US20050114283A1 - System and method for generating a report using a knowledge base - Google Patents

System and method for generating a report using a knowledge base Download PDF

Info

Publication number
US20050114283A1
US20050114283A1 US10/846,255 US84625504A US2005114283A1 US 20050114283 A1 US20050114283 A1 US 20050114283A1 US 84625504 A US84625504 A US 84625504A US 2005114283 A1 US2005114283 A1 US 2005114283A1
Authority
US
United States
Prior art keywords
keywords
user
sentence
available
display region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/846,255
Inventor
Philip Pearson
Andrew Odlivak
Marc Shapiro
Deepak Agarwal
Aaron Divinsky
Peter Cotton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus America Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus America Inc filed Critical Olympus America Inc
Priority to US10/846,255 priority Critical patent/US20050114283A1/en
Publication of US20050114283A1 publication Critical patent/US20050114283A1/en
Assigned to OLYMPUS AMERICA INC. reassignment OLYMPUS AMERICA INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHAPIRO, MARC, PEARSON, PHILIP
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OLYMPUS AMERICA INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/40ICT specially adapted for the handling or processing of patient-related medical or healthcare data for data related to laboratory analysis, e.g. patient specimen analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera

Definitions

  • the invention relates generally to a system and method for generating a report using a knowledge base and, more specifically, to generating a report using sentence models that are automatically populated with keywords selected by a user.
  • the invention is illustrated in the context of a medical procedure such as an endoscopic examination.
  • Computerized word processing applications have gained widespread use since they allow a user to prepare and manipulate documents in electronic form. Moreover, various applications have been developed to assist users in specific industries in developing reports and other documents. For example, some voice-activated systems use a dictionary developed from scanning published reports to choose a term that most closely matches a spoken word. The spoken words of the user are then converted to text to produce a report. Other systems allow the user to enter information for a report via various prompts and menus. The system then generates a report based on the entered information.
  • the present invention provides a system and method for generating a report using a knowledge base.
  • a method for generating a report regarding a procedure.
  • the method includes: (a) displaying available keywords from a knowledge base on a first display region of a user interface, (b) receiving, via the user interface, at least one user command selecting at least one of the available keywords from the first display region, (c) displaying the at least one of the available keywords on a second display region of the user interface, responsive to the selection thereof by the at least one user command, (d) populating a sentence model according to the at least one of the available keywords to provide a populated sentence, and (e) displaying the populated sentence on a third display region of the user interface.
  • a method for providing keywords for generating a report regarding a procedure includes: (a) providing respective keywords for use in the report, (b) associating each of the respective keywords with a respective classification in a hierarchically arranged tree structure of classifications, (c) associating respective properties with the respective keywords, and (d) defining, based on the respective properties, a set of allowable values for a group of the respective keywords, where the group of the respective keywords are related
  • a related user interface and program storage device are also provided.
  • FIG. 1 illustrates an overview of a report-generating system according to the invention
  • FIG. 2 illustrates a Registration and Scheduling clinical flow
  • FIG. 3 illustrates a Pre-Procedure clinical flow
  • FIG. 4 illustrates a Procedure clinical flow
  • FIG. 5 illustrates a Post-Procedure clinical flow
  • FIG. 6 illustrates a Home tab of a user interface
  • FIG. 7 illustrates a Patient File tab of a user interface
  • FIG. 8 illustrates a Registration tab of a user interface
  • FIG. 9 illustrates a Pre Procedure tab of a user interface
  • FIG. 10 illustrates a Procedure tab of a user interface
  • FIG. 11 illustrates a Post-Procedure tab of a user interface
  • FIG. 12 illustrates an Analysis tab of a user interface
  • FIG. 13 illustrates an Admin tab of a user interface
  • FIG. 14 illustrates a Pre-Procedure Lexicon interface
  • FIG. 15 illustrates a Lexicon user interface for selecting indications of GI symptoms for a patient
  • FIG. 16 illustrates a Lexicon interface that shows the selected keyword “heartburn”
  • FIG. 17 illustrates a Lexicon interface that shows a keyword “severity” in expanded form
  • FIG. 18 illustrates a Lexicon interface that shows a selected keyword “severe”
  • FIG. 19 illustrates a Lexicon interface that shows a user input value “6” as a selected keyword
  • FIG. 20 illustrates an interface with a list of billing codes
  • FIG. 21 illustrates a Lexicon interface in the Post-Procedure tab
  • FIG. 22 illustrates an interface for selecting a Procedure Note Template under the Post-Procedure tab
  • FIG. 23 illustrates a Procedure Note interface
  • FIG. 24 illustrates an interface for selecting a Procedure Note Template under the Admin tab
  • FIG. 25 illustrates a class diagram of the Concept Layer of the knowledge base
  • FIG. 26 illustrates a class diagram of the Data Layer of the knowledge base
  • FIG. 27 illustrates a class diagram of the Sentence Model infrastructure of the knowledge base
  • FIG. 28 illustrates a class diagram of the View Layer of the knowledge base
  • FIG. 29 illustrates a user interface for adding a keyword to the knowledge base
  • FIG. 30 illustrates a user interface for adding a property definition to the knowledge base
  • FIG. 31 illustrates a user interface for editing a keyword in the knowledge base
  • FIG. 32 illustrates a user interface for editing properties of a keyword in the knowledge base
  • FIG. 33 illustrates a user interface for adding properties of a keyword in the knowledge base
  • FIG. 34 illustrates a user interface for adding and editing codes in the knowledge base
  • FIG. 35 illustrates a user interface for adding and editing sentence models in the knowledge base
  • FIG. 36 illustrates a user interface for adding a condition in a sentence model in the knowledge base
  • FIG. 37 illustrates a user interface for adding and editing triggers in the knowledge base
  • FIG. 38 illustrates a user interface for adding a keyword item to a menu in a view
  • FIG. 39 illustrates a user interface for adding a menu in the knowledge base
  • FIG. 40 illustrates a user interface for creating a shortcut in the knowledge base
  • FIG. 41 illustrates a user interface for editing a shortcut in the knowledge base
  • FIG. 42 illustrates user interfaces for a grammar engine.
  • FIG. 1 illustrates an overview of a report-generating system according to the invention.
  • the invention involves a web-browser based clinical information management system that automates a medical lab such as an endoscopy lab by managing patient examination data at different phases of patient care, including the capture of images, data and written Procedure Notes, and further, the generation of medical records and procedure reports.
  • the system may include an endoscopic workstation 110 , a Mavigraph printer 112 , RGB monitor 114 and processor 116 .
  • the user provides inputs to the workstation 110 via keyboard, mouse, voice interface, or the like.
  • the workstation may be coupled with a web browser interface that provides the necessary information to perform exams, and facilitates for users of endoscopic equipment, e.g., physicians, nurses or clinicians, the efficient capture, management, organization and presentation of endoscopic images and patient and examination data.
  • the workflow processes associated with this aspect of the system are flexible enough to support small endoscopic practices in addition to endoscopic departments within large healthcare institutions.
  • the system may function as a stand-alone system including memory for storing patient data and image information.
  • the system may also include a server 140 and database element 145 that may be connected via a gateway application to various “external” systems such as a hospital information system where the gateway facilitates the transfer of healthcare information between the system and other applications.
  • Patient information stored in the system may be downloaded to external systems (e.g., a legacy system) via a gateway interface.
  • the workstation 110 may communicate with the server 140 via the Internet 170 or other network, such as a LAN or intranet.
  • the workstation 110 may also communicate with a fax server 160 , for instance, for faxing reports via a fax modem 162 .
  • software instructions including firmware and microcode, may be stored in any type of program storage device or devices, also referred to as computer-readable media. The software is executed by a processor in a known manner to achieve the functionality described herein.
  • the system includes an Image Management function enabling a user to annotate, label, import, export, and enhance the quality of images, including the ability to manage, record, and export live video clips. Further to this is an “auto-masking” feature that automatically selects an appropriate video mask based on a particular endoscope device being utilized by the health care practitioner.
  • the system includes a medical terminology “Knowledge Base” (KB) comprising keywords relating to the procedure, e.g., such as gastrointestinal, endoscopic and bronchoscopic terminology keywords.
  • the keywords are captured via a graphical user interface (GUI) before, during, and/or after a procedure.
  • GUI graphical user interface
  • the keywords are made available for labeling images captured during an examination to be used in reports, auto-populating appropriate sections of a report such as a Procedure Note, described further below, based on patient history, and building Procedure Note templates or models to auto-populate sections of information.
  • the system also facilitates the use of custom terms that apply to a specific department or location.
  • a user may select KB terms for a procedure via a common user interface, which is employed wherever the user needs to locate or extract keywords. This also provides a consistent way to select and use terminology.
  • FIGS. 2-5 illustrate clinical flow diagrams that describe the most common activities associated with the system and their relationship in time in the context of one possible application of the invention.
  • Clinical flow is based on patient flow, which relates to how a patient is processed before, during, and after an endoscopic procedure.
  • the overall flow across all lifecycle stages starts with an exam request and ends with the generation of a Procedure Note, the release of the patient, and the generation of a set of related reports.
  • User roles are represented as horizontal bands.
  • the registration and scheduling clinical flow 200 of FIG. 2 includes a collection of all the information necessary to set up a visit. It is initiated through an exam request made by either the patient, a surrogate for the patient, or a referring physician. The nurse and physician share the activity of preparing prep instructions and medical advice for the patient.
  • the Pre-Procedure clinical flow 300 of FIG. 3 starts with the arrival of the patient at the endoscopy facility and addresses all administrative and medical activities necessary to prepare the patient for the exam.
  • the Procedure clinical flow 400 of FIG. 4 depicts the actual examination that takes place during the Procedure lifecycle stage.
  • the system is used to capture images, record vital signs, and administer medications during this stage.
  • the Post-Procedure clinical flow 500 of FIG. 5 depicts the activities that take place after the completion of an exam. These activities include a nurse continuing to monitor the patient's recovery, a nurse completing discharge instructions, releasing the patient, and preparing billing code reports, and a physician reviewing and editing the analysis of an exam by generating a Procedure Note. A physician signs the Procedure Note when it is complete. Afterward, management reports, patient recall requests, and referral letters can be created and distributed.
  • the invention is next described in connection with a user interface that allow the user to select different features under different tabs.
  • the Home tab 600 ( FIG. 6 ).
  • the Home tab is the default home page, and is pre-defined for each role. However, the user can modify the page to suit the user's needs. The following are the most common tasks that can be performed in the Home tab. Access to these tasks is based on the user's role. For example, if the user logs into the application as a scheduler, then the user would not see the Sign Reports menu option, since that option is reserved for the physician role.
  • Scheduled Exams used to view a list of scheduled exams and create a new visit and exam.
  • Pending Items used to view all of the pending tasks. The user can also select one or more pending items and close them.
  • Pathology Status used to view the status of outstanding pathology requests or search the database for an existing record. The user can also edit or delete existing pathology records. When a pathology record is deleted, all of the specimens associated with that record are deleted.
  • Unsigned Reports an attending physician can use the Unsigned Reports screen to view and sign unsigned Procedure Notes.
  • Sign Reports A system administrator can use the Sign Reports screen to view unsigned Procedure Notes for a specific physician and mark them as signed.
  • Carbon Copies When the user distributes a document to a medical provider, clinical staff, or contact via email, a notification is sent to the recipients that a document is available for them in the system. Recipients can then log on to the system and view a list of documents on the Carbon Copies screen.
  • ICU Synchronization when the user performs an exam in ICU (Image Capture) mode, the user's imaging station is not connected to the network server. When the user finishes the exam, the user must upload images and data from the workstation to the server repository. When the workstation is re-connected to the network, a series of simple commands will upload the data and images captured during the exam. After the data is uploaded, the user uses the ICU Synchronization option to synchronize images and data.
  • ICU Image Capture
  • Recall Letters used to recall a patient for another examination.
  • the user can use this option to add an item to the Recall Letter Queue to remind a patient of a follow-up examination.
  • Patient File tab 700 ( FIG. 7 )—allows a user to capture information specific to the individual patient. This tab is used to record a patient's demographic information; a patient's medical alerts, GI/pulmonary, medication, family, and social history information, and view a summary of the patient information.
  • Registration tab 800 ( FIG. 8 ). This tab is used to: (a) create and modify visit and/or exam information; (b) view past, current, or future schedules; (c) assign resources for an examination including procedure rooms and equipment; and (d) distribute registration documents.
  • Pre-Procedure tab 900 ( FIG. 9 ). This tab is used to: (a) record care plan information for a specific visit; (b) record medical alert information; (c) record GI, pulmonary, family, and social history information, (d) manage physical examination, patient assessment, and physician check information, (e) manage prep status information for the patient; (f) manage consent information for a visit; (g) capture vital signs and medications administered before the examination; (h) display a summary of selected Pre-Procedure information and capture nurse handoff information; and (i) distribute Pre-Procedure documents.
  • the Procedure Tab 1000 ( FIG. 10 ). This tab is used to: (a) capture images during an endoscopic procedure; (b) record live video clips; (c) record scope time used during an examination; (d) view images and Procedure Notes from a previous exam; (e) print images for an exam on a laser jet or a Mavigraph printer; (f) record nurse administration information; (g) record accessories and equipment used during an examination; (h) generate pathology requests; (i) capture vital signs and medications administered during the examination; and (j) distribute procedure documents.
  • the Post Procedure Tab 1100 ( FIG. 11 )—After an examination is completed, this tab is used to perform post-procedural tasks. These tasks include synchronizing images in the ICU mode, monitoring a patient's vital sign and medication information, managing captured images, and writing Procedure Notes. Images from a current procedure, e.g., image 1 and image 2 , and from a prior procedure, e.g., image 3 , image 4 , and image 5 , can be displayed together for comparison.
  • This tab is used to: (a) record patient recovery information; (b) manage images captured during an exam; (c) label, annotate, enhance, and print images; (d) import and export images to and from the current examination; (e) manage video clips recorded during an examination; (f) write and sign Procedure Notes; (g) capture patient recall information; (h) assess performance of a trainee participating in an examination; (i) capture patient survey information; (j) distribute Post-Procedure documents; and (k) perform ICU synchronization.
  • the Analysis Tab 1200 ( FIG. 12 )—used to generate redefined template-based management reports to satisfy end-user administrative reporting requirements related to patient, procedure and facility management, efficiency analysis, and resource utilization. This tab is used to generate: (a) Continuous Quality Improvement (CQI) reports; (b) efficiency reports; (c) equipment analysis reports; (d) procedure analysis reports; and (e) administration reports.
  • CQI Continuous Quality Improvement
  • the Admin Tab 1300 ( FIG. 13 )—used to perform administrator tasks and ensure the efficiency and security of the system.
  • the system can be customized based on the needs and requirements of the facility, physician, and clinical staff.
  • This tab is used to: (a) maintain system data (such as Patient ID type and department information); (b) maintain application resource data (such as clinical staff and contact information); (c) perform system configuration (such as configure Mavigraph printer and video settings); (d) customize how the application will flow and generate information (for example, changing the order and location of menus within the application and editing or creating templates/models that are used to create Procedure Notes); (e) customize user-defined fields (such as other patient information and other visit information); (f) control access to or within the application (such as user and role maintenance); and (g) maintain equipment used during the procedure.
  • system data such as Patient ID type and department information
  • application resource data such as clinical staff and contact information
  • system configuration such as configure Mavigraph printer and video settings
  • customize how the application will flow and generate information for example,
  • the Knowledge Base is a terminology database that contains terms related to the specific procedure.
  • the KB may be a medical terminology database that includes gastrointestinal, endoscopic and bronchoscopic terminology in one possible application.
  • the KB can be extended to other medical and non-medical applications by using the appropriate terminology.
  • the knowledge base may be used for any application where standardized terminology is desired. This may include applications for producing reports in various industries, such as financial services, insurance, legal services, real estate and so forth.
  • the KB contains concepts (terms), keywords (terms used in a specific context), sentence models and views (menus and keyword items). Keywords are the medical terms that are the basic building blocks of the KB. When the user selects a keyword from a menu of available keywords, it appears in a list of selected keywords or terms. Moreover, keywords are organized in menus. For example, a “size” menu type could contain the keywords “small”, “medium”, and “large”. Furthermore, a menu can be configured to be single-select, unique, or multi-select.
  • the user can use the KB to: (a) capture keywords before, during, or after a procedure; (b) label images captured during an examination to be used in reports; (c) auto-populate appropriate sections of a Procedure Note based on patient history; and (d) build Procedure Note templates/models to auto-populate sections of information.
  • the KB features a common user interface, which is employed wherever the user needs to locate or extract keywords. This also provides a consistent way for the user to select and use terminology.
  • the KB also facilitates the use of custom terms that apply to a specific department or location.
  • a Lexicon function is used to select terms from the KB.
  • the Lexicon arranges the KB content into different report sections, based on Phases of Care.
  • the user can click a tab to see the KB terms associated with the report section.
  • a facility determines which report sections should be available in the Lexicon screen for a phase of care.
  • a system administrator can make a report section available to appropriate phases of care in the Admin tab.
  • Each tab or report section in the Lexicon screen may contain two panes or display regions.
  • the right hand side pane displays a KB view and contains all the available terms of the KB arranged in a logical tree. Typically, only a portion of the tree is visible at a time. The user can navigate the tree to view different portions of it.
  • the left hand side pane contains the selected terms. When the user picks or selects an available term from the right hand side pane, it is copied to the left hand side pane, which displays the selected terms, thus allowing the user to logically building a comprehensive description of the exam. The user uses these selected terms along with other exam data collected during various phases of care to generate Procedure Notes, e.g., reports, and other exam related documents.
  • FIG. 14 illustrates an example of the Pre-Procedure Lexicon screen or interface 1400 .
  • the user can use the Lexicon function to select terms from the KB and to record procedure related information of the exam.
  • the Lexicon arranges the KB content into different report sections for the Pre-Procedure, including indications, unplanned events and billing codes. These report sections appear as tabs in the Lexicon screen 1400 .
  • the Pre-Procedure tab is selected, and the menu item “Lexicon” is selected from the far left hand side of the interface.
  • the “Indications” tab has been selected, so the term “Indications” appears at the top of the left hand pane or second display region 1410 .
  • the right-hand pane or first display region 1420 of the interface 1400 displays the available terms in the KB in a hierarchical tree or menu 1425 .
  • the user can view branches of the tree 1425 in an expanded view by clicking on a “+” icon, or in a contracted view by clicking on a “ ⁇ ” icon.
  • the branches of a node of the tree may contain related keywords that qualify the concept of the parent node keyword.
  • the keyword “severity” includes branches for keywords “mild”, “moderate” and “severe” that describe the degree of severity of the patient's symptoms. Keywords at lower levels of the tree thus can be provided to qualify the keywords at the higher levels.
  • the keywords in the tree 1425 are organized into menus in which the physician might organize the indications or symptoms of a patient. These menus may include, e.g., GI or gastrointestinal symptoms, airway symptoms, test results, imaging results, surveillance, treatment of established disease, systemic disorders, follow up, diagnostic sampling, and protocol study. Under each keyword, further qualifying terms can be viewed by expanding the tree.
  • the interface 1400 relates to GI symptoms, and the portion of the tree 1425 shown relates to the indication “abdominal pain”. Multiple menus of indications can be selected by the user together or in turn.
  • the second display region 1410 contains a tree 1415 with the keywords that the user has selected from the KB terms in the display region 1420 .
  • the user selects an available term from the first display region 1420 , it is copied to the second display region 1410 to logically build a comprehensive description, e.g., of the patient's symptoms, or of the subsequent exam results.
  • a comprehensive description e.g., of the patient's symptoms, or of the subsequent exam results.
  • the indications of diarrhea, weight loss, abdominal pain and heartburn have been selected along with various keywords for detailing each indication.
  • the following discussion explains how the keyword “heartburn” in the tree 1415 is selected by the user.
  • FIG. 15 illustrates an example lexicon user interface 1500 for selecting indications of GI symptoms for a patient.
  • the various GI symptoms are provided as branches of a tree 1525 in which the keyword “GI symptoms” is a parent node.
  • the user has expanded the tree under the keyword GI symptoms to reveal the next lower level of keywords, e.g., abdominal pain, bloating, etc.
  • the second display region 1410 indicates that the user has not yet selected any GI symptoms.
  • the user such as a physician, obtains information regarding the GI symptoms of a patient such as by interviewing the patient.
  • the user learns that the patient is experiencing heartburn.
  • the user selects the keyword “heartburn” from the tree 1525 in the first display region 1420 , e.g., by clicking on “heartburn” with a pointing device such as mouse.
  • the keyword “heartburn” is then copied to the second display region 1410 as a first indication. See the interface 1600 of FIG. 16 , which shows the selected keyword “heartburn” displayed in the tree 1615 in the second display region 1410 .
  • Other indications such as diarrhea, weight loss and abdominal pain, as provided in the tree 1415 of FIG. 14 , may similarly be selected from the tree 1525 to build a lexicon of selected keywords in the second display region 1410 .
  • each keyword may be further detailed when selected.
  • the keyword “heartburn” is selected from the tree 1525 in the first display region 1420 , as shown in FIG. 15 , the first display region 1420 may be updated to display the tree 1625 , in which various additional details appear as branches or child nodes of the selected keyword.
  • the tree 1625 displays keywords for detailing the severity, frequency, duration, length of last episode, precipitants and relief factors regarding the heartburn. Note that these same keywords may be used for detailing the other indications, such as diarrhea, weight loss and abdominal pain. Keywords that are specific to a particular indication may also be used for detailing the indication.
  • each of the keywords in the tree 1625 may be expanded to provide further detailing keywords.
  • the user may expand the node “severity” in the tree 1625 by clicking on the “+” sign next to “severity”.
  • the result is the tree 1725 in the interface 1700 of FIG. 17 , where the keywords “mild”, “moderate” and “severe” allow the user to detail the severity of the heartburn.
  • the term “heartburn” is highlighted in the tree 1615 , indicating to the user that any further selection from the first display region 1420 will provide details regarding the heartburn. If other indications such as weight loss and abdominal pain were present in the tree 1615 , each indication could be highlighted in turn by the user to allow the user to enter the associated details in turn.
  • the user selects the child keyword “severe” under the parent keyword “severity”.
  • the term “severe” is copied to the second display region 1410 and provided as a branch of the term “heartburn”.
  • the user proceeds to detail the frequency of the heartburn. To do this, the user expands the keyword “frequency’ in the tree 1625 to reveal options such as “constantly”, “specify number of times per day”, “specify number of times per week”, “specify number of times per month”, and “specify number of times per year”.
  • the user may select “specify number of times per day”, which causes a pop-up window to be displayed requesting that the user enter a number.
  • the user enters “6” in an Enter Number field if the patient was feeling heartburn six times a day.
  • the user clicks OK to save the entry.
  • the interface is then updated so that the number “6” appears in the tree 1915 in the interface 1900 of FIG. 19 under the keyword “heartburn”.
  • the interface may display “6 times per day”.
  • the process as detailed above may be repeated for the different indications to arrive at the interface 1400 of FIG. 14 .
  • the hierarchical relationship of the keywords in the tree of available keywords is maintained in the tree of selected keywords.
  • the tree of selected keywords is a subset of the tree of available keywords.
  • this allows the context of a keyword to be understood by the user.
  • the “specify” keywords prompt the user for a specific entry, e.g., as text or a numerical entry. It is also possible for the numerical values or ranges of values to be provided as keywords in the tree of available keywords.
  • the menu or tree of available keywords in the KB can be configured to single-select, unique, or multi-select.
  • a single-select menu allows the user to select one keyword at a time.
  • the keyword “severity” is classified as a single-select menu, so the user can select only one qualifier from the available qualifiers, such as mild, moderate or severe.
  • the GI Symptoms menu 1525 of FIG. 15 is unique, and the user selects “heartburn” as the first symptom. If the user tries to selects this keyword again from the tree 1525 , the keyword “heartburn” in the tree 1615 of FIG. 16 is highlighted to indicate that it has already been selected.
  • a multi-select menu allows the user to select multiple keywords at a time. The user can even select all the keywords in a multi-select menu. For example, if the GI symptoms menu in the tree 1525 of FIG. 15 is classified as a multi-select menu, the user may select multiple symptoms such as diarrhea, weight loss, abdominal pain and heartburn at the same time. For example, the user may select multiple terms by clicking on each term with a mouse while holding down the ⁇ Ctrl> key on the keyboard. Each selected symptom appears at the same hierarchical level of the tree 1415 of FIG. 14 .
  • Views are collections of menus and their associated keywords organized within a tree-structure.
  • the user can use views to navigate through the KB and select appropriate medical terms or keywords.
  • the interface 1400 of FIG. 14 illustrates a collection of menus and keywords or a View in the Lexicon screen.
  • the user navigates to the Pre-Procedure tab, selects an exam, and selects the “Lexicon” option from the far left hand menu.
  • the Lexicon screen 1400 may contain the following tools to assist the user in navigating the interface. Any appropriate icon design may be used.
  • Tool Purpose: Generate Generate sentences from selected keywords. Active only Report in the Procedure Note screen (The Post-Procedure tab). Move Up Move keyword up in the selected terms hierarchy.
  • the user clicks on a keyword from the list of available keywords in the first display region 1420 , and the selected keyword is copied to the second display region 1410 .
  • this arrangement of the display regions is an example only. It is possible to have the available and selected keywords displayed in various other configurations. For example, the available keywords may be displayed on a top display region while the selected keywords are in a bottom display region. In another possible approach, the selected keywords are displayed in a window that is overlaid on a window that displays the available keywords. It is also possible to use multiple display screens, where, e.g., the available and selected keywords are displayed on different screens. Various other configurations are possible based on the available display technologies.
  • a window may be provided that allows the user to type in details of such an event, or keywords describing the event may be selected from the first display region 1420 .
  • an unplanned event during an endoscopic procedure may include bleeding. The location and extent of bleeding can thus be detailed.
  • the user can associate billing codes with selected keywords for billing purposes.
  • the user accesses the Lexicon screen and selects a keyword with which to associate the billing code.
  • the user searches for a billing code from this window.
  • the user enters the number of the code that is being searched for in the Number field.
  • the user enters a description of the code in the Description field.
  • FIG. 20 illustrates an interface 2000 with a list of billing codes from the ICD- 9 (International Classification of Diseases) Diagnostic Code Set.
  • FIG. 21 illustrates an example of the Lexicon screen 2100 in the Post-Procedure tab.
  • the physician or other user uses the Lexicon function to record findings from performing a procedure. These report sections appear as tabs in the Lexicon screen 2100 labeled as: indications, procedure, findings, medication, unplanned events, recommendation, summary and codes.
  • FIG. 21 illustrates the Findings tab.
  • the user navigates to the Post-Procedure tab, selects an exam, and selects the Lexicon option from the far left hand menu.
  • the Lexicon interface for Post-Procedure is analogous to that for Pre-Procedure ( FIG. 14 ).
  • the Post-Procedure Lexicon screen may contain icons for tools as discussed above in connection with the Pre-Procedure lexicon function. Moreover, as with the Pre-Procedure case, a Lexicon menu for Post-Procedure can be configured as single-select, unique, or multi-select.
  • the selected keyword is copied to the left side of screen or other separate display region. For example, assume the user finds a small polyp in the stomach in the patient during the endoscopic procedure, specifically on the anterior wall of the antrum. To document this finding, the user may perform the following steps:
  • the display region 1420 provides a tree where a parent node is for the keyword “organs” (not shown).
  • the first display region 1420 is updated to display a tree that is similar to the tree 2125 , except that the keywords are unexpanded.
  • the keywords in the tree 2125 e.g., fundus site. body site, antrum site, other stomach site, etc., are child branches of the keyword “polyp”.
  • the system further allows the user to enter custom information. If the user does not find a keyword in the available KB terms, the user can specify a term using a keyword that begins with “specify” (such as “specify size to enter a size”, “specify finding to enter a finding”, etc.) to add a term. For example, if the user does not find the size that the user wants to specify for the “size of polyp”, the user can specify a different size. To specify a different size in millimeters (mm), for instance, the user can perform the following:
  • the Procedure Notes task may be accessed from the Post-Procedure tab of the interface to prepare Procedure Notes regarding a procedure performed on a patient.
  • a Procedure Note may include documented information about a specific exam. It can be used to document findings, diagnosis, medications, recommendations, and other information such as past diagnosis.
  • the Procedure Note function can be used to: (a) manage images, (b) view information such as images and Procedure Notes for other exams, (c) select terms from the Knowledge Base tree, (d) generate report text, (e) select billing codes, (f) sign a note, (g) generate different versions of a note, (h) discard a note, and (i) delete a note.
  • FIG. 22 illustrates a Select Procedure Note Template screen 2250 in the Post-Procedure screen 2000 .
  • the user does the following: (a) access the Post-Procedure screen 2200 , (b) select an exam, (c) select “Procedure Note” from the far left hand menu, causing the Select Procedure Note Template screen 2250 to be displayed, e.g., as a pop up window, (d) search for the Procedure Note template based on facility and/or physician names, (e) click Go, causing a list of resulting Procedure Note templates to be displayed, and (f) select a particular template to view the Procedure Note screen 2300 of FIG. 23 .
  • a template entitled “Standard—Bronch” is available to assist the user in generating a Procedure note for a bronchoscopy procedure.
  • a first display region 2364 may provide a tree 2365 of available KB keywords as discussed in connection with the first display region 1420 of the Lexicon screen 1400 ( FIG. 14 ).
  • a second display region 2362 may provide a tree 2363 of selected keywords as discussed in connection with the second display region 1410 of the Lexicon screen 1400 .
  • a third display region 2340 provides a Procedure Note Builder for editing and generate reports such as Procedure Notes.
  • a fourth display region 2320 provides an Image Strip section 2320 for managing images for the current exam. Example images for the current exam are displayed at the left side of the section 2320 as “image 1 ” and “image 2 ”.
  • the user can use the Images section 2320 in a similar manner as the Image Management screen.
  • the Images section 2320 may contain tools for the following functions: (a) delete selected images from the current exam, (b) delete all unlabeled images from the current exam, (c) mark selected images for printing, (d) unmark the selected image for printing, (e) view larger image, (f) label all selected images from the current exam, (g) delete the label from all selected images from the current exam, (h) associate findings, (i) disassociate findings, (j) show or hide menu and (k) show or hide strip.
  • the Procedure Note function can be used to view images and Procedure Notes for other exams associated with the selected patient.
  • the following tools may be used to view other exam information: (a) view large image, (b) view the Procedure Note for the other exam, and (c) close the other exam image window.
  • the user accesses the Procedure Note screen, and selects an exam from the Other Exam dropdown list 2322 . Images for the selected exam are displayed on the right side of the Images section 2320 .
  • Procedure Notes for another exam the user selects the exam from the Other Exam dropdown list 2322 , and clicks the View the Procedure Note icon to display the Procedure Note for the other selected exam in a new window.
  • the Procedure Note function can also be used to select KB keywords for an examination.
  • the first display region 2364 is used to select keywords as discussed above in connection with the Lexicon function.
  • the Generate Report icon is activated if the user makes changes to the selected keywords.
  • the sentence is generated and populated with the keywords in the displayed report within the specific Report Section.
  • the third display region 2340 may include predefined sections corresponding to the tabs in the Lexicon screen 2100 of FIG. 21 , e.g., introduction, indications, procedure, findings, medication, unplanned events, recommendations, summary and codes.
  • the user may build a lexicon for one or more of the sections as discussed above. As the lexicons are developed, one or more sentence models are selected from a number of available sentence models. The selected sentence models are populated using the selected keywords and predefined or static text, as discussed further below.
  • a Use Organ Labels feature may be set to Yes for the exam type, in which case the sentences for findings are prefixed with the name of the organ. For example, assume there was a finding of “polyp” in “stomach”. Moreover, the polyp is further detailed by the user selecting the keyword “antrum site” and the branch keyword thereof “anterior wall of the antrum”, the keyword “polyp qualifier” and the branch keyword thereof “broad based”, and the keyword “size of polyp” and the branch keyword thereof “small”. These keywords are indicated in the tree 2363 in the second display region 2362 .
  • the sentence model in the report might appear as: “Stomach: A small, broad-based polyp was found arising from the anterior wall of the antrum. (2)”.
  • the notation “(2)” denotes a corresponding image in the Images section 2320 associated with the finding, e.g., image 2 .
  • the image number may be provided at the end of the finding.
  • a sentence model is populated to describe findings regarding a second organ, the esophagus.
  • the user has found food in the esophagus.
  • the food is characterized as being a trace amount, in a location described as the upper third of the esophagus, and the presence of the food is attributed to a motor disorder in the patient.
  • the sentence might appear as: “Esophagus: Trace of food was found in the upper third of the esophagus, due to motor disorder.”
  • a Recommendation may appear as the sentence: “Start a low fat diet” based, e.g., on one or more keywords selected by the user under the Recommendation tab.
  • Other example recommendations include “admit for observation”, “consult radiologist”, and so forth.
  • the sentence models advantageously allow the user to quickly generate reports using standard terminology and sentence structures.
  • the generated sentences can be easily edited by substituting one keyword in place of another.
  • the user may wish to change “The views were excellent” to “The views were good”. To achieve, this, the user highlights the word “excellent” in the sentence.
  • the display region 2362 is updated to show the portion of the tree 2363 in which the selected keyword “excellent” is located. The keyword “excellent” may appear highlighted in the tree 2363 .
  • the display region 2364 is similarly updated to show the portion of the keyword tree 2365 in which the keyword “excellent” is located.
  • “excellent” will appear with other related keywords such as “poor”, “fair” and “good”, which are branches of the same parent node, such as “quality”.
  • the keyword “quality” may in turn be a branch of a parent node “views”, for instance.
  • the user selects “good” from the tree 2365 .
  • the tree 2363 is then updated with the keyword “good” replacing “excellent”.
  • a sentence can be edited by adding or deleting a keyword.
  • the user may wish to change “A small, broad-based polyp was found . . . ” to “A broad-based polyp was found . . . ”
  • the user highlights “small” in the sentence and clicks on a delete icon.
  • the user selects the additional keyword from the tree of available keywords 2365 .
  • the keyword is then copied to the tree of selected keywords 2363 and shown in the appropriate hierarchical position in the tree.
  • the user clicks the Generate Report icon to re-populated the sentence in the third display region 2340 .
  • the user may select the additional keyword “small” from the tree 2364 to change the sentence “A broad-based polyp was found . . . ” to “A small, broad-based polyp was found . . . ”
  • This keyword editing feature is very powerful since it allows the user to modify the report to correct erroneous entries, or to modify entries based on further findings or change in judgment, for instance.
  • a normal sentence model also typically incorporates all of the selected keywords associated with the report section.
  • a report can include both normal and summary sentences.
  • the Procedure Note function can further be used to select relevant billing codes for the exam. If a keyword has a billing code associated with it, the user would see all billing codes and descriptions associated with the keyword when clicking the Code icon. Relevant billing codes associated with a keyword can be selected for the current exam as discussed previously
  • the user accesses the Procedure Note screen and clicks Save.
  • the user can sign a Procedure Note if the Procedure Note text is generated. In one approach, only attending physicians for the examination can sign the Procedure Note. If the Validate & Sign Procedure Note settings are set to Yes, the user will be asked to validate the user ID and password. Signing a note will lock the report from further editing. To sign a report, the user accesses the Procedure Note screen and clicks Sign. Signing a Procedure Note updates a patient's past procedure, past diagnosis, and past surgeries record.
  • the user saves the modified Procedure Note as a template for generating future Procedure Notes.
  • the user accesses the Procedure Note screen, clicks Save As, assigns a name to the template, selects either a facility or physician name to assign an owner to the template, and clicks Save.
  • Procedure Note Once a Procedure Note is signed, it is locked and cannot be edited. To make any changes to an existing, signed Procedure Note, the user can create a new version of the Procedure Note.
  • the new version of the Procedure Note is an exact copy of the current signed Procedure Note, without regenerating any sentences or updating any database fields. To generate a new version, the user accesses the Procedure Note screen and clicks New.
  • Discarded Procedure Notes are stored in a Discard Bin, where they can be viewed but not restored.
  • the user accesses the Procedure Note screen, makes sure the Procedure Note is signed, clicks Discard, and clicks Yes.
  • a system administrator is responsible for maintaining the Knowledge Base by adding new keywords, sentence models, and menu structures.
  • Report Sections are used to generate Procedure Notes for a specific exam type.
  • the user can choose to include a few or all the report sections in it.
  • the user can also assign these report sections to a phase of care and then use them in the Lexicon screen as tabs to record data.
  • the user can create, modify, and delete report sections from the Report Section List screen. Examples of report sections include: introduction, indications, procedure, findings, medication, unplanned events, recommendation, summary, and billing codes.
  • a Phase of Care function can be used to assign specific report sections to a phase of care, namely Registration, Pre-Procedure, Procedure, and Post-Procedure.
  • the user uses the Report Template function.
  • the user navigates to the Admin tab, and selects Customization from the left menu. Available customization options are displayed.
  • the user selects Report Template from the left menu, and the Select Procedure Note Template screen is displayed ( FIG. 24 ).
  • the user accesses the Select Procedure Note Template screen, selects an exam type, facility, or physician, and clicks Go.
  • a list of Procedure note templates is displayed, based on the search criteria.
  • the KB provides a controlled vocabulary for reporting results, e.g., of medical examinations and procedures. Discussed below are the three structural layers of the KB, an overview of the use of the KB base by the end user while creating Procedure Notes, the use of selected terms for query purposes, and an interface to maintain the KB.
  • the KB includes three main layers: Concept Layer, Data Layer and View Layer.
  • the Concept Layer represents a dictionary from which individual words can be selected to build a more complicated grammar.
  • the Data Layer represents the data that describes the Concepts in greater detail.
  • the View Layer organizes the Keywords into groups of terms in a tree structure connected by menus. The view is the primary way for the user to navigate through the knowledge base to select the appropriate medical terms for a given examination type.
  • class diagrams are provided according to the Unified Modeling Language (UML).
  • UML Unified Modeling Language
  • class diagrams describe the static structure of a system. Classes represent an abstraction of entities with common characteristics. Classes are illustrated with rectangles divided into portions. The name of the class is in the top portion, and the attributes of the class are in the middle portion. Write operations may be provided in the bottom portion.
  • associations represent the relationships between classes. Multiplicity or cardinality notations are indicated near the ends of an association. These symbols indicate the number of instances of one class linked to one instance of the other class, as follows: (a) “1” denotes no more than one, (b) “0 . . . 1” denotes zero or one, (c) “*” denotes many, (d) “0 . . . *” denotes zero or many, and (e) “1 . . . *” denotes one or many.
  • a filled diamond represents a composition relationship, denoting a strong ownership between a “whole” class and a “part” class.
  • a hollow diamond represents a simple aggregation relationship, in which the whole class plays a more important role than the part class, but the two classes are not dependent on each other. The diamond end in both a composition and aggregation relationship points toward the whole class or the aggregate.
  • FIG. 25 illustrates a conceptual class diagram 2500 of the Concept Layer.
  • the classes are “Concept” and “LUI” (Lexical Unique Identifier). There are one or many instances of “LUI” linked to one instance of “Concept”. Also, there is a composition relationship between “Concept” and “LUI”.
  • the Concept Layer defines the concepts that can exist in the KB. Concepts are given a textual representation and classified according to the dictionary definition of the concept within this layer. The Concept Layer acts much like a dictionary, defining meanings of concepts, whereas the rest of the knowledge base is more like an encyclopedia, defining concept usage.
  • Concepts are unique objects within the KB that represent the data that does not change regardless of the context in which the data is used.
  • the word “mass” is defined as a unified body of matter with no specific shape. It may also be defined as a large but nonspecific amount or number. These would be two separate concepts within the KB because they have different meanings. However, a mass within the esophagus and a mass within the colon would be the same concept because they are both a unified body of matter with no specific shape. As discussed further below, this approach advantageously allows accurate querying of indications, findings and other data obtained via the system, which distinguishes when the same word is used in different contexts. Concepts have no inherent relationship to any other concepts within the KB.
  • LUIs represent a specific version of text that represents a Concept.
  • a new LUI is created so that the Concept can still be referred to by its old name.
  • a user of the KB may choose to change the name of the Concept “mass” to “tumor” due to personal preference.
  • all references to that Concept will have the text of “tumor” unless the specific LUI for “mass” is used.
  • LUIs are used to “lock” the text of a signed Procedure Note.
  • the application traverses the tree of selected terms and stores the LUIs of the concepts at the time of signing.
  • the selected items refer to the text that was present at the time of signing until that item is modified at which time the newest LUI is used.
  • the LUI infrastructure could also be extended to accommodate other features where the textual representation of a concept needs to change without the underlying meaning being affected, e.g., for internationalization and synonym support.
  • FIG. 26 illustrates a conceptual class diagram 2600 of the Data Layer.
  • the Data Layer represents all the knowledge that is contained within the KB.
  • the Data Layer allows the application to understand what a concept means in the context of the KB.
  • the Data Layer can be thought of as an encyclopedia of knowledge as apposed to the Concept Layer, which is a dictionary. For instance, while the dictionary definition of “mass” might be quite simple, an encyclopedia entry for mass could talk about all the different causes for, and descriptions of, a mass/tumor.
  • Keywords are Concepts that have a defined set of properties. Unlike Concepts, Keywords are linked with other Keywords to define a specific use of a Concept.
  • sample data for a Concept called Mass may include a classification as an Entity/ConceptualEntity/PatientProblem/Finding/ImagingFinding/ProtrudingLesions.
  • the use of the “/” in this notation denote that the item following the “/” is a subclass of the item preceding the “/”.
  • Conceptual Entity is a subclass of Entity
  • PatientProblem is a subclass of Conceptual Entity, and so forth.
  • Classifications represent the semantics of Concepts. Through Classifications, Concepts are given a meaning. Classifications are constructed in a hierarchy much like a taxonomic classification of organisms. Taxonomy is the scientific discipline of categorizing various species of organisms into conveniently sized groups, referred to as taxa, which share common, identifiable traits.
  • the root of the hierarchy is a classification called Entity, from which all other classifications are derived.
  • the hierarchy is structured so that more specific classifications are lower in the hierarchy. For example, under the root classification of Entity there are two classifications; Physical Object and Conceptual Entity. If a Concept is classified as a Physical Object, like a chair for example, we know that that Concept exists in the physical world and could, for example, be measured.
  • Properties define the set of allowable values for a group of related keywords.
  • a keyword can have any number of properties that are made up of a homogenous list of keywords that all share the same classification or inherit from the same classification.
  • the property Location on Mass has a classification of BodyLocation; this means that any keyword that is also classified as BodyLocation or something derived from BodyLocation, such as Organ, can be assigned as an allowable value to the property.
  • Keyword - Mass Property Classification Values Location Entity/PhysicalObject/ esophagus, stomach, AnatomicalStructure/ duodenum, distance from entry AnatomicalForm/BodyLocation Appearance Entity/ConceptualEntity/ nodular, ulcerated, friable, FindingModifier/VisualAppearance firm, frond-like/villous, fungating, infiltrative, polypoid, submucosal, smooth Bleeding Entity/ConceptualEntity/ oozing, spurting, not bleeding, FindingModifier/Bleeding bleeding on contact Circumferential Entity/ConceptualEntity/ ⁇ 25%, 25-49%, 50-74%, 75-99%, QuantitativeConcept/Size specify Narrowing Entity/ConceptualEntity/ Extrinsic, intrinsic, uncertain, QuantitativeConcept nodular, friable, firm, ulcerated, infil
  • FIG. 27 illustrates a conceptual class diagram 2700 of the Sentence Model infrastructure.
  • Sentences define the prose that will be generated for a given set of keyword selections.
  • a sentence model may include up to four different types of placeholders. Text placeholders are used to add static text to the sentence model. Node-placeholders are used as a placeholder for any value of the property specified in the placeholder. Conditional-placeholders are used to generate text that depends on the value or values of a property or properties, respectively, in the sentence model. Trigger-placeholders are used to put the keyword that this sentence model is associated with into the sentence.
  • a keyword may be associated with a summary sentence model and/or a normal sentence model. The summary sentence model is used specifically in the summary report section.
  • the trigger-placeholder “TUMOR/MASS” indicates that the particular sentence model will be populated and displayed on the user interface, such as on the Procedure Note Builder display region 2340 ( FIG. 23 ), when the user selects the keyword “TUMOR/MASS” from the available keywords and clicks the Generate Report icon.
  • the node placeholders [Size] and [Mass Appearance] will be populated by the respective keywords selected by the user for those concepts. For example, [Size] and [Mass Appearance] may be populated by the keywords “large” and “rounded”, respectively.
  • conditional statements are used to in the example code above to adjust the grammar of the sentence depending on the keywords selected by the user to characterize the finding.
  • the sentence will state: “There was a large rounded tumor/mass present at 10 cm from the teeth”. If the site is described by something other than a distance in cm from the teeth, e.g., the site is described as being the esophagus, the sentence will state: “There was a large rounded tumor/mass present in the esophagus”.
  • the sentence models thus account for the different ways in which information can be provided by the user for the same findings, for instance.
  • the trigger-placeholder “CIPROFLOXACIN” indicates that the particular sentence model will be populated and displayed on the user interface, such as on the Procedure Note Builder screen region 2340 ( FIG. 23 ), when the user selects the keyword “CIPROFLOXACIN” from the available keywords and clicks the Generate Report icon.
  • the node placeholders [dose] and [route] will be populated by the respective keywords selected by the user for those concepts. For example, [dose] and [route] may be populated by the keywords “IV” and “2 mg”, respectively.
  • conditional statements are used to adjust the grammar of the sentence depending on the keywords selected by the user to characterize the finding. For example, if the “frequency” is described by 12-hour period (variable “q12 h”), the sentence will state: “Start CIPROFLOXACIN IV 2 mg every twelve (12) hours.”
  • Triggers are used to signal the application that some action must be taken when the triggering keyword is selected. There are two forms of triggers that can be assigned to a keyword: Education and Recall. An education trigger is used to signal the system that a particular document should be queued for printing when the keyword is selected. A recall trigger is used to generate an item in a recall queue for a patient when the keyword is selected.
  • Codes are used to associate external codes to a keyword in the KB, primarily for billing. Any number of Codes can be assigned to a keyword. Codes are assigned to a Keyword from a code set, which is the universe of all possible codes for a given code set type.
  • FIG. 28 illustrates a class diagram 2800 of the View Layer.
  • Items are Keywords that appear within a View. Items are grouped into menus when they are inserted into a view. Menus are used to group related items together within a view. Shortcuts are stored selections that will be expanded when selected. Shortcuts are named entities that have their own menu structure. A shortcut menu appears under the parent of the highest-level menu for the shortcut.
  • references are stored in an ordered tree based on the order in which the terms are selected from the view.
  • the reference data associated with the examination will be open to queries.
  • the invention allows advanced data mining techniques from a database of patient information.
  • Other techniques that simply search a database by keyword yield less accurate results. For instance, a search for the word “mass” in a database with such techniques might yield a finding stating that “no mass was found” as well as other tangential or irrelevant results.
  • the KB can be tailored to the user's specific needs.
  • a complete baseline KB is defined by the developer of the system to enable users such as physicians to immediately use the system. The users can subsequently fine-tune the system to meet their specific needs after gaining experience with the system by employing the following maintenance features.
  • FIG. 29 illustrates a user interface for adding a keyword to the KB.
  • the KB maintenance user interface 2900 may include a number of tools, including: add menu/term, edit menu/term, add concept, edit concept, add property definition, edit property, remove, copy, past, move up, move down, create shortcut, edit sentence, edit codes and triggers.
  • a classification tree 2922 indicates that the classification of the concept is under “organ”. The user can navigate the tree 2922 by clicking on the desired classification. The higher-level classifications in the tree are also shown, up to the top-level classification “Entity”. The user can also enter a plural term for the concept, and a description of the concept, via the window 2920 . A classification description is also provided in a grayed out manner to indicate that it cannot be edited. In the display region 2905 of the interface 2900 , the classification tree is repeated.
  • a display region 2910 indicates that “esophagus” and “duodenum” are concepts that are related to “stomach” since they are all members of the classification “organ”.
  • the user can check a “specify” checkbox 2925 so that the user is prompted to enter a value to replace the keyword when the report is generated.
  • the data type specified by the drop down menu 2930 determines whether the user is allowed to enter a text or numeric value.
  • the keyword “stomach” is also defined for a certain exam type, e.g., EGD, and for a report section, e.g., Findings. The user can also adjust these factors.
  • FIG. 30 illustrates a user interface for adding a property definition to the knowledge base.
  • the user interface 3000 includes a display region 3005 indicating that the property definition is for the classification of “body location”.
  • FIG. 31 illustrates a user interface 3100 for editing a keyword in the knowledge base.
  • a tree 3120 indicates that the user has selected the keyword “ulcer”.
  • the concept “ulcer” is presented in a grayed out manner to indicate it cannot be edited.
  • the sentence name can be edited if desired. If the keyword is a “specify” type item that prompts the user for a value, the check box 3125 will be checked, and the user will be allowed to determine its data type with the drop down menu 3130 .
  • FIG. 32 illustrates a user interface 3200 for editing properties of a keyword in the knowledge base.
  • the keyword tree 3120 is the same as shown in FIG. 31 .
  • a display region 3210 allows the user to edit the properties of a keyword, such as “size”.
  • the classification of the keyword is identified as a quantitative concept.
  • the user selects one or more keywords for the quantitative concept from a pop up window 3230 .
  • the selected keywords are then copied to a display region 3220 as available values for detailing the property of “size”.
  • the user may check a checkbox 3225 to indicate that a value for size is required to be entered by the user, e.g., when reporting the findings of a procedure.
  • FIG. 33 illustrates a user interface 3300 for assigning properties to a keyword in the knowledge base.
  • the user selects a classification from the Select Class tree.
  • a display region 3340 indicates the properties available in the selected class, and a user may choose a property to assign to the keyword.
  • the user may create a new property by assigning it to a classification.
  • FIG. 34 illustrates a user interface 3400 for adding and editing codes in the knowledge base.
  • the keywords are provided in the tree 3410 .
  • a pop up window 3430 allows the user to select codes such as for billing. For example, the user may choose a code set from a drop down menu, and one or more individual codes.
  • a display region 3420 displays the selected codes.
  • FIG. 35 illustrates a user interface 3500 for adding and editing sentence models in the knowledge base.
  • the tree 341 0 is the same as in FIG. 34 .
  • the keywords in the tree 3410 that trigger a sentence model may be highlighted, e.g., in bold font.
  • a display region 3520 provides the sentence model for the normal or full sentence, while the display region 3530 provides the summary sentence model.
  • the placeholder terms in brackets may be highlighted in yellow, for instance.
  • the pop up window 3540 indicates the available nodes that the user may choose to insert into the sentence model.
  • FIG. 36 illustrates a user interface 3600 for adding a condition in a sentence model in the knowledge base.
  • the user may desire to add a conditional placeholder to the sentence model in the display region 3520 .
  • the user positions the cursor to a location in the sentence model in which the conditional placeholder is to be added, and clicks on the “conditional” icon, causing the window 3630 to pop up.
  • the window 3630 allows the user to define a condition, operator, and values.
  • the user also defines the text that is to be entered in the sentence model depending on whether or not the condition is met.
  • the user can click on an icon 3635 to cause a window 3640 to pop up that display values from which to select.
  • sentence model in the display region 3620 is simplified.
  • detailed sentence models contain several conditional statements to account, e.g., for variations in the way a finding can be described, the level of detail, grammatical concerns and so forth.
  • FIG. 37 illustrates a user interface 3700 for adding and editing triggers in the knowledge base.
  • selected keywords may trigger an action such as printing a document or scheduling a patient recall.
  • the user may select the keyword “polyp” from the keyword tree 3705 .
  • the user checks a check box 3720 to set an education trigger.
  • a document is selected from a drop down menu 3725 to identify a relevant document to provide to the patient.
  • the document may be printed and handed to the patient during the patient's examination, mailed to the patient's home, or emailed to the patient, for instance.
  • Another checkbox sets a recall reminder for a given number of days, weeks or months, based on a second drop down list. A follow-up examination may be scheduled based on the recall.
  • FIG. 38 illustrates a user interface 3800 for adding a keyword item to a menu in a view.
  • the user selects a menu such as “colon findings” in the tree 3810 , then selects from a menu of available terms in a list 3820 , after selecting a classification in region 3830 to display a list of available keywords in the selected classification.
  • FIG. 39 illustrates a user interface 3900 for adding a menu in the knowledge base.
  • the user selects a keyword, e.g., “normal”, from the tree 3910 , for which the menu is to be provided.
  • a keyword e.g., “normal”
  • the user desires to add a menu for “rectal sites” under “normal” in the tree 3910 .
  • the user types in the menu name in a display region 3920 .
  • Other factors, such as menu type can also be defined.
  • FIG. 40 illustrates a user interface 4000 for creating a shortcut in the knowledge base.
  • the user can select multiple keywords at a time, which is desirable when the same group of keywords is selected over and over, e.g., for different patients.
  • the user provides pre text 4025 for the report, which appears prior to the selected keywords, and post text 4030 , which appears after the selected keywords.
  • a sentence model is populated to provide a preview 4035 of the resulting sentence.
  • FIG. 41 illustrates a user interface 4100 for editing a shortcut in the knowledge base.
  • the shortcut named “Std Meds” was created using the interface 4000 of FIG. 40 .
  • the keyword “Shortcuts” appears in the tree 4110 along with the specific shortcut “Std Meds”. The user can quickly generate the report in the display region 4120 by selecting the keyword “Std Meds”.
  • the user may automatically generate a report from selected keywords by clicking on the Generate Report icon.
  • a grammar checking routine or grammar engine may be used to correct or optimize the grammar in the populated sentences.
  • the grammar engine may be run automatically when a sentence is populated.
  • Various grammar engines including those known in the art, may be used.
  • a grammar engine can be helpful for various reasons, such as ensuring that the verb and subject of a sentence agree, providing correct punctuation and capitalization, and ensuring that singular and plural nouns are properly modified.
  • a grammar engine includes three main components. First, a part-of-speech tagger assigns a part of speech (POS) tag to each word or word component (e.g., noun or verb) in the generated sentence. Second, a lexical analyzer is run to break the sentence into grammatical components, phrases, and clauses. The final portion updates the sentence with the necessary corrections.
  • POS part of speech
  • Each keyword in the KB may be assigned a primary tag by default that identifies the most likely grammatical characteristic of the keyword in the domain in which it is used. Tags for other allowable grammatical characteristic may also be provided. Tags may also be provided for the words in a sentence that are not keywords, such as static text.
  • Example grammatical characteristics identify a word as being, e.g., an adjective, singular noun, plural noun, adverb and so forth.
  • the tag may be a two-letter codeword, for instance.
  • FIG. 42 illustrates user interfaces for a grammar engine.
  • user interfaces may be provided that allow a user to set grammatical characteristics, and view existing characteristics, for keywords or groups of keywords in the KB for use by a grammar engine. This allows the user to tailor the system to the user's preferences and needs, as well as to gain a better understanding of the operation of the grammar engine.
  • a first interface 4200 allows the user to search a dictionary of terms. “Dilated bile duct” is an example. The interface indicates that the term has been assigned to the grammatical characteristic or category of NN, denoting a singular noun. The user then clicks on “edit” to edit the dictionary entry definition.
  • a user interface 4220 provides the name of the entry and the base word, e.g., “duct”.
  • the user can use check boxes, drop down menus and other widgets to set the grammatical characteristics of the entry. For example, the part of speech can be changed by clicking on the “edit” button, thereby causing the user interface 4240 to appear.
  • the interface 4240 provides a list of available grammatical characteristics on the left hand display region 4242 and the one or more assigned characteristics on the right hand display region 4244 .
  • the user can assign an available characteristic by clicking on the characteristic in the display region 4242 and clicking on the right-pointing arrow.
  • the user can delete an assigned characteristic by clicking on the characteristic in the display region 4244 and clicking on the left-pointing arrow.
  • the user designates one of them as a primary characteristic by selecting the characteristic and clicking on the “primary” button in the display region 4244 .

Abstract

A method and system for assisting a user in generating a report regarding a procedure such as a medical procedure. A first display region provides a hierarchical menu of available keywords from a knowledge base. A second display region provides a hierarchical menu of particular keywords that have been selected by the user. A third display region provides a report from a sentence that was generated by populating a sentence model based on the selected keywords. The sentence can be edited by selecting a keyword from the sentence, then selecting a replacement keyword from the first display region. A grammar engine corrects the grammar of the sentence based on user settings for the keywords.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. provisional patent application No. 60/471,349, filed May 16, 2003, entitled “System And Method For Endoscope Management” (docket No. P16531), and incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates generally to a system and method for generating a report using a knowledge base and, more specifically, to generating a report using sentence models that are automatically populated with keywords selected by a user. The invention is illustrated in the context of a medical procedure such as an endoscopic examination.
  • 2. Description of the Related Art
  • Computerized word processing applications have gained widespread use since they allow a user to prepare and manipulate documents in electronic form. Moreover, various applications have been developed to assist users in specific industries in developing reports and other documents. For example, some voice-activated systems use a dictionary developed from scanning published reports to choose a term that most closely matches a spoken word. The spoken words of the user are then converted to text to produce a report. Other systems allow the user to enter information for a report via various prompts and menus. The system then generates a report based on the entered information.
  • However, such approaches are problematic since the terms are not presented in a uniform sentence structure. Moreover, the terms are not classified in a way that improves understanding and data mining capabilities. Similarly, various other features and benefits have also been lacking in the known systems.
  • BRIEF SUMMARY OF THE INVENTION
  • To overcome these and other deficiencies in the prior art, the present invention provides a system and method for generating a report using a knowledge base.
  • In a particular aspect of the invention, a method is provided for generating a report regarding a procedure. The method includes: (a) displaying available keywords from a knowledge base on a first display region of a user interface, (b) receiving, via the user interface, at least one user command selecting at least one of the available keywords from the first display region, (c) displaying the at least one of the available keywords on a second display region of the user interface, responsive to the selection thereof by the at least one user command, (d) populating a sentence model according to the at least one of the available keywords to provide a populated sentence, and (e) displaying the populated sentence on a third display region of the user interface.
  • In another aspect of the invention, a method for providing keywords for generating a report regarding a procedure includes: (a) providing respective keywords for use in the report, (b) associating each of the respective keywords with a respective classification in a hierarchically arranged tree structure of classifications, (c) associating respective properties with the respective keywords, and (d) defining, based on the respective properties, a set of allowable values for a group of the respective keywords, where the group of the respective keywords are related
  • A related user interface and program storage device are also provided.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other features, benefits and advantages of the present invention will become apparent by reference to the following text and figures, with like reference numbers referring to like structures across the views, wherein:
  • FIG. 1 illustrates an overview of a report-generating system according to the invention;
  • FIG. 2 illustrates a Registration and Scheduling clinical flow;
  • FIG. 3 illustrates a Pre-Procedure clinical flow;
  • FIG. 4 illustrates a Procedure clinical flow;
  • FIG. 5 illustrates a Post-Procedure clinical flow;
  • FIG. 6 illustrates a Home tab of a user interface;
  • FIG. 7 illustrates a Patient File tab of a user interface;
  • FIG. 8 illustrates a Registration tab of a user interface;
  • FIG. 9 illustrates a Pre Procedure tab of a user interface;
  • FIG. 10 illustrates a Procedure tab of a user interface;
  • FIG. 11 illustrates a Post-Procedure tab of a user interface;
  • FIG. 12 illustrates an Analysis tab of a user interface;
  • FIG. 13 illustrates an Admin tab of a user interface;
  • FIG. 14 illustrates a Pre-Procedure Lexicon interface;
  • FIG. 15 illustrates a Lexicon user interface for selecting indications of GI symptoms for a patient;
  • FIG. 16 illustrates a Lexicon interface that shows the selected keyword “heartburn”;
  • FIG. 17 illustrates a Lexicon interface that shows a keyword “severity” in expanded form; FIG. 18 illustrates a Lexicon interface that shows a selected keyword “severe”;
  • FIG. 19 illustrates a Lexicon interface that shows a user input value “6” as a selected keyword;
  • FIG. 20 illustrates an interface with a list of billing codes;
  • FIG. 21 illustrates a Lexicon interface in the Post-Procedure tab;
  • FIG. 22 illustrates an interface for selecting a Procedure Note Template under the Post-Procedure tab;
  • FIG. 23 illustrates a Procedure Note interface;
  • FIG. 24 illustrates an interface for selecting a Procedure Note Template under the Admin tab;
  • FIG. 25 illustrates a class diagram of the Concept Layer of the knowledge base;
  • FIG. 26 illustrates a class diagram of the Data Layer of the knowledge base;
  • FIG. 27 illustrates a class diagram of the Sentence Model infrastructure of the knowledge base;
  • FIG. 28 illustrates a class diagram of the View Layer of the knowledge base;
  • FIG. 29 illustrates a user interface for adding a keyword to the knowledge base;
  • FIG. 30 illustrates a user interface for adding a property definition to the knowledge base;
  • FIG. 31 illustrates a user interface for editing a keyword in the knowledge base;
  • FIG. 32 illustrates a user interface for editing properties of a keyword in the knowledge base;
  • FIG. 33 illustrates a user interface for adding properties of a keyword in the knowledge base;
  • FIG. 34 illustrates a user interface for adding and editing codes in the knowledge base;
  • FIG. 35 illustrates a user interface for adding and editing sentence models in the knowledge base;
  • FIG. 36 illustrates a user interface for adding a condition in a sentence model in the knowledge base;
  • FIG. 37 illustrates a user interface for adding and editing triggers in the knowledge base;
  • FIG. 38 illustrates a user interface for adding a keyword item to a menu in a view;
  • FIG. 39 illustrates a user interface for adding a menu in the knowledge base;
  • FIG. 40 illustrates a user interface for creating a shortcut in the knowledge base;
  • FIG. 41 illustrates a user interface for editing a shortcut in the knowledge base; and
  • FIG. 42 illustrates user interfaces for a grammar engine.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Overview of the System
  • FIG. 1 illustrates an overview of a report-generating system according to the invention. In one possible aspect, the invention involves a web-browser based clinical information management system that automates a medical lab such as an endoscopy lab by managing patient examination data at different phases of patient care, including the capture of images, data and written Procedure Notes, and further, the generation of medical records and procedure reports. The system may include an endoscopic workstation 110, a Mavigraph printer 112, RGB monitor 114 and processor 116. The user provides inputs to the workstation 110 via keyboard, mouse, voice interface, or the like. The workstation may be coupled with a web browser interface that provides the necessary information to perform exams, and facilitates for users of endoscopic equipment, e.g., physicians, nurses or clinicians, the efficient capture, management, organization and presentation of endoscopic images and patient and examination data.
  • The workflow processes associated with this aspect of the system are flexible enough to support small endoscopic practices in addition to endoscopic departments within large healthcare institutions.
  • The system may function as a stand-alone system including memory for storing patient data and image information. The system may also include a server 140 and database element 145 that may be connected via a gateway application to various “external” systems such as a hospital information system where the gateway facilitates the transfer of healthcare information between the system and other applications. Patient information stored in the system may be downloaded to external systems (e.g., a legacy system) via a gateway interface. The workstation 110 may communicate with the server 140 via the Internet 170 or other network, such as a LAN or intranet. The workstation 110 may also communicate with a fax server 160, for instance, for faxing reports via a fax modem 162. Generally, software instructions, including firmware and microcode, may be stored in any type of program storage device or devices, also referred to as computer-readable media. The software is executed by a processor in a known manner to achieve the functionality described herein.
  • In a particular aspect, the system includes an Image Management function enabling a user to annotate, label, import, export, and enhance the quality of images, including the ability to manage, record, and export live video clips. Further to this is an “auto-masking” feature that automatically selects an appropriate video mask based on a particular endoscope device being utilized by the health care practitioner.
  • In another particular aspect, the system includes a medical terminology “Knowledge Base” (KB) comprising keywords relating to the procedure, e.g., such as gastrointestinal, endoscopic and bronchoscopic terminology keywords. The keywords are captured via a graphical user interface (GUI) before, during, and/or after a procedure. The keywords are made available for labeling images captured during an examination to be used in reports, auto-populating appropriate sections of a report such as a Procedure Note, described further below, based on patient history, and building Procedure Note templates or models to auto-populate sections of information. The system also facilitates the use of custom terms that apply to a specific department or location. Thus, for example, during an exam, a user may select KB terms for a procedure via a common user interface, which is employed wherever the user needs to locate or extract keywords. This also provides a consistent way to select and use terminology.
  • Clinical Flow
  • FIGS. 2-5 illustrate clinical flow diagrams that describe the most common activities associated with the system and their relationship in time in the context of one possible application of the invention. Clinical flow is based on patient flow, which relates to how a patient is processed before, during, and after an endoscopic procedure. The overall flow across all lifecycle stages starts with an exam request and ends with the generation of a Procedure Note, the release of the patient, and the generation of a set of related reports. User roles are represented as horizontal bands.
  • The registration and scheduling clinical flow 200 of FIG. 2 includes a collection of all the information necessary to set up a visit. It is initiated through an exam request made by either the patient, a surrogate for the patient, or a referring physician. The nurse and physician share the activity of preparing prep instructions and medical advice for the patient.
  • The Pre-Procedure clinical flow 300 of FIG. 3 starts with the arrival of the patient at the endoscopy facility and addresses all administrative and medical activities necessary to prepare the patient for the exam.
  • The Procedure clinical flow 400 of FIG. 4 depicts the actual examination that takes place during the Procedure lifecycle stage. The system is used to capture images, record vital signs, and administer medications during this stage.
  • The Post-Procedure clinical flow 500 of FIG. 5 depicts the activities that take place after the completion of an exam. These activities include a nurse continuing to monitor the patient's recovery, a nurse completing discharge instructions, releasing the patient, and preparing billing code reports, and a physician reviewing and editing the analysis of an exam by generating a Procedure Note. A physician signs the Procedure Note when it is complete. Afterward, management reports, patient recall requests, and referral letters can be created and distributed.
  • User Interface
  • The invention is next described in connection with a user interface that allow the user to select different features under different tabs.
  • I. Home tab 600 (FIG. 6). The Home tab is the default home page, and is pre-defined for each role. However, the user can modify the page to suit the user's needs. The following are the most common tasks that can be performed in the Home tab. Access to these tasks is based on the user's role. For example, if the user logs into the application as a scheduler, then the user would not see the Sign Reports menu option, since that option is reserved for the physician role.
  • 1) Scheduled Exams—used to view a list of scheduled exams and create a new visit and exam.
  • 2) Create a New Visit—allows the user to schedule a new visit for a patient.
  • 3) Pending Items—used to view all of the pending tasks. The user can also select one or more pending items and close them.
  • 4) Pathology Status—used to view the status of outstanding pathology requests or search the database for an existing record. The user can also edit or delete existing pathology records. When a pathology record is deleted, all of the specimens associated with that record are deleted.
  • 5) Unsigned Reports—an attending physician can use the Unsigned Reports screen to view and sign unsigned Procedure Notes.
  • 6) Sign Reports—A system administrator can use the Sign Reports screen to view unsigned Procedure Notes for a specific physician and mark them as signed.
  • 7) Carbon Copies—When the user distributes a document to a medical provider, clinical staff, or contact via email, a notification is sent to the recipients that a document is available for them in the system. Recipients can then log on to the system and view a list of documents on the Carbon Copies screen.
  • 8) ICU Synchronization—when the user performs an exam in ICU (Image Capture) mode, the user's imaging station is not connected to the network server. When the user finishes the exam, the user must upload images and data from the workstation to the server repository. When the workstation is re-connected to the network, a series of simple commands will upload the data and images captured during the exam. After the data is uploaded, the user uses the ICU Synchronization option to synchronize images and data.
  • 9) Recall Letters—used to recall a patient for another examination. The user can use this option to add an item to the Recall Letter Queue to remind a patient of a follow-up examination.
  • 10) System Log -Allows the system administrator to view errors and messages generated by the application.
  • II. Patient File tab 700 (FIG. 7)—allows a user to capture information specific to the individual patient. This tab is used to record a patient's demographic information; a patient's medical alerts, GI/pulmonary, medication, family, and social history information, and view a summary of the patient information.
  • III. Registration tab 800 (FIG. 8). This tab is used to: (a) create and modify visit and/or exam information; (b) view past, current, or future schedules; (c) assign resources for an examination including procedure rooms and equipment; and (d) distribute registration documents.
  • III. Pre-Procedure tab 900 (FIG. 9). This tab is used to: (a) record care plan information for a specific visit; (b) record medical alert information; (c) record GI, pulmonary, family, and social history information, (d) manage physical examination, patient assessment, and physician check information, (e) manage prep status information for the patient; (f) manage consent information for a visit; (g) capture vital signs and medications administered before the examination; (h) display a summary of selected Pre-Procedure information and capture nurse handoff information; and (i) distribute Pre-Procedure documents.
  • IV. The Procedure Tab 1000 (FIG. 10). This tab is used to: (a) capture images during an endoscopic procedure; (b) record live video clips; (c) record scope time used during an examination; (d) view images and Procedure Notes from a previous exam; (e) print images for an exam on a laser jet or a Mavigraph printer; (f) record nurse administration information; (g) record accessories and equipment used during an examination; (h) generate pathology requests; (i) capture vital signs and medications administered during the examination; and (j) distribute procedure documents.
  • V. The Post Procedure Tab 1100 (FIG. 11)—After an examination is completed, this tab is used to perform post-procedural tasks. These tasks include synchronizing images in the ICU mode, monitoring a patient's vital sign and medication information, managing captured images, and writing Procedure Notes. Images from a current procedure, e.g., image 1 and image 2, and from a prior procedure, e.g., image 3, image 4, and image 5, can be displayed together for comparison. This tab is used to: (a) record patient recovery information; (b) manage images captured during an exam; (c) label, annotate, enhance, and print images; (d) import and export images to and from the current examination; (e) manage video clips recorded during an examination; (f) write and sign Procedure Notes; (g) capture patient recall information; (h) assess performance of a trainee participating in an examination; (i) capture patient survey information; (j) distribute Post-Procedure documents; and (k) perform ICU synchronization.
  • VI. The Analysis Tab 1200 (FIG. 12)—used to generate redefined template-based management reports to satisfy end-user administrative reporting requirements related to patient, procedure and facility management, efficiency analysis, and resource utilization. This tab is used to generate: (a) Continuous Quality Improvement (CQI) reports; (b) efficiency reports; (c) equipment analysis reports; (d) procedure analysis reports; and (e) administration reports.
  • VII. The Admin Tab 1300 (FIG. 13)—used to perform administrator tasks and ensure the efficiency and security of the system. The system can be customized based on the needs and requirements of the facility, physician, and clinical staff. This tab is used to: (a) maintain system data (such as Patient ID type and department information); (b) maintain application resource data (such as clinical staff and contact information); (c) perform system configuration (such as configure Mavigraph printer and video settings); (d) customize how the application will flow and generate information (for example, changing the order and location of menus within the application and editing or creating templates/models that are used to create Procedure Notes); (e) customize user-defined fields (such as other patient information and other visit information); (f) control access to or within the application (such as user and role maintenance); and (g) maintain equipment used during the procedure.
  • Knowledge Base
  • The Knowledge Base (KB) is a terminology database that contains terms related to the specific procedure. For example, the KB may be a medical terminology database that includes gastrointestinal, endoscopic and bronchoscopic terminology in one possible application. The KB can be extended to other medical and non-medical applications by using the appropriate terminology. For example, the knowledge base may be used for any application where standardized terminology is desired. This may include applications for producing reports in various industries, such as financial services, insurance, legal services, real estate and so forth.
  • The KB contains concepts (terms), keywords (terms used in a specific context), sentence models and views (menus and keyword items). Keywords are the medical terms that are the basic building blocks of the KB. When the user selects a keyword from a menu of available keywords, it appears in a list of selected keywords or terms. Moreover, keywords are organized in menus. For example, a “size” menu type could contain the keywords “small”, “medium”, and “large”. Furthermore, a menu can be configured to be single-select, unique, or multi-select.
  • Generally, the user can use the KB to: (a) capture keywords before, during, or after a procedure; (b) label images captured during an examination to be used in reports; (c) auto-populate appropriate sections of a Procedure Note based on patient history; and (d) build Procedure Note templates/models to auto-populate sections of information.
  • During an exam, the user can select KB terms for a procedure. The KB features a common user interface, which is employed wherever the user needs to locate or extract keywords. This also provides a consistent way for the user to select and use terminology. The KB also facilitates the use of custom terms that apply to a specific department or location.
  • Lexicon Function
  • A Lexicon function is used to select terms from the KB. The Lexicon arranges the KB content into different report sections, based on Phases of Care. The user can click a tab to see the KB terms associated with the report section. A facility determines which report sections should be available in the Lexicon screen for a phase of care. A system administrator can make a report section available to appropriate phases of care in the Admin tab.
  • Each tab or report section in the Lexicon screen may contain two panes or display regions. As described further below, the right hand side pane displays a KB view and contains all the available terms of the KB arranged in a logical tree. Typically, only a portion of the tree is visible at a time. The user can navigate the tree to view different portions of it. The left hand side pane contains the selected terms. When the user picks or selects an available term from the right hand side pane, it is copied to the left hand side pane, which displays the selected terms, thus allowing the user to logically building a comprehensive description of the exam. The user uses these selected terms along with other exam data collected during various phases of care to generate Procedure Notes, e.g., reports, and other exam related documents.
  • Detailed Discussion
  • Lexicon Function—Pre-Procedure
  • FIG. 14 illustrates an example of the Pre-Procedure Lexicon screen or interface 1400. The user can use the Lexicon function to select terms from the KB and to record procedure related information of the exam. The Lexicon arranges the KB content into different report sections for the Pre-Procedure, including indications, unplanned events and billing codes. These report sections appear as tabs in the Lexicon screen 1400. To reach this interface, the Pre-Procedure tab is selected, and the menu item “Lexicon” is selected from the far left hand side of the interface. Furthermore, under the Lexicon function, the “Indications” tab has been selected, so the term “Indications” appears at the top of the left hand pane or second display region 1410.
  • The right-hand pane or first display region 1420 of the interface 1400 displays the available terms in the KB in a hierarchical tree or menu 1425. Generally, there is insufficient room to display all terms at the same time. Instead, the user can view branches of the tree 1425 in an expanded view by clicking on a “+” icon, or in a contracted view by clicking on a “−” icon. The branches of a node of the tree may contain related keywords that qualify the concept of the parent node keyword. For example, in the tree 1425, the keyword “severity” includes branches for keywords “mild”, “moderate” and “severe” that describe the degree of severity of the patient's symptoms. Keywords at lower levels of the tree thus can be provided to qualify the keywords at the higher levels.
  • In one example approach, the keywords in the tree 1425 are organized into menus in which the physician might organize the indications or symptoms of a patient. These menus may include, e.g., GI or gastrointestinal symptoms, airway symptoms, test results, imaging results, surveillance, treatment of established disease, systemic disorders, follow up, diagnostic sampling, and protocol study. Under each keyword, further qualifying terms can be viewed by expanding the tree. The interface 1400 relates to GI symptoms, and the portion of the tree 1425 shown relates to the indication “abdominal pain”. Multiple menus of indications can be selected by the user together or in turn.
  • The second display region 1410 contains a tree 1415 with the keywords that the user has selected from the KB terms in the display region 1420. Generally, when the user selects an available term from the first display region 1420, it is copied to the second display region 1410 to logically build a comprehensive description, e.g., of the patient's symptoms, or of the subsequent exam results. For example, in the tree 1415, the indications of diarrhea, weight loss, abdominal pain and heartburn have been selected along with various keywords for detailing each indication. The following discussion explains how the keyword “heartburn” in the tree 1415 is selected by the user.
  • FIG. 15 illustrates an example lexicon user interface 1500 for selecting indications of GI symptoms for a patient. In the first display region 1420, the various GI symptoms are provided as branches of a tree 1525 in which the keyword “GI symptoms” is a parent node. The user has expanded the tree under the keyword GI symptoms to reveal the next lower level of keywords, e.g., abdominal pain, bloating, etc. The second display region 1410 indicates that the user has not yet selected any GI symptoms.
  • The user, such as a physician, obtains information regarding the GI symptoms of a patient such as by interviewing the patient. The user learns that the patient is experiencing heartburn. The user then selects the keyword “heartburn” from the tree 1525 in the first display region 1420, e.g., by clicking on “heartburn” with a pointing device such as mouse. The keyword “heartburn” is then copied to the second display region 1410 as a first indication. See the interface 1600 of FIG. 16, which shows the selected keyword “heartburn” displayed in the tree 1615 in the second display region 1410. Other indications such as diarrhea, weight loss and abdominal pain, as provided in the tree 1415 of FIG. 14, may similarly be selected from the tree 1525 to build a lexicon of selected keywords in the second display region 1410.
  • Moreover, each keyword may be further detailed when selected. For example, when the keyword “heartburn” is selected from the tree 1525 in the first display region 1420, as shown in FIG. 15, the first display region 1420 may be updated to display the tree 1625, in which various additional details appear as branches or child nodes of the selected keyword. For example, as shown in FIG. 16, the tree 1625 displays keywords for detailing the severity, frequency, duration, length of last episode, precipitants and relief factors regarding the heartburn. Note that these same keywords may be used for detailing the other indications, such as diarrhea, weight loss and abdominal pain. Keywords that are specific to a particular indication may also be used for detailing the indication.
  • Furthermore, each of the keywords in the tree 1625 may be expanded to provide further detailing keywords. For example, the user may expand the node “severity” in the tree 1625 by clicking on the “+” sign next to “severity”. The result is the tree 1725 in the interface 1700 of FIG. 17, where the keywords “mild”, “moderate” and “severe” allow the user to detail the severity of the heartburn. Note that the term “heartburn” is highlighted in the tree 1615, indicating to the user that any further selection from the first display region 1420 will provide details regarding the heartburn. If other indications such as weight loss and abdominal pain were present in the tree 1615, each indication could be highlighted in turn by the user to allow the user to enter the associated details in turn.
  • Referring to the tree 1725, assume the user selects the child keyword “severe” under the parent keyword “severity”. As shown in the tree 1815 in the interface 1800 of FIG. 18, the term “severe” is copied to the second display region 1410 and provided as a branch of the term “heartburn”. As a further example, assume the user proceeds to detail the frequency of the heartburn. To do this, the user expands the keyword “frequency’ in the tree 1625 to reveal options such as “constantly”, “specify number of times per day”, “specify number of times per week”, “specify number of times per month”, and “specify number of times per year”. The user may select “specify number of times per day”, which causes a pop-up window to be displayed requesting that the user enter a number. The user enters “6” in an Enter Number field if the patient was feeling heartburn six times a day. The user then clicks OK to save the entry. The interface is then updated so that the number “6” appears in the tree 1915 in the interface 1900 of FIG. 19 under the keyword “heartburn”. Alternatively, the interface may display “6 times per day”.
  • The process as detailed above may be repeated for the different indications to arrive at the interface 1400 of FIG. 14. Note that the hierarchical relationship of the keywords in the tree of available keywords is maintained in the tree of selected keywords. Essentially, the tree of selected keywords is a subset of the tree of available keywords. Advantageously, this allows the context of a keyword to be understood by the user. Also, note that the “specify” keywords prompt the user for a specific entry, e.g., as text or a numerical entry. It is also possible for the numerical values or ranges of values to be provided as keywords in the tree of available keywords.
  • Generally, the menu or tree of available keywords in the KB can be configured to single-select, unique, or multi-select. A single-select menu allows the user to select one keyword at a time. For example, in the tree 1425 of FIG. 14, the keyword “severity” is classified as a single-select menu, so the user can select only one qualifier from the available qualifiers, such as mild, moderate or severe. In a unique menu, once the user selects a keyword, the user cannot select the keyword again. For example, the GI Symptoms menu 1525 of FIG. 15 is unique, and the user selects “heartburn” as the first symptom. If the user tries to selects this keyword again from the tree 1525, the keyword “heartburn” in the tree 1615 of FIG. 16 is highlighted to indicate that it has already been selected.
  • A multi-select menu allows the user to select multiple keywords at a time. The user can even select all the keywords in a multi-select menu. For example, if the GI symptoms menu in the tree 1525 of FIG. 15 is classified as a multi-select menu, the user may select multiple symptoms such as diarrhea, weight loss, abdominal pain and heartburn at the same time. For example, the user may select multiple terms by clicking on each term with a mouse while holding down the <Ctrl> key on the keyboard. Each selected symptom appears at the same hierarchical level of the tree 1415 of FIG. 14.
  • Views are collections of menus and their associated keywords organized within a tree-structure. The user can use views to navigate through the KB and select appropriate medical terms or keywords. For example, the interface 1400 of FIG. 14 illustrates a collection of menus and keywords or a View in the Lexicon screen. To access the Lexicon screen, the user navigates to the Pre-Procedure tab, selects an exam, and selects the “Lexicon” option from the far left hand menu. The Lexicon screen 1400 may contain the following tools to assist the user in navigating the interface. Any appropriate icon design may be used.
    Tool: Purpose:
    Generate Generate sentences from selected keywords. Active only
    Report in the Procedure Note screen (The Post-Procedure tab).
    Move Up Move keyword up in the selected terms hierarchy.
    Move Down Move keyword down in the selected terms hierarchy.
    Code Generate billing codes based on keyword selection.
    Add KB Item Add a new keyword to the Knowledge Base or available
    terms (the right side pane).
    Next Take the user to the previous multi-select menu one level
    up in the selected terms (the left side pane).
    Previous Take the user one level up in the selected terms (the left
    side pane).
    Delete KB Item Delete a keyword from the selected terms (the left side
    pane).
  • As mentioned, to select a clinical term for a particular report section (tab), the user clicks on a keyword from the list of available keywords in the first display region 1420, and the selected keyword is copied to the second display region 1410. However, this arrangement of the display regions is an example only. It is possible to have the available and selected keywords displayed in various other configurations. For example, the available keywords may be displayed on a top display region while the selected keywords are in a bottom display region. In another possible approach, the selected keywords are displayed in a window that is overlaid on a window that displays the available keywords. It is also possible to use multiple display screens, where, e.g., the available and selected keywords are displayed on different screens. Various other configurations are possible based on the available display technologies.
  • Under the “unplanned events” tab of the interface 1400 of FIG. 14, a window may be provided that allows the user to type in details of such an event, or keywords describing the event may be selected from the first display region 1420. For example, an unplanned event during an endoscopic procedure may include bleeding. The location and extent of bleeding can thus be detailed.
  • Under the “codes” tab, the user can associate billing codes with selected keywords for billing purposes. The user accesses the Lexicon screen and selects a keyword with which to associate the billing code. The user clicks the Code icon, and a Select Billing Codes window is displayed. The user clicks Code Set, and the Code Sets window is displayed. The user searches for a billing code from this window. The user selects a type of code set from the Code Set dropdown list. The user enters the number of the code that is being searched for in the Number field. The user enters a description of the code in the Description field. The used clicks Go, and a list of billing codes based on the search criteria is displayed. If no search criteria are entered, all the billing codes from the selected code set are displayed. FIG. 20 illustrates an interface 2000 with a list of billing codes from the ICD-9 (International Classification of Diseases) Diagnostic Code Set. The user clicks a billing code to select it. To select multiple billing codes, the user presses and holds the <Ctrl> key and selects the billing codes.
  • Lexicon Function—Post-Procedure
  • FIG. 21 illustrates an example of the Lexicon screen 2100 in the Post-Procedure tab. In the Post-Procedure phase, the physician or other user uses the Lexicon function to record findings from performing a procedure. These report sections appear as tabs in the Lexicon screen 2100 labeled as: indications, procedure, findings, medication, unplanned events, recommendation, summary and codes. FIG. 21 illustrates the Findings tab. To access the Lexicon screen, the user navigates to the Post-Procedure tab, selects an exam, and selects the Lexicon option from the far left hand menu. The Lexicon interface for Post-Procedure is analogous to that for Pre-Procedure (FIG. 14).
  • The Post-Procedure Lexicon screen may contain icons for tools as discussed above in connection with the Pre-Procedure lexicon function. Moreover, as with the Pre-Procedure case, a Lexicon menu for Post-Procedure can be configured as single-select, unique, or multi-select.
  • To select a clinical term for a particular report section (tab), the user clicks on a keyword from the list of available keywords. The selected keyword is copied to the left side of screen or other separate display region. For example, assume the user finds a small polyp in the stomach in the patient during the endoscopic procedure, specifically on the anterior wall of the antrum. To document this finding, the user may perform the following steps:
  • (a) Access the Lexicon screen.
  • (b) Select the Findings report section tab. The display region 1420 provides a tree where a parent node is for the keyword “organs” (not shown).
  • (b) Click the +(plus) sign next to “organs” to expand it, revealing branch nodes for specific organs such as the stomach, duodenum, etc.
  • (c) Select “stomach” from the first display region 1420. This causes the tree 2115 (FIG. 21) to be updated with the keyword “stomach”, and for the first display region 1420 to display branches including “stomach findings” under the parent node “stomach” (not shown).
  • (d) Click the + (plus) sign next to “stomach findings” to expand it. This causes the first display region 1420 to display branches of “stomach findings”, including “polyps” (not shown). Note that it is not necessary to update the tree 2115 with the keyword “stomach findings” since it is self-evident that the findings that are subsequently detailed are stomach findings.
  • (e) Select “polyp” from the first display region 1420. This causes the tree 2115 to be updated with the keyword “polyp” as a child branch of the keyword “stomach”. The first display region 1420 is updated to display a tree that is similar to the tree 2125, except that the keywords are unexpanded. The keywords in the tree 2125, e.g., fundus site. body site, antrum site, other stomach site, etc., are child branches of the keyword “polyp”.
  • (f) Click the + (plus) sign next to “antrum site” in the tree 2125 to expand it, revealing the child branches of anterior wall, posteriori wall, greater curvature and lesser curvature.
  • (g) Select “anterior wall of the antrum” from the tree 2125. This causes the tree 2115 to be updated with the keyword “anterior wall of the antrum” as a first branch of the keyword “polyp”.
  • (h) Click the + (plus) sign next to “size of polyp” to expand it, revealing the branches for diminutive, small, medium, and specify size (mm) in the tree 2125.
  • (i) Select “small” from the tree 2125 This causes the tree 2115 to be updated with the keyword “small” as a second child branch of the keyword “polyp”. Again, note that the hierarchical relationship of the available keywords is maintained in the tree 2115 for the selected keywords.
  • (j) Click Save.
  • The system further allows the user to enter custom information. If the user does not find a keyword in the available KB terms, the user can specify a term using a keyword that begins with “specify” (such as “specify size to enter a size”, “specify finding to enter a finding”, etc.) to add a term. For example, if the user does not find the size that the user wants to specify for the “size of polyp”, the user can specify a different size. To specify a different size in millimeters (mm), for instance, the user can perform the following:
  • (a) Click the “specify size (mm)” option in the tree 2125 to display a Specify Size (mm) window as a pop up window with a text field.
  • (b) Enter a value in the Enter Value text field, e.g., “23”.
  • (c) Click OK. The size of the polyp then appears in the left side tree 2115. For example, in the tree 2115, the entry “23 mm” or “size of polyp 23 mm” may appear in place of the keyword “small”.
  • (d) Click Save.
  • Procedure Notes
  • The Procedure Notes task may be accessed from the Post-Procedure tab of the interface to prepare Procedure Notes regarding a procedure performed on a patient. A Procedure Note may include documented information about a specific exam. It can be used to document findings, diagnosis, medications, recommendations, and other information such as past diagnosis. Generally, the Procedure Note function can be used to: (a) manage images, (b) view information such as images and Procedure Notes for other exams, (c) select terms from the Knowledge Base tree, (d) generate report text, (e) select billing codes, (f) sign a note, (g) generate different versions of a note, (h) discard a note, and (i) delete a note.
  • The user selects a template/model to use the Procedure Note function. The system administrator creates a Procedure Note template/model based on the manner in which either a facility or a physician wants to display captured information in a document. FIG. 22 illustrates a Select Procedure Note Template screen 2250 in the Post-Procedure screen 2000. To select a template, the user does the following: (a) access the Post-Procedure screen 2200, (b) select an exam, (c) select “Procedure Note” from the far left hand menu, causing the Select Procedure Note Template screen 2250 to be displayed, e.g., as a pop up window, (d) search for the Procedure Note template based on facility and/or physician names, (e) click Go, causing a list of resulting Procedure Note templates to be displayed, and (f) select a particular template to view the Procedure Note screen 2300 of FIG. 23. In the example provided, a template entitled “Standard—Bronch” is available to assist the user in generating a Procedure note for a bronchoscopy procedure.
  • Referring to FIG. 23, a first display region 2364 may provide a tree 2365 of available KB keywords as discussed in connection with the first display region 1420 of the Lexicon screen 1400 (FIG. 14). Similarly, a second display region 2362 may provide a tree 2363 of selected keywords as discussed in connection with the second display region 1410 of the Lexicon screen 1400. A third display region 2340 provides a Procedure Note Builder for editing and generate reports such as Procedure Notes. A fourth display region 2320 provides an Image Strip section 2320 for managing images for the current exam. Example images for the current exam are displayed at the left side of the section 2320 as “image 1” and “image 2”.
  • The user can use the Images section 2320 in a similar manner as the Image Management screen. The Images section 2320 may contain tools for the following functions: (a) delete selected images from the current exam, (b) delete all unlabeled images from the current exam, (c) mark selected images for printing, (d) unmark the selected image for printing, (e) view larger image, (f) label all selected images from the current exam, (g) delete the label from all selected images from the current exam, (h) associate findings, (i) disassociate findings, (j) show or hide menu and (k) show or hide strip.
  • The Procedure Note function can be used to view images and Procedure Notes for other exams associated with the selected patient. The following tools may be used to view other exam information: (a) view large image, (b) view the Procedure Note for the other exam, and (c) close the other exam image window. To view images for another exam, the user accesses the Procedure Note screen, and selects an exam from the Other Exam dropdown list 2322. Images for the selected exam are displayed on the right side of the Images section 2320.
  • To view Procedure Notes for another exam, the user selects the exam from the Other Exam dropdown list 2322, and clicks the View the Procedure Note icon to display the Procedure Note for the other selected exam in a new window.
  • The Procedure Note function can also be used to select KB keywords for an examination. The first display region 2364 is used to select keywords as discussed above in connection with the Lexicon function. When the user is done selecting keywords, the user clicks the Generate Report icon, discussed in connection with the tools above, to generate a report in the third display region 2340, which is the report section of the interface 2300. The Generate Report icon is activated if the user makes changes to the selected keywords. Moreover, if any of the selected keywords are associated with a sentence model, the sentence is generated and populated with the keywords in the displayed report within the specific Report Section.
  • The third display region 2340 may include predefined sections corresponding to the tabs in the Lexicon screen 2100 of FIG. 21, e.g., introduction, indications, procedure, findings, medication, unplanned events, recommendations, summary and codes. The user may build a lexicon for one or more of the sections as discussed above. As the lexicons are developed, one or more sentence models are selected from a number of available sentence models. The selected sentence models are populated using the selected keywords and predefined or static text, as discussed further below.
  • In the Findings section of the third display region 2340, a Use Organ Labels feature may be set to Yes for the exam type, in which case the sentences for findings are prefixed with the name of the organ. For example, assume there was a finding of “polyp” in “stomach”. Moreover, the polyp is further detailed by the user selecting the keyword “antrum site” and the branch keyword thereof “anterior wall of the antrum”, the keyword “polyp qualifier” and the branch keyword thereof “broad based”, and the keyword “size of polyp” and the branch keyword thereof “small”. These keywords are indicated in the tree 2363 in the second display region 2362. The sentence model in the report might appear as: “Stomach: A small, broad-based polyp was found arising from the anterior wall of the antrum. (2)”. Here, the notation “(2)” denotes a corresponding image in the Images section 2320 associated with the finding, e.g., image 2. When a sentence for a finding is generated, and there is an image associated with the finding, the image number may be provided at the end of the finding.
  • Also in the third display region 2340, a sentence model is populated to describe findings regarding a second organ, the esophagus. As indicated in the tree of selected keywords 2363, the user has found food in the esophagus. The food is characterized as being a trace amount, in a location described as the upper third of the esophagus, and the presence of the food is attributed to a motor disorder in the patient. Specifically, after building the lexicon of terms to describe the Findings in the tree 2363, the user clicks the Generate Report icon, and the sentence is generated. The sentence might appear as: “Esophagus: Trace of food was found in the upper third of the esophagus, due to motor disorder.”
  • Similarly, a Recommendation may appear as the sentence: “Start a low fat diet” based, e.g., on one or more keywords selected by the user under the Recommendation tab. Other example recommendations include “admit for observation”, “consult radiologist”, and so forth.
  • The sentence models advantageously allow the user to quickly generate reports using standard terminology and sentence structures. Moreover, the generated sentences can be easily edited by substituting one keyword in place of another. For example, under Procedure in the third display region 2340, the user may wish to change “The views were excellent” to “The views were good”. To achieve, this, the user highlights the word “excellent” in the sentence. In response, the display region 2362 is updated to show the portion of the tree 2363 in which the selected keyword “excellent” is located. The keyword “excellent” may appear highlighted in the tree 2363. Additionally, the display region 2364 is similarly updated to show the portion of the keyword tree 2365 in which the keyword “excellent” is located. Moreover, in the tree 2365, “excellent” will appear with other related keywords such as “poor”, “fair” and “good”, which are branches of the same parent node, such as “quality”. The keyword “quality” may in turn be a branch of a parent node “views”, for instance. To replace the keyword “excellent” with the keyword “good”, the user selects “good” from the tree 2365. The tree 2363 is then updated with the keyword “good” replacing “excellent”. Next, the user clicks the Generate Report icon to re-populate the sentence model with “good” in place of “excellent”.
  • Similarly, a sentence can be edited by adding or deleting a keyword. For example, the user may wish to change “A small, broad-based polyp was found . . . ” to “A broad-based polyp was found . . . ” To achieve this, the user highlights “small” in the sentence and clicks on a delete icon. To add a keyword, the user selects the additional keyword from the tree of available keywords 2365. The keyword is then copied to the tree of selected keywords 2363 and shown in the appropriate hierarchical position in the tree. The user then clicks the Generate Report icon to re-populated the sentence in the third display region 2340. For example, the user may select the additional keyword “small” from the tree 2364 to change the sentence “A broad-based polyp was found . . . ” to “A small, broad-based polyp was found . . . ”
  • This keyword editing feature is very powerful since it allows the user to modify the report to correct erroneous entries, or to modify entries based on further findings or change in judgment, for instance.
  • The example sentence models discussed above are normal sentence models since they read as complete sentences rather than as sentence fragments. A normal sentence model also typically incorporates all of the selected keywords associated with the report section. However, it is also possible to associate a keyword in the KB with a summary sentence model. If a selected template has a Summary section and a selected keyword has a summary sentence model, the summary sentence for the keyword appears in the Summary section. For example, if the user finds a polyp in the stomach, the summary sentence might be: “Diagnosis—polyp”. Thus, a report can include both normal and summary sentences.
  • The Procedure Note function can further be used to select relevant billing codes for the exam. If a keyword has a billing code associated with it, the user would see all billing codes and descriptions associated with the keyword when clicking the Code icon. Relevant billing codes associated with a keyword can be selected for the current exam as discussed previously
  • To save a Procedure Note, the user accesses the Procedure Note screen and clicks Save. The user can sign a Procedure Note if the Procedure Note text is generated. In one approach, only attending physicians for the examination can sign the Procedure Note. If the Validate & Sign Procedure Note settings are set to Yes, the user will be asked to validate the user ID and password. Signing a note will lock the report from further editing. To sign a report, the user accesses the Procedure Note screen and clicks Sign. Signing a Procedure Note updates a patient's past procedure, past diagnosis, and past surgeries record.
  • In situations where the user makes changes to an existing Procedure Note and wants to apply those changes to future Procedure Notes, the user saves the modified Procedure Note as a template for generating future Procedure Notes. To save a Procedure Note as template, the user accesses the Procedure Note screen, clicks Save As, assigns a name to the template, selects either a facility or physician name to assign an owner to the template, and clicks Save.
  • Once a Procedure Note is signed, it is locked and cannot be edited. To make any changes to an existing, signed Procedure Note, the user can create a new version of the Procedure Note. The new version of the Procedure Note is an exact copy of the current signed Procedure Note, without regenerating any sentences or updating any database fields. To generate a new version, the user accesses the Procedure Note screen and clicks New.
  • Discarded Procedure Notes are stored in a Discard Bin, where they can be viewed but not restored. To discard a Procedure Note, the user accesses the Procedure Note screen, makes sure the Procedure Note is signed, clicks Discard, and clicks Yes.
  • To delete a Procedure Note, the user accesses the Procedure Note screen, makes sure the Procedure Note is not signed, clicks Delete, and clicks OK.
  • Maintaining the Knowledge Base
  • A system administrator is responsible for maintaining the Knowledge Base by adding new keywords, sentence models, and menu structures.
  • Report Sections are used to generate Procedure Notes for a specific exam type. When the user creates a Procedure Note template, the user can choose to include a few or all the report sections in it. The user can also assign these report sections to a phase of care and then use them in the Lexicon screen as tabs to record data. The user can create, modify, and delete report sections from the Report Section List screen. Examples of report sections include: introduction, indications, procedure, findings, medication, unplanned events, recommendation, summary, and billing codes.
  • A Phase of Care function can be used to assign specific report sections to a phase of care, namely Registration, Pre-Procedure, Procedure, and Post-Procedure.
  • To create a Procedure Note template for an exam type, the user uses the Report Template function. To access the Report Template screen, the user navigates to the Admin tab, and selects Customization from the left menu. Available customization options are displayed. The user selects Report Template from the left menu, and the Select Procedure Note Template screen is displayed (FIG. 24). To search for a Procedure Note template/model, the user accesses the Select Procedure Note Template screen, selects an exam type, facility, or physician, and clicks Go. A list of Procedure note templates is displayed, based on the search criteria.
  • Knowledge Base Framework
  • The KB provides a controlled vocabulary for reporting results, e.g., of medical examinations and procedures. Discussed below are the three structural layers of the KB, an overview of the use of the KB base by the end user while creating Procedure Notes, the use of selected terms for query purposes, and an interface to maintain the KB.
  • The KB includes three main layers: Concept Layer, Data Layer and View Layer. The Concept Layer represents a dictionary from which individual words can be selected to build a more complicated grammar. The Data Layer represents the data that describes the Concepts in greater detail. The View Layer organizes the Keywords into groups of terms in a tree structure connected by menus. The view is the primary way for the user to navigate through the knowledge base to select the appropriate medical terms for a given examination type.
  • In the following discussion, class diagrams are provided according to the Unified Modeling Language (UML). As is known, class diagrams describe the static structure of a system. Classes represent an abstraction of entities with common characteristics. Classes are illustrated with rectangles divided into portions. The name of the class is in the top portion, and the attributes of the class are in the middle portion. Write operations may be provided in the bottom portion.
  • Moreover, associations represent the relationships between classes. Multiplicity or cardinality notations are indicated near the ends of an association. These symbols indicate the number of instances of one class linked to one instance of the other class, as follows: (a) “1” denotes no more than one, (b) “0 . . . 1” denotes zero or one, (c) “*” denotes many, (d) “0 . . . *” denotes zero or many, and (e) “1 . . . *” denotes one or many. A filled diamond represents a composition relationship, denoting a strong ownership between a “whole” class and a “part” class. A hollow diamond represents a simple aggregation relationship, in which the whole class plays a more important role than the part class, but the two classes are not dependent on each other. The diamond end in both a composition and aggregation relationship points toward the whole class or the aggregate.
  • Concept Layer (The Dictionary)
  • FIG. 25 illustrates a conceptual class diagram 2500 of the Concept Layer. The classes are “Concept” and “LUI” (Lexical Unique Identifier). There are one or many instances of “LUI” linked to one instance of “Concept”. Also, there is a composition relationship between “Concept” and “LUI”.
  • The Concept Layer defines the concepts that can exist in the KB. Concepts are given a textual representation and classified according to the dictionary definition of the concept within this layer. The Concept Layer acts much like a dictionary, defining meanings of concepts, whereas the rest of the knowledge base is more like an encyclopedia, defining concept usage.
  • Concepts are unique objects within the KB that represent the data that does not change regardless of the context in which the data is used. For example, the word “mass” is defined as a unified body of matter with no specific shape. It may also be defined as a large but nonspecific amount or number. These would be two separate concepts within the KB because they have different meanings. However, a mass within the esophagus and a mass within the colon would be the same concept because they are both a unified body of matter with no specific shape. As discussed further below, this approach advantageously allows accurate querying of indications, findings and other data obtained via the system, which distinguishes when the same word is used in different contexts. Concepts have no inherent relationship to any other concepts within the KB.
  • LUIs represent a specific version of text that represents a Concept. When the text for a Concept changes, a new LUI is created so that the Concept can still be referred to by its old name. For example, a user of the KB may choose to change the name of the Concept “mass” to “tumor” due to personal preference. In this case, all references to that Concept will have the text of “tumor” unless the specific LUI for “mass” is used.
  • LUIs are used to “lock” the text of a signed Procedure Note. When a Procedure Note is signed, the application traverses the tree of selected terms and stores the LUIs of the concepts at the time of signing. When the signed report is later amended, the selected items refer to the text that was present at the time of signing until that item is modified at which time the newest LUI is used. The LUI infrastructure could also be extended to accommodate other features where the textual representation of a concept needs to change without the underlying meaning being affected, e.g., for internationalization and synonym support.
  • Data Layer
  • FIG. 26 illustrates a conceptual class diagram 2600 of the Data Layer. The Data Layer represents all the knowledge that is contained within the KB. The Data Layer allows the application to understand what a concept means in the context of the KB. The Data Layer can be thought of as an encyclopedia of knowledge as apposed to the Concept Layer, which is a dictionary. For instance, while the dictionary definition of “mass” might be quite simple, an encyclopedia entry for mass could talk about all the different causes for, and descriptions of, a mass/tumor.
  • Keywords are Concepts that have a defined set of properties. Unlike Concepts, Keywords are linked with other Keywords to define a specific use of a Concept. For example, sample data for a Concept called Mass may include a classification as an Entity/ConceptualEntity/PatientProblem/Finding/ImagingFinding/ProtrudingLesions. The use of the “/” in this notation denote that the item following the “/” is a subclass of the item preceding the “/”. For example, Conceptual Entity is a subclass of Entity, PatientProblem is a subclass of Conceptual Entity, and so forth.
  • Classifications represent the semantics of Concepts. Through Classifications, Concepts are given a meaning. Classifications are constructed in a hierarchy much like a taxonomic classification of organisms. Taxonomy is the scientific discipline of categorizing various species of organisms into conveniently sized groups, referred to as taxa, which share common, identifiable traits. In one possible implementation, the root of the hierarchy is a classification called Entity, from which all other classifications are derived. The hierarchy is structured so that more specific classifications are lower in the hierarchy. For example, under the root classification of Entity there are two classifications; Physical Object and Conceptual Entity. If a Concept is classified as a Physical Object, like a chair for example, we know that that Concept exists in the physical world and could, for example, be measured. Alternatively if a Concept is classified as a Conceptual Entity, for example, occupation, we know this is not a physical thing and could not be held or touched. For example, with the above approach, the user could formulate a search of the KB to find all patients who had neoplasms. The KB would return patients that had tumors and polyps because both Concepts are classified as neoplasms.
  • Properties define the set of allowable values for a group of related keywords. A keyword can have any number of properties that are made up of a homogenous list of keywords that all share the same classification or inherit from the same classification. For example, the property Location on Mass has a classification of BodyLocation; this means that any keyword that is also classified as BodyLocation or something derived from BodyLocation, such as Organ, can be assigned as an allowable value to the property. The following table indicates sample properties, classifications and values for the keyword “mass.”
    Keyword - Mass
    Property Classification Values
    Location Entity/PhysicalObject/ esophagus, stomach,
    AnatomicalStructure/ duodenum, distance from entry
    AnatomicalForm/BodyLocation
    Appearance Entity/ConceptualEntity/ nodular, ulcerated, friable,
    FindingModifier/VisualAppearance firm, frond-like/villous,
    fungating, infiltrative,
    polypoid, submucosal, smooth
    Bleeding Entity/ConceptualEntity/ oozing, spurting, not bleeding,
    FindingModifier/Bleeding bleeding on contact
    Circumferential Entity/ConceptualEntity/ <25%, 25-49%, 50-74%, 75-99%,
    QuantitativeConcept/Size specify
    Narrowing Entity/ConceptualEntity/ Extrinsic, intrinsic, uncertain,
    QuantitativeConcept nodular, friable, firm, ulcerated,
    infiltrative, submucosal, smooth
    Obstructing Entity/ConceptualEntity/ Possible, mild, moderate, severe
    QuantitativeConcept
    Size Entity/ConceptualEntity/ diminutive, small, medium, large,
    QuantitativeConcept/Size measured (cm), measured (mm)
    Stigmata Entity/ConceptualEntity/ adherent clot, loose clot, spot, visible
    FindingModier/Bleeding vessel, no bleeding stigmata
  • FIG. 27 illustrates a conceptual class diagram 2700 of the Sentence Model infrastructure. Sentences define the prose that will be generated for a given set of keyword selections. A sentence model may include up to four different types of placeholders. Text placeholders are used to add static text to the sentence model. Node-placeholders are used as a placeholder for any value of the property specified in the placeholder. Conditional-placeholders are used to generate text that depends on the value or values of a property or properties, respectively, in the sentence model. Trigger-placeholders are used to put the keyword that this sentence model is associated with into the sentence. Moreover, a keyword may be associated with a summary sentence model and/or a normal sentence model. The summary sentence model is used specifically in the summary report section.
  • The following code illustrates example sentence models. In the examples, text-placeholders are represented surrounded by angle brackets (<>), node-placeholders are represented surrounded by square brackets ([ ]), trigger-placeholders are displayed all in uppercase, and conditional-placeholders are surrounded by curly braces ({ }).
  • EXAMPLE 1 Finding
  • <There was a> [Size] [Mass Appearance] TUMOR/MASS <present> {IF ([site]==(distance cm from teeth)) “at” else “in the” } [Site] {IF ([site]!=(whole esophagus) AND [site]!=(Distance cm from teeth) AND [site]==(ANY)) “of the esophagus” ELSE “”} {IF ([site]==(Distance cm from teeth)) “from the entry site” ELSE “”}<.>
  • The trigger-placeholder “TUMOR/MASS” indicates that the particular sentence model will be populated and displayed on the user interface, such as on the Procedure Note Builder display region 2340 (FIG. 23), when the user selects the keyword “TUMOR/MASS” from the available keywords and clicks the Generate Report icon. The node placeholders [Size] and [Mass Appearance] will be populated by the respective keywords selected by the user for those concepts. For example, [Size] and [Mass Appearance] may be populated by the keywords “large” and “rounded”, respectively. Moreover, note that various conditional statements are used to in the example code above to adjust the grammar of the sentence depending on the keywords selected by the user to characterize the finding. For example, if the “site” is described by a distance in cm from the teeth, the sentence will state: “There was a large rounded tumor/mass present at 10 cm from the teeth”. If the site is described by something other than a distance in cm from the teeth, e.g., the site is described as being the esophagus, the sentence will state: “There was a large rounded tumor/mass present in the esophagus”. The sentence models thus account for the different ways in which information can be provided by the user for the same findings, for instance.
  • EXAMPLE 2 Medication
  • <Start> CIPROFLOXACIN [dose][route] {IF ([frequency]==(qpm)) “every evening” ELSE “”} {IF ([frequency]==(q12 h)) “every twelve (12) hours” ELSE “”} {IF ([duration]==(ANY) “for” ELSE “” } [duration]<.>
  • The trigger-placeholder “CIPROFLOXACIN” indicates that the particular sentence model will be populated and displayed on the user interface, such as on the Procedure Note Builder screen region 2340 (FIG. 23), when the user selects the keyword “CIPROFLOXACIN” from the available keywords and clicks the Generate Report icon. The node placeholders [dose] and [route] will be populated by the respective keywords selected by the user for those concepts. For example, [dose] and [route] may be populated by the keywords “IV” and “2 mg”, respectively. Again, note that various conditional statements are used to adjust the grammar of the sentence depending on the keywords selected by the user to characterize the finding. For example, if the “frequency” is described by 12-hour period (variable “q12 h”), the sentence will state: “Start CIPROFLOXACIN IV 2 mg every twelve (12) hours.”
  • Triggers are used to signal the application that some action must be taken when the triggering keyword is selected. There are two forms of triggers that can be assigned to a keyword: Education and Recall. An education trigger is used to signal the system that a particular document should be queued for printing when the keyword is selected. A recall trigger is used to generate an item in a recall queue for a patient when the keyword is selected.
  • Codes are used to associate external codes to a keyword in the KB, primarily for billing. Any number of Codes can be assigned to a keyword. Codes are assigned to a Keyword from a code set, which is the universe of all possible codes for a given code set type.
  • View Layer
  • FIG. 28 illustrates a class diagram 2800 of the View Layer. Items are Keywords that appear within a View. Items are grouped into menus when they are inserted into a view. Menus are used to group related items together within a view. Shortcuts are stored selections that will be expanded when selected. Shortcuts are named entities that have their own menu structure. A shortcut menu appears under the parent of the highest-level menu for the shortcut.
  • Using the Knowledge Base
  • Procedure Note Data
  • When terms are selected from a view and associated with an examination, a reference to the keywords, not the terms, is saved. The references are stored in an ordered tree based on the order in which the terms are selected from the view.
  • Queries
  • The reference data associated with the examination will be open to queries. The query can look at either the concepts or keywords associated with the examination. For example to find all patients that had a finding of mass in the esophagus, the query could be: findings.mass.location=esophagus. To find all patients that had a finding of mass in the esophagus greater than 2 cm, the query could be: findings.mass.location=esophagus and findings.mass.size>2 cm. To find all patients that had a confirmed diagnosis of cancer in the esophagus, the query could be: diagnosis.(concept.classification=cancer).location=esophagus and diagnosis.(concept.classification=cancer).certainty=confirmed. Thus, the invention allows advanced data mining techniques from a database of patient information. Other techniques that simply search a database by keyword yield less accurate results. For instance, a search for the word “mass” in a database with such techniques might yield a finding stating that “no mass was found” as well as other tangential or irrelevant results.
  • Maintaining the Knowledge Base
  • Generally, the KB can be tailored to the user's specific needs. In one possible approach, a complete baseline KB is defined by the developer of the system to enable users such as physicians to immediately use the system. The users can subsequently fine-tune the system to meet their specific needs after gaining experience with the system by employing the following maintenance features.
  • FIG. 29 illustrates a user interface for adding a keyword to the KB. The KB maintenance user interface 2900 may include a number of tools, including: add menu/term, edit menu/term, add concept, edit concept, add property definition, edit property, remove, copy, past, move up, move down, create shortcut, edit sentence, edit codes and triggers.
  • In a pop up window 2920, the user enters a new concept named “stomach” in a field 2924. A classification tree 2922 indicates that the classification of the concept is under “organ”. The user can navigate the tree 2922 by clicking on the desired classification. The higher-level classifications in the tree are also shown, up to the top-level classification “Entity”. The user can also enter a plural term for the concept, and a description of the concept, via the window 2920. A classification description is also provided in a grayed out manner to indicate that it cannot be edited. In the display region 2905 of the interface 2900, the classification tree is repeated. A display region 2910 indicates that “esophagus” and “duodenum” are concepts that are related to “stomach” since they are all members of the classification “organ”. The user can check a “specify” checkbox 2925 so that the user is prompted to enter a value to replace the keyword when the report is generated. The data type specified by the drop down menu 2930 determines whether the user is allowed to enter a text or numeric value. Note that the keyword “stomach” is also defined for a certain exam type, e.g., EGD, and for a report section, e.g., Findings. The user can also adjust these factors.
  • FIG. 30 illustrates a user interface for adding a property definition to the knowledge base. The user interface 3000 includes a display region 3005 indicating that the property definition is for the classification of “body location”.
  • FIG. 31 illustrates a user interface 3100 for editing a keyword in the knowledge base. A tree 3120 indicates that the user has selected the keyword “ulcer”. The concept “ulcer” is presented in a grayed out manner to indicate it cannot be edited. The sentence name can be edited if desired. If the keyword is a “specify” type item that prompts the user for a value, the check box 3125 will be checked, and the user will be allowed to determine its data type with the drop down menu 3130.
  • FIG. 32 illustrates a user interface 3200 for editing properties of a keyword in the knowledge base. The keyword tree 3120 is the same as shown in FIG. 31. A display region 3210 allows the user to edit the properties of a keyword, such as “size”. The classification of the keyword is identified as a quantitative concept. The user selects one or more keywords for the quantitative concept from a pop up window 3230. The selected keywords are then copied to a display region 3220 as available values for detailing the property of “size”. The user may check a checkbox 3225 to indicate that a value for size is required to be entered by the user, e.g., when reporting the findings of a procedure.
  • FIG. 33 illustrates a user interface 3300 for assigning properties to a keyword in the knowledge base. The user selects a classification from the Select Class tree. A display region 3340 indicates the properties available in the selected class, and a user may choose a property to assign to the keyword. In window 3310, the user may create a new property by assigning it to a classification.
  • FIG. 34 illustrates a user interface 3400 for adding and editing codes in the knowledge base. The keywords are provided in the tree 3410. A pop up window 3430 allows the user to select codes such as for billing. For example, the user may choose a code set from a drop down menu, and one or more individual codes. A display region 3420 displays the selected codes.
  • FIG. 35 illustrates a user interface 3500 for adding and editing sentence models in the knowledge base. The tree 341 0 is the same as in FIG. 34. The keywords in the tree 3410 that trigger a sentence model may be highlighted, e.g., in bold font. A display region 3520 provides the sentence model for the normal or full sentence, while the display region 3530 provides the summary sentence model. The placeholder terms in brackets may be highlighted in yellow, for instance. The pop up window 3540 indicates the available nodes that the user may choose to insert into the sentence model.
  • FIG. 36 illustrates a user interface 3600 for adding a condition in a sentence model in the knowledge base. For example, the user may desire to add a conditional placeholder to the sentence model in the display region 3520. To do this, the user positions the cursor to a location in the sentence model in which the conditional placeholder is to be added, and clicks on the “conditional” icon, causing the window 3630 to pop up. The window 3630 allows the user to define a condition, operator, and values. The user also defines the text that is to be entered in the sentence model depending on whether or not the condition is met. The user can click on an icon 3635 to cause a window 3640 to pop up that display values from which to select. Once the user defines the conditions, corresponding code, such as in the C language, is generated in the sentence model. The example sentence model in the display region 3620 is simplified. In practice, detailed sentence models contain several conditional statements to account, e.g., for variations in the way a finding can be described, the level of detail, grammatical concerns and so forth.
  • FIG. 37 illustrates a user interface 3700 for adding and editing triggers in the knowledge base. As mentioned previously, selected keywords may trigger an action such as printing a document or scheduling a patient recall. For instance, the user may select the keyword “polyp” from the keyword tree 3705. In a display region 3710, the user checks a check box 3720 to set an education trigger. Additionally, a document is selected from a drop down menu 3725 to identify a relevant document to provide to the patient. The document may be printed and handed to the patient during the patient's examination, mailed to the patient's home, or emailed to the patient, for instance. Another checkbox sets a recall reminder for a given number of days, weeks or months, based on a second drop down list. A follow-up examination may be scheduled based on the recall.
  • FIG. 38 illustrates a user interface 3800 for adding a keyword item to a menu in a view. The user selects a menu such as “colon findings” in the tree 3810, then selects from a menu of available terms in a list 3820, after selecting a classification in region 3830 to display a list of available keywords in the selected classification.
  • FIG. 39 illustrates a user interface 3900 for adding a menu in the knowledge base. The user selects a keyword, e.g., “normal”, from the tree 3910, for which the menu is to be provided. Assume the user desires to add a menu for “rectal sites” under “normal” in the tree 3910. To do this, the user types in the menu name in a display region 3920. Other factors, such as menu type, can also be defined.
  • FIG. 40 illustrates a user interface 4000 for creating a shortcut in the knowledge base. With this feature, the user can select multiple keywords at a time, which is desirable when the same group of keywords is selected over and over, e.g., for different patients. The user provides pre text 4025 for the report, which appears prior to the selected keywords, and post text 4030, which appears after the selected keywords. When the user selects the multiple keywords from the tree 4010, a sentence model is populated to provide a preview 4035 of the resulting sentence.
  • FIG. 41 illustrates a user interface 4100 for editing a shortcut in the knowledge base. The shortcut named “Std Meds” was created using the interface 4000 of FIG. 40. The keyword “Shortcuts” appears in the tree 4110 along with the specific shortcut “Std Meds”. The user can quickly generate the report in the display region 4120 by selecting the keyword “Std Meds”.
  • Grammar Engine
  • As discussed, the user may automatically generate a report from selected keywords by clicking on the Generate Report icon. Furthermore, a grammar checking routine or grammar engine may be used to correct or optimize the grammar in the populated sentences. The grammar engine may be run automatically when a sentence is populated. Various grammar engines, including those known in the art, may be used.
  • A grammar engine can be helpful for various reasons, such as ensuring that the verb and subject of a sentence agree, providing correct punctuation and capitalization, and ensuring that singular and plural nouns are properly modified. In one possible approach, a grammar engine includes three main components. First, a part-of-speech tagger assigns a part of speech (POS) tag to each word or word component (e.g., noun or verb) in the generated sentence. Second, a lexical analyzer is run to break the sentence into grammatical components, phrases, and clauses. The final portion updates the sentence with the necessary corrections.
  • Each keyword in the KB may be assigned a primary tag by default that identifies the most likely grammatical characteristic of the keyword in the domain in which it is used. Tags for other allowable grammatical characteristic may also be provided. Tags may also be provided for the words in a sentence that are not keywords, such as static text. Example grammatical characteristics identify a word as being, e.g., an adjective, singular noun, plural noun, adverb and so forth. The tag may be a two-letter codeword, for instance.
  • FIG. 42 illustrates user interfaces for a grammar engine. In accordance with a further aspect of the invention, user interfaces may be provided that allow a user to set grammatical characteristics, and view existing characteristics, for keywords or groups of keywords in the KB for use by a grammar engine. This allows the user to tailor the system to the user's preferences and needs, as well as to gain a better understanding of the operation of the grammar engine. A first interface 4200 allows the user to search a dictionary of terms. “Dilated bile duct” is an example. The interface indicates that the term has been assigned to the grammatical characteristic or category of NN, denoting a singular noun. The user then clicks on “edit” to edit the dictionary entry definition. A user interface 4220 provides the name of the entry and the base word, e.g., “duct”. The user can use check boxes, drop down menus and other widgets to set the grammatical characteristics of the entry. For example, the part of speech can be changed by clicking on the “edit” button, thereby causing the user interface 4240 to appear. The interface 4240 provides a list of available grammatical characteristics on the left hand display region 4242 and the one or more assigned characteristics on the right hand display region 4244. The user can assign an available characteristic by clicking on the characteristic in the display region 4242 and clicking on the right-pointing arrow. Similarly, the user can delete an assigned characteristic by clicking on the characteristic in the display region 4244 and clicking on the left-pointing arrow. When multiple characteristics are assigned, the user designates one of them as a primary characteristic by selecting the characteristic and clicking on the “primary” button in the display region 4244.
  • The invention has been described herein with reference to particular exemplary embodiments. Certain alterations and modifications may be apparent to those skilled in the art, without departing from the scope of the invention. The exemplary embodiments are meant to be illustrative, not limiting of the scope of the invention, which is defined by the appended claims.

Claims (29)

1. A method for generating a report regarding a procedure, comprising:
displaying available keywords from a knowledge base on a first display region of a user interface;
receiving, via the user interface, at least one user command selecting at least one of the available keywords from the first display region;
displaying the at least one of the available keywords on a second display region of the user interface, responsive to the selection thereof by the at least one user command;
populating a sentence model according to the at least one of the available keywords to provide a populated sentence; and
displaying the populated sentence on a third display region of the user interface.
2. The method of claim 1, wherein:
the populating the sentcnce model comprises selecting the sentence model from among a plurality of available sentence models according to the at least one of the available keywords.
3. The method of claim 1, farther comprising:
editing the populated sentence by: (a) receiving, via the user interface, at least one user command selecting the at least one of the available keywords in the populated sentence, and at least one user command selecting another of the available keywords from the first display region, and (b) re-populating the sentence model using the another of the available keywords in place of the at least one of the available keywords.
4. The method of claim 1, wherein:
the displaying the populated sentence comprises displaying the populated sentence with the at least one of the available keywords highlighted therein.
5. The method of claim 1, wherein:
the populating the sentence model comprises providing static text in at least one text placeholder.
6. The method of claim 1, wherein:
the populating the sentence model comprises providing a specified property value in at least one node placeholder.
7. The method of claim 1, wherein:
the populating the sentence model comprises generating text in the at least one sentence that depends on the value of a property.
8. The method of claim 1, wherein:
the receiving comprises receiving, via the user interface, at least one user command selecting a plurality of the available keywords from the first display region;
the populating the sentence model comprises populating the sentence model according to the plurality of the available keywords to provide the populated sentence; and
the sentence model comprises a summary sentence model in which fewer than all of the plurality of the available keywords are used.
9. The method of claim 1, wherein:
the rccciving comprises receiving, via the user interface, at least one user command selecting a plurality of the available keywords from the first display region;
the populating the sentence model comprises populating the sentence model according to the plurality of the available keywords to provide the populated sentence; and
the sentence model comprises a normal sentence model in which all of the plurality of the available keywords are used.
10. The method of claim 1, further comprising:
receiving, via the user interface, at lest one user command setting a grammatical property of the at least one of the available keywords;
analyzing a grammar of the at least one populated sentence based on the grammatical property.
11. The method of claim 1, wherein:
the displaying the available keywords comprises displaying the available keywords on the first display region organized according to at least one hierarchical menu wherein related keywords are child keywords of a parent keyword.
12. The method of claim 11, wherein:
the receiving comprises receiving, via the user interface, at least one user command selecting a plurality of the available keywords from the first display region; and
the displaying the at least one of the available keywords comprises displaying the plurality of the available keywords on the second display region organized according to at least one hierarchical menu in accordance with the at least one hierarchical menu in the first display region.
13. The method of claim 11, wherein:
the child keywords qualify the parent keyword.
14. The method of claim 1, wherein:
the receiving comprises receiving, via the user interface, a shortcut command for selecting multiple keywords at a time from the available keywords.
15. The method of claim 1, further comprising:
receiving at least one user command for maintaining the knowledge base.
16. The method of claim 15, wherein:
the at least one user command for maintaining the knowledge base comprises at least one user command for adding a keyword to the knowledge base.
17. The method of claim 15, wherein:
the at least one user command for maintaining the knowledge base comprises at least one user command for editing properties of the at least one of the available keywords.
18. The method of claim 15, wherein:
the at least one user command for maintaining the knowledge base comprises at least one user command for associating a billing code with the at least one of the available keywords.
19. The method of claim 15, wherein:
the at least one user command for maintaining the knowledge base comprises at least one user command for at least one of creating and editing the sentence model.
20. The method of claim 15, wherein:
the at least one user command for maintaining the knowledge base comprises at least one user command for at least one of adding and editing a trigger for the at least one of the available keywords.
21. The method of claim 1, further comprising:
displaying at least one image obtained from the procedure in a fourth display region of the user interface.
22. A method for providing keywords for generating a report regarding a procedure, comprising:
providing respective keywords for use in the report;
associating each of the respective keywords with a respective classification in a hierarchically arranged tree structure of classifications;
associating respective properties with the respective keywords; and
defining, based on the respective properties, a set of allowable values for a group of the respective keywords;
wherein the group of the respective keywords are related.
23. The method of claim 22, wherein:
the group of the respective keywords which are related are associated with the same classification.
24. The method of claim 22, wherein:
the group of the respective keywords which are related inherit from the same classification.
25. The method of claim 22, further comprising:
displaying the available keywords on a first display region of a user interface organized in a first tree structure that is based on the hierarchically arranged tree structure of classifications;
receiving, via the user interface, at least one user command selecting at least one of the available keywords from the first tree structure;
displaying the at least one of the available keywords on a second display region of the user interface, responsive to the receiving, in a second tree structure that is based on the first tree structure.
26. The method of claim 22, further comprising:
populating a sentence model according to the at least one of the available keywords to provide a populated sentence; and
displaying the populated sentence on the user interface.
27. The method of claim 22, further comprising:
querying a database containing the keywords based on at least one of the respective classifications.
28. A user interface for assisting a user in generating a report regarding a procedure, comprising:
a first display region displaying available keywords from a knowledge base;
wherein the user provides at least one user command selccting at least one of the available keywords from the first display region;
a second display region displaying the at least one of the available keywords responsive to the selection thereof by the at least one user command; and
a third display region displaying a populated sentence that is provided by populating a sentence model according to the at least one of the available keywords.
29. A program storage device tangibly embodying a program of instructions executable by a machine to perform a method for providing keywords for generating a report regarding a procedure, the method comprising:
providing respective keywords for use in the report;
associating each of the respective keywords with a respective classification in a hierarchically arranged tree structure of classifications;
associating respective properties with the respective keywords; and
defining, based on the respective properties, a set of allowable values for a group of the respective keywords;
wherein the group of the respective keywords are related.
US10/846,255 2003-05-16 2004-05-14 System and method for generating a report using a knowledge base Abandoned US20050114283A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/846,255 US20050114283A1 (en) 2003-05-16 2004-05-14 System and method for generating a report using a knowledge base

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US47134903P 2003-05-16 2003-05-16
US10/846,255 US20050114283A1 (en) 2003-05-16 2004-05-14 System and method for generating a report using a knowledge base

Publications (1)

Publication Number Publication Date
US20050114283A1 true US20050114283A1 (en) 2005-05-26

Family

ID=33476833

Family Applications (4)

Application Number Title Priority Date Filing Date
US10/846,254 Abandoned US20050075535A1 (en) 2003-05-16 2004-05-14 Data entry system for an endoscopic examination
US10/846,253 Expired - Fee Related US7492388B2 (en) 2003-05-16 2004-05-14 System and method for automatic processing of endoscopic images
US10/846,245 Abandoned US20050075544A1 (en) 2003-05-16 2004-05-14 System and method for managing an endoscopic lab
US10/846,255 Abandoned US20050114283A1 (en) 2003-05-16 2004-05-14 System and method for generating a report using a knowledge base

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US10/846,254 Abandoned US20050075535A1 (en) 2003-05-16 2004-05-14 Data entry system for an endoscopic examination
US10/846,253 Expired - Fee Related US7492388B2 (en) 2003-05-16 2004-05-14 System and method for automatic processing of endoscopic images
US10/846,245 Abandoned US20050075544A1 (en) 2003-05-16 2004-05-14 System and method for managing an endoscopic lab

Country Status (5)

Country Link
US (4) US20050075535A1 (en)
EP (4) EP1627356A4 (en)
JP (4) JP2007505419A (en)
CA (4) CA2526149A1 (en)
WO (4) WO2004103151A2 (en)

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050120002A1 (en) * 2003-10-02 2005-06-02 Hassan Behbehani Automated text generation process
US20060093199A1 (en) * 2004-11-04 2006-05-04 Fram Evan K Systems and methods for viewing medical 3D imaging volumes
US20060093198A1 (en) * 2004-11-04 2006-05-04 Fram Evan K Systems and methods for interleaving series of medical images
US20060095423A1 (en) * 2004-11-04 2006-05-04 Reicher Murray A Systems and methods for retrieval of medical data
US20060103729A1 (en) * 2003-12-12 2006-05-18 James Burns Computer-based image capture system
US20060236142A1 (en) * 2002-02-01 2006-10-19 Xerox Corporation Methods and systems for accessing email
US20060288004A1 (en) * 2005-06-15 2006-12-21 Nintendo Co., Ltd. Storage medium storing program and information processing apparatus
US20070143143A1 (en) * 2005-12-16 2007-06-21 Siemens Medical Solutions Health Services Corporation Patient Discharge Data Processing System
US20080086330A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Providing multidisciplinary activities in context of clinician's role relevant activities
US20080086684A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Clinical activity navigator
US20080086334A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Providing clinical activity details in context
US20080086333A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Documentation of medication activities in context of mar
US20080086332A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Viewing clinical activity details within a selected time period
US20080086331A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Acknowledgement of previous results for medication administration
US20080086329A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Resceduling clinical activities in context of activities view
US20080086336A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Patient outcomes in context of documentation
US20080086328A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Patient activity coordinator
US20080281629A1 (en) * 2007-05-08 2008-11-13 Medaptus, Inc. Method and System for Processing Medical Billing Records
US20090037222A1 (en) * 2007-08-02 2009-02-05 Kuo Eric E Clinical data file
US20090086269A1 (en) * 2007-09-28 2009-04-02 Kyocera Mita Corporation Image Forming Apparatus and Image Forming System
US20090132586A1 (en) * 2007-11-19 2009-05-21 Brian Napora Management of Medical Workflow
US20090144051A1 (en) * 2007-12-04 2009-06-04 Nhn Corporation Method of providing personal dictionary
US20090287487A1 (en) * 2008-05-14 2009-11-19 General Electric Company Systems and Methods for a Visual Indicator to Track Medical Report Dictation Progress
US20100138239A1 (en) * 2008-11-19 2010-06-03 Dr Systems, Inc. System and method of providing dynamic and customizable medical examination forms
US20100201714A1 (en) * 2004-11-04 2010-08-12 Dr Systems, Inc. Systems and methods for viewing medical images
US20110029853A1 (en) * 2009-08-03 2011-02-03 Webtrends, Inc. Advanced visualizations in analytics reporting
US20110054884A1 (en) * 2007-09-17 2011-03-03 Capfinder Aktiebolag System for assisting in drafting applications
US7953614B1 (en) 2006-11-22 2011-05-31 Dr Systems, Inc. Smart placement rules
US20110239099A1 (en) * 2010-03-23 2011-09-29 Disney Enterprises, Inc. System and method for video poetry using text based related media
US8094901B1 (en) 2004-11-04 2012-01-10 Dr Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US20120173281A1 (en) * 2011-01-05 2012-07-05 Dilella James M Automated data entry and transcription system, especially for generation of medical reports by an attending physician
US20120323576A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Automated adverse drug event alerts
US20130191154A1 (en) * 2012-01-22 2013-07-25 Dobkin William R. Medical data system generating automated surgical reports
US20140006294A1 (en) * 2012-06-28 2014-01-02 Korea Atomic Energy Research Institute Technology Trend Analysis Report Generating System and Recording Medium
US20140013219A1 (en) * 2012-07-06 2014-01-09 Canon Kabushiki Kaisha Apparatus and method for generating inspection report(s)
US8712120B1 (en) 2009-09-28 2014-04-29 Dr Systems, Inc. Rules-based approach to transferring and/or viewing medical images
US20140359509A1 (en) * 2013-05-31 2014-12-04 Alp Sinan Baran Templates
WO2014197812A1 (en) * 2013-06-07 2014-12-11 Kiglies Mauricio Electronic on-line motor vehicle management and auction system
WO2015026485A3 (en) * 2013-08-20 2015-06-18 Intelligent Medical Objects, Inc. System and method for implementing a 64 bit data searching and delivery portal
US9092551B1 (en) 2011-08-11 2015-07-28 D.R. Systems, Inc. Dynamic montage reconstruction
US9411804B1 (en) 2013-07-17 2016-08-09 Yseop Sa Techniques for automatic generation of natural language text
WO2016126859A1 (en) * 2015-02-03 2016-08-11 Texas Tech University System Graphical user interface system for interactive, hierarchical, multi-panel comprehension of multi-format data
WO2016135100A1 (en) * 2015-02-23 2016-09-01 Qmedify Gmbh Apparatus and method for producing a medical report
US10037317B1 (en) * 2013-07-17 2018-07-31 Yseop Sa Techniques for automatic generation of natural language text
US10665342B2 (en) 2013-01-09 2020-05-26 Merge Healthcare Solutions Inc. Intelligent management of computerized advanced processing
US10909168B2 (en) 2015-04-30 2021-02-02 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and review of, digital medical image data
US20220139548A1 (en) * 2018-10-23 2022-05-05 Nec Corporation Nursing assistance device, nursing assistance method, and recording medium
US11354007B2 (en) * 2015-04-07 2022-06-07 Olympus America, Inc. Diagram based visual procedure note writing tool

Families Citing this family (198)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070167681A1 (en) * 2001-10-19 2007-07-19 Gill Thomas J Portable imaging system employing a miniature endoscope
US7252633B2 (en) * 2002-10-18 2007-08-07 Olympus Corporation Remote controllable endoscope system
US7240119B2 (en) * 2002-11-04 2007-07-03 Ge Fanuc Automation North America, Inc. Method for configuring a programmable logic controller using an extensible markup language schema
US8719053B2 (en) * 2003-07-17 2014-05-06 Ventana Medical Systems, Inc. Laboratory instrumentation information management and control network
US7860727B2 (en) * 2003-07-17 2010-12-28 Ventana Medical Systems, Inc. Laboratory instrumentation information management and control network
US20080235055A1 (en) * 2003-07-17 2008-09-25 Scott Mattingly Laboratory instrumentation information management and control network
US7302444B1 (en) * 2003-08-15 2007-11-27 Microsoft Corporation System for designating grid-based database reports
US20070185390A1 (en) * 2003-08-19 2007-08-09 Welch Allyn, Inc. Information workflow for a medical diagnostic workstation
US20050050439A1 (en) * 2003-08-28 2005-03-03 Xerox Corporation Method to distribute a document to one or more recipients and document distributing apparatus arranged in accordance with the same method
US7779039B2 (en) 2004-04-02 2010-08-17 Salesforce.Com, Inc. Custom entities and fields in a multi-tenant database system
US20050091191A1 (en) * 2003-09-24 2005-04-28 Greg Miller System and method for managing and utilizing information
US8065161B2 (en) 2003-11-13 2011-11-22 Hospira, Inc. System for maintaining drug information and communicating with medication delivery devices
US7490021B2 (en) * 2003-10-07 2009-02-10 Hospira, Inc. Method for adjusting pump screen brightness
US9123077B2 (en) 2003-10-07 2015-09-01 Hospira, Inc. Medication management system
US7149973B2 (en) * 2003-11-05 2006-12-12 Sonic Foundry, Inc. Rich media event production system and method including the capturing, indexing, and synchronizing of RGB-based graphic content
US20050114180A1 (en) * 2003-11-26 2005-05-26 Ploetz Lawrence E. System and method for providing potential problem solutions to a service provider
US7840416B2 (en) * 2003-12-23 2010-11-23 ProVation Medical Inc. Naturally expressed medical procedure descriptions generated via synchronized diagrams and menus
JP4049115B2 (en) * 2004-03-15 2008-02-20 セイコーエプソン株式会社 projector
US20050273365A1 (en) * 2004-06-04 2005-12-08 Agfa Corporation Generalized approach to structured medical reporting
US7774326B2 (en) * 2004-06-25 2010-08-10 Apple Inc. Methods and systems for managing data
US7437358B2 (en) 2004-06-25 2008-10-14 Apple Inc. Methods and systems for managing data
US7730012B2 (en) 2004-06-25 2010-06-01 Apple Inc. Methods and systems for managing data
JP2006006834A (en) * 2004-06-29 2006-01-12 Pentax Corp Electronic endoscope system
US7970631B2 (en) * 2004-08-31 2011-06-28 Ethicon Endo-Surgery, Inc. Medical effector system
JP4690683B2 (en) * 2004-09-13 2011-06-01 株式会社東芝 Ultrasonic diagnostic apparatus and medical image browsing method
EP1650980A3 (en) * 2004-10-20 2010-09-29 FUJIFILM Corporation Electronic endoscope apparatus
US7747959B2 (en) * 2004-12-17 2010-06-29 Siebel Systems, Inc. Flexible and extensible combination user interfaces
US20060136832A1 (en) * 2004-12-17 2006-06-22 Siebel Systems, Inc. Flexible and extensible combination user interfaces
US20060162546A1 (en) * 2005-01-21 2006-07-27 Sanden Corporation Sealing member of a compressor
US20060173713A1 (en) * 2005-01-26 2006-08-03 Alan Petro Integrated medical device and healthcare information system
EP1895900A4 (en) * 2005-05-06 2009-12-30 Stereoraxis Inc Preoperative and intra-operative imaging-based procedure workflow with complexity scoring
AU2006245251B2 (en) * 2005-05-12 2009-10-08 Olympus Medical Systems Corp. Biometric instrument
US8527540B2 (en) * 2005-05-24 2013-09-03 Business Objects Software Ltd. Augmenting a report with metadata for export to a non-report document
US7983943B2 (en) * 2005-05-27 2011-07-19 Xerox Corporation Method and system for workflow process node synchronization
JP4879519B2 (en) * 2005-06-03 2012-02-22 株式会社ニデック Medical information management system
JP5395434B2 (en) 2005-09-09 2014-01-22 セールスフォース ドット コム インコーポレイティッド System and method for exporting, publishing, browsing and installing on-demand applications in a multi-tenant database environment
US8121863B2 (en) * 2005-09-12 2012-02-21 Diakides Nicholas A Method for detecting abnormalities in medical screening
US20070067179A1 (en) * 2005-09-16 2007-03-22 Wizard International, Inc. Framed art visualization software
US7403123B2 (en) * 2005-09-30 2008-07-22 General Electric Company Method and apparatus for displaying a patient worklist
JP4967317B2 (en) * 2005-11-17 2012-07-04 コニカミノルタエムジー株式会社 Information processing system
JP4791840B2 (en) * 2006-02-06 2011-10-12 株式会社日立ハイテクノロジーズ Charged particle beam apparatus, scanning electron microscope, and sample inspection method
US8566113B2 (en) * 2006-02-07 2013-10-22 International Business Machines Corporation Methods, systems and computer program products for providing a level of anonymity to patient records/information
US20070203744A1 (en) * 2006-02-28 2007-08-30 Stefan Scholl Clinical workflow simulation tool and method
US20070238962A1 (en) * 2006-04-06 2007-10-11 Andreas Hartlep Transfer of treatment planning information using standard image transfer protocols
US7861159B2 (en) * 2006-04-07 2010-12-28 Pp Associates, Lp Report generation with integrated quality management
EP2015678B1 (en) 2006-05-08 2014-09-03 C.R. Bard, Inc. User interface and methods for sonographic display device
US20080016120A1 (en) * 2006-06-29 2008-01-17 Yosi Markovich System and method for case management
US20080091065A1 (en) * 2006-10-04 2008-04-17 Olympus Medical Systems Corporation Medical image processing apparatus, endoscope system and medical image processing system
AU2007317669A1 (en) 2006-10-16 2008-05-15 Hospira, Inc. System and method for comparing and utilizing activity information and configuration information from mulitple device management systems
US8694907B2 (en) * 2006-11-29 2014-04-08 Siemens Medical Solutions Usa, Inc. Imaging study completion processing system
JP2008149027A (en) * 2006-12-19 2008-07-03 Olympus Corp Endoscope apparatus
US8643484B2 (en) 2006-12-20 2014-02-04 Thomson Licensing Visual alert system for set-top box standby mode
WO2008089204A1 (en) * 2007-01-15 2008-07-24 Allscripts Healthcare Solutions, Inc. Universal application integrator
US10474318B2 (en) * 2007-03-26 2019-11-12 Adobe Inc. Systems and methods for controlling the display of tools based on document state
CN101652097A (en) * 2007-04-12 2010-02-17 皇家飞利浦电子股份有限公司 With the bonded image capturing of vital signs bedside monitor
JP2009022689A (en) * 2007-07-24 2009-02-05 Hoya Corp Electronic endoscope apparatus
WO2009015466A1 (en) * 2007-07-27 2009-02-05 The Hospital For Sick Children A medical vital sign indication tool, system and method
US20090138282A1 (en) * 2007-11-28 2009-05-28 Chuck Lee System and Method for Tracking and Maintaining Vascular Access Medical Records
US8517990B2 (en) 2007-12-18 2013-08-27 Hospira, Inc. User interface improvements for medical devices
US20090189978A1 (en) * 2008-01-29 2009-07-30 Olympus Medical Systems Corp. Medical support control system
US20100017223A1 (en) * 2008-03-03 2010-01-21 Amy Johnson Electronic donor medical records management system
US7765489B1 (en) 2008-03-03 2010-07-27 Shah Shalin N Presenting notifications related to a medical study on a toolbar
JP2009219573A (en) * 2008-03-14 2009-10-01 Fujinon Corp Image processor for endoscope and image processing method for endoscope
US10875182B2 (en) 2008-03-20 2020-12-29 Teladoc Health, Inc. Remote presence system mounted to operating room hardware
US20090254867A1 (en) * 2008-04-03 2009-10-08 Microsoft Corporation Zoom for annotatable margins
US8065167B1 (en) * 2008-05-09 2011-11-22 Robert Kurt Wyman Computer systems for managing patient discharge
US20090307618A1 (en) * 2008-06-05 2009-12-10 Microsoft Corporation Annotate at multiple levels
US20090316013A1 (en) * 2008-06-20 2009-12-24 Largent Jana C System and method for processing medical graphics data
US10162477B2 (en) * 2008-07-23 2018-12-25 The Quantum Group, Inc. System and method for personalized fast navigation
JP5215105B2 (en) * 2008-09-30 2013-06-19 オリンパスメディカルシステムズ株式会社 Image display device, image display method, and image display program
US8862485B2 (en) * 2008-10-15 2014-10-14 Rady Children's Hospital—San Diego System and method for data quality assurance cycle
US20100097576A1 (en) * 2008-10-16 2010-04-22 Woodlyn, Inc. Administering and analyzing ophthalmic examinatioins
IT1392731B1 (en) * 2009-01-27 2012-03-16 Techlab Works S A S Di Luigi Tummino & C INTEGRATED SYSTEM FOR ENDOSCOPIC ANALYSIS.
EP2419847A1 (en) * 2009-04-17 2012-02-22 Koninklijke Philips Electronics N.V. System and method for storing a candidate report
US8271106B2 (en) 2009-04-17 2012-09-18 Hospira, Inc. System and method for configuring a rule set for medical event management and responses
US11547275B2 (en) 2009-06-18 2023-01-10 Endochoice, Inc. Compact multi-viewing element endoscope system
US9101268B2 (en) 2009-06-18 2015-08-11 Endochoice Innovation Center Ltd. Multi-camera endoscope
US10165929B2 (en) 2009-06-18 2019-01-01 Endochoice, Inc. Compact multi-viewing element endoscope system
US9101287B2 (en) 2011-03-07 2015-08-11 Endochoice Innovation Center Ltd. Multi camera endoscope assembly having multiple working channels
US9872609B2 (en) 2009-06-18 2018-01-23 Endochoice Innovation Center Ltd. Multi-camera endoscope
US9706903B2 (en) 2009-06-18 2017-07-18 Endochoice, Inc. Multiple viewing elements endoscope system with modular imaging units
US11278190B2 (en) 2009-06-18 2022-03-22 Endochoice, Inc. Multi-viewing element endoscope
US9492063B2 (en) 2009-06-18 2016-11-15 Endochoice Innovation Center Ltd. Multi-viewing element endoscope
US11864734B2 (en) 2009-06-18 2024-01-09 Endochoice, Inc. Multi-camera endoscope
US9713417B2 (en) 2009-06-18 2017-07-25 Endochoice, Inc. Image capture assembly for use in a multi-viewing elements endoscope
WO2012120507A1 (en) 2011-02-07 2012-09-13 Peermedical Ltd. Multi-element cover for a multi-camera endoscope
US9642513B2 (en) 2009-06-18 2017-05-09 Endochoice Inc. Compact multi-viewing element endoscope system
EP3811847A1 (en) 2009-06-18 2021-04-28 EndoChoice, Inc. Multi-camera endoscope
US9402533B2 (en) 2011-03-07 2016-08-02 Endochoice Innovation Center Ltd. Endoscope circuit board assembly
US8926502B2 (en) 2011-03-07 2015-01-06 Endochoice, Inc. Multi camera endoscope having a side service channel
US9901244B2 (en) 2009-06-18 2018-02-27 Endochoice, Inc. Circuit board assembly of a multiple viewing elements endoscope
US8863031B2 (en) * 2009-07-17 2014-10-14 Andre Gene Douen Systems, methods and articles for managing presentation of information
US20110016427A1 (en) * 2009-07-17 2011-01-20 Andre Gene Douen Systems, Methods and Articles For Managing Presentation of Information
US20110029326A1 (en) * 2009-07-28 2011-02-03 General Electric Company, A New York Corporation Interactive healthcare media devices and systems
US20110029325A1 (en) * 2009-07-28 2011-02-03 General Electric Company, A New York Corporation Methods and apparatus to enhance healthcare information analyses
US20110071850A1 (en) * 2009-09-23 2011-03-24 General Electric Company Method and system for managing healthcare resources
US20110112850A1 (en) * 2009-11-09 2011-05-12 Roberto Beraja Medical decision system including medical observation locking and associated methods
US20110191356A1 (en) * 2010-02-01 2011-08-04 Gazula Krishna Advanced application for capturing, storing and retrieving digital images of a patient condition during a real-time virtual face-to-face encounter
WO2011100577A2 (en) * 2010-02-12 2011-08-18 Procure Treatment Centers, Inc. Robotic mobile anesthesia system
US8670017B2 (en) 2010-03-04 2014-03-11 Intouch Technologies, Inc. Remote presence system including a cart that supports a robot face and an overhead camera
US20130041681A1 (en) * 2010-03-04 2013-02-14 Koninklijke Philips Electronics N.V. Clinical decision support system with temporal context
US10019685B1 (en) * 2010-07-28 2018-07-10 EPOWERdoc, Inc. Emergency department information system
EP3718466B1 (en) 2010-09-20 2023-06-07 EndoChoice, Inc. Endoscope distal section comprising a unitary fluid channeling component
US9560953B2 (en) 2010-09-20 2017-02-07 Endochoice, Inc. Operational interface in a multi-viewing element endoscope
CN103403605A (en) 2010-10-28 2013-11-20 恩多巧爱思创新中心有限公司 Optical systems for multi-sensor endoscopes
JP6054874B2 (en) 2010-12-09 2016-12-27 エンドチョイス イノベーション センター リミテッド Flexible electronic circuit board for multi-camera endoscope
EP3420886B8 (en) 2010-12-09 2020-07-15 EndoChoice, Inc. Flexible electronic circuit board multi-camera endoscope
US11889986B2 (en) 2010-12-09 2024-02-06 Endochoice, Inc. Flexible electronic circuit board for a multi-camera endoscope
JP5657375B2 (en) * 2010-12-24 2015-01-21 オリンパス株式会社 Endoscope apparatus and program
EP2668008A4 (en) 2011-01-28 2018-01-24 Intouch Technologies, Inc. Interfacing with a mobile telepresence robot
US10674968B2 (en) 2011-02-10 2020-06-09 Karl Storz Imaging, Inc. Adjustable overlay patterns for medical display
US10631712B2 (en) 2011-02-10 2020-04-28 Karl Storz Imaging, Inc. Surgeon's aid for medical display
US11412998B2 (en) 2011-02-10 2022-08-16 Karl Storz Imaging, Inc. Multi-source medical display
US20120330674A1 (en) * 2011-02-21 2012-12-27 John Brimm Hospital-acquired infections dashboard systems and methods
US8606597B2 (en) 2011-02-24 2013-12-10 Olympus Corporation Endoscope inspection report creating apparatus, creating method of endoscope inspection report and storage medium
US10769739B2 (en) * 2011-04-25 2020-09-08 Intouch Technologies, Inc. Systems and methods for management of information among medical providers and facilities
WO2012174539A1 (en) * 2011-06-17 2012-12-20 Parallax Enterprises Consolidated healthcare and resource management system
CA2844807C (en) 2011-08-19 2022-07-26 Hospira, Inc. Systems and methods for a graphical interface including a graphical representation of medical data
AU2012325937B2 (en) 2011-10-21 2018-03-01 Icu Medical, Inc. Medical device update system
EP3659491A1 (en) 2011-12-13 2020-06-03 EndoChoice Innovation Center Ltd. Removable tip endoscope
EP2604172B1 (en) 2011-12-13 2015-08-12 EndoChoice Innovation Center Ltd. Rotatable connector for an endoscope
US10022498B2 (en) 2011-12-16 2018-07-17 Icu Medical, Inc. System for monitoring and delivering medication to a patient and method of using the same to minimize the risks associated with automated therapy
US8755606B2 (en) 2011-12-16 2014-06-17 Harris Corporation Systems and methods for efficient feature extraction accuracy using imperfect extractors
US8832593B2 (en) * 2011-12-16 2014-09-09 Harris Corporation Systems and methods for efficient spatial feature analysis
US8855427B2 (en) 2011-12-16 2014-10-07 Harris Corporation Systems and methods for efficiently and accurately detecting changes in spatial feature data
US9152764B2 (en) * 2012-02-02 2015-10-06 Photon Medical Communications, Inc. Systems and methods for managing data
JP6306566B2 (en) 2012-03-30 2018-04-04 アイシーユー・メディカル・インコーポレーテッド Air detection system and method for detecting air in an infusion system pump
CA2871674A1 (en) * 2012-05-31 2013-12-05 Ikonopedia, Inc. Image based analytical systems and processes
JP2013250967A (en) * 2012-05-31 2013-12-12 Xerox Corp Asynchronous personalization of records using dynamic scripting
US20130339051A1 (en) * 2012-06-18 2013-12-19 George M. Dobrean System and method for generating textual report content
US9560954B2 (en) 2012-07-24 2017-02-07 Endochoice, Inc. Connector for use with endoscope
ES2743160T3 (en) 2012-07-31 2020-02-18 Icu Medical Inc Patient care system for critical medications
CN104956391A (en) * 2012-09-13 2015-09-30 帕克兰临床创新中心 Clinical dashboard user interface system and method
US20140074495A1 (en) * 2012-09-13 2014-03-13 Arne Brock-Utne Ambulatory surgery centers
WO2014081867A2 (en) 2012-11-20 2014-05-30 Ikonopedia, Inc. Secure data transmission
AU2014225658B2 (en) 2013-03-06 2018-05-31 Icu Medical, Inc. Medical device communication method
US9993142B2 (en) 2013-03-28 2018-06-12 Endochoice, Inc. Fluid distribution device for a multiple viewing elements endoscope
US9986899B2 (en) 2013-03-28 2018-06-05 Endochoice, Inc. Manifold for a multiple viewing elements endoscope
US10499794B2 (en) 2013-05-09 2019-12-10 Endochoice, Inc. Operational interface in a multi-viewing element endoscope
WO2014190264A1 (en) 2013-05-24 2014-11-27 Hospira, Inc. Multi-sensor infusion system for detecting air or an occlusion in the infusion system
ES2838450T3 (en) 2013-05-29 2021-07-02 Icu Medical Inc Infusion set that uses one or more sensors and additional information to make an air determination relative to the infusion set
AU2014274122A1 (en) 2013-05-29 2016-01-21 Icu Medical, Inc. Infusion system and method of use which prevents over-saturation of an analog-to-digital converter
US11425579B2 (en) * 2013-07-09 2022-08-23 Commscope Technologies Llc Signal distribution interface
US20150019236A1 (en) * 2013-07-15 2015-01-15 Covidien Lp Data age display and management
US20150066535A1 (en) * 2013-08-28 2015-03-05 George M. Dobrean System and method for reporting multiple medical procedures
EP3039596A4 (en) 2013-08-30 2017-04-12 Hospira, Inc. System and method of monitoring and managing a remote infusion regimen
US9662436B2 (en) 2013-09-20 2017-05-30 Icu Medical, Inc. Fail-safe drug infusion therapy system
US10311972B2 (en) 2013-11-11 2019-06-04 Icu Medical, Inc. Medical device system performance index
EP3071253B1 (en) 2013-11-19 2019-05-22 ICU Medical, Inc. Infusion pump automation system and method
US9342215B2 (en) 2013-12-24 2016-05-17 Adobe Systems Incorporated Automatic environment restoration for a particular artwork
JP2015146550A (en) * 2014-02-04 2015-08-13 ソニー株式会社 information processing apparatus, information processing method, and program
JP6636442B2 (en) 2014-02-28 2020-01-29 アイシーユー・メディカル・インコーポレーテッド Infusion systems and methods utilizing dual wavelength optical in-pipe air detection
JP6140100B2 (en) * 2014-04-23 2017-05-31 富士フイルム株式会社 Endoscope apparatus, image processing apparatus, and operation method of endoscope apparatus
US9764082B2 (en) 2014-04-30 2017-09-19 Icu Medical, Inc. Patient care system with conditional alarm forwarding
AU2015266706B2 (en) 2014-05-29 2020-01-30 Icu Medical, Inc. Infusion system and pump with configurable closed loop delivery rate catch-up
US9724470B2 (en) 2014-06-16 2017-08-08 Icu Medical, Inc. System for monitoring and delivering medication to a patient and method of using the same to minimize the risks associated with automated therapy
EP3175773A4 (en) * 2014-07-30 2018-10-10 Olympus Corporation Image processing device
US9539383B2 (en) 2014-09-15 2017-01-10 Hospira, Inc. System and method that matches delayed infusion auto-programs with manually entered infusion programs and analyzes differences therein
CN106999257A (en) * 2014-09-23 2017-08-01 外科安全技术公司 Operating room black box device, system, method and computer-readable medium
US10440246B2 (en) * 2014-11-19 2019-10-08 Kiran K. Bhat System for enabling remote annotation of media data captured using endoscopic instruments and the creation of targeted digital advertising in a documentation environment using diagnosis and procedure code entries
US11344668B2 (en) 2014-12-19 2022-05-31 Icu Medical, Inc. Infusion system with concurrent TPN/insulin infusion
US10850024B2 (en) 2015-03-02 2020-12-01 Icu Medical, Inc. Infusion system, device, and method having advanced infusion features
CA2988094A1 (en) 2015-05-26 2016-12-01 Icu Medical, Inc. Infusion pump system and method with multiple drug library editor source capability
US20170132320A1 (en) * 2015-11-10 2017-05-11 Lexmark International Technology, SA System and Methods for Transmitting Health level 7 Data from One or More Sending Applications to a Dictation System
US10769363B2 (en) 2015-11-10 2020-09-08 Hyland Switzerland Sàrl System and methods for transmitting clinical data having multi-segment fields from one or more modalities to a dictation machine
US10664568B2 (en) * 2015-11-10 2020-05-26 Hyland Switzerland Sàrl System and methods for transmitting clinical data from one or more sending applications to a dictation system
US20190006032A1 (en) * 2015-12-30 2019-01-03 Koninklijke Philips N.V. Interventional medical reporting apparatus
JP6603590B2 (en) * 2016-01-28 2019-11-06 富士フイルム株式会社 Medical support device, its operating method and operating program, and medical support system
EP3454922B1 (en) 2016-05-13 2022-04-06 ICU Medical, Inc. Infusion pump system with common line auto flush
CA3027176A1 (en) 2016-06-10 2017-12-14 Icu Medical, Inc. Acoustic flow sensor for continuous medication flow measurements and feedback control of infusion
US9705931B1 (en) 2016-07-13 2017-07-11 Lifetrack Medical Systems Inc. Managing permissions
NZ750032A (en) 2016-07-14 2020-05-29 Icu Medical Inc Multi-communication path selection and security system for a medical device
JP6779089B2 (en) * 2016-10-05 2020-11-04 富士フイルム株式会社 Endoscope system and how to drive the endoscope system
US11862302B2 (en) 2017-04-24 2024-01-02 Teladoc Health, Inc. Automated transcription and documentation of tele-health encounters
US10483007B2 (en) 2017-07-25 2019-11-19 Intouch Technologies, Inc. Modular telehealth cart with thermal imaging and touch screen user interface
US11636944B2 (en) 2017-08-25 2023-04-25 Teladoc Health, Inc. Connectivity infrastructure for a telehealth platform
US10642451B2 (en) 2017-11-06 2020-05-05 Lifetrack Medical Systems Private Ltd. Computer control of access to fields and features of an application
US10089055B1 (en) 2017-12-27 2018-10-02 Icu Medical, Inc. Synchronized display of screen content on networked devices
US20190303825A1 (en) * 2018-04-02 2019-10-03 Lynell J. De Wind Healthcare Project Management process
US10617299B2 (en) 2018-04-27 2020-04-14 Intouch Technologies, Inc. Telehealth cart that supports a removable tablet with seamless audio/video switching
JP7171274B2 (en) * 2018-07-06 2022-11-15 ソニー・オリンパスメディカルソリューションズ株式会社 Medical image processing device and medical observation device
US10950339B2 (en) 2018-07-17 2021-03-16 Icu Medical, Inc. Converting pump messages in new pump protocol to standardized dataset messages
US11139058B2 (en) 2018-07-17 2021-10-05 Icu Medical, Inc. Reducing file transfer between cloud environment and infusion pumps
ES2962660T3 (en) 2018-07-17 2024-03-20 Icu Medical Inc Systems and methods to facilitate clinical messaging in a network environment
EP3824386B1 (en) 2018-07-17 2024-02-21 ICU Medical, Inc. Updating infusion pump drug libraries and operational software in a networked environment
US10692595B2 (en) 2018-07-26 2020-06-23 Icu Medical, Inc. Drug library dynamic version management
WO2020023231A1 (en) 2018-07-26 2020-01-30 Icu Medical, Inc. Drug library management system
GB2612245B (en) * 2018-10-03 2023-08-30 Cmr Surgical Ltd Automatic endoscope video augmentation
CN109472472A (en) * 2018-10-26 2019-03-15 南京米好信息安全有限公司 A kind of artificial intelligence points-scoring system
TWI715166B (en) * 2019-08-27 2021-01-01 宏正自動科技股份有限公司 Multi-screen control system
US11278671B2 (en) 2019-12-04 2022-03-22 Icu Medical, Inc. Infusion pump with safety sequence keypad
CN111009296B (en) * 2019-12-06 2023-05-09 安翰科技(武汉)股份有限公司 Capsule endoscopy report labeling method, device and medium
US11109741B1 (en) 2020-02-21 2021-09-07 Ambu A/S Video processing apparatus
US10835106B1 (en) 2020-02-21 2020-11-17 Ambu A/S Portable monitor
US10980397B1 (en) 2020-02-21 2021-04-20 Ambu A/S Video processing device
US11166622B2 (en) 2020-02-21 2021-11-09 Ambu A/S Video processing apparatus
WO2022020184A1 (en) 2020-07-21 2022-01-27 Icu Medical, Inc. Fluid transfer devices and methods of use
US11135360B1 (en) 2020-12-07 2021-10-05 Icu Medical, Inc. Concurrent infusion with common line auto flush

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5715449A (en) * 1994-06-20 1998-02-03 Oceania, Inc. Method for generating structured medical text through user selection of displayed text and rules
US5740801A (en) * 1993-03-31 1998-04-21 Branson; Philip J. Managing information in an endoscopy system
US5823948A (en) * 1996-07-08 1998-10-20 Rlis, Inc. Medical records, documentation, tracking and order entry system
US5911133A (en) * 1997-10-22 1999-06-08 Rush-Presbyterian -St. Luke's Medical Center User interface for echocardiographic report generation
US5920317A (en) * 1996-06-11 1999-07-06 Vmi Technologies Incorporated System and method for storing and displaying ultrasound images
US5924074A (en) * 1996-09-27 1999-07-13 Azron Incorporated Electronic medical records system
US6032120A (en) * 1997-12-16 2000-02-29 Acuson Corporation Accessing stored ultrasound images and other digital medical images
US6047259A (en) * 1997-12-30 2000-04-04 Medical Management International, Inc. Interactive method and system for managing physical exams, diagnosis and treatment protocols in a health care practice
US6353817B1 (en) * 1998-06-26 2002-03-05 Charles M Jacobs Multi-user system for creating and maintaining a medical-decision-making knowledge base
US20020072896A1 (en) * 1998-04-01 2002-06-13 Cyberpulse,L.L.C. Structured speech recognition
US20020111932A1 (en) * 1998-04-01 2002-08-15 Cyberpulse, L.L.C. Method and system for generation of medical reports from data in a hierarchically-organized database
US6597392B1 (en) * 1997-10-14 2003-07-22 Healthcare Vision, Inc. Apparatus and method for computerized multi-media data organization and transmission
US20040168119A1 (en) * 2003-02-24 2004-08-26 David Liu method and apparatus for creating a report

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4724844A (en) * 1985-06-26 1988-02-16 Stephen Rafelson Vital sign modular unit
JPS626212A (en) 1985-07-02 1987-01-13 Olympus Optical Co Ltd Image signal processing circuit
US5031036A (en) * 1988-07-11 1991-07-09 Olympus Optical Co., Ltd. Electronic endoscope apparatus simultaneously displaying an original picture image and special picture image on a single displaying picture surface
US5697885A (en) * 1989-01-30 1997-12-16 Olympus Optical Co., Ltd. Endoscope for recording and displaying time-serial images
US5583566A (en) * 1989-05-12 1996-12-10 Olympus Optical Co., Ltd. Combined medical image and data transmission with data storage, in which character/diagram information is transmitted with video data
JP3041015B2 (en) * 1990-04-18 2000-05-15 オリンパス光学工業株式会社 Endoscope image file system
JPH0595900A (en) * 1991-04-11 1993-04-20 Olympus Optical Co Ltd Endoscope image processing device
JP3228972B2 (en) * 1991-10-31 2001-11-12 株式会社東芝 Medical image storage communication system
JPH06139287A (en) * 1992-10-30 1994-05-20 Toshiba Corp Picture recording and reproducing device
US5447164A (en) * 1993-11-08 1995-09-05 Hewlett-Packard Company Interactive medical information display system and method for displaying user-definable patient events
JP3732865B2 (en) * 1995-01-18 2006-01-11 ペンタックス株式会社 Endoscope device
US5624398A (en) * 1996-02-08 1997-04-29 Symbiosis Corporation Endoscopic robotic surgical tools and methods
CA2252698A1 (en) * 1996-04-23 1997-10-30 Deroyal Industries, Inc. Method for the administration of health care employing a computer generated model
US5797838A (en) * 1996-09-13 1998-08-25 Colin Corporation Physical-information-image displaying apparatus
JP2815346B2 (en) * 1997-01-31 1998-10-27 株式会社亀田医療情報研究所 Medical planning support system
US6252597B1 (en) * 1997-02-14 2001-06-26 Netscape Communications Corporation Scalable user interface for graphically representing hierarchical data
US6345260B1 (en) * 1997-03-17 2002-02-05 Allcare Health Management System, Inc. Scheduling interface system and method for medical professionals
US6106457A (en) * 1997-04-04 2000-08-22 Welch Allyn, Inc. Compact imaging instrument system
JPH1132986A (en) * 1997-07-16 1999-02-09 Olympus Optical Co Ltd Endoscope system
JP3855462B2 (en) * 1998-05-29 2006-12-13 株式会社日立製作所 Method for editing command sequence with processing time and apparatus using the same
CA2337847A1 (en) * 1998-07-17 2000-01-27 Starkey International Facility management system
IL140472A0 (en) * 1998-08-04 2002-02-10 Contec Medical Ltd Surgical recording and reporting system
US6025362A (en) * 1998-08-31 2000-02-15 Fukunaga; Atsuo F. Uses of xanthine compounds
DE19845030A1 (en) * 1998-09-30 2000-04-20 Siemens Ag Imaging system for reproduction of medical image information
US6574629B1 (en) * 1998-12-23 2003-06-03 Agfa Corporation Picture archiving and communication system
US6416471B1 (en) * 1999-04-15 2002-07-09 Nexan Limited Portable remote patient telemonitoring system
US6454708B1 (en) * 1999-04-15 2002-09-24 Nexan Limited Portable remote patient telemonitoring system using a memory card or smart card
US6859288B1 (en) * 1999-04-28 2005-02-22 General Electric Company Method and apparatus for requesting and displaying worklist data from remotely located device
JP3394742B2 (en) * 1999-05-31 2003-04-07 オリンパス光学工業株式会社 Data filing system for endoscope
AU5880400A (en) * 1999-06-21 2001-01-09 Ellora Software, Inc. Method and apparatus for internet-based activity management
JP2001052073A (en) * 1999-08-17 2001-02-23 Toshitada Kameda Medical treatment planning and recording support system and machine readable medium with program recorded
US6398728B1 (en) * 1999-11-16 2002-06-04 Cardiac Intelligence Corporation Automated collection and analysis patient care system and method for diagnosing and monitoring respiratory insufficiency and outcomes thereof
EP1278456A2 (en) * 2000-05-05 2003-01-29 Hill-Rom Services, Inc. Patient point of care computer system
JP3791894B2 (en) * 2000-05-12 2006-06-28 オリンパス株式会社 Endoscopic image filing system
US6633772B2 (en) * 2000-08-18 2003-10-14 Cygnus, Inc. Formulation and manipulation of databases of analyte and associated values
JP3742549B2 (en) * 2000-08-29 2006-02-08 オリンパス株式会社 Medical image filing system
US6678764B2 (en) * 2000-10-20 2004-01-13 Sony Corporation Medical image processing system
US6684276B2 (en) * 2001-03-28 2004-01-27 Thomas M. Walker Patient encounter electronic medical record system, method, and computer product
JP2002306509A (en) * 2001-04-10 2002-10-22 Olympus Optical Co Ltd Remote operation supporting system
US7395214B2 (en) * 2001-05-11 2008-07-01 Craig P Shillingburg Apparatus, device and method for prescribing, administering and monitoring a treatment regimen for a patient
WO2002095653A2 (en) * 2001-05-18 2002-11-28 Mayo Foundation For Medical Education And Research Ultrasound laboratory information management system and method
US7119814B2 (en) * 2001-05-18 2006-10-10 Given Imaging Ltd. System and method for annotation on a moving image
US6735556B2 (en) * 2001-06-15 2004-05-11 International Business Machines Corporation Real-time model evaluation
US20030050801A1 (en) * 2001-08-20 2003-03-13 Ries Linda K. System and user interface for planning and monitoring patient related treatment activities
US20030060691A1 (en) * 2001-08-24 2003-03-27 Olympus Optical Co., Ltd. Examination follow-up information management apparatus for performing the simple and certain follow-up examinations
JP4709443B2 (en) * 2001-08-29 2011-06-22 オリンパス株式会社 Endoscopic image filing system
US7447644B2 (en) * 2001-09-12 2008-11-04 Siemens Medical Solutions Usa, Inc. System and user interface for processing healthcare related event information
US6985870B2 (en) * 2002-01-11 2006-01-10 Baxter International Inc. Medication delivery system
US20030149598A1 (en) * 2002-01-28 2003-08-07 Santoso Nugroho Iwan Intelligent assignment, scheduling and notification scheme for task management
US20030212576A1 (en) * 2002-05-08 2003-11-13 Back Kim Medical information system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5740801A (en) * 1993-03-31 1998-04-21 Branson; Philip J. Managing information in an endoscopy system
US5715449A (en) * 1994-06-20 1998-02-03 Oceania, Inc. Method for generating structured medical text through user selection of displayed text and rules
US5920317A (en) * 1996-06-11 1999-07-06 Vmi Technologies Incorporated System and method for storing and displaying ultrasound images
US5823948A (en) * 1996-07-08 1998-10-20 Rlis, Inc. Medical records, documentation, tracking and order entry system
US6347329B1 (en) * 1996-09-27 2002-02-12 Macneal Memorial Hospital Assoc. Electronic medical records system
US5924074A (en) * 1996-09-27 1999-07-13 Azron Incorporated Electronic medical records system
US6597392B1 (en) * 1997-10-14 2003-07-22 Healthcare Vision, Inc. Apparatus and method for computerized multi-media data organization and transmission
US5911133A (en) * 1997-10-22 1999-06-08 Rush-Presbyterian -St. Luke's Medical Center User interface for echocardiographic report generation
US6032120A (en) * 1997-12-16 2000-02-29 Acuson Corporation Accessing stored ultrasound images and other digital medical images
US6047259A (en) * 1997-12-30 2000-04-04 Medical Management International, Inc. Interactive method and system for managing physical exams, diagnosis and treatment protocols in a health care practice
US20020072896A1 (en) * 1998-04-01 2002-06-13 Cyberpulse,L.L.C. Structured speech recognition
US20020111932A1 (en) * 1998-04-01 2002-08-15 Cyberpulse, L.L.C. Method and system for generation of medical reports from data in a hierarchically-organized database
US6801916B2 (en) * 1998-04-01 2004-10-05 Cyberpulse, L.L.C. Method and system for generation of medical reports from data in a hierarchically-organized database
US6353817B1 (en) * 1998-06-26 2002-03-05 Charles M Jacobs Multi-user system for creating and maintaining a medical-decision-making knowledge base
US20040168119A1 (en) * 2003-02-24 2004-08-26 David Liu method and apparatus for creating a report

Cited By (111)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7921166B2 (en) 2002-02-01 2011-04-05 Xerox Corporation Methods and systems for accessing email
US20060236142A1 (en) * 2002-02-01 2006-10-19 Xerox Corporation Methods and systems for accessing email
US20050120002A1 (en) * 2003-10-02 2005-06-02 Hassan Behbehani Automated text generation process
US20060103729A1 (en) * 2003-12-12 2006-05-18 James Burns Computer-based image capture system
US8094901B1 (en) 2004-11-04 2012-01-10 Dr Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US10614615B2 (en) 2004-11-04 2020-04-07 Merge Healthcare Solutions Inc. Systems and methods for viewing medical 3D imaging volumes
US8019138B2 (en) 2004-11-04 2011-09-13 Dr Systems, Inc. Systems and methods for viewing medical images
US7970625B2 (en) 2004-11-04 2011-06-28 Dr Systems, Inc. Systems and methods for retrieval of medical data
US9501863B1 (en) 2004-11-04 2016-11-22 D.R. Systems, Inc. Systems and methods for viewing medical 3D imaging volumes
US20060093198A1 (en) * 2004-11-04 2006-05-04 Fram Evan K Systems and methods for interleaving series of medical images
US8610746B2 (en) 2004-11-04 2013-12-17 Dr Systems, Inc. Systems and methods for viewing medical 3D imaging volumes
US8626527B1 (en) 2004-11-04 2014-01-07 Dr Systems, Inc. Systems and methods for retrieval of medical data
US8731259B2 (en) 2004-11-04 2014-05-20 Dr Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US8879807B2 (en) 2004-11-04 2014-11-04 Dr Systems, Inc. Systems and methods for interleaving series of medical images
US8244014B2 (en) 2004-11-04 2012-08-14 Dr Systems, Inc. Systems and methods for viewing medical images
US8217966B2 (en) 2004-11-04 2012-07-10 Dr Systems, Inc. Systems and methods for viewing medical 3D imaging volumes
US20060095423A1 (en) * 2004-11-04 2006-05-04 Reicher Murray A Systems and methods for retrieval of medical data
US11177035B2 (en) 2004-11-04 2021-11-16 International Business Machines Corporation Systems and methods for matching, naming, and displaying medical images
US10790057B2 (en) 2004-11-04 2020-09-29 Merge Healthcare Solutions Inc. Systems and methods for retrieval of medical data
US9471210B1 (en) 2004-11-04 2016-10-18 D.R. Systems, Inc. Systems and methods for interleaving series of medical images
US20110016430A1 (en) * 2004-11-04 2011-01-20 Dr Systems, Inc. Systems and methods for interleaving series of medical images
US10437444B2 (en) 2004-11-04 2019-10-08 Merge Healthcare Soltuions Inc. Systems and methods for viewing medical images
US10096111B2 (en) 2004-11-04 2018-10-09 D.R. Systems, Inc. Systems and methods for interleaving series of medical images
US9734576B2 (en) 2004-11-04 2017-08-15 D.R. Systems, Inc. Systems and methods for interleaving series of medical images
US9727938B1 (en) 2004-11-04 2017-08-08 D.R. Systems, Inc. Systems and methods for retrieval of medical data
US20100201714A1 (en) * 2004-11-04 2010-08-12 Dr Systems, Inc. Systems and methods for viewing medical images
US7885440B2 (en) 2004-11-04 2011-02-08 Dr Systems, Inc. Systems and methods for interleaving series of medical images
US8913808B2 (en) 2004-11-04 2014-12-16 Dr Systems, Inc. Systems and methods for viewing medical images
US10540763B2 (en) 2004-11-04 2020-01-21 Merge Healthcare Solutions Inc. Systems and methods for matching, naming, and displaying medical images
US9542082B1 (en) 2004-11-04 2017-01-10 D.R. Systems, Inc. Systems and methods for matching, naming, and displaying medical images
US7920152B2 (en) 2004-11-04 2011-04-05 Dr Systems, Inc. Systems and methods for viewing medical 3D imaging volumes
US20060093199A1 (en) * 2004-11-04 2006-05-04 Fram Evan K Systems and methods for viewing medical 3D imaging volumes
US8577149B2 (en) * 2005-06-15 2013-11-05 Nintendo Co., Ltd. Storage medium storing program and information processing apparatus
US20060288004A1 (en) * 2005-06-15 2006-12-21 Nintendo Co., Ltd. Storage medium storing program and information processing apparatus
US20070143143A1 (en) * 2005-12-16 2007-06-21 Siemens Medical Solutions Health Services Corporation Patient Discharge Data Processing System
US20080086328A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Patient activity coordinator
US8050946B2 (en) 2006-10-06 2011-11-01 Cerner Innovation, Inc. Clinical activity navigator
US8589185B2 (en) 2006-10-06 2013-11-19 Cerner Innovation, Inc. Acknowledgement of previous results for medication administration
US8560335B2 (en) 2006-10-06 2013-10-15 Cerner Innovation, Inc. Viewing clinical activity details within a selected time period
US20080086336A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Patient outcomes in context of documentation
US20080086329A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Resceduling clinical activities in context of activities view
US20080086331A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Acknowledgement of previous results for medication administration
US8355924B2 (en) * 2006-10-06 2013-01-15 Cerner Innovation, Inc. Patient activity coordinator
US8775208B2 (en) 2006-10-06 2014-07-08 Cerner Innovation, Inc. Patient outcomes in context of documentation
US8423384B2 (en) 2006-10-06 2013-04-16 Cerner Innovation, Inc. Providing multidisciplinary activities in context of clinician's role relevant activities
US20080086332A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Viewing clinical activity details within a selected time period
US20080086333A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Documentation of medication activities in context of mar
US20080086334A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Providing clinical activity details in context
US20080086684A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Clinical activity navigator
US20080086330A1 (en) * 2006-10-06 2008-04-10 Cerner Innovation, Inc. Providing multidisciplinary activities in context of clinician's role relevant activities
US9672477B1 (en) 2006-11-22 2017-06-06 D.R. Systems, Inc. Exam scheduling with customer configured notifications
US9754074B1 (en) 2006-11-22 2017-09-05 D.R. Systems, Inc. Smart placement rules
US7953614B1 (en) 2006-11-22 2011-05-31 Dr Systems, Inc. Smart placement rules
US10896745B2 (en) 2006-11-22 2021-01-19 Merge Healthcare Solutions Inc. Smart placement rules
US8554576B1 (en) 2006-11-22 2013-10-08 Dr Systems, Inc. Automated document filing
US10157686B1 (en) 2006-11-22 2018-12-18 D.R. Systems, Inc. Automated document filing
US8457990B1 (en) 2006-11-22 2013-06-04 Dr Systems, Inc. Smart placement rules
US8751268B1 (en) 2006-11-22 2014-06-10 Dr Systems, Inc. Smart placement rules
US7685002B2 (en) * 2007-05-08 2010-03-23 Medaptus, Inc. Method and system for processing medical billing records
US20080281629A1 (en) * 2007-05-08 2008-11-13 Medaptus, Inc. Method and System for Processing Medical Billing Records
US8788285B2 (en) * 2007-08-02 2014-07-22 Align Technology, Inc. Clinical data file
US20090037222A1 (en) * 2007-08-02 2009-02-05 Kuo Eric E Clinical data file
US20110054884A1 (en) * 2007-09-17 2011-03-03 Capfinder Aktiebolag System for assisting in drafting applications
US20090086269A1 (en) * 2007-09-28 2009-04-02 Kyocera Mita Corporation Image Forming Apparatus and Image Forming System
US10438694B2 (en) * 2007-11-19 2019-10-08 Medicalis Corporation Management of medical workflow
US20090132586A1 (en) * 2007-11-19 2009-05-21 Brian Napora Management of Medical Workflow
US20090144051A1 (en) * 2007-12-04 2009-06-04 Nhn Corporation Method of providing personal dictionary
US20090287487A1 (en) * 2008-05-14 2009-11-19 General Electric Company Systems and Methods for a Visual Indicator to Track Medical Report Dictation Progress
US8380533B2 (en) 2008-11-19 2013-02-19 DR Systems Inc. System and method of providing dynamic and customizable medical examination forms
US9501627B2 (en) 2008-11-19 2016-11-22 D.R. Systems, Inc. System and method of providing dynamic and customizable medical examination forms
US20100138239A1 (en) * 2008-11-19 2010-06-03 Dr Systems, Inc. System and method of providing dynamic and customizable medical examination forms
US10592688B2 (en) 2008-11-19 2020-03-17 Merge Healthcare Solutions Inc. System and method of providing dynamic and customizable medical examination forms
US20110029853A1 (en) * 2009-08-03 2011-02-03 Webtrends, Inc. Advanced visualizations in analytics reporting
US9892341B2 (en) 2009-09-28 2018-02-13 D.R. Systems, Inc. Rendering of medical images using user-defined rules
US9386084B1 (en) 2009-09-28 2016-07-05 D.R. Systems, Inc. Selective processing of medical images
US8712120B1 (en) 2009-09-28 2014-04-29 Dr Systems, Inc. Rules-based approach to transferring and/or viewing medical images
US9501617B1 (en) 2009-09-28 2016-11-22 D.R. Systems, Inc. Selective display of medical images
US9684762B2 (en) 2009-09-28 2017-06-20 D.R. Systems, Inc. Rules-based approach to rendering medical imaging data
US9934568B2 (en) 2009-09-28 2018-04-03 D.R. Systems, Inc. Computer-aided analysis and rendering of medical images using user-defined rules
US10607341B2 (en) 2009-09-28 2020-03-31 Merge Healthcare Solutions Inc. Rules-based processing and presentation of medical images based on image plane
US9042617B1 (en) 2009-09-28 2015-05-26 Dr Systems, Inc. Rules-based approach to rendering medical imaging data
US9190109B2 (en) * 2010-03-23 2015-11-17 Disney Enterprises, Inc. System and method for video poetry using text based related media
US20110239099A1 (en) * 2010-03-23 2011-09-29 Disney Enterprises, Inc. System and method for video poetry using text based related media
US20120173281A1 (en) * 2011-01-05 2012-07-05 Dilella James M Automated data entry and transcription system, especially for generation of medical reports by an attending physician
US9412369B2 (en) * 2011-06-17 2016-08-09 Microsoft Technology Licensing, Llc Automated adverse drug event alerts
US20120323576A1 (en) * 2011-06-17 2012-12-20 Microsoft Corporation Automated adverse drug event alerts
US9092727B1 (en) 2011-08-11 2015-07-28 D.R. Systems, Inc. Exam type mapping
US9092551B1 (en) 2011-08-11 2015-07-28 D.R. Systems, Inc. Dynamic montage reconstruction
US10579903B1 (en) 2011-08-11 2020-03-03 Merge Healthcare Solutions Inc. Dynamic montage reconstruction
US20130191154A1 (en) * 2012-01-22 2013-07-25 Dobkin William R. Medical data system generating automated surgical reports
US20140006294A1 (en) * 2012-06-28 2014-01-02 Korea Atomic Energy Research Institute Technology Trend Analysis Report Generating System and Recording Medium
US10083166B2 (en) * 2012-07-06 2018-09-25 Canon Kabushiki Kaisha Apparatus and method for generating inspection report(s)
US20140013219A1 (en) * 2012-07-06 2014-01-09 Canon Kabushiki Kaisha Apparatus and method for generating inspection report(s)
US10665342B2 (en) 2013-01-09 2020-05-26 Merge Healthcare Solutions Inc. Intelligent management of computerized advanced processing
US11094416B2 (en) 2013-01-09 2021-08-17 International Business Machines Corporation Intelligent management of computerized advanced processing
US10672512B2 (en) 2013-01-09 2020-06-02 Merge Healthcare Solutions Inc. Intelligent management of computerized advanced processing
US20140359509A1 (en) * 2013-05-31 2014-12-04 Alp Sinan Baran Templates
WO2014197812A1 (en) * 2013-06-07 2014-12-11 Kiglies Mauricio Electronic on-line motor vehicle management and auction system
US10275442B2 (en) 2013-07-17 2019-04-30 Yseop Sa Techniques for automatic generation of natural language text
US10120865B1 (en) 2013-07-17 2018-11-06 Yseop Sa Techniques for automatic generation of natural language text
US10037317B1 (en) * 2013-07-17 2018-07-31 Yseop Sa Techniques for automatic generation of natural language text
US9411804B1 (en) 2013-07-17 2016-08-09 Yseop Sa Techniques for automatic generation of natural language text
US9135318B2 (en) 2013-08-20 2015-09-15 Intelligent Medical Objects, Inc. System and method for implementing a 64 bit data searching and delivery portal
WO2015026485A3 (en) * 2013-08-20 2015-06-18 Intelligent Medical Objects, Inc. System and method for implementing a 64 bit data searching and delivery portal
WO2016126859A1 (en) * 2015-02-03 2016-08-11 Texas Tech University System Graphical user interface system for interactive, hierarchical, multi-panel comprehension of multi-format data
US11043292B2 (en) 2015-02-23 2021-06-22 Smart Reporting Gmbh Apparatus and method for producing a medical report
WO2016135100A1 (en) * 2015-02-23 2016-09-01 Qmedify Gmbh Apparatus and method for producing a medical report
US11354007B2 (en) * 2015-04-07 2022-06-07 Olympus America, Inc. Diagram based visual procedure note writing tool
US10909168B2 (en) 2015-04-30 2021-02-02 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and review of, digital medical image data
US10929508B2 (en) 2015-04-30 2021-02-23 Merge Healthcare Solutions Inc. Database systems and interactive user interfaces for dynamic interaction with, and indications of, digital medical image data
US20220139548A1 (en) * 2018-10-23 2022-05-05 Nec Corporation Nursing assistance device, nursing assistance method, and recording medium

Also Published As

Publication number Publication date
WO2004104754A2 (en) 2004-12-02
JP2007505419A (en) 2007-03-08
EP1625751A4 (en) 2009-11-04
CA2526135A1 (en) 2004-12-02
JP2007516011A (en) 2007-06-21
JP2007516498A (en) 2007-06-21
EP1625751A2 (en) 2006-02-15
EP1629350A2 (en) 2006-03-01
US20050073578A1 (en) 2005-04-07
EP1627357A4 (en) 2010-01-06
WO2004103151A3 (en) 2005-10-20
CA2526073A1 (en) 2004-12-02
WO2004104742A2 (en) 2004-12-02
EP1629350A4 (en) 2007-08-01
US7492388B2 (en) 2009-02-17
EP1627356A2 (en) 2006-02-22
US20050075544A1 (en) 2005-04-07
CA2526149A1 (en) 2004-12-02
JP2007503282A (en) 2007-02-22
EP1627357A2 (en) 2006-02-22
WO2004103151A2 (en) 2004-12-02
WO2004104754A3 (en) 2005-03-03
US20050075535A1 (en) 2005-04-07
EP1627356A4 (en) 2006-12-20
WO2004104921A3 (en) 2005-03-31
WO2004104921A2 (en) 2004-12-02
CA2526078A1 (en) 2004-12-02
WO2004104742A3 (en) 2006-08-10

Similar Documents

Publication Publication Date Title
US20050114283A1 (en) System and method for generating a report using a knowledge base
US8311848B2 (en) Electronic medical record creation and retrieval system
JP4402033B2 (en) Information processing system
US8521561B2 (en) Database system, program, image retrieving method, and report retrieving method
US8150711B2 (en) Generating and managing medical documentation sets
US11557384B2 (en) Collaborative synthesis-based clinical documentation
Yamamoto et al. Challenges of electronic medical record implementation in the emergency department
US20200211685A1 (en) Universal medical charting
Waegemann et al. Healthcare documentation: A report on information capture and report generation
Lee Measuring nurses' experiences with unintended adverse consequences in EMR use in acute care settings
WO2022081731A9 (en) Automatically pre-constructing a clinical consultation note during a patient intake/admission process
Colicchio et al. Physicians’ perceptions about narrative note sections format and content: a multi-specialty survey
Chireshe et al. Integrated chronic care models for people with comorbid of HIV and non-communicable diseases in Sub-Saharan Africa: A scoping review
Pure et al. A Review on Association between Electronic Health Record use and Quality of Patient Care
Wilcox et al. ActiveNotes: computer-assisted creation of patient progress notes
Murray Unified Documentation and Information Retrieval for Electronic Health Records
Cam et al. An investigation on the use of computerized patient care documentation: Preliminary results
KR20030095691A (en) Management Device Of A Medical Data By Mouse And Storage Media Thereof
Wilcox et al. activeNotes: Computer-Assisted Creation of Patient Progress Notes in a Hospital Environment
Clarke et al. Electronic Health Record Usability Evaluation Improves Training
Wade et al. Agile systems for clinical research
Elkin et al. Terminological Systems
Efthimiadis et al. An Investigation on the Use of Computerized Patient Care Documentation: Preliminary Results

Legal Events

Date Code Title Description
AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OLYMPUS AMERICA INC.;REEL/FRAME:017687/0697

Effective date: 20060322

Owner name: OLYMPUS AMERICA INC., NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHAPIRO, MARC;PEARSON, PHILIP;REEL/FRAME:017722/0237;SIGNING DATES FROM 20040928 TO 20041006

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION