US20110153343A1 - Adaptable medical workflow system - Google Patents

Adaptable medical workflow system Download PDF

Info

Publication number
US20110153343A1
US20110153343A1 US12/644,919 US64491909A US2011153343A1 US 20110153343 A1 US20110153343 A1 US 20110153343A1 US 64491909 A US64491909 A US 64491909A US 2011153343 A1 US2011153343 A1 US 2011153343A1
Authority
US
United States
Prior art keywords
medical
workflow
healthcare worker
menu
information data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/644,919
Inventor
Jean-Sebastien Tremblay
Sebastien Dumont
Ali Dufour
Jean-Francois Lagace
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CareFusion 303 Inc
Original Assignee
CareFusion 303 Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CareFusion 303 Inc filed Critical CareFusion 303 Inc
Priority to US12/644,919 priority Critical patent/US20110153343A1/en
Assigned to CAREFUSION 303, INC. reassignment CAREFUSION 303, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DUFOUR, ALI, DUMONT, SEBASTIEN, LAGACE, JEAN-FRANCOIS, TREMBLAY, JEAN-SEBASTIEN
Priority to BR112012014414A priority patent/BR112012014414A2/en
Priority to CA2783780A priority patent/CA2783780C/en
Priority to PCT/US2010/060860 priority patent/WO2011087710A2/en
Priority to MX2016009467A priority patent/MX362712B/en
Priority to CN201080058507.2A priority patent/CN102667848B/en
Priority to EP10843499.4A priority patent/EP2517168A4/en
Priority to JP2012546063A priority patent/JP2013515324A/en
Priority to AU2010341599A priority patent/AU2010341599A1/en
Priority to RU2012123163/08A priority patent/RU2012123163A/en
Priority to MX2012007047A priority patent/MX340693B/en
Publication of US20110153343A1 publication Critical patent/US20110153343A1/en
Priority to ZA2012/04343A priority patent/ZA201204343B/en
Priority to AU2016234929A priority patent/AU2016234929A1/en
Priority to AU2019203469A priority patent/AU2019203469A1/en
Priority to US16/459,540 priority patent/US11170325B2/en
Priority to US17/519,488 priority patent/US11880787B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms

Definitions

  • the present disclosure relates to medical workflow systems.
  • the work of a healthcare worker generally involves performing several tasks related to patient treatment. Often, a healthcare worker performs tasks such as feeding a patient or recording the temperature of a patient using devices such as barcode scanners to enter identities of medications and patients in a medical computer system. During care and treatment of a patient, a healthcare worker may perform several tasks that may collectively be referred to as a “workflow” of the healthcare worker.
  • a healthcare worker may follow a workflow specified by regulation of a healthcare facility. The workflow may be provided by a medical computer system to the healthcare worker on a display screen.
  • a healthcare worker is a problem faced by the healthcare industry.
  • Introduction of a new technology often means some changes to a healthcare worker's workflow and therefore may require efforts on part of the healthcare worker to adopt the new technology.
  • a certain healthcare system configuration may expect a healthcare worker to perform tasks A, B and C, in that order (e.g., scanning a label, administering a medication, documenting a patient's vital signs, etc.).
  • a change or an upgrade to the healthcare system configuration may require the healthcare worker to perform tasks in the order A, C, B or may add an additional task D (e.g., new workflow is A, B, D and C).
  • Such changes in the healthcare system configuration may be disruptive to the workflow that the healthcare worker is used to follow.
  • upgrades or changes to existing healthcare configurations may therefore lead to increased errors or reduced efficiency on the part of healthcare workers during the time period in which the healthcare workers become accustomed to the changes.
  • a method of adapting a medical workflow implemented at a processor coupled to a hospital network comprises receiving a message comprising medical information data, predicting a healthcare worker's workflow using, at least in part, the medical information data, and communicating to an interface, based on the predicting, a menu comprising one or more medical action options.
  • a machine-readable medium encoded with instructions for adapting a medical workflow comprises code to cause a processor to receive a message comprising medical information data, predict a healthcare worker's workflow using, at least in part, the medical information data and communicate to an interface, based on the predicting, a menu comprising one or more medical action options.
  • a system for adapting a medical workflow comprises a server; a scanner having an interface and a database.
  • the server comprises a medical information reception section configured to receive a message from the scanner comprising medical information data, a workflow prediction section configured to predict a healthcare worker's workflow using, at least in part, the medical information data and a menu communication section configured to communicate to the scanner, based on the predicted workflow, a menu comprising one or more medical action options for displaying on the interface.
  • FIG. 1 is a diagrammatic view of a system, in accordance with embodiments of the present disclosure.
  • FIG. 2 is a conceptual block diagram of a server, in accordance with embodiments of the present disclosure.
  • FIG. 3A is a flow chart depicting an exemplary process implemented on a server, in accordance with embodiments of the present disclosure.
  • FIG. 3B is a conceptual block diagram depicting entries of a medical database, in accordance with embodiments of the present disclosure.
  • FIG. 4A is a flow chart depicting an exemplary prior art process implemented on a server.
  • FIG. 4B is a flow chart depicting an exemplary prior art process implemented on a server.
  • FIG. 4C is a flow chart depicting an exemplary process implemented on a server, in accordance with embodiments of the present disclosure.
  • FIG. 5 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 6 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 7 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 8 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 9 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • the disclosed embodiments address and solve problems related to the aforementioned medical workflow configurations.
  • the embodiments solve these problems, at least in part, by minimizing or eliminating the need for a healthcare worker to follow only a single specific sequence of actions to accomplish certain healthcare tasks.
  • the disclosed embodiments solve these problems, at least by predicting workflow of a healthcare worker based on an identity of a medical item scanned by the healthcare worker.
  • workflow prediction is used to create a menu presented to the healthcare worker at a display in response to the healthcare worker's prior actions and menu selections.
  • the prediction of a workflow is triggered by the act of a healthcare worker scanning a label such as a patient's identification (ID) tag or a medical package label.
  • a predictive process at a computer uses the identity of the scanned medical object to calculate possible next actions by a healthcare worker. Based on the predicted next actions, the computer then directs a display to present a menu to the healthcare worker to facilitate fulfilling the possible next actions.
  • the predictive process adapts based on prior scans performed by the healthcare worker during a session. Therefore, in certain aspects, the predictive process relieves the healthcare worker from having to remember a specific sequence of scanning various medical objects.
  • FIG. 1 illustrates a simplified diagram of system 110 in accordance with certain configurations of the present disclosure.
  • System 110 includes one or more medical devices 102 capable of communication with a computer server 100 (server) via hospital network 104 .
  • System 110 further includes an interface 108 communicatively coupled to server 100 .
  • Server 100 communicates to interface 108 for displaying to a healthcare worker.
  • Interface 108 provides a display to a user, as well as an input device for a user to input information into system 110 .
  • An example of interface 108 is a touch screen, although in other embodiments, the interface 108 includes a separate display and input device.
  • Interface 108 communicates user interactions (e.g., menu selections) to server 100 .
  • interface 108 is directly attached to server 100 .
  • interface 108 is remotely located, and communicates with server 100 over hospital network 104 .
  • interface 108 is integrated into medical device 102 .
  • system 110 further includes medical database 106 communicatively coupled to server 100 through management network 105 .
  • Server 100 is configured to predict workflow of a healthcare worker based on the healthcare worker's interaction with medical device 102 and/or interface 108 .
  • Server 100 communicates with database 106 to receive or store certain relevant information useful in the prediction of a workflow.
  • server 100 predicts workflow of a healthcare worker using a software application running on a processor of server 100 .
  • medical device 102 may be a computer, a mobile phone, a laptop computer, a thin client device, a personal digital assistant (PDA), a portable computing device, a barcode scanner, a radio frequency identification (RFID) receiver or another device with a processor.
  • scanning refers to a wired or wireless operation of transferring information from an entity to a processor. This includes, for example, barcode scanning by an infrared receiver, sensing of radio frequency identification (RFID) using an RFID antenna, manually reading and entering barcode or patient information to a computer, and so on.
  • RFID radio frequency identification
  • networks 104 and 105 may be, for example, modem connections, a LAN connection including the Ethernet or a broadband WAN connection including DSL, Cable, T1, T3, Fiber Optics, Wi-Fi, or a mobile network connection including GSM, GPRS, 3G, WiMax or other network connections.
  • hospital network 104 is a dedicated point-to-point link between medical device 102 and server 100 (e.g., a Bluetooth or a wireless USB link).
  • FIG. 2 illustrates a simplified block diagram of configuration 200 of server 100 in accordance with certain configurations of the present disclosure.
  • Operating system (OS) 218 is in communication with a medical information reception section 202 , a workflow prediction section 204 , a menu communication section 206 , a verification section 208 , a state determination section 210 , a database access section 212 , a database update section 216 and a session management section 214 .
  • OS Operating system
  • FIG. 3A is a flow chart illustrating an exemplary process 300 implemented on server 100 , in accordance with certain aspects of the present disclosure.
  • Server 100 is configured to receive messages from medical device 102 .
  • Process 300 is initiated by the server 100 receiving a message comprising medical information data in operation 302 .
  • a healthcare worker uses medical device 102 (e.g., a barcode reader) to scan a barcode label or perform a frequency identification (MID) scan of a medical object.
  • the medical object may be, for example, a medication vial, a food item, a patient's wristband comprising the patient's identity information, and so on.
  • a healthcare worker directly enters an identity of the medical object into medical device 102 via, for example, a touch screen or a keyboard.
  • medical device 102 In response to the scan or other input by a healthcare worker, medical device 102 communicates a message comprising certain medical information data to server 100 .
  • the message is in the form of an internet protocol (IP) packet or any other well-known machine-to-machine communication format.
  • IP internet protocol
  • the medical information data includes medical object identification information.
  • the medical information data may include a barcode value or an RFID value that uniquely represents a medical object.
  • Medical information reception section 202 processes the received message carrying the medical information data received by server 100 .
  • medical information reception section 202 parses the received message to extract medical information data.
  • medical information reception section 202 performs authentication of medical device 102 to ensure that medical device 102 is not a rogue medical device.
  • Session management section 214 associates a session, based on the medical information received, to the received message.
  • prediction of a workflow is performed in the context of a session for the workflow.
  • session management section 214 creates a new session every time a patient ID scan is received or every time a healthcare worker logs in.
  • workflow prediction process described in greater detail below, predicts workflow by using one to all messages received during a session to predict a healthcare worker's next action.
  • workflow prediction section 204 of server 100 predicts the healthcare worker's workflow using, at least in part, the received medical information data.
  • workflow prediction section 204 predicts a healthcare worker's workflow based actions possible for the medical object whose identity is communicated in the medical information data. For example, if a scanned medical object is “milk,” then workflow prediction section 204 includes in a list of predicted action all possible actions to take for milk, including “feed,” “store,” and “document feeding” actions. If the scanned medical object corresponds to a patient's wristband, then using the patient's identity, the workflow prediction module 204 predicts the healthcare worker's next possible action (e.g., administer medication to the patient or take the patient's temperature, etc,)
  • next possible action e.g., administer medication to the patient or take the patient's temperature, etc,
  • state determination section 210 determines, based on the identity of the medical object, a state (or states) associated with the scanned medical object. For example, if a scanned medical object is a patient's wristband, state determination section 210 determines if the patient is in a pre-operating state, a post-operating state, an under-observation state, etc. State determination section 210 also obtains information regarding the patient's vitals state (e.g., weight, drug allergies etc.) from database 106 .
  • a state or states associated with the scanned medical object. For example, if a scanned medical object is a patient's wristband, state determination section 210 determines if the patient is in a pre-operating state, a post-operating state, an under-observation state, etc. State determination section 210 also obtains information regarding the patient's vitals state (e.g., weight, drug allergies etc.) from database 106 .
  • the patient's vitals state e.g., weight, drug
  • state determination section 210 determines an expiration state (whether medication has expired or not), a recall state (whether there have been any recalls issued for the medication), and so on. State determination section 210 makes the determination regarding states of a scanned medical object by querying database 106 via database access section 212 . State determination section 210 determines a state of the scanned medical object by reading state entries and their values associated with the scanned medical object and stored in a memory coupled to server 100 (e.g., database 106 ). State determination section 210 communicates the retrieved state information to workflow prediction section 204 .
  • workflow prediction section 204 predicts next possible actions in the healthcare worker's workflow. In certain configurations, workflow prediction section 204 predicts a single next possible action. Alternatively, in certain other configurations, workflow prediction section 204 predicts multiple levels of next possible actions (e.g., next possible action and the one after that and so on). When needed, workflow prediction section 204 also queries medical database 106 through database access section 212 for establishing a patient's diagnostics needs in order to predict the healthcare worker's workflow. Workflow prediction section 204 also communicates with state determination section 210 to determine a state (or states) associated with the medical object identified in the medical information data received, as previously described.
  • the prediction operation 304 comprises a database lookup operation.
  • medical database 106 may include a list of possible operations for a certain medical object (e.g., a medicine) and workflow prediction section 204 may simply use the list associated with the medical object as the predicted next possible action by a healthcare worker.
  • the prediction operation comprises selecting one of several possible next actions by a healthcare worker, associated with a medical object in medical database 106 .
  • the prediction operation includes using information related to past actions by the healthcare worker (or another healthcare worker having a similar role such as a nurse) and/or the same patient and including these actions in the predicted workflow.
  • workflow prediction section 204 is able to train itself by storing in memory the information about previous workflows and selections by a healthcare worker.
  • the prediction operation uses prior actions performed by a healthcare worker during the ongoing session to select a next possible action based on the medical object included in the message received in operation 302 .
  • the prediction operation may associate probabilities with actions possible by a healthcare worker.
  • a probability value associated with a possible next action may be used to prioritize display of the actions in the menu. For example, higher probability actions may be displayed at the top of the menu list, or may be made visibly prominent (e.g., color or font size) to the healthcare worker.
  • menu communication section 206 of server 100 communicates, based on the predicted workflow received from workflow prediction section 204 , a menu comprising one or more medical action options for displaying to the healthcare worker on interface 108 .
  • Various formats are possible for the communication of menu from menu communication section 206 to interface 108 .
  • menu communication section 206 conveys data to be displayed on interface 108 by including display instructions or graphics commands.
  • Such configurations are suitable when interface 108 is embedded in medical device 102 and medical device 102 has a processor capable of receiving graphics commands and displaying a menu to the healthcare worker by decoding the graphics commands.
  • menu communication section 206 may specify display properties such as font, size, color and placement of the menu on interface 108 .
  • menu communication section 206 uses a well-known technique such as the hypertext markup language (HTML) to specify the menu.
  • HTTP hypertext markup language
  • interface 108 is directly connected to server 100 and menu communication section 206 communicates menu to interface 108 using one of several well known graphics peripheral standards such as video graphics array (VGA) and the like.
  • VGA video graphics array
  • menu communication section 206 includes one or more action items in the predicted workflow, but disables or “grays out” display of the action items so that a healthcare worker may be able to see the action items on the menu, but may not be able to interact with the action item by selecting it from the menu.
  • display menu item for ordering new medication items to re-stock inventory may be grayed out if menu communication section 206 has determined that a sufficient number of doses are available in the inventory.
  • menu communication section 206 uses one or more operational parameters such as the time of the day, and state values associated with the scanned medical object in making decisions regarding disabling an action item. The grayed out action is displayed to make a healthcare worker aware, for her future reference, that such an action is possible for the scanned medical object.
  • workflow database update section 216 allows updates to database 106 , based on a system administrator's input. For example, a system administrator may update workflow database to improve the quality of healthcare provided to patients. A system administrator may update database 106 to adapt “best practices” across several healthcare facilities managed by the system administrator. In certain configurations, updating database 106 includes updating a list of predicted workflows associated with a received medical object ID. In certain configurations, updating database 106 includes adding new patient care rules, such as “do not administer medication X and medication Y together.”
  • medical database 106 includes a list 350 of a plurality of medical objects 352 .
  • medical database 106 includes a plurality of possible actions 354 possible for a healthcare worker.
  • Each of the plurality of possible actions 354 has zero or more states 356 .
  • the list 350 of the plurality of medical objects also includes a default entry 360 for an “unknown” medical object.
  • list 350 includes medical objects such as “patient,” “drug vial,” “food item,” “IV fluid container,” and so on.
  • possible actions 354 for the entry corresponding to the medical object 352 “drug vial” include “administer,” and “discard.”
  • possible actions corresponding to the entry 360 “unknown” include “re-scan,” “switch to manual entry,” “help,” and “contact system administrator.” If server 100 cannot recognize a scanned medical object 352 , server 100 requests medical device 102 to display the list of possible actions 354 corresponding to the “unknown” entry 360 .
  • example states 356 associated with the plurality of possible actions 354 include various previously discussed patient states (e.g., pre-operating state, etc.), vital sign states, and so on.
  • entries of database 106 are static and pre-determined based on rules of the healthcare facility where database 106 is used.
  • the server 100 updates, from time to time, entries of database 106 and values of state fields 356 . For example, the server 100 updates the entries depending on a healthcare worker's previous scans and/or selections.
  • FIGS. 4A and 4B generally relate to a prior art “static” workflow wherein server 100 is not predicting a healthcare worker's workflow.
  • the server 100 predicts the workflow, based on an identity of a scanned medical object.
  • process 401 illustrates an example prior art process related to documenting feeding milk to a baby patient when a medical workflow system is not predicting the workflow.
  • a menu is displayed to a healthcare worker.
  • the menu includes a list of several possible tasks. In general, this list of tasks includes several tasks not pertinent to the current task of documenting a feeding.
  • the healthcare worker chooses from the displayed tasks. Since the healthcare worker intends to document a feeding, the healthcare worker interacts with a “document feeding” workflow. In the document feeding workflow, the healthcare worker chooses milk functions (operation 405 ) or chooses baby functions (action 407 ).
  • a menu of possible milk-related functions is shown to the healthcare worker (operation 409 ) or a menu of baby functions is displayed to the healthcare worker (operation 411 ).
  • the healthcare worker indicates, in operation 413 , that she wants to document a feeding session.
  • the healthcare worker is prompted for a baby ID.
  • the healthcare worker scans a barcode containing a patient's identification (e.g., a patient bracelet).
  • a healthcare worker cannot simply scan a baby's ID bracelet to begin the “document feeding” workflow. Instead, the healthcare worker has to navigate through multiple menu screens, before she is able to scan a medical object (baby ID).
  • a server (not shown in FIG. 4A ) verifies the baby ID received at the server. If the scanned baby ID is not valid, an error message is displayed to the healthcare worker. If the scanned baby ID is valid, the server sends a request to a medical device 102 to display “choose feeding type” menu screen in operation 419 . In certain configurations, the healthcare worker is given at least three choices: formula feeding, breast feeding or container feeding.
  • a “document formula feeding event” menu screen is displayed to the healthcare worker at operation 421 . If the healthcare worker chooses the breast feeding option, then a “document breast feeding event” option is displayed to the healthcare worker at operation 423 . If the healthcare worker chooses the container option, then a “feeding events list” option is displayed, at operation 425 , to the healthcare worker.
  • the healthcare worker e.g., a nurse
  • the server communicates with database 106 and registers the feeding event.
  • feeding events list is chosen at operation 425 , then a menu of possible events is displayed to the healthcare worker.
  • the healthcare worker chooses an event from the displayed menu.
  • server 100 instructs, at operation 431 , an interface 108 (not shown in FIG. 4B ) to display a form for documenting container feeding.
  • the healthcare worker documents the feeding event (operation 435 ) and the server also causes a state of the container maintained by the server to change at operation 437 . If the expected delay before the feeding can be documented has not elapsed, then server 100 causes a delay message to be displayed (operation 439 ) to the healthcare worker.
  • FIG. 4C shows exemplary process 400 implemented using the scanner of the present disclosure to predict a healthcare worker's workflow based on identities of scanned medical items.
  • a healthcare worker scans a medical object.
  • the scanned object may be a baby patient's wristband or a milk container.
  • server 100 verifies the received ID at operation 404 . For example, if the scanned ID was for a baby patient, at operation 404 , server 100 verifies that the baby patient is a currently admitted patient.
  • server 100 sends a message to a medical device 102 to display a menu of baby-related actions to the healthcare worker.
  • one of the actions relates to documenting feeding of milk to the baby patient.
  • the healthcare worker may want to document a milk feeding given to the baby patient in the hospital's medical records and may choose the “document feeding” option at operation 408 .
  • process 400 avoids the need for a healthcare worker to navigate through a set of menus before she can communicate to server 100 that the current task (workflow) relates to document feeding of a baby patient.
  • server 100 facilitates the display of a “choose feeding type” menu to the healthcare worker by sending a display request to a medical device 102 .
  • server 100 communicates a request to medical device 102 to display a workflow menu predicted from “breast feeding a baby” actions.
  • server 100 communicates a request to medical device 102 to display a workflow menu predicted from the selection of the “formula feeding” action and patient ID for a baby.
  • the healthcare worker documents the feeding event operation 416 .
  • server 100 predicts possible actions that the healthcare worker may want to perform and, in operation 418 , communicates a menu to medical device 102 comprising predicted actions.
  • the healthcare worker may choose, at operation 420 , an action from the menu displayed.
  • server 100 communicates a request to medical device 102 to display, at operation 422 , a list of “document container feed” actions.
  • server 100 checks the expected delay (e.g., duration of feeding milk to a baby patient) has elapsed. If the delay has expired, at operation 430 , server 100 facilitates documentation of the feeding event by the healthcare worker. Server 100 updates a medical database coupled to the hospital network, at operation 432 , to reflect the feeding. Server 100 then changes a state associated with the container scanned in operation 402 to indicate, for example, that the container has been used. If server 100 decides at operation 426 that a delay has not elapsed, server 100 communicates a message, at operation 428 , to a medical device 102 to display a delay message to the healthcare worker.
  • the expected delay e.g., duration of feeding milk to a baby patient
  • process 400 illustrated in FIG. 4C is initiated by a healthcare worker scanning a medical object (e.g., a milk container or a baby patient's wrist ID). Based on the identity of the scanned medical object, server 100 predicts the healthcare worker's workflow and presents menu action items to the healthcare worker. In some aspects, the exact order in which various scans are performed does not matter for a successful execution of a workflow. For example, in process 400 , a healthcare worker need not go through multiple menu screens to communicate to server 100 what the healthcare worker wants to accomplish. Instead, the healthcare worker need only scan a baby ID and a milk container, for server 100 to predict the healthcare worker's workflow. Furthermore, the exact order in which the baby ID is scanned and the container ID is scanned does not matter to server 100 supporting the workflow.
  • a healthcare worker scanning a medical object (e.g., a milk container or a baby patient's wrist ID).
  • server 100 predicts the healthcare worker's workflow and presents menu action items to the healthcare worker.
  • an adaptive workflow method that guides a healthcare worker to a menu of medical tasks by predicting the healthcare worker's workflow from a received barcode scan information.
  • a medical device communicatively coupled to a server displays a “top level” menu upon starting operation and awaits a barcode scan from the healthcare worker.
  • the server interprets what has been scanned and displays to the healthcare worker a menu of actions such that it is possible to perform a meaningful medical task with the previously scanned object.
  • the server is configured to accept any barcode format.
  • a healthcare worker uses a barcode reader or other device to scan medical objects such as a patient ID, a baby ID or a milk container ID. Based on what was scanned, the system will facilitate the display of next possible tasks to the healthcare worker.
  • the healthcare worker is a nurse administering milk to a baby.
  • the server 100 is able to predict that the workflow relates to administering milk to the baby. Based on this prediction, server 100 facilitates presentation of an appropriate medical task menu to the nurse.
  • FIGS. 5-9 illustrate various display screens displayed on interface 108 to a healthcare worker during accomplishment of medicals tasks by predicting workflows.
  • interface 108 is a part of a medical device 102 such as a barcode or and RFID scanner.
  • a healthcare worker scans barcode using the medical device 102 and interacts with menu displayed on the interface 108 .
  • FIG. 5 illustrates an exemplary menu screen 500 , in accordance with certain aspects of the present disclosure.
  • medical device 102 displays menu screen 500 to a healthcare worker as an initial login to a predictive workflow scanning application in accordance with various application configurations described above with respect to FIGS. 1 to 4C .
  • Menu screen 500 comprises area 502 where the healthcare enters her user name and further enters a password in area 504 .
  • session management section 214 begins a new session for workflow prediction.
  • FIG. 6 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure.
  • Menu screen 600 illustrates a message displayed for a healthcare worker suggesting a next action to perform, based on predictive workflow calculations of the present disclosure.
  • the suggested actions include scanning a patient's wristband or a medical container label or tapping the interface 108 to receive more possible actions.
  • FIG. 7 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure.
  • Menu screen 700 of interface 108 includes display area 702 displaying a healthcare worker's identity and a patient's identity.
  • Menu screen 700 further includes a list of baby functions, as indicated by a heading display area 704 , displaying actions that are possible for a baby patient.
  • Menu screen 700 further comprises a list of actions that the healthcare worker may need to perform, as calculated by predicting the healthcare worker's workflow.
  • the example illustrated in FIG. 7 shows a “baby functions” action in region 704 , a “check out task” in region 706 , an “administer milk” action in region 708 , “document feeding” in region 710 and “print labels” in region 712 .
  • Menu screen 700 is an example of a menu that may be presented to a healthcare worker at operation 406 described in FIG. 4C .
  • FIG. 8 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure.
  • Menu screen 800 includes display area 802 displaying a healthcare worker's identity and a patient's identity.
  • Menu screen 800 further includes a list of adult functions, as indicated by a heading display area 804 , displaying actions possible for an adult patient.
  • Menu screen 800 further comprises a list of actions that the healthcare worker may need to perform, as calculated by predicting the healthcare worker's workflow.
  • the example illustrated in FIG. 8 shows an “adult functions” action in region 804 , a “check out task” in region 806 , a “print labels” action in region 808 and a “receive milk” action in region 810 .
  • FIG. 9 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure.
  • Menu screen 900 includes display area 902 displaying a healthcare worker's identity and a patient's identity.
  • Menu screen 900 further includes a list of milk functions, as indicated by a heading display area 904 , displaying actions possible for milk (e.g., when a milk container is scanned as a medical object).
  • Menu screen 900 further comprises a list of actions that the healthcare worker may need to perform, as calculated by predicting the healthcare worker's workflow.
  • the example illustrated in FIG. 9 shows a “receive milk” action in region 906 an “administer milk” action in region 908 , a “fortify milk” action in region 908 , and a “document feeding” action in region 912 .
  • menu screens 500 , 600 , 700 , 800 and 900 described above with respect to FIGS. 5 to 9 are used at various operations in a workflow for feeding milk to a baby patient.
  • menu screen 700 may be presented at operation 406 .
  • menu screen 600 may be presented at operation 402 .
  • workflow prediction techniques described presently free up a healthcare worker from having to remember a specific sequence of scanning medical objects.
  • the methods and systems of the present disclosure provide for a server in a medical facility to manage menu screens displayed to a healthcare worker in ways that minimize disruption to a healthcare worker's workflow.
  • the healthcare worker “trains” a system to better predict her next actions, based on her actions during a previous medical workflow. Therefore, configurations of the present disclosure relieve a healthcare worker from having to memorize menu screens and inputs expected from her to accomplish certain healthcare tasks.
  • workflow prediction is based on IDs of medical objects scanned by a healthcare worker.

Abstract

Examples of systems and methods are provided for adapting a medical workflow implemented at a computer coupled to a hospital network, including receiving a message comprising medical information data, predicting a healthcare worker's workflow using, at least in part, the medical information data; communicating, based on the predicted workflow, a menu comprising one or more medical action options for displaying to the healthcare worker.

Description

    FIELD
  • The present disclosure relates to medical workflow systems.
  • BACKGROUND
  • The work of a healthcare worker (e.g., a nurse) generally involves performing several tasks related to patient treatment. Often, a healthcare worker performs tasks such as feeding a patient or recording the temperature of a patient using devices such as barcode scanners to enter identities of medications and patients in a medical computer system. During care and treatment of a patient, a healthcare worker may perform several tasks that may collectively be referred to as a “workflow” of the healthcare worker. A healthcare worker may follow a workflow specified by regulation of a healthcare facility. The workflow may be provided by a medical computer system to the healthcare worker on a display screen.
  • Acceptance of new technologies by a healthcare worker is a problem faced by the healthcare industry. Introduction of a new technology often means some changes to a healthcare worker's workflow and therefore may require efforts on part of the healthcare worker to adopt the new technology. For example, a certain healthcare system configuration may expect a healthcare worker to perform tasks A, B and C, in that order (e.g., scanning a label, administering a medication, documenting a patient's vital signs, etc.). A change or an upgrade to the healthcare system configuration may require the healthcare worker to perform tasks in the order A, C, B or may add an additional task D (e.g., new workflow is A, B, D and C). Such changes in the healthcare system configuration may be disruptive to the workflow that the healthcare worker is used to follow. In some instances, upgrades or changes to existing healthcare configurations may therefore lead to increased errors or reduced efficiency on the part of healthcare workers during the time period in which the healthcare workers become accustomed to the changes.
  • SUMMARY
  • Methods and systems that solve the above-discussed and other needs for improved medical workflow are disclosed.
  • In one aspect of the disclosure, a method of adapting a medical workflow implemented at a processor coupled to a hospital network is disclosed. The method comprises receiving a message comprising medical information data, predicting a healthcare worker's workflow using, at least in part, the medical information data, and communicating to an interface, based on the predicting, a menu comprising one or more medical action options.
  • In another aspect of the disclosure, a machine-readable medium encoded with instructions for adapting a medical workflow is disclosed. The instructions comprise code to cause a processor to receive a message comprising medical information data, predict a healthcare worker's workflow using, at least in part, the medical information data and communicate to an interface, based on the predicting, a menu comprising one or more medical action options.
  • In yet another aspect of the disclosure, a system for adapting a medical workflow is disclosed. The system comprises a server; a scanner having an interface and a database. The server comprises a medical information reception section configured to receive a message from the scanner comprising medical information data, a workflow prediction section configured to predict a healthcare worker's workflow using, at least in part, the medical information data and a menu communication section configured to communicate to the scanner, based on the predicted workflow, a menu comprising one or more medical action options for displaying on the interface.
  • The foregoing and other features, aspects and advantages of the embodiments of the present disclosure will become more apparent from the following detailed description and accompanying drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagrammatic view of a system, in accordance with embodiments of the present disclosure.
  • FIG. 2 is a conceptual block diagram of a server, in accordance with embodiments of the present disclosure.
  • FIG. 3A is a flow chart depicting an exemplary process implemented on a server, in accordance with embodiments of the present disclosure.
  • FIG. 3B is a conceptual block diagram depicting entries of a medical database, in accordance with embodiments of the present disclosure.
  • FIG. 4A is a flow chart depicting an exemplary prior art process implemented on a server.
  • FIG. 4B is a flow chart depicting an exemplary prior art process implemented on a server.
  • FIG. 4C is a flow chart depicting an exemplary process implemented on a server, in accordance with embodiments of the present disclosure.
  • FIG. 5 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 6 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 7 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 8 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • FIG. 9 illustrates a scanner with an exemplary menu screen, in accordance with embodiments of the present disclosure.
  • DETAILED DESCRIPTION
  • The detailed description set forth below is intended as a description of various configurations of the subject technology and is not intended to represent the only configurations in which the subject technology may be practiced. The appended drawings are incorporated herein and constitute a part of the detailed description. The detailed description includes specific details for the purpose of providing a thorough understanding of the subject technology. However, it will be apparent to those skilled in the art that the subject technology may be practiced without these specific details. In some instances, well-known structures and components are shown in block diagram form in order to avoid obscuring the concepts of the subject technology. Like components are labeled with identical element numbers for ease of understanding.
  • The disclosed embodiments address and solve problems related to the aforementioned medical workflow configurations. The embodiments solve these problems, at least in part, by minimizing or eliminating the need for a healthcare worker to follow only a single specific sequence of actions to accomplish certain healthcare tasks. The disclosed embodiments solve these problems, at least by predicting workflow of a healthcare worker based on an identity of a medical item scanned by the healthcare worker. In certain embodiments, workflow prediction is used to create a menu presented to the healthcare worker at a display in response to the healthcare worker's prior actions and menu selections.
  • According to certain embodiments, the prediction of a workflow is triggered by the act of a healthcare worker scanning a label such as a patient's identification (ID) tag or a medical package label. Using the identity of the scanned medical object, a predictive process at a computer calculates possible next actions by a healthcare worker. Based on the predicted next actions, the computer then directs a display to present a menu to the healthcare worker to facilitate fulfilling the possible next actions. The predictive process adapts based on prior scans performed by the healthcare worker during a session. Therefore, in certain aspects, the predictive process relieves the healthcare worker from having to remember a specific sequence of scanning various medical objects.
  • FIG. 1 illustrates a simplified diagram of system 110 in accordance with certain configurations of the present disclosure. System 110 includes one or more medical devices 102 capable of communication with a computer server 100 (server) via hospital network 104. System 110 further includes an interface 108 communicatively coupled to server 100. Server 100 communicates to interface 108 for displaying to a healthcare worker. Interface 108 provides a display to a user, as well as an input device for a user to input information into system 110. An example of interface 108 is a touch screen, although in other embodiments, the interface 108 includes a separate display and input device. Interface 108 communicates user interactions (e.g., menu selections) to server 100. In certain embodiments, interface 108 is directly attached to server 100. In certain other embodiments, interface 108 is remotely located, and communicates with server 100 over hospital network 104. In yet other embodiments, interface 108 is integrated into medical device 102.
  • Still referring to FIG. 1, system 110 further includes medical database 106 communicatively coupled to server 100 through management network 105. Server 100 is configured to predict workflow of a healthcare worker based on the healthcare worker's interaction with medical device 102 and/or interface 108. Server 100 communicates with database 106 to receive or store certain relevant information useful in the prediction of a workflow. By way of illustration and not limitation, in certain configurations server 100 predicts workflow of a healthcare worker using a software application running on a processor of server 100.
  • Still referring to FIG. 1, by way of illustration and not limitation, medical device 102 may be a computer, a mobile phone, a laptop computer, a thin client device, a personal digital assistant (PDA), a portable computing device, a barcode scanner, a radio frequency identification (RFID) receiver or another device with a processor. As used herein, the terms “scanning,” and “scan” refer to a wired or wireless operation of transferring information from an entity to a processor. This includes, for example, barcode scanning by an infrared receiver, sensing of radio frequency identification (RFID) using an RFID antenna, manually reading and entering barcode or patient information to a computer, and so on.
  • Still referring to FIG. 1, by way of illustration and not limitation, networks 104 and 105 may be, for example, modem connections, a LAN connection including the Ethernet or a broadband WAN connection including DSL, Cable, T1, T3, Fiber Optics, Wi-Fi, or a mobile network connection including GSM, GPRS, 3G, WiMax or other network connections. In certain configurations, hospital network 104 is a dedicated point-to-point link between medical device 102 and server 100 (e.g., a Bluetooth or a wireless USB link).
  • FIG. 2 illustrates a simplified block diagram of configuration 200 of server 100 in accordance with certain configurations of the present disclosure. Operating system (OS) 218 is in communication with a medical information reception section 202, a workflow prediction section 204, a menu communication section 206, a verification section 208, a state determination section 210, a database access section 212, a database update section 216 and a session management section 214. Features and functions of these sections according to certain aspects of the present disclosure may be readily implemented in software, in hardware and/or a combination thereof, and are further described in the disclosure.
  • FIG. 3A is a flow chart illustrating an exemplary process 300 implemented on server 100, in accordance with certain aspects of the present disclosure. Server 100 is configured to receive messages from medical device 102. Process 300 is initiated by the server 100 receiving a message comprising medical information data in operation 302. In certain configurations, a healthcare worker uses medical device 102 (e.g., a barcode reader) to scan a barcode label or perform a frequency identification (MID) scan of a medical object. The medical object may be, for example, a medication vial, a food item, a patient's wristband comprising the patient's identity information, and so on. In certain configurations, a healthcare worker directly enters an identity of the medical object into medical device 102 via, for example, a touch screen or a keyboard.
  • In response to the scan or other input by a healthcare worker, medical device 102 communicates a message comprising certain medical information data to server 100. In certain embodiments, the message is in the form of an internet protocol (IP) packet or any other well-known machine-to-machine communication format. The medical information data includes medical object identification information. For example, the medical information data may include a barcode value or an RFID value that uniquely represents a medical object.
  • Medical information reception section 202 (FIG. 2) processes the received message carrying the medical information data received by server 100. In certain configurations, medical information reception section 202 parses the received message to extract medical information data. In certain configurations, medical information reception section 202 performs authentication of medical device 102 to ensure that medical device 102 is not a rogue medical device.
  • Session management section 214 associates a session, based on the medical information received, to the received message. In certain configurations, prediction of a workflow is performed in the context of a session for the workflow. For example, in certain configurations, session management section 214 creates a new session every time a patient ID scan is received or every time a healthcare worker logs in. In certain configurations, workflow prediction process, described in greater detail below, predicts workflow by using one to all messages received during a session to predict a healthcare worker's next action.
  • Still referring to FIG. 3A, at operation 304, workflow prediction section 204 of server 100 predicts the healthcare worker's workflow using, at least in part, the received medical information data. In certain configurations, workflow prediction section 204 predicts a healthcare worker's workflow based actions possible for the medical object whose identity is communicated in the medical information data. For example, if a scanned medical object is “milk,” then workflow prediction section 204 includes in a list of predicted action all possible actions to take for milk, including “feed,” “store,” and “document feeding” actions. If the scanned medical object corresponds to a patient's wristband, then using the patient's identity, the workflow prediction module 204 predicts the healthcare worker's next possible action (e.g., administer medication to the patient or take the patient's temperature, etc,)
  • Still referring to FIG. 3A, state determination section 210 determines, based on the identity of the medical object, a state (or states) associated with the scanned medical object. For example, if a scanned medical object is a patient's wristband, state determination section 210 determines if the patient is in a pre-operating state, a post-operating state, an under-observation state, etc. State determination section 210 also obtains information regarding the patient's vitals state (e.g., weight, drug allergies etc.) from database 106. For example, if the scanned medical object is a medication vial, state determination section 210 determines an expiration state (whether medication has expired or not), a recall state (whether there have been any recalls issued for the medication), and so on. State determination section 210 makes the determination regarding states of a scanned medical object by querying database 106 via database access section 212. State determination section 210 determines a state of the scanned medical object by reading state entries and their values associated with the scanned medical object and stored in a memory coupled to server 100 (e.g., database 106). State determination section 210 communicates the retrieved state information to workflow prediction section 204.
  • Still referring to FIG. 3A, based on state information obtained from state determination section 210 and session information from session management section 214, workflow prediction section 204 predicts next possible actions in the healthcare worker's workflow. In certain configurations, workflow prediction section 204 predicts a single next possible action. Alternatively, in certain other configurations, workflow prediction section 204 predicts multiple levels of next possible actions (e.g., next possible action and the one after that and so on). When needed, workflow prediction section 204 also queries medical database 106 through database access section 212 for establishing a patient's diagnostics needs in order to predict the healthcare worker's workflow. Workflow prediction section 204 also communicates with state determination section 210 to determine a state (or states) associated with the medical object identified in the medical information data received, as previously described.
  • In certain configurations, the prediction operation 304 comprises a database lookup operation. For example, medical database 106 may include a list of possible operations for a certain medical object (e.g., a medicine) and workflow prediction section 204 may simply use the list associated with the medical object as the predicted next possible action by a healthcare worker. In certain configurations, the prediction operation comprises selecting one of several possible next actions by a healthcare worker, associated with a medical object in medical database 106. In certain configurations, the prediction operation includes using information related to past actions by the healthcare worker (or another healthcare worker having a similar role such as a nurse) and/or the same patient and including these actions in the predicted workflow. Therefore, in certain aspects, workflow prediction section 204 is able to train itself by storing in memory the information about previous workflows and selections by a healthcare worker. In certain configurations, the prediction operation uses prior actions performed by a healthcare worker during the ongoing session to select a next possible action based on the medical object included in the message received in operation 302.
  • In certain configurations, the prediction operation may associate probabilities with actions possible by a healthcare worker. A probability value associated with a possible next action may be used to prioritize display of the actions in the menu. For example, higher probability actions may be displayed at the top of the menu list, or may be made visibly prominent (e.g., color or font size) to the healthcare worker.
  • Still referring to FIG. 3A, at operation 306, menu communication section 206 of server 100 communicates, based on the predicted workflow received from workflow prediction section 204, a menu comprising one or more medical action options for displaying to the healthcare worker on interface 108. Various formats are possible for the communication of menu from menu communication section 206 to interface 108. For example, in certain configurations, menu communication section 206 conveys data to be displayed on interface 108 by including display instructions or graphics commands. Such configurations are suitable when interface 108 is embedded in medical device 102 and medical device 102 has a processor capable of receiving graphics commands and displaying a menu to the healthcare worker by decoding the graphics commands. For example, menu communication section 206 may specify display properties such as font, size, color and placement of the menu on interface 108. As an example, in certain configurations, menu communication section 206 uses a well-known technique such as the hypertext markup language (HTML) to specify the menu. Alternatively, in certain configurations, interface 108 is directly connected to server 100 and menu communication section 206 communicates menu to interface 108 using one of several well known graphics peripheral standards such as video graphics array (VGA) and the like.
  • Still referring to FIG. 3A, in certain configurations, menu communication section 206 includes one or more action items in the predicted workflow, but disables or “grays out” display of the action items so that a healthcare worker may be able to see the action items on the menu, but may not be able to interact with the action item by selecting it from the menu. For example, display menu item for ordering new medication items to re-stock inventory may be grayed out if menu communication section 206 has determined that a sufficient number of doses are available in the inventory. In certain configurations, menu communication section 206 uses one or more operational parameters such as the time of the day, and state values associated with the scanned medical object in making decisions regarding disabling an action item. The grayed out action is displayed to make a healthcare worker aware, for her future reference, that such an action is possible for the scanned medical object.
  • In certain embodiments, workflow database update section 216 allows updates to database 106, based on a system administrator's input. For example, a system administrator may update workflow database to improve the quality of healthcare provided to patients. A system administrator may update database 106 to adapt “best practices” across several healthcare facilities managed by the system administrator. In certain configurations, updating database 106 includes updating a list of predicted workflows associated with a received medical object ID. In certain configurations, updating database 106 includes adding new patient care rules, such as “do not administer medication X and medication Y together.”
  • Referring now to FIG. 3B, a conceptual block diagram of the entries of database 106 is depicted. In certain embodiments, medical database 106 includes a list 350 of a plurality of medical objects 352. For each medical object 352 in the list 350, medical database 106 includes a plurality of possible actions 354 possible for a healthcare worker. Each of the plurality of possible actions 354 has zero or more states 356. In certain configurations, the list 350 of the plurality of medical objects also includes a default entry 360 for an “unknown” medical object.
  • As an example, list 350 includes medical objects such as “patient,” “drug vial,” “food item,” “IV fluid container,” and so on. As an example, possible actions 354 for the entry corresponding to the medical object 352 “drug vial” include “administer,” and “discard.” As an example, possible actions corresponding to the entry 360 “unknown” include “re-scan,” “switch to manual entry,” “help,” and “contact system administrator.” If server 100 cannot recognize a scanned medical object 352, server 100 requests medical device 102 to display the list of possible actions 354 corresponding to the “unknown” entry 360.
  • Still referring to FIG. 3B, example states 356 associated with the plurality of possible actions 354 include various previously discussed patient states (e.g., pre-operating state, etc.), vital sign states, and so on. In certain configurations, entries of database 106 are static and pre-determined based on rules of the healthcare facility where database 106 is used. In certain configurations, the server 100 updates, from time to time, entries of database 106 and values of state fields 356. For example, the server 100 updates the entries depending on a healthcare worker's previous scans and/or selections.
  • By way of illustration, and not limitation, the process of adapting a healthcare worker's workflow by predicting the healthcare worker's actions is further illustrated below via an example workflow wherein a healthcare worker feeds milk to a baby patient and documents the feeding in medical records. In particular, two different workflows to perform the “document feeding” task are described below. The workflow illustrated in FIGS. 4A and 4B generally relates to a prior art “static” workflow wherein server 100 is not predicting a healthcare worker's workflow. On the other hand, in the workflow illustrated in FIG. 4C, the server 100 predicts the workflow, based on an identity of a scanned medical object.
  • Referring to FIG. 4A, process 401 illustrates an example prior art process related to documenting feeding milk to a baby patient when a medical workflow system is not predicting the workflow. At operation 403, a menu is displayed to a healthcare worker. The menu includes a list of several possible tasks. In general, this list of tasks includes several tasks not pertinent to the current task of documenting a feeding. The healthcare worker chooses from the displayed tasks. Since the healthcare worker intends to document a feeding, the healthcare worker interacts with a “document feeding” workflow. In the document feeding workflow, the healthcare worker chooses milk functions (operation 405) or chooses baby functions (action 407). Based on this selection by the healthcare worker, either a menu of possible milk-related functions is shown to the healthcare worker (operation 409) or a menu of baby functions is displayed to the healthcare worker (operation 411). In response to the displayed menu of milk or baby functions, the healthcare worker indicates, in operation 413, that she wants to document a feeding session. At operation 415, the healthcare worker is prompted for a baby ID. The healthcare worker scans a barcode containing a patient's identification (e.g., a patient bracelet).
  • As can be seen from the description above of the operations illustrated in FIG. 4A, a healthcare worker cannot simply scan a baby's ID bracelet to begin the “document feeding” workflow. Instead, the healthcare worker has to navigate through multiple menu screens, before she is able to scan a medical object (baby ID).
  • Still referring to FIG. 4A, at operation 417, a server (not shown in FIG. 4A) verifies the baby ID received at the server. If the scanned baby ID is not valid, an error message is displayed to the healthcare worker. If the scanned baby ID is valid, the server sends a request to a medical device 102 to display “choose feeding type” menu screen in operation 419. In certain configurations, the healthcare worker is given at least three choices: formula feeding, breast feeding or container feeding.
  • Referring now to FIG. 4B, if the healthcare worker chooses the “formula feeding” option, then a “document formula feeding event” menu screen is displayed to the healthcare worker at operation 421. If the healthcare worker chooses the breast feeding option, then a “document breast feeding event” option is displayed to the healthcare worker at operation 423. If the healthcare worker chooses the container option, then a “feeding events list” option is displayed, at operation 425, to the healthcare worker.
  • Still referring to FIG. 4B, if “formula feeding” or “breast feeding” event is chosen at operations 421 or 423, the healthcare worker (e.g., a nurse) documents the feeding event in operation 429 by entering pertinent data into a medical device. At operation 433, the server communicates with database 106 and registers the feeding event. If “feeding events list” is chosen at operation 425, then a menu of possible events is displayed to the healthcare worker. At operation 427, the healthcare worker chooses an event from the displayed menu. In response to the selection, server 100 instructs, at operation 431, an interface 108 (not shown in FIG. 4B) to display a form for documenting container feeding. If the delay expected before the documentation is entered (e.g., waiting to finish a typical milk feeding), then the healthcare worker documents the feeding event (operation 435) and the server also causes a state of the container maintained by the server to change at operation 437. If the expected delay before the feeding can be documented has not elapsed, then server 100 causes a delay message to be displayed (operation 439) to the healthcare worker.
  • FIG. 4C shows exemplary process 400 implemented using the scanner of the present disclosure to predict a healthcare worker's workflow based on identities of scanned medical items. At operation 402, a healthcare worker scans a medical object. For example, the scanned object may be a baby patient's wristband or a milk container. Upon receiving a message comprising medical information data related to the scanned medical object, server 100 verifies the received ID at operation 404. For example, if the scanned ID was for a baby patient, at operation 404, server 100 verifies that the baby patient is a currently admitted patient. If the scanned ID corresponds to a valid baby ID, at operation 406, server 100 sends a message to a medical device 102 to display a menu of baby-related actions to the healthcare worker. For example, one of the actions relates to documenting feeding of milk to the baby patient. The healthcare worker may want to document a milk feeding given to the baby patient in the hospital's medical records and may choose the “document feeding” option at operation 408.
  • In comparison to process 401 illustrated in FIG. 4A and FIG. 4B, a healthcare worker is able to communicate to server 100 the current task (document feeding) by first scanning a medical object ID. Therefore, process 400 avoids the need for a healthcare worker to navigate through a set of menus before she can communicate to server 100 that the current task (workflow) relates to document feeding of a baby patient.
  • Still referring to FIG. 4C, at operation 410, server 100 facilitates the display of a “choose feeding type” menu to the healthcare worker by sending a display request to a medical device 102. If the healthcare worker chooses the “breast feeding” menu (operation 412), server 100 communicates a request to medical device 102 to display a workflow menu predicted from “breast feeding a baby” actions. If the healthcare worker chooses “formula feeding” menu (operation 414), server 100 communicates a request to medical device 102 to display a workflow menu predicted from the selection of the “formula feeding” action and patient ID for a baby. Subsequent to either operation 412 or operation 414, the healthcare worker documents the feeding event operation 416.
  • Still referring to FIG. 4C, if the healthcare worker chooses, at operation 410, the “container feeding” menu, then server 100 predicts possible actions that the healthcare worker may want to perform and, in operation 418, communicates a menu to medical device 102 comprising predicted actions. The healthcare worker may choose, at operation 420, an action from the menu displayed. In response to the healthcare worker's selection at operation 420, server 100 communicates a request to medical device 102 to display, at operation 422, a list of “document container feed” actions. The healthcare worker is also shown a similar message at operation 422, if medical asset ID scanner in operation 402 was that of a container and server 100 was able to link, at operation 424, the scanned container ID with the identity of a baby in the hospital. At operation 426, server 100 checks the expected delay (e.g., duration of feeding milk to a baby patient) has elapsed. If the delay has expired, at operation 430, server 100 facilitates documentation of the feeding event by the healthcare worker. Server 100 updates a medical database coupled to the hospital network, at operation 432, to reflect the feeding. Server 100 then changes a state associated with the container scanned in operation 402 to indicate, for example, that the container has been used. If server 100 decides at operation 426 that a delay has not elapsed, server 100 communicates a message, at operation 428, to a medical device 102 to display a delay message to the healthcare worker.
  • In comparison to process 401 illustrated in FIG. 4A, process 400 illustrated in FIG. 4C is initiated by a healthcare worker scanning a medical object (e.g., a milk container or a baby patient's wrist ID). Based on the identity of the scanned medical object, server 100 predicts the healthcare worker's workflow and presents menu action items to the healthcare worker. In some aspects, the exact order in which various scans are performed does not matter for a successful execution of a workflow. For example, in process 400, a healthcare worker need not go through multiple menu screens to communicate to server 100 what the healthcare worker wants to accomplish. Instead, the healthcare worker need only scan a baby ID and a milk container, for server 100 to predict the healthcare worker's workflow. Furthermore, the exact order in which the baby ID is scanned and the container ID is scanned does not matter to server 100 supporting the workflow.
  • It will be appreciated that in certain aspects of the present disclosure, an adaptive workflow method is provided that guides a healthcare worker to a menu of medical tasks by predicting the healthcare worker's workflow from a received barcode scan information. In certain configurations, a medical device communicatively coupled to a server displays a “top level” menu upon starting operation and awaits a barcode scan from the healthcare worker. Upon receiving data related to a barcode scan, the server interprets what has been scanned and displays to the healthcare worker a menu of actions such that it is possible to perform a meaningful medical task with the previously scanned object. The server is configured to accept any barcode format.
  • In certain configurations, a healthcare worker uses a barcode reader or other device to scan medical objects such as a patient ID, a baby ID or a milk container ID. Based on what was scanned, the system will facilitate the display of next possible tasks to the healthcare worker. In certain configurations, e.g., as described above with respect to FIGS. 1 to 4C, the healthcare worker is a nurse administering milk to a baby. Regardless of the order in which the nurse scans barcodes (e.g., the baby's wristband ID first or the milk container ID first), the server 100 is able to predict that the workflow relates to administering milk to the baby. Based on this prediction, server 100 facilitates presentation of an appropriate medical task menu to the nurse.
  • FIGS. 5-9 illustrate various display screens displayed on interface 108 to a healthcare worker during accomplishment of medicals tasks by predicting workflows. For example, in certain configurations, interface 108 is a part of a medical device 102 such as a barcode or and RFID scanner. In such configurations, a healthcare worker scans barcode using the medical device 102 and interacts with menu displayed on the interface 108.
  • FIG. 5 illustrates an exemplary menu screen 500, in accordance with certain aspects of the present disclosure. In certain configurations, medical device 102 displays menu screen 500 to a healthcare worker as an initial login to a predictive workflow scanning application in accordance with various application configurations described above with respect to FIGS. 1 to 4C. Menu screen 500 comprises area 502 where the healthcare enters her user name and further enters a password in area 504. In certain configurations, when a healthcare worker logs in from screen 500, session management section 214 begins a new session for workflow prediction.
  • FIG. 6 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure. Menu screen 600 illustrates a message displayed for a healthcare worker suggesting a next action to perform, based on predictive workflow calculations of the present disclosure. In this case, the suggested actions include scanning a patient's wristband or a medical container label or tapping the interface 108 to receive more possible actions.
  • FIG. 7 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure. Menu screen 700 of interface 108 includes display area 702 displaying a healthcare worker's identity and a patient's identity. Menu screen 700 further includes a list of baby functions, as indicated by a heading display area 704, displaying actions that are possible for a baby patient. Menu screen 700 further comprises a list of actions that the healthcare worker may need to perform, as calculated by predicting the healthcare worker's workflow. The example illustrated in FIG. 7 shows a “baby functions” action in region 704, a “check out task” in region 706, an “administer milk” action in region 708, “document feeding” in region 710 and “print labels” in region 712. Menu screen 700 is an example of a menu that may be presented to a healthcare worker at operation 406 described in FIG. 4C.
  • FIG. 8 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure. Menu screen 800 includes display area 802 displaying a healthcare worker's identity and a patient's identity. Menu screen 800 further includes a list of adult functions, as indicated by a heading display area 804, displaying actions possible for an adult patient. Menu screen 800 further comprises a list of actions that the healthcare worker may need to perform, as calculated by predicting the healthcare worker's workflow. The example illustrated in FIG. 8 shows an “adult functions” action in region 804, a “check out task” in region 806, a “print labels” action in region 808 and a “receive milk” action in region 810.
  • FIG. 9 illustrates an exemplary menu screen, in accordance with certain aspects of the present disclosure. Menu screen 900 includes display area 902 displaying a healthcare worker's identity and a patient's identity. Menu screen 900 further includes a list of milk functions, as indicated by a heading display area 904, displaying actions possible for milk (e.g., when a milk container is scanned as a medical object). Menu screen 900 further comprises a list of actions that the healthcare worker may need to perform, as calculated by predicting the healthcare worker's workflow. The example illustrated in FIG. 9 shows a “receive milk” action in region 906 an “administer milk” action in region 908, a “fortify milk” action in region 908, and a “document feeding” action in region 912.
  • The menu screens 500, 600, 700, 800 and 900 described above with respect to FIGS. 5 to 9 are used at various operations in a workflow for feeding milk to a baby patient. For example, during process 400 illustrated in FIG. 4C, menu screen 700 may be presented at operation 406. Similarly, menu screen 600 may be presented at operation 402.
  • It will be appreciated that, in certain aspects, workflow prediction techniques described presently, free up a healthcare worker from having to remember a specific sequence of scanning medical objects. The methods and systems of the present disclosure provide for a server in a medical facility to manage menu screens displayed to a healthcare worker in ways that minimize disruption to a healthcare worker's workflow. In certain aspects, the healthcare worker “trains” a system to better predict her next actions, based on her actions during a previous medical workflow. Therefore, configurations of the present disclosure relieve a healthcare worker from having to memorize menu screens and inputs expected from her to accomplish certain healthcare tasks. In certain aspects, workflow prediction is based on IDs of medical objects scanned by a healthcare worker.
  • Although embodiments of the present disclosure have been described and illustrated in detail, it is to be clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the scope of the present invention being limited only by the terms of the appended claims.

Claims (20)

1. A method of adapting a medical workflow implemented at a processor coupled to a hospital network, comprising:
receiving a message comprising medical information data;
predicting a healthcare worker's workflow using, at least in part, the medical information data;
communicating to an interface, based on the predicting, a menu comprising one or more medical action options.
2. The method of claim 1, wherein the predicting further comprises predicting using the healthcare worker's identity and the authorization level.
3. The method of claim 1, wherein the medical information data comprises at least one of a patient identification, a medical item identification or a barcode information.
4. The method of claim 1, wherein the medical information data comprises a medical action selected by the healthcare worker.
5. The method of claim 1 further comprising:
determining a state associated with the medical information data.
6. The method of claim 5, wherein the predicting further comprises predicting using the state associated with the medical information data.
7. The method of claim 1, further comprising:
associating a session information with the first message.
8. The method of claim 7, wherein the predicting further comprises predicting based on the session information.
9. The method of claim 1, wherein the predicting further comprises:
accessing a workflow database comprising medical action option entries; and
including at least one medical action option in the healthcare worker's predicted workflow.
10. The method of claim 1, further comprising:
updating the workflow database based on a system administrator's input.
11. The method of claim 1, further comprising:
facilitating disabling a menu item from the menu.
12. A machine-readable medium encoded with instructions for adapting a medical workflow, the instructions comprising code to cause a processor to:
receive a message comprising medical information data;
predict a healthcare worker's workflow using, at least in part, the medical information data;
communicate to an interface, based on the predicting, a menu comprising one or more medical action options.
13. The machine-readable medium of claim 12, wherein the code for predicting further comprises code for predicting using the healthcare worker's identity and the authorization level.
14. The machine-readable medium of claim 12, wherein the medical information data comprises at least one of a patient identification, a medical item identification or a barcode information.
15. A system for adapting a medical workflow, comprising:
a server;
a scanner having an interface; and
a database; wherein
the server comprises:
a medical information reception section configured to receive a message from the scanner comprising medical information data;
a workflow prediction section configured to predict a healthcare worker's workflow using, at least in part, the medical information data;
a menu communication section configured to communicate to the scanner, based on the predicted workflow, a menu comprising one or more medical action options for displaying on the interface.
16. The system of claim 15, wherein the workflow prediction section is further configured to predict the healthcare worker's workflow using the healthcare worker's identity and the authorization level.
17. The system of claim 15 further comprising:
a state determination section configured to determined a state associated with the medical information data.
18. The system of claim 17, wherein the workflow prediction section is further configured to predict using the state associated with the medical information data.
19. The system of claim 15, further comprising:
a session management section configured to associate a session information with the first message.
20. The system of claim 19 wherein, the workflow prediction section is further configured to predict using the session information.
US12/644,919 2009-12-22 2009-12-22 Adaptable medical workflow system Abandoned US20110153343A1 (en)

Priority Applications (16)

Application Number Priority Date Filing Date Title
US12/644,919 US20110153343A1 (en) 2009-12-22 2009-12-22 Adaptable medical workflow system
MX2012007047A MX340693B (en) 2009-12-22 2010-12-16 Adaptable medical workflow system.
JP2012546063A JP2013515324A (en) 2009-12-22 2010-12-16 Compatible medical workflow system
RU2012123163/08A RU2012123163A (en) 2009-12-22 2010-12-16 ADAPTIVE SYSTEM OF ORGANIZATION OF WORK PROCESS IN THE FIELD OF HEALTH
CA2783780A CA2783780C (en) 2009-12-22 2010-12-16 Adaptable medical workflow system
PCT/US2010/060860 WO2011087710A2 (en) 2009-12-22 2010-12-16 Adaptable medical workflow system
MX2016009467A MX362712B (en) 2009-12-22 2010-12-16 Adaptable medical workflow system.
CN201080058507.2A CN102667848B (en) 2009-12-22 2010-12-16 Adaptable clinical workflow system
EP10843499.4A EP2517168A4 (en) 2009-12-22 2010-12-16 Adaptable medical workflow system
BR112012014414A BR112012014414A2 (en) 2009-12-22 2010-12-16 "instruction-coded machine-readable method, system and medium for adapting a medical workflow"
AU2010341599A AU2010341599A1 (en) 2009-12-22 2010-12-16 Adaptable medical workflow system
ZA2012/04343A ZA201204343B (en) 2009-12-22 2012-06-13 Adaptable medical workflow system
AU2016234929A AU2016234929A1 (en) 2009-12-22 2016-09-28 Adaptable medical workflow system
AU2019203469A AU2019203469A1 (en) 2009-12-22 2019-05-17 Adaptable medical workflow system
US16/459,540 US11170325B2 (en) 2009-12-22 2019-07-01 Adaptable medical workflow system
US17/519,488 US11880787B2 (en) 2009-12-22 2021-11-04 Adaptable medical workflow menu

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/644,919 US20110153343A1 (en) 2009-12-22 2009-12-22 Adaptable medical workflow system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/459,540 Continuation US11170325B2 (en) 2009-12-22 2019-07-01 Adaptable medical workflow system

Publications (1)

Publication Number Publication Date
US20110153343A1 true US20110153343A1 (en) 2011-06-23

Family

ID=44152352

Family Applications (3)

Application Number Title Priority Date Filing Date
US12/644,919 Abandoned US20110153343A1 (en) 2009-12-22 2009-12-22 Adaptable medical workflow system
US16/459,540 Active 2030-06-08 US11170325B2 (en) 2009-12-22 2019-07-01 Adaptable medical workflow system
US17/519,488 Active US11880787B2 (en) 2009-12-22 2021-11-04 Adaptable medical workflow menu

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/459,540 Active 2030-06-08 US11170325B2 (en) 2009-12-22 2019-07-01 Adaptable medical workflow system
US17/519,488 Active US11880787B2 (en) 2009-12-22 2021-11-04 Adaptable medical workflow menu

Country Status (11)

Country Link
US (3) US20110153343A1 (en)
EP (1) EP2517168A4 (en)
JP (1) JP2013515324A (en)
CN (1) CN102667848B (en)
AU (3) AU2010341599A1 (en)
BR (1) BR112012014414A2 (en)
CA (1) CA2783780C (en)
MX (2) MX362712B (en)
RU (1) RU2012123163A (en)
WO (1) WO2011087710A2 (en)
ZA (1) ZA201204343B (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130166619A1 (en) * 2011-12-23 2013-06-27 Timothy Thompson Accessing business intelligence workflows
US20140272860A1 (en) * 2012-07-02 2014-09-18 Physio-Control, Inc. Decision support tool for use with a medical monitor-defibrillator
WO2015190987A1 (en) * 2014-06-11 2015-12-17 Ledningsbolaget I Skandinavien Ab A decision support system and method for resource planning in the healthcare sector
US9872998B2 (en) 2012-05-08 2018-01-23 Physio-Control, Inc. Defibrillator communication system
US10061899B2 (en) 2008-07-09 2018-08-28 Baxter International Inc. Home therapy machine
US10124184B2 (en) 2003-12-17 2018-11-13 Physio-Control, Inc. Defibrillator/monitor system having a pod with leads capable of wirelessly communicating
US10242159B2 (en) 2010-01-22 2019-03-26 Deka Products Limited Partnership System and apparatus for electronic patient care
WO2019200012A1 (en) * 2018-04-10 2019-10-17 Cairl Brian System and method for robot-assisted, cart-based workflows
US10453157B2 (en) 2010-01-22 2019-10-22 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US10872685B2 (en) 2010-01-22 2020-12-22 Deka Products Limited Partnership Electronic patient monitoring system
US10911515B2 (en) 2012-05-24 2021-02-02 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US11115476B1 (en) 2020-04-22 2021-09-07 Drb Systems, Llc System for and method of controlling operations of a car wash
US11164672B2 (en) 2010-01-22 2021-11-02 Deka Products Limited Partnership System and apparatus for electronic patient care
US11210116B2 (en) * 2019-07-24 2021-12-28 Adp, Llc System, method and computer program product of navigating users through a complex computing system to perform a task
US11210611B2 (en) 2011-12-21 2021-12-28 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US11244745B2 (en) 2010-01-22 2022-02-08 Deka Products Limited Partnership Computer-implemented method, system, and apparatus for electronic patient care
US20220044802A1 (en) * 2020-08-09 2022-02-10 Kevin Patel System for remote medical care
US11881307B2 (en) 2012-05-24 2024-01-23 Deka Products Limited Partnership System, method, and apparatus for electronic patient care

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011011454A1 (en) * 2009-07-21 2011-01-27 Zoll Medical Corporation Systems and methods for collection, organization and display of ems information
KR101699195B1 (en) * 2013-03-13 2017-01-23 메델라 홀딩 아게 System and method for managing a supply of breast milk
US20180025283A1 (en) * 2015-05-11 2018-01-25 Sony Corporation Information processing apparatus, information processing method, and program
EP3621002A1 (en) * 2018-09-06 2020-03-11 Koninklijke Philips N.V. Monitoring moveable entities in a predetermined area
WO2020235428A1 (en) * 2019-05-20 2020-11-26 株式会社シェルパ Care assistance system, care assistance information recording system, care assistance method, and care assistance information recording method
CN113296671A (en) * 2021-06-23 2021-08-24 青岛大学附属医院 User interface operation method and device and storage medium

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4857713A (en) * 1986-02-14 1989-08-15 Brown Jack D Hospital error avoidance system
US6338066B1 (en) * 1998-09-25 2002-01-08 International Business Machines Corporation Surfaid predictor: web-based system for predicting surfer behavior
US20020143320A1 (en) * 2001-03-30 2002-10-03 Levin Bruce H. Tracking medical products with integrated circuits
US20020178126A1 (en) * 2001-05-25 2002-11-28 Beck Timothy L. Remote medical device access
US20040169673A1 (en) * 2002-08-19 2004-09-02 Orthosoft Inc. Graphical user interface for computer-assisted surgery
US20050086072A1 (en) * 2003-10-15 2005-04-21 Fox Charles S.Jr. Task-based system and method for managing patient care through automated recognition
US20050144043A1 (en) * 2003-10-07 2005-06-30 Holland Geoffrey N. Medication management system
US20050182654A1 (en) * 2004-02-14 2005-08-18 Align Technology, Inc. Systems and methods for providing treatment planning
US20050251418A1 (en) * 2003-10-15 2005-11-10 Cerner Innovation, Inc. System and method for processing ad hoc orders in an automated patient care environment
US20060112050A1 (en) * 2000-10-17 2006-05-25 Catalis, Inc. Systems and methods for adaptive medical decision support
US20060175401A1 (en) * 2005-02-07 2006-08-10 Cryovac, Inc. Method of labeling an item for item-level identification
US20060229551A1 (en) * 2005-02-11 2006-10-12 Martinez John D Identification system and method for medication management
US20060288095A1 (en) * 2004-05-25 2006-12-21 David Torok Patient and device location dependent healthcare information processing system
US20070290028A1 (en) * 2003-10-15 2007-12-20 Cerner Innovation, Inc. Utilizing scanned supply information and a patient task list to document care
US20080077433A1 (en) * 2006-09-21 2008-03-27 Kasprisin Duke O Tissue management system
US20080184250A1 (en) * 2007-01-30 2008-07-31 Microsoft Corporation Synchronizing Workflows
US20080270178A1 (en) * 2007-04-30 2008-10-30 Mckesson Specialty Distribution Llc Inventory Management System For A Medical Service Provider
US20080306740A1 (en) * 2007-06-07 2008-12-11 Mckesson Automation Inc. Remotely and interactively controlling semi-automatic devices
US20090070137A1 (en) * 2007-09-10 2009-03-12 Sultan Haider Method and system to optimize quality of patient care paths
US20090157428A1 (en) * 2005-10-18 2009-06-18 Neoteric Technology, Limited Apparatus and Method for Administration of Mother's Milk
US20090171695A1 (en) * 2007-12-31 2009-07-02 Intel Corporation System and method for interactive management of patient care
US20090178004A1 (en) * 2008-01-03 2009-07-09 General Electric Company Methods and systems for workflow management in clinical information systems
US20100198620A1 (en) * 2009-01-30 2010-08-05 Omnicell, Inc. Tissue tracking
US20110093279A1 (en) * 2009-10-16 2011-04-21 Levine Wilton C Drug Labeling

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2354089B (en) * 1999-09-08 2003-09-17 Sony Uk Ltd Artificial intelligence user profiling
US20020046047A1 (en) * 2000-07-07 2002-04-18 Budd Jeffrey R. Patient care management system and method
US20030050801A1 (en) 2001-08-20 2003-03-13 Ries Linda K. System and user interface for planning and monitoring patient related treatment activities
EP1443444A3 (en) * 2001-09-12 2006-06-07 Siemens Medical Solutions Health Services Corporation A system and user interface for processing healthcare related event information
JP2006502117A (en) 2002-07-17 2006-01-19 タイタン ファーマシューティカルズ インコーポレイテッド Combination of chemotherapeutic drugs to increase antitumor activity
US20060106641A1 (en) * 2004-11-16 2006-05-18 Ernst Bartsch Portable task management system for healthcare and other uses
US20060291657A1 (en) * 2005-05-03 2006-12-28 Greg Benson Trusted monitoring system and method
US8069055B2 (en) * 2006-02-09 2011-11-29 General Electric Company Predictive scheduling for procedure medicine
US20080040160A1 (en) * 2006-05-15 2008-02-14 Siemens Medical Solutions Usa, Inc. Medical Treatment Compliance Monitoring System
US7792922B2 (en) * 2008-03-05 2010-09-07 Caterpillar Inc. Systems and methods for managing health of a client system

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4857713A (en) * 1986-02-14 1989-08-15 Brown Jack D Hospital error avoidance system
US6338066B1 (en) * 1998-09-25 2002-01-08 International Business Machines Corporation Surfaid predictor: web-based system for predicting surfer behavior
US20060112050A1 (en) * 2000-10-17 2006-05-25 Catalis, Inc. Systems and methods for adaptive medical decision support
US20020143320A1 (en) * 2001-03-30 2002-10-03 Levin Bruce H. Tracking medical products with integrated circuits
US20020178126A1 (en) * 2001-05-25 2002-11-28 Beck Timothy L. Remote medical device access
US20040169673A1 (en) * 2002-08-19 2004-09-02 Orthosoft Inc. Graphical user interface for computer-assisted surgery
US20050144043A1 (en) * 2003-10-07 2005-06-30 Holland Geoffrey N. Medication management system
US20050086072A1 (en) * 2003-10-15 2005-04-21 Fox Charles S.Jr. Task-based system and method for managing patient care through automated recognition
US20050251418A1 (en) * 2003-10-15 2005-11-10 Cerner Innovation, Inc. System and method for processing ad hoc orders in an automated patient care environment
US20070290028A1 (en) * 2003-10-15 2007-12-20 Cerner Innovation, Inc. Utilizing scanned supply information and a patient task list to document care
US20050182654A1 (en) * 2004-02-14 2005-08-18 Align Technology, Inc. Systems and methods for providing treatment planning
US20060288095A1 (en) * 2004-05-25 2006-12-21 David Torok Patient and device location dependent healthcare information processing system
US20060175401A1 (en) * 2005-02-07 2006-08-10 Cryovac, Inc. Method of labeling an item for item-level identification
US20060229551A1 (en) * 2005-02-11 2006-10-12 Martinez John D Identification system and method for medication management
US20090157428A1 (en) * 2005-10-18 2009-06-18 Neoteric Technology, Limited Apparatus and Method for Administration of Mother's Milk
US20080077433A1 (en) * 2006-09-21 2008-03-27 Kasprisin Duke O Tissue management system
US20080184250A1 (en) * 2007-01-30 2008-07-31 Microsoft Corporation Synchronizing Workflows
US20080270178A1 (en) * 2007-04-30 2008-10-30 Mckesson Specialty Distribution Llc Inventory Management System For A Medical Service Provider
US20080306740A1 (en) * 2007-06-07 2008-12-11 Mckesson Automation Inc. Remotely and interactively controlling semi-automatic devices
US20090070137A1 (en) * 2007-09-10 2009-03-12 Sultan Haider Method and system to optimize quality of patient care paths
US20090171695A1 (en) * 2007-12-31 2009-07-02 Intel Corporation System and method for interactive management of patient care
US20090178004A1 (en) * 2008-01-03 2009-07-09 General Electric Company Methods and systems for workflow management in clinical information systems
US20100198620A1 (en) * 2009-01-30 2010-08-05 Omnicell, Inc. Tissue tracking
US20110093279A1 (en) * 2009-10-16 2011-04-21 Levine Wilton C Drug Labeling

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10124184B2 (en) 2003-12-17 2018-11-13 Physio-Control, Inc. Defibrillator/monitor system having a pod with leads capable of wirelessly communicating
US10068061B2 (en) 2008-07-09 2018-09-04 Baxter International Inc. Home therapy entry, modification, and reporting system
US10224117B2 (en) 2008-07-09 2019-03-05 Baxter International Inc. Home therapy machine allowing patient device program selection
US10095840B2 (en) 2008-07-09 2018-10-09 Baxter International Inc. System and method for performing renal therapy at a home or dwelling of a patient
US10061899B2 (en) 2008-07-09 2018-08-28 Baxter International Inc. Home therapy machine
US11164672B2 (en) 2010-01-22 2021-11-02 Deka Products Limited Partnership System and apparatus for electronic patient care
US11424029B2 (en) 2010-01-22 2022-08-23 Deka Products Limited Partnership System, method and apparatus for electronic patient care
US11524107B2 (en) 2010-01-22 2022-12-13 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US11244745B2 (en) 2010-01-22 2022-02-08 Deka Products Limited Partnership Computer-implemented method, system, and apparatus for electronic patient care
US11776671B2 (en) 2010-01-22 2023-10-03 Deka Products Limited Partnership Electronic patient monitoring system
US10453157B2 (en) 2010-01-22 2019-10-22 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US11810653B2 (en) 2010-01-22 2023-11-07 Deka Products Limited Partnership Computer-implemented method, system, and apparatus for electronic patient care
US10242159B2 (en) 2010-01-22 2019-03-26 Deka Products Limited Partnership System and apparatus for electronic patient care
US10872685B2 (en) 2010-01-22 2020-12-22 Deka Products Limited Partnership Electronic patient monitoring system
US11210611B2 (en) 2011-12-21 2021-12-28 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US20130166619A1 (en) * 2011-12-23 2013-06-27 Timothy Thompson Accessing business intelligence workflows
US10105546B2 (en) 2012-05-08 2018-10-23 Physio-Control, Inc. Utility module
US10926099B2 (en) 2012-05-08 2021-02-23 Physio-Control, Inc. Utility module interface
US10159846B2 (en) 2012-05-08 2018-12-25 Physio-Control, Inc. Utility module interface
US10124181B2 (en) 2012-05-08 2018-11-13 Physio-Control., Inc. Defibrillator network system
US10118048B2 (en) 2012-05-08 2018-11-06 Physio-Control, Inc. Utility module system
US9872998B2 (en) 2012-05-08 2018-01-23 Physio-Control, Inc. Defibrillator communication system
US10089443B2 (en) 2012-05-15 2018-10-02 Baxter International Inc. Home medical device systems and methods for therapy prescription and tracking, servicing and inventory
US10911515B2 (en) 2012-05-24 2021-02-02 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US11881307B2 (en) 2012-05-24 2024-01-23 Deka Products Limited Partnership System, method, and apparatus for electronic patient care
US10303852B2 (en) * 2012-07-02 2019-05-28 Physio-Control, Inc. Decision support tool for use with a medical monitor-defibrillator
US20140272860A1 (en) * 2012-07-02 2014-09-18 Physio-Control, Inc. Decision support tool for use with a medical monitor-defibrillator
WO2015190987A1 (en) * 2014-06-11 2015-12-17 Ledningsbolaget I Skandinavien Ab A decision support system and method for resource planning in the healthcare sector
US11427404B2 (en) 2018-04-10 2022-08-30 Fetch Robotics, Inc. System and method for robot-assisted, cart-based workflows
WO2019200012A1 (en) * 2018-04-10 2019-10-17 Cairl Brian System and method for robot-assisted, cart-based workflows
US11210116B2 (en) * 2019-07-24 2021-12-28 Adp, Llc System, method and computer program product of navigating users through a complex computing system to perform a task
US11115476B1 (en) 2020-04-22 2021-09-07 Drb Systems, Llc System for and method of controlling operations of a car wash
US11289195B2 (en) * 2020-08-09 2022-03-29 Kevin Patel System for remote medical care
US20220044802A1 (en) * 2020-08-09 2022-02-10 Kevin Patel System for remote medical care

Also Published As

Publication number Publication date
ZA201204343B (en) 2014-01-29
JP2013515324A (en) 2013-05-02
WO2011087710A3 (en) 2011-11-17
US11170325B2 (en) 2021-11-09
WO2011087710A2 (en) 2011-07-21
MX340693B (en) 2016-07-21
AU2019203469A1 (en) 2019-06-06
US20190325543A1 (en) 2019-10-24
US20220058535A1 (en) 2022-02-24
EP2517168A4 (en) 2014-01-22
CN102667848B (en) 2017-03-29
BR112012014414A2 (en) 2017-04-04
AU2016234929A1 (en) 2016-10-20
AU2010341599A1 (en) 2012-07-05
EP2517168A2 (en) 2012-10-31
MX362712B (en) 2019-02-05
CN102667848A (en) 2012-09-12
CA2783780A1 (en) 2011-07-21
CA2783780C (en) 2021-09-14
RU2012123163A (en) 2014-01-27
MX2012007047A (en) 2012-10-15
US11880787B2 (en) 2024-01-23

Similar Documents

Publication Publication Date Title
US11880787B2 (en) Adaptable medical workflow menu
US7225408B2 (en) System and user interface for communicating and processing patient record information
US7578432B2 (en) Method for transmitting medical information identified by a unique identifier barcode to a hospital
US20140089011A1 (en) Medication Management System
US20070138253A1 (en) Method for transmitting medical information idetified by a unique identifier
US20160318311A1 (en) Networkable medical labeling apparatus and method
US7555720B2 (en) System and user interface for processing and navigating patient record information
US20080281637A1 (en) System and Method of Automatically Displaying Patient Information
US20090281825A1 (en) Automated patient flow management system
US20090070142A1 (en) Methods and systems for providing patient registration information
JP2009531146A (en) Drug administration and management system and method
US20170344948A1 (en) Coordinated mobile access to electronic medical records
EP1946263A2 (en) Use of a mobile communications device for the secure real time alerting of patient health information
US20180349974A1 (en) System and method for presenting product-specific content on a client device based on a scanned barcode
KR101067326B1 (en) Apparatus and method for providing service based on location in mibile communication system
JP5142639B2 (en) Medical service support system, medical service support server
JP6440569B2 (en) Server device, display control method, and display control program
US20090228302A1 (en) System and method of prescribing alternative medications
US20160027138A1 (en) Automated Patient Flow Management Systems
JP2018133679A (en) Image formation device and information notification method
US20120259658A1 (en) System and Method of Automatically Displaying Patient Information
KR102172364B1 (en) Prescription delivery system using two-channel in cloud computing environment and the control method thereof
JP6545440B2 (en) Method for supporting prescription reception, computer program for supporting prescription reception, and prescription reception support device
JP2022066670A (en) Medication history data transmission system, information management server, and terminal program
JP2004303087A (en) Information provision system and information provision program

Legal Events

Date Code Title Description
AS Assignment

Owner name: CAREFUSION 303, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:TREMBLAY, JEAN-SEBASTIEN;DUMONT, SEBASTIEN;DUFOUR, ALI;AND OTHERS;REEL/FRAME:023701/0106

Effective date: 20091218

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION