WO2011116340A2 - Context-management framework for telemedicine - Google Patents

Context-management framework for telemedicine Download PDF

Info

Publication number
WO2011116340A2
WO2011116340A2 PCT/US2011/029077 US2011029077W WO2011116340A2 WO 2011116340 A2 WO2011116340 A2 WO 2011116340A2 US 2011029077 W US2011029077 W US 2011029077W WO 2011116340 A2 WO2011116340 A2 WO 2011116340A2
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor data
action
user
sensor
Prior art date
Application number
PCT/US2011/029077
Other languages
French (fr)
Other versions
WO2011116340A3 (en
Inventor
Robert A. Lowe
Robert Norton
Ritu Sahni
Richard Harper
Rita H. Wouhaybi
Mark D. Yarvis
Phillip Muse
Chieh-Yih Wan
Sangita Sharma
Sai Prasad
Lenitra Durham
Original Assignee
Oregon Health & Science University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oregon Health & Science University filed Critical Oregon Health & Science University
Publication of WO2011116340A2 publication Critical patent/WO2011116340A2/en
Publication of WO2011116340A3 publication Critical patent/WO2011116340A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

Definitions

  • This application relates to the field of health care, and in particular, to devices and techniques for monitoring and recording patient information.
  • Patient monitoring systems are used extensively to monitor vital signs of patients in multiple medical settings, including out-of-hospital care by emergency medical technicians and paramedics, out-patient settings such as urgent care clinics and freestanding surgical centers, intermediate care facilities such as nursing homes, and hospital facilities such as emergency departments, intensive care units, and hospital "step-down units" providing a transition from the ICU to general medical and surgical units.
  • Such systems acquire data from medical sensors to measure patient vital signs such as cardiac activity, blood-oxygenation, and blood pressure.
  • False alarms may detract medical personnel from providing patient care, forcing the personnel to pay more attention to the patient monitoring system instead of the patient herself. Eventually, false alarms may lead to alarm fatigue, a situation in which subsequent alarms are sometimes ignored, at times in critical cases.
  • a remote physician or other caregiver is often dependent on a local caregiver to act as a proxy, interacting with the patient and relaying physiological information, patient history, and procedures.
  • EMS emergency medical services
  • paramedics may treat many routine patients independently, for more complex cases they may call upon a remote physician or other base station clinician for advice.
  • physiological information is captured on electronic monitoring devices, while patient history, procedures performed, and medications administered are often captured on paper, or even on a caregivers hand or glove.
  • FIG. 1 is a block diagram illustrating relationships between components of a medical alarm control system in accordance with various embodiments of the present disclosure
  • Figures 2 and 3 are example screen shots of interfaces presented to a user of a medical communication system in accordance with various embodiments
  • Figure 4 is a block diagram illustrating relationships between components of a coordinated medical visualization system in accordance with various embodiments of the present disclosure
  • Figure 5 is an example screen shot of an interface presented to a user for display of medical data in accordance with various embodiments
  • Figure 6 is a block diagram illustrating relationships between components of a medical voice control system in accordance with various embodiments of the present disclosure
  • FIGS 7-9 are diagrams of systems employing medical control and communication systems and techniques in accordance with various embodiments.
  • Figure 10 is an example computing environment in accordance with various embodiments.
  • Illustrative embodiments of systems and techniques of the present disclosure include, but are not limited to, systems and methods for efficient collection and delivery of patient information.
  • the systems include automated field data collection and transmission to supporting clinicians at remote locations for real-time monitoring.
  • Embodiments of the present disclosure allow physiological vital signs, history, procedures performed, drugs adrninistered, and physical assessments, to be combined into a single chronological record of care. This record can be referenced in the field and delivered in real time to a supporting physician or receiving hospital.
  • aspects of the present disclosure include real-time delivery of contextual information streams, hands-free acquisition of patient information, and a rule-based smart alarm system that avoids alarm fatigue.
  • on-body sensors monitor and record physiological waveforms, which are aggregated on a mobile device and transmitted either in real time or periodically to a remote physician. This provides for a flexible and scalable platform for medical applications, including in telemedicine.
  • data may be correlated from multiple sensor streams to validate measurements made by the sensors.
  • systems and techniques may use prior- determined probabilities based on validated clinical data, patient history, and treatments administered to determine subsequent actions after an alarm is triggered. In various embodiments, these actions may include, but are not limited to, when to alarm, when to acquire more data, and one or more protocols to recommend to clinicians. Additionally, in various embodiments, systems and techniques may archive user responses, triggered alarms and related data so that the decision-making process surrounding alarm triggering may be refined during post-analysis.
  • techniques and systems may enable coordinated representation of a set of medical data by visually presenting data related to the events using two or more correlated, i.e., linked, representations.
  • the data may include multiple continuous streams and/or discrete events from multiple sources and/or include tracking of changes of data read.
  • multiple visualizations may be displayed, controlled, and/oT reviewed with respect to a common time axis.
  • such a display will allow for time-consistent review, such that moving forward in time in one of the visualizations will scroll others to a similar time interval, providing clearer correlation of events to medical personnel.
  • systems and techniques may enable a user, such as medical field personnel, to use vocal expressions to input data to the system to capture and/or categorize the nature of the care being provided (such as, for example, drugs administered, procedures being performed, physical observations, patient history, etc.).
  • the captured information may be further used to annotate physiological patient data streaming from patient sensors.
  • a remote clinician may be provided contemporaneous and consolidated updates on patients' states.
  • medical monitoring alarms may be controlled so as to reduce the number of false alarms.
  • the control of alarms may include the use of one or more of data correlation, prior probability review, and archival capability.
  • FIG. 1 is a block diagram that illustrates various aspects of medical alarm control according to various embodiments.
  • sensor data may flow from sensors at block 3 and from profile data at block 2 to a computing device at block 1.
  • profile data such as that in block 2
  • the computing device at block 1 may be a decision engine and may apply rules, which may be pre-defined by medical personnel, to determine an action to take based on the sensor data.
  • the action determined by the rule may be to identify and/or validate new facts and store them in the profile data at block 2, to determine additional sensor measurements to take which may be taken at block 6, and/or to determine alarms that may be issued.
  • a selector (such as at block 7) may determine which actuation mechanism (illustrated at block 10) will be most appropriate and effective.
  • the user may then respond to alarms at block 8.
  • actions that have occurred such as, for example, the sensing of data, the making of decisions, initiated alarms, and user response may be stored at block 9.
  • Prior probabilities as will be described below may be injected into the decision engine at block 5 based on profile data from block 2.
  • software upgrades may be applied via block 11 to the prior probabilities and the decision engine rules.
  • Various aspects of the actions described above are described below with reference to data correlation, prior probability review, and archiving.
  • systems and techniques described herein utilize correlated data from multiple sensor inputs to determine an action to take, such as (a) when to alarm, and (b) when to acquire more data.
  • This data correlation may be directed through the use of one or more pre-determined rules.
  • the rule may correlate sensor data from a first sensor measuring a first parameter of the patient with sensor data from a second sensor measuring a second parameter of the patient, where the first parameter is different from the second parameter. That is, the first sensor and the second sensor may be different types of sensors that measure a different characteristic of the patient. For example, if medical personnel seek to validate that a complete set of ECG leads have been properly placed on a patient, sensor data from the ECG monitor and plethysmography sensors may be correlated using the following rule, presented here in vernacular language:
  • heart rate based on plethysmography waveform If heart rate based on plethysmography is normal and 02 sat >92%, do not alarm. But if plethysmography signal is clear and the heart rate based on
  • plethysmography does not match the heart rate from ECG, signal that there is an artifact so paramedics can adjust leads
  • rules may have different outcomes.
  • the rule recited above both determines whether or not to trigger an alarm and records validated sensor data.
  • the systems and techniques may also leverage prior probabilities, such as at block S, to determine an action to take, such as (a) when to alarm, (b) when to acquire more data, and (c) what protocol to suggest to the clinician.
  • This provides several advantages over systems that treat facts as being either true or false, e.g., binary triggering. This binary triggering may produce an incorrect assessment of facts that may contribute to false alarms.
  • prior-determined probabilities such as those based on profile data related to the patient, such as validated clinical data, history, and treatments administered, may be assigned to trigger conditions and may be revised as new profile data and/or sensor data emerge. Using these probabilities may make the system adaptable to the changing conditions. For example:
  • recent history of one or more alarm conditions which have occurred may be used to alter the assumed probability that an alarm condition from another sensor is valid and clinically important. For example, if a patient's blood pressure is low, a system may set a higher probability that an abnormal heart rate or a low SP02 is real and could be more likely to alarm when a low SP02 is detected.
  • prior-determined probabilities may be used to calculate ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇
  • an abnormal heart rate or cardiac rhythm may trigger an automatic 12-lead ECG.
  • prior- determined probabilities may be used to determine recommended medical protocols. For example, if an observed cardiac rhythm is ventricular tachycardia, the clinician may be recommended to check blood pressure, PPG wave form, and SP02. Then, if BP is normal, the PPG wave form is normal, and SP02 is within acceptable range, the system may alarm and recommend drug treatment. Alternatively, if BP is dangerously low, PPG wave form is abnormal, or SP02 is low, the system may charge an associated defibrillator and
  • data related to triggered alarms, facts and conditions which caused the alarms to trigger, rules in the computing device that were used during decision-making, and/or user response/feedback may be archived, i.e., stored, in a persistent store in the system, such as at block 9 in Figure 1.
  • user response(s) may cause decision rules and prior probabilities to be adjusted. This adjustment, according to various embodiments, may either be in real time by the system or at a later time for a next firmware or software release. For example, if a specific alarm has triggered multiple times, and each time a user of the system responds by silencing the alarm, the system may choose to archive the user response. Using the archived responses, a manufacturer or other system maintainer may look for conditions that caused the false alarm to trigger and modify firmware or software to minimize future occurrences.
  • firmware or software is illustrated at block 11 of Figure 1.
  • the system may automatically modify the rules and/or prior probability data based on the user response data.
  • Figure 1 illustrates a decision engine, at block 1, which may be a computing device.
  • the decision engine contains rules which may be used to determine actions to be taken. For example, the rule may evaluate alarm trigger conditions.
  • the engine may make decisions by evaluating input from sources such as: sensor data from one or more sensors (illustrated at block 3), profile data, such as patient information and patient history (illustrated at block 2), threshold settings for sensor data (illustrated at block 4), and prior- determined probabilities (illustrated at block 5).
  • the decision engine may compare sensor data against a set of rules, which comprise embedded medical knowledge, so that the decision engine may determine when to alarm. Prior probabilities are used by these rules to determine whether and what severity of alarm or other action is required.
  • actions determined from the decision engine may include: alarms triggered from evaluating one or more inputs, acquiring additional data such as retaking a blood pressure measurement if it is low, and feedback related to new facts identified from validated sensor data.
  • the decisions made may be represented as decision -making code.
  • a rule determining alarms related to low blood pressure may comprise the following:
  • a rule relating to alarms for heart rate may comprise one or more of the following:
  • Figure 1 also illustrates profile data related to the patient at block 2.
  • the profile data may be fed into the decision engine of block 1 as a part of the decision making process.
  • Prior probabilities such as at block 5, may be continuously updated as new profile data emerge.
  • profile data may include information such as patient demographics (such as gender, age, chief complaint, patient history, etc.), medical protocols (for example suggested procedures to be followed to treat a patient), drugs administered to the patient during treatment, procedures administered to the patient, (for example CPR or defibrillation), and validated sensor data.
  • the validated sensor data may be distinct from raw sensor data received by the system. For example, plethysmography data read from a sensor is raw data representing the SP02 waveform. This waveform may be analyzed to derive a pulse rate and this pulse rate may be validated against an actual, observed pulse rate. This validated data may subsequently be used by the decision engine for future decisions.
  • the data may be validated through correlation of data from multiple sensors, as described herein.
  • Figure 1 also illustrates sensors at block 3.
  • the sensors may be medical sensors that are operatively coupled to a patient to monitor the patient. Examples include ECG sensors to monitor the heart, SP02 sensors to monitor blood oxygenation, blood pressure sensors, ETC02 sensors to monitor breathing, etc.
  • sensors may be wired or wireless and the data coming in may be scalar (e.g., blood pressure) or streaming (e.g. cardiac rhythm).
  • sensor data may be input to the computing device, e.g., decision engine.
  • FIG. 1 illustrates thresholds at block 4.
  • these represent threshold settings for actions such as alarm triggers.
  • Various settings may be pre-programmed or defined by a user. For example, an alarm may trigger if an observed actual blood pressure reading is below or above determined low/high thresholds for BP.
  • normal values for sensor data may vary by age. Threshold settings may thus be adjusted according to profile data, such as age of the patient.
  • Figure 1 illustrates prior-determined probabilities at block 5.
  • this illustrated component may use prior-determined probabilities to assess conditions that could lead to a triggering of an alarm.
  • the prior- determined probabilities may be adjusted to improve the decision making process. These probabilities may be fed into the decision engine as inputs to the rules.
  • Figure 1 illustrates "actions" at block 6.
  • this component may effectuate some or all actions triggered by the decision engine. For example, if the decision engine comprises a rule to take a 12-lead ECG reading if blood pressure is low, the action component may send a command to the ECG sensor to take the ECG reading.
  • Figure 1 illustrates a selector at block 7.
  • a user when alarms are triggered by the decision engine, a user may be notified of an alarm in various ways. For example, embodiments may alarm visually by flashing readings on the display, embodiments may play audible beeps or alarm voice playback, etc.
  • the selector component may be used to select the appropriate method of actuating alarms triggered, such as at block 10, by the decision engine. In various embodiments, the selector may also use knowledge of user response, such as at block 8, to decide on an actuation method.
  • Figure 1 illustrates a user response component at block 8.
  • this component may accept, record, and act on one or more user responses to alarms.
  • Such responses may include a user acknowledging alarms, silencing specific alarms, re-activating alarms, etc.
  • User response may also include feedback from a user to let the system know the user's impression of the alarm.
  • Such feedback may include indications such as: this alarm is real, this is a false alarm, do not trigger, and/or, this is a critical alarm, always trigger.
  • user response may feed into the selector component, such as at block 7, to select an appropriate way to notify a user. For instance, if a user has silenced the BP alarm, the selector would not use audio to notify the user of a BP alarm. User feedback about false alarms, for example, may also be sent to the decision engine to modify its decision making process appropriately. Finally, user feedback may be recorded in a persistent store, such as at block 9, to enable future adjustment of processes and techniques as described herein.
  • Figure 1 illustrates a persistent store at block 9.
  • this component may store a record of alarms which have been triggered, user response to each of the triggered alarms, when applicable, and the actuating mechanism which was used to notify the user for each alarm triggered.
  • Other relevant information may also be archived. For example, a snapshot of sensor data at a time an alarm was triggered may be archived; the archived data may include a determined amount of time before/after the moment the alarm was triggered.
  • embodiments may archive rules and/or decisions that caused an alarm to trigger and/or user response and facts known at the time of the triggering.
  • the records stored may be used by a manufacturer or other maintainer, for example to improve a rule base in the decision engine to make better decisions in future revisions of the software or firmware.
  • Figure 1 also illustrates actuators at block 10. In various aspects
  • these illustrated actuators represent different mechanisms by which alarms may be effected.
  • alarms may be rendered, according to various embodiments, on the display through flashing icons, by displaying sensor readings in different colors, by using audible beeps or a voice playback to notify the user about a specific alarm, and/or by other techniques such as flashing LEDs on the display.
  • Figure 1 also illustrates a software upgrade block at block 11.
  • this component may be used by a device manufacturer to upgrade software and/or firmware to include refinements and/or improvements to the decision making process. Upgrades may include, for example, revisions to rules in the decision engine and/or revisions to prior-determined probability tables;
  • Figures 2 and 3 show example screenshots illustrating one implementation of a graphical application operating according to the techniques and systems described herein.
  • Figure 2 illustrates a screenshot from a Current Vitals tab 202 in an application.
  • Figure 2 shows vital signs as an ECG waveform 204 and an SP02 waveform 206, and as scalar data for cardiac rhythm 208, SP02 210, and ETC02 212.
  • Alarms 214 and 216 have been triggered for low SP02 and pulse readings, respectively.
  • the illustrated screen shot of Figure 2 is divided into three regions: a top region 218, a middle region 220, and a bottom region 222.
  • the top region 218 and bottom region 222 may be consistent and may be kept visible.
  • the top region 218, as illustrated, contains critical patient information, including patient identifiers and chief complaints (which are often used to identify patients, e.g., "65 year old man with chest pain"), current vital sign information, and alarm status.
  • the middle region may contain navigation tabs.
  • the first three tabs— Current Vitals 202, Historical Data 224, and 12-lead ECG 226— may display physiological sensor data.
  • the bottom region 222 may provide visual feedback for the audio context system, as described herein, and may include the current voice-to-text translation and system state.
  • Figure 3 illustrates a screenshot from an Alarms tab 336 in an application. As illustrated, Figure 3 shows that a user may be allowed to set alarm thresholds 340, such as, for example, thresholds for high and low blood pressure 342 and end tidal C02 344. Additionally, in the illustrated implementation, an alarm history 346 of activation and deactivation of various triggered alarms is also shown.
  • alarm thresholds 340 such as, for example, thresholds for high and low blood pressure 342 and end tidal C02 344.
  • an alarm history 346 of activation and deactivation of various triggered alarms is also shown.
  • the systems and techniques described herein may provide a facility for visualizing medical data from multiple sources in a coordinated representation.
  • data is correlated and represented using time as a common axis. This may include continuous data (from multiple sources), discrete events which occur
  • data may be acquired in various formats.
  • data may include:
  • data may be presented as a continuous plot versus time, such as ECG, SP02 waveform, etc.
  • data may represent non- medical measurements, such as, for example, speed of an object flying or bandwidth usage on a network link.
  • Such data may be, in one embodiment, displayed using a ticker with changing text.
  • Examples of data sampled on a non-continuous basis include stock prices, gas counters at gas pumps, and/or news
  • These data may also be grouped in a visualization by virtue of being human-generated, thus giving a consistent record of human-produced activities. Examples of such data include a person logging into a machine, or someone activating a feature using voice.
  • the techniques may visualize events that are initiated by the execution of a process. These events may read data that fells under various event categories discussed herein, including, in some embodiments, other process-generated data, and produce log events. For example a system may include a security flag to be logged if a person tries to login 20 times with a wrong password on multiple machines inside a company within 10 minutes, or may record an alarm generated due to a patient's heart rate suddenly dropping sharply.
  • FIG. 4 illustrates a block diagram of one implementation of a system 400 for coordinated visualization of data from multiple sources.
  • sensor data 402 may be obtained from a plurality of sensors, as described above.
  • the sensor data 402, in whole or in part, may be stored on a database or server 404, as illustrated.
  • the system 400 may attach a time stamp to the sensor data 402.
  • the data when visualized (e.g., by visual representations 406, 408, 410, and/or 412), the data may not be currently-observed data, but may instead be historical data which has been saved on the database or server.
  • the system may associate a portion of the data having time stamps within a time interval as treatment session data to be played back later.
  • a user of these techniques may play back a non-current session, including respective timestamps, using data fed from the database 404 instead of the sensors. Additionally, some of the visualized data may come from one or more processes 414 which may operate on the sensor data 402 before generating data to visualize, as illustrated.
  • the systems and techniques described herein may be implemented in an emergency medicine usage model.
  • This implementation may present data coming from different sources and showing the data to a user in one combined view using a multitude of tools, tylng them together using time as a common axis.
  • Figure S illustrates an example screenshot 500 of such an
  • data may be visualized according to different visualization techniques.
  • visualization techniques may include continuous data plots 502, one or more visual timelines 504 representing a treatment session, in whole or in part, one or more textual charts 506, and one or more text tickers 508 with changing text
  • the system may present data to the user using a plurality of visualization techniques:
  • coordination may start by capturing
  • the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
  • the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
  • the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
  • the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
  • the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms.
  • the data may also include discrete values of other physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveform
  • sensors such as, for example, blood pressure. These are represented in the example of Figure 5 as the continuous graphs 502 in the mid- left area of the screenshot, as well as the ticker 508 which runs near the bottom of the screen shot, to the left of the illustrated buttons.
  • annotations correspond to events such as voice inputs (for example as described below), or alarms created by the values of physiological data.
  • the annotations may also include events, for example taking specific measurements such as blood pressure.
  • the annotations are shown in text chart 506 on the right side of the screenshot, where time, category, and detail are shown for each annotation.
  • data may be presented as a timeline.
  • the left hand side of the timeline represents the start time of monitoring and the right hand side represents a most recent time.
  • this most recent time comprises a current time if the session is still active; in another, if the data- gathering session has ended, the most recent time comprises an end time.
  • colored markers may be added to the timeline as visual clues to indicate major events, such as, for example, alarms or voice inputs.
  • a snapshot a snapshot of a data stream.
  • timeline 504 is shown near the bottom of the screen, representing a timeline between 13:37:59 and 13:46:11. Additionally, in the
  • the screenshot includes a button 510 for bringing the system up to a live timepoint.
  • all three of the visualization techniques described above may be linked together using time, e.g., may have a common time axis.
  • time e.g., may have a common time axis.
  • systems and techniques may provide for user input to capture and categorize medical information and direct care, and in particular to be used in telemedicine.
  • embodiments may be directed to the creation and processing of voice annotations, which record medical information, such as decisions, drugs, procedures and the like, or which cause events to occur.
  • embodiments may also provide for text annotations.
  • some embodiments may convert the voice annotations to text.
  • FIG. 6 is a block diagram which illustrates embodiments of a voice control system 600 for integrating voice control into medical decision-making. Aspects of the figure, systems, and techniques will be described with reference to five elements:
  • phase of the system (shown as functional block 628 in Figure 6).
  • the systems and techniques described herein compose these elements into a mobile platform for use by a health practitioner in the field.
  • embodiments may enable context aspects sought to be explained during treatment, known colloquially as "who, what, when, how, and why" to be more easily captured in the field.
  • 600 include aspects relating to determining when a speech recognizer may start listening for annotations during treatment. In various embodiments, this determination may be made according to different mechanisms. For example, such mechanisms may include:
  • a speech recognition engine may recognize this command and cause the system to enter a listening mode.
  • a speech recognition engine may recognize this command and cause the system to enter a listening mode.
  • the system may turn off the listening mode.
  • a hardware push-button accessible to the user such as
  • this push-button may be located on a device implementing the techniques discussed herein, in some embodiments the push-button may be worn by a user, such as, for example, on a ring or a wrist band. In various embodiments, the push-button may be wirelessly connected to the system and, upon activation, cause the system to toggle between start/stop listening modes.
  • a software push-button such as at block 604.
  • the systems and techniques may display a software-based push-button on an associated display.
  • this push-button When this push-button is activated, the system may toggle between start/stop listening modes.
  • these components may determine one or more categories of an annotation to be captured, thereby capturing the "what" aspect of a treatment being provided.
  • these components may operate through the use of pre-defined categories.
  • the categories may be stored at block 630 in Figure 6. For example, one or more of the following five categories may be defined through which annotations may be categorized.
  • DEMO for patient demographics (age, gender, chief
  • the categories are oriented toward emergency medicine, and may be extended to telemedicine; however, in other contexts, other categories may be utilized. In some embodiments, in addition to the categories described above, other categories may be used. For example, a system may use a MEMO category for recording audio memos and/or an EVTS category, such as for voice commands as will be described below.
  • a user may say a category-based word, such as "Drug,” before recording any drug-related annotations.
  • buttons may be used, and the combination of their activation determines an intended annotation category.
  • these push-buttons may be located on a system display and there may be a separate one for each pre-defined
  • voice activation and deactivation may be combined with categorization, to simplify the experience for the user.
  • activation and categorization may be voice-based. For example, a user may say "Enter Drug.” If the system recognizes this command, it may
  • Figure 6 also illustrates, at block 624, components directed to capturing raw audio annotations by recording audio between, for example, voice activation and deactivation.
  • This recording may be deactivated, according to various embodiments, through a deactivation mechanism as described above, or via a timeout mechanism, such as at block 612.
  • this raw audio may be further analyzed by a speech recognition (SR) engine to convert the annotation to text
  • SR speech recognition
  • text annotation may be performed on a database server, such as is illustrated in Figure 6, particularly if system performance is an issue.
  • time stamps for the annotations are also attached, i.e., associated, with the annotations, such as at block 608.
  • speech recognition accuracy may be increased and latency decreased by defining one or more category-specific limited grammars or vocabularies (such as block 632 in Figure 6) that are activated based upon a category identified during a categorization stage.
  • a limited vocabulary may be defined, such as for categories like DRUG, PROC, PHYS, and DEMO discussed above, based on the feedback from actual users, such as paramedics and doctors involved in emergency medicine. While such a vocabulary may cover only specific medical scenarios under which it was created, such a vocabulary can be easily extended.
  • a category such as HIST discussed above may take the form of a large dictation vocabulary because of the open-ended nature of the annotations in such a category.
  • Examples of the limited grammar/vocabulary include:
  • PROC category (procedures performed)
  • DEMO category (patient information)
  • HIST category (patient history)
  • annotations may capture different aspects of care being provided.
  • a "what" context may be provided from the DRUG/PROC annotations, a "why" context from the PHYS/HIST annotations, a "how” context from the DRUG/PROC annotation, a "when” context from the recorded timestamp of the annotation, and a "who" from logged user information, such as at block 601.
  • the grammar described above may be extended to cover situations in which annotations are recorded later than the actual time care was given.
  • the grammar may be extended to accept an annotation of "2 minutes ago" for a procedure.
  • extensions may allow for a scenario when a secondary user who is different from the person doing annotations, such as a secondary field personnel, is actually carrying out the care.
  • such an extension may take the form of ⁇ what> ["at" ⁇ when>] ['lay" ⁇ who>].
  • an annotation may take the form of "[PROC] intubated at 3.15 by Joe.”
  • embodiments of the system 600 and techniques may include a block 626 with components (such as block 613) that combine information, such as what, when, how, why, who, of a captured annotation and store it into a database 634.
  • the text annotations may be further analyzed by one or more understanding components, such as at block 615, which may enter information into a report and/or provide context to other system components.
  • the understanding component may cause a certain action (block 614) to be performed such as parse patient information (e.g., DEMO) to fill in age, gender, and/or complaint fields of a report and also to display such information on a display 636 for a user.
  • an understanding component may recognize a CPR procedure annotation "start CPR / stop CPR" such that the component may start or stop a CPR timer.
  • various embodiments may enable usage of audio to give commands to the system to perform certain actions to control the system.
  • the system may recognize commands to navigate to certain GUI screens, to control alarm activation and/or silencing, and to initiate pre-defined sensor readings.
  • a user may say "take BP" to take a blood pressure reading or "take 12 lead” to take an ECG reading.
  • this mode may utilize a pre-defined command category, such as that discussed above.
  • the database 634 of the system may also contain time-stamped sensor data from physiological sensors that have been connected to monitor a patient, This sensor data may also account partly for the "why" aspect of the patient treatment.
  • a remote system for example at a consulting clinician's facility, may pull data, along with the annotations, and present consolidated and contemporaneous information on a particular patient treatment. The consulting clinician can thus provide more effective guidance to the local medical personnel without the need for lengthy and potentially incomplete conversations with the local personnel. Examples of Annotation State Feedback
  • the system 600 and techniques may utilize a feedback component 628 which makes a user aware of a state of the system.
  • Such feedback may include, for example, whether voice control/input has been activated (block 609) and categorization has been successful, whether audio is being recorded and audio annotation capture is in progress, when speech-to-text conversion for text annotation is successful (block 610), and when audio recording is complete (block 611),
  • this type of feedback is useful in order for the user to know whether they need to repeat any processing stage.
  • different feedback mechanisms may be used. Two examples amenable to mobile telemedicine and emergency medicine scenarios are described herein. Either of these mechanisms may make it possible for feedback to be part of the user's environment and enable the user to be aware of an annotation state while allowing the user to continue to perform other activities.
  • a first mechanism 638 includes feedback using a plurality of differently colored LEDs mounted on a mobile device implementing the techniques or on safety glasses of a medical care-giver. Any combination of colors of LEDs and associated meanings may be used. For example, in one implementation, four colored LEDS may be used - blue, yellow, green, and red with the following associated meanings:
  • a second mechanism 640 utilizes audio feedback with distinct sounds to identify a state. For example, the system may use one distinct sound to indicate that voice activation and categorization has succeeded and a second distinct sound to indicate that the voice recognition has been deactivated. Examples of Use
  • one paramedic used an audio headset to create audio annotations. Prior to the sessions the paramedics underwent an hour- long voice training session. The training created a voice model for the speaker, and provided hands-on experience for the user in system usage and grammar/vocabulary. During each simulation session, paramedics created audio annotations, which were recorded in the database along with the textual translations. The sessions were also video-taped for additional ground truth.
  • the systems and techniques described herein may be focused around an aggregator device that is associated with a given person and collects sensor data from various sensors associated with that person.
  • the aggregator device may be a computing device, and may allow the sensor data to be processed, stored, and displayed locally, as well as delivered to remote telemedicine clients.
  • Figure 7 presents an example of a high-level system architecture 700 according to the techniques described herein.
  • the architecture 700 is partitioned into three parts: one or more sensors (e.g., sensors 702, 704, 706, and 708), an aggregator 710, and a back end 712.
  • the back end 712 may include a computer 714 and/or a database server 716.
  • the aggregator 710 may be a decision engine as described above,
  • the aggregator 710 may be implemented on a mobile platform, such as a PDA, smart phone, or tablet.
  • the aggregator 710 may be paired with on-body or environmental sensors, from which it receives data via a Wireless Personal Area Network (WPAN).
  • WPAN Wireless Personal Area Network
  • the aggregator 710 and sensors 702, 704, 706, and 708 may communicate via short-range wireless protocols such as, for example, Bluetooth.
  • the aggregator 710 may control the sensors 702, 704, 706, and 708 and receive, process, store, and/or display sensor data. Sensor data may then be replicated to back end telemedicine clients through a wide area wireless network.
  • the system 700 may also have associated one or more PC platforms 714 which receive, store, analyze and/or display data from each connected aggregator 710. Communication between each aggregator 710 and the back end platform 712 may take place through a storage subsystem, such as database server 716, using a database replication and data streaming architecture to synchronize data across the platforms.
  • a storage subsystem such as database server 716
  • FIG. 8 illustrates a high-level end-to-end view of an example software architecture 800 implementing embodiments described herein.
  • the architecture may be symmetric between the aggregator 810 and the back end 812.
  • the aggregator 810 may include a plurality of platforms, including a data acquisition block 820, a user interface block 822, and a storage subsystems block 824.
  • the back end 812 may include a data acquisition block 826, a user interface block 828, and a storage subsystems block 830.
  • the systems of the aggregator 810 and back end 812 may be encapsulated in an application object 832 and 834, respectively.
  • the application object 832 and 834 manages core components, reducing the overhead if the system is to be customized to a specific application. Accordingly, for a specific application, a system designer may only need to create and manage application-specific data analysis components and a custom user interface.
  • the application object 832 may manage communication with wireless sensors, such as BluetoothTM sensors. Acquired data may be stored via the storage subsystem 824 as well as streamed to the analysis components 836 and to the user interface 822.
  • the storage subsystem 824 may include a replication capability that is a communication mechanism between the aggregator 810 and the back end 812.
  • the application object 832 receives data via the storage subsystem 824 and may stream it to the analysis components 838 and the back end user interface application 828.
  • the user interface 822 and 828 may receive real-time data from the application object 832 and 834, respectively, via a streaming interface. It may also access historical data from the storage subsystem 824 and 830, respectively, via a data access layer (DAL).
  • DAL data access layer
  • the application object 932 may provide functions of data acquisition. Such functions may include: 1) a publish/subscribe based reconfigurable data path 940 that may be configurable at system integration time, 2) an extensible sensing component 942 that may interface with sensors, 3) analysis components 944 for real-time data analysis, and 4) a database bridge 946 (DB) component to record incoming data streams in a local database.
  • DB database bridge 946
  • Figure 10 illustrates, for one embodiment, an example system 1000 comprising one or more processor(s) 1004, system control logic 1008 coupled to at least one of the processor(s) 1004, system memory 1012 coupled to system control logic 1008, nonvolatile memory (NVM)/storage 1016 coupled to system control logic 1008, and one or more communications interface(s) 1020 coupled to system control logic 1008.
  • processor(s) 1004 processor(s) 1004
  • system control logic 1008 coupled to at least one of the processor(s) 1004
  • system memory 1012 coupled to system control logic 1008, nonvolatile memory (NVM)/storage 1016 coupled to system control logic 1008, and one or more communications interface(s) 1020 coupled to system control logic 1008.
  • NVM nonvolatile memory
  • System control logic 1008 may include any suitable interface controller to provide for any suitable interface to at least one of the processor (s) 1004 and/or to any suitable device or component in communicatioh with system control logic 1008.
  • System control logic 1008 for one embodiment may include one or more memory controller(s) to provide an interface to system memory 1012.
  • System memory 1012 may be used to load and store data and/or instructions, for example, for system 1000.
  • System memory 1012 for one embodiment may include any suitable volatile memory, such as suitable dynamic random access memory
  • System control logic 1008 may include one or more input/output (I/O) controller(s) to provide an interface to NVM/storage 1016 and communications interface(s) 1020.
  • I/O controller(s) may be included in system control logic 1008 for one embodiment to provide an interface to NVM/storage 1016 and communications interface(s) 1020.
  • NVM/storage 1016 may be used to store data and/or instructions, for example.
  • NVM/storage 1016 may include any suitable non-volatile memory, such as flash memory, for example, and/or may include any suitable non-volatile storage device(s), such as one or more hard disk drive(s) (HDD(s)), one or more solid-state drive(s), one or more compact disc (CD) drive(s), and/or one or more digital versatile disc (DVD) drive(s) for example.
  • HDD hard disk drive
  • CD compact disc
  • DVD digital versatile disc
  • the NVM/storage 1016 may include a storage resource physically part of a device on which the system 1000 is installed or it may be accessible by, but not necessarily a part of, the device.
  • the NVM/storage 1016 may be accessed over a network via the communications interface(s) 1020.
  • System memory 1012 and NVM/storage 1016 may include, in particular, temporal and persistent copies of medical communication logic 1024, respectively.
  • the medical communication logic 1024 may include instructions that when executed by at least one of the processor(s) 1004 result in the system 1000 performing automated security setup actions as described in conjunction with, for example, the wireless setup or certificate setup applications described herein.
  • the medical communication logic 1024 may additionally/alternatively be located in the system control logic 1008.
  • Communications interface(s) 1020 may provide an interface for system 1000 to communicate over one or more network(s) and/or with any other suitable device.
  • Communications interface(s) 1020 may include any suitable hardware and/or firmware.
  • Communications interface(s) 1020 for one embodiment may include, for example, a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem.
  • communications interface(s) 1020 for one embodiment may use one or more antenna(s).
  • At least one of the processors) 1004 may be packaged together with logic for one or more controlJer(s) of system control logic 1008.
  • at least one of the processor(s) 1004 may be packaged together with logic for one or more controllers of system control logic 1008 to form a System in Package (SiP).
  • SiP System in Package
  • at least one of the processor(s) 1004 may be integrated on the same die with logic for one or more controller(s) of system control logic 1008.
  • at least one of the processors) 1004 may be integrated on the same die with logic for one or more controllers) of system control logic 1008 to form a System on Chip (SoC).
  • SoC System on Chip
  • system 1000 may have more or less components, and/or different architectures.
  • references throughout this specification to "one embodiment” or “an embodiment” mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation encompassed within the present invention. Thus, appearances of the phrase “one embodiment” or “in an embodiment” are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be instituted in other suitable forms other than the particular embodiment(s) illustrated.

Abstract

Embodiments herein provide a rule-based smart alarm system for a medical monitoring system that avoids alarm fatigue, delivery of coordinated medical data from multiple sources, and methods to facilitate data entry by a caregiver in the field, ϊn various embodiments, data may be correlated from multiple sensor streams to validate measurements made by sensors, Additionally, in various embodiments, systems and techniques may use prior-determined probabilities based on validated clinical data, patient history, and treatments administered to determine subsequent actions after an alarm is triggered. In various embodiments, these actions may include, but are not limited to, when to alarm, when to acquire more data, and one or more protocols to recommend to clinicians. Additionally, in various embodiments, systems and techniques may archive user responses, triggered alarms, and related data so that the decision-making process surrounding alarm triggering may be refined during post-analysis. Additionally, various embodiments, techniques and systems may enable coordinated representation of a set of medical data by visually presenting data related to the events using two or more correlated representations.

Description

Context-Management Framework for Telemedicine
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims priority to U.S. Provisional Patent
Application No. 61/315,358, riled March 18, 2010, entitled "Context-Management Framework for Telemedicine," the entire disclosure of which is hereby incorporated by reference in its entirety,
TECHNICAL FIELD
[0002] This application relates to the field of health care, and in particular, to devices and techniques for monitoring and recording patient information.
BACKGROUND
[0003] In a clinical setting, physicians make medical decisions based on physiological measurements and interaction with patients, Patient monitoring systems are used extensively to monitor vital signs of patients in multiple medical settings, including out-of-hospital care by emergency medical technicians and paramedics, out-patient settings such as urgent care clinics and freestanding surgical centers, intermediate care facilities such as nursing homes, and hospital facilities such as emergency departments, intensive care units, and hospital "step-down units" providing a transition from the ICU to general medical and surgical units. Such systems acquire data from medical sensors to measure patient vital signs such as cardiac activity, blood-oxygenation, and blood pressure.
[0004] However the sheer amount of information that flows from these sensors, from the patient directly, and from medical personnel can make accurate and consistent analysis of the patient's medical condition difficult. Caregivers may find themselves unable to record and maintain careful records of patient information, such as histories, decisions made, drugs given, and procedures performed, which can hamper both their ability to provide effective care and for other caregivers to review actions which were taken. [0005] These caregivers may also find themselves facing difficulties in interpreting measurements as they are taken. For example, alarms, which may be raised to alert medical personnel if sensors detect anomalies in sensor data, may often be falsely triggered. However, in many systems, every triggered alarm requires the medical personnel to tend to the alarm, determine the alarm trigger, and decide whether the situation requires attention. False alarms may detract medical personnel from providing patient care, forcing the personnel to pay more attention to the patient monitoring system instead of the patient herself. Eventually, false alarms may lead to alarm fatigue, a situation in which subsequent alarms are sometimes ignored, at times in critical cases.
[0006] One particular scenario in which these difficulties present themselves is in the context of telemedicine, where a remote physician or other caregiver is often dependent on a local caregiver to act as a proxy, interacting with the patient and relaying physiological information, patient history, and procedures. One particular example of telemedicine is emergency medical services (EMS). Although paramedics may treat many routine patients independently, for more complex cases they may call upon a remote physician or other base station clinician for advice. In this, often chaotic, environment, physiological information is captured on electronic monitoring devices, while patient history, procedures performed, and medications administered are often captured on paper, or even on a caregivers hand or glove. The paramedic's best estimate of key facts and order of events is communicated by voice to the supporting clinician, delivered in summary to the receiving hospital, and integrated into a final report. Because these systems are difficult to coordinate and suffer from a lack of precision, however, physicians often base their decisions on information that is, at least in part, relayed by voice and is both limited and prone to error.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] Embodiments of the present disclosure will be described by way of exemplary embodiments, but not limitations, illustrated in the accompanying drawings in which like references denote similar elements, and in which: [0008] Figure 1 is a block diagram illustrating relationships between components of a medical alarm control system in accordance with various embodiments of the present disclosure;
[0009] Figures 2 and 3 are example screen shots of interfaces presented to a user of a medical communication system in accordance with various embodiments;
[0010] Figure 4 is a block diagram illustrating relationships between components of a coordinated medical visualization system in accordance with various embodiments of the present disclosure;
[0011] Figure 5 is an example screen shot of an interface presented to a user for display of medical data in accordance with various embodiments;
[0012] Figure 6 is a block diagram illustrating relationships between components of a medical voice control system in accordance with various embodiments of the present disclosure;
[0013] Figures 7-9 are diagrams of systems employing medical control and communication systems and techniques in accordance with various embodiments; and
[0014] Figure 10 is an example computing environment in accordance with various embodiments.
DETAILED DESCRIPTION OF EMBODIMENTS
[0015] Illustrative embodiments of systems and techniques of the present disclosure include, but are not limited to, systems and methods for efficient collection and delivery of patient information. In various embodiments, the systems include automated field data collection and transmission to supporting clinicians at remote locations for real-time monitoring. Embodiments of the present disclosure allow physiological vital signs, history, procedures performed, drugs adrninistered, and physical assessments, to be combined into a single chronological record of care. This record can be referenced in the field and delivered in real time to a supporting physician or receiving hospital. Aspects of the present disclosure include real-time delivery of contextual information streams, hands-free acquisition of patient information, and a rule-based smart alarm system that avoids alarm fatigue. [0016] In various embodiments, on-body sensors monitor and record physiological waveforms, which are aggregated on a mobile device and transmitted either in real time or periodically to a remote physician. This provides for a flexible and scalable platform for medical applications, including in telemedicine.
[0017] Various embodiments of the present disclosure seek to reduce levels of false medical monitoring alarms. In various embodiments, data may be correlated from multiple sensor streams to validate measurements made by the sensors.
Additionally, in various embodiments, systems and techniques may use prior- determined probabilities based on validated clinical data, patient history, and treatments administered to determine subsequent actions after an alarm is triggered. In various embodiments, these actions may include, but are not limited to, when to alarm, when to acquire more data, and one or more protocols to recommend to clinicians. Additionally, in various embodiments, systems and techniques may archive user responses, triggered alarms and related data so that the decision-making process surrounding alarm triggering may be refined during post-analysis.
[0018] Additionally, various embodiments, techniques and systems may enable coordinated representation of a set of medical data by visually presenting data related to the events using two or more correlated, i.e., linked, representations. The data may include multiple continuous streams and/or discrete events from multiple sources and/or include tracking of changes of data read. In various embodiments, in order to tie all of the data and events together, multiple visualizations may be displayed, controlled, and/oT reviewed with respect to a common time axis. Thus, in various embodiments, such a display will allow for time-consistent review, such that moving forward in time in one of the visualizations will scroll others to a similar time interval, providing clearer correlation of events to medical personnel.
[0019] Also, in various embodiments of the present disclosure, systems and techniques may enable a user, such as medical field personnel, to use vocal expressions to input data to the system to capture and/or categorize the nature of the care being provided (such as, for example, drugs administered, procedures being performed, physical observations, patient history, etc.). The captured information may be further used to annotate physiological patient data streaming from patient sensors. As such, in various embodiments a remote clinician may be provided contemporaneous and consolidated updates on patients' states.
[0020] Various aspects of the illustrative embodiments will be described using terms commonly employed by those skilled in the art to convey the substance of their work to others skilled in the art. However, it will be apparent to those skilled in the art that alternate embodiments may be practiced with only some of the described aspects. For purposes of explanation, specific numbers, materials, and configurations are set forth in order to provide a thorough understanding of the illustrative embodiments. However, it will be apparent to one skilled in the art that alternate embodiments may be practiced without the specific details. In other instances, well-known features are omitted or simplified in order not to obscure the illustrative embodiments.
[0021] Further, various operations will be described as multiple discrete operations, in turn, in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent. In particular, these operations need not be performed in the order of presentation.
[0022] The phrase 'in one embodiment" is used. Various usages of the phrase generally do not refer to the same embodiment; however, they may.
Additionally, while the phrases "in various embodiments" or "in some
embodiments" are used, usages of these phrases generally do not refer to identical embodiments or sets of embodiments; however, particular embodiments may be common to the referred-to sets. The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "A and/or B" means "(A), (B), or (A and B)". The phrase "at least one of A, B and C" means "(A), (B), (C), (A and B), (A and C), (B and C) or (A, B and C)".
Examples of Medical Alarm Control
[0023] As discussed above, in various embodiments medical monitoring alarms may be controlled so as to reduce the number of false alarms. In various embodiments, the control of alarms may include the use of one or more of data correlation, prior probability review, and archival capability.
Alarm Control Overview
[0024] Figure 1 is a block diagram that illustrates various aspects of medical alarm control according to various embodiments. As illustrated, in various embodiments, sensor data may flow from sensors at block 3 and from profile data at block 2 to a computing device at block 1. In various embodiments, profile data, such as that in block 2, may include patient demographics and/or history, medical protocols, drugs and procedures that have been administered to the patient, as well as sensor data which have previously been validated. In various embodiments, the computing device at block 1 may be a decision engine and may apply rules, which may be pre-defined by medical personnel, to determine an action to take based on the sensor data. For example, the action determined by the rule may be to identify and/or validate new facts and store them in the profile data at block 2, to determine additional sensor measurements to take which may be taken at block 6, and/or to determine alarms that may be issued. In various embodiments, for alarms, a selector (such as at block 7) may determine which actuation mechanism (illustrated at block 10) will be most appropriate and effective. The user may then respond to alarms at block 8. In various embodiments, actions that have occurred, such as, for example, the sensing of data, the making of decisions, initiated alarms, and user response may be stored at block 9. Prior probabilities, as will be described below may be injected into the decision engine at block 5 based on profile data from block 2. Finally, software upgrades may be applied via block 11 to the prior probabilities and the decision engine rules. Various aspects of the actions described above are described below with reference to data correlation, prior probability review, and archiving.
Examples of Usage of Data Correlation
[0025] In various embodiments, systems and techniques described herein utilize correlated data from multiple sensor inputs to determine an action to take, such as (a) when to alarm, and (b) when to acquire more data. This data correlation may be directed through the use of one or more pre-determined rules. The rule may correlate sensor data from a first sensor measuring a first parameter of the patient with sensor data from a second sensor measuring a second parameter of the patient, where the first parameter is different from the second parameter. That is, the first sensor and the second sensor may be different types of sensors that measure a different characteristic of the patient. For example, if medical personnel seek to validate that a complete set of ECG leads have been properly placed on a patient, sensor data from the ECG monitor and plethysmography sensors may be correlated using the following rule, presented here in vernacular language:
[0026] "If heart rate from ECG monitor is too low or too high: check
heart rate based on plethysmography waveform. If heart rate based on plethysmography is normal and 02 sat >92%, do not alarm. But if plethysmography signal is clear and the heart rate based on
plethysmography does not match the heart rate from ECG, signal that there is an artifact so paramedics can adjust leads"
[0027] As the example rule illustrates, in various embodiments rules may have different outcomes. For example, the rule recited above both determines whether or not to trigger an alarm and records validated sensor data.
Examples of Usage of Prior-determined probabilities
[0028] The systems and techniques may also leverage prior probabilities, such as at block S, to determine an action to take, such as (a) when to alarm, (b) when to acquire more data, and (c) what protocol to suggest to the clinician. This provides several advantages over systems that treat facts as being either true or false, e.g., binary triggering. This binary triggering may produce an incorrect assessment of facts that may contribute to false alarms. In various embodiments of the current disclosure, prior-determined probabilities, such as those based on profile data related to the patient, such as validated clinical data, history, and treatments administered, may be assigned to trigger conditions and may be revised as new profile data and/or sensor data emerge. Using these probabilities may make the system adaptable to the changing conditions. For example:
[0029] a) Determining when to alarm: In various embodiments
which may be separate from the data correlation examples described above, recent history of one or more alarm conditions which have occurred may be used to alter the assumed probability that an alarm condition from another sensor is valid and clinically important. For example, if a patient's blood pressure is low, a system may set a higher probability that an abnormal heart rate or a low SP02 is real and could be more likely to alarm when a low SP02 is detected.
[0030] b) Determining when to acquire more data. In various
embodiments, prior-determined probabilities may be used to
determine that, based on an alarm trigger, more data should be
acquired. For example, an abnormal heart rate or cardiac rhythm may trigger an automatic 12-lead ECG.
[0031] c) Determining which protocol to recommend to a
caregiver, such as a clinician. In various embodiments, prior- determined probabilities may be used to determine recommended medical protocols. For example, if an observed cardiac rhythm is ventricular tachycardia, the clinician may be recommended to check blood pressure, PPG wave form, and SP02. Then, if BP is normal, the PPG wave form is normal, and SP02 is within acceptable range, the system may alarm and recommend drug treatment. Alternatively, if BP is dangerously low, PPG wave form is abnormal, or SP02 is low, the system may charge an associated defibrillator and
recommend that the clinician consider shocking the patient.
Examples of Archival Capabilities
[0032] In various embodiments, data related to triggered alarms, facts and conditions which caused the alarms to trigger, rules in the computing device that were used during decision-making, and/or user response/feedback may be archived, i.e., stored, in a persistent store in the system, such as at block 9 in Figure 1. In some embodiments, user response(s) may cause decision rules and prior probabilities to be adjusted. This adjustment, according to various embodiments, may either be in real time by the system or at a later time for a next firmware or software release. For example, if a specific alarm has triggered multiple times, and each time a user of the system responds by silencing the alarm, the system may choose to archive the user response. Using the archived responses, a manufacturer or other system maintainer may look for conditions that caused the false alarm to trigger and modify firmware or software to minimize future occurrences. One example of these software upgrades is illustrated at block 11 of Figure 1.
Alternatively, the system may automatically modify the rules and/or prior probability data based on the user response data.
Examples of Decision-Making Details
[0033] Figure 1 illustrates a decision engine, at block 1, which may be a computing device. In various embodiments, the decision engine contains rules which may be used to determine actions to be taken. For example, the rule may evaluate alarm trigger conditions. The engine may make decisions by evaluating input from sources such as: sensor data from one or more sensors (illustrated at block 3), profile data, such as patient information and patient history (illustrated at block 2), threshold settings for sensor data (illustrated at block 4), and prior- determined probabilities (illustrated at block 5). In various embodiments, the decision engine may compare sensor data against a set of rules, which comprise embedded medical knowledge, so that the decision engine may determine when to alarm. Prior probabilities are used by these rules to determine whether and what severity of alarm or other action is required.
[0034] In various embodiments, actions determined from the decision engine may include: alarms triggered from evaluating one or more inputs, acquiring additional data such as retaking a blood pressure measurement if it is low, and feedback related to new facts identified from validated sensor data.
[0035] In various embodiments, the decisions made may be represented as decision -making code. For example, a rule determining alarms related to low blood pressure may comprise the following:
Figure imgf000012_0001
[0036] Whereas a rule relating to alarms for heart rate may comprise one or more of the following:
Figure imgf000013_0001
Figure imgf000014_0001
Figure imgf000015_0001
Examples of Farther Architectural Details
[0037] Figure 1 also illustrates profile data related to the patient at block 2.
In various embodiments, the profile data may be fed into the decision engine of block 1 as a part of the decision making process. Prior probabilities, such as at block 5, may be continuously updated as new profile data emerge.
[0038] In various embodiments, profile data may include information such as patient demographics (such as gender, age, chief complaint, patient history, etc.), medical protocols (for example suggested procedures to be followed to treat a patient), drugs administered to the patient during treatment, procedures administered to the patient, (for example CPR or defibrillation), and validated sensor data. In various embodiments, the validated sensor data may be distinct from raw sensor data received by the system. For example, plethysmography data read from a sensor is raw data representing the SP02 waveform. This waveform may be analyzed to derive a pulse rate and this pulse rate may be validated against an actual, observed pulse rate. This validated data may subsequently be used by the decision engine for future decisions. In some embodiments, the data may be validated through correlation of data from multiple sensors, as described herein.
[0039] Figure 1 also illustrates sensors at block 3. In various embodiments, the sensors may be medical sensors that are operatively coupled to a patient to monitor the patient. Examples include ECG sensors to monitor the heart, SP02 sensors to monitor blood oxygenation, blood pressure sensors, ETC02 sensors to monitor breathing, etc. In various embodiments, sensors may be wired or wireless and the data coming in may be scalar (e.g., blood pressure) or streaming (e.g. cardiac rhythm). In various embodiments, sensor data may be input to the computing device, e.g., decision engine.
[0040] Figure 1 illustrates thresholds at block 4. In various embodiments, these represent threshold settings for actions such as alarm triggers. Various settings may be pre-programmed or defined by a user. For example, an alarm may trigger if an observed actual blood pressure reading is below or above determined low/high thresholds for BP. In various embodiments, normal values for sensor data may vary by age. Threshold settings may thus be adjusted according to profile data, such as age of the patient.
[0041] Figure 1 illustrates prior-determined probabilities at block 5. In various embodiments, this illustrated component may use prior-determined probabilities to assess conditions that could lead to a triggering of an alarm. In some embodiments, as new profile data emerge, such as from block 2, the prior- determined probabilities may be adjusted to improve the decision making process. These probabilities may be fed into the decision engine as inputs to the rules.
[0042] Figure 1 illustrates "actions" at block 6. In various embodiments, this component may effectuate some or all actions triggered by the decision engine. For example, if the decision engine comprises a rule to take a 12-lead ECG reading if blood pressure is low, the action component may send a command to the ECG sensor to take the ECG reading.
[0043] Figure 1 illustrates a selector at block 7. In various embodiments, when alarms are triggered by the decision engine, a user may be notified of an alarm in various ways. For example, embodiments may alarm visually by flashing readings on the display, embodiments may play audible beeps or alarm voice playback, etc. The selector component may be used to select the appropriate method of actuating alarms triggered, such as at block 10, by the decision engine. In various embodiments, the selector may also use knowledge of user response, such as at block 8, to decide on an actuation method.
[0044] Figure 1 illustrates a user response component at block 8. In various embodiments, this component may accept, record, and act on one or more user responses to alarms. Such responses may include a user acknowledging alarms, silencing specific alarms, re-activating alarms, etc. User response may also include feedback from a user to let the system know the user's impression of the alarm. Such feedback may include indications such as: this alarm is real, this is a false alarm, do not trigger, and/or, this is a critical alarm, always trigger.
[0045] In various embodiments, user response may feed into the selector component, such as at block 7, to select an appropriate way to notify a user. For instance, if a user has silenced the BP alarm, the selector would not use audio to notify the user of a BP alarm. User feedback about false alarms, for example, may also be sent to the decision engine to modify its decision making process appropriately. Finally, user feedback may be recorded in a persistent store, such as at block 9, to enable future adjustment of processes and techniques as described herein.
[0046] Figure 1 illustrates a persistent store at block 9. In various embodiments, this component may store a record of alarms which have been triggered, user response to each of the triggered alarms, when applicable, and the actuating mechanism which was used to notify the user for each alarm triggered. Other relevant information may also be archived. For example, a snapshot of sensor data at a time an alarm was triggered may be archived; the archived data may include a determined amount of time before/after the moment the alarm was triggered. In another example, embodiments may archive rules and/or decisions that caused an alarm to trigger and/or user response and facts known at the time of the triggering. In some embodiments, the records stored may be used by a manufacturer or other maintainer, for example to improve a rule base in the decision engine to make better decisions in future revisions of the software or firmware.
[0047] Figure 1 also illustrates actuators at block 10. In various
embodiments, these illustrated actuators represent different mechanisms by which alarms may be effected. For example, alarms may be rendered, according to various embodiments, on the display through flashing icons, by displaying sensor readings in different colors, by using audible beeps or a voice playback to notify the user about a specific alarm, and/or by other techniques such as flashing LEDs on the display.
[0048] Figure 1 also illustrates a software upgrade block at block 11. In various embodiments, this component may be used by a device manufacturer to upgrade software and/or firmware to include refinements and/or improvements to the decision making process. Upgrades may include, for example, revisions to rules in the decision engine and/or revisions to prior-determined probability tables;
Examples of Alarm Interaction
[0049] Figures 2 and 3 show example screenshots illustrating one implementation of a graphical application operating according to the techniques and systems described herein. Figure 2 illustrates a screenshot from a Current Vitals tab 202 in an application. As illustrated, Figure 2 shows vital signs as an ECG waveform 204 and an SP02 waveform 206, and as scalar data for cardiac rhythm 208, SP02 210, and ETC02 212. Alarms 214 and 216 have been triggered for low SP02 and pulse readings, respectively.
[0050] Generally, the illustrated screen shot of Figure 2 is divided into three regions: a top region 218, a middle region 220, and a bottom region 222. In various embodiments, the top region 218 and bottom region 222 may be consistent and may be kept visible. The top region 218, as illustrated, contains critical patient information, including patient identifiers and chief complaints (which are often used to identify patients, e.g., "65 year old man with chest pain"), current vital sign information, and alarm status. The middle region may contain navigation tabs. The first three tabs— Current Vitals 202, Historical Data 224, and 12-lead ECG 226— may display physiological sensor data. Of the remaining tabs, Notes 228, Imaging 230, and Report 232 may provide additional access to the patient record, while Personnel 234, Alarms 236, and Protocols 238 may be administrative in nature. The bottom region 222 may provide visual feedback for the audio context system, as described herein, and may include the current voice-to-text translation and system state.
[0051] Figure 3 illustrates a screenshot from an Alarms tab 336 in an application. As illustrated, Figure 3 shows that a user may be allowed to set alarm thresholds 340, such as, for example, thresholds for high and low blood pressure 342 and end tidal C02 344. Additionally, in the illustrated implementation, an alarm history 346 of activation and deactivation of various triggered alarms is also shown.
Examples of Coordinated Medical Data Presentation
[0052] As discussed above, in various embodiments, the systems and techniques described herein may provide a facility for visualizing medical data from multiple sources in a coordinated representation. In various embodiments, data is correlated and represented using time as a common axis. This may include continuous data (from multiple sources), discrete events which occur
contemporaneously, and tracking of changes in data read. By ordering these events and displaying them in a coordinated fashion, medical personnel may be better able to get a complete picture of a patient's status at a particular point in time, which can assist in making patient-care decisions.
[0053] In various embodiments, data may be acquired in various formats.
For example, data may include:
[0054] 1) Sampled continuous data, such as from a physical
sensor. Such data may be presented as a continuous plot versus time, such as ECG, SP02 waveform, etc. In other embodiments of the data visualization techniques described herein, data may represent non- medical measurements, such as, for example, speed of an object flying or bandwidth usage on a network link.
[0055] 2) Sampled data that takes place on a less-frequent basis.
Such data may be, in one embodiment, displayed using a ticker with changing text. Examples of data sampled on a non-continuous basis include stock prices, gas counters at gas pumps, and/or news
headlines.
[0056] 3) Human-generated events. In various embodiments,
these events happen at a low frequency, and may happen at a lower frequency than the non-continuous sampled data discussed above.
These data may also be grouped in a visualization by virtue of being human-generated, thus giving a consistent record of human-produced activities. Examples of such data include a person logging into a machine, or someone activating a feature using voice.
[0057] 4) Process-generated events. In various embodiments,
the techniques may visualize events that are initiated by the execution of a process. These events may read data that fells under various event categories discussed herein, including, in some embodiments, other process-generated data, and produce log events. For example a system may include a security flag to be logged if a person tries to login 20 times with a wrong password on multiple machines inside a company within 10 minutes, or may record an alarm generated due to a patient's heart rate suddenly dropping sharply.
[0058] Figure 4 illustrates a block diagram of one implementation of a system 400 for coordinated visualization of data from multiple sources. As illustrated, sensor data 402 may be obtained from a plurality of sensors, as described above. The sensor data 402, in whole or in part, may be stored on a database or server 404, as illustrated. The system 400 may attach a time stamp to the sensor data 402. Accordingly, in some embodiments, when visualized (e.g., by visual representations 406, 408, 410, and/or 412), the data may not be currently-observed data, but may instead be historical data which has been saved on the database or server. In some embodiments, the system may associate a portion of the data having time stamps within a time interval as treatment session data to be played back later. In this manner, a user of these techniques may play back a non-current session, including respective timestamps, using data fed from the database 404 instead of the sensors. Additionally, some of the visualized data may come from one or more processes 414 which may operate on the sensor data 402 before generating data to visualize, as illustrated.
[0059] In one embodiment, the systems and techniques described herein may be implemented in an emergency medicine usage model. This implementation may present data coming from different sources and showing the data to a user in one combined view using a multitude of tools, tylng them together using time as a common axis. Figure S illustrates an example screenshot 500 of such an
implementation. '
[0060] In various embodiments, data may be visualized according to different visualization techniques. Such visualization techniques may include continuous data plots 502, one or more visual timelines 504 representing a treatment session, in whole or in part, one or more textual charts 506, and one or more text tickers 508 with changing text For example, in the screenshot 500 illustrated in Figure 5, the system may present data to the user using a plurality of visualization techniques:
[0061] In a first technique, coordination may start by capturing
continuous data coming from several physiological sensors, including ECG, ETC02, and SP02 and presenting it to the user as plots and/or waveforms. The data may also include discrete values of other
sensors, such as, for example, blood pressure. These are represented in the example of Figure 5 as the continuous graphs 502 in the mid- left area of the screenshot, as well as the ticker 508 which runs near the bottom of the screen shot, to the left of the illustrated buttons.
[0062] In a second technique, an annotated flowchart may be shown
comprising multiple annotations. In various embodiments, the
annotations correspond to events such as voice inputs (for example as described below), or alarms created by the values of physiological data. In some embodiments, the annotations may also include events, for example taking specific measurements such as blood pressure. In the example of Figure 5, the annotations are shown in text chart 506 on the right side of the screenshot, where time, category, and detail are shown for each annotation.
[0063] In the third technique, data may be presented as a timeline. In
various embodiments, the left hand side of the timeline represents the start time of monitoring and the right hand side represents a most recent time. In one embodiment, this most recent time comprises a current time if the session is still active; in another, if the data- gathering session has ended, the most recent time comprises an end time. In various implementations, colored markers may be added to the timeline as visual clues to indicate major events, such as, for example, alarms or voice inputs. In the example screenshot, a
timeline 504 is shown near the bottom of the screen, representing a timeline between 13:37:59 and 13:46:11. Additionally, in the
illustrated implementation, the screenshot includes a button 510 for bringing the system up to a live timepoint.
[0064] In some embodiments, all three of the visualization techniques described above may be linked together using time, e.g., may have a common time axis. Thus, in some embodiments, if a user scrolls through one of the three visualizations, clicks on a specific item in the chart or clicks on a specific point in time on the timeline, the other two representations may scroll accordingly to the same time interval, allowing the user to make the visual correlation of all events.
Examples of Voice Control of Medical Care
[006S] As discussed above, in various embodiments, systems and techniques may provide for user input to capture and categorize medical information and direct care, and in particular to be used in telemedicine. In particular, embodiments may be directed to the creation and processing of voice annotations, which record medical information, such as decisions, drugs, procedures and the like, or which cause events to occur. Embodiments may also provide for text annotations.
Additionally, some embodiments may convert the voice annotations to text.
[0066] Figure 6 is a block diagram which illustrates embodiments of a voice control system 600 for integrating voice control into medical decision-making. Aspects of the figure, systems, and techniques will be described with reference to five elements:
[0067] 1) Activation and deactivation of audio recording during
medical treatment to capture, store, and analyze pertinent audio data (shown as functional block 620 in Figure 6).
[0068] 2) Categorization of annotations into well-defined and
meaningful categories in medicine, and in particular in telemedicine (shown as functional block 622 in Figure 6).
[0069] 3) Capture of annotation data in audio and text format by
using voice-to-text conversion (shown as functional block 624 in
Figure 6).
[0070] 4) Creation of time-stamped fmal annotations that are
stored, such as by a database, for access by the remote system at the consulting clinician's facility (shown as functional block 626 in
Figure 6).
[0071] 5) Providing feedback to a user on a current operating
phase of the system (shown as functional block 628 in Figure 6).
[0072] In various embodiments, the systems and techniques described herein compose these elements into a mobile platform for use by a health practitioner in the field. As such, embodiments may enable context aspects sought to be explained during treatment, known colloquially as "who, what, when, how, and why" to be more easily captured in the field.
Examples of Voice Activation and Deactivation
[0073] As Figure 6 illustrates, various embodiments of voice control system
600 include aspects relating to determining when a speech recognizer may start listening for annotations during treatment. In various embodiments, this determination may be made according to different mechanisms. For example, such mechanisms may include:
[0074] 1) Voice recognition based activation and deactivation,
such as at biock 602 may be used. For example, a user may say
"Start annotation." Following this utterance, a speech recognition engine may recognize this command and cause the system to enter a listening mode. In another example, when a user says "Stop
annotation," the system may turn off the listening mode.
[0075] 2) A hardware push-button accessible to the user, such as
at block 603. While in some embodiments, this push-button may be located on a device implementing the techniques discussed herein, in some embodiments the push-button may be worn by a user, such as, for example, on a ring or a wrist band. In various embodiments, the push-button may be wirelessly connected to the system and, upon activation, cause the system to toggle between start/stop listening modes.
[0076] 2) A software push-button, such as at block 604. In
various embodiments, the systems and techniques may display a software-based push-button on an associated display. When this push-button is activated, the system may toggle between start/stop listening modes.
Examples of Annotation Categorization
[0077] Also illustrated in block 622 of Figure 6 are aspects related to categorizing annotations which are being captured by the voice control systems and techniques. In general, these components may determine one or more categories of an annotation to be captured, thereby capturing the "what" aspect of a treatment being provided. In various embodiments, these components may operate through the use of pre-defined categories. The categories may be stored at block 630 in Figure 6. For example, one or more of the following five categories may be defined through which annotations may be categorized.
[0078] 1 ) DRUG: for drugs that may be administered [0079] PROC: for procedures that may be performed
[0080] 3) PHYS: for physical exam findings
[0081] 4) HIST: for patient medical history information
[0082] 5) DEMO: for patient demographics (age, gender, chief
complaint)
[0083] In the preceding example, the categories are oriented toward emergency medicine, and may be extended to telemedicine; however, in other contexts, other categories may be utilized. In some embodiments, in addition to the categories described above, other categories may be used. For example, a system may use a MEMO category for recording audio memos and/or an EVTS category, such as for voice commands as will be described below.
[0084] In various embodiments, different mechanisms may be employed for annotation categorization. For example, such mechanisms may include:
[0085] 1) Voice-recognition-based category identification, such
as at block 605. For example, a user may say a category-based word, such as "Drug," before recording any drug-related annotations.
[0086] 2) Hardware push-buttons, such as at block 606. In some
embodiments, two or more buttons may be used, and the combination of their activation determines an intended annotation category.
[0087] 3) Software push-buttons, such as at block 607. In
various embodiments, these push-buttons may be located on a system display and there may be a separate one for each pre-defined
category.
[0088] In various embodiments, voice activation and deactivation may be combined with categorization, to simplify the experience for the user. Under one mechanism, activation and categorization may be voice-based. For example, a user may say "Enter Drug." If the system recognizes this command, it may
simultaneously 1) cause voice capture to be activated, and 2) cause a category to be identified as 'ORUG." Similarly, terms like "Enter Procedure," "Enter Physical," "Enter History," and "Enter Patient" may be used as keywords for contemporaneous voice activation and categorization of annotations. In another mechanism, hardware or software push-buttons may be used to cause both activation and categorization. In some embodiments which use push-button-based activation and voice-based categorization, the components may be separated.
Examples of Annotation Capture
[0089] Figure 6 also illustrates, at block 624, components directed to capturing raw audio annotations by recording audio between, for example, voice activation and deactivation. This recording may be deactivated, according to various embodiments, through a deactivation mechanism as described above, or via a timeout mechanism, such as at block 612. In some embodiments, this raw audio may be further analyzed by a speech recognition (SR) engine to convert the annotation to text In some embodiments, text annotation may be performed on a database server, such as is illustrated in Figure 6, particularly if system performance is an issue. In some embodiments, in addition to recording audio and text annotations, time stamps for the annotations are also attached, i.e., associated, with the annotations, such as at block 608.
[0090] In various embodiments, speech recognition accuracy may be increased and latency decreased by defining one or more category-specific limited grammars or vocabularies (such as block 632 in Figure 6) that are activated based upon a category identified during a categorization stage. In some embodiments, a limited vocabulary may be defined, such as for categories like DRUG, PROC, PHYS, and DEMO discussed above, based on the feedback from actual users, such as paramedics and doctors involved in emergency medicine. While such a vocabulary may cover only specific medical scenarios under which it was created, such a vocabulary can be easily extended. In some embodiments, a category such as HIST discussed above may take the form of a large dictation vocabulary because of the open-ended nature of the annotations in such a category.
[0091] Examples of the limited grammar/vocabulary include:
[0092] DRUG category: (drugs administered)
[0093] Grammar: <DrugName> <DrugDose> <DrugUnit>
<DrugRoute> [0094] e.g., epinephrine 1 mg IV
[0095] PROC category: (procedures performed)
[0096] Grammar: <DefibProcedureName> <lntensity> joules
<Phase>
[0097] e.g., start CPR; defibrillation 200 joules
biphasic
[0098] DEMO category: (patient information)
[0099] Grammar: <Age> year old <Gender> <Complain£>
[00100] e.g., 60 year old male with chest pain
[00101] PHYS category: (physical findings)
[00102] e.g., breath sounds clear and equal bilateral; no pulse
with CPR
[00103] HIST category: (patient history)
[00104] e.g., patient is diabetic and has high blood pressure;
patient is allergic to xyz
[00105] As discussed above, such annotations may capture different aspects of care being provided. Thus, for example, a "what" context may be provided from the DRUG/PROC annotations, a "why" context from the PHYS/HIST annotations, a "how" context from the DRUG/PROC annotation, a "when" context from the recorded timestamp of the annotation, and a "who" from logged user information, such as at block 601.
[00106] In some embodiments, the grammar described above, may be extended to cover situations in which annotations are recorded later than the actual time care was given. For example, the grammar may be extended to accept an annotation of "2 minutes ago" for a procedure. In some embodiments, extensions may allow for a scenario when a secondary user who is different from the person doing annotations, such as a secondary field personnel, is actually carrying out the care. In one example, such an extension may take the form of <what> ["at" <when>] ['lay" <who>]. Thus, an annotation may take the form of "[PROC] intubated at 3.15 by Joe."
Examples of Annotation Creation and Action
[00107] As illustrated in Figure 6, embodiments of the system 600 and techniques may include a block 626 with components (such as block 613) that combine information, such as what, when, how, why, who, of a captured annotation and store it into a database 634. The text annotations may be further analyzed by one or more understanding components, such as at block 615, which may enter information into a report and/or provide context to other system components. For example, the understanding component may cause a certain action (block 614) to be performed such as parse patient information (e.g., DEMO) to fill in age, gender, and/or complaint fields of a report and also to display such information on a display 636 for a user. In another example, an understanding component may recognize a CPR procedure annotation "start CPR / stop CPR" such that the component may start or stop a CPR timer.
[00108] In addition to the annotations, various embodiments may enable usage of audio to give commands to the system to perform certain actions to control the system. For example, the system may recognize commands to navigate to certain GUI screens, to control alarm activation and/or silencing, and to initiate pre-defined sensor readings. For example a user may say "take BP" to take a blood pressure reading or "take 12 lead" to take an ECG reading. In some embodiments, this mode may utilize a pre-defined command category, such as that discussed above.
[00109] In various embodiments, along with annotations which were created, the database 634 of the system may also contain time-stamped sensor data from physiological sensors that have been connected to monitor a patient, This sensor data may also account partly for the "why" aspect of the patient treatment. A remote system, for example at a consulting clinician's facility, may pull data, along with the annotations, and present consolidated and contemporaneous information on a particular patient treatment. The consulting clinician can thus provide more effective guidance to the local medical personnel without the need for lengthy and potentially incomplete conversations with the local personnel. Examples of Annotation State Feedback
[00110] As illustrated at Figure 6, in various embodiments the system 600 and techniques may utilize a feedback component 628 which makes a user aware of a state of the system. Such feedback may include, for example, whether voice control/input has been activated (block 609) and categorization has been successful, whether audio is being recorded and audio annotation capture is in progress, when speech-to-text conversion for text annotation is successful (block 610), and when audio recording is complete (block 611), In some embodiments, given the possibility of errors in speech recognition, this type of feedback is useful in order for the user to know whether they need to repeat any processing stage.
[00111] According to various embodiments, different feedback mechanisms may be used. Two examples amenable to mobile telemedicine and emergency medicine scenarios are described herein. Either of these mechanisms may make it possible for feedback to be part of the user's environment and enable the user to be aware of an annotation state while allowing the user to continue to perform other activities.
[00112] A first mechanism 638 includes feedback using a plurality of differently colored LEDs mounted on a mobile device implementing the techniques or on safety glasses of a medical care-giver. Any combination of colors of LEDs and associated meanings may be used. For example, in one implementation, four colored LEDS may be used - blue, yellow, green, and red with the following associated meanings:
[00113] blue: the audio system is ready
[00114] yellow: voice activation and categorization have succeeded
[0011S] green: the SR engine has succeeded in converting the
annotation to text
[00116] red: the SR engine could not convert the annotation to text
[00117] A second mechanism 640 utilizes audio feedback with distinct sounds to identify a state. For example, the system may use one distinct sound to indicate that voice activation and categorization has succeeded and a second distinct sound to indicate that the voice recognition has been deactivated. Examples of Use
[00118] Implementations of the systems and techniques described herein were deployed in a simulation center. Thirteen paramedics and six physicians participated in a study utilizing a research prototype to evaluate the features of this technology in three simulated emergency care scenarios: chest pain, blunt trauma, and cardiac arrest. The medical goal was to evaluate via observational study the value and viability of system features, including electronic record keeping of vital signs' sensor data, the ability to view and playback historical data, voice interaction, smart alarms, replication and streaming, and use of wireless technologies for sensor data acquisition.
[00119] Emergency rescue scenarios were simulated using a manikin outfitted with sensors, such as a 3-lead ECG. The manikin's physiological signs were changed in response to care given by trainees. Additionally, software-based simulated sensors were used as a supplement for physiological data that the manikin could not simulate, As such, simulated data was received by the aggregator tablet in a similar manner to the real sensors. Data from the tablet was streamed live via an 802.11 LAN to a remote console where emergency physicians could view it in near real time and interact with the rescue team by telephone,
[00120] Three standardized patient scenarios were executed in 13 simulations utilizing 13 different paramedics and 6 physicians. Both physicians and paramedics were given training in advance of the simulations. Paramedic training included focus on the voice input system and grammars.
[00121] In each simulated scenario, one paramedic used an audio headset to create audio annotations. Prior to the sessions the paramedics underwent an hour- long voice training session. The training created a voice model for the speaker, and provided hands-on experience for the user in system usage and grammar/vocabulary. During each simulation session, paramedics created audio annotations, which were recorded in the database along with the textual translations. The sessions were also video-taped for additional ground truth.
[00122] Recognition accuracy of voice-to-text conversion was measured by comparing audio recordings to transcribed text. An annotation was considered accurate only if the entire annotation was correctly transcribed. Accuracy of voice activation/categorization and deactivation was measured by comparing the video (ground truth) with the categorization recorded in the database. The following table summarizes the results.
Figure imgf000031_0001
[00123] Significant improvements in activation recognition were observed when paramedics retrained the SR engine on activation keywords immediately after a simulation session. Most of the failures in deactivation were due to insufficient pauses (<250msec) before and after the deactivation keyword. The failures on the custom grammar-based annotations were found to be due to the use of out-of- vocabulary words and insufficient pauses before/after the annotation.
[00124] To study the impact of vocabulary size and grammar on system/user performance, a subset of simulations were conducted using the large dictation SR vocabulary that, in contrast to the custom grammar, did not restrict the user's word patterns or vocabulary. In this case, accuracy is measured in terms of per-word accuracy. In addition, it was assessed whether each annotation could be correctly understood by a remote clinician, in spite of some words being incorrectly recognized. As shown above, even though word accuracy with the large vocabulary was only 73%, 87% of the annotations were still usable.
[00125] In the large vocabulary experiments, it was observed that the users were more at ease with the large vocabulary, since they did not have to remember the word patterns. However, the use of a large vocabulary increased system latency, especially when the annotations were long. On the other hand, concise annotations were found to not impact system performance, indicating the need to further train the users on system usage without restricting what they can say.
Examples of Systems
[00126] According to various embodiments, the systems and techniques described herein may be focused around an aggregator device that is associated with a given person and collects sensor data from various sensors associated with that person. The aggregator device may be a computing device, and may allow the sensor data to be processed, stored, and displayed locally, as well as delivered to remote telemedicine clients. Figure 7 presents an example of a high-level system architecture 700 according to the techniques described herein. The architecture 700 is partitioned into three parts: one or more sensors (e.g., sensors 702, 704, 706, and 708), an aggregator 710, and a back end 712. The back end 712 may include a computer 714 and/or a database server 716. In some embodiments, the aggregator 710 may be a decision engine as described above,
[00127] In some embodiments, the aggregator 710 may be implemented on a mobile platform, such as a PDA, smart phone, or tablet. The aggregator 710 may be paired with on-body or environmental sensors, from which it receives data via a Wireless Personal Area Network (WPAN). The aggregator 710 and sensors 702, 704, 706, and 708 may communicate via short-range wireless protocols such as, for example, Bluetooth. The aggregator 710 may control the sensors 702, 704, 706, and 708 and receive, process, store, and/or display sensor data. Sensor data may then be replicated to back end telemedicine clients through a wide area wireless network. The system 700 may also have associated one or more PC platforms 714 which receive, store, analyze and/or display data from each connected aggregator 710. Communication between each aggregator 710 and the back end platform 712 may take place through a storage subsystem, such as database server 716, using a database replication and data streaming architecture to synchronize data across the platforms.
[00128] Figure 8 illustrates a high-level end-to-end view of an example software architecture 800 implementing embodiments described herein. The architecture may be symmetric between the aggregator 810 and the back end 812. As illustrated the aggregator 810 may include a plurality of platforms, including a data acquisition block 820, a user interface block 822, and a storage subsystems block 824. Similarly, the back end 812 may include a data acquisition block 826, a user interface block 828, and a storage subsystems block 830.
[00129] In some embodiments, the systems of the aggregator 810 and back end 812 may be encapsulated in an application object 832 and 834, respectively. The application object 832 and 834 manages core components, reducing the overhead if the system is to be customized to a specific application. Accordingly, for a specific application, a system designer may only need to create and manage application-specific data analysis components and a custom user interface.
[00130] On the aggregator 810, the application object 832 may manage communication with wireless sensors, such as Bluetooth™ sensors. Acquired data may be stored via the storage subsystem 824 as well as streamed to the analysis components 836 and to the user interface 822. The storage subsystem 824 may include a replication capability that is a communication mechanism between the aggregator 810 and the back end 812.
[00131] On the back end 812, the application object 832 receives data via the storage subsystem 824 and may stream it to the analysis components 838 and the back end user interface application 828. On both the aggregator 810 and back end 812, the user interface 822 and 828, respectively, may receive real-time data from the application object 832 and 834, respectively, via a streaming interface. It may also access historical data from the storage subsystem 824 and 830, respectively, via a data access layer (DAL).
[00132] As illustrated in Figure 9, in various embodiments, the application object 932 may provide functions of data acquisition. Such functions may include: 1) a publish/subscribe based reconfigurable data path 940 that may be configurable at system integration time, 2) an extensible sensing component 942 that may interface with sensors, 3) analysis components 944 for real-time data analysis, and 4) a database bridge 946 (DB) component to record incoming data streams in a local database.
[00133] The techniques and apparatuses described herein may be implemented into a system using suitable hardware and/or software to configure as desired. Figure 10 illustrates, for one embodiment, an example system 1000 comprising one or more processor(s) 1004, system control logic 1008 coupled to at least one of the processor(s) 1004, system memory 1012 coupled to system control logic 1008, nonvolatile memory (NVM)/storage 1016 coupled to system control logic 1008, and one or more communications interface(s) 1020 coupled to system control logic 1008.
[00134] System control logic 1008 for one embodiment may include any suitable interface controller to provide for any suitable interface to at least one of the processor (s) 1004 and/or to any suitable device or component in communicatioh with system control logic 1008.
[00135] System control logic 1008 for one embodiment may include one or more memory controller(s) to provide an interface to system memory 1012. System memory 1012 may be used to load and store data and/or instructions, for example, for system 1000. System memory 1012 for one embodiment may include any suitable volatile memory, such as suitable dynamic random access memory
(DRAM), for example.
[00136] System control logic 1008 for one embodiment may include one or more input/output (I/O) controller(s) to provide an interface to NVM/storage 1016 and communications interface(s) 1020.
[00137] NVM/storage 1016 may be used to store data and/or instructions, for example. NVM/storage 1016 may include any suitable non-volatile memory, such as flash memory, for example, and/or may include any suitable non-volatile storage device(s), such as one or more hard disk drive(s) (HDD(s)), one or more solid-state drive(s), one or more compact disc (CD) drive(s), and/or one or more digital versatile disc (DVD) drive(s) for example.
[00138] The NVM/storage 1016 may include a storage resource physically part of a device on which the system 1000 is installed or it may be accessible by, but not necessarily a part of, the device. For example, the NVM/storage 1016 may be accessed over a network via the communications interface(s) 1020.
[00139] System memory 1012 and NVM/storage 1016 may include, in particular, temporal and persistent copies of medical communication logic 1024, respectively. The medical communication logic 1024 may include instructions that when executed by at least one of the processor(s) 1004 result in the system 1000 performing automated security setup actions as described in conjunction with, for example, the wireless setup or certificate setup applications described herein. In some embodiments, the medical communication logic 1024 may additionally/alternatively be located in the system control logic 1008.
[00140] Communications interface(s) 1020 may provide an interface for system 1000 to communicate over one or more network(s) and/or with any other suitable device. Communications interface(s) 1020 may include any suitable hardware and/or firmware. Communications interface(s) 1020 for one embodiment may include, for example, a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem. For wireless communications, communications interface(s) 1020 for one embodiment may use one or more antenna(s).
[00141] For one embodiment, at least one of the processors) 1004 may be packaged together with logic for one or more controlJer(s) of system control logic 1008. For one embodiment, at least one of the processor(s) 1004 may be packaged together with logic for one or more controllers of system control logic 1008 to form a System in Package (SiP). For one embodiment, at least one of the processor(s) 1004 may be integrated on the same die with logic for one or more controller(s) of system control logic 1008. For one embodiment, at least one of the processors) 1004 may be integrated on the same die with logic for one or more controllers) of system control logic 1008 to form a System on Chip (SoC).
[00142] In various embodiments, system 1000 may have more or less components, and/or different architectures.
[00143] References throughout this specification to "one embodiment" or "an embodiment" mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation encompassed within the present invention. Thus, appearances of the phrase "one embodiment" or "in an embodiment" are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be instituted in other suitable forms other than the particular embodiment(s) illustrated.
[00144] Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a wide variety of alternate and/or equivalent implementations may be substituted for the specific embodiments shown and described, without departing from the scope of the embodiments of the present invention. This application is intended to cover any adaptations or variations of the embodiments discussed herein. Therefore, it is manifestly intended that the embodiments of the present invention be limited only by the claims and the equivalents thereof.

Claims

Claims
1. A method of medical monitoring comprising:
receiving, by a computing device, sensor data from a plurality of sensors operatively coupled to a patient, including first sensor data from a first sensor measuring a first parameter of the patient and second sensor data from a second sensor measuring a second parameter of the patierit, the first parameter being different from the second parameter;
determining, by the computing device, an action to take by correlating the first sensor data with the second sensor data based on a rule.
2. The method of claim 1 further comprising receiving, by the computing device, profile data related to the patient, wherein the deterrtiined action is based on the rule and the profile data.
3. The method of claim 2 wherein the determined action comprises storing at least one of the first sensor data and the second sensor data in the profile data.
4. The method of claim 1 wherein the rule comprises comparing the first sensor data with a first threshold, and, based on the comparison, sending a command to the second sensor to obtain the second sensor data.
5. The method of claim 4 further comprising comparing the second sensor data with a second threshold, and, based on the comparison, actuating an alarm.
6. The method of claim 1 further comprising:
receiving user response data from at least one of the patient and a caregiver, the user response data related to the action identified by the computing device; and modifying the rule based on the user response data.
7. The method of claim 6 wherein the action comprises selecting an actuation mechanism for activating an alarm, wherein the selecting is based on the user response data.
8. A method of medical monitoring comprising:
receiving, by a computing device, sensor data from one or more sensors operatively coupled to a patient;
receiving, by the computing device, prior probability data, the prior probability data based on profile data related to the patient;
determining, by the computing device, an action to take by applying a rule and the prior probability data to the sensor data; and
modifying the prior probability data based on the sensor data received or the action determined.
9. The method of claim 8 further comprising determining, by the computing device, a subsequent action to take by applying the rule and the modified prior probability data to the sensor data.
10. The method of claim 8 wherein the rule comprises a first rule, and further comprising determining, by the computing device, a subsequent action to take by applying a second rule and the modified probability data to the sensor data.
11. The method of claim 8 further comprising receiving, by the computing device, user response data related to the action determined, and modifying, by the computing device, the prior probability data based on the user response data.
12. The method of claim 8 wherein the action determined comprises actuating an alarm.
13. The method of claim 8 wherein the action deterrnined comprises acquiring additional sensor data.
14. The method of claim 8 wherein the action determined comprises recommending a protocol to a caregiver.
15. A method comprising:
receiving, by a decision engine, sensor data from a plurality of sensors; applying a rule, by the decision engine, to the sensor data to identify an action to carry out;
sending a command to carry out the action;
receiving user response data relating to the action;
storing data related to the action identified by the decision engine, the data including the action identified, the rule applied by the decision engine to identify the action, the sensor data to which the rule was applied, and the user response to the action identified; and
modifying the rule based on the stored data.
16. The method of claim 15 wherein the rule comprises activating an alarm and selecting an actuation mechanism for the alarm, wherein the modifying comprises modifying the alarm selected by the rule based on the user response data.
17. A method comprising:
receiving sensor data, by a computing device, from one or more sensors operatively coupled to a patient;
initiating an action, by the computing device contemporaneously with receiving the sensor data, in response to the sensor data or a command from a user; storing data related to the action;
attaching a time stamp to the sensor data and the stored action data;
displaying the sensor data and the stored action data on a common display organized by the time stamps.
18. The method of claim 17 wherein the sensor data includes a plurality of data formats, including two or more of sampled continuous data, sampled non-continuous data, and event data.
19. The method of claim 17 wherein the common display includes a plurality of visualization techniques for displaying the sensor data and the stored action data, including two or more of a continuous data plot, a visual timeline, a textual flowchart, and a text ticker with changing text, the plurality of visualization techniques linked together by time stamp so that the plurality of visualization techniques show data having time stamps within a common time interval.
20. The method of claim 17 further comprising associating a portion of the sensor data and the stored action data having time stamps within a time interval as treatment session data, and displaying at least a portion of the treatment session data at a later time.
21. The method of claim 17 further comprising:
receiving, contemporaneously with the sensor data, user-input data from a caregiver or the patient related to at least one of the patient and the sensor data; attaching a time stamp to the user-input data;
displaying the user-input data together with the sensor data and the stored action data on a common display organized by the time stamp.
22. The method of claim 21 wherein the user-input data comprises voice data.
23. The method of claim 22 further comprising converting the voice data into text data corresponding to the voice data, attaching a time stamp to the text data, and displaying the text data on the common display.
24. The method of claim 21 wherein the user- input data comprises text data.
25. A method comprising:
receiving sensor data, by a computing device, from one or more sensors operatively coupled to a patient;
receiving user-input data, by the computing device contemporaneously with receiving the sensor data, in response to the sensor data or a command from a user; attaching a time stamp to the sensor data and the user-input data; and indexing the sensor data and the user-input data according to the time stamp.
26. The method of claim 25 further comprising displaying the sensor data and the user-input data on a common display with a common time axis.
27. The method of claim 26 wherein the common display includes a plurality of visualization techniques for displaying the sensor data and the stored action data, including two or more of a continuous data plot, a visual timeline, a textual flowchart, and a text ticker with changing text, the plurality of visualization techniques linked together by the time stamp so that the plurality of visualization techniques show data having the time stamp within a common time interval.
28. The method of claim 25 wherein the user-input data relates to a
contemporaneous event.
29. The method of claim 25 wherein the user-input data comprises voice data.
30. The method of claim 25 further comprising converting, by the computing device, the voice data into text data corresponding to the voice data, attaching the time stamp to the text data, and displaying the text data on the common display.
31. The method of claim 25 wherein the user-input data comprises text data.
32. The method of claim 25 further comprising associating a portion of the sensor data and the user-input data having time stamps within a time interval as treatment session data, and recalling at least a portion of the treatment session data at a later time.
PCT/US2011/029077 2010-03-18 2011-03-18 Context-management framework for telemedicine WO2011116340A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US31535810P 2010-03-18 2010-03-18
US61/315,358 2010-03-18

Publications (2)

Publication Number Publication Date
WO2011116340A2 true WO2011116340A2 (en) 2011-09-22
WO2011116340A3 WO2011116340A3 (en) 2011-11-17

Family

ID=44649856

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2011/029077 WO2011116340A2 (en) 2010-03-18 2011-03-18 Context-management framework for telemedicine

Country Status (1)

Country Link
WO (1) WO2011116340A2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014110280A2 (en) 2013-01-11 2014-07-17 Zoll Medical Corporation Ems decision support interface, event history, and related tools
US9007207B2 (en) 2013-01-22 2015-04-14 General Electric Company Dynamic alarm system for operating a power plant and method of responding to same
CN105224383A (en) * 2015-08-21 2016-01-06 上海理工大学 cardiopulmonary resuscitation simulation system
CN105224383B (en) * 2015-08-21 2018-08-31 上海理工大学 cardiopulmonary resuscitation simulation system
JP2019523926A (en) * 2016-05-24 2019-08-29 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Method and system for providing customized settings for a patient monitor
US10504036B2 (en) 2016-01-06 2019-12-10 International Business Machines Corporation Optimizing performance of event detection by sensor data analytics
US10720240B2 (en) 2015-02-26 2020-07-21 Koninklijke Philips N.V. Context detection for medical monitoring
CN114842935A (en) * 2022-04-29 2022-08-02 中国人民解放军总医院第六医学中心 Intelligent detection method and system for night ward round of hospital
CN116453637A (en) * 2023-03-20 2023-07-18 杭州市卫生健康事业发展中心 Health data management method and system based on regional big data
US11925439B2 (en) 2018-10-23 2024-03-12 Zoll Medical Corporation Data playback interface for a medical device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107451401A (en) * 2017-07-11 2017-12-08 武汉金豆医疗数据科技有限公司 A kind of medical insurance intelligent checks method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080015903A1 (en) * 2005-12-09 2008-01-17 Valence Broadband, Inc. Methods for refining patient, staff and visitor profiles used in monitoring quality and performance at a healthcare facility
US20080214904A1 (en) * 2005-06-22 2008-09-04 Koninklijke Philips Electronics N. V. Apparatus To Measure The Instantaneous Patients' Acuity Value
US20090036757A1 (en) * 2004-07-12 2009-02-05 Cardiac Pacemakers, Inc. Expert system for patient medical information analysis
US7552101B2 (en) * 2003-10-31 2009-06-23 Vigimedia S.A.S. Health monitoring system implementing medical diagnosis

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7552101B2 (en) * 2003-10-31 2009-06-23 Vigimedia S.A.S. Health monitoring system implementing medical diagnosis
US20090036757A1 (en) * 2004-07-12 2009-02-05 Cardiac Pacemakers, Inc. Expert system for patient medical information analysis
US20080214904A1 (en) * 2005-06-22 2008-09-04 Koninklijke Philips Electronics N. V. Apparatus To Measure The Instantaneous Patients' Acuity Value
US20080015903A1 (en) * 2005-12-09 2008-01-17 Valence Broadband, Inc. Methods for refining patient, staff and visitor profiles used in monitoring quality and performance at a healthcare facility

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10976908B2 (en) 2013-01-11 2021-04-13 Zoll Medical Corporation EMS decision support interface, event history, and related tools
EP2943926A4 (en) * 2013-01-11 2018-05-23 Zoll Medical Corporation Ems decision support interface, event history, and related tools
US11816322B2 (en) 2013-01-11 2023-11-14 Zoll Medical Corporation EMS decision support interface, event history, and related tools
CN110277160A (en) * 2013-01-11 2019-09-24 卓尔医学产品公司 Code checks the system and decision support method of medical events
CN110277160B (en) * 2013-01-11 2023-07-28 卓尔医学产品公司 System for code viewing medical events and decision support method
WO2014110280A2 (en) 2013-01-11 2014-07-17 Zoll Medical Corporation Ems decision support interface, event history, and related tools
US9007207B2 (en) 2013-01-22 2015-04-14 General Electric Company Dynamic alarm system for operating a power plant and method of responding to same
US10720240B2 (en) 2015-02-26 2020-07-21 Koninklijke Philips N.V. Context detection for medical monitoring
CN105224383A (en) * 2015-08-21 2016-01-06 上海理工大学 cardiopulmonary resuscitation simulation system
CN105224383B (en) * 2015-08-21 2018-08-31 上海理工大学 cardiopulmonary resuscitation simulation system
US10504036B2 (en) 2016-01-06 2019-12-10 International Business Machines Corporation Optimizing performance of event detection by sensor data analytics
US11626207B2 (en) * 2016-05-24 2023-04-11 Koninklijke Philips N.V. Methods and systems for providing customized settings for patient monitors
US20190295696A1 (en) * 2016-05-24 2019-09-26 Koninklijke Philips N.V. Methods and systems for providing customized settings for patient monitors
JP2019523926A (en) * 2016-05-24 2019-08-29 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Method and system for providing customized settings for a patient monitor
US11925439B2 (en) 2018-10-23 2024-03-12 Zoll Medical Corporation Data playback interface for a medical device
CN114842935A (en) * 2022-04-29 2022-08-02 中国人民解放军总医院第六医学中心 Intelligent detection method and system for night ward round of hospital
CN114842935B (en) * 2022-04-29 2024-01-23 中国人民解放军总医院第六医学中心 Intelligent detection method and system for night ward round of hospital
CN116453637A (en) * 2023-03-20 2023-07-18 杭州市卫生健康事业发展中心 Health data management method and system based on regional big data
CN116453637B (en) * 2023-03-20 2023-11-07 杭州市卫生健康事业发展中心 Health data management method and system based on regional big data

Also Published As

Publication number Publication date
WO2011116340A3 (en) 2011-11-17

Similar Documents

Publication Publication Date Title
US11776669B2 (en) System and method for synthetic interaction with user and devices
US20220037038A1 (en) Operating room checklist system
WO2011116340A2 (en) Context-management framework for telemedicine
US11301680B2 (en) Computing device for enhancing communications
US11963924B2 (en) Tools for case review performance analysis and trending of treatment metrics
US20160117940A1 (en) Method, system, and apparatus for treating a communication disorder
JP2019527864A (en) Virtual health assistant to promote a safe and independent life
TW201327460A (en) Apparatus and method for voice assisted medical diagnosis
CN107910073A (en) A kind of emergency treatment previewing triage method and device
WO2015145424A1 (en) A system for conducting a remote physical examination
US10698983B2 (en) Wireless earpiece with a medical engine
US20210298711A1 (en) Audio biomarker for virtual lung function assessment and auscultation
JP2023539874A (en) Computerized decision support tools and medical devices for respiratory disease monitoring and care
Dhakal et al. IVACS: I ntelligent v oice A ssistant for C oronavirus Disease (COVID-19) S elf-Assessment
JP2005512608A (en) Correlation between sensor signals and subjective information in patient monitoring
US20170354383A1 (en) System to determine the accuracy of a medical sensor evaluation
Rabbani et al. Towards developing a voice-activated self-monitoring application (VoiS) for adults with diabetes and hypertension
US11501879B2 (en) Voice control for remote monitoring
US20220059238A1 (en) Systems and methods for generating data quality indices for patients
Wouhaybi et al. A context-management framework for telemedicine: an emergency medicine case study
KR20130115706A (en) Service providing method for physical condition based on user voice using smart device
Wouhaybi et al. Experiences with context management in emergency medicine
CN110024038B (en) System and method for synthetic interaction with users and devices
JP2024003313A (en) Information processing device, information processing method, and program
WO2024015885A1 (en) Systems and methods for providing context sensitive guidance for medical treatment of a patient

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11757096

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11757096

Country of ref document: EP

Kind code of ref document: A2