US20140201627A1 - Ems decision support interface, event history, and related tools - Google Patents

Ems decision support interface, event history, and related tools Download PDF

Info

Publication number
US20140201627A1
US20140201627A1 US14/151,602 US201414151602A US2014201627A1 US 20140201627 A1 US20140201627 A1 US 20140201627A1 US 201414151602 A US201414151602 A US 201414151602A US 2014201627 A1 US2014201627 A1 US 2014201627A1
Authority
US
United States
Prior art keywords
user interface
softkeys
patient
event
decision support
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US14/151,602
Other versions
US9658756B2 (en
Inventor
Gary A. Freeman
Annemarie Elizabeth Silver
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zoll Medical Corp
Original Assignee
Zoll Medical Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zoll Medical Corp filed Critical Zoll Medical Corp
Priority to US14/151,602 priority Critical patent/US9658756B2/en
Publication of US20140201627A1 publication Critical patent/US20140201627A1/en
Assigned to ZOLL MEDICAL CORPORATION reassignment ZOLL MEDICAL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SILVER, Annemarie Elizabeth, FREEMAN, GARY A
Priority to US15/497,745 priority patent/US10976908B2/en
Application granted granted Critical
Publication of US9658756B2 publication Critical patent/US9658756B2/en
Priority to US17/197,341 priority patent/US11816322B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • Embodiments of the present invention relate generally to tools for facilitating acute care treatment, and more specifically to systems and methods for clinical decision support and differential diagnosis.
  • Automated differential diagnosis utilizes computer inference algorithms such as Bayesian algorithms, neural networks, or genetic algorithms. According to a Wikipedia posting:
  • Embodiments of the present invention include a system that provides a tool for the caregiver to more efficiently and accurately perform a differential diagnosis that is integrated into the caregivers existing workflow during emergency situations.
  • Embodiments of the present invention may also provide an integrated view of physiological data from the patient, along with therapeutic treatment and patient history and examination findings, in an automated way to caregivers.
  • a method for code review of a medical event includes displaying, on a first device screen, a user interface during the medical event; recording images of the user interface, each of the images representing an entirety of the user interface associated with a time during the medical event, wherein the images are recorded at least once every second; displaying, on a second device screen, a visual timeline indicator and a user interface replicator, the user interface replicator displaying the images of the user interface and the visual timeline indicator representing a time associated with each of the images, wherein the visual timeline indicator and the user interface replicator permit sequential playback and review of the user interface images from the medical event, and wherein the visual timeline indicator accepts user input to move the sequential playback to a different time associated with the medical event.
  • the visual timeline indicator includes a timeline including a beginning time of the medical event and an end time of the medical event, the method further comprising indicating on the timeline the time associated with the image of the user interface shown in the user interface replicator.
  • indicating on the timeline comprises using an indicator on the timeline to indicate the time associated with the image of the user interface shown in the user interface replicator.
  • the visual timeline indicator includes one or more event markers marking occurrence of clinically-relevant sub-events during the medical event.
  • the one or more event markers include a drug event marker indicating a time at which a drug was administered to a patient during the medical event.
  • the one or more event markers include a defibrillation event marker indicating a time at which a defibrillation shock was applied to a patient during the medical event.
  • the one or more event markers include an ROSC event marker indicating a time at which a patient returned to spontaneous circulation.
  • the one or more event markers include a rearrest event marker indicating a time at which a patient returned to cardiac arrest.
  • the one or more event markers include an alarm event marker indicating a time at which an alarm was activated.
  • a method for decision support during a medical event includes displaying, on a screen on a device, a user interface during the medical event, wherein the user interface comprises two or more softkeys each representing a possible user selection; collecting physiological data from a patient with the device; determining, based on the physiological data, which one of the two or more softkeys represents the possible user selection that most closely conforms to a treatment or diagnosis protocol; and based on the determination, visually distinguishing the one of the two or more softkeys from the others of the two or more softkeys on the user interface.
  • a method for decision support during a medical event includes displaying, on a first screen on a first device, a user interface during the medical event, wherein the user interface comprises two or more softkeys each representing a possible user selection; collecting physiological data from a patient with the first device; displaying, on a second screen on a second device, a visual representation of a clinical decision support tree and an indication of a current node on the clinical decision support tree, wherein the two or more softkeys each represent the possible user selection from the current node on the clinical decision support tree.
  • a method for decision support during a medical event includes: during the medical event, collecting physiological data from a patient at a first frequency with a patient monitoring device; guiding a user through a clinical decision support process with a display screen during the medical event; determining, with the clinical decision support process, a status of a patient; and based on the status of the patient, selecting a second frequency at which to collect the physiological data from the patient.
  • a system for code review of a medical event includes a first screen configured to visually display a clinical decision support tree used in the medical event; a second screen configured to visually display a replication of a user interface of a patient monitoring device as the user interface appeared at times during the medical event, wherein selecting a location within the clinical decision support tree on the first screen causes the second screen to display the replication of the user interface corresponding to a time during the medical event represented by the location within the clinical decision support tree.
  • first screen is further configured to visually indicate on the clinical decision support tree a user's advancement through the clinical decision support tree in synchronization with advancement of the replication of the user interface on the second screen.
  • a method for decision support includes displaying, on a screen, a user interface during the medical event, wherein the user interface comprises trending information for a patient condition; collecting physiological data from a patient with a device; providing clinical decision support using at least some of the physiological data and at least some user input data; establishing a range for the patient condition based on the clinical decision support; and visually indicating on the user interface whether all or portions of the trending information is within the range.
  • establishing the range for the patient condition based on the clinical decision support comprises establishing the range for the patient condition based on the patient's age, wherein the patient's age is acquired via the clinical decision support.
  • a method for decision support includes displaying, on a screen, a user interface during the medical event, wherein the user interface comprises dosing information display for a drug; collecting physiological data from a patient with a device; providing clinical decision support using at least some of the physiological data and at least some user input data; establishing a dosage recommendation for the drug based on the clinical decision support; and visually indicating the dosage recommendation on the dosing information display of the user interface.
  • establishing the dosage recommendation based on the clinical decision support comprises establishing the dosage recommendation based on the patient's age, wherein the patient's age is acquired via the clinical decision support.
  • establishing the dosage recommendation based on the clinical decision support comprises establishing the dosage recommendation based on the patient's weight, wherein the patient's weight is acquired via the clinical decision support.
  • establishing the dosage recommendation based on the clinical decision support comprises establishing the dosage recommendation based on the patient's allergies, wherein the patient's allergies are acquired via the clinical decision support.
  • FIG. 1 illustrates a clinical decision support system, according to embodiments of the present invention.
  • FIG. 2 illustrates a user interface for a medical device, according to embodiments of the present invention.
  • FIG. 3 illustrates the user interface of FIG. 2 upon selection of an acute care diagnosis mode, according to embodiments of the present invention.
  • FIG. 4 illustrates the user interface of FIGS. 2 and 3 upon selection of a respiratory distress mode, according to embodiments of the present invention.
  • FIG. 5 is a table describing a differential diagnosis outline for acute dyspnea in adults.
  • FIG. 6 is a table describing clues to the diagnosis of dyspnea.
  • FIG. 7 is a table listing physical examination findings in the diagnosis of acute dyspnea.
  • FIG. 8A is a top portion of a common medical protocol and differential diagnosis flow chart for adult shortness of breath.
  • FIG. 8B is a continuation of the common medical protocol and differential diagnosis flow chart of FIG. 8A .
  • FIG. 9 illustrates a carbon dioxide snapshot waveform which may be displayed on the user interface when selected by the user, according to embodiments of the present invention.
  • FIG. 10 illustrates the carbon dioxide snapshot waveform of FIG. 9 with displayed measurements, according to embodiments of the present invention.
  • FIG. 11 illustrates a tablet computing device docked on a defibrillator device, according to embodiments of the present invention.
  • FIG. 12 illustrates a protocol for use in patients with cardiac arrest.
  • FIG. 13 illustrates an example trauma assessment protocol.
  • FIG. 14 illustrates an example rapid trauma assessment protocol.
  • FIG. 15 illustrates an example focused physical exam protocol.
  • FIG. 16 illustrates an example amputation injuries protocol.
  • FIG. 17 illustrates an example bleeding control protocol.
  • FIG. 18 illustrates an example burns protocol.
  • FIG. 19 illustrates an example electrocution protocol.
  • FIG. 20 illustrates an example spinal immobilization protocol.
  • FIG. 21 illustrates additional steps in the spinal immobilization protocol of FIG. 20 .
  • FIG. 22 illustrates an example multi-system trauma protocol.
  • FIG. 23 illustrates an example near drowning protocol.
  • FIG. 24 illustrates an example trauma in pregnancy protocol.
  • FIG. 25 illustrates an example traumatic cardiac arrest protocol.
  • FIG. 26 illustrates a clinical decision support system, according to embodiments of the present invention.
  • FIG. 27 illustrates a computer system, according to embodiments of the present invention.
  • FIG. 28 illustrates a user interface display of a clinical decision support tree, according to embodiments of the present invention.
  • FIG. 29 illustrates the user interface display of FIG. 28 with a portion of the clinical decision support tree resized, according to embodiments of the present invention.
  • FIG. 30 illustrates the user interface display of FIGS. 28 and 29 with an additional portion of the clinical decision support tree resized, according to embodiments of the present invention.
  • FIG. 31 illustrates a user interface display with dynamic softkeys, according to embodiments of the present invention.
  • FIG. 32 illustrates the user interface display of FIG. 31 with one softkey emphasized based on clinical decision support, according to embodiments of the present invention.
  • FIG. 33 illustrates the user interface display of FIG. 31 with one softkey emphasized in a different way, based on clinical decision support, according to embodiments of the present invention.
  • FIG. 34 illustrates a code review interface for reviewing user interface display data corresponding to a medical event, according to embodiments of the present invention.
  • FIG. 35 illustrates a portion of a clinical decision support tree, according to embodiments of the present invention.
  • FIG. 36 illustrates a user interface display, according to embodiments of the present invention.
  • FIG. 1 shows a block diagram of the system, according to embodiments of the present invention.
  • a combined defibrillator/monitor device such as the E-Series manufactured by ZOLL Medical of Chelmsford Massachusetts has keys whose labeling is provided by on-screen text.
  • the text is thus configurable in real time ether due to input by the user or as a result of analysis and decision making by the defibrillator or other devices with which the defibrillator is in communication at the time of the defibrillator's use, such as the computer tablet device 214 or remote base station staffed by medical dispatch or medical supervisory personnel in communication with the computer tablet.
  • the computer tablet may take the form of an iPad (Apple Corp., Cupertino Calif.).
  • Such screen-labeled keys may be referred to as “soft-keys”.
  • a specific soft-key is initially labeled “Acute Care diagnose” at device turn-on as shown in FIG. 2 , according to embodiments of the present invention.
  • the defibrillator Upon detecting a key press of the Acute Care Diagnose key, the defibrillator changes the functionality and labeling of the keys to those shown in FIG. 3 .
  • each DTP may include a template according to which sensor data, or the physiological and/or measurement data derived therefrom, is displayed in a way most useful and/or efficient for that particular DTP. For instance, if the “Respiratory Distress” soft-key is pressed, then the waveforms and numeric physiologic data on the screen change to that shown in FIG. 4 . Stored snapshots of individual CO2 breath waveforms may be initiated via the CO2 Snapshot soft-key. These snapshots remain on the display for reference to the clinician both for automating measurements for diagnosis as well as for assessing the effectiveness of a particular therapy.
  • Heart sound measurement and detection may be incorporated into the monitoring device for the detection of S3 and S4 heart sounds and automatically narrow the differential, or suggest for the rescuer to confirm agreement with the software diagnosis, of heart failure or pulmonary edema.
  • a flowchart for evaluating heart sounds is shown in FIGS. 8A and 8B .
  • Pulse oximetry and capnography are also very helpful measures and may be automatically incorporated into the algorithm for more accurate diagnosis.
  • the same sensors used to detect heart sounds may also be employed to detect breath sounds and to analyze their quality. Specific algorithms may be employed to detect wheezing, crackles, rale or stridor, each of which may be indicative of a particular disease.
  • Sensors such as flow sensors and O2 gas sensors are included in some embodiments, so that the additional physiological measurements such as volumetric Co2, volumetric O2 and spirometry, which are relevant for diagnosis and treatment of dyspnea, may be included and displayed on the Respiratory Distress DTP screen.
  • An oxygen sensor may be located in the patient's airway, which may assist in calculating the metabolic needs of the patient.
  • the display on the defibrillator 212 is a touchscreen, according to some embodiments of the present invention.
  • the caregiver can easily initiate measurements such as on the CO2 snapshot waveform or the spirometry snapshot waveform via touchscreen gesture such as a double tap.
  • a zoom icon may exist in the upper corner of each waveform box, such as the CO2 snapshot, such that when the zoom button is touched, that particular waveform fills the display of the defibrillator.
  • Another measurement button is present which, when touched, displays all the relevant measurements for a particular waveform, according to embodiments of the present invention.
  • a gestural interface is provided as part of the touchscreen.
  • Using two fingers or finger and thumb to touch to two points in the waveform (which may also be referred to as a “caliper” measurement or gesture) will cause measurements to be displayed and/or overlaid onto the physiological data, as illustrated in FIG. 10 .
  • acaliper measurement or gesture
  • dead space volume, phase II and III slopes which are indicative of COPD, and estimates of arterial pCO2 may be listed on the screen after initiation of CO2 waveform measurement.
  • the processor communicably coupled with the touchscreen portion of a decision support system may be configured to recognize the wave shape of a wave signal being displayed, and/or recognize the edge of an image being displayed, in order to improve the accuracy of a caliper touch gesture. For example, if a user were to use a caliper gesture to measure or “zoom in” on an ST elevation in an ECG wave display, the decision support system may be configured to recognize that if one of the user's fingers taps just below the top of the ECG wave, that the user likely intended to include the top of the ECG wave in the enlarged or selected view. In addition, the decision support system may be configured to permit an ability to enlarge (zoom) and adjust measurement points individually using the touchscreen. A tap/click and drag method may be used to set the caliper gesture; for example, to hone in on a particular portion of displayed waveform, the user may press on one point and drag to another point to indicate the endpoints of the caliper gesture.
  • Specific out-of-range readings can be displayed in red or highlighted by other mechanisms, such as bold-face font and/or flashing. Touching the highlighted values will cause the display to show the possible diagnoses which are consistent with the measurements, according to embodiments of the present invention.
  • a specific graphical zone of the screen can be designated with a graphical image of the computer tablet. By dragging waveforms, measurements, or any other data object shown on the display over onto the computer tablet icon, it can automatically be presented on the computer tablet that is linked to the defibrillator.
  • Capnography is helpful in the assessment of asthma, where an increased slope in the expiratory plateau provides a measure of bronchospasm.
  • the slope of the plateau phase (phase III) provides a measure of airway obstruction.
  • the adequacy of b-agonist bronchodilatory therapy for an asthma exacerbation may be monitored through observation of slope change of phase III.
  • the data for the patient's history may be entered via the computer tablet with patient physiological measures via the monitor.
  • patient examination findings along with measures of the patient's physiological state via such monitoring as ECG, capnography and pulse oximetry
  • these data elements are integrated into a user interface that automatically or semi-automatically integrates the various data elements on a single differential diagnosis screen within the application on the computer tablet.
  • the interface may begin by asking the rescuer to choose from a list of common presenting symptoms or complaints by the patient, for example dyspnea or respiratory distress.
  • the information such as on the screens of FIGS.
  • the defibrillator contains a docking feature for propping up a computer tablet such as an Apple® iPad® on top of the defibrillator in a stable position via mounting features integrated onto the defibrillator, as illustrated in FIG. 11 .
  • a docking feature for propping up a computer tablet such as an Apple® iPad® on top of the defibrillator in a stable position via mounting features integrated onto the defibrillator, as illustrated in FIG. 11 .
  • Other mobile computing devices including tablet computers, an iPhone®, an iTouch®, and other touchscreen monitors may be used.
  • a low power, battery powered, touchscreen monitor may be used, such as, for example, those that transfer information to and from a computing device via a wired or wireless USB connection. Communication may be provided wirelessly between the two devices (the medical device and the mobile computing device, for example). Other communicable coupling may be achieved between the two devices; for example, wired.
  • the iPad may include a protective housing and/or waterproof housing to protect it from the typical physical abuse it would likely encounter in the prehospital environment.
  • Mounting features integral to such an iPad housing allow it to be easily attached on top of the defibrillator on scene.
  • the mounting feature on the defibrillator may be able to hinge to allow the iPad® to hinge down when not in use into a protective pocket on the defibrillator.
  • the iPad® may also be undocked and used nearby to the defibrillator, without need for physical connection.
  • a physical slot may also be provided, preferably at the side, top or back of the unit that allows for the iPad® to have its battery charged by the defibrillator.
  • the results of this integrated analysis of physiological data, patient history and examination findings may then be displayed on the defibrillator, potentially in the form of asking to make an additional physiological measurement.
  • the results of this integrated analysis of physiological data, patient history and examination findings may alternatively, or additionally, be displayed on the tablet computer.
  • the tablet computer, or other mobile computing device may be communicably coupled with the defibrillator or other physiological assessment device, for example through a wireless connection.
  • the phrase “communicably coupled” is used in its broadest sense to refer to any coupling whereby information may be passed.
  • communicably coupled includes electrically coupled by, for example, a wire; optically coupled by, for example, an optical cable; and/or wirelessly coupled by, for example, a radio frequency or other transmission media.
  • “Communicably coupled” also includes, for example, indirect coupling, such as through a network, or direct coupling.
  • a user interface device is communicably coupled to a processor, and the processor is configured to receive data entered via the user interface device, as well as data received from one or more sensors, in order to perform clinical decision support based on both data sources.
  • the user interface device may include one or more devices such as a touch screen computer, a tablet computer, a mobile computing device, a smart phone, an audio receiver, an audio transmitter, a video receiver, a video transmitter, a camera, and a “heads up” display projected onto a user's glasses or face shield.
  • a small monitor may be mounted onto eyeglasses, a face shield, and/or integrated with other wearable communications devices, such as, for example, an ear bud or a Bluetooth® hands free phone adaptor.
  • the user interface device may include a combination of devices for conveying options and receiving input; for example, an audio speaker may be used to convey possible DTPs, and an audio receiver may be used to receive a verbal command indicating a selection of one of the DTPs.
  • an audio receiver a video camera may be used to receive a gestural command that will be interpreted by the processor as a selection of one of the possible DTPs, or input elements.
  • Using hands-free devices for user interface devices may free the hands of a caregiver to perform clinical tasks, while still permitting non-intrusive decision support and/or differential diagnosis for the caregiver.
  • FIGS. 8A and 8B illustrate a differential diagnosis and/or clinical support process through which a computer processor may take a caregiver, using the user interface device, according to embodiments of the present invention.
  • the caregiver selected “Respiratory Distress” from among the five DTPs presented on the screen of FIG. 3
  • the user interface device would prompt the caregiver to input information about step 802 in the flowchart of FIG. 8 , which flows from top to bottom.
  • step 802 if the 12-lead reveals an S3 heart sound, or if the Dyspnea Engagement Score is greater than 3, then the decision support system will take the user through the Acute Decompensated Heart Failure (CHF) decision/diagnosis process.
  • CHF Acute Decompensated Heart Failure
  • the decision support system may take into account both physiological data received from sensors, and information data received from the caregiver (e.g. via mobile computing device such as an iPad®), in helping the caregiver move from one decision point in the flow chart to the next, while updating any display or information provided along the way.
  • the decision support system may indicate to the user that, based on processing of the ECG data, there does not appear to be an S3 heart sound present, and ask the caregiver to confirm this assessment.
  • the decision support system may also, or alternatively, request the caregiver to enter a Dyspnea Engagement Score, or suggest one for confirmation by the caregiver.
  • step 802 if the 12-lead reveals no S3 heart sound, or if the Dyspnea Engagement Score is less than 3, then the decision support system will recognize that the caregiver is not dealing with a CHF situation, but then moves to step 804 in which the decision support system changes its display and/or input prompts in order to help the caregiver determine whether to enter the Asthma treatment path or the COPD treatment path.
  • the decision support system may factor in various physiological data from sensors, as well as various informational data received about the particular patient, in helping to support the caregiver's decision. For example, as illustrated in FIG. 6 , if the patient information (either entered by the caregiver or obtained from another source) indicates that the patient is involved in heavy tobacco use, the decision support system will recognize at step 804 that a COPD diagnosis is more likely, whereas if the caregiver indicates to the decision support system that the patient is experiencing a cough, or has a history of asthma, the decision support system may recognize at step 804 that an Asthma diagnosis is more likely. In addition to, or alternatively to, the informational diagnosis support reflected in FIG. 6 , the decision support system may gather findings using physiological data to help the caregiver determine the appropriate treatment path.
  • the decision support system may recognize at step 804 that a COPD treatment path is more appropriate, whereas if the breath sound sensor generates data indicative of pulsus paradoxus, or if a muscle activity sensor indicates accessory muscle use, the decision support system may recognize at step 804 that an Asthma treatment path is more appropriate.
  • the decision support system may suggest or propose a diagnosis or treatment path, for example by indicating statistical probabilities (based on charts and data such as those of FIGS. 6 and 7 ) or relative likelihoods, and ask for confirmation or final selection by the caregiver.
  • the decision support system may change the information presented to the caregiver, for example by launching into a treatment protocol specific to the Asthma diagnosis.
  • the decision support system may suggest that the caregiver attach a humidifier to the patient's oxygen supply, and administer 2.5 milligrams of albuterol mixed with 0.5 milligrams of Atrovent administered by nebulizer connected to a 6-9 liter per minute source, and may indicate that the dosage may be administered continuously as long as the heart rate is not greater than 140.
  • the decision support system may monitor the heart rate, and give a visual and/or audio indication when and if the heart rate reaches or approaches 140, in this example.
  • the decision support system may help the caregiver decide whether the patient is extremely bronchoconstricted, for example by showing data or measurements related to blood oxygen content, respiration rate, or respiration volume. Upon a confirmation by the caregiver that the patient is extremely bronchoconstricted at step 808 , the decision support system may then suggest to the caregiver that a 125 milligram dose of Solumedrol be administered over a slow (e.g. 2 minute) intravenous push. At step 810 , the decision support system may help the caregiver to decide whether the patient's symptoms have improved (e.g. whether the patient's shortness of breath has improved with the treatment thus far).
  • the decision support system may display and/or analyze the patient's end-tidal waveform, and suggest that the patient does not appear to be responding to the treatment, and ask for the caregiver's confirmation. If the caregiver confirms the decision, then the decision support system may continue to guide the caregiver through additional treatment options, for example those indicated in FIG. 8 . In this way, the decision support system guides the caregiver through complex decision making processes, during the clinical encounter, using both physiological data and informational data gathered from the patient or input by the caregiver, in a way which would be too inconvenient or time-consuming for the caregiver to perform absent the decision support system.
  • the decision support may or may not be fully automated.
  • Inference engines utilizing Bayesian networks, neural networks, genetic algorithms, or simpler rule-based systems may be employed.
  • the tissue CO2 or pH are measured by methods such as those described in U.S. Pat. No. 6,055,447, which describes a sublingual tissue CO2 sensor, or U.S. Pat. Nos. 5,813,403, 6,564,088, and 6,766,188, which describe a method and device for measuring tissue pH via near infrared spectroscopy (NIRS), and which are all incorporated herein by reference in their entirety for all purposes.
  • NIRS technology allows the simultaneous measurement of tissue PO2, PCO2, and pH.
  • Tissue CO2 and pH are particularly helpful in monitoring in the trauma DTP.
  • Physiological parameters on display for the trauma DTP may be one or more of: invasive and non-invasive blood pressure, tissue CO2 and pH, ECG, SpO2 trending, and heart rate variability risk index.
  • the ECG may be analyzed to determine the interval between adjacent R-waves of the QRS complexes and using this interval to calculate heart rate variability as a running difference between adjacent R-R intervals. It is known to those skilled in the art that an abrupt reduction in variability will often precede by many minutes a precipitous decline in a patient's blood pressure (traumatic arrest). By monitoring the trend in heart rate variability, the traumatic arrest can be anticipated and prevented.
  • Another sensor of use for the trauma DTP is ultrasound, according to embodiments of the present invention.
  • the caregiver selecting elements of the flowchart results in the ultrasound sensor being activated and images presented on the computer tablet. Additional instructions can be requested from the interface on either the computer tablet and/or the defibrillator. Based on the selections and instructions, the settings of the ultrasound can be adjusted to deliver the optimal images, according to embodiments of the present invention.
  • the differential diagnosis/decision support system may be configured to support decision making and diagnosis with respect to other DTPs, and may be configured to display and support various combinations of one or more DTPs, from among the five shown in FIG. 3 and others.
  • each user may configure the decision support system to use customized DTP for each DTP option; for example, the user may change the default series of questions/steps/readings for the Trauma DTP with a new series of questions/steps/readings based on caregiver-specific, patient-specific, geography-specific, and/or regulation-specific treatment protocols.
  • the decision support system operates to guide decision making and diagnosis for a caregiver in a way that accommodates various kinds of DTPs.
  • the decision support system may be configured to guide a user through a decision and treatment pathway similar to that shown in FIGS. 13-25 .
  • the user would then be presented with a series of further options, such as “amputation injury,” “bleeding control,” “burns,” and the like. Selecting one of these further options would then cause the decision support system to enter and display the particular pathway or pathways relevant to the selected option.
  • the decision support system is comprised by a user interface device, independent of a medical device or one or more sensors, in a way which simply guides the caregiver through a series of decisions according to a pre-established flow chart.
  • a medical device such as a defibrillator
  • a graphical DTP may be included in a defibrillator device as a reference document, electronically navigable.
  • the decision support system is informed by a combination of caregiver observations, patient information, and/or sensor data. Assessment and/or scoring may be performed, either by receiving data from the caregiver, or receiving data from sensors, or both. For example, for a trauma DTP, the decision support system may take into account pulse rate, breathing data, qualitative breathing data, pulse rate, blood loss, blood pressure, presence of broken limbs, and/or compound fractures. Or, in a cardiac distress DTP, the decision support system may be configured to display a cardiac arrest probability at a moment in time, which may be calculated and/or predicated by the decision support system based on selected criteria. The decision support system may also be configured to track certain criteria in order to suggest treatment outcome probabilities, for example suggesting the treatment pathway with the highest or a high perceived probability of success.
  • a monitor, or a defibrillator/monitor combination, or other similar device may be configured to provide a graphical tool to configure the monitor to follow recognized rescue protocols, for example one or more of the protocols described and/or shown herein.
  • a tool may be included on the monitor or defibrillator device, on a tablet or handheld or other computing device, and/or on both, according to embodiments of the present invention.
  • Such a tool may be provided in a graphical interface, for example a flowchart. The tool allows the user to configure the patient monitor to follow a particular rescue protocol, for example by visually presenting a flow chart for the protocol and allowing the user to customize the protocol.
  • the length of the CPR period may be configured by the user to customize the treatment protocol.
  • a tool may also permit the downloading and uploading of customized treatment protocols to and/or from a monitoring device, which may also permit the same customized protocol settings to be carried on a mobile device and/or transferred or uploaded to multiple other devices in different locations and/or at different times, according to embodiments of the present invention.
  • FIG. 26 illustrates a clinical decision support system 2600 , according to embodiments of the present invention.
  • System 2600 includes a processor 150 which is communicably coupled to a database 152 , a decision support module 153 , a display 156 , and a patient monitor and/or defibrillator 154 , which may itself be communicably coupled to another display module 155 , according to embodiments of the present invention.
  • Some or all of the elements shown in FIG. 26 may be part of, or implemented by, one or more computer systems as illustrated in FIG. 27 .
  • FIG. 27 is an example of a computer or computing device system 200 with which embodiments of the present invention may be utilized.
  • defibrillator 154 and/or the tablet shown in FIG. 11 may be or incorporate a computer system 200 , according to embodiments of the present invention.
  • the computer system includes a bus 201 , at least one processor 202 , at least one communication port 203 , a main memory 208 , a removable storage media 205 , a read only memory 206 , and a mass storage 207 .
  • Processor(s) 202 can be any known processor, such as, but not limited to, an Intel® Itanium® or Itanium 2® processor(s), or AMD® Opteron® or Athlon MP® processor(s), or Motorola® lines of processors, or any known microprocessor or processor for a mobile device, such as, but not limited to, ARM, Intel Pentium Mobile, Intel Core i5 Mobile, AMD A6 Series, AMD Phenom II Quad Core Mobile, or like devices.
  • Communication port(s) 203 can be any of an RS-232 port for use with a modem based dialup connection, a copper or fiber 10/100/1000 Ethernet port, or a Bluetooth® or WiFi interface, for example.
  • Communication port(s) 203 may be chosen depending on a network such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system 200 connects.
  • Main memory 208 can be Random Access Memory (RAM), or any other dynamic storage device(s) commonly known to one of ordinary skill in the art.
  • Read only memory 206 can be any static storage device(s) such as Programmable Read Only Memory (PROM) chips for storing static information such as instructions for processor 202 , for example.
  • PROM Programmable Read Only Memory
  • Mass storage 207 can be used to store information and instructions.
  • flash memory or other storage media may be used, including removable or dedicated memory in a mobile or portable device, according to embodiments of the present invention.
  • hard disks such as the Adaptec® family of SCSI drives, an optical disc, an array of disks such as RAID (e.g. the Adaptec family of RAID drives), or any other mass storage devices may be used.
  • Bus 201 communicably couples processor(s) 202 with the other memory, storage and communication blocks.
  • Bus 201 can be a PCI/PCI-X or SCSI based system bus depending on the storage devices used, for example.
  • Removable storage media 205 can be any kind of external hard-drives, floppy drives, flash drives, zip drives, compact disc-read only memory (CD-ROM), compact disc-re-writable (CD-RW), or digital video disk-read only memory (DVD-ROM), for example.
  • CD-ROM compact disc-read only memory
  • CD-RW compact disc-re-writable
  • DVD-ROM digital video disk-read only memory
  • the decision support module 153 may be a clinical support and/or differential diagnosis and/or treatment protocol as described herein. Based on information about the patient received from monitor 154 , the decision support module 153 determines and/or shows to the user a set or array of next available options in the decision tree. Alternatively, the decision support module 153 may be configured to calculate probabilities or other statistics based on decision support trees, algorithms, and/or historical data.
  • embodiments of the present invention include a separate display 156 which is available to the user or to someone other than the user in order to view information about a particular decision support process being implemented by the processor 150 and, optionally, by the patient monitor 154 .
  • a user decides to implement a decision support process, a selection may be made on the user interface screen operated by the display module 155 , and/or may be made on the user interface operated by display module 156 . This then prompts the processor 150 to access a clinical decision support process via decision support module 153 .
  • Decision support module 153 may include logic to guide the user through the various nodes and/or branches of a clinical decision support process, for example those shown in FIGS. 5-8B and 12 - 25 .
  • the display module 155 operates the display screen of a monitor/defibrillator as shown in FIG. 11
  • the display module 156 operates a tablet computer screen.
  • Such a tablet computing device may be communicably coupled to the processor 150 (whether such processor is located in the monitor/defibrillator or the tablet computing device) by docking it into a communications dock on the monitor/defibrillator as shown in FIG. 11 , and/or may be communicably coupled to the processor 150 wirelessly.
  • patient monitor 154 may include its own processor, and tasks described as performed by processor 150 may be distributed across one or multiple processors and/or physical devices.
  • FIG. 28 illustrates one example of a decision support tree that may be shown to a user on an auxiliary screen (operated by module 156 ) during a medical event, to guide the user through a treatment protocol or pre-diagnosis of the patient.
  • the decision support module 153 may be navigated through the various decision points (e.g. “nodes”) either by manual selection of the next available option or branch, or by complete or partial automatic selection of the next available option or branch based upon patient data collected during the medical event, for example physiological data collected by the patient monitor/defibrillator 154 that is connected to the patient, or by a combination of these two processes.
  • a process that is wholly or partially automatic may also be configured to prompt a user for confirmation before moving to a subsequent or previous node, according to embodiments of the present invention.
  • FIG. 28 illustrates a user interface display of a clinical decision support tree, according to embodiments of the present invention. This decision support tree begins at block 2 , and the first decision is between blocks 4 or 24 . If block 4 is selected, the decision is next between blocks 6 and 8 . If block 6 is selected, the next decision is between blocks 10 and 12 .
  • branches or decisions are shown, one of ordinary skill in the art will appreciate, based on the disclosure provided herein, that any number of branches or decision options may be provided to extend from a particular node, and that such branches could overlap and/or loop back to a previous node, according to embodiments of the present invention.
  • the remaining blocks 14 , 16 , 18 , 20 , 22 , 24 , 26 , 28 , 30 , 32 , 34 , 36 , 38 , 40 , and 42 may function in a similar manner.
  • FIG. 29 illustrates one example of the user interface display of FIG. 28 with a portion of the clinical decision support tree resized, according to embodiments of the present invention.
  • the resizing may occur by making block 24 smaller, or by making block 4 larger, or both.
  • only the subsequent sets of blocks or nodes are resized, rather than the rest of the branches or nodes depending from the immediately subsequent nodes.
  • Each node may be represented by a shape, and the entire border of the shape may be resized in order to indicate a non-chosen or less-probable node.
  • the size of the node may remain the same but the text inside the node may be resized.
  • the size of the node may remain the same, but the color or transparency of the non-chosen or less-probable nodes may be changed, for example “grayed out” for the less important nodes and turned to a bolder color or flashing color for the more important nodes.
  • a combination of these and other visual indication features may be employed to assist the user in visually navigating through the decision support process in real time, during the medical event.
  • FIG. 29 also illustrates how a screen border can be recentered or moved dynamically to correspond with movement through the tree.
  • screen border 50 is initially centered (either vertically or horizontally or both) on block 2 , and as soon as block 4 is selected, or becomes a more likely or recommended selection, the screen border 50 shifts along the direction indicated by arrow 52 to new screen border position 50 ′, which is now centered on block 4 .
  • FIG. 30 illustrates a similar resizing feature as it might be displayed after block 6 is selected over block 8 .
  • the decision support module 153 may also be configured to transition between differential diagnosis and treatment protocols; for example, as a likely diagnosis is approached by a clinical support module, the user may be prompted to select or begin a treatment protocol consistent with one or more likely diagnoses or pre-diagnoses. As another example, one or more treatment protocol trees may be presented at the end of a differential diagnosis or clinical decision support tree, in order to guide the user through the recommended treatment protocol once the decision support module 153 has helped the user identify the condition that requires treatment.
  • the patient monitor/defibrillator device 154 may also be configured for several different care modes, and may be configured to enter the most likely or most relevant care mode based on the user's navigation of the clinical decision support process, for example on auxiliary display 156 , and to change between two or more care modes as appropriate as the user navigates the clinical decision support tree, according to embodiments of the present invention.
  • FIG. 31 illustrates a user interface display with dynamic softkeys, according to embodiments of the present invention.
  • the nodes on a decision support tree display may be dynamically visually adjusted to help the user in navigating the process, so too the selection options on a patient monitoring or treatment device 154 may be dynamically adjusted to guide the user through a particular clinical decision support process.
  • FIG. 31 shows a housing of a patient monitor/defibrillator 54 , which may include a screen 55 (for example operated by display module 155 of FIG. 26 ), and which may include a number of physical user input devices 56 , 58 , 60 , 62 , 64 , which may be for example buttons.
  • the screen 55 may be configured to display a user interface as shown, which may include one or more softkeys 66 , 68 , 70 , 72 , 74 , with one or more of the softkeys 66 - 74 corresponding to one or more of the buttons 56 - 64 .
  • a user interface as shown, which may include one or more softkeys 66 , 68 , 70 , 72 , 74 , with one or more of the softkeys 66 - 74 corresponding to one or more of the buttons 56 - 64 .
  • more or fewer buttons and/or softkeys maybe used, and the positioning of the buttons and/or softkeys may vary across different units, models, or designs.
  • the buttons may alternatively or additionally extend vertically across one side of the screen 55 .
  • the softkeys 66 are part of the display screen that may be dynamically modified by the processor 150 and/or the patient monitor 154 , such that the buttons 56 - 64 may be used by the user to select different options at different times. This allows the user to navigate through various menus with a single row of buttons.
  • the device 55 does not include any physical buttons, and instead uses only softkeys on the display screen 55 that are themselves selectable (e.g. via a touchscreen arrangement).
  • the term “softkey” is used herein in its broadest sense to refer to any combination of physical and virtual buttons that may be used by a user to select from one or more options.
  • the softkeys 66 may be dynamically adjusted to assist the user in navigating a decision support process. Based on the disclosure provided herein, one of ordinary skill in the art will recognize numerous different menus or clinical decision support processes that may benefit from such dynamically adjusting softkeys. Just a few particular examples are shown in FIGS. 32 and 33 . For example, if a user selected the “acute care diagnose” button or softkey from the user interface display of FIG. 3 , the user could be taken to the screen of FIG. 31 with dynamic softkeys 66 - 74 . Such softkeys may initially look very similar to those of FIGS.
  • the patient monitor/defibrillator observes a cardiac arrhythmia based on the patient's simultaneously observed ECG waveform.
  • the display module 155 emphasizes the Cardiac Distress softkey 70 by visually emphasizing it or visually distinguishing it over the other simultaneously displayed softkeys, as shown in FIG. 32 .
  • the Cardiac Distress softkey 70 may be changed in color or boldness.
  • the softkey 70 may include a displayed geometric shape, and such shape may be changed, or its perimeter may be made bolder or more visually distinct.
  • the text within the softkey 70 may be enlarged or emboldened or italicized in order to visually distinguish softkey 70 based on the physiological data.
  • the user interface displayed on the screen 55 , and/or the screen display of an accompanying tablet device includes one or more legends for visually indicating to the user why one or more softkeys have been emphasized or highlighted.
  • the display module 155 resizes the Cardiac Distress softkey 70 by making it larger, or by making the other softkeys smaller, as shown in FIG. 33 .
  • FIGS. 32 and 33 illustrate only one softkey 70 being emphasized and/or resized based on available patient data
  • the display module 155 may further be configured to dynamically emphasize and/or resize more than one softkey, in more than one way, according to embodiments of the present invention.
  • both the Cardiac Distress softkey 70 and the Respiratory Distress softkey 66 may be visually emphasized or resized with respect to the other softkeys, and may also be visually emphasized or resized with respect to each other depending upon the relative significance of each possible diagnosis or treatment protocol.
  • the cardiac distress softkey 70 may be the largest or most emphasized softkey, while the respiratory distress softkey 66 may be the next largest or next most emphasized softkey, followed by the remaining softkeys.
  • the softkeys 66 - 74 may be configured to dynamically update to reflect the next decision/step or set of decisions/steps. The dynamic resizing and/or emphasizing of various softkeys conveys a greater level of helpful decision support to the user, without sacrificing the user's ability to select even one of the softkeys that is not enlarged or emphasized, according to embodiments of the present invention.
  • the dynamic adjustment of visual characteristics of softkeys has been described with respect to observed physiological data about the patient, such dynamic adjustment may alternatively or additionally be accomplished using patient charting data or other patient data entered manually or automatically. For example, if the patient's chart at the beginning of the medical event indicates that the patient was involved in an automobile accident, the Trauma softkey 72 may be configured for initial enlargement and/or emphasis as soon as the user selects the “acute care diagnose” function from the interface of FIG. 2 , according to embodiments of the present invention.
  • FIG. 34 illustrates a code review interface for reviewing user interface display data corresponding to a medical event, according to embodiments of the present invention.
  • the code review interface includes a user interface replicator 455 as well as a visual timeline indicator 300 .
  • the user of the patient monitor/defibrillator 154 takes the display screen 55 of the monitor 154 through various steps and user interface modes. It is often helpful, after the medical event has occurred, for the user, as well as someone who is reviewing or critiquing the performance of the user, to be able to know what happened during the medical event and when during the medical event such events occurred. Such information is particularly helpful in the time leading up to or following a significant patient event, in order to determine the appropriateness or effectiveness of the particular treatment applied.
  • the processor 150 may be configured to capture visual representations (e.g. “snapshots” in time) of some or all of the user interface screen 55 and store them for later review, for example in database 152 , according to embodiments of the present invention. Such review may be accomplished in the form of a playback interface as shown in FIG. 34 . Such snapshots of the user interface 55 may be recorded at least once each second, twice each second, or more times each second, at regular or irregular intervals, according to embodiments of the present invention. In some embodiments, the snapshots may be made frequently enough (e.g. at the data sample rate of 500 snapshots per second) to provide full fidelity playback of the event.
  • visual representations e.g. “snapshots” in time
  • the interface replicator 455 and visual timeline indicator 300 may be configured to play back the screen user interface appearance at the same rate at which the images were taken or captured, and dynamically move the position of the timeline indicator 308 along the timeline 300 from the beginning time indicator 304 to the ending time indicator 306 , according to embodiments of the present invention.
  • a current position indicator 302 indicates the time, for example in hour:minute:second format, at which the particular user interface screen shot shown in the user interface replicator 455 was taken (or at which such a user interface displayed during the medical event). As such, a person reviewing the progression of the screen interface 55 sees the screen interface 55 in the user interface replicator 455 just as it would have been seen by the user of the device at the time of the medical event, according to embodiments of the present invention.
  • the visual timeline indicator 300 may also include visual event indicators, such as drug administration visual event indicator 310 and patient defibrillation visual event indicator 312 .
  • Other visual event indicators may include, for example, the occurrence of an alarm, the time at which a blood pressure measurement or signal was acquired (which may be helpful for documenting at the end of a medical event), event markers, clinical decision tree points, the time at which spontaneous circulation returned (“ROSC”), and/or the time at which a “rearrest” softkey was pressed or at which a renewed or subsequent cardiac arrest condition was observed.
  • Visual event indicator 310 indicates the time during the medical event (e.g. on the timeline) at which a drug was administered to the patient.
  • Visual event indicator 312 indicates the time during the medical event (e.g. on the timeline) at which a defibrillation treatment was applied to the patient, according to embodiments of the present invention. Fewer or more of the same or additional visual event indicators may be used in the visual timeline indicator 300 , in order to signal to the reviewer the times at which significant events of interest occurred during the medical event. This then permits the reviewer to skip directly to the user interface time intervals of interest, rather than reviewing all user interface screen shots sequentially, according to embodiments of the present invention.
  • the user may select timeline indicator 308 with a cursor or other selection process, and drag it left or right on the timeline before releasing it to resume playback at the time corresponding to the new location of the indicator 308 , according to embodiments of the present invention.
  • the user may move the indicator 308 and thus the playback to the time of visual event indicator 310 (or to a time that is a predetermined interval before the time of visual event indicator 310 ) by simply clicking on visual event indicator 310 .
  • the interface of FIG. 34 may further include a current position indicator 302 , which displays a time corresponding to the position of the indicator 308 along the timeline 300 and corresponding to the image displayed in the user interface replicator 455 , according to embodiments of the present invention. While FIG. 34 illustrates a substantially linear timeline, other non-linear timeline indicators may be used.
  • the code review interface of FIG. 34 may also be particular helpful in reviewing the recorded screen images for dynamic softkey adjustments, as described with respect to FIGS. 32 and 33 .
  • the code reviewer could set the indicator 308 to the time that such softkey was displayed to see whether the particular softkey was resized or emphasized in order to indicate that it was the preferred course of action.
  • Reviewers using the interface of FIG. 34 are also able to see what exactly was on the user's screen when certain actions were undertaken, for example what the user looked at just prior to the drug administration event 310 , according to embodiments of the present invention.
  • the interface of FIG. 34 operates in a manner similar to that of digital video recorder playback.
  • the interface may include a media navigation interface including media navigation bar 314 , volume selection bar 314 , and/or playback speed selection bar 316 .
  • the media navigation bar 314 may include screen controls similar to those used with playback of movies, to control the content of the user interface replicator.
  • the media navigation bar 314 may include a play button 322 , a stop button 324 , a pause button 326 , a rewind button 320 , and a fast forward button 328 .
  • a skip back button 314 and skip forward button 330 may also be included, for example to skip between medical events, chapters, and/or visual event indicators, according to embodiments of the present invention.
  • “button” is used to refer to either or both of a physical button or a virtual/screen selection interface option.
  • the playback speed selection bar 316 may also be configured to visually indicate which of the playback speed selections is currently active. Other or additional speed selections may be provided.
  • Clicking on or otherwise selecting volume selection bar 314 permits adjustment of any audio playback volume (e.g. when audio data from the medical event is also played back simultaneously or instead of the visual data).
  • the on-screen cursor 334 may take the form of a hand with a pointed finger.
  • a display preview pop-up window 332 opens, for example attached or in the vicinity of the finger or cursor 334 .
  • the display preview window 332 may show, for example, a physiologic waveform along with static measurements and time and events in sufficient detail for the user to determine whether to select that particular timeline location for current playback, according to embodiments of the present invention.
  • the display preview window 332 includes the physiologic waveform and measurements/events portion 336 , as well as a time indicator portion 338 indicating where, along the visual timeline indicator 300 , the cursor 334 has been placed, according to embodiments of the present invention. According to some embodiments of the present invention, selecting and “holding” the selection on the timeline indicator 308 and scrolling forward and backward along the timeline 300 causes a similar display preview window 332 to pop up at or near the slider 308 .
  • the user can play back the clinical decision support tree for reviewing the medical event.
  • a tablet screen, or a screen controlled by display module 156 or alternatively an interface similar to that of FIG. 34 , could be configured to indicate a timeline and display the user's progression through a clinical decision support tree by highlighting each node through which the process was taken, and the time at which such node selection was made.
  • a representation of the clinical decision support tree is itself used as a visual timeline indicator, permitting a user to select a node in order to see, in the user interface replicator 455 , what the defibrillator/monitor 154 screen 55 looked like at the time or times when the user was at the selected step in the decision support process.
  • the display module 156 and processor 150 may communicably coupled bi-directionally with the defibrillator/monitor 154 , and the defibrillator/monitor 154 screen 55 itself may be used as (for example instead of) the user interface replicator 455 .
  • the tablet computer screen or other display device operated by display module 156 may be configured to show a user-selectable list of event markers which, when selected by the user, replicates the monitor's 154 display at the time of the marked event, either using display module 155 or user interface replicator 455 , according to embodiments of the present invention.
  • the following list of event markers could be displayed on a tablet computing device communicably coupled to the defibrillator/monitor 154 :
  • FIG. 34 depicts a user interface replicator 455
  • other replicators may be used to display or play back other observed parameters that occurred over the course of a medical event; for example, graphs, trends, and/or charts representing patient information or physiological status.
  • Such an ability to quickly and efficiently review patient data for a medical event or portions thereof may be helpful not only for a subsequent reviewer, but may also be helpful for the user during the medical event, and/or for a subsequent user during the medical event, for example when a patient is transferred from a Basic Life Support crew to an Advanced Life Support crew.
  • the interface of FIG. 34 may permit review of the patient's care report, ECG or 12-lead waveforms, cardiopulmonary resuscitation quality, and other patient care information or data.
  • Event markers may be used as described above.
  • an event marker may be used to indicate that the patient was administered a bronchodilator medication
  • the code review interface may be used to look at the patient's respiratory status before and after the application of the bronchodilator. This permits the same user, or a subsequent user for the same patient, or a subsequent reviewer, to observe how effective the bronchodilator dosage was, and perhaps to factor such information into a decision to again administer the same or another treatment.
  • the interface of FIG. 34 or a similar interface may be used to review how the patient's carbon dioxide waveform changes upon patient treatment. “Snapshots” may be recorded and played back through a similar interface for other patient data, for example the data from a ventilation monitoring device (e.g. minute ventilation).
  • alarm thresholds may be dynamically adjusted based on patient physiological data and/or charting data.
  • frequency-automated measurements for example blood pressure
  • the monitor 154 may be configured to automatically obtain vital signs (e.g. blood pressure, SpO 2 , heart rate, and respiratory rate) every five minutes. For other, less critical conditions, these vital signs may only need to be taken twice during the entire patient event.
  • vital signs e.g. blood pressure, SpO 2 , heart rate, and respiratory rate
  • these vital signs may only need to be taken twice during the entire patient event.
  • automatic blood pressure measurements may be disabled when treating a cardiac patient, and then re-enabled once the patient achieves return of spontaneous circulation.
  • a systolic blood pressure (“SBP”) alarm may be configured on the monitor 154 to alert the user with an alarm if an adult's SBP is less than 90 mmHg, with a ventilation rate target of 10 breaths per minute, and/or the end tidal carbon dioxide is less than 35 mmHg.
  • SBP systolic blood pressure
  • These targets may need to be adjusted based on a patient's age; for example, for a three-year old, a systolic blood pressure alarm may be set to activate with an SBP of less than 76 mmHg and/or a ventilation rate target of twenty breaths per minute.
  • a systolic blood pressure alarm may be set to activate with an SBP of less than 72 mmHg, and/or a ventilation rate target of twenty-five breaths per minute.
  • the processor 150 is configured to automatically adjust the thresholds based on the patient's age, in a traumatic brain injury situation, based on user input, rather than requiring the user to manually reconfigure the alarm thresholds based on age.
  • the processor 150 may obtain the patient's age from database 152 , and/or from a patient charting system to which it is communicably coupled, and use the patient's age to automatically reconfigure the alarm thresholds upon an indication, either via a softkey selection or from the decision support module 153 , that a traumatic brain injury situation applies.
  • the clinical decision support tree for traumatic brain injury may, at the appropriate node in the process, request the user to select from various age groupings, and use the user's selection from the decision support tree to automatically adjust the alarm thresholds.
  • the processor 150 may also be configured to silence all alarms upon a determination that the patient has entered cardiac arrest, and then re-enable all alarms upon a determination that the patient has achieved a return of spontaneous circulation.
  • the processor 150 may be configured to, after a cardiac arrest event for an adult, reset the alarm thresholds to end tidal carbon dioxide ⁇ 30 mmHg (possibly lower for a traumatic brain injury situation) or heart rate ⁇ 40 beats per minute. While alarm and other thresholds are discussed as being adjustable in traumatic brain injury medical events, alarms and other thresholds may also be dynamically adjusted for other patient events or conditions, according to embodiments of the present invention.
  • system 2600 is used to assist clinicians with delivering medications of the appropriate dose. Medication errors can cause significant problems, particularly for the treatment of pediatric patients and when drugs are substituted.
  • the decision support module 153 displays assistance for physicians to comply with a protocol, for example a protocol related to drug delivery.
  • a protocol for example a protocol related to drug delivery.
  • a medical director may provide drug options for treatment of a particular condition. The dosing of the drug would be determined based on patient age, body weight, Broselow measurement, and/or medical complaint. At any time, the medical director may change the drug options and dosing based on factors such as the availability of the drug. A physician might even change the recommendations in real-time or in clinical time if remotely monitoring the treatment.
  • the system may include protections and/or safeguards to ensure that the information is correctly entered into the database 152 about the drug and/or the patient, in order to permit the decision support module 153 to accurately guide the caregiver in dosing according to the current drug delivery protocol, according to embodiments of the present invention.
  • system 2600 may assist the decision support module 153 in determining whether to select, suggest, and/or recommend node 402 or node 404 , according to embodiments of the present invention.
  • the process may continue to the next node via arrow 403 .
  • the process may continue to the next node via arrow 405 , according to embodiments of the present invention.
  • FIG. 35 part of a process, for example a clinical decision support tree, has a decision process that flows through arrow 401 and into decision point 400 , system 2600 may assist the decision support module 153 in determining whether to select, suggest, and/or recommend node 402 or node 404 , according to embodiments of the present invention.
  • the process may continue to the next node via arrow 403 .
  • the process may continue to the next node via arrow 405 , according to embodiments of the present invention.
  • FIG. 35 part of a process, for example a clinical decision support tree, has a decision process that flows through arrow 401 and into decision point 400
  • a drug delivery portion of a clinical decision support tree may, at node 400 , have the decision support module 153 determine a correct dosing for the particular patient based on upon observed and/or input patient characteristics (block 406 ), and/or may determine or suggest or recommend one of two or more doses, for example dose A (block 402 ) or dose B (block 404 ), according to embodiments of the present invention.
  • the observed and/or input patient characteristics (block 406 ) that might suggest different dosing include age, weight, allergies, and/or other conditions, according to embodiments of the present invention.
  • FIG. 36 illustrates a user interface, for example a user interface for a patient monitor/defibrillator 154 or other device.
  • the user interface of FIG. 36 includes display portions which indicate trending information for various values.
  • the interface illustrates trending data for systolic blood pressure (referenced as 3600 ), end tidal carbon dioxide (EtCO2), and blood oxygen saturation (SpO2).
  • Trending data may be displayed as a running record of previous readings. The oldest readings may appear on the left, and the newest readings may appear on the right, and the newest reading may be inserted on the right side while displacing the oldest reading on the left side, according to embodiments of the present invention.
  • the oldest readings may appear on the right, and the newest readings may appear on the left, and the newest reading may be inserted on the left side while displacing the oldest reading on the right side.
  • Other options for visually indicating the trend data for a given signal may be employed.
  • the scaling of the trending readouts, and/or the frequency of the values displayed for the trending values, and/or a color in which the trending values are displayed is customized according to the particular patient and/or the patient's condition. This may be done by the decision support module 153 . For example, if as part of a decision support process the decision support module 153 receives information indicating a patient's age, then the processor may be configured to configure the color in which each bar of the blood pressure trending graph 3600 is displayed.
  • the three bars on the left 3602 may be displayed as green to indicate that the patient's blood pressure at the times corresponding to those particular blood pressure measurements was within acceptable limits for the patient's age.
  • the middle five bars 3604 may be yellow to indicate that the patient's blood pressure at the times corresponding to those particular blood pressure measurements was below acceptable limits, but not yet at a critical level.
  • the right three bars 3606 may be red to indicate that the patient's blood pressure at the times corresponding to those particular blood pressure measurements was far below acceptable ranges, and was therefore at a critical level.
  • the trend graph 3600 for patient systolic blood pressure indicates that the patient's blood pressure is worsening over time by becoming lower.
  • the clinician manually adjusts the target values of signals, which may be beneficial if the patient is “crashing,” for example.
  • the clinician could select ranges which correspond to conditions with a more realistic chance of being achieved, according to embodiments of the present invention.
  • the ETCO2 and/or ventilation rate targets may be changed in order to hyperventilate such patients so as to reduce intracranial pressure.
  • These ranges or targets may be adjusted automatically if, in the course of a decision support process, the decision support module 153 detects, either automatically, or via manual or clinical or other inputs, that the patient has or is about to experience cerebral herniation.
  • the target ventilation rate will be adjusted to lower the ventilation rate. If the decision support module 153 detects that the ETCO2 is above a certain threshold, the target ventilation rate will be adjusted to increase the ventilation rate, according to embodiments of the present invention.
  • Such adjusted ventilation rates may include an upper and/or lower limit to prevent other undesired results, because high or low ETCO2 readings may be caused by factors other than ventilation rate (e.g. a super low ETCO2 may be caused by perfusion).

Abstract

Embodiments of the present invention include systems and methods for display and navigation of a clinical decision support process with portions thereof on separate display screens, as well as systems and methods for dynamically changing visual characteristics of softkeys on a patient monitor/defibrillator user interface screen based on clinical decision support or differential diagnosis processes, as well as a code review interface configured to permit a user to see what was displayed on a patient monitor/defibrillator user interface screen at any time during a medical event, as well as to see snapshots of other recorded parameters over the course of the medical event for purposes of code review, patient transfer, and improved patient care.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Patent Application Ser. No. 61/751,743, filed on Jan. 11, 2013, and of U.S. Provisional Patent Application Ser. No. 61/818,334, filed on May 1, 2013, both of which are incorporated herein by reference in their entireties for all purposes.
  • TECHNICAL FIELD
  • Embodiments of the present invention relate generally to tools for facilitating acute care treatment, and more specifically to systems and methods for clinical decision support and differential diagnosis.
  • BACKGROUND
  • In the pre-hospital and acute care treatment setting, medical responders often have difficulties in accurately determining the proper diagnosis of a particular patient. Even well-trained physicians often have difficulty under emergency conditions in which split second decisions are required with limited information. Computer-automated diagnosis was developed to improve the accuracy, effectiveness, and reliability of both field and hospital patient treatment.
  • Automated differential diagnosis utilizes computer inference algorithms such as Bayesian algorithms, neural networks, or genetic algorithms. According to a Wikipedia posting:
      • The Bayesian network is a knowledge-based graphical representation that shows a set of variables and their probabilistic relationships between diseases and symptoms. They are based on conditional probabilities, the probability of an event given the occurrence of another event, such as the interpretation of diagnostic tests. Bayes' rule helps us compute the probability of an event with the help of some more readily information and it consistently processes options as new evidence is presented. In the context of CDSS [(clinical decision support system)], the Bayesian network can be used to compute the probabilities of the presence of the possible diseases given their symptoms. Some of the advantages of Bayesian Network include the knowledge and conclusions of experts in the form of probabilities, assistance in decision making as new information is available and are based on unbiased probabilities that are applicable to many models. Some of the disadvantages of Bayesian Network include the difficulty to get the probability knowledge for possible diagnosis and not being practical for large complex systems given multiple symptoms. The Bayesian calculations on multiple simultaneous symptoms could be overwhelming for users. Example of a Bayesian network in the CDSS context is the Iliad system which makes use of Bayesian reasoning to calculate posterior probabilities of possible diagnoses depending on the symptoms provided. The system now covers about 1500 diagnoses based on thousands of findings. Another example is the DXplain system that uses a modified form of the Bayesian logic. This CDSS produces a list of ranked diagnoses associated with the symptoms.
      • Artificial Neural Networks (ANN) is a nonknowledge-based adaptive CDSS that uses a form of artificial intelligence, also known as machine learning, that allows the systems to learn from past experiences/examples and recognizes patterns in clinical information. It consists of nodes called neurodes and weighted connections that transmit signals between the neurodes in a unidirectional fashion. An ANN consists of 3 main layers: Input (data receiver or findings), Output (communicates results or possible diseases) and Hidden (processes data). The system becomes more efficient with known results for large amounts of data. The advantages of ANN include the elimination of needing to program the systems and providing input from experts. The ANN CDSS can process incomplete data by making educated guesses about missing data and improves with every use due to its adaptive system learning. Additionally, ANN systems do not require large databases to store outcome data with its associated probabilities. Some of the disadvantages are that the training process may be time consuming leading users to not make use of the systems effectively. The ANN systems derive their own formulas for weighting and combining data based on the statistical recognition patterns over time which may be difficult to interpret and doubt the system's reliability. Examples include the diagnosis of appendicitis, back pain, myocardial infarction, psychiatric emergencies and skin disorders. The ANN's diagnostic predictions of pulmonary embolisms were in some cases even better than physician's predictions. Additionally, ANN based applications have been useful in the analysis of ECG (a.k.a. EKG) waveforms.
      • A Genetic Algorithm (GA) is a nonknowledge-based method developed in the 1940s at the Massachusetts Institute of Technology based on Darwin's evolutionary theories that dealt with the survival of the fittest. These algorithms rearrange to form different re-combinations that are better than the previous solutions. Similar to neural networks, the genetic algorithms derive their information from patient data. An advantage of genetic algorithms is these systems go through an iterative process to produce an optimal solution. The fitness function determines the good solutions and the solutions that can be eliminated. A disadvantage is the lack of transparency in the reasoning involved for the decision support systems making it undesirable for physicians. The main challenge in using genetic algorithms is in defining the fitness criteria. In order to use a genetic algorithm, there must be many components such as multiple drugs, symptoms, treatment therapy and so on available in order to solve a problem. Genetic algorithms have proved to be useful in the diagnosis of female urinary incontinence.
  • Despite the fact that automated differential diagnosis systems have been developed and attempted to be implemented for more than 35 years now, they have not achieved any acceptance in the emergency medical setting for acute care treatment (ACT). In large part, this failure is due to the conditions under which emergency care of acute conditions are practiced. In those situations, such as the treatment of trauma, cardiac arrest or respiratory arrest, speed of decision-making is critical and caregivers already must split their time and attention between the patient and the physiological monitors and defibrillators. In such situations, automated differential diagnosis (ADD) tools are often viewed as interfering with the caregiving process and as a delay to treatment of the patient. Given that every minute can result in a 10% drop in survival rate, such as is the case for cardiac arrest, it is not surprising that ADD tools are ignored by the very people that they were designed to assist.
  • It has also been found that much of the patient's medical history is inaccessible by the caregiver at the time of the acute medical condition because patients are often treated in the prehospital setting where family members are often not present at the time of the injury.
  • SUMMARY
  • Embodiments of the present invention include a system that provides a tool for the caregiver to more efficiently and accurately perform a differential diagnosis that is integrated into the caregivers existing workflow during emergency situations. Embodiments of the present invention may also provide an integrated view of physiological data from the patient, along with therapeutic treatment and patient history and examination findings, in an automated way to caregivers.
  • A method for code review of a medical event according to embodiments of the present invention includes displaying, on a first device screen, a user interface during the medical event; recording images of the user interface, each of the images representing an entirety of the user interface associated with a time during the medical event, wherein the images are recorded at least once every second; displaying, on a second device screen, a visual timeline indicator and a user interface replicator, the user interface replicator displaying the images of the user interface and the visual timeline indicator representing a time associated with each of the images, wherein the visual timeline indicator and the user interface replicator permit sequential playback and review of the user interface images from the medical event, and wherein the visual timeline indicator accepts user input to move the sequential playback to a different time associated with the medical event.
  • The method of paragraph [0008], wherein the visual timeline indicator includes a timeline including a beginning time of the medical event and an end time of the medical event, the method further comprising indicating on the timeline the time associated with the image of the user interface shown in the user interface replicator.
  • The method of any of paragraphs [0008] to [0009], wherein the indicating on the timeline comprises using an indicator on the timeline to indicate the time associated with the image of the user interface shown in the user interface replicator.
  • The method of any of paragraphs [0008] to [0010], further comprising advancing the indicator along the timeline in a direction from the beginning time toward the end time during sequential playback of the user interface images.
  • The method of any of paragraphs [0008] to [0011], wherein the visual timeline indicator accepts user input by permitting scrolling of the indicator to a different position along the timeline.
  • The method of any of paragraphs [0008] to [0012], wherein the visual timeline indicator displays the time associated with each of the images in an hour-minute-second format.
  • The method of any of paragraphs [0008] to [0013], wherein the visual timeline indicator includes one or more event markers marking occurrence of clinically-relevant sub-events during the medical event.
  • The method of any of paragraphs [0008] to [0014], wherein the one or more event markers include a drug event marker indicating a time at which a drug was administered to a patient during the medical event.
  • The method of any of paragraphs [0008] to [0015], wherein the one or more event markers include a defibrillation event marker indicating a time at which a defibrillation shock was applied to a patient during the medical event.
  • The method of any of paragraphs [0008] to [0016], wherein the one or more event markers include an ROSC event marker indicating a time at which a patient returned to spontaneous circulation.
  • The method of any of paragraphs [0008] to [0017], wherein the one or more event markers include a rearrest event marker indicating a time at which a patient returned to cardiac arrest.
  • The method of any of paragraphs [0008] to [0018], wherein the one or more event markers include an alarm event marker indicating a time at which an alarm was activated.
  • The method of any of paragraphs [0008] to [0019], further comprising displaying a cursor on the second device screen and, when the cursor is hovered over or near a time represented by the visual timeline indicator, displaying on the second screen at or near the cursor the image of the user interface associated with the time.
  • The method of any of paragraphs [0008] to [0020], further comprising, when the cursor is hovered over or near the time, displaying at or near the cursor a textual representation of the time.
  • A method for decision support during a medical event according to embodiments of the present invention includes displaying, on a screen on a device, a user interface during the medical event, wherein the user interface comprises two or more softkeys each representing a possible user selection; collecting physiological data from a patient with the device; determining, based on the physiological data, which one of the two or more softkeys represents the possible user selection that most closely conforms to a treatment or diagnosis protocol; and based on the determination, visually distinguishing the one of the two or more softkeys from the others of the two or more softkeys on the user interface.
  • The method of paragraph [0022], wherein visually distinguishing the one of the two or more softkeys comprises making the one of the two or more softkeys larger than the others of the two or more softkeys.
  • The method of any of paragraphs [0022] to [0023], wherein visually distinguishing the one of the two or more softkeys comprises changing a position of the one of the two or more softkeys on the user interface.
  • The method of any of paragraphs [0022] to [0024], wherein visually distinguishing the one of the two or more softkeys comprises changing a color of the one of the two or more softkeys on the user interface.
  • The method of any of paragraphs [0022] to [0025], wherein visually distinguishing the one of the two or more softkeys comprises changing a border of the one of the two or more softkeys on the user interface.
  • The method of any of paragraphs [0022] to [0026], wherein visually distinguishing the one of the two or more softkeys comprises making the one of the two or more softkeys dynamically flash on the user interface.
  • The method of any of paragraphs [0022] to [0027], wherein visually distinguishing the one of the two or more softkeys comprises displaying on the screen a legend describing why the one of the two or more softkeys has been visually distinguished.
  • The method of any of paragraphs [0022] to [0028], wherein the one of the two or more softkeys is a cardiac distress softkey, and wherein the legend textually indicates a possible cardiac arrhythmia.
  • A method for decision support during a medical event according to embodiments of the present invention includes displaying, on a first screen on a first device, a user interface during the medical event, wherein the user interface comprises two or more softkeys each representing a possible user selection; collecting physiological data from a patient with the first device; displaying, on a second screen on a second device, a visual representation of a clinical decision support tree and an indication of a current node on the clinical decision support tree, wherein the two or more softkeys each represent the possible user selection from the current node on the clinical decision support tree.
  • The method of paragraph [0030], wherein the visual representation of the clinical decision support tree includes, in addition to the current node, at least one prior node and at least one subsequent node.
  • The method of any of paragraphs [0030] to [0031], wherein selection of one of the two or more softkeys advances the indication of the current node on the second screen to a subsequent node as selected by the one of the two or more softkeys.
  • The method of any of paragraphs [0030] to [0032], wherein the second screen permits scrolling and resizing of the visual representation of the clinical decision support tree.
  • The method of any of paragraphs [0030] to [0033], wherein the visual representation of the clinical decision support tree is centered at the current node on the second screen.
  • The method of any of paragraphs [0030] to [0034], wherein the visual representation of the clinical decision support tree is positioned on the second screen based on the current node.
  • The method of any of paragraphs [0030] to [0035], wherein the visual representation of the clinical decision support tree on the second screen is positioned based on the current node, and wherein the selection of the one of the two or more softkeys repositions the visual representation of the clinical decision support tree on the second screen based on the subsequent node.
  • The method of any of paragraphs [0030] to [0036], wherein the visual representation of the clinical decision support tree on the second screen is centered at the current node, and wherein the selection of the one of the two or more softkeys recenters the visual representation of the clinical decision support tree on the second screen at the subsequent node.
  • A method for decision support during a medical event according to embodiments of the present invention includes: during the medical event, collecting physiological data from a patient at a first frequency with a patient monitoring device; guiding a user through a clinical decision support process with a display screen during the medical event; determining, with the clinical decision support process, a status of a patient; and based on the status of the patient, selecting a second frequency at which to collect the physiological data from the patient.
  • The method of paragraph [0038], further comprising collecting the physiological data from the patient at the second frequency.
  • The method of any of paragraphs [0038] to [0039], wherein the status of the patient indicates traumatic brain injury, and wherein the second frequency is selected to be greater than the first frequency.
  • The method of any of paragraphs [0038] to [0040], wherein the second frequency is at least once every five minutes.
  • A system for code review of a medical event according to embodiments of the present invention includes a first screen configured to visually display a clinical decision support tree used in the medical event; a second screen configured to visually display a replication of a user interface of a patient monitoring device as the user interface appeared at times during the medical event, wherein selecting a location within the clinical decision support tree on the first screen causes the second screen to display the replication of the user interface corresponding to a time during the medical event represented by the location within the clinical decision support tree.
  • The system of paragraph [0042], wherein the second screen is part of the patient monitoring device.
  • The system of any of paragraphs [0042] to [0043], wherein the first screen is further configured to visually indicate on the clinical decision support tree a user's advancement through the clinical decision support tree in synchronization with advancement of the replication of the user interface on the second screen.
  • A method for decision support according to an embodiment of the present invention includes displaying, on a screen, a user interface during the medical event, wherein the user interface comprises trending information for a patient condition; collecting physiological data from a patient with a device; providing clinical decision support using at least some of the physiological data and at least some user input data; establishing a range for the patient condition based on the clinical decision support; and visually indicating on the user interface whether all or portions of the trending information is within the range.
  • The method of paragraph [0045], wherein the screen is on the device.
  • The method of any of paragraphs [0045] to [0046], wherein visually indicating on the user interface whether all or portions of the trending information is within the range further comprises displaying portions of the trending information that falls outside of the range in a first color, and displaying portions of the trending information that falls within the range in a second color different from the first color.
  • The method of any of paragraphs [0045] to [0047], wherein the range is a first range, the method further comprising establishing a second range for the patient condition based on the clinical decision support, and visually indicating on the user interface whether all or portions of the trending information is within in the second range.
  • The method of any of paragraphs [0045] to [0048], further comprising establishing a third range for the patient condition based on the clinical decision support, wherein the first, second, and third ranges do not overlap each other, and visually indicating on the user interface whether all or portions of the trending information is within the third range.
  • The method of any of paragraphs [0045] to [0049], further comprising coloring portions of the trending information within the first range a first color, coloring portions of the trending information within the second range a second color, and coloring portions of the trending information within the third range a third color.
  • The method of any of paragraphs [0045] to [0050], wherein the first color is green, wherein the second color is yellow, and wherein the third color is red.
  • The method of any of paragraphs [0045] to [0051], wherein establishing the range for the patient condition based on the clinical decision support comprises establishing the range for the patient condition based on the patient's age, wherein the patient's age is acquired via the clinical decision support.
  • A method for decision support according to an embodiment of the present invention includes displaying, on a screen, a user interface during the medical event, wherein the user interface comprises dosing information display for a drug; collecting physiological data from a patient with a device; providing clinical decision support using at least some of the physiological data and at least some user input data; establishing a dosage recommendation for the drug based on the clinical decision support; and visually indicating the dosage recommendation on the dosing information display of the user interface.
  • The method paragraph [0053], wherein the screen is on the device.
  • The method of any of paragraphs [0053] to [0054], wherein establishing the dosage recommendation based on the clinical decision support comprises establishing the dosage recommendation based on the patient's age, wherein the patient's age is acquired via the clinical decision support.
  • The method of any of paragraphs [0053] to [0055], wherein establishing the dosage recommendation based on the clinical decision support comprises establishing the dosage recommendation based on the patient's weight, wherein the patient's weight is acquired via the clinical decision support.
  • The method of any of paragraphs [0053] to [0056], wherein establishing the dosage recommendation based on the clinical decision support comprises establishing the dosage recommendation based on the patient's allergies, wherein the patient's allergies are acquired via the clinical decision support.
  • While multiple embodiments are disclosed, still other embodiments of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a clinical decision support system, according to embodiments of the present invention.
  • FIG. 2 illustrates a user interface for a medical device, according to embodiments of the present invention.
  • FIG. 3 illustrates the user interface of FIG. 2 upon selection of an acute care diagnosis mode, according to embodiments of the present invention.
  • FIG. 4 illustrates the user interface of FIGS. 2 and 3 upon selection of a respiratory distress mode, according to embodiments of the present invention.
  • FIG. 5 is a table describing a differential diagnosis outline for acute dyspnea in adults.
  • FIG. 6 is a table describing clues to the diagnosis of dyspnea.
  • FIG. 7 is a table listing physical examination findings in the diagnosis of acute dyspnea.
  • FIG. 8A is a top portion of a common medical protocol and differential diagnosis flow chart for adult shortness of breath.
  • FIG. 8B is a continuation of the common medical protocol and differential diagnosis flow chart of FIG. 8A.
  • FIG. 9 illustrates a carbon dioxide snapshot waveform which may be displayed on the user interface when selected by the user, according to embodiments of the present invention.
  • FIG. 10 illustrates the carbon dioxide snapshot waveform of FIG. 9 with displayed measurements, according to embodiments of the present invention.
  • FIG. 11 illustrates a tablet computing device docked on a defibrillator device, according to embodiments of the present invention.
  • FIG. 12 illustrates a protocol for use in patients with cardiac arrest.
  • FIG. 13 illustrates an example trauma assessment protocol.
  • FIG. 14 illustrates an example rapid trauma assessment protocol.
  • FIG. 15 illustrates an example focused physical exam protocol.
  • FIG. 16 illustrates an example amputation injuries protocol.
  • FIG. 17 illustrates an example bleeding control protocol.
  • FIG. 18 illustrates an example burns protocol.
  • FIG. 19 illustrates an example electrocution protocol.
  • FIG. 20 illustrates an example spinal immobilization protocol.
  • FIG. 21 illustrates additional steps in the spinal immobilization protocol of FIG. 20.
  • FIG. 22 illustrates an example multi-system trauma protocol.
  • FIG. 23 illustrates an example near drowning protocol.
  • FIG. 24 illustrates an example trauma in pregnancy protocol.
  • FIG. 25 illustrates an example traumatic cardiac arrest protocol.
  • FIG. 26 illustrates a clinical decision support system, according to embodiments of the present invention.
  • FIG. 27 illustrates a computer system, according to embodiments of the present invention.
  • FIG. 28 illustrates a user interface display of a clinical decision support tree, according to embodiments of the present invention.
  • FIG. 29 illustrates the user interface display of FIG. 28 with a portion of the clinical decision support tree resized, according to embodiments of the present invention.
  • FIG. 30 illustrates the user interface display of FIGS. 28 and 29 with an additional portion of the clinical decision support tree resized, according to embodiments of the present invention.
  • FIG. 31 illustrates a user interface display with dynamic softkeys, according to embodiments of the present invention.
  • FIG. 32 illustrates the user interface display of FIG. 31 with one softkey emphasized based on clinical decision support, according to embodiments of the present invention.
  • FIG. 33 illustrates the user interface display of FIG. 31 with one softkey emphasized in a different way, based on clinical decision support, according to embodiments of the present invention.
  • FIG. 34 illustrates a code review interface for reviewing user interface display data corresponding to a medical event, according to embodiments of the present invention.
  • FIG. 35 illustrates a portion of a clinical decision support tree, according to embodiments of the present invention.
  • FIG. 36 illustrates a user interface display, according to embodiments of the present invention.
  • While the invention is amenable to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the invention to the particular embodiments described. On the contrary, the invention is intended to cover all modifications, equivalents, and alternatives falling within the scope of the invention as defined by the appended claims.
  • DETAILED DESCRIPTION
  • FIG. 1 shows a block diagram of the system, according to embodiments of the present invention. In one embodiment, a combined defibrillator/monitor device such as the E-Series manufactured by ZOLL Medical of Chelmsford Massachusetts has keys whose labeling is provided by on-screen text. The text is thus configurable in real time ether due to input by the user or as a result of analysis and decision making by the defibrillator or other devices with which the defibrillator is in communication at the time of the defibrillator's use, such as the computer tablet device 214 or remote base station staffed by medical dispatch or medical supervisory personnel in communication with the computer tablet. The computer tablet may take the form of an iPad (Apple Corp., Cupertino Calif.). Such screen-labeled keys may be referred to as “soft-keys”. A specific soft-key is initially labeled “Acute Care diagnose” at device turn-on as shown in FIG. 2, according to embodiments of the present invention. Upon detecting a key press of the Acute Care Diagnose key, the defibrillator changes the functionality and labeling of the keys to those shown in FIG. 3. These five labels—“Respiratory Distress” or alternatively “Dyspnea”, “Altered Mental Status”, “Cardiac Distress”, “Trauma” and “Pain/Abnormal Nerve Sensation”—differ from the traditional symptoms associated with differential diagnosis in that they identify classes of patients and potential workflows and diagnosis and treatment pathways (DTP), and are listed in relative frequency with which paramedics and other emergency personnel encounter patients meeting these criteria in actual practice.
  • By pressing the soft-key for each DTP, the defibrillator is then configured to potentially activate certain physiological sensors and display the sensor data in such a way as to provide the caregiver the optimal information, presented in the optimal fashion so as to diagnose and treat the patient most accurately and efficiently. Each DTP may include a template according to which sensor data, or the physiological and/or measurement data derived therefrom, is displayed in a way most useful and/or efficient for that particular DTP. For instance, if the “Respiratory Distress” soft-key is pressed, then the waveforms and numeric physiologic data on the screen change to that shown in FIG. 4. Stored snapshots of individual CO2 breath waveforms may be initiated via the CO2 Snapshot soft-key. These snapshots remain on the display for reference to the clinician both for automating measurements for diagnosis as well as for assessing the effectiveness of a particular therapy.
  • Heart sound measurement and detection may be incorporated into the monitoring device for the detection of S3 and S4 heart sounds and automatically narrow the differential, or suggest for the rescuer to confirm agreement with the software diagnosis, of heart failure or pulmonary edema. A flowchart for evaluating heart sounds is shown in FIGS. 8A and 8B. Pulse oximetry and capnography are also very helpful measures and may be automatically incorporated into the algorithm for more accurate diagnosis. The same sensors used to detect heart sounds may also be employed to detect breath sounds and to analyze their quality. Specific algorithms may be employed to detect wheezing, crackles, rale or stridor, each of which may be indicative of a particular disease.
  • Sensors such as flow sensors and O2 gas sensors are included in some embodiments, so that the additional physiological measurements such as volumetric Co2, volumetric O2 and spirometry, which are relevant for diagnosis and treatment of dyspnea, may be included and displayed on the Respiratory Distress DTP screen. An oxygen sensor may be located in the patient's airway, which may assist in calculating the metabolic needs of the patient.
  • The display on the defibrillator 212 is a touchscreen, according to some embodiments of the present invention. The caregiver can easily initiate measurements such as on the CO2 snapshot waveform or the spirometry snapshot waveform via touchscreen gesture such as a double tap. A zoom icon may exist in the upper corner of each waveform box, such as the CO2 snapshot, such that when the zoom button is touched, that particular waveform fills the display of the defibrillator. Another measurement button is present which, when touched, displays all the relevant measurements for a particular waveform, according to embodiments of the present invention. A gestural interface is provided as part of the touchscreen. Using two fingers or finger and thumb to touch to two points in the waveform (which may also be referred to as a “caliper” measurement or gesture) will cause measurements to be displayed and/or overlaid onto the physiological data, as illustrated in FIG. 10. For instance, dead space volume, phase II and III slopes which are indicative of COPD, and estimates of arterial pCO2 may be listed on the screen after initiation of CO2 waveform measurement.
  • According to embodiments of the present invention, the processor communicably coupled with the touchscreen portion of a decision support system may be configured to recognize the wave shape of a wave signal being displayed, and/or recognize the edge of an image being displayed, in order to improve the accuracy of a caliper touch gesture. For example, if a user were to use a caliper gesture to measure or “zoom in” on an ST elevation in an ECG wave display, the decision support system may be configured to recognize that if one of the user's fingers taps just below the top of the ECG wave, that the user likely intended to include the top of the ECG wave in the enlarged or selected view. In addition, the decision support system may be configured to permit an ability to enlarge (zoom) and adjust measurement points individually using the touchscreen. A tap/click and drag method may be used to set the caliper gesture; for example, to hone in on a particular portion of displayed waveform, the user may press on one point and drag to another point to indicate the endpoints of the caliper gesture.
  • Specific out-of-range readings can be displayed in red or highlighted by other mechanisms, such as bold-face font and/or flashing. Touching the highlighted values will cause the display to show the possible diagnoses which are consistent with the measurements, according to embodiments of the present invention. A specific graphical zone of the screen can be designated with a graphical image of the computer tablet. By dragging waveforms, measurements, or any other data object shown on the display over onto the computer tablet icon, it can automatically be presented on the computer tablet that is linked to the defibrillator.
  • Capnography is helpful in the assessment of asthma, where an increased slope in the expiratory plateau provides a measure of bronchospasm. The slope of the plateau phase (phase III) provides a measure of airway obstruction. The adequacy of b-agonist bronchodilatory therapy for an asthma exacerbation may be monitored through observation of slope change of phase III.
  • As referenced in U.S. Patent Application Publication No. 2011/0172550, published on Jul. 14, 2011, which is incorporated by reference herein in its entirety for all purposes, the data for the patient's history may be entered via the computer tablet with patient physiological measures via the monitor. As the differential diagnosis often implicates both patient history, patient examination findings along with measures of the patient's physiological state via such monitoring as ECG, capnography and pulse oximetry, these data elements are integrated into a user interface that automatically or semi-automatically integrates the various data elements on a single differential diagnosis screen within the application on the computer tablet. The interface may begin by asking the rescuer to choose from a list of common presenting symptoms or complaints by the patient, for example dyspnea or respiratory distress. The information such as on the screens of FIGS. 5, 6, and 7 (taken from Am Fam Physician 2003; 68:1803-10) provides one possible structured approach for rescuers to obtain information. As patient history and physical examination findings are entered on the computer tablet, the differential diagnosis page will gradually narrow down the possible diagnoses.
  • In another embodiment, the defibrillator contains a docking feature for propping up a computer tablet such as an Apple® iPad® on top of the defibrillator in a stable position via mounting features integrated onto the defibrillator, as illustrated in FIG. 11. Other mobile computing devices, including tablet computers, an iPhone®, an iTouch®, and other touchscreen monitors may be used. Alternatively, a low power, battery powered, touchscreen monitor may be used, such as, for example, those that transfer information to and from a computing device via a wired or wireless USB connection. Communication may be provided wirelessly between the two devices (the medical device and the mobile computing device, for example). Other communicable coupling may be achieved between the two devices; for example, wired. The iPad may include a protective housing and/or waterproof housing to protect it from the typical physical abuse it would likely encounter in the prehospital environment. Mounting features integral to such an iPad housing allow it to be easily attached on top of the defibrillator on scene. The mounting feature on the defibrillator may be able to hinge to allow the iPad® to hinge down when not in use into a protective pocket on the defibrillator. The iPad® may also be undocked and used nearby to the defibrillator, without need for physical connection. A physical slot may also be provided, preferably at the side, top or back of the unit that allows for the iPad® to have its battery charged by the defibrillator. Internal to the frame of the iPad® protective housing is the standard iPad® connector, while on the exterior of the frame of the iPad® protective housing are much more robust mechanical and electrical connections that can withstand the extensive abuse experienced by medical devices in the prehospital emergency setting, according to embodiments of the present invention.
  • The results of this integrated analysis of physiological data, patient history and examination findings may then be displayed on the defibrillator, potentially in the form of asking to make an additional physiological measurement. The results of this integrated analysis of physiological data, patient history and examination findings may alternatively, or additionally, be displayed on the tablet computer. According to some embodiments of the present invention, the tablet computer, or other mobile computing device, may be communicably coupled with the defibrillator or other physiological assessment device, for example through a wireless connection. As used herein, the phrase “communicably coupled” is used in its broadest sense to refer to any coupling whereby information may be passed. Thus, for example, communicably coupled includes electrically coupled by, for example, a wire; optically coupled by, for example, an optical cable; and/or wirelessly coupled by, for example, a radio frequency or other transmission media. “Communicably coupled” also includes, for example, indirect coupling, such as through a network, or direct coupling.
  • According to embodiments of the present invention, a user interface device is communicably coupled to a processor, and the processor is configured to receive data entered via the user interface device, as well as data received from one or more sensors, in order to perform clinical decision support based on both data sources. The user interface device may include one or more devices such as a touch screen computer, a tablet computer, a mobile computing device, a smart phone, an audio receiver, an audio transmitter, a video receiver, a video transmitter, a camera, and a “heads up” display projected onto a user's glasses or face shield. A small monitor may be mounted onto eyeglasses, a face shield, and/or integrated with other wearable communications devices, such as, for example, an ear bud or a Bluetooth® hands free phone adaptor. The user interface device may include a combination of devices for conveying options and receiving input; for example, an audio speaker may be used to convey possible DTPs, and an audio receiver may be used to receive a verbal command indicating a selection of one of the DTPs. Instead of an audio receiver, a video camera may be used to receive a gestural command that will be interpreted by the processor as a selection of one of the possible DTPs, or input elements. Using hands-free devices for user interface devices may free the hands of a caregiver to perform clinical tasks, while still permitting non-intrusive decision support and/or differential diagnosis for the caregiver.
  • FIGS. 8A and 8B illustrate a differential diagnosis and/or clinical support process through which a computer processor may take a caregiver, using the user interface device, according to embodiments of the present invention. For example, if the caregiver selected “Respiratory Distress” from among the five DTPs presented on the screen of FIG. 3, then the user interface device would prompt the caregiver to input information about step 802 in the flowchart of FIG. 8, which flows from top to bottom. At step 802, if the 12-lead reveals an S3 heart sound, or if the Dyspnea Engagement Score is greater than 3, then the decision support system will take the user through the Acute Decompensated Heart Failure (CHF) decision/diagnosis process.
  • The decision support system may take into account both physiological data received from sensors, and information data received from the caregiver (e.g. via mobile computing device such as an iPad®), in helping the caregiver move from one decision point in the flow chart to the next, while updating any display or information provided along the way. For example, the decision support system may indicate to the user that, based on processing of the ECG data, there does not appear to be an S3 heart sound present, and ask the caregiver to confirm this assessment. The decision support system may also, or alternatively, request the caregiver to enter a Dyspnea Engagement Score, or suggest one for confirmation by the caregiver. At step 802, if the 12-lead reveals no S3 heart sound, or if the Dyspnea Engagement Score is less than 3, then the decision support system will recognize that the caregiver is not dealing with a CHF situation, but then moves to step 804 in which the decision support system changes its display and/or input prompts in order to help the caregiver determine whether to enter the Asthma treatment path or the COPD treatment path.
  • Again, the decision support system may factor in various physiological data from sensors, as well as various informational data received about the particular patient, in helping to support the caregiver's decision. For example, as illustrated in FIG. 6, if the patient information (either entered by the caregiver or obtained from another source) indicates that the patient is involved in heavy tobacco use, the decision support system will recognize at step 804 that a COPD diagnosis is more likely, whereas if the caregiver indicates to the decision support system that the patient is experiencing a cough, or has a history of asthma, the decision support system may recognize at step 804 that an Asthma diagnosis is more likely. In addition to, or alternatively to, the informational diagnosis support reflected in FIG. 6, the decision support system may gather findings using physiological data to help the caregiver determine the appropriate treatment path. For example, if a breathing or breath sound sensor generates data that, when processed, indicates clubbing, barrel chest, or decreased breath sounds, the decision support system may recognize at step 804 that a COPD treatment path is more appropriate, whereas if the breath sound sensor generates data indicative of pulsus paradoxus, or if a muscle activity sensor indicates accessory muscle use, the decision support system may recognize at step 804 that an Asthma treatment path is more appropriate.
  • According to embodiments of the present invention, the decision support system may suggest or propose a diagnosis or treatment path, for example by indicating statistical probabilities (based on charts and data such as those of FIGS. 6 and 7) or relative likelihoods, and ask for confirmation or final selection by the caregiver. For example if at step 804 the decision support system receives confirmation of an Asthma diagnosis, then the user interface device may change the information presented to the caregiver, for example by launching into a treatment protocol specific to the Asthma diagnosis. At step 806, the decision support system may suggest that the caregiver attach a humidifier to the patient's oxygen supply, and administer 2.5 milligrams of albuterol mixed with 0.5 milligrams of Atrovent administered by nebulizer connected to a 6-9 liter per minute source, and may indicate that the dosage may be administered continuously as long as the heart rate is not greater than 140. The decision support system may monitor the heart rate, and give a visual and/or audio indication when and if the heart rate reaches or approaches 140, in this example.
  • At step 808, the decision support system may help the caregiver decide whether the patient is extremely bronchoconstricted, for example by showing data or measurements related to blood oxygen content, respiration rate, or respiration volume. Upon a confirmation by the caregiver that the patient is extremely bronchoconstricted at step 808, the decision support system may then suggest to the caregiver that a 125 milligram dose of Solumedrol be administered over a slow (e.g. 2 minute) intravenous push. At step 810, the decision support system may help the caregiver to decide whether the patient's symptoms have improved (e.g. whether the patient's shortness of breath has improved with the treatment thus far). For example, the decision support system may display and/or analyze the patient's end-tidal waveform, and suggest that the patient does not appear to be responding to the treatment, and ask for the caregiver's confirmation. If the caregiver confirms the decision, then the decision support system may continue to guide the caregiver through additional treatment options, for example those indicated in FIG. 8. In this way, the decision support system guides the caregiver through complex decision making processes, during the clinical encounter, using both physiological data and informational data gathered from the patient or input by the caregiver, in a way which would be too inconvenient or time-consuming for the caregiver to perform absent the decision support system.
  • The decision support according to embodiments of the present invention may or may not be fully automated. Inference engines utilizing Bayesian networks, neural networks, genetic algorithms, or simpler rule-based systems may be employed.
  • In another embodiment, the tissue CO2 or pH are measured by methods such as those described in U.S. Pat. No. 6,055,447, which describes a sublingual tissue CO2 sensor, or U.S. Pat. Nos. 5,813,403, 6,564,088, and 6,766,188, which describe a method and device for measuring tissue pH via near infrared spectroscopy (NIRS), and which are all incorporated herein by reference in their entirety for all purposes. NIRS technology allows the simultaneous measurement of tissue PO2, PCO2, and pH. One drawback of previous methods for the measurement of tissue pH is that the measurements provided excellent relative accuracy for a given baseline measurement performed in a series of measurements over the course of a resuscitation, but absolute accuracy was not as good, as a result of patient-specific offsets such as skin pigment. One of the benefits achieved by some embodiments of the present invention is the elimination of the need for absolute accuracy of these measurements, and the reliance on only the offset and gain being stable over the course of the resuscitation. Tissue CO2 and pH are particularly helpful in monitoring in the trauma DTP. Physiological parameters on display for the trauma DTP may be one or more of: invasive and non-invasive blood pressure, tissue CO2 and pH, ECG, SpO2 trending, and heart rate variability risk index. The ECG may be analyzed to determine the interval between adjacent R-waves of the QRS complexes and using this interval to calculate heart rate variability as a running difference between adjacent R-R intervals. It is known to those skilled in the art that an abrupt reduction in variability will often precede by many minutes a precipitous decline in a patient's blood pressure (traumatic arrest). By monitoring the trend in heart rate variability, the traumatic arrest can be anticipated and prevented.
  • Another sensor of use for the trauma DTP is ultrasound, according to embodiments of the present invention. According to C. Hernandez et al., C.A.U.S.E.: Cardiac arrest ultra-sound exam—A better approach to managing patients in primary non-arrhythmogenic cardiac arrest, Resuscitation (2007), doi:10.1016/j.resuscitation.2007.06.033, which his incorporated by reference herein in its entirety for all purposes:
      • C.A.U.S.E. is a new approach developed by the authors. The C.A.U.S.E. protocol addresses four leading causes of cardiac arrest and achieves this by using two sonographic perspectives of the thorax; a four-chamber view of the heart and pericardium and anteromedial views of the lung and pleura at the level of the second intercostal space at the midclavicular line bilaterally. The four-chamber view of the heart and pericardium is attained using either the subcostal, parasternal or apical thoracic windows. This allows the individual performing the examination to select the most adequate view depending on the patients' anatomy. The authors recommend beginning with the subcostal view first as this view makes it possible for the practitioner to evaluate the heart without interrupting chest compression. If this view is not possible then the apical or parasternal approaches may be used during coordinated pulse checks lead by the resuscitation team leader. A four-chamber view is used in this protocol as it allows for ease of comparison between the different chambers in the heart, facilitating the diagnosis of hypovolemia, massive PE, and cardiac tamponade (FIG. 6). Pneumothorax is diagnosed by identifying the lack of sliding sign and comet-tail artifact while looking in the sagittal plane at the second intercostal space of the midclavicular line (FIG. 7). For both the cardiac and lung views it is recommended to use a 2.5-5.0 phased array transducer probe. This allows the examiner to use the same probe for both lung, heart and if needed abdominal exam. This type of probe was used by Knudtson in his study involving ultrasound for the use of identifying pneumothorax as an addition to the FAST exam, and it yielded very a high accuracy in detecting pneumothorax, yet still remained useful in identifying the heart and abdominal organs. The protocol is best described in diagram form. [see FIG. 12]
  • The caregiver selecting elements of the flowchart results in the ultrasound sensor being activated and images presented on the computer tablet. Additional instructions can be requested from the interface on either the computer tablet and/or the defibrillator. Based on the selections and instructions, the settings of the ultrasound can be adjusted to deliver the optimal images, according to embodiments of the present invention.
  • Although five diagnosis and treatment pathways are discussed with respect to FIG. 3, the differential diagnosis/decision support system may be configured to support decision making and diagnosis with respect to other DTPs, and may be configured to display and support various combinations of one or more DTPs, from among the five shown in FIG. 3 and others. According to other embodiments of the present invention, each user may configure the decision support system to use customized DTP for each DTP option; for example, the user may change the default series of questions/steps/readings for the Trauma DTP with a new series of questions/steps/readings based on caregiver-specific, patient-specific, geography-specific, and/or regulation-specific treatment protocols. In this way, the decision support system according to embodiments of the present invention operates to guide decision making and diagnosis for a caregiver in a way that accommodates various kinds of DTPs.
  • For example, if a user selected the Trauma DTP option from the screen of FIG. 3, the decision support system may be configured to guide a user through a decision and treatment pathway similar to that shown in FIGS. 13-25. The user would then be presented with a series of further options, such as “amputation injury,” “bleeding control,” “burns,” and the like. Selecting one of these further options would then cause the decision support system to enter and display the particular pathway or pathways relevant to the selected option. According to embodiments of the present invention, the decision support system is comprised by a user interface device, independent of a medical device or one or more sensors, in a way which simply guides the caregiver through a series of decisions according to a pre-established flow chart. At a basic level, a medical device, such as a defibrillator, may include one or more decision support flow charts and/or treatment protocols, which guide the caregiver through various decisions, either with or without sensor data or other data input. A graphical DTP may be included in a defibrillator device as a reference document, electronically navigable.
  • According to other embodiments, the decision support system is informed by a combination of caregiver observations, patient information, and/or sensor data. Assessment and/or scoring may be performed, either by receiving data from the caregiver, or receiving data from sensors, or both. For example, for a trauma DTP, the decision support system may take into account pulse rate, breathing data, qualitative breathing data, pulse rate, blood loss, blood pressure, presence of broken limbs, and/or compound fractures. Or, in a cardiac distress DTP, the decision support system may be configured to display a cardiac arrest probability at a moment in time, which may be calculated and/or predicated by the decision support system based on selected criteria. The decision support system may also be configured to track certain criteria in order to suggest treatment outcome probabilities, for example suggesting the treatment pathway with the highest or a high perceived probability of success.
  • According to some embodiments of the present invention, a monitor, or a defibrillator/monitor combination, or other similar device, may be configured to provide a graphical tool to configure the monitor to follow recognized rescue protocols, for example one or more of the protocols described and/or shown herein. Such a tool may be included on the monitor or defibrillator device, on a tablet or handheld or other computing device, and/or on both, according to embodiments of the present invention. Such a tool may be provided in a graphical interface, for example a flowchart. The tool allows the user to configure the patient monitor to follow a particular rescue protocol, for example by visually presenting a flow chart for the protocol and allowing the user to customize the protocol. For example, the length of the CPR period may be configured by the user to customize the treatment protocol. Such a tool may also permit the downloading and uploading of customized treatment protocols to and/or from a monitoring device, which may also permit the same customized protocol settings to be carried on a mobile device and/or transferred or uploaded to multiple other devices in different locations and/or at different times, according to embodiments of the present invention.
  • FIG. 26 illustrates a clinical decision support system 2600, according to embodiments of the present invention. System 2600 includes a processor 150 which is communicably coupled to a database 152, a decision support module 153, a display 156, and a patient monitor and/or defibrillator 154, which may itself be communicably coupled to another display module 155, according to embodiments of the present invention. Some or all of the elements shown in FIG. 26 may be part of, or implemented by, one or more computer systems as illustrated in FIG. 27.
  • FIG. 27 is an example of a computer or computing device system 200 with which embodiments of the present invention may be utilized. For example, defibrillator 154 and/or the tablet shown in FIG. 11 may be or incorporate a computer system 200, according to embodiments of the present invention. According to the present example, the computer system includes a bus 201, at least one processor 202, at least one communication port 203, a main memory 208, a removable storage media 205, a read only memory 206, and a mass storage 207.
  • Processor(s) 202 can be any known processor, such as, but not limited to, an Intel® Itanium® or Itanium 2® processor(s), or AMD® Opteron® or Athlon MP® processor(s), or Motorola® lines of processors, or any known microprocessor or processor for a mobile device, such as, but not limited to, ARM, Intel Pentium Mobile, Intel Core i5 Mobile, AMD A6 Series, AMD Phenom II Quad Core Mobile, or like devices. Communication port(s) 203 can be any of an RS-232 port for use with a modem based dialup connection, a copper or fiber 10/100/1000 Ethernet port, or a Bluetooth® or WiFi interface, for example. Communication port(s) 203 may be chosen depending on a network such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system 200 connects. Main memory 208 can be Random Access Memory (RAM), or any other dynamic storage device(s) commonly known to one of ordinary skill in the art. Read only memory 206 can be any static storage device(s) such as Programmable Read Only Memory (PROM) chips for storing static information such as instructions for processor 202, for example.
  • Mass storage 207 can be used to store information and instructions. For example, flash memory or other storage media may be used, including removable or dedicated memory in a mobile or portable device, according to embodiments of the present invention. As another example, hard disks such as the Adaptec® family of SCSI drives, an optical disc, an array of disks such as RAID (e.g. the Adaptec family of RAID drives), or any other mass storage devices may be used. Bus 201 communicably couples processor(s) 202 with the other memory, storage and communication blocks. Bus 201 can be a PCI/PCI-X or SCSI based system bus depending on the storage devices used, for example. Removable storage media 205 can be any kind of external hard-drives, floppy drives, flash drives, zip drives, compact disc-read only memory (CD-ROM), compact disc-re-writable (CD-RW), or digital video disk-read only memory (DVD-ROM), for example. The components described above are meant to exemplify some types of possibilities. In no way should the aforementioned examples limit the scope of the invention, as they are only exemplary embodiments of computer system 400 and related components.
  • As shown in FIG. 26, the decision support module 153 may be a clinical support and/or differential diagnosis and/or treatment protocol as described herein. Based on information about the patient received from monitor 154, the decision support module 153 determines and/or shows to the user a set or array of next available options in the decision tree. Alternatively, the decision support module 153 may be configured to calculate probabilities or other statistics based on decision support trees, algorithms, and/or historical data.
  • Because the display module 155 of the monitor 154 is used for patient-critical monitoring or treatment functions, and because the monitor 154 must often be small or portable, there may be limited size availability on the display device which display module 155 operates. As such, embodiments of the present invention include a separate display 156 which is available to the user or to someone other than the user in order to view information about a particular decision support process being implemented by the processor 150 and, optionally, by the patient monitor 154. When a user decides to implement a decision support process, a selection may be made on the user interface screen operated by the display module 155, and/or may be made on the user interface operated by display module 156. This then prompts the processor 150 to access a clinical decision support process via decision support module 153. Decision support module 153 may include logic to guide the user through the various nodes and/or branches of a clinical decision support process, for example those shown in FIGS. 5-8B and 12-25. According to some embodiments of the present invention, the display module 155 operates the display screen of a monitor/defibrillator as shown in FIG. 11, and the display module 156 operates a tablet computer screen. Such a tablet computing device may be communicably coupled to the processor 150 (whether such processor is located in the monitor/defibrillator or the tablet computing device) by docking it into a communications dock on the monitor/defibrillator as shown in FIG. 11, and/or may be communicably coupled to the processor 150 wirelessly. Based on the disclosure provided herein, one of ordinary skill in the art will recognize that patient monitor 154 may include its own processor, and tasks described as performed by processor 150 may be distributed across one or multiple processors and/or physical devices.
  • FIG. 28 illustrates one example of a decision support tree that may be shown to a user on an auxiliary screen (operated by module 156) during a medical event, to guide the user through a treatment protocol or pre-diagnosis of the patient. The decision support module 153 may be navigated through the various decision points (e.g. “nodes”) either by manual selection of the next available option or branch, or by complete or partial automatic selection of the next available option or branch based upon patient data collected during the medical event, for example physiological data collected by the patient monitor/defibrillator 154 that is connected to the patient, or by a combination of these two processes. A process that is wholly or partially automatic may also be configured to prompt a user for confirmation before moving to a subsequent or previous node, according to embodiments of the present invention.
  • Due to the time critical nature of a medical first responder's tasks, such a medical first responder has limited attention resources. In order to further simplify such a user's interface with a decision support module 153, the processor 150 may be configured to dynamically adjust the display screen 156 during the medical event. As one example, FIG. 28 illustrates a user interface display of a clinical decision support tree, according to embodiments of the present invention. This decision support tree begins at block 2, and the first decision is between blocks 4 or 24. If block 4 is selected, the decision is next between blocks 6 and 8. If block 6 is selected, the next decision is between blocks 10 and 12. Although one or two possible branches or decisions are shown, one of ordinary skill in the art will appreciate, based on the disclosure provided herein, that any number of branches or decision options may be provided to extend from a particular node, and that such branches could overlap and/or loop back to a previous node, according to embodiments of the present invention. The remaining blocks 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, and 42 may function in a similar manner.
  • FIG. 29 illustrates one example of the user interface display of FIG. 28 with a portion of the clinical decision support tree resized, according to embodiments of the present invention. Once block 4 is selected over block 24 (manually by the user and/or automatically based on patient data), the display module 156 resizes the entire “branch” including block 24 and its subsequent nodes, and/or resizes each block 24, as shown in FIG. 29, in this case by making them smaller. Alternatively, in another embodiment, even before the user manually selects block 4, the processor 150 instructs the display module 156 to resize the block 24 branch as shown in FIG. 29 based on an indication from the decision support module 153, which factors in patient data received (either manually or automatically from the monitor 154) to indicate that choosing block 4 over block 24 would be more consistent with the particular clinical decision support process being implemented. By indicating a size different between block 4 and block 24, the user is provided a visual indication which, if it coincides with the user's perceptions and experience, facilitates the navigation through the decision support process. This also makes such a process easier to use for those who may not have extensive experience with a particular decision support protocol.
  • The resizing may occur by making block 24 smaller, or by making block 4 larger, or both. In some cases, only the subsequent sets of blocks or nodes are resized, rather than the rest of the branches or nodes depending from the immediately subsequent nodes. Each node may be represented by a shape, and the entire border of the shape may be resized in order to indicate a non-chosen or less-probable node. As another alternative, the size of the node may remain the same but the text inside the node may be resized. As yet another alternative, the size of the node may remain the same, but the color or transparency of the non-chosen or less-probable nodes may be changed, for example “grayed out” for the less important nodes and turned to a bolder color or flashing color for the more important nodes. A combination of these and other visual indication features may be employed to assist the user in visually navigating through the decision support process in real time, during the medical event.
  • In some cases, the entire decision support tree may be shown on a device screen; in other cases, the tree may be too large to show all at once. FIG. 29 also illustrates how a screen border can be recentered or moved dynamically to correspond with movement through the tree. For example, screen border 50 is initially centered (either vertically or horizontally or both) on block 2, and as soon as block 4 is selected, or becomes a more likely or recommended selection, the screen border 50 shifts along the direction indicated by arrow 52 to new screen border position 50′, which is now centered on block 4. FIG. 30 illustrates a similar resizing feature as it might be displayed after block 6 is selected over block 8.
  • The decision support module 153 may also be configured to transition between differential diagnosis and treatment protocols; for example, as a likely diagnosis is approached by a clinical support module, the user may be prompted to select or begin a treatment protocol consistent with one or more likely diagnoses or pre-diagnoses. As another example, one or more treatment protocol trees may be presented at the end of a differential diagnosis or clinical decision support tree, in order to guide the user through the recommended treatment protocol once the decision support module 153 has helped the user identify the condition that requires treatment.
  • The patient monitor/defibrillator device 154 may also be configured for several different care modes, and may be configured to enter the most likely or most relevant care mode based on the user's navigation of the clinical decision support process, for example on auxiliary display 156, and to change between two or more care modes as appropriate as the user navigates the clinical decision support tree, according to embodiments of the present invention.
  • FIG. 31 illustrates a user interface display with dynamic softkeys, according to embodiments of the present invention. Just as the nodes on a decision support tree display may be dynamically visually adjusted to help the user in navigating the process, so too the selection options on a patient monitoring or treatment device 154 may be dynamically adjusted to guide the user through a particular clinical decision support process. FIG. 31 shows a housing of a patient monitor/defibrillator 54, which may include a screen 55 (for example operated by display module 155 of FIG. 26), and which may include a number of physical user input devices 56, 58, 60, 62, 64, which may be for example buttons. The screen 55 may be configured to display a user interface as shown, which may include one or more softkeys 66, 68, 70, 72, 74, with one or more of the softkeys 66-74 corresponding to one or more of the buttons 56-64. Based on the disclosure provided herein, more or fewer buttons and/or softkeys maybe used, and the positioning of the buttons and/or softkeys may vary across different units, models, or designs. For example, the buttons may alternatively or additionally extend vertically across one side of the screen 55.
  • The softkeys 66 are part of the display screen that may be dynamically modified by the processor 150 and/or the patient monitor 154, such that the buttons 56-64 may be used by the user to select different options at different times. This allows the user to navigate through various menus with a single row of buttons. According to some embodiments of the present invention, the device 55 does not include any physical buttons, and instead uses only softkeys on the display screen 55 that are themselves selectable (e.g. via a touchscreen arrangement). As such, the term “softkey” is used herein in its broadest sense to refer to any combination of physical and virtual buttons that may be used by a user to select from one or more options.
  • Similar to the process described with respect to FIGS. 28-30, the softkeys 66 may be dynamically adjusted to assist the user in navigating a decision support process. Based on the disclosure provided herein, one of ordinary skill in the art will recognize numerous different menus or clinical decision support processes that may benefit from such dynamically adjusting softkeys. Just a few particular examples are shown in FIGS. 32 and 33. For example, if a user selected the “acute care diagnose” button or softkey from the user interface display of FIG. 3, the user could be taken to the screen of FIG. 31 with dynamic softkeys 66-74. Such softkeys may initially look very similar to those of FIGS. 3 and 31; however, according to one embodiment of the present invention, after the user has entered the acute care diagnosis function, and before the user has selected the next branch of the process, the patient monitor/defibrillator observes a cardiac arrhythmia based on the patient's simultaneously observed ECG waveform. Based on this physiological data, the display module 155 emphasizes the Cardiac Distress softkey 70 by visually emphasizing it or visually distinguishing it over the other simultaneously displayed softkeys, as shown in FIG. 32. For example, the Cardiac Distress softkey 70 may be changed in color or boldness. The softkey 70 may include a displayed geometric shape, and such shape may be changed, or its perimeter may be made bolder or more visually distinct. As another option, the text within the softkey 70 may be enlarged or emboldened or italicized in order to visually distinguish softkey 70 based on the physiological data.
  • According to some embodiments of the present invention, the user interface displayed on the screen 55, and/or the screen display of an accompanying tablet device, includes one or more legends for visually indicating to the user why one or more softkeys have been emphasized or highlighted. For example, such a legend may include text such as “possible cardiac arrhythmia” to explain why the Cardiac Distress softkey 70 is emphasized, or “low SpO2” to explain why the Respiratory Distress softkey 66 is emphasized, or “dispatch: chief complaint=trauma” to explain why the Trauma softkey 72 is emphasized, according to embodiments of the present invention.
  • As an alternative, or in combination with the color, font, font size, shape, and similar visual distinguishing features, based on this physiological data, the display module 155 resizes the Cardiac Distress softkey 70 by making it larger, or by making the other softkeys smaller, as shown in FIG. 33. Although FIGS. 32 and 33 illustrate only one softkey 70 being emphasized and/or resized based on available patient data, the display module 155 may further be configured to dynamically emphasize and/or resize more than one softkey, in more than one way, according to embodiments of the present invention. For example, if the patient's blood oxygen content is observed by the monitor 154 as being below a certain threshold, and the patient's ECG waveform is observed by the monitor 154 as being irregular, both the Cardiac Distress softkey 70 and the Respiratory Distress softkey 66 may be visually emphasized or resized with respect to the other softkeys, and may also be visually emphasized or resized with respect to each other depending upon the relative significance of each possible diagnosis or treatment protocol. For example, if the decision support module 153 or processor 150 is able to determine that the cause of the respiratory distress is likely cardiac distress, then the cardiac distress softkey 70 may be the largest or most emphasized softkey, while the respiratory distress softkey 66 may be the next largest or next most emphasized softkey, followed by the remaining softkeys. Once a definitive selection is made, the softkeys 66-74 may be configured to dynamically update to reflect the next decision/step or set of decisions/steps. The dynamic resizing and/or emphasizing of various softkeys conveys a greater level of helpful decision support to the user, without sacrificing the user's ability to select even one of the softkeys that is not enlarged or emphasized, according to embodiments of the present invention.
  • Although the dynamic adjustment of visual characteristics of softkeys has been described with respect to observed physiological data about the patient, such dynamic adjustment may alternatively or additionally be accomplished using patient charting data or other patient data entered manually or automatically. For example, if the patient's chart at the beginning of the medical event indicates that the patient was involved in an automobile accident, the Trauma softkey 72 may be configured for initial enlargement and/or emphasis as soon as the user selects the “acute care diagnose” function from the interface of FIG. 2, according to embodiments of the present invention.
  • FIG. 34 illustrates a code review interface for reviewing user interface display data corresponding to a medical event, according to embodiments of the present invention. The code review interface includes a user interface replicator 455 as well as a visual timeline indicator 300. Throughout a medical event, the user of the patient monitor/defibrillator 154 takes the display screen 55 of the monitor 154 through various steps and user interface modes. It is often helpful, after the medical event has occurred, for the user, as well as someone who is reviewing or critiquing the performance of the user, to be able to know what happened during the medical event and when during the medical event such events occurred. Such information is particularly helpful in the time leading up to or following a significant patient event, in order to determine the appropriateness or effectiveness of the particular treatment applied. To this end, the processor 150 may be configured to capture visual representations (e.g. “snapshots” in time) of some or all of the user interface screen 55 and store them for later review, for example in database 152, according to embodiments of the present invention. Such review may be accomplished in the form of a playback interface as shown in FIG. 34. Such snapshots of the user interface 55 may be recorded at least once each second, twice each second, or more times each second, at regular or irregular intervals, according to embodiments of the present invention. In some embodiments, the snapshots may be made frequently enough (e.g. at the data sample rate of 500 snapshots per second) to provide full fidelity playback of the event.
  • The interface replicator 455 and visual timeline indicator 300 may be configured to play back the screen user interface appearance at the same rate at which the images were taken or captured, and dynamically move the position of the timeline indicator 308 along the timeline 300 from the beginning time indicator 304 to the ending time indicator 306, according to embodiments of the present invention. A current position indicator 302 indicates the time, for example in hour:minute:second format, at which the particular user interface screen shot shown in the user interface replicator 455 was taken (or at which such a user interface displayed during the medical event). As such, a person reviewing the progression of the screen interface 55 sees the screen interface 55 in the user interface replicator 455 just as it would have been seen by the user of the device at the time of the medical event, according to embodiments of the present invention.
  • The visual timeline indicator 300 may also include visual event indicators, such as drug administration visual event indicator 310 and patient defibrillation visual event indicator 312. Other visual event indicators may include, for example, the occurrence of an alarm, the time at which a blood pressure measurement or signal was acquired (which may be helpful for documenting at the end of a medical event), event markers, clinical decision tree points, the time at which spontaneous circulation returned (“ROSC”), and/or the time at which a “rearrest” softkey was pressed or at which a renewed or subsequent cardiac arrest condition was observed.
  • Visual event indicator 310 indicates the time during the medical event (e.g. on the timeline) at which a drug was administered to the patient. Visual event indicator 312 indicates the time during the medical event (e.g. on the timeline) at which a defibrillation treatment was applied to the patient, according to embodiments of the present invention. Fewer or more of the same or additional visual event indicators may be used in the visual timeline indicator 300, in order to signal to the reviewer the times at which significant events of interest occurred during the medical event. This then permits the reviewer to skip directly to the user interface time intervals of interest, rather than reviewing all user interface screen shots sequentially, according to embodiments of the present invention. As one example of how a user may skip directly to a desired time for playback of the user interface screen, the user may select timeline indicator 308 with a cursor or other selection process, and drag it left or right on the timeline before releasing it to resume playback at the time corresponding to the new location of the indicator 308, according to embodiments of the present invention. According to some embodiments of the present invention, the user may move the indicator 308 and thus the playback to the time of visual event indicator 310 (or to a time that is a predetermined interval before the time of visual event indicator 310) by simply clicking on visual event indicator 310.
  • The interface of FIG. 34 may further include a current position indicator 302, which displays a time corresponding to the position of the indicator 308 along the timeline 300 and corresponding to the image displayed in the user interface replicator 455, according to embodiments of the present invention. While FIG. 34 illustrates a substantially linear timeline, other non-linear timeline indicators may be used. The code review interface of FIG. 34 may also be particular helpful in reviewing the recorded screen images for dynamic softkey adjustments, as described with respect to FIGS. 32 and 33. For example, if a user failed to select a particular softkey that was later determined to have been the preferred course of action, the code reviewer could set the indicator 308 to the time that such softkey was displayed to see whether the particular softkey was resized or emphasized in order to indicate that it was the preferred course of action. Reviewers using the interface of FIG. 34 are also able to see what exactly was on the user's screen when certain actions were undertaken, for example what the user looked at just prior to the drug administration event 310, according to embodiments of the present invention. According to some embodiments of the present invention, the interface of FIG. 34 operates in a manner similar to that of digital video recorder playback.
  • Screen controls consistent with user interfaces that play back movies may be included in the interface of FIG. 34. For example, the interface may include a media navigation interface including media navigation bar 314, volume selection bar 314, and/or playback speed selection bar 316. The media navigation bar 314 may include screen controls similar to those used with playback of movies, to control the content of the user interface replicator. For example, the media navigation bar 314 may include a play button 322, a stop button 324, a pause button 326, a rewind button 320, and a fast forward button 328. A skip back button 314 and skip forward button 330 may also be included, for example to skip between medical events, chapters, and/or visual event indicators, according to embodiments of the present invention. As used herein, “button” is used to refer to either or both of a physical button or a virtual/screen selection interface option. By clicking on or otherwise selecting one of the 2x, 4x, 8x, or 16x portions of the playback speed selection bar 316, the speed at which the medical vent is played on the user interface replicator 455 may be adjusted. The playback speed selection bar 316 may also be configured to visually indicate which of the playback speed selections is currently active. Other or additional speed selections may be provided. Clicking on or otherwise selecting volume selection bar 314 permits adjustment of any audio playback volume (e.g. when audio data from the medical event is also played back simultaneously or instead of the visual data).
  • According to some embodiments of the present invention, the on-screen cursor 334 (or other selection mechanism) may take the form of a hand with a pointed finger. When the finger is placed over, on, or near the timeline, a display preview pop-up window 332 opens, for example attached or in the vicinity of the finger or cursor 334. The display preview window 332 may show, for example, a physiologic waveform along with static measurements and time and events in sufficient detail for the user to determine whether to select that particular timeline location for current playback, according to embodiments of the present invention. The display preview window 332 includes the physiologic waveform and measurements/events portion 336, as well as a time indicator portion 338 indicating where, along the visual timeline indicator 300, the cursor 334 has been placed, according to embodiments of the present invention. According to some embodiments of the present invention, selecting and “holding” the selection on the timeline indicator 308 and scrolling forward and backward along the timeline 300 causes a similar display preview window 332 to pop up at or near the slider 308.
  • According to some embodiments of the present invention, the user can play back the clinical decision support tree for reviewing the medical event. For example, a tablet screen, or a screen controlled by display module 156, or alternatively an interface similar to that of FIG. 34, could be configured to indicate a timeline and display the user's progression through a clinical decision support tree by highlighting each node through which the process was taken, and the time at which such node selection was made. According to some embodiments of the present invention, a representation of the clinical decision support tree is itself used as a visual timeline indicator, permitting a user to select a node in order to see, in the user interface replicator 455, what the defibrillator/monitor 154 screen 55 looked like at the time or times when the user was at the selected step in the decision support process. According to some embodiments, the display module 156 and processor 150 may communicably coupled bi-directionally with the defibrillator/monitor 154, and the defibrillator/monitor 154 screen 55 itself may be used as (for example instead of) the user interface replicator 455. In addition to being able to select a particular node in the decision support tree to view the monitor display at that selected step, the tablet computer screen or other display device operated by display module 156 may be configured to show a user-selectable list of event markers which, when selected by the user, replicates the monitor's 154 display at the time of the marked event, either using display module 155 or user interface replicator 455, according to embodiments of the present invention. For example, the following list of event markers could be displayed on a tablet computing device communicably coupled to the defibrillator/monitor 154:
  • 03:05:00 SBP 110/80, HR 99, SpO2 95%
  • 03:08:00 alarm: SpO2 88%
  • 03:08:30 event: O2 delivery
  • 03:10:00 SBP 105/82, HR 110, SpO2 92%
  • 03:11:01 event: ACLS arrive
  • Although FIG. 34 depicts a user interface replicator 455, other replicators may be used to display or play back other observed parameters that occurred over the course of a medical event; for example, graphs, trends, and/or charts representing patient information or physiological status. Such an ability to quickly and efficiently review patient data for a medical event or portions thereof may be helpful not only for a subsequent reviewer, but may also be helpful for the user during the medical event, and/or for a subsequent user during the medical event, for example when a patient is transferred from a Basic Life Support crew to an Advanced Life Support crew. The interface of FIG. 34, or a similar interface, may permit review of the patient's care report, ECG or 12-lead waveforms, cardiopulmonary resuscitation quality, and other patient care information or data. Event markers may be used as described above. As another example, an event marker may be used to indicate that the patient was administered a bronchodilator medication, and the code review interface may be used to look at the patient's respiratory status before and after the application of the bronchodilator. This permits the same user, or a subsequent user for the same patient, or a subsequent reviewer, to observe how effective the bronchodilator dosage was, and perhaps to factor such information into a decision to again administer the same or another treatment. As another example, the interface of FIG. 34 or a similar interface may be used to review how the patient's carbon dioxide waveform changes upon patient treatment. “Snapshots” may be recorded and played back through a similar interface for other patient data, for example the data from a ventilation monitoring device (e.g. minute ventilation).
  • According to some embodiments of the present invention, alarm thresholds may be dynamically adjusted based on patient physiological data and/or charting data. In addition, frequency-automated measurements, for example blood pressure, may be adjusted based on patient physiological data and/or charting data, for example by changing the frequency of such measurements. For example, when a traumatic brain injury is suspected or diagnosed based on the patient physiological data, charting data, and/or via following a clinical decision support process, the monitor 154 may be configured to automatically obtain vital signs (e.g. blood pressure, SpO2, heart rate, and respiratory rate) every five minutes. For other, less critical conditions, these vital signs may only need to be taken twice during the entire patient event. As another example, automatic blood pressure measurements may be disabled when treating a cardiac patient, and then re-enabled once the patient achieves return of spontaneous circulation.
  • As another example illustrating how alarm thresholds may be dynamically adjusted based in a traumatic brain injury medical event, a systolic blood pressure (“SBP”) alarm may be configured on the monitor 154 to alert the user with an alarm if an adult's SBP is less than 90 mmHg, with a ventilation rate target of 10 breaths per minute, and/or the end tidal carbon dioxide is less than 35 mmHg. These targets may need to be adjusted based on a patient's age; for example, for a three-year old, a systolic blood pressure alarm may be set to activate with an SBP of less than 76 mmHg and/or a ventilation rate target of twenty breaths per minute. For a one-year old, a systolic blood pressure alarm may be set to activate with an SBP of less than 72 mmHg, and/or a ventilation rate target of twenty-five breaths per minute. According to embodiments of the present invention, the processor 150 is configured to automatically adjust the thresholds based on the patient's age, in a traumatic brain injury situation, based on user input, rather than requiring the user to manually reconfigure the alarm thresholds based on age. For example, the processor 150 may obtain the patient's age from database 152, and/or from a patient charting system to which it is communicably coupled, and use the patient's age to automatically reconfigure the alarm thresholds upon an indication, either via a softkey selection or from the decision support module 153, that a traumatic brain injury situation applies. Alternatively, the clinical decision support tree for traumatic brain injury may, at the appropriate node in the process, request the user to select from various age groupings, and use the user's selection from the decision support tree to automatically adjust the alarm thresholds. The processor 150 may also be configured to silence all alarms upon a determination that the patient has entered cardiac arrest, and then re-enable all alarms upon a determination that the patient has achieved a return of spontaneous circulation. According to some embodiments of the present invention, the processor 150 may be configured to, after a cardiac arrest event for an adult, reset the alarm thresholds to end tidal carbon dioxide <30 mmHg (possibly lower for a traumatic brain injury situation) or heart rate <40 beats per minute. While alarm and other thresholds are discussed as being adjustable in traumatic brain injury medical events, alarms and other thresholds may also be dynamically adjusted for other patient events or conditions, according to embodiments of the present invention.
  • According to embodiments of the present invention, system 2600 is used to assist clinicians with delivering medications of the appropriate dose. Medication errors can cause significant problems, particularly for the treatment of pediatric patients and when drugs are substituted. According to some embodiments of the present invention, the decision support module 153 displays assistance for physicians to comply with a protocol, for example a protocol related to drug delivery. For example, a medical director may provide drug options for treatment of a particular condition. The dosing of the drug would be determined based on patient age, body weight, Broselow measurement, and/or medical complaint. At any time, the medical director may change the drug options and dosing based on factors such as the availability of the drug. A physician might even change the recommendations in real-time or in clinical time if remotely monitoring the treatment. The system may include protections and/or safeguards to ensure that the information is correctly entered into the database 152 about the drug and/or the patient, in order to permit the decision support module 153 to accurately guide the caregiver in dosing according to the current drug delivery protocol, according to embodiments of the present invention.
  • As shown in FIG. 35, part of a process, for example a clinical decision support tree, has a decision process that flows through arrow 401 and into decision point 400, system 2600 may assist the decision support module 153 in determining whether to select, suggest, and/or recommend node 402 or node 404, according to embodiments of the present invention. After node 402 is selected, the process may continue to the next node via arrow 403. After node 404 is selected, the process may continue to the next node via arrow 405, according to embodiments of the present invention. As shown in FIG. 35, a drug delivery portion of a clinical decision support tree may, at node 400, have the decision support module 153 determine a correct dosing for the particular patient based on upon observed and/or input patient characteristics (block 406), and/or may determine or suggest or recommend one of two or more doses, for example dose A (block 402) or dose B (block 404), according to embodiments of the present invention. The observed and/or input patient characteristics (block 406) that might suggest different dosing include age, weight, allergies, and/or other conditions, according to embodiments of the present invention.
  • FIG. 36 illustrates a user interface, for example a user interface for a patient monitor/defibrillator 154 or other device. The user interface of FIG. 36 includes display portions which indicate trending information for various values. For example, the interface illustrates trending data for systolic blood pressure (referenced as 3600), end tidal carbon dioxide (EtCO2), and blood oxygen saturation (SpO2). Trending data may be displayed as a running record of previous readings. The oldest readings may appear on the left, and the newest readings may appear on the right, and the newest reading may be inserted on the right side while displacing the oldest reading on the left side, according to embodiments of the present invention. Alternatively, the oldest readings may appear on the right, and the newest readings may appear on the left, and the newest reading may be inserted on the left side while displacing the oldest reading on the right side. Other options for visually indicating the trend data for a given signal may be employed.
  • Conventional trending data displays for medical devices, for example a patient monitor/defibrillator 154, help a clinician assess patient history and condition, but they often fail to convey information about how the trending values compare with acceptable values or ranges of values, or user-defined values or ranges of values. According to some embodiments of the present invention, the scaling of the trending readouts, and/or the frequency of the values displayed for the trending values, and/or a color in which the trending values are displayed, is customized according to the particular patient and/or the patient's condition. This may be done by the decision support module 153. For example, if as part of a decision support process the decision support module 153 receives information indicating a patient's age, then the processor may be configured to configure the color in which each bar of the blood pressure trending graph 3600 is displayed.
  • The three bars on the left 3602 may be displayed as green to indicate that the patient's blood pressure at the times corresponding to those particular blood pressure measurements was within acceptable limits for the patient's age. The middle five bars 3604 may be yellow to indicate that the patient's blood pressure at the times corresponding to those particular blood pressure measurements was below acceptable limits, but not yet at a critical level. The right three bars 3606 may be red to indicate that the patient's blood pressure at the times corresponding to those particular blood pressure measurements was far below acceptable ranges, and was therefore at a critical level. In embodiments in which the newest trending values appear on the right side, the trend graph 3600 for patient systolic blood pressure indicates that the patient's blood pressure is worsening over time by becoming lower. Of course, other colors may be used, and additional colors and/or ranges may be employed. These ranges may be automatically adjusted by the decision support module 154 based on various factors, for example the patient's age, or other conditions. For example, all of the bars 3602, 3604, and 3606 can be displayed as green for a normal adult patient, while the same absolute readings may be colored as shown in FIG. 35 for a younger or adolescent patient. The coloring, target ranges, or other visual indication of the trending data may also be adjusted by the decision support module 153 during the patient monitoring event, based on data observed by the patient monitoring device 154.
  • According to some embodiments of the present invention, the clinician manually adjusts the target values of signals, which may be beneficial if the patient is “crashing,” for example. Instead of a screen full of red target values, the clinician could select ranges which correspond to conditions with a more realistic chance of being achieved, according to embodiments of the present invention.
  • According to some embodiments of the present invention, if a patient has cerebral herniation or impending cerebral herniation, the ETCO2 and/or ventilation rate targets may be changed in order to hyperventilate such patients so as to reduce intracranial pressure. These ranges or targets may be adjusted automatically if, in the course of a decision support process, the decision support module 153 detects, either automatically, or via manual or clinical or other inputs, that the patient has or is about to experience cerebral herniation.
  • According to some embodiments of the present invention, if the decision support module 153 detects that the ETCO2 is below a certain threshold, the target ventilation rate will be adjusted to lower the ventilation rate. If the decision support module 153 detects that the ETCO2 is above a certain threshold, the target ventilation rate will be adjusted to increase the ventilation rate, according to embodiments of the present invention. Such adjusted ventilation rates may include an upper and/or lower limit to prevent other undesired results, because high or low ETCO2 readings may be caused by factors other than ventilation rate (e.g. a super low ETCO2 may be caused by perfusion).
  • Various modifications and additions can be made to the exemplary embodiments discussed without departing from the scope of the present invention. For example, while the embodiments described above refer to particular features, the scope of this invention also includes embodiments having different combinations of features and embodiments that do not include all of the described features. Accordingly, the scope of the present invention is intended to embrace all such alternatives, modifications, and variations as fall within the scope of the claims, together with all equivalents thereof.

Claims (22)

What is claimed is:
1. A method for code review of a medical event, the method comprising:
displaying, on a first device screen, a user interface during the medical event;
recording images of the user interface, each of the images representing an entirety of the user interface associated with a time during the medical event, wherein the images are recorded at least once every second;
displaying, on a second device screen, a visual timeline indicator and a user interface replicator, the user interface replicator displaying the images of the user interface and the visual timeline indicator representing a time associated with each of the images, wherein the visual timeline indicator and the user interface replicator permit sequential playback and review of the user interface images from the medical event, and wherein the visual timeline indicator accepts user input to move the sequential playback to a different time associated with the medical event.
2. The method of claim 1, wherein the visual timeline indicator includes a timeline including a beginning time of the medical event and an end time of the medical event, the method further comprising indicating on the timeline the time associated with the image of the user interface shown in the user interface replicator.
3. The method of claim 2, wherein the indicating on the timeline comprises using an indicator on the timeline to indicate the time associated with the image of the user interface shown in the user interface replicator.
4. The method of claim 3, further comprising advancing the indicator along the timeline in a direction from the beginning time toward the end time during sequential playback of the user interface images.
5. The method of claim 4, wherein the visual timeline indicator accepts user input by permitting scrolling of the indicator to a different position along the timeline.
6. The method of claim 2, wherein the visual timeline indicator displays the time associated with each of the images in an hour-minute-second format.
7. The method of claim 2, wherein the visual timeline indicator includes one or more event markers marking occurrence of clinically-relevant sub-events during the medical event.
8. The method of claim 7, wherein the one or more event markers include a drug event marker indicating a time at which a drug was administered to a patient during the medical event.
9. The method of claim 7, wherein the one or more event markers include a defibrillation event marker indicating a time at which a defibrillation shock was applied to a patient during the medical event.
10. The method of claim 7, wherein the one or more event markers include an ROSC event marker indicating a time at which a patient returned to spontaneous circulation.
11. The method of claim 7, wherein the one or more event markers include a rearrest event marker indicating a time at which a patient returned to cardiac arrest.
12. The method of claim 7, wherein the one or more event markers include an alarm event marker indicating a time at which an alarm was activated.
13. The method of claim 1, further comprising displaying a cursor on the second device screen and, when the cursor is hovered over or near a time represented by the visual timeline indicator, displaying on the second screen at or near the cursor the image of the user interface associated with the time.
14. The method of claim 13, further comprising, when the cursor is hovered over or near the time, displaying at or near the cursor a textual representation of the time.
15. A method for decision support during a medical event, the method comprising:
displaying, on a screen on a device, a user interface during the medical event, wherein the user interface comprises two or more softkeys each representing a possible user selection;
collecting physiological data from a patient with the device;
determining, based on the physiological data, which one of the two or more softkeys represents the possible user selection that most closely conforms to a treatment or diagnosis protocol; and
based on the determination, visually distinguishing the one of the two or more softkeys from the others of the two or more softkeys on the user interface.
16. The method of claim 15, wherein visually distinguishing the one of the two or more softkeys comprises making the one of the two or more softkeys larger than the others of the two or more softkeys.
17. The method of claim 15, wherein visually distinguishing the one of the two or more softkeys comprises changing a position of the one of the two or more softkeys on the user interface.
18. The method of claim 15, wherein visually distinguishing the one of the two or more softkeys comprises changing a color of the one of the two or more softkeys on the user interface.
19. The method of claim 15, wherein visually distinguishing the one of the two or more softkeys comprises changing a border of the one of the two or more softkeys on the user interface.
20. The method of claim 15, wherein visually distinguishing the one of the two or more softkeys comprises making the one of the two or more softkeys dynamically flash on the user interface.
21. The method of claim 15, wherein visually distinguishing the one of the two or more softkeys comprises displaying on the screen a legend describing why the one of the two or more softkeys has been visually distinguished.
22. The method of claim 21, wherein the one of the two or more softkeys is a cardiac distress softkey, and wherein the legend textually indicates a possible cardiac arrhythmia.
US14/151,602 2013-01-11 2014-01-09 EMS decision support interface, event history, and related tools Active 2035-01-21 US9658756B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US14/151,602 US9658756B2 (en) 2013-01-11 2014-01-09 EMS decision support interface, event history, and related tools
US15/497,745 US10976908B2 (en) 2013-01-11 2017-04-26 EMS decision support interface, event history, and related tools
US17/197,341 US11816322B2 (en) 2013-01-11 2021-03-10 EMS decision support interface, event history, and related tools

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361751743P 2013-01-11 2013-01-11
US201361818334P 2013-05-01 2013-05-01
US14/151,602 US9658756B2 (en) 2013-01-11 2014-01-09 EMS decision support interface, event history, and related tools

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/497,745 Continuation US10976908B2 (en) 2013-01-11 2017-04-26 EMS decision support interface, event history, and related tools

Publications (2)

Publication Number Publication Date
US20140201627A1 true US20140201627A1 (en) 2014-07-17
US9658756B2 US9658756B2 (en) 2017-05-23

Family

ID=51166246

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/151,602 Active 2035-01-21 US9658756B2 (en) 2013-01-11 2014-01-09 EMS decision support interface, event history, and related tools
US15/497,745 Active 2035-03-06 US10976908B2 (en) 2013-01-11 2017-04-26 EMS decision support interface, event history, and related tools
US17/197,341 Active US11816322B2 (en) 2013-01-11 2021-03-10 EMS decision support interface, event history, and related tools

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/497,745 Active 2035-03-06 US10976908B2 (en) 2013-01-11 2017-04-26 EMS decision support interface, event history, and related tools
US17/197,341 Active US11816322B2 (en) 2013-01-11 2021-03-10 EMS decision support interface, event history, and related tools

Country Status (9)

Country Link
US (3) US9658756B2 (en)
EP (2) EP2943926A4 (en)
JP (3) JP6417334B2 (en)
CN (6) CN105164627B (en)
AU (1) AU2014205388A1 (en)
BR (1) BR112015016664A2 (en)
CA (1) CA2897915A1 (en)
RU (1) RU2015133474A (en)
WO (1) WO2014110280A2 (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140365175A1 (en) * 2013-06-10 2014-12-11 Zoll Medical Corporation Rescue Performance Metrics for CPR and Traumatic Brain Injury
US20150109135A1 (en) * 2013-10-23 2015-04-23 Dock Technologies Inc. Indicators
US20160085927A1 (en) * 2014-09-18 2016-03-24 Preventice, Inc. Dynamically adaptive care plans
WO2016109393A1 (en) * 2014-12-29 2016-07-07 Zoll Medical Corporation Rescue performance metric
USD777753S1 (en) * 2014-11-14 2017-01-31 Espec Corp. Display screen with graphical user interface
US20170265819A1 (en) * 2014-07-27 2017-09-21 Oridion Medical 1987 Ltd. Personalized patient alarm management
WO2017173014A1 (en) * 2016-04-01 2017-10-05 Cardiac Pacemakers, Inc. Systems and methods for detecting worsening heart failure
EP3300713A1 (en) 2016-09-30 2018-04-04 Zoll Medical Corporation Wearable sensor devices and systems for patient care
US9972181B1 (en) * 2014-04-11 2018-05-15 Vivint, Inc. Chronological activity monitoring and review
USD859452S1 (en) * 2016-07-18 2019-09-10 Emojot, Inc. Display screen for media players with graphical user interface
WO2019203824A1 (en) * 2018-04-18 2019-10-24 Kent Thomas Andrew Method of capturing and evaluating uncertainty in computerized intelligence systems for medical diagnosis
US10485490B2 (en) * 2010-11-11 2019-11-26 Zoll Medical Corporation Acute care treatment systems dashboard
WO2020006391A1 (en) * 2018-06-29 2020-01-02 Zoll Medical Corporation Systems and methods for providing resuscitation guidance based on physical features of a patient measured during an acute care event
US20200107810A1 (en) * 2014-10-14 2020-04-09 M3Dicine Ip Pty Ltd Systems, devices, and methods for capturing and outputting data regarding a bodily characteristic
WO2020197903A1 (en) * 2019-03-22 2020-10-01 Zoll Medical Corporation Handling of age of transmitted data in medical device system
US20210022688A1 (en) * 2019-07-26 2021-01-28 GPS Health LLC Methods and systems for generating a diagnosis via a digital health application
US20210169736A1 (en) * 2018-04-17 2021-06-10 Koninklijke Philips N.V. Device, system and method for supporting detection of return of spontaneous circulation during cardiopulmonary resuscitation
US11179293B2 (en) 2017-07-28 2021-11-23 Stryker Corporation Patient support system with chest compression system and harness assembly with sensor system
US11202579B2 (en) 2016-08-08 2021-12-21 Zoll Medical Corporation Wrist-worn device for coordinating patient care
WO2022040433A1 (en) * 2020-08-19 2022-02-24 Recovery Exploration Technologies Inc. Augmented intelligence for next-best-action in patient care
US11262904B2 (en) * 2018-04-20 2022-03-01 Palantir Technologies Inc. Object time series system and investigation graphical user interface
WO2022051672A1 (en) * 2020-09-04 2022-03-10 Zoll Medical Corporation Medical treatment system with companion device
US11508474B2 (en) 2016-03-31 2022-11-22 Zoll Medical Corporation Event reconstruction for a medical device
US11615565B2 (en) 2018-04-20 2023-03-28 Palantir Technologies Inc. Object time series system
US11735320B2 (en) * 2018-12-04 2023-08-22 Merative Us L.P. Dynamic creation and manipulation of data visualizations
US11869674B2 (en) 2022-04-15 2024-01-09 Recovery Exploration Technologies Inc. Translation of medical evidence into computational evidence and applications thereof

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105164627B (en) 2013-01-11 2019-04-09 卓尔医学产品公司 EMS decision support interface, event history and related tool
JP6461696B2 (en) * 2015-04-20 2019-01-30 日本光電工業株式会社 Portable medical device, program, and biological information display method
KR20180039703A (en) * 2015-08-11 2018-04-18 마시모 코오퍼레이션 Medical monitoring analysis and replay including signs of reacting to light weakened by body tissues
JP2017213249A (en) * 2016-06-01 2017-12-07 セイコーエプソン株式会社 Biological information display system, portable terminal device, wearable device, biological information display method, and biological information display program
US9899038B2 (en) * 2016-06-30 2018-02-20 Karen Elaine Khaleghi Electronic notebook system
JP6817755B2 (en) * 2016-09-14 2021-01-20 浜松ホトニクス株式会社 How to operate the concentration measuring device and the concentration measuring device
EP3552177A1 (en) * 2016-12-12 2019-10-16 Koninklijke Philips N.V. System and method for facilitating computational analysis of a health condition
US10832594B2 (en) 2017-03-09 2020-11-10 Zoll Medical Corporation Automated detection of cardiopulmonary resuscitation chest compressions
US11311457B2 (en) 2017-03-09 2022-04-26 Zoll Medical Corporation Automated detection of cardiopulmonary resuscitation chest compressions
CN110998645A (en) * 2017-08-10 2020-04-10 奥林巴斯株式会社 Observation system and information management method
CN107789742A (en) * 2017-11-24 2018-03-13 青岛市光电工程技术研究院 Portable physiotherapeutic device
US11295835B2 (en) * 2017-11-27 2022-04-05 Biosense Webster (Israel) Ltd. System and method for interactive event timeline
JP6968195B2 (en) * 2017-12-01 2021-11-17 富士フイルム株式会社 Inspection information display device, method and program
AU2018379597A1 (en) * 2017-12-08 2020-06-04 Fisher & Paykel Healthcare Limited Graphical user interface for a flow therapy apparatus
US10235998B1 (en) 2018-02-28 2019-03-19 Karen Elaine Khaleghi Health monitoring system and appliance
USD977503S1 (en) * 2018-06-25 2023-02-07 Medipines Corporation Computer display screen with breathing samples interface
US20200113471A1 (en) * 2018-10-16 2020-04-16 Cardionomous Llc Heart signal waveform processing system and method
EP3870280A1 (en) 2018-10-23 2021-09-01 Zoll Medical Corporation Data playback interface for a medical device
US10559307B1 (en) 2019-02-13 2020-02-11 Karen Elaine Khaleghi Impaired operator detection and interlock apparatus
US10735191B1 (en) 2019-07-25 2020-08-04 The Notebook, Llc Apparatus and methods for secure distributed communications and data access
CN112562823A (en) * 2019-09-25 2021-03-26 深圳迈瑞科技有限公司 Infusion pump and setting method thereof
CN111341433A (en) * 2020-03-12 2020-06-26 李斌 CDSS-based medical diagnosis and treatment system
CN116209497A (en) * 2020-05-15 2023-06-02 深圳迈瑞生物医疗电子股份有限公司 Patient monitoring device and communication method thereof
US11173315B1 (en) 2020-07-24 2021-11-16 Defibrio AS Mobile defibrillator
CN116916813A (en) * 2021-10-12 2023-10-20 杏泽科技股份有限公司 Method for collecting and presenting physiological signal data and position information, server and system for realizing method
USD1016824S1 (en) * 2021-10-13 2024-03-05 Vitalmex Internacional S.A. De C.V. Display screen with graphical user interface
CN116431684B (en) * 2023-04-18 2024-03-19 中船海神医疗科技有限公司 Diagnosis and treatment data storage and playback method and system for portable life support system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5414644A (en) * 1993-11-24 1995-05-09 Ethnographics, Inc. Repetitive event analysis system
US6073033A (en) * 1996-11-01 2000-06-06 Telxon Corporation Portable telephone with integrated heads-up display and data terminal functions
US20060229557A1 (en) * 2005-04-11 2006-10-12 Fathallah Marwan A User interface improvements for medical devices
US20090073114A1 (en) * 2007-09-13 2009-03-19 Susanne Bay Control of a scrollable context menu

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2691814B2 (en) * 1991-12-03 1997-12-17 シャープ株式会社 Portable electrocardiograph
US5549115A (en) * 1994-09-28 1996-08-27 Heartstream, Inc. Method and apparatus for gathering event data using a removable data storage medium and clock
US6055447A (en) 1995-07-06 2000-04-25 Institute Of Critical Care Medicine Patient CO2 Measurement
US5720771A (en) * 1995-08-02 1998-02-24 Pacesetter, Inc. Method and apparatus for monitoring physiological data from an implantable medical device
US5813403A (en) 1995-11-08 1998-09-29 Soller; Babs R. Optical measurement of tissue pH
US6681003B2 (en) 1999-10-05 2004-01-20 Lifecor, Inc. Data collection and system management for patient-worn medical devices
US6564088B1 (en) 2000-01-21 2003-05-13 University Of Massachusetts Probe for localized tissue spectroscopy
JP4189787B2 (en) 2001-03-06 2008-12-03 日本光電工業株式会社 Biological information display monitor and system
WO2003032808A2 (en) 2001-10-15 2003-04-24 University Of Massachusetts Tissue oxygen measurement system
EP1526442B8 (en) * 2003-09-25 2018-08-29 Ricoh Company, Ltd. Printer driver user interface for multimedia data
US20050124866A1 (en) 2003-11-12 2005-06-09 Joseph Elaz Healthcare processing device and display system
JP4319094B2 (en) * 2004-06-11 2009-08-26 ソニー株式会社 Data processing apparatus, data processing method, program, program recording medium, and data recording medium
JP3839839B2 (en) 2004-06-25 2006-11-01 株式会社セブンスディメンジョンデザイン Medical image management system and medical image management method
US7805191B2 (en) 2005-01-31 2010-09-28 Physio-Control, Inc. CPR time indicator for a defibrillator data management system
JP2006239084A (en) 2005-03-02 2006-09-14 Takahashi Kogei Kenkyusho:Kk Monitoring method of condition measuring device by portable terminal
US10124183B2 (en) 2005-12-16 2018-11-13 Koninklijke Philips N.V. Automatic external defibrillator with increased CPR administration time
JP2007233850A (en) * 2006-03-02 2007-09-13 Gifu Univ Medical treatment evaluation support device, medical treatment evaluation support system and medical treatment evaluation support program
KR20080100434A (en) * 2006-03-09 2008-11-18 톰슨 라이센싱 Content access tree
CN100399984C (en) * 2006-08-18 2008-07-09 方祖祥 Portable remote real-time monitor with first-aid and locate function
CN105999553A (en) * 2006-10-27 2016-10-12 赛昂国际医疗控股有限公司 Automatic external defibrillator (AED) capable of wireless monitoring of multiple patients
US8113844B2 (en) * 2006-12-15 2012-02-14 Atellis, Inc. Method, system, and computer-readable recording medium for synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network
US10032236B2 (en) * 2007-04-26 2018-07-24 General Electric Company Electronic health record timeline and the human figure
JP5305616B2 (en) * 2007-06-07 2013-10-02 株式会社東芝 Inspection data processing apparatus and inspection system
US10702174B2 (en) * 2007-06-27 2020-07-07 Integra Lifesciences Corporation Medical monitor user interface
JP5110270B2 (en) * 2007-08-09 2012-12-26 日本光電工業株式会社 Biological information display system, biological information display monitor, and biological information display monitor program
WO2009024966A2 (en) * 2007-08-21 2009-02-26 Closevu Ltd. Method for adapting media for viewing on small display screens
US20090207140A1 (en) * 2008-02-19 2009-08-20 Sony Ericsson Mobile Communications Ab Identifying and responding to multiple time-overlapping touches on a touch panel
JP5169374B2 (en) * 2008-03-26 2013-03-27 株式会社Jvcケンウッド Medical information recording system and display device
US20100131293A1 (en) * 2008-11-26 2010-05-27 General Electric Company Interactive multi-axis longitudinal health record systems and methods of use
JP5306015B2 (en) 2009-02-23 2013-10-02 株式会社堀場製作所 Scanning probe microscope probe and scanning probe microscope
US9400874B2 (en) * 2009-06-08 2016-07-26 Airstrip Ip Holdings, Llc Systems and methods for viewing patient data
US20110172550A1 (en) 2009-07-21 2011-07-14 Michael Scott Martin Uspa: systems and methods for ems device communication interface
JP2011036371A (en) 2009-08-10 2011-02-24 Tohoku Otas Kk Medical image recording apparatus
US9098114B2 (en) * 2009-09-03 2015-08-04 Human Touch, Llc Comprehensive user control system for therapeutic wellness devices
US9492095B2 (en) * 2009-12-02 2016-11-15 Neetour Medical Ltd. Hemodynamics-based monitoring and evaluation of a respiratory condition
WO2011116340A2 (en) * 2010-03-18 2011-09-22 Oregon Health & Science University Context-management framework for telemedicine
WO2011122402A1 (en) 2010-03-31 2011-10-06 株式会社 日立メディコ Inspection information display device and method
CN102314912A (en) * 2010-07-08 2012-01-11 睿码科技股份有限公司 Multimedia player module
CN103052956B (en) * 2010-08-03 2016-05-18 皇家飞利浦电子股份有限公司 For the demonstration of clinical events and the method for navigation
CN103313650B (en) * 2010-11-11 2016-09-14 卓尔医学产品公司 Emergency treatment system information panel
WO2012065167A1 (en) * 2010-11-12 2012-05-18 Zoll Medical Corporation Real-time evaluation of cpr performance
JP5649425B2 (en) * 2010-12-06 2015-01-07 株式会社東芝 Video search device
US9060570B2 (en) 2011-03-15 2015-06-23 Nike, Inc. Method of manufacturing a knitted component
US20130110565A1 (en) 2011-04-25 2013-05-02 Transparency Sciences, Llc System, Method and Computer Program Product for Distributed User Activity Management
US20120278099A1 (en) * 2011-04-26 2012-11-01 Cerner Innovation, Inc. Monitoring, capturing, measuring and annotating physiological waveform data
GB2503888A (en) * 2012-07-10 2014-01-15 Wirewax Ltd Expandable video playback timeline that includes the location of tag content.
US20150227694A1 (en) * 2012-08-06 2015-08-13 Knoninklijke Philips N.V. a corporation Method and apparatus for managing an annotated record of a medical treatment event
US9728228B2 (en) * 2012-08-10 2017-08-08 Smartdrive Systems, Inc. Vehicle event playback apparatus and methods
US20140068489A1 (en) * 2012-09-05 2014-03-06 Intelligent Decisions, Inc. Interactive historical timeline of personal health record
CN105164627B (en) 2013-01-11 2019-04-09 卓尔医学产品公司 EMS decision support interface, event history and related tool

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5414644A (en) * 1993-11-24 1995-05-09 Ethnographics, Inc. Repetitive event analysis system
US6073033A (en) * 1996-11-01 2000-06-06 Telxon Corporation Portable telephone with integrated heads-up display and data terminal functions
US20060229557A1 (en) * 2005-04-11 2006-10-12 Fathallah Marwan A User interface improvements for medical devices
US20090073114A1 (en) * 2007-09-13 2009-03-19 Susanne Bay Control of a scrollable context menu

Cited By (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10485490B2 (en) * 2010-11-11 2019-11-26 Zoll Medical Corporation Acute care treatment systems dashboard
US11127497B2 (en) 2013-06-10 2021-09-21 Zoll Medical Corporation Rescue performance metrics for CPR and traumatic brain injury
US11842811B2 (en) 2013-06-10 2023-12-12 Zoll Medical Corporation Rescue performance metrics for CPR and traumatic brain injury
US20140365175A1 (en) * 2013-06-10 2014-12-11 Zoll Medical Corporation Rescue Performance Metrics for CPR and Traumatic Brain Injury
US10204389B2 (en) * 2013-06-10 2019-02-12 Zoll Medical Corporation Rescue performance metrics for CPR and traumatic brain injury
US20150109135A1 (en) * 2013-10-23 2015-04-23 Dock Technologies Inc. Indicators
US9972181B1 (en) * 2014-04-11 2018-05-15 Vivint, Inc. Chronological activity monitoring and review
US10490042B1 (en) 2014-04-11 2019-11-26 Vivint, Inc. Chronological activity monitoring and review
US20170265819A1 (en) * 2014-07-27 2017-09-21 Oridion Medical 1987 Ltd. Personalized patient alarm management
US10194873B2 (en) * 2014-07-27 2019-02-05 Oridion Medical 1987 Ltd. Personalized patient alarm management
WO2016044519A1 (en) * 2014-09-18 2016-03-24 Preventice, Inc. Dynamically adaptive care plans
US20160085927A1 (en) * 2014-09-18 2016-03-24 Preventice, Inc. Dynamically adaptive care plans
US11120911B2 (en) * 2014-09-18 2021-09-14 Preventice Solutions, Inc. Dynamically adaptive care plans
US20200107810A1 (en) * 2014-10-14 2020-04-09 M3Dicine Ip Pty Ltd Systems, devices, and methods for capturing and outputting data regarding a bodily characteristic
US11179135B2 (en) 2014-10-14 2021-11-23 M3Dicine Ip Pty Ltd. Systems, devices, and methods for capturing and outputting data regarding a bodily characteristic
US10863965B2 (en) 2014-10-14 2020-12-15 M3Dicine Ip Pty Ltd. Systems, devices, and methods for capturing and outputting data regarding a bodily characteristic
US10842463B2 (en) 2014-10-14 2020-11-24 M3Dicine Ip Pty Ltd Systems, devices, and methods for capturing and outputting data regarding a bodily characteristic
US10856836B2 (en) 2014-10-14 2020-12-08 M3Dicine Ip Pty Ltd Systems, devices, and methods for capturing and outputting data regarding a bodily characteristic
US10806427B2 (en) 2014-10-14 2020-10-20 M3Dicine Ip Pty Ltd Systems, devices, and methods for capturing and outputting data regarding a bodily characteristic
USD777753S1 (en) * 2014-11-14 2017-01-31 Espec Corp. Display screen with graphical user interface
WO2016109393A1 (en) * 2014-12-29 2016-07-07 Zoll Medical Corporation Rescue performance metric
US11508474B2 (en) 2016-03-31 2022-11-22 Zoll Medical Corporation Event reconstruction for a medical device
US11883209B2 (en) 2016-04-01 2024-01-30 Cardiac Pacemakers, Inc. Systems and methods for detecting worsening heart failure
US10660577B2 (en) 2016-04-01 2020-05-26 Cardiac Pacamakers, Inc. Systems and methods for detecting worsening heart failure
US11523778B2 (en) 2016-04-01 2022-12-13 Cardiac Pacemakers, Inc. Systems and methods for detecting worsening heart failure
US11890116B2 (en) 2016-04-01 2024-02-06 Cardiac Pacemakers, Inc. Systems and methods for detecting worsening heart failure
WO2017173014A1 (en) * 2016-04-01 2017-10-05 Cardiac Pacemakers, Inc. Systems and methods for detecting worsening heart failure
USD859452S1 (en) * 2016-07-18 2019-09-10 Emojot, Inc. Display screen for media players with graphical user interface
US11202579B2 (en) 2016-08-08 2021-12-21 Zoll Medical Corporation Wrist-worn device for coordinating patient care
US10492986B2 (en) 2016-09-30 2019-12-03 Zoll Medical Corporation Wearable sensor devices and systems for patient care
US11883352B2 (en) 2016-09-30 2024-01-30 Zoll Medical Corporation Wearable sensor devices and systems for patient care
EP3300713A1 (en) 2016-09-30 2018-04-04 Zoll Medical Corporation Wearable sensor devices and systems for patient care
US11534365B2 (en) 2016-09-30 2022-12-27 Zoll Medical Corporation “Wearable sensor devices and systems for patient care”
US11179293B2 (en) 2017-07-28 2021-11-23 Stryker Corporation Patient support system with chest compression system and harness assembly with sensor system
US11723835B2 (en) 2017-07-28 2023-08-15 Stryker Corporation Patient support system with chest compression system and harness assembly with sensor system
US20210169736A1 (en) * 2018-04-17 2021-06-10 Koninklijke Philips N.V. Device, system and method for supporting detection of return of spontaneous circulation during cardiopulmonary resuscitation
WO2019203824A1 (en) * 2018-04-18 2019-10-24 Kent Thomas Andrew Method of capturing and evaluating uncertainty in computerized intelligence systems for medical diagnosis
US20220147237A1 (en) * 2018-04-20 2022-05-12 Palantir Technologies Inc. Object time series system and investigation graphical user interface
US11875436B2 (en) 2018-04-20 2024-01-16 Palantir Technologies Inc. Object time series system
US11262904B2 (en) * 2018-04-20 2022-03-01 Palantir Technologies Inc. Object time series system and investigation graphical user interface
US11579762B2 (en) * 2018-04-20 2023-02-14 Palantir Technologies Inc. Object time series system and investigation graphical user interface
US11615565B2 (en) 2018-04-20 2023-03-28 Palantir Technologies Inc. Object time series system
CN112638345A (en) * 2018-06-29 2021-04-09 Zoll医疗公司 System and method for providing resuscitation guidance based on physical characteristics of a patient measured during an acute care event
WO2020006391A1 (en) * 2018-06-29 2020-01-02 Zoll Medical Corporation Systems and methods for providing resuscitation guidance based on physical features of a patient measured during an acute care event
US11735320B2 (en) * 2018-12-04 2023-08-22 Merative Us L.P. Dynamic creation and manipulation of data visualizations
WO2020197903A1 (en) * 2019-03-22 2020-10-01 Zoll Medical Corporation Handling of age of transmitted data in medical device system
US20210022688A1 (en) * 2019-07-26 2021-01-28 GPS Health LLC Methods and systems for generating a diagnosis via a digital health application
US11551813B2 (en) 2020-08-19 2023-01-10 Recovery Exploration Technologies Inc. Augmented intelligence for next-best-action in patient care
US11437145B2 (en) 2020-08-19 2022-09-06 Recovery Exploration Technologies Inc. Augmented intelligence for next-best-action in patient care
WO2022040433A1 (en) * 2020-08-19 2022-02-24 Recovery Exploration Technologies Inc. Augmented intelligence for next-best-action in patient care
WO2022051672A1 (en) * 2020-09-04 2022-03-10 Zoll Medical Corporation Medical treatment system with companion device
US11869674B2 (en) 2022-04-15 2024-01-09 Recovery Exploration Technologies Inc. Translation of medical evidence into computational evidence and applications thereof

Also Published As

Publication number Publication date
CN110277160A (en) 2019-09-24
US20170293726A1 (en) 2017-10-12
JP6764596B2 (en) 2020-10-07
CN110265132B (en) 2024-01-26
JP2016505997A (en) 2016-02-25
CN110277159A (en) 2019-09-24
WO2014110280A3 (en) 2014-10-23
CN110251836A (en) 2019-09-20
US20210303143A1 (en) 2021-09-30
US10976908B2 (en) 2021-04-13
RU2015133474A (en) 2017-02-17
US9658756B2 (en) 2017-05-23
CN110265132A (en) 2019-09-20
EP3913639A3 (en) 2022-03-30
CA2897915A1 (en) 2014-07-17
JP2021000474A (en) 2021-01-07
JP7028521B2 (en) 2022-03-02
CN105164627B (en) 2019-04-09
CN117079792A (en) 2023-11-17
BR112015016664A2 (en) 2017-07-11
WO2014110280A2 (en) 2014-07-17
US11816322B2 (en) 2023-11-14
AU2014205388A1 (en) 2015-07-23
EP2943926A4 (en) 2018-05-23
JP6417334B2 (en) 2018-11-07
CN110251836B (en) 2024-01-23
CN105164627A (en) 2015-12-16
EP3913639A2 (en) 2021-11-24
EP2943926A2 (en) 2015-11-18
JP2019048078A (en) 2019-03-28
CN110277160B (en) 2023-07-28

Similar Documents

Publication Publication Date Title
US11816322B2 (en) EMS decision support interface, event history, and related tools
US20230277138A1 (en) Use of Muscle Oxygen Saturation and PH in Clinical Decision Support
US11826181B2 (en) Acute care treatment systems dashboard

Legal Events

Date Code Title Description
AS Assignment

Owner name: ZOLL MEDICAL CORPORATION, MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FREEMAN, GARY A;SILVER, ANNEMARIE ELIZABETH;SIGNING DATES FROM 20160616 TO 20160620;REEL/FRAME:038961/0050

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4