Embodiments of the present invention relate generally to tools for facilitating acute care treatment, and more specifically to systems and methods for clinical decision support and differential diagnosis.
In the pre-hospital and acute care treatment setting, medical responders often have difficulties in accurately determining the proper diagnosis of a particular patient. Even well-trained physicians often have difficulty under emergency conditions in which split second decisions are required with limited information. Computer-automated diagnosis was developed to improve the accuracy, effectiveness, and reliability of both field and hospital of patient treatment.
Automated differential diagnosis utilizes computer inference algorithms such as Bayesian algorithms, neural networks, or genetic algorithms. According to a Wikipedia posting:
Despite the fact that automated differential diagnosis systems have been developed and attempted to be implemented for more than 35 years now, they have not achieved any acceptance in the emergency medical setting for acute care treatment (ACT). In large part, this failure is due to the conditions under which emergency care of acute conditions are practiced. In those situations, such as the treatment of trauma, cardiac arrest or respiratory arrest, speed of decision-making is critical and caregivers already must split their time and attention between the patient and the physiological monitors and defibrillators. In such situations, automated differential diagnosis (ADD) tools are often viewed as interfering with the caregiving process and as a delay to treatment of the patient. Given that every minute can result in a 10% drop in survival rate, such as is the case for cardiac arrest, it is not surprising that ADD tools are ignored by the very people that they were designed to assist.
It has also been found that much of the patient's medical history is inaccessible by the caregiver at the time of the acute medical condition because patients are often treated in the prehospital setting where family members are often not present at the time of the injury.
Embodiments of the present invention include a system that provides a tool for the caregiver to more efficiently and accurately perform a differential diagnosis that is integrated into the caregivers existing workflow during emergency situations. Embodiments of the present invention may also provide an integrated view of physiological data from the patient, along with therapeutic treatment and patient history and examination findings, in an automated way to caregivers.
A medical system according to embodiments of the present invention includes at least one sensor configured to monitor physiological status of a patient and to generate sensor data based on the physiological status; a user interface device; a processor communicably coupled to the user interface device, the processor configured to: present via the user interface device an array of two or more possible input elements, the input elements each comprising a class of patients or a diagnosis and treatment pathway; receive a selected input element based on a user selection among the two or more possible input elements; acquire the sensor data and process the sensor data to generate physiological data; and present via the user interface device the physiological data according to a template that is customized for the selected input element.
The medical system as described above, in which the selected input element is selected based on activation of one or more switches.
The medical system as described above, in which the selected input element is selected based on touching a touch-activated screen.
The medical system as described above, wherein the touch-activated screen is the user interface screen.
The medical system as described above, wherein the at least one sensor is one or more of an ECG, SpO.sub.2, NIR tissue perfusion, NIR pH, ultrasound, ventilator flow rate, EtCO.sub.2, invasive blood pressure, and non-invasive blood pressure sensors.
The medical system as described above, wherein the processor is further configured to receive a caliper gesture signal generated by the touching of two points on the touch-activated screen at the same time with the same hand, and to overlay measurement data onto the physiological data upon receipt of the caliper gesture signal.
The medical system as described above, wherein the array of two or more possible input elements includes at least one of: a respiratory distress or dyspnea diagnosis and treatment pathway; an altered mental status diagnosis and treatment pathway; a cardiac distress diagnosis and treatment pathway; a trauma diagnosis and treatment pathway; and a pain or abnormal nerve sensation diagnosis and treatment pathway.
The medical system as described above, wherein the array of two or more possible input elements includes: a respiratory distress or dyspnea diagnosis and treatment pathway; an altered mental status diagnosis and treatment pathway; a cardiac distress diagnosis and treatment pathway; a trauma diagnosis and treatment pathway; and a pain or abnormal nerve sensation diagnosis and treatment pathway.
The medical system as described above, further comprising a tablet computer.
The medical system as described above, wherein the processor is part of the tablet computer.
The medical system as described above, wherein the tablet computer is an iPad® tablet computer.
The medical system as described above, wherein the user interface screen is part of the tablet computer.
The medical system as described above, further including a defibrillator.
The medical system as described above, wherein the user interface screen is part of the defibrillator.
The medical system as described above, wherein tablet computer includes a protective housing, wherein the protective housing includes a first mounting feature, the medical system further including a second mounting feature configured to interfit with the first mounting feature.
The medical system as described above, wherein the array of two or more possible input elements comprises a respiratory distress or dyspnea diagnosis and treatment pathway.
The medical system as described above, wherein the at least one sensor is configured to monitor heart sounds of the patient.
The medical system as described above, wherein the at least one sensor is configured to monitor breathing sounds of the patient.
The medical system as described above, wherein the processor is further configured to differentiate between wheezing, crackles, rale, and stridor breathing sounds.
The medical system as described above, wherein the at least one sensor is a near infrared based sensor.
The medical system as described above, wherein the at least one sensor is configured to measure pH of either tissue or blood of the patient.
The medical system as described above, wherein the at least one sensor is an ECG sensor, and wherein the physiological data reflects heart rate variability.
While multiple embodiments are disclosed, still other embodiments of the present invention will become apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
While the invention is amenable to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the invention to the particular embodiments described. On the contrary, the invention is intended to cover all modifications, equivalents, and alternatives falling within the scope of the invention as defined by the appended claims.
By pressing the soft-key for each DTP, the defibrillator is then configured to potentially activate certain physiological sensors and display the sensor data in such a way as to provide the caregiver the optimal information, presented in the optimal fashion so as to diagnose and treat the patient most accurately and efficiently. Each DTP may include a template according to which sensor data, or the physiological and/or measurement data derived therefrom, is displayed in a way most useful and/or efficient for that particular DTP. For instance, if the “Respiratory Distress” soft-key is pressed, then the waveforms and numeric physiologic data on the screen change to that shown in
Heart sound measurement and detection may be incorporated into the monitoring device for the detection of S3 and S4 heart sounds and automatically narrow the differential, or suggest for the rescuer to confirm agreement with the software diagnosis, of heart failure or pulmonary edema. A flowchart for evaluating heart sounds is shown in
Sensors such as flow sensors and O2 gas sensors are included in some embodiments, so that the additional physiological measurements such as volumetric Co2, volumetric O2 and spirometry, which are relevant for diagnosis and treatment of dyspnea, may be included and displayed on the Respiratory Distress DTP screen. An oxygen sensor may be located in the patient's airway, which may assist in calculating the metabolic needs of the patient.
The display on the defibrillator 212 is a touchscreen, according to some embodiments of the present invention. The caregiver can easily initiate measurements such as on the CO2 snapshot waveform or the spirometry snapshot waveform via touchscreen gesture such as a double tap. A zoom icon may exist in the upper corner of each waveform box, such as the CO2 snapshot, such that when the zoom button is touched, that particular waveform fills the display of the defibrillator. Another measurement button is present which, when touched, displays all the relevant measurements for a particular waveform, according to embodiments of the present invention. A gestural interface is provided as part of the touchscreen. Using two fingers or finger and thumb to touch to two points in the waveform (which may also be referred to as a “caliper” measurement or gesture) will cause measurements to be displayed and/or overlaid onto the physiological data, as illustrated in
According to embodiments of the present invention, the processor communicably coupled with the touchscreen portion of a decision support system may be configured to recognize the wave shape of a wave signal being displayed, and/or recognize the edge of an image being displayed, in order to improve the accuracy of a caliper touch gesture. For example, if a user were to use a caliper gesture to measure or “zoom in” on an ST elevation in an ECG wave display, the decision support system may be configured to recognize that if one of the user's fingers taps just below the top of the ECG wave, that the user likely intended to include the top of the ECG wave in the enlarged or selected view. In addition, the decision support system may be configured to permit an ability to enlarge (zoom) and adjust measurement points individually using the touchscreen. A tap/click and drag method may be used to set the caliper gesture; for example, to hone in on a particular portion of displayed waveform, the user may press on one point and drag to another point to indicate the endpoints of the caliper gesture.
Specific out-of-range readings can be displayed in red or highlighted by other mechanisms, such as bold-face font and/or flashing. Touching the highlighted values will cause the display to show the possible diagnoses which are consistent with the measurements, according to embodiments of the present invention. A specific graphical zone of the screen can be designated with a graphical image of the computer tablet. By dragging waveforms, measurements, or any other data object shown on the display over onto the computer tablet icon, it can automatically be presented on the computer tablet that is linked to the defibrillator.
Capnography is helpful in the assessment of asthma, where an increased slope in the expiratory plateau provides a measure of bronchospasm. The slope of the plateau phase (phase III) provides a measure of airway obstruction. The adequacy of b-agonist bronchodilatory therapy for an asthma exacerbation may be monitored through observation of slope change of phase III.
As referenced in U.S. Patent Application Publication No. 2011/0172550, published on Jul. 14, 2011, which is incorporated by reference herein in its entirety for all purposes, the data for the patient's history may be entered via the computer tablet with patient physiological measures via the monitor. As the differential diagnosis often implicates both patient history, patient examination findings along with measures of the patient's physiological state via such monitoring as ECG, capnography and pulse oximetry, these data elements are integrated into a user interface that automatically or semi-automatically integrates the various data elements on a single differential diagnosis screen within the application on the computer tablet. The interface may begin by asking the rescuer to choose from a list of common presenting symptoms or complaints by the patient, for example dyspnea or respiratory distress. The information such as on the screens of
In another embodiment, the defibrillator contains a docking feature for propping up a computer tablet such as an Apple® iPad® on top of the defibrillator in a stable position via mounting features integrated onto the defibrillator, as illustrated in
The results of this integrated analysis of physiological data, patient history and examination findings may then be displayed on the defibrillator, potentially in the form of asking to make an additional physiological measurement. The results of this integrated analysis of physiological data, patient history and examination findings may alternatively, or additionally, be displayed on the tablet computer. According to some embodiments of the present invention, the tablet computer, or other mobile computing device, may be communicably coupled with the defibrillator or other physiological assessment device, for example through a wireless connection. As used herein, the phrase “communicably coupled” is used in its broadest sense to refer to any coupling whereby information may be passed. Thus, for example, communicably coupled includes electrically coupled by, for example, a wire; optically coupled by, for example, an optical cable; and/or wirelessly coupled by, for example, a radio frequency or other transmission media. “Communicably coupled” also includes, for example, indirect coupling, such as through a network, or direct coupling.
According to embodiments of the present invention, a user interface device is communicably coupled to a processor, and the processor is configured to receive data entered via the user interface device, as well as data received from one or more sensors, in order to perform clinical decision support based on both data sources. The user interface device may include one or more devices such as a touch screen computer, a tablet computer, a mobile computing device, a smart phone, an audio receiver, an audio transmitter, a video receiver, a video transmitter, a camera, and a “heads up” display projected onto a user's glasses or face shield. A small monitor may be mounted onto eyeglasses, a face shield, and/or integrated with other wearable communications devices, such as, for example, an ear bud or a Bluetooth® hands free phone adaptor. The user interface device may include a combination of devices for conveying options and receiving input; for example, an audio speaker may be used to convey possible DTPs, and an audio receiver may be used to receive a verbal command indicating a selection of one of the DTPs. Instead of an audio receiver, a video camera may be used to receive a gestural command that will be interpreted by the processor as a selection of one of the possible DTPs, or input elements. Using hands-free devices for user interface devices may free the hands of a caregiver to perform clinical tasks, while still permitting non-intrusive decision support and/or differential diagnosis for the caregiver.
The decision support system may take into account both physiological data received from sensors, and information data received from the caregiver (e.g. via mobile computing device such as an iPad®), in helping the caregiver move from one decision point in the flow chart to the next, while updating any display or information provided along the way. For example, the decision support system may indicate to the user that, based on processing of the ECG data, there does not appear to be an S3 heart sound present, and ask the caregiver to confirm this assessment. The decision support system may also, or alternatively, request the caregiver to enter a Dyspnea Engagement Score, or suggest one for confirmation by the caregiver. At step 802, if the 12-lead reveals no S3 heart sound, or if the Dyspnea Engagement Score is less than 3, then the decision support system will recognize that the caregiver is not dealing with a CHF situation, but then moves to step 804 in which the decision support system changes its display and/or input prompts in order to help the caregiver determine whether to enter the Asthma treatment path or the COPD treatment path.
Again, the decision support system may factor in various physiological data from sensors, as well as various informational data received about the particular patient, in helping to support the caregiver's decision. For example, as illustrated in
According to embodiments of the present invention, the decision support system may suggest or propose a diagnosis or treatment path, for example by indicating statistical probabilities (based on charts and data such as those of
At step 808, the decision support system may help the caregiver decide whether the patient is extremely bronchoconstricted, for example by showing data or measurements related to blood oxygen content, respiration rate, or respiration volume. Upon a confirmation by the caregiver that the patient is extremely bronchoconstricted at step 808, the decision support system may then suggest to the caregiver that a 125 milligram dose of Solumedrol be administered over a slow (e.g. 2 minute) intravenous push. At step 810, the decision support system may help the caregiver to decide whether the patient's symptoms have improved (e.g. whether the patient's shortness of breath has improved with the treatment thus far). For example, the decision support system may display and/or analyze the patient's end-tidal waveform, and suggest that the patient does not appear to be responding to the treatment, and ask for the caregiver's confirmation. If the caregiver confirms the decision, then the decision support system may continue to guide the caregiver through additional treatment options, for example those indicated in
The decision support according to embodiments of the present invention may or may not be fully automated. Inference engines utilizing Bayesian networks, neural networks, genetic algorithms, or simpler rule-based systems may be employed.
In another embodiment, the tissue CO2 or pH are measured by methods such as those described in U.S. Pat. No. 6,055,447, which describes a sublingual tissue CO2 sensor, or U.S. Pat. Nos. 5,813,403, 6,564,088, and 6,766,188, which describe a method and device for measuring tissue pH via near infrared spectroscopy (NIRS), and which are all incorporated herein by reference in their entirety for all purposes. NIRS technology allows the simultaneous measurement of tissue PO2, PCO2, and pH. One drawback of previous methods for the measurement of tissue pH is that the measurements provided excellent relative accuracy for a given baseline measurement performed in a series of measurements over the course of a resuscitation, but absolute accuracy was not as good, as a result of patient-specific offsets such as skin pigment. One of the benefits achieved by some embodiments of the present invention is the elimination of the need for absolute accuracy of these measurements, and the reliance on only the offset and gain being stable over the course of the resuscitation. Tissue CO2 and pH are particularly helpful in monitoring in the trauma DTP. Physiological parameters on display for the trauma DTP may be one or more of: invasive and non-invasive blood pressure, tissue CO2 and pH, ECG, SpO2 trending, and heart rate variability risk index. The ECG may be analyzed to determine the interval between adjacent R-waves of the QRS complexes and using this interval to calculate heart rate variability as a running difference between adjacent R-R intervals. It is known to those skilled in the art that an abrupt reduction in variability will often precede by many minutes a precipitous decline in a patient's blood pressure (traumatic arrest). By monitoring the trend in heart rate variability, the traumatic arrest can be anticipated and prevented.
Another sensor of use for the trauma DTP is ultrasound, according to embodiments of the present invention. According to C. Hernandez et al., C.A.U.S.E.: Cardiac arrest ultra-sound exam—A better approach to managing patients in primary non-arrhythmogenic cardiac arrest, Resuscitation (2007), doi:10.1016/j.resuscitation.2007.06.033, which his incorporated by reference herein in its entirety for all purposes:
The caregiver selecting elements of the flowchart results in the ultrasound sensor being activated and images presented on the computer tablet. Additional instructions can be requested from the interface on either the computer tablet and/or the defibrillator. Based on the selections and instructions, the settings of the ultrasound can be adjusted to deliver the optimal images, according to embodiments of the present invention.
Although five diagnosis and treatment pathways are discussed with respect to
For example, if a user selected the Trauma DTP option from the screen of
According to other embodiments, the decision support system is informed by a combination of caregiver observations, patient information, and/or sensor data. Assessment and/or scoring may be performed, either by receiving data from the caregiver, or receiving data from sensors, or both. For example, for a trauma DTP, the decision support system may take into account pulse rate, breathing data, qualitative breathing data, pulse rate, blood loss, blood pressure, presence of broken limbs, and/or compound fractures. Or, in a cardiac distress DTP, the decision support system may be configured to display a cardiac arrest probability at a moment in time, which may be calculated and/or predicated by the decision support system based on selected criteria. The decision support system may also be configured to track certain criteria in order to suggest treatment outcome probabilities, for example suggesting the treatment pathway with the highest or a high perceived probability of success.
According to some embodiments of the present invention, a monitor, or a defibrillator/monitor combination, or other similar device, may be configured to provide a graphical tool to configure the monitor to follow recognized rescue protocols, for example one or more of the protocols described and/or shown herein. Such a tool may be included on the monitor or defibrillator device, on a tablet or handheld or other computing device, and/or on both, according to embodiments of the present invention. Such a tool may be provided in a graphical interface, for example a flowchart. The tool allows the user to configure the patient monitor to follow a particular rescue protocol, for example by visually presenting a flow chart for the protocol and allowing the user to customize the protocol. For example, the length of the CPR period may be configured by the user to customize the treatment protocol. Such a tool may also permit the downloading and uploading of customized treatment protocols to and/or from a monitoring device, which may also permit the same customized protocol settings to be carried on a mobile device and/or transferred or uploaded to multiple other devices in different locations and/or at different times, according to embodiments of the present invention.
Various modifications and additions can be made to the exemplary embodiments discussed without departing from the scope of the present invention. For example, while the embodiments described above refer to particular features, the scope of this invention also includes embodiments having different combinations of features and embodiments that do not include all of the described features. Accordingly, the scope of the present invention is intended to embrace all such alternatives, modifications, and variations as fall within the scope of the claims, together with all equivalents thereof.
This application is a continuation under 35 U.S.C. § 120 of U.S. patent application Ser. No. 16/399,827, filed on Apr. 30, 2019 and issued as U.S. Pat. No. 10,959,683, which is a continuation of U.S. patent application Ser. No. 13/294,947, filed on Nov. 11, 2011 and issued as U.S. Pat. No. 10,485,490, which claims priority under 35 USC § 119(e) to U.S. Provisional Patent Application Ser. No. 61/436,943, filed on Jan. 27, 2011 and to U.S. Provisional Patent Application Ser. No. 61/413,266, filed on Nov. 12, 2010 and to U.S. Provisional Patent Application Ser. No. 61/412,679, filed on Nov. 11, 2010. All subject matter set forth in the above referenced applications is hereby incorporated by reference in its entirety into the present application as if fully set forth herein.
Number | Name | Date | Kind |
---|---|---|---|
5086391 | Chambers | Feb 1992 | A |
5218969 | Bredesen et al. | Jun 1993 | A |
5255187 | Sorensen | Oct 1993 | A |
5626151 | Linden | May 1997 | A |
5782878 | Morgan et al. | Jul 1998 | A |
5813403 | Soller et al. | Sep 1998 | A |
5853005 | Scanlon | Dec 1998 | A |
6024699 | Surwit et al. | Feb 2000 | A |
6025699 | Cummings | Feb 2000 | A |
6055447 | Weil et al. | Apr 2000 | A |
6261238 | Gavriely | Jul 2001 | B1 |
6270456 | Iliff | Aug 2001 | B1 |
6321113 | Parker et al. | Nov 2001 | B1 |
6356785 | Snyder | Mar 2002 | B1 |
6398728 | Bardy | Jun 2002 | B1 |
6443889 | Groth et al. | Sep 2002 | B1 |
6488629 | Saetre et al. | Dec 2002 | B1 |
6551243 | Bocionek et al. | Apr 2003 | B2 |
6564088 | Soller et al. | May 2003 | B1 |
6572560 | Watrous et al. | Jun 2003 | B1 |
6725447 | Gilman et al. | Apr 2004 | B1 |
6766188 | Soller | Jul 2004 | B2 |
6829501 | Nielsen et al. | Dec 2004 | B2 |
6849045 | Iliff | Feb 2005 | B2 |
6898462 | Rock et al. | May 2005 | B2 |
7020844 | Trevino et al. | Mar 2006 | B2 |
7072840 | Mayaud | Jul 2006 | B1 |
7107095 | Manolas | Sep 2006 | B2 |
7164945 | Hamilton et al. | Jan 2007 | B2 |
7184963 | Shannon | Feb 2007 | B1 |
7212128 | Schenker | May 2007 | B2 |
7213009 | Pestotnik et al. | May 2007 | B2 |
7306560 | Iliff | Dec 2007 | B2 |
7421647 | Reiner | Sep 2008 | B2 |
7447643 | Olson et al. | Nov 2008 | B1 |
7558622 | Tran | Jul 2009 | B2 |
7742931 | McElwain Miller | Jun 2010 | B2 |
7835925 | Roe et al. | Nov 2010 | B2 |
7853327 | Patangay et al. | Dec 2010 | B2 |
7945452 | Fathallah et al. | May 2011 | B2 |
7986309 | Kim | Jul 2011 | B2 |
8068104 | Rampersad | Nov 2011 | B2 |
8098423 | Islam | Jan 2012 | B2 |
8137270 | Keenan et al. | Mar 2012 | B2 |
8335694 | Reiner | Dec 2012 | B2 |
8337404 | Osorio | Dec 2012 | B2 |
8352021 | Scheib | Jan 2013 | B2 |
8355928 | Spahn | Jan 2013 | B2 |
8392217 | Iliff | Mar 2013 | B2 |
8487881 | Keenan | Jul 2013 | B2 |
8510126 | Martin et al. | Aug 2013 | B2 |
8527049 | Koh et al. | Sep 2013 | B2 |
8587426 | Bloem | Nov 2013 | B2 |
8744543 | Li et al. | Jun 2014 | B2 |
8790257 | Libbus et al. | Jul 2014 | B2 |
8900141 | Smith et al. | Dec 2014 | B2 |
8923918 | Kreger et al. | Dec 2014 | B2 |
8956292 | Wekell et al. | Feb 2015 | B2 |
9286440 | Carter et al. | Mar 2016 | B1 |
9492084 | Behar et al. | Nov 2016 | B2 |
9596989 | Morris | Mar 2017 | B2 |
9658756 | Freeman et al. | May 2017 | B2 |
10146912 | Drysdale et al. | Dec 2018 | B2 |
10303852 | Peterson et al. | May 2019 | B2 |
10410748 | Fitzgerald et al. | Sep 2019 | B2 |
11622726 | Freeman | Apr 2023 | B2 |
20010047140 | Freeman | Nov 2001 | A1 |
20020184055 | Naghavi et al. | Dec 2002 | A1 |
20030036683 | Kehr et al. | Feb 2003 | A1 |
20030036924 | Rosen et al. | Feb 2003 | A1 |
20040049233 | Edwards | Mar 2004 | A1 |
20040064169 | Briscoe et al. | Apr 2004 | A1 |
20040064342 | Browne | Apr 2004 | A1 |
20040143298 | Nova et al. | Jul 2004 | A1 |
20040147818 | Levy et al. | Jul 2004 | A1 |
20040152954 | Pearce | Aug 2004 | A1 |
20040214148 | Salvino | Oct 2004 | A1 |
20050015115 | Sullivan | Jan 2005 | A1 |
20050187472 | Lysyansky et al. | Aug 2005 | A1 |
20050277872 | Colby et al. | Dec 2005 | A1 |
20060047188 | Bohan | Mar 2006 | A1 |
20060111749 | Westenskow | May 2006 | A1 |
20060111933 | Wheeler | May 2006 | A1 |
20060116908 | Dew et al. | Jun 2006 | A1 |
20060122864 | Gottesman et al. | Jun 2006 | A1 |
20060142648 | Banet et al. | Jun 2006 | A1 |
20070032830 | Bowers | Feb 2007 | A1 |
20070175980 | Alsafadi | Aug 2007 | A1 |
20070191687 | Justus | Aug 2007 | A1 |
20080176199 | Stickney et al. | Jul 2008 | A1 |
20090054735 | Higgins et al. | Feb 2009 | A1 |
20090069642 | Gao et al. | Mar 2009 | A1 |
20090076345 | Manicka et al. | Mar 2009 | A1 |
20090089095 | Esham et al. | Apr 2009 | A1 |
20090227883 | Zhang et al. | Sep 2009 | A1 |
20090270931 | Liden | Oct 2009 | A1 |
20100010319 | Tivig et al. | Jan 2010 | A1 |
20100087883 | Sullivan et al. | Apr 2010 | A1 |
20100161353 | Mayaud | Jun 2010 | A1 |
20100249540 | Lisogurski | Sep 2010 | A1 |
20100302281 | Kim | Dec 2010 | A1 |
20100305412 | Darrah et al. | Dec 2010 | A1 |
20110074831 | Lynch et al. | Mar 2011 | A1 |
20110118561 | Tari et al. | May 2011 | A1 |
20110130636 | Daniel et al. | Jun 2011 | A1 |
20110172550 | Martin | Jul 2011 | A1 |
20110208540 | Lord et al. | Aug 2011 | A1 |
20110295078 | Reid | Dec 2011 | A1 |
20120123218 | Renes | May 2012 | A1 |
20120123223 | Freeman et al. | May 2012 | A1 |
20120278099 | Kelly et al. | Nov 2012 | A1 |
20120302910 | Freeman | Nov 2012 | A1 |
20130096649 | Martin et al. | Apr 2013 | A1 |
20130124090 | Gotschall et al. | May 2013 | A1 |
20140296675 | Freeman et al. | Oct 2014 | A1 |
20150359489 | Baudenbacher et al. | Dec 2015 | A1 |
20160256102 | Castiel | Sep 2016 | A1 |
20190349652 | Greenewald et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
101226452 | Jul 2008 | CN |
101779986 | Jul 2010 | CN |
101849241 | Sep 2010 | CN |
H0647005 | Feb 1994 | JP |
09-262213 | Oct 1997 | JP |
H10127586 | May 1998 | JP |
2003-521972 | Jul 2003 | JP |
2003532442 | Nov 2003 | JP |
2005-524436 | Aug 2005 | JP |
2005-524498 | Aug 2005 | JP |
2007-125151 | May 2007 | JP |
2008-200111 | Sep 2008 | JP |
2010514498 | May 2010 | JP |
2010515546 | May 2010 | JP |
20080086496 | Jul 2008 | WO |
Entry |
---|
Zoorob et al., Acute Dyspnea in the Office, American Family Physician, vol. 68, 2003 (Year: 2003). |
Hernandez, et al., “C.A.U.S.E.: Cardiac Arrest Ultra-sound Exam—A Better Approach to Managing Patients in Primary Non-arrhythmogenic Cardiac Arrest,” Resuscitation, vol. 76, Issue 2, pp. 198-206 (Feb. 2008). |
Notice of Reasons for Refusal for Japanese Patent Application No. 2020-104986, with English Translation, dated Apr. 13, 2021, 5 pages. |
Number | Date | Country | |
---|---|---|---|
20200359972 A1 | Nov 2020 | US |
Number | Date | Country | |
---|---|---|---|
61436943 | Jan 2011 | US | |
61413266 | Nov 2010 | US | |
61412679 | Nov 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16399827 | Apr 2019 | US |
Child | 16947554 | US | |
Parent | 13294947 | Nov 2011 | US |
Child | 16399827 | US |