The present technology is generally related to non-contact monitoring systems for patients, used in conjunction with attached patient sensors.
Many conventional medical monitors require attachment of a sensor to a patient in order to detect physiologic signals from the patient and to transmit detected signals through a cable to the monitor. These monitors process the received signals and determine vital signs such as the patient's pulse rate, respiration rate, and arterial oxygen saturation. For example, a pulse oximeter is a finger sensor that can include two light emitters and a photodetector. The sensor emits light into the patient's finger and transmits the detected light signal to a monitor. The monitor includes a processor that processes the signal, determines vital signs (e.g., pulse rate, respiration rate, arterial oxygen saturation), and displays the vital signs on a display.
Other monitoring systems include other types of monitors and sensors, such as electroencephalogram (EEG) sensors, blood pressure cuffs, temperature probes, air flow measurement devices (e.g., spirometer), and others. Some wireless, wearable sensors have been developed, such as wireless EEG patches and wireless pulse oximetry sensors.
Video-based monitoring is a field of patient monitoring that uses one or more remote video cameras to detect physical attributes of the patient. This type of monitoring can also be called “non-contact” monitoring in reference to the remote video sensor(s), which does/do not contact the patient. The remainder of this disclosure offers solutions and improvements in this field.
The techniques of this disclosure generally relate to a patient monitoring system including a non-contact monitoring component, used in conjunction with one or more attached patient sensors, with the sensor(s) activating additionally streamed physiological parameters from the non-contact monitoring system.
In one aspect, a first sensor in contact with a patient provides first data to determine one or more patient parameters. A non-contact video monitoring system including an image capture device is programmed to capture second data related to the patient. A monitoring device is configured to receive and display said first data; and either said first sensor or an associated first sensor device is configured to provide instructions to the monitoring device to display said second data.
In another aspect, an additional connecting element is associated with the first sensor, the additional connecting element configured to receive the first data from the one or more sensors and to transmit the first data to the monitoring device.
In another aspect, the first sensor or associated first sensor device is configured to provide instruction as an encrypted key.
In another aspect, the non-contact video monitoring system is programmed to define one or more regions of interest (ROI's) on a patient, capture the second data related to the patient, wherein the second data includes two or more images of the ROI's, and to measure changes in depths of the ROI's across the two or more images of the ROI's.
In another aspect, the image capture device includes a depth sensing camera, an RGB camera, and/or an infrared camera.
In another aspect, instructions to the monitoring device are configured to: allow the monitoring device to include the second data directly onto a current display screen; allow the monitoring device to reconfigure a current screen to display the second data; and/or allow for separate pages to be accessible on the monitoring device for display of the second data.
In other aspects, the first and second data are combined at a monitoring device or are combined prior to receipt at a monitoring device.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale. Instead, emphasis is placed on illustrating clearly the principles of the present disclosure. The drawings should not be taken to limit the disclosure to the specific embodiments depicted, but are for explanation and understanding only.
The following disclosure describes patient monitoring devices, systems, and associated methods for detecting and/or monitoring one or more patient parameters, such as tidal volume, respiratory rate, minute volume, patient movement, temperature, blood pressure, heart rate, arterial oxygen saturation, and/or others. As described in greater detail below, devices, systems, and/or methods configured in accordance with embodiments of the present technology are configured to capture one or more images (e.g., a video sequence) of a patient or a portion of a patient (e.g., a patient's torso) within a field of view of a non-contact detector (e.g., an image capture device). The devices, systems, and/or methods can measure changes in depths of regions (e.g., one or more pixels or groups of pixels) in the captured images over time. Based, at least in part, on these measurements, the devices, systems, and/or methods can determine various respiratory parameters of a patient, including tidal volume, minute volume, and respiratory rate, among others. In these and other embodiments, the device, systems, and/or methods can analyze the respiratory parameters and can trigger alerts and/or alarms when the devices, systems, and/or methods detect one or more breathing abnormalities.
Additionally, devices, systems, and/or methods configured in accordance with embodiments of the present technology can include one or more sensors or probes associated with (e.g., contacting) a patient that can be configured to capture data (e.g., temperature, blood pressure, heart rate, arterial oxygen saturation, etc.) related to a patient. The devices, systems, and/or methods can transmit the captured data to a monitoring device, hub, mobile patient management system (MPM), or the like. In some embodiments, the devices, systems, and/or methods can analyze the captured data to determine and/or monitor one or more patient parameters. In these and other embodiments, the devices, systems, and/or methods can use the data captured by the one or more sensors or probes in conjunction with data captured using a non-contact detector. In these and still other embodiments, the devices, systems, and/or methods can trigger alerts and/or alarms when the devices, systems, and/or methods detect one or more patient parameter abnormalities.
In some embodiments, one or more sensors or probes associated with (e.g., contacting) a patient can be configured to capture data related to a patient and can be configured to activate additionally streamed physiological parameters from non-contact monitoring (NCM) devices. In such cases, one or more sensors or probes and/or one or more associated intermediary devices (e.g., an additional connecting element (ACE)) can be configured to communicate with and instruct a monitoring device, such as a hub, mobile patient management system (MPM), or the like (note that a monitoring device could also be a clinician's data tablet, a central data collection and display system, etc.), to start receiving physiological data from an NCM camera and/or allow such data to be displayed on the screen.
In exemplary embodiments such instruction by a sensor, probe or associated device comprises a key that is transmitted to (either via a wired or wireless connection) and read by the monitoring device. The key can be configured to: allow the monitoring device to include the additional physiological information directly onto a current display screen; allow the device or system to reconfigure a current screen to display the additional physiological information; and/or allow for separate pages to be accessible on the device or system for display of the additional physiological information.
In some exemplary embodiments, sensors or probes that are configured to provide such instruction are provided with marking, for example different color coding, packaging, labeling, etc., to distinguish from standard sensors or probes. In some embodiments, such sensors or probes are configured with a unique or different connector shape and/or connector pin configuration relative to standard sensors or probes.
In some exemplary embodiments, a wireless receiver or additional connecting element (ACE) can collect or pass through the patient sensor or probe information (e.g., pulse oximeter information, or a combination of different types of information from plural patient sensors or probes) as well as receive the additional (NCM) physiological information, passing all of this information on to a monitoring device. Additionally, different wireless receivers or ACE components may be configured to switch on different physiological parameters, for example with one configured only to include respiratory parameters (e.g., respiratory rate, tidal volume, minute volume and central and obstructive apnea detection), whereas others could be configured to only include patient activity and posture information, etc. Accordingly, one or more ACE components may be configured to simultaneously provide information to a monitoring device.
In some embodiments, one or more ACE components may be configured only to when an appropriate/compatible probe is attached. Additionally, it should be recognized that various additional types of sensors and probes are contemplated, including without limitation regional saturation probes, depth of anesthesia (EEG) probes, capnography sidestream probes, etc. Further, the present disclosure contemplates other sources of additional physiological information (in addition to or instead of NCM), including temperature, ETCO2, rSO2 monitors, etc. Also as has been noted above, in some embodiments, a wireless (for example, electromagnetic, LiFi, sonar, etc.) system can be used for one or more transmission paths.
In some exemplary embodiments, data streams are combined and/or switched on prior to sending the streams to the monitoring device.
Specific details of several embodiments of the present technology are described herein with reference to
In some embodiments, the monitoring device can be a monitor with a screen 134 (e.g., to display various information, such as a power on/off button, one or more patient parameters, one or more alerts and/or alarms, etc.). The monitoring device can be attached to, be worn, and/or otherwise be carried by a patient 114. For example, the monitoring device can be attached to and/or worn by the patient 114 at the patient's upper arm, at the patient's belt, on the patient's wrist (e.g., as a watch and/or using a band), etc. In some embodiments, the monitoring device can be sewn into the patient's clothing. In these and other embodiments, the monitoring device can be a mobile device, such as a mobile phone, tablet, or laptop.
In the embodiments illustrated in
Information captured by the one or more sensors 112 can be stored and/or processed by the one or more sensors 112 and/or by the monitoring device. For example, the one or more sensors 112 can store captured information and/or can locally process the captured information (e.g., to determine one or more patient parameters). In these and other embodiments, the one or more sensors 112 can transmit the raw, captured information and/or the locally processed data to the monitoring device. For example, the one or more sensors 112 can include a wireless transmitter (not shown) to transfer the data directly to the monitoring device via a wired or wireless connection (not shown). In turn, the monitoring device can store and/or process the information received from the one or more sensors 112 (e.g., to determine one or more patient parameters). In these and other embodiments, the monitoring device can transfer the raw, captured information and/or processed data to a central unit (not shown), such as a central hospital station, via a wired or wireless connection (not shown).
Additionally or alternatively, the one or more sensors 112 can transmit the captured information and/or the locally processed data to a relay (not shown) (e.g., a band attached to the patient) via one or more wired and/or wireless connections. In some embodiments, a relay can store and/or process data received from the one or more sensors. In these and other embodiments, the relay can include a wireless transmitter that can be used to transmit the captured information and/or the processed data to the monitoring device via a wireless connection.
The image capture device 314 can capture a sequence of images over time. The image capture device 314 can be a depth sensing camera, such as a Kinect camera from Microsoft Corp. (Redmond, Wash.). A depth sensing camera can detect a distance between the camera and objects within its field of view. Such information can be used, as disclosed herein, to determine that a patient 114 is within the FOV 316 of the image capture device 314 and/or to determine one or more ROI's to monitor on the patient 114. Once an ROI is identified, the ROI can be monitored over time, and the changes in depths of regions (e.g., pixels) within the ROI can represent movements of the patient 114 (e.g., associated with breathing). As described in greater detail in U.S. patent application Ser. No. 16/219,360, U.S. Provisional Patent Application Ser. No. 62/779,964 and U.S. Provisional Patent Application Ser. No. 62/797,519, those movements, or changes of regions within the ROI, can be used to determine various patient parameters, such as various breathing parameters, including tidal volume, minute volume, respiratory rate, etc. Those movements, or changes of regions within the ROI, can also be used to detect various patient parameter abnormalities, as discussed in greater detail in U.S. Provisional Patent Application Ser. Nos. 62/716,724 and 62/779,964. The various patient parameter abnormalities can include, for example, apnea, rapid breathing (tachypnea), slow breathing, intermittent or irregular breathing, shallow breathing, obstructed and/or impaired breathing, and others. The entire disclosures of U.S. patent application Ser. No. 16/219,360 and U.S. Provisional Patent Application Ser. Nos. 62/716,724, 62/779,964 and 62/797,519 are incorporated herein by reference.
In these and other embodiments, the image capture device can be an RGB (red green blue) camera or an infrared camera. An RGB camera can detect slight color changes within its field of view. Such information can be used, as disclosed herein, to determine that a patient 114 is within the FOV 316 of the image capture device 314 and/or to determine one or more ROI's to monitor on the patient 114. Once an ROI is identified, the ROI can be monitored over time, and the changes in color of regions (e.g., pixels) within the ROI can represent various information related to the patient 114. As described in greater detail in U.S. patent application Ser. No. 16/188,969 those color changes can be used to detect optical signals associated with one or more medical devices, such as a pulse oximeter attached to the patient. Those color changes can also be used to determine and/or monitor various vital signs of the patient, including pulse rate, respiration rate, and arterial oxygen saturation, as discussed in greater detail in U.S. patent application Ser. Nos. 15/432,057, 15/432,063 and 62/797,519. Additionally, or alternatively, as discussed in greater detail in U.S. Provisional Patent Application Ser. Nos. 62/685,485 and 62/695,244, those color changes can also be used in a surgical setting to monitor and/or assess blood flow in the ROI by detecting occlusions and/or monitoring pulsation, pulsation strength, and/or perfusion. The entire disclosures of U.S. patent application Ser. Nos. 16/188,969, 15/432,057, and 15/432,063 and U.S. Provisional Patent Application Ser. Nos. 62/685,485 and 62/695,244 are incorporated herein by reference.
In some embodiments, the system 300 can receive user input to identify a starting point for defining a ROI. For example, an image can be reproduced on a display 322 of the system 300 (or on the display of the monitoring device 116), allowing a user of the system 300 to select a patient 114 for monitoring (which can be helpful where multiple objects are within the FOV 316 of the image capture device 314) and/or allowing the user to select a point on the patient 114 from which a ROI can be determined (such as the point 303 on the chest of the patient 114). In other embodiments, other methods for identifying a patient 114, for identifying points on the patient 114, and/or for defining one or more ROI's can be used. For example, a user can select a patient 114 for monitoring and a point on a patient bed 308 (which can be helpful in defining one or more ranges of depths to be used in measurements taken by a non-contact detector).
The images detected by the image capture device 314 can be sent to the computing device 315 through a wired or wireless connection 320. The computing device 315 can include a processor 318 (e.g., a microprocessor), the display 322, and/or hardware memory 326 for storing software and computer instructions. Sequential image frames of the patient 114 are recorded by the image capture device 314 and sent to the processor 318 for analysis. The display 322 can be remote from the image capture device 314, such as a video screen positioned separately from the processor 318 and the memory 326. Other embodiments of the computing device 315 can have different, fewer, or additional components than shown in
The computing device 410 can communicate with other devices, such as the server 425 and/or the image capture device(s) 485 via (e.g., wired or wireless) connections 470 and/or 480, respectively. For example, the computing device 410 can send to the server 425 information determined about a patient from images and/or other data captured by the image capture device(s) 485 and/or one or more other sensors or probes. The computing device 410 can be located remotely from the image capture device(s) 485, or it can be local and close to the image capture device(s) 485 (e.g., in the same room). In various embodiments disclosed herein, the processor 415 of the computing device 410 can perform the steps disclosed herein. In other embodiments, the steps can be performed on a processor 435 of the server 425. In some embodiments, the various steps and methods disclosed herein can be performed by both of the processors 415 and 435. In some embodiments, certain steps can be performed by the processor 415 while others are performed by the processor 435. In some embodiments, information determined by the processor 415 can be sent to the server 425 for storage and/or further processing.
In some embodiments, the image capture device(s) 485 are remote sensing device(s), such as depth sensing video camera(s) In some embodiments, the image capture device(s) 485 can be or include some other type(s) of device(s), such as proximity sensors or proximity sensor arrays, heat or infrared sensors/cameras, sound/acoustic or radio wave emitters/detectors, or other devices that include a field of view and can be used to monitor the location and/or characteristics of a patient or a region of interest (ROI) on the patient. Body imaging technology can also be utilized according to the methods disclosed herein. For example, backscatter x-ray or millimeter wave scanning technology can be utilized to scan a patient, which can be used to define and/or monitor a ROI. Advantageously, such technologies can be able to “see” through clothing, bedding, or other materials while giving an accurate representation of the patient's skin facet. This can allow for more accurate measurements, particularly if the patient is wearing baggy clothing or is under bedding. The image capture device(s) 485 can be described as local because they are relatively close in proximity to a patient such that at least a part of a patient is within the field of view of the image capture device(s) 485. In some embodiments, the image capture device(s) 485 can be adjustable to ensure that the patient is captured in the field of view. For example, the image capture device(s) 485 can be physically movable, can have a changeable orientation (such as by rotating or panning), and/or can be capable of changing a focus, zoom, or other characteristic to allow the image capture device(s) 485 to adequately capture images of a patient and/or a ROI of the patient. In various embodiments, for example, the image capture device(s) 485 can focus on a ROI, zoom in on the ROI, center the ROI within a field of view by moving the image capture device(s) 485, or otherwise adjust the field of view to allow for better and/or more accurate tracking/measurement of the ROI.
The server 425 includes a processor 435 that is coupled to a memory 430. The processor 435 can store and recall data and applications in the memory 430. The processor 435 is also coupled to a transceiver 440. In some embodiments, the processor 435, and subsequently the server 425, can communicate with other devices, such as the computing device 410 through the connection 470.
The devices shown in the illustrative embodiment can be utilized in various ways. For example, either the connections 470 and 480 can be varied. Either of the connections 470 and 480 can be a hard-wired connection. A hard-wired connection can involve connecting the devices through a USB (universal serial bus) port, serial port, parallel port, or other type of wired connection that can facilitate the transfer of data and information between a processor of a device and a second processor of a second device. In another embodiment, either of the connections 470 and 480 can be a dock where one device can plug into another device. In other embodiments, either of the connections 470 and 480 can be a wireless connection. These connections can take the form of any sort of wireless connection, including, but not limited to, Bluetooth connectivity, Wi-Fi connectivity, infrared, visible light, radio frequency (RF) signals, or other wireless protocols/methods. Other possible modes of wireless communication can include near-field communications, such as passive radio-frequency identification (RFID) and active RFID technologies. RFID and similar near-field communications can allow the various devices to communicate in short range when they are placed proximate to one another. In some embodiments, two or more devices in the patient monitoring system 400 can together create a dynamic mesh network that includes connections 470 and/or 480. In these and other embodiments, data captured by and/or received at one device of the system 400 may be sent to and/or through other devices of the system 400 (e.g., to reach the server(s)), hence improving wireless coverage. In these and still other embodiments, the various devices can connect through an internet (or other network) connection. That is, either of the connections 470 and 480 can represent several different computing devices and network components that allow the various devices to communicate through the internet, either through a hard-wired or wireless connection. Either of the connections 470 and 480 can also be a combination of several modes of connection.
The configuration of the devices in
The routine 500 can begin at block 502 by sensing a data via a patient attached probe (e.g., 112 in
At block 506, the patient attached sensor or associated device can transmit instructions (e.g., a key) to the monitoring device to instruct the monitoring device to display not just the first patient data from the attached sensor/probe, but also second patient data from the non-contact monitoring (NCM) system. At block 508, the monitoring device receives and reads the key (which may be encrypted) from the patient attached sensor or associated device. As has been discussed above, the instruction can prompt the monitoring device to start receiving data from the NCM system. Such instructions can also activate the NCM system, as well as: allow the monitoring device to include the additional physiological information directly onto a current display screen; allow the device or system to reconfigure a current screen to display the additional physiological information; and/or allow for separate pages to be accessible on the device or system for display of the additional physiological information.
At block 510, the routine 500 can sense/capture second patient data from the video system. For example, the routine 510 can recognize a patient within a field of view (FOV) of one or more image capture devices and/or define one or more regions of interest (ROI's) on the patient. In some embodiments, the routine 510 can recognize the patient by identifying the patient using facial recognition hardware and/or software of the image capture device(s). In these embodiments, the routine 510 can display the name of the patient on a display screen once the routine 510 has identified the patient.
At block 512, the monitoring device (e.g., a MPM device, clinician's tablet, etc.) displays data from both the attached sensor as well as second patient data from the non-contact monitoring system.
Although the steps of the routine 500 are discussed and illustrated in a particular order, the routine 500 illustrated in
Referring to
Referring still to
In exemplary embodiments, the wireless receiving unit 702 only sends NCM data if the probe 112 is connected to it, and/or if the probe provides the correct key instructing the wireless receiving unit to use NCM data. Further, in some embodiments, the wireless receiving unit 702 can be configured to only send a subset of the wireless (e.g., NCM data) streamed to it. For example, the wireless receiving unit could be configured only to send out respiratory rate, combined with pulse oximeter parameters. In other embodiments, it could only stream out apnea detection flags with pulse oximetry parameters, only stream out patient posture information, etc. Accordingly, some embodiments provide customized or customizable wireless receiving units according to different possible desired parameters handled by the wireless receiving units.
In some embodiments, the data streams referred to above may include numerical values of a physiological parameter (e.g., heart rate, respiratory rate), a flag indicating a state (e.g., an apnea flag, a sensor-off flag, etc.), a physiological waveform (e.g., PPG, ECG, EEG, CO2), a video stream (e.g. from an RGB or depth camera), etc.
The above detailed descriptions of embodiments of the technology are not intended to be exhaustive or to limit the technology to the precise form disclosed above. Although specific embodiments of, and examples for, the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while steps are presented in a given order, alternative embodiments can perform steps in a different order. Furthermore, the various embodiments described herein can also be combined to provide further embodiments.
The systems and methods described herein can be provided in the form of tangible and non-transitory machine-readable medium or media (such as a hard disk drive, hardware memory, etc.) having instructions recorded thereon for execution by a processor or computer. The set of instructions can include various commands that instruct the computer or processor to perform specific operations such as the methods and processes of the various embodiments described here. The set of instructions can be in the form of a software program or application. The computer storage media can include volatile and non-volatile media, and removable and non-removable media, for storage of information such as computer-readable instructions, data structures, program modules or other data. The computer storage media can include, but are not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, or other optical storage, magnetic disk storage, or any other hardware medium which can be used to store desired information and that can be accessed by components of the system. Components of the system can communicate with each other via wired or wireless communication. The components can be separate from each other, or various combinations of components can be integrated together into a monitor or processor or contained within a workstation with standard computer hardware (for example, processors, circuitry, logic circuits, memory, and the like). The system can include processing devices such as microprocessors, microcontrollers, integrated circuits, control units, storage media, and other hardware.
From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. To the extent any materials incorporated herein by reference conflict with the present disclosure, the present disclosure controls. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Moreover, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Additionally, the terms “comprising,” “including,” “having” and “with” are used throughout to mean including at least the recited feature(s) such that any greater number of the same feature and/or additional types of other features are not precluded.
From the foregoing, it will also be appreciated that various modifications can be made without deviating from the technology. For example, various components of the technology can be further divided into subcomponents, or various components and functions of the technology can be combined and/or integrated. Furthermore, although advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments can also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the disclosure and associated technology can encompass other embodiments not expressly shown or described herein.
Number | Name | Date | Kind |
---|---|---|---|
5107845 | Guern et al. | Apr 1992 | A |
5408998 | Mersch | Apr 1995 | A |
5704367 | Ishikawa et al. | Jan 1998 | A |
5800360 | Kisner et al. | Sep 1998 | A |
5995856 | Mannheimer et al. | Nov 1999 | A |
6668071 | Minkin et al. | Dec 2003 | B1 |
6920236 | Prokoski | Jul 2005 | B2 |
7431700 | Aoki et al. | Oct 2008 | B2 |
7558618 | Williams | Jul 2009 | B1 |
8149273 | Liu | Apr 2012 | B2 |
8754772 | Horng et al. | Jun 2014 | B2 |
8792969 | Bernal et al. | Jul 2014 | B2 |
8971985 | Bernal et al. | Mar 2015 | B2 |
9226691 | Bernal et al. | Jan 2016 | B2 |
9282725 | Jensen-Jarolim et al. | Mar 2016 | B2 |
9301710 | Mestha et al. | Apr 2016 | B2 |
9402601 | Berger et al. | Aug 2016 | B1 |
9436984 | Xu et al. | Sep 2016 | B2 |
9443289 | Xu et al. | Sep 2016 | B2 |
9504426 | Kyal et al. | Nov 2016 | B2 |
9662022 | Kyal et al. | May 2017 | B2 |
9693693 | Farag et al. | Jul 2017 | B2 |
9693710 | Mestha et al. | Jul 2017 | B2 |
9697599 | Prasad et al. | Jul 2017 | B2 |
9750461 | Telfort | Sep 2017 | B1 |
9839756 | Klasek | Dec 2017 | B2 |
9943371 | Bresch et al. | Apr 2018 | B2 |
10278585 | Ferguson et al. | May 2019 | B2 |
10376147 | Wood et al. | Aug 2019 | B2 |
10398353 | Addison et al. | Sep 2019 | B2 |
10523852 | Tzvieli et al. | Dec 2019 | B2 |
10588779 | Vorhees et al. | Mar 2020 | B2 |
10667723 | Jacquel et al. | Jun 2020 | B2 |
10702188 | Addison et al. | Jul 2020 | B2 |
10874331 | Kaiser et al. | Dec 2020 | B2 |
10939824 | Addison et al. | Mar 2021 | B2 |
10939834 | Khwaja et al. | Mar 2021 | B2 |
20020137464 | Dolgonos | Sep 2002 | A1 |
20040001633 | Caviedes | Jan 2004 | A1 |
20040258285 | Hansen et al. | Dec 2004 | A1 |
20050203348 | Shihadeh et al. | Sep 2005 | A1 |
20070116328 | Sablak et al. | May 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080108880 | Young | May 2008 | A1 |
20080279420 | Masticola et al. | Nov 2008 | A1 |
20080295837 | McCormick et al. | Dec 2008 | A1 |
20090304280 | Aharoni et al. | Dec 2009 | A1 |
20100210924 | Parthasarathy et al. | Aug 2010 | A1 |
20100236553 | Jafari et al. | Sep 2010 | A1 |
20100249630 | Droitcour et al. | Sep 2010 | A1 |
20100324437 | Freeman et al. | Dec 2010 | A1 |
20110144517 | Cervantes | Jun 2011 | A1 |
20110150274 | Patwardhan et al. | Jun 2011 | A1 |
20120065533 | Carrillo et al. | Mar 2012 | A1 |
20120075464 | Derenne et al. | Mar 2012 | A1 |
20120243797 | Di Venuto Dayer et al. | Sep 2012 | A1 |
20130267873 | Fuchs | Oct 2013 | A1 |
20130271591 | Van Leest et al. | Oct 2013 | A1 |
20130272393 | Kirenko et al. | Oct 2013 | A1 |
20130275873 | Shaw et al. | Oct 2013 | A1 |
20130324830 | Bernal et al. | Dec 2013 | A1 |
20130324876 | Bernal et al. | Dec 2013 | A1 |
20140023235 | Cennini et al. | Jan 2014 | A1 |
20140052006 | Lee et al. | Feb 2014 | A1 |
20140053840 | Liu | Feb 2014 | A1 |
20140139405 | Ribble et al. | May 2014 | A1 |
20140235976 | Bresch et al. | Aug 2014 | A1 |
20140267718 | Govro et al. | Sep 2014 | A1 |
20140272860 | Peterson et al. | Sep 2014 | A1 |
20140275832 | Muehlsteff et al. | Sep 2014 | A1 |
20140276104 | Tao et al. | Sep 2014 | A1 |
20140330336 | Errico et al. | Nov 2014 | A1 |
20140334697 | Kersten et al. | Nov 2014 | A1 |
20140358017 | Op Den Buijs et al. | Dec 2014 | A1 |
20140378810 | Davis et al. | Dec 2014 | A1 |
20140379369 | Kokovidis et al. | Dec 2014 | A1 |
20150003723 | Huang et al. | Jan 2015 | A1 |
20150131880 | Wang et al. | May 2015 | A1 |
20150157269 | Lisogurski et al. | Jun 2015 | A1 |
20150223731 | Sahin | Aug 2015 | A1 |
20150238150 | Subramaniam | Aug 2015 | A1 |
20150265187 | Bernal et al. | Sep 2015 | A1 |
20150282724 | McDuff et al. | Oct 2015 | A1 |
20150301590 | Furst et al. | Oct 2015 | A1 |
20150317814 | Johnston et al. | Nov 2015 | A1 |
20160000335 | Khachaturian et al. | Jan 2016 | A1 |
20160049094 | Gupta et al. | Feb 2016 | A1 |
20160082222 | Garcia Molina et al. | Mar 2016 | A1 |
20160140828 | DeForest | May 2016 | A1 |
20160143598 | Rusin et al. | May 2016 | A1 |
20160151022 | Berlin et al. | Jun 2016 | A1 |
20160156835 | Ogasawara et al. | Jun 2016 | A1 |
20160174887 | Kirenko et al. | Jun 2016 | A1 |
20160310084 | Banerjee et al. | Oct 2016 | A1 |
20160317041 | Porges et al. | Nov 2016 | A1 |
20160345931 | Xu et al. | Dec 2016 | A1 |
20160367186 | Freeman et al. | Dec 2016 | A1 |
20170007342 | Kasai et al. | Jan 2017 | A1 |
20170007795 | Pedro et al. | Jan 2017 | A1 |
20170055877 | Niemeyer | Mar 2017 | A1 |
20170065484 | Addison et al. | Mar 2017 | A1 |
20170071516 | Bhagat et al. | Mar 2017 | A1 |
20170095215 | Watson et al. | Apr 2017 | A1 |
20170095217 | Hubert et al. | Apr 2017 | A1 |
20170119340 | Nakai et al. | May 2017 | A1 |
20170147772 | Meehan et al. | May 2017 | A1 |
20170164904 | Kirenko | Jun 2017 | A1 |
20170172434 | Amelard et al. | Jun 2017 | A1 |
20170173262 | Veltz | Jun 2017 | A1 |
20170238805 | Addison et al. | Aug 2017 | A1 |
20170238842 | Jacquel | Aug 2017 | A1 |
20170311887 | Leussler et al. | Nov 2017 | A1 |
20170319114 | Kaestle | Nov 2017 | A1 |
20180042486 | Yoshizawa et al. | Feb 2018 | A1 |
20180042500 | Liao et al. | Feb 2018 | A1 |
20180053392 | White et al. | Feb 2018 | A1 |
20180104426 | Oldfield et al. | Apr 2018 | A1 |
20180106897 | Shouldice et al. | Apr 2018 | A1 |
20180169361 | Dennis et al. | Jun 2018 | A1 |
20180217660 | Dayal et al. | Aug 2018 | A1 |
20180228381 | Leboeuf et al. | Aug 2018 | A1 |
20180310844 | Tezuka et al. | Nov 2018 | A1 |
20180325420 | Gigi | Nov 2018 | A1 |
20180333050 | Greiner et al. | Nov 2018 | A1 |
20190050985 | Den Brinker et al. | Feb 2019 | A1 |
20190142274 | Addison et al. | May 2019 | A1 |
20190199970 | Greiner et al. | Jun 2019 | A1 |
20190209046 | Addison et al. | Jul 2019 | A1 |
20190209083 | Wu et al. | Jul 2019 | A1 |
20190307365 | Addison et al. | Oct 2019 | A1 |
20190311101 | Nienhouse | Oct 2019 | A1 |
20190343480 | Shute et al. | Nov 2019 | A1 |
20190380599 | Addison et al. | Dec 2019 | A1 |
20190380807 | Addison et al. | Dec 2019 | A1 |
20200046302 | Jacquel et al. | Feb 2020 | A1 |
20200187827 | Addison et al. | Jun 2020 | A1 |
20200202154 | Wang et al. | Jun 2020 | A1 |
20200205734 | Mulligan et al. | Jul 2020 | A1 |
20200237225 | Addison et al. | Jul 2020 | A1 |
20200242790 | Addison et al. | Jul 2020 | A1 |
20200250406 | Wang et al. | Aug 2020 | A1 |
20200253560 | De Haan | Aug 2020 | A1 |
20200289024 | Addison et al. | Sep 2020 | A1 |
20200329976 | Chen et al. | Oct 2020 | A1 |
20210068670 | Redtel | Mar 2021 | A1 |
20210153746 | Addison et al. | May 2021 | A1 |
Number | Date | Country |
---|---|---|
106725410 | May 2017 | CN |
111728602 | Oct 2020 | CN |
112233813 | Jan 2021 | CN |
19741982 | Oct 1998 | DE |
2793189 | Nov 2016 | EP |
2428162 | Aug 2017 | EP |
3207862 | Aug 2017 | EP |
3207863 | Aug 2017 | EP |
3384827 | Oct 2018 | EP |
2772828 | Jan 2019 | EP |
2009544080 | Dec 2009 | JP |
2011130996 | Jul 2011 | JP |
101644843 | Aug 2016 | KR |
2004100067 | Nov 2004 | WO |
2010034107 | Apr 2010 | WO |
2010036653 | Apr 2010 | WO |
2015059700 | Apr 2015 | WO |
2015078735 | Jun 2015 | WO |
2015110859 | Jul 2015 | WO |
2016065411 | May 2016 | WO |
2016178141 | Nov 2016 | WO |
2016209491 | Dec 2016 | WO |
2017060463 | Apr 2017 | WO |
2017089139 | Jun 2017 | WO |
2017144934 | Aug 2017 | WO |
2018042376 | Mar 2018 | WO |
2019094893 | May 2019 | WO |
2019135877 | Jul 2019 | WO |
2019240991 | Dec 2019 | WO |
2020033613 | Feb 2020 | WO |
2021044240 | Mar 2021 | WO |
Entry |
---|
Mukherjee S, Dolui K, Datta SK. Patient health management system using e-health monitoring architecture. In 2014 IEEE international advance computing conference (IACC) Feb. 21, 2014 (pp. 400-405). IEEE. (Year: 2014). |
Lawrence E, Navarro KF, Hoang D, Lim YY. Data collection, correlation and dissemination of medical sensor information in a WSN. In2009 Fifth International Conference on Networking and Services Apr. 20, 2009 (pp. 402-408). IEEE. (Year: 2009). |
Srinivas, Jangirala, Dheerendra Mishra, and Sourav Mukhopadhyay. “A mutual authentication framework for wireless medical sensor networks.” Journal of medical systems 41.5 (2017): 80. (Year: 2017). |
Fischer, et al., “ReMoteCare: Health Monitoring with Streaming Video,” ICMB '08, 7TH International Conference on Mobile Business, IEEE, Piscataway, NJ, Jul. 7, 2008, pp. 280-286 (MD40378PCT). |
International Application No. PCT/US2021/015669 International Search Report and Written Opinion dated Apr. 21, 2021, 15 pages (MD40378PCT). |
“European Search Report”, European Patent Application No. 17156337.2, Applicant: Covidien LP, dated Aug. 23, 2017, 10 pages. |
“International Search Repod and Written Opinion”, International Application No. PCT/US2018/060648, dated Jan. 28, 2019, 17 pages. |
“International Search Repod and Written Opinion”, International Application No. PCT/US2018/065492, dated Mar. 8, 2019, 12 pages. |
“International Search Repod and Written Opinion”, International Application No. PCT/US19/035433, dated Nov. 11, 2019, 17 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2019/045600, dated Oct. 23, 2019, 19 pages. |
“Invitation to Pay Additional Fees and Partial International Search Report”, International Application No. PCT/US2019/035433, dated Sep. 13, 2019, 16 pages. |
“Medical Electrical Equipment, Part 2-61: Particular requirements for basic safety and essential performance of pulse oximeter equipment”, BSI Standards Publication, BS EN ISO 80601-2-61, 2011, 98 pages. |
Aarts, Lonneke A.M., et al., “Non-contact heart rate monitoring utilizing camera photoplethysmography in neonatal intensive care unit—A Pilot Study”, Early Human Development 89, 2013, pp. 943-948, 6 pages. |
Abbas, A.K., et al., “Neonatal non-contact respiratory monitoring based on real-time infrared thermography”, Biomed. Eng. Online, vol. 10, No. 93, 2011, 17 pages. |
Addison, Paul S., “A Review of Signal Processing Used in the Implementation of the Pulse Oximetry Photoplethysmographic Fluid Responsiveness Parameter”, International Anesthesia Research Society, vol. 119, No. 6, Dec. 2014, pp. 1293-1306, 14 pages. |
Addison, Paul S., et al., “Developing an algorithm for pulse oximetry derived respirator rate (RRoxi): a healthy volunteer study”, J Clin comput, No. 26, 2012, pp. 45-51, 7 pages. |
Addison, Paul S., et al., “Pulse oximetry-derived respiratory rate in general care floor patients”, J. Clin Monit Comput, No. 29, 2015, pp. 113-120, 8 pages. |
Addison, P.S., et al., “Video-based Heart Rate Monitoring across a Range of Skin Pigmentations during an Acute Hypoxic Challenge”, J Clin Monit Comput, vol. 9, Nov. 9, 2017, 15 pages. |
Amazon, “Dockem Koala Tablet Wall Mount Dock for iPad Air/Mini/Pro, Samsung Galaxy Tab/Note, Nexus 7/10, and More (Black Brackets, Screw-in Version)”, https://www.amazon.com/Tablet-Dockem-Samsung-Brackets-Version-dp/B00JV75FC6?th=1, First available Apr. 22, 2014, viewed on Nov. 16, 2021, Apr. 22, 2014, 4 pages. |
Amelard, et al., “Non-contact transmittance photoplethysmographic imaging (PPGI) for long-distance cardiovascular monitoring”, ResearchGate, XP055542534 [Retrieved online Jan. 15, 2019], Mar. 23, 2015, pp. 1-13, 14 pages. |
Armanian, A. M., “Caffeine administration to prevent apnea in very premature infants”, Pediatrics & Neonatology, 57 (5), 2016, pp. 408-412, 5 pages. |
Barone, S, et al., “Computer-aided modelling of three-dimensional maxillofacial tissues through multi-modal Imaging”, Proceedings of the Institution of Mechanical Engineers, Journal of Engineering in Medicine, Part H vol. 227, No. 2, Feb. 1, 2013, 1 page. |
Barone, S, et al., “Creation of 3D Multi-body Orthodontic Models by Using Independent Imaging Sensors”, Senros MDPI AG Switzerland, vol. 13, No. 2, Jan. 1, 2013, pp. 2033-2050, 18 pages. |
Bhattacharya, S., et al., “A Novel Classification Method for Predicting Acute Hypotensive Episodes in Critical Dare”, 5th ACM Conference on Bioinformatics, Computational Bilogy and Health Informatics (ACM-BCB 2014), Newport Beach, USA, 2014, 10 pages. |
Bhattacharya, S., et al., “Unsupervised learning using Gaussian Mixture Copula models”, 21st International Conference on Computational Statistics (COMPSTAT 2014), Geneva, Switzerland, 2014, pp. 523-530, 8 pages. |
Bickler, Philip E., et al., “Factors Affecting the Performance of 5 Cerebral Oximeters During Hypoxia in Healthy Volunteers”, Society for Technology in Anesthesia, vol. 117, No. 4, Oct. 2013, pp. 813-823, 11 pages. |
Bousefsaf, Frederic, et al., “Continuous wavelet filtering on webcam photoplethysmographic signals to remotely assess the instantaneous heart rate”, Biomedical Signal Processing and Control 8, 2013, pp. 568-574, 7 pages. |
Bruser, C., et al., “Adaptive Beat-to-Beat Heart Rate Estimation in Ballistocardiograms”, IEEE Transactions Information Technology in Biomedicine, vol. 15, No. 5, Sep. 2011, pp. 778-786, 9 pages. |
Cennini, Giovanni, et al., “Heart rate monitoring via remote photoplethysmography with motion artifacts reduction”, Optics Express, vol. 18, No. 5, Mar. 1, 2010, pp. 4867-4875, 9 pages. |
Colantonio, S., et al., “A smart mirror to promote a healthy lifestyle”, Biosystems Engineering. vol. 138, Innovations in Medicine and Healthcare, Oct. 2015, pp. 33-43, 11 pages. |
Cooley, et al., “An Alorithm for the Machine Calculation of Complex Fourier Series”, Aug. 17, 1964, pp. 297-301, 5 pages. |
Di Fiore, J.M., et al., “Intermittent hypoxemia and oxidative stress in preterm infants”, Respiratory Physiology & Neurobiology, No. 266, 2019, pp. 121-129, 25 pages. |
Fei, J., et al., “Thermistor at a distance: unobtrusive measurement of breathing”, IEEE Transactions on Biomedical Engineering, vol. 57, No. 4, 2010, pp. 968-998, 11 pages. |
Feng, Litong, et al., “Dynamic ROI based on K-means for remote photoplethysmography”, IEE International Conference on Accoustics, Speech and Signal Processing (ICASSP), Apr. 2015, pp. 1310-1314, 5 pages. |
George, et al., “Respiratory Rate Measurement From PPG Signal Using Smart Fusion Technique”, International Conference on Engineering Trends and Science & Humanities (ICETSH-2015), 2015, 5 pages. |
Goldman, L.J., “Nasal airflow and thoracoabdominal motion in children using infrared thermographic video processing”. Pediatric Pulmonology, vol. 47, No. 5, 2012, pp. 476-486, 11 pages. |
Grimm, T., et al., “Sleep position classification from a depth camera using bed aligned maps”, 23rd International Conference on Pattern Recognition (ICPR), Dec. 2016, pp. 319-324, 6 pages. |
Gsmarena, “Apple iPad Pro 11 (2018)”, https://www.gsmarena.com/apple_ipad_pro_11_(2018)-9386.pjp, viewed on Nov. 16, 2021, 1 page. |
Guazzi, Alessandro R., et al., “Non-contact measurement of oxygen saturation with an RGB camera”, Biomedical Optics Express, vol. 6, No. 9, Sep. 1, 2015, pp. 3320-3338, 19 pages. |
Han, J., et al., “Visible and infrared image registration in man-made environments employing hybrid visuals features”, Pattern Recognition Letters, vol. 34, No. 1, 2013, pp. 42-51, 10 pages. |
Huddar, V., et al., “Predicting Postoperative Acute Respiratory Failure in Critical Care using Nursing Notes and Physiological Signals”, 36th Annual International Conference of IEEE Engineering in Medicine and Biology Society (IEEE EMBC 2014), Chicago, USA, 2014, pp. 2702-2705, 4 pages. |
Hyvarinen, A., et al., “Independent Component Analysis: Algorithms and Applications”, Neural Networks, vol. 13, No. 4, 2000, pp. 411-430, 31 pages. |
Javadi, M., et al., “Diagnosing Pneumonia in Rural Thailand: Digital Cameras versus Film Digitizers for Chest Radiograph Teleradiology”, International Journal of Infectious Disease, 10(2), Mar. 2006, pp. 129-135, 7 pages. |
Jopling, M. W., et al., “Issues in the Laboratory Evaluation of Pulse Oximeter Performance”, Anesth. Analg., No. 94, 2002, pp. S62-S68, 7 pages. |
Kastle, Siegfried W., et al., “Determining the Artifact Sensitivity of Recent Pulse Oximeters During Laboratory Benchmarking”, Journal of Clinical Monitoring and Computing, vol. 16, No. 7, 2000, pp. 509-552, 14 pages. |
Klaessens, J.H.G.M., et al., “Non-invasive skin oxygenation imaging using a multi-spectral camera system: Effectiveness of various concentration algorithms applied on human skin”, Proc. of SPIE, vol. 7174 717408-1, 2009, 14 pages. |
Kong, Lingqin, et al., “Non-contact detection of oxygen saturation based on visible light imaging device using ambient light”, Optics Express, vol. 21, No. 15, Jul. 29, 2013, pp. 17646-17471, 8 pages. |
Kortelainen, J.M., et al., “Sleep staging based on signals acquired through bed sensor”, IEEE Transactions on Informational Technology in Biomedicine, vol. 14, No. 3, May 2010, pp. 776-785, 10 pages. |
Kumar, M., et al., “Distance PPG: Robust non-contact vital signs monitoring using a camera”, Biomedical Optics Express, vol. 6, No. 5, May 1, 2015, 24 pages. |
Kwon, Sungjun, et al., “Validation of heart rate extraction using video imaging on a built-in camera system of a smartphone”, 34th Annual International Conference of the IEEE EMBS, San Diego, CA, USA, Aug. 28-Sep. 1, 2012, pp. 2174-2177, 4 pages. |
Lai, C.J., et al., “Heated humidified high-flow nasal oxygen prevents intraoperative body temperature decrease in non-intubated thoracoscopy”, Journal of Anesthesia, Oct. 15, 2018, 8 pages. |
Li, et al., “A Non-Contact Vision-Based System for Respiratory Rate Estimation”, IEEE 978-1-4244-7929-0/14, 2014, pp. 2119-2122, 4 pages. |
Liu, H., et al., “A Novel Method Based on Two Cameras for Accurate Estimation of Arterial Oxygen Saturation”, BioMedical Engineering Online, vol. 14, No. 52, 2015, 18 pages. |
Liu, S., et al., “In-bed pose estimation: Deep learning with shallow dataset. IEEE journal of translational engineering in health and medicine”, IEEE Journal of Translational Engineering in Health and Medicine, No. 7, 2019, pp. 1-12, 12 pages. |
Liu, C., et al., “Motion Magnification”, ACM Transactions on Graphics (TOG), vol. 24, No. 3, 2005, pp. 519-526, 8 pages. |
Lv, et al., “Class Energy Image Analysis for Video Sensor-Based Gait Recognition: A Review”, Sensors, No. 15, 2015, pp. 932-964, 33 pages. |
McDuff, Daniel J., et al., “A Survey of Remote Optical Photoplethysmographic Imaging Methods”, IEEE 987-1-4244-0270-1/15, 2015, pp. 6398-6404, 7 pages. |
Mestha, L.K., et al., “Towards Continuous Monitoring of Pulse Rate in Neonatal Intensive Care Unit with a Webcam”, Proc. of 36th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, 2014, pp. 3817-3820, 4 pages. |
Nguyen, et al., “3D shape, deformation and vibration measurements using infrared Kinect sensors and digital image correlation”, Applied Optics, vol. 56, No. 32, Nov. 10, 2017, 8 pages. |
Ni, et al., “RGBD-Camera Based Get-Up Event Detection for Hospital Fall Prevention”, Acoustics, Speech and Signal Processing (ICASSP) 2012 IEEE International Conf., Mar. 2012, pp. 1405-1408, 6 pages. |
Nisar, et al., “Contactless heart rate monitor for multiple persons in a video”, IEEE International Conference on Consumer Electronics—Taiwan (ICCE-TW), XP03291229 [Retreived on Jul. 25, 2016], May 27, 2016, 2 pages. |
Pereira, C., et al., “Noncontact Monitoring of Respiratory Rate in Newborn Infants Using Thermal Imaging”, IEEE Transactions on Biomedical Engineering, Aug. 23, 2018, 10 pages. |
Poh, et al., “Advancements in Noncontact, Multiparameter Physiological Measurements Using a Webcam”, IEEE Transactions on Biomedical Engineering, vol. 58, No. 1, Jan. 2011, pp. 7-11, 5 pages. |
Poh, et al., “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation”, OPT Express 18, 2010, pp. 10762-10774, 14 pages. |
Povsic, Klemen, et al., “Real-Time 3D visualization of the thoraco-abdominal surface during breathing with body movement and deformation extraction”, Physiological Measurement, vol. 36, No. 7, May 28, 2015, pp. 1497-1516, 22 pages. |
Prochazka, et al., “Microsoft Kinect Visual and Depth Sensors for Breathing and Heart Rate Analysis”, Senors, vol. 16, No. 7, Jun. 28, 2016, 11 pages. |
Rajan, V., et al., “Clinical Decision Support for Stroke using Multiview Learning based Models for NIHSS Scores”, PAKDD 2016 Workshop: Predictive Analytics in Critical Care (PACC), Auckland, New Zealand, 2016, pp. 190-199, 10 pages. |
Rajan, V., et al., “Dependency Clustering of Mixed Data with Gaussian Mixture Copulas”, 25th International Joint Conference on Artificial Intelligence IJCAI, New York, USA, 2016, pp. 1967-1973, 7 pages. |
Reisner, A., et al., “Utility of the Photoplethysmogram in Circulatory Monitoring”, American Society of Anesthesiologist, May 2008, pp. 950-958, 9 pages. |
Rougier, Caroline, et al., “Robust Video Surveillance for Fall Detection Based on Human Shape Deformation”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 21, No. 5, May 2011, pp. 611-622, 12 pages. |
Rubinstein, M, “Analysis and Visualization of Temporal Variations in Video”, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Feb. 2014, 118 pages. |
Scalise, Lorenzo, et al., “Heart rate measurement in neonatal patients using a webcamera”, Department of Industrial Engineering and Mathematical Science, Italy, 978-1-4673-0882-3/12, EEE, 2012, 4 pages. |
Schaerer, J., et al., “Multi-dimensional respiratory motion tracking from markerless optical surface imaging based on deformable mesh registration”, Physics in Medicine and Biology, vol. 57, No. 2, Dec. 14, 2011, pp. 357-373, 18 pages. |
Sengupta, A., et al., “A Statistical Model for Stroke Outcome Prediction and Treatment Planning”, 38th Annual International Conference of the IEE Engineering in Medicine and Biology (Society IEEE EMBC2016), Orlando, USA, 2016, pp. 2516-2519, 4 pages. |
Shah, Nitin, et al., “Performance of three new-generation pulse oximeters during motion and low perfursion in volunteers”. Journal of Clinical Anesthesia, No. 24, 2012, pp. 385-391, 7 pages. |
Shao, Dangdang, et al., “Noncontact Monitoring Breathing Pattern, Exhalation Flow Rate and Pulse Transit Time”, EEE Transactions on Biomedical Engineering, vol. 61, No. 11, Nov. 2014, pp. 2760-2767, 8 pages. |
Shrivastava, H., et al., “Classification with Imbalance: A Similarity-based Method for Predicting Respiratory Failure”, IEEE International Conference on Bioinformatics and Biomedicine (IEEE BIBM2015), Washington, DC, USA, 2015, pp. 707-714, 8 pages. |
Sun, Yu, et al., “Motion-compensated noncontact imaging photoplethysmography to monitor cardiorespiratory status during exercise”, Journal of Biomedical Optics, vol. 16, No. 7, Jul. 1, 2011, 10 pages. |
Sun, Yu, et al., “Noncontact imaging photoplethysmography to effectively access pulse rate variability”, Journal of Biomedical Optics, vol. 18(6), Jun. 2013, 10 pages. |
Tamura, et al., “Wearable Photoplethysmographic Sensors—Past & Present”, Electronics, vol. 3, 2014, pp. 282-302, 21 pages. |
Tarassenko, L., et al., “Non-contact video-based vital sign monitoring using ambient light and auto-regressive models”, Institute of Physics and Engineering in Medicine, vol. 35, 2014, pp. 807-831, 26 pages. |
Teichmann, D., et al., “Non-Contact monitoring techniques—Principles and applications”, In Proc. of IEEE International Conference of the Engineering in Medicine and Biology Society (EMBC), San Diego, CA, 2012, pp. 1302-1305, 4 pages. |
Verkruysee, Wim, et al., “Calibration of Contactless Pulse Oximetry”, Anesthesia & Analgesia, vol. 124, No. 1, Jan. 2017, pp. 136-145, 10 pages. |
Villarroel, Mauricio, et al., “Continuous non-contact vital sign monitoring in neonatal intensive care unit”, Healthcare Technology Letters, vol. 1, Issue 3, 2014, pp. 87-91, 5 pages. |
Wadhwa, N., et al., “Phase-Based Video Motion Processing”, MIT Computer Science and Ailincial Intelligence Lab, Jul. 2013, 9 pages. |
Wadhwa, N., et al., “Riesz pyramids for fast phase-based video magnification”, In Proc. of IEEE International Conference on Computational Photography (ICCP), Santa Clara, CA, 2014, 10 pages. |
Wang, W., et al., “Exploiting spatial redundancy of image sensor for motion robust rPPG”, IEEE Transactions on Biomedical Engineering, vol. 62, No. 2, 2015, pp. 415-425, 11 pages. |
Wu, H.Y., et al., “Eulerian video magnification for revealing subtle changes in the world”, ACM Transactions on Graphics (TOG), vol. 31, No. 4, 2012, pp. 651-658, 8 pages. |
Wulbrand, H., et al., “Submental and diaphragmatic muscle activity during and at resolution of mixed and obstructive apneas and cardiorespiratory arousal in preterm infants”, Pediatric Research, No. 38(3), 1995, pp. 298-305, 9 pages. |
Zaunseder, et al., “Spatio-temporal analysis of blood perfusion by imaging photoplethysmography”, Progress in Biomedical Optics and Imaging, SPIE—International Society for Optical Engineering, vol. 10501, Feb. 20, 2018, 15 pages. |
Zhou, J., et al., “Maximum parsimony analysis of gene copy number changes in tumor phylogenetics”, 15th International Workshop on Algorithms in Bioinformatics WABI 2015, Atlanta, USA, 2015, pp. 108-120, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20210235992 A1 | Aug 2021 | US |