The present disclosure relates to informative displays for non-contact patient monitoring, and more specifically, to informative displays for visualizing, e.g., low flow, apnea and/or patient motion events. Various patient breathing parameters can be obtained and/or calculated from depth measurements taken by a non-contact patient monitoring system including a depth sensing camera. The informative display can provide multiple visualizations of low flow, apnea, motion, etc., which can be adaptive such that the visualizations adjust based on further patient breathing measurements.
Depth sensing technologies have been developed that, when integrated into non-contact patient monitoring systems, can be used to determine a number of physiological and contextual parameters, such as respiration rate, tidal volume, minute volume, etc. Such parameters can be displayed on a display so that a clinician is provided with a basic visualization of these parameters. For example, respiratory volume as a function of time can be displayed as a rolling line plot to visualize a patient's breathing patterns.
However, additional effort and analysis is required for the clinician to decipher and interpret what the displayed data means with respect to the health of the patient being monitored. Accordingly, a need exists for systems and methods that are capable of both synthesizing patient monitoring data and providing additional visualization of the analyzed data for quick and easy interpretation and identification of developing medical issues.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawing are not necessarily to scale. Instead, emphasis is placed on illustrating clearly the principles of the present disclosure. The drawings should not be taken to limit the disclosure to the specific embodiments depicted but are for explanation and understanding only.
The present disclosure relates to informative displays for non-contact patient monitoring. The technology described herein can be incorporated into systems and methods for non-contact patient monitoring. As described in greater detail below, the described technology can include obtaining respiratory volume data, such as via non-contact patient monitoring using depth sensing cameras, and displaying the respiratory volume data as a function of time using a line plot. The technology may further include calculating absolute respiratory flow values from the plot line and determining when the calculated absolute respiratory flow value falls below a predetermined respiratory flow value, at which point a visual flag may be added to the display in order to indicate, e.g., low flow and/or apnea in the monitored patient. Subsequently collected data regarding respiratory volume and absolute respiratory flow values calculated therefrom may alter and/or remove previous visual flags added to the display. In some embodiments, the plot line is visually changed from a first plot line design to a second plot line design when the absolute respiratory flow value falls below the predetermined respiratory flow value to thereby indicate a low flow occurrence. In some embodiments, if the absolute respiratory flow value remains below the predetermined respiratory flow value for longer than a predetermined period of time, the rolling plot line is visually changed from the second plot line design to a third plot line design to thereby indicate an apnea event. In some embodiments, if the absolute respiratory flow value remains below the predetermined respiratory flow value for shorter than the predetermined period of time, the rolling plot line is changed from the second plot line design to the first plot line design and any previous visualization of low flow via use of the second plot line design is removed from the display.
Informative displays presenting patient monitoring data such as respiratory volume as a function of time, a histogram of breathing rates, respiration rate as function of time, and a patient depth image can also be visually altered to easily and quickly convey to a clinician various information pertaining to, e.g., apnea or patient motion events. In some embodiments, detected apnea events are visualized on an informative display by at least one of: adding an apnea label to a patient depth image; visually changing the plot line design of a rolling plot line for respiratory volume as a function of time; dropping to zero a rolling plot line for respiration rate as a function of time; and introducing and/or growing an apnea bar to a histogram of breathing rates. In some embodiments, detected patient motion events are visualized on an informative display by at least one of: adding a motion label to a patient depth image; visually changing the plot line design of a rolling plot line for respiratory volume as a function of time (including the use of a different plot line design than the plot line design used for denoting apnea); dropping to zero or otherwise holding flat a rolling plot line for respiration rate as a function of time; and introducing and growing a new motion bar to a histogram of breathing rates. The informative display may also incorporate the use of visual and/or auditory warnings when detected apnea and/or motion events continue for longer than a predetermined period of time. The informative display may incorporate either or both of the previously described apnea visualizations and motion visualizations.
Specific details of several embodiment of the present technology are described herein with reference to
The camera 114 can capture a sequence of images over time. The camera 114 can be a depth sensing camera, such as a Kinect camera from Microsoft Corp. (Redmond, Washington) or Intel camera such as the D415, D435, and SR305 cameras from Intel Corp, (Santa Clara, California). A depth sensing camera can detect a distance between the camera and objects within its field of view. Such information can be used to determine that a patient 112 is within the FOV 116 of the camera 114 and/or to determine one or more regions of interest (ROI) to monitor on the patient 112. Once a ROI is identified, the ROI can be monitored over time, and the changes in depth of regions (e.g., pixels) within the ROI 102 can represent movements of the patient 112 associated with breathing. As described in greater detail in U.S. Patent Application Publication No. 2019/0209046, those movements, or changes of regions within the ROI 102, can be used to determine various breathing parameters, such as tidal volume, minute volume, respiratory rate, respiratory, etc. Those movements, or changes of regions within the ROI 102, can also be used to detect various breathing abnormalities, as discussed in greater detail in U.S. Patent Application Publication No. 2020/0046302. The various breathing abnormalities can include, for example, low flow, apnea, rapid breathing (tachypnea), slow breathing, intermittent or irregular breathing, shallow breathing, obstructed and/or impaired breathing, and others. U.S. Patent Application Publication Nos. 2019/0209046 and 2020/0046302 are incorporated herein by reference in their entirety.
In some embodiments, the system 100 determines a skeleton-like outline of the patient 112 to identify a point or points from which to extrapolate a ROI. For example, a skeleton-like outline can be used to find a center point of a chest, shoulder points, waist points, and/or any other points on a body of the patient 112. These points can be used to determine one or more ROIs. For example, a ROI 102 can be defined by filling in area around a center point 103 of the chest, as shown in
In another example, the patient 112 can wear specially configured clothing (not shown) that includes one or more features to indicate points on the body of the patient 112, such as the patient's shoulders and/or the center of the patient's chest. The one or more features can include visually encoded message (e.g., bar code, QR code, etc.), and/or brightly colored shapes that contrast with the rest of the patient's clothing. In these and other embodiments, the one or more features can include one or more sensors that are configured to indicate their positions by transmitting light or other information to the camera 114. In these and still other embodiments, the one or more features can include a grid or another identifiable pattern to aid the system 100 in recognizing the patient 112 and/or the patient's movement. In some embodiments, the one or more features can be stuck on the clothing using a fastening mechanism such as adhesive, a pin, etc. For example, a small sticker can be placed on a patient's shoulders and/or on the center of the patient's chest that can be easily identified within an image captured by the camera 114. The system 100 can recognize the one or more features on the patient's clothing to identify specific points on the body of the patient 112. In turn, the system 100 can use these points to recognize the patient 112 and/or to define a ROI.
In some embodiments, the system 100 can receive user input to identify a starting point for defining a ROI. For example, an image can be reproduced on a display 122 of the system 100, allowing a user of the system 100 to select a patient 112 for monitoring (which can be helpful where multiple objects are within the FOV 116 of the camera 114) and/or allowing the user to select a point on the patient 112 from which a ROI can be determined (such as the point 103 on the chest of the patient 112). In other embodiments, other methods for identifying a patient 112, identifying points on the patient 112, and/or defining one or more ROI's can be used.
The images detected by the camera 114 can be sent to the computing device 115 through a wired or wireless connection 120. The computing device 115 can include a processor 118 (e.g., a microprocessor), the display 122, and/or hardware memory 126 for storing software and computer instructions. Sequential image frames of the patient 112 are recorded by the video camera 114 and sent to the processor 118 for analysis. The display 122 can be remote from the camera 114, such as a video screen positioned separately from the processor 118 and the memory 126. Other embodiments of the computing device 115 can have different, fewer, or additional components than shown in
The computing device 210 can communicate with other devices, such as the server 225 and/or the image capture device(s) 285 via (e.g., wired or wireless) connections 270 and/or 280, respectively. For example, the computing device 210 can send to the server 225 information determined about a patient from images captured by the image capture device(s) 285. The computing device 210 can be the computing device 115 of
In some embodiments, the image capture device(s) 285 are remote sensing device(s), such as depth sensing video camera(s), as described above with respect to
The server 225 includes a processor 235 that is coupled to a memory 230. The processor 235 can store and recall data and applications in the memory 230. The processor 235 is also coupled to a transceiver 240. In some embodiments, the processor 235, and subsequently the server 225, can communicate with other devices, such as the computing device 210 through the connection 270.
The devices shown in the illustrative embodiment can be utilized in various ways. For example, either the connections 270 and 280 can be varied. Either of the connections 270 and 280 can be a hard-wired connection. A hard-wired connection can involve connecting the devices through a USB (universal serial bus) port, serial port, parallel port, or other type of wired connection that can facilitate the transfer of data and information between a processor of a device and a second processor of a second device. In another embodiment, either of the connections 270 and 280 can be a dock where one device can plug into another device. In other embodiments, either of the connections 270 and 280 can be a wireless connection. These connections can take the form of any sort of wireless connection, including, but not limited to, Bluetooth connectivity, Wi-Fi connectivity, infrared, visible light, radio frequency (RF) signals, or other wireless protocols/methods. For example, other possible modes of wireless communication can include near-field communications, such as passive radio-frequency identification (RFID) and active RFID technologies. RFID and similar near-field communications can allow the various devices to communicate in short range when they are placed proximate to one another. In yet another embodiment, the various devices can connect through an internet (or other network) connection. That is, either of the connections 270 and 280 can represent several different computing devices and network components that allow the various devices to communicate through the internet, either through a hard-wired or wireless connection. Either of the connections 270 and 280 can also be a combination of several modes of connection.
The configuration of the devices in
Referring back to
In some embodiments, the absolute respiratory flow value used for identifying and visualizing breathing events (as discussed in greater detail below) is an average of absolute respiratory flow values taken over a set period of time. The set period of time used for calculating an average respiratory flow values is not limited. In some embodiments, the set period of time is one second to thereby provide a one second average respiratory flow value, while in other embodiments, the set period of time is three seconds to thereby provide a three second average respiratory flow value. Some embodiments of the technology described herein may also calculate, for example, both a one second and three second average and use both value in creating an informative display. As noted previously, these averages can be used when determining whether breathing events may be occurring, such as comparing the average respiratory flow value against a predetermined respiratory flow value below which a breathing event is likely to be occurring.
With reference to
As shown in
In some embodiments, the amount of time during which the absolute respiratory flow value remains below the predetermined respiratory flow value is monitored. In such embodiments, a predetermined period of time is established as a benchmark after which the low flow event can be considered as an apnea event due to its duration where little to no respiratory flow is detected. Any predetermined period of time can be used, and the predetermined period of time may be fixed or dynamic. In a fixed scenario, a fixed period of time is used for all comparisons regardless of any previous data collected from the patient. For example, the fixed period of time may be 5 seconds, 7 seconds, 10 seconds, or more, but the selected fixed predetermined period of time does not change. In a dynamic scenario, previous data collected from the patient being monitored may be used to establish the specific time period assigned to the predetermined period of time. Any suitable previously collected data regarding the patient can be used to adjust the dynamic predetermined period of time. For example, in some embodiments, the dynamic predetermined period of time is based on the average breath duration over a set number of breaths immediately preceding the breathing event. A multiplier can also be used to increase or decrease the predetermined time period calculated based on breathing data taken from a period of time immediately preceding the breathing event. In one non-limiting example, the predetermined period of time may be based on the average breath duration of the three breaths immediately preceding the breathing event. In an example where the average breath duration of the three breaths preceding the breathing event is 4 seconds, this time period can be multiplied by, for example, 0.5, 1.0, 1.5, 2.0, 5.0, etc., to establish a predetermined period of time for this specific breathing event of 2, 4, 6, 8, 20, etc., seconds. In another non-limiting example of a dynamic predetermined period of time, the predetermined period of time may be set as a multiple of the typical exhalation period (from peak to trough of each breath). Use of a dynamic predetermined time period helps to ensure that the low flow and apnea flags are more tailored to the specific patient being monitored and their breathing tendencies.
While not shown in
As also shown in
While not shown in the Figures, in an alternate embodiment, the second plot line design at the first portion of the plot line section 602 can be retained rather than being overwritten by the third plot line design. In such an embodiment, the plot line section 602 would have a portion from the initial time when the absolute respiratory flow value fell below the predetermined respiratory flow value to the time when the absolute respiratory flow value remained below the predetermined respiratory flow value for the predetermined period of time that is in the second plot line design, and a portion after the predetermined period of time that is in the third plot line design. This representation would therefore show the clinician the progression of the breathing event from a low flow event to an apnea event.
With reference to
As also shown in
With reference to
With reference to
With reference
The predetermined period of time 704 can be similar to the predetermined period of time as initially discussed previously with respect to
With reference to
Step 905 represents a decision block in which it is determined whether or not the monitored patient has begun a new breath. The initiation of a new breath can be determined using, for example, the depth sensing camera monitoring the region of interest of the patient and from which breath information can be extrapolate. In the event that a new breath has begun, step 906 result in lowering a low flow flag (if present). The low flow flag can be lowered in this scenario because the presence of the new breath by the patient safely indicates that the patient is not experiencing a low flow event. Step 907 represents another decision block in which it is determined whether or not the previous data sample was identified as apnea. If the previous data sample had been identified as apnea, then the associated apnea flag is lowered at step 908. Again, it is possible to lower the apnea flag due to the previous identification of a new patient breadth. Following a lowering of the apnea flag at step 908, the flow chart may flow back to step 901. If the previous data sample had not been identified as apnea, then the flow chart proceeds to step 909, a decision block to determine if the previous data sample had been identified as a low flow. If the previous data sample had been identified as a low flow, then step 910 is carried out, in which the low flow flags are retroactively lowered. Following completion of step 910, the flow chart may flow back to step 901. While not shown in
Returning to decision block 905, if a new breath has not begun, the step 911 is performed in which a low flow flag is raised. Decision block 912 follows in which it is determined if the low flow period has exceeded the predetermined period of time (i.e., the period of time for establishing when an apnea event is occurring). While not shown in
Referring back to
With reference to
Each of images 310A, 320A, 330A, 340A are configured to provide additional visualizations of various patient events that are not necessarily immediately apparent or discernable from the basic data presented in images 310A, 320A, 330A and 340A. For example, and with respect to
With respect to patient depth image 310A, a visualization 311 of the occurrence of an apnea event is provided by superimposing over the depth image a label, such as an “APNEA” label. As shown in
With respect to graph 320A detailing respiratory volume as a function of time, the visualization of an apnea event may be similar or identical to the embodiments described previously wherein the design of the plot line changes once an apnea event has been determined and/or confirmed. As shown in
With respect to histogram 330A, the histogram 330A generally includes along the x-axis various ranges of breathing rates. For example, the various breathing rate ranges may include 0-4 breaths per minute (bpm), 5-8 bpm, 9-12 bpm, 13-16 bpm, etc. As the monitored patient exhibits various breathing rates, the bar over the associated breathing rate range within which the exhibited breathing rate occurred increases. As shown in
As shown in
With respect to trend display 340A, the plot line 341 tracks the measured breaths per minute as a function of time. As shown in
Any of the above described visualizations of an apnea event may be used alone or in any combination. Furthermore, any or all of the visualizations may be accompanied by an audible alarm that provides another signal of an apnea event. In some embodiments, the audible alarm begins immediately upon the start of the apnea event, while in other embodiments the audible alarm does not begin until after the apnea event has continued for longer than a predetermined period of time. The audible alarm may also increase in pitch and/or volume the longer the apnea event continues in order to convey to a clinician an increasing seriousness of apnea event.
In
With respect to graph 320A shown in
With respect to histogram 330A shown in
As shown in
With respect to trend display 340A shown in
While not shown in
Any of the above described visualizations of a motion event may be used alone or in any combination. Similarly, any combination of visualizations for apnea events and motion events can be used in the visual display 300. Furthermore, any or all of the previously described visualizations may be accompanied by an audible alarm that provides another signal of an apnea or a motion event. In some embodiments, an audible alarm for an apnea event or a motion event begins immediately upon the start of the associated event, while in other embodiments the audible alarm does not begin until after the event has continued for longer than a predetermined period of time. The audible alarm may also increase in pitch and/or volume the longer the event continues in order to convey to a clinician an increasing seriousness of event. In some embodiments, a different sound, pitch, pattern of sounds, etc., is used for each of an apnea event audible alarm and a motion event audible alarm.
As mentioned previously, breathing data collected during an identified motion event may be ignored and/or eliminated from breathing monitoring data used to determine various patient breathing parameters. For example, the processor used together with the non-contact patient monitoring system may be configured such that once a motion event is identified and for so long as the motion event is occurring, none of the breathing data collected is used in calculating any breathing parameter, such as respiratory volume or average breathing rates. In other embodiments, the data collected during a motion event is maintained and/or used in determining various patient breathing parameters. Any data from a motion event that is maintained can be incorporated with data from normal breathing events or used to create a separate data set of breathing parameters during motion events.
With reference to
The above detailed descriptions of embodiments of the technology are not intended to be exhaustive or to limit the technology to the precise form disclosed above. Although specific embodiments of, and examples for, the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while steps are presented in a given order, alternative embodiments can perform steps in a different order. Furthermore, the various embodiments described herein can also be combined to provide further embodiments.
The systems and methods described herein can be provided in the form of tangible and non-transitory machine-readable medium or media (such as a hard disk drive, hardware memory, etc.) having instructions recorded thereon for execution by a processor or computer. The set of instructions can include various commands that instruct the computer or processor to perform specific operations such as the methods and processes of the various embodiments described here. The set of instructions can be in the form of a software program or application. The computer storage media can include volatile and non-volatile media, and removable and non-removable media, for storage of information such as computer-readable instructions, data structures, program modules or other data. The computer storage media can include, but are not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, or other optical storage, magnetic disk storage, or any other hardware medium which can be used to store desired information and that can be accessed by components of the system. Components of the system can communicate with each other via wired or wireless communication. The components can be separate from each other, or various combinations of components can be integrated together into a monitor or processor, or contained within a workstation with standard computer hardware (for example, processors, circuitry, logic circuits, memory, and the like). The system can include processing devices such as microprocessors, microcontrollers, integrated circuits, control units, storage media, and other hardware.
From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. To the extent any materials incorporated herein by reference conflict with the present disclosure, the present disclosure controls. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Moreover, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Additionally, the terms “comprising,” “including,” “having” and “with” are used throughout to mean including at least the recited feature(s) such that any greater number of the same feature and/or additional types of other features are not precluded. Furthermore, as used herein, the term “substantially” refers to the complete or nearly complete extent or degree of an action, characteristic, property, state, structure, item, or result. For example, an object that is “substantially” enclosed would mean that the object is either completely enclosed or nearly completely enclosed. The exact allowable degree of deviation from absolute completeness may in some cases depend on the specific context. However, generally speaking the nearness of completion will be so as to have the same overall result as if absolute and total completion were obtained. The use of “substantially” is equally applicable when used in a negative connotation to refer to the complete or near complete lack of an action, characteristic, property, state, structure, item, or result.
It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a medical device.
In one or more examples, the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
The present application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 63/049,889, entitled “Informative Display for Non-Contact Patient Monitoring” and filed Jul. 9, 2020 and U.S. Provisional Patent Application No. 63/057,413, entitled “Informative Display for Non-Contact Patient Monitoring” and filed Jul. 28, 2020, both of which are incorporated herein by reference in their entirety. U.S. Patent Application Publication Nos. 2019/0209046 and 2020/0046302 are incorporated herein by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
5107845 | Guern et al. | Apr 1992 | A |
5408998 | Mersch | Apr 1995 | A |
5704367 | Ishikawa et al. | Jan 1998 | A |
5800360 | Kisner et al. | Sep 1998 | A |
5995856 | Mannheimer et al. | Nov 1999 | A |
6241684 | Amano et al. | Jun 2001 | B1 |
6668071 | Minkin et al. | Dec 2003 | B1 |
6920236 | Prokoski | Jul 2005 | B2 |
7431700 | Aoki et al. | Oct 2008 | B2 |
7558618 | Williams | Jul 2009 | B1 |
8149273 | Liu et al. | Apr 2012 | B2 |
8754772 | Horng et al. | Jun 2014 | B2 |
8792969 | Bernal et al. | Jul 2014 | B2 |
8971985 | Bernal et al. | Mar 2015 | B2 |
9226691 | Bernal et al. | Jan 2016 | B2 |
9282725 | Jensen-Jarolim et al. | Mar 2016 | B2 |
9301710 | Mestha et al. | Apr 2016 | B2 |
9402601 | Berger et al. | Aug 2016 | B1 |
9436984 | Xu et al. | Sep 2016 | B2 |
9443289 | Xu et al. | Sep 2016 | B2 |
9504426 | Kyal et al. | Nov 2016 | B2 |
9508141 | Khachaturian et al. | Nov 2016 | B2 |
9607138 | Baldwin et al. | Mar 2017 | B1 |
9662022 | Kyal et al. | May 2017 | B2 |
9693693 | Farag et al. | Jul 2017 | B2 |
9693710 | Mestha et al. | Jul 2017 | B2 |
9697599 | Prasad et al. | Jul 2017 | B2 |
9750461 | Telfort | Sep 2017 | B1 |
9839756 | Klasek | Dec 2017 | B2 |
9943371 | Bresch et al. | Apr 2018 | B2 |
10213540 | Burbank et al. | Feb 2019 | B2 |
10278585 | Ferguson et al. | May 2019 | B2 |
10376147 | Wood et al. | Aug 2019 | B2 |
10398353 | Addison et al. | Sep 2019 | B2 |
10447972 | Patil | Oct 2019 | B2 |
10489912 | Brailovskiy | Nov 2019 | B1 |
10523852 | Tzvieli et al. | Dec 2019 | B2 |
10588779 | Vorhees et al. | Mar 2020 | B2 |
10589916 | Mcrae | Mar 2020 | B2 |
10650585 | Kiely | May 2020 | B2 |
10667723 | Jacquel et al. | Jun 2020 | B2 |
10702188 | Addison et al. | Jul 2020 | B2 |
10729357 | Larson et al. | Aug 2020 | B2 |
10874331 | Kaiser et al. | Dec 2020 | B2 |
10937296 | Kukreja et al. | Mar 2021 | B1 |
10939824 | Addison et al. | Mar 2021 | B2 |
10939834 | Khwaja et al. | Mar 2021 | B2 |
10966059 | Dayal et al. | Mar 2021 | B1 |
11311252 | Jacquel et al. | Apr 2022 | B2 |
11315275 | Addison et al. | Apr 2022 | B2 |
11317828 | Addison et al. | May 2022 | B2 |
11350850 | Jacquel et al. | Jun 2022 | B2 |
11850026 | Levi et al. | Dec 2023 | B2 |
20020137464 | Dolgonos et al. | Sep 2002 | A1 |
20040001633 | Caviedes | Jan 2004 | A1 |
20040258285 | Hansen et al. | Dec 2004 | A1 |
20050203348 | Shihadeh et al. | Sep 2005 | A1 |
20070116328 | Sablak et al. | May 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080108880 | Young et al. | May 2008 | A1 |
20080279420 | Masticola et al. | Nov 2008 | A1 |
20080295837 | McCormick et al. | Dec 2008 | A1 |
20090024012 | Li et al. | Jan 2009 | A1 |
20090050155 | Alder | Feb 2009 | A1 |
20090141124 | Liu et al. | Jun 2009 | A1 |
20090304280 | Aharoni et al. | Dec 2009 | A1 |
20100152600 | Droitcour | Jun 2010 | A1 |
20100210924 | Parthasarathy et al. | Aug 2010 | A1 |
20100236553 | Jafari et al. | Sep 2010 | A1 |
20100249630 | Droitcour et al. | Sep 2010 | A1 |
20100324437 | Freeman et al. | Dec 2010 | A1 |
20110144517 | Cervantes | Jun 2011 | A1 |
20110150274 | Patwardhan et al. | Jun 2011 | A1 |
20120065533 | Carrillo et al. | Mar 2012 | A1 |
20120075464 | Derenne | Mar 2012 | A1 |
20120195473 | De Haan et al. | Aug 2012 | A1 |
20120243797 | Di Venuto Dayer et al. | Sep 2012 | A1 |
20130073312 | Thompson et al. | Mar 2013 | A1 |
20130267873 | Fuchs | Oct 2013 | A1 |
20130271591 | Van Leest et al. | Oct 2013 | A1 |
20130272393 | Kirenko et al. | Oct 2013 | A1 |
20130275873 | Shaw et al. | Oct 2013 | A1 |
20130324830 | Bernal et al. | Dec 2013 | A1 |
20130324876 | Bernal et al. | Dec 2013 | A1 |
20140023235 | Cennini et al. | Jan 2014 | A1 |
20140052006 | Lee et al. | Feb 2014 | A1 |
20140053840 | Liu | Feb 2014 | A1 |
20140073860 | Urtti | Mar 2014 | A1 |
20140139405 | Ribble et al. | May 2014 | A1 |
20140140592 | Lasenby et al. | May 2014 | A1 |
20140235976 | Bresch et al. | Aug 2014 | A1 |
20140267718 | Govro et al. | Sep 2014 | A1 |
20140272860 | Peterson et al. | Sep 2014 | A1 |
20140275832 | Muehlsteff et al. | Sep 2014 | A1 |
20140276104 | Tao et al. | Sep 2014 | A1 |
20140330336 | Errico et al. | Nov 2014 | A1 |
20140334697 | Kersten et al. | Nov 2014 | A1 |
20140358017 | Op Den Buijs et al. | Dec 2014 | A1 |
20140378810 | Davis et al. | Dec 2014 | A1 |
20140379369 | Kokovidis et al. | Dec 2014 | A1 |
20150003723 | Huang et al. | Jan 2015 | A1 |
20150068069 | Tran et al. | Mar 2015 | A1 |
20150094597 | Mestha et al. | Apr 2015 | A1 |
20150131880 | Wang et al. | May 2015 | A1 |
20150157269 | Lisogurski et al. | Jun 2015 | A1 |
20150198707 | Al-Alusi | Jul 2015 | A1 |
20150223731 | Sahin | Aug 2015 | A1 |
20150238150 | Subramaniam | Aug 2015 | A1 |
20150265187 | Bernal et al. | Sep 2015 | A1 |
20150282724 | McDuff et al. | Oct 2015 | A1 |
20150286779 | Bala et al. | Oct 2015 | A1 |
20150301590 | Furst et al. | Oct 2015 | A1 |
20150317814 | Johnston et al. | Nov 2015 | A1 |
20150379370 | Clifton et al. | Dec 2015 | A1 |
20160000335 | Khachaturian et al. | Jan 2016 | A1 |
20160049094 | Gupta et al. | Feb 2016 | A1 |
20160082222 | Garcia Molina et al. | Mar 2016 | A1 |
20160140828 | Deforest | May 2016 | A1 |
20160143598 | Rusin et al. | May 2016 | A1 |
20160151022 | Berlin et al. | Jun 2016 | A1 |
20160156835 | Ogasawara et al. | Jun 2016 | A1 |
20160174887 | Kirenko et al. | Jun 2016 | A1 |
20160189518 | Krüger | Jun 2016 | A1 |
20160210747 | Hay et al. | Jul 2016 | A1 |
20160235344 | Auerbach | Aug 2016 | A1 |
20160310084 | Banerjee et al. | Oct 2016 | A1 |
20160317041 | Porges et al. | Nov 2016 | A1 |
20160345931 | Xu et al. | Dec 2016 | A1 |
20160367186 | Freeman et al. | Dec 2016 | A1 |
20170007342 | Kasai et al. | Jan 2017 | A1 |
20170007795 | Pedro et al. | Jan 2017 | A1 |
20170055877 | Niemeyer | Mar 2017 | A1 |
20170065484 | Addison et al. | Mar 2017 | A1 |
20170071516 | Bhagat et al. | Mar 2017 | A1 |
20170095215 | Watson et al. | Apr 2017 | A1 |
20170095217 | Hubert et al. | Apr 2017 | A1 |
20170119340 | Nakai et al. | May 2017 | A1 |
20170147772 | Meehan et al. | May 2017 | A1 |
20170164904 | Kirenko | Jun 2017 | A1 |
20170172434 | Amelard et al. | Jun 2017 | A1 |
20170173262 | Veltz | Jun 2017 | A1 |
20170238805 | Addison et al. | Aug 2017 | A1 |
20170238842 | Jacquel et al. | Aug 2017 | A1 |
20170311887 | Leussler et al. | Nov 2017 | A1 |
20170319114 | Kaestle | Nov 2017 | A1 |
20180042486 | Yoshizawa et al. | Feb 2018 | A1 |
20180042500 | Liao et al. | Feb 2018 | A1 |
20180049669 | Vu | Feb 2018 | A1 |
20180053392 | White et al. | Feb 2018 | A1 |
20180104426 | Oldfield et al. | Apr 2018 | A1 |
20180106897 | Shouldice et al. | Apr 2018 | A1 |
20180169361 | Dennis et al. | Jun 2018 | A1 |
20180217660 | Dayal et al. | Aug 2018 | A1 |
20180228381 | Leboeuf et al. | Aug 2018 | A1 |
20180303351 | Mestha et al. | Oct 2018 | A1 |
20180310844 | Tezuka et al. | Nov 2018 | A1 |
20180325420 | Gigi | Nov 2018 | A1 |
20180333050 | Greiner et al. | Nov 2018 | A1 |
20180333102 | De Haan et al. | Nov 2018 | A1 |
20180352150 | Purwar et al. | Dec 2018 | A1 |
20190050985 | Den Brinker et al. | Feb 2019 | A1 |
20190133499 | Auerbach | May 2019 | A1 |
20190142274 | Addison et al. | May 2019 | A1 |
20190183383 | Brayanov | Jun 2019 | A1 |
20190199970 | Greiner et al. | Jun 2019 | A1 |
20190209046 | Addison et al. | Jul 2019 | A1 |
20190209083 | Wu et al. | Jul 2019 | A1 |
20190307365 | Addison et al. | Oct 2019 | A1 |
20190311101 | Nienhouse | Oct 2019 | A1 |
20190343480 | Shute et al. | Nov 2019 | A1 |
20190380599 | Addison et al. | Dec 2019 | A1 |
20190380807 | Addison et al. | Dec 2019 | A1 |
20200046302 | Jacquel et al. | Feb 2020 | A1 |
20200187827 | Addison et al. | Jun 2020 | A1 |
20200202154 | Wang et al. | Jun 2020 | A1 |
20200205734 | Mulligan et al. | Jul 2020 | A1 |
20200237225 | Addison et al. | Jul 2020 | A1 |
20200242790 | Addison et al. | Jul 2020 | A1 |
20200250406 | Wang et al. | Aug 2020 | A1 |
20200253560 | De Haan | Aug 2020 | A1 |
20200279464 | Llewelyn | Sep 2020 | A1 |
20200289024 | Addison et al. | Sep 2020 | A1 |
20200329976 | Chen et al. | Oct 2020 | A1 |
20200409383 | Maunder | Dec 2020 | A1 |
20210068670 | Redtel | Mar 2021 | A1 |
20210142874 | Llewelyn | May 2021 | A1 |
20210153746 | Addison et al. | May 2021 | A1 |
20210201517 | Yang et al. | Jul 2021 | A1 |
20210233631 | Llewelyn | Jul 2021 | A1 |
20210235992 | Addison | Aug 2021 | A1 |
20210295662 | Bugbee et al. | Sep 2021 | A1 |
20210313075 | McNamara et al. | Oct 2021 | A1 |
20220211296 | Addison et al. | Jul 2022 | A1 |
20230122367 | Tesar | Apr 2023 | A1 |
Number | Date | Country |
---|---|---|
2234191 | Oct 1998 | CA |
106725410 | May 2017 | CN |
111728602 | Oct 2020 | CN |
112233813 | Jan 2021 | CN |
19741982 | Oct 1998 | DE |
2793189 | Nov 2016 | EP |
2428162 | Aug 2017 | EP |
3207862 | Aug 2017 | EP |
3207863 | Aug 2017 | EP |
3384827 | Oct 2018 | EP |
2772828 | Jan 2019 | EP |
2004173010 | Jun 2004 | JP |
2004283373 | Oct 2004 | JP |
3744778 | Dec 2005 | JP |
2009544080 | Dec 2009 | JP |
2011130996 | Jul 2011 | JP |
101644843 | Aug 2016 | KR |
20120373 | Apr 2014 | RS |
2004100067 | Nov 2004 | WO |
2005079658 | Sep 2005 | WO |
2010034107 | Apr 2010 | WO |
2010036653 | Apr 2010 | WO |
2015059700 | Apr 2015 | WO |
2015078735 | Jun 2015 | WO |
2015110859 | Jul 2015 | WO |
2016065411 | May 2016 | WO |
2016178141 | Nov 2016 | WO |
2016209491 | Dec 2016 | WO |
2017060463 | Apr 2017 | WO |
2017089139 | Jun 2017 | WO |
2017100188 | Jun 2017 | WO |
2017144934 | Aug 2017 | WO |
2018042376 | Mar 2018 | WO |
2019094893 | May 2019 | WO |
2019135877 | Jul 2019 | WO |
2019240991 | Dec 2019 | WO |
2020033613 | Feb 2020 | WO |
2021044240 | Mar 2021 | WO |
Entry |
---|
Amazon, “Dockem Koala Tablet Wall Mount Dock for iPad Air/Mini/Pro, Samsung Galaxy Tab/Note, Nexus 7/10, and More (Black Brackets, Screw-in Version)”, https://www.amazon.com/Tablet-Dockem-Samsung-Brackets-Version-dp/B00JV75FC6?th=1, First available Apr. 22, 2014, viewed on Nov. 16, 2021, Apr. 22, 2014, 4 pages. |
Gsmarena, “Apple iPad Pro 11 (2018)”, https://www.gsmarena.com/apple_ipad_pro_11_(2018)-9386.pjp, viewed on Nov. 16, 2021, 1 page. |
Rezaei, Mahdi , et al., “DeepSOCIAL: Social Distancing Monitoring and Infection Risk Assessment in COVID-19 Pandemic”, Applied Sciences, vol. 10, 7514, Oct. 26, 2020, pp. 1-29, 29 pages. |
Sathyamoorthy, Adarsh Jagan, et al., “COVID-Robot: Monitoring Social Distancing Constraints in Crowded Scenarios”, Aug. 21, 2020, pp. 1-11, 11 pages. |
Xinyi, Liu , et al., “An Image Captioning Method for Infant Sleeping Environment Diagnosis”, Springer International Publishing, May 15, 2019, pp. 18-26, 9 pages. |
Sokooti, Hess , et al., “Hierarchical Prediction of Registration Misalignment Using a Convolutional LSTM: Application to Chest CT Scans”, IEEE Access, IEEE, USA, vol. 9, Apr. 20, 2021, 62008-62020, 13 pages. |
Al-Naji, Ali , et al., “Real Time Apnoea Monitoring of Children Using the Microsoft Kinect Sensor: A Pilot Study”, Sensors, 17(286), Feb. 3, 2017, 15 pages. |
Harte, James M., et al., “Chest wall motion analysis in healthy volunteers and adults with cystic fibrosis using a novel Kinect-based motion tracking system”, Medical & Biological Engineering & Computing, 54(11), Feb. 13, 2016, pp. 1631-1640, 11 pages. |
Mcduff, Daniel J., et al., “A Survey of Remote Optical Photoplethysmographic Imaging Methods”, IEEE 987-1-4244-0270-1/15, 2015, pp. 6398-6404, 7 pages. |
Mestha, L.K., et al., “Towards Continuous Monitoring of Pulse Rate in Neonatal Intensive Care Unit with a Webcam”, Proc. of 36th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, 2014, pp. 3817-3820, 4 pages. |
Nguyen, et al., “3D shape, deformation and vibration measurements using infrared Kinect sensors and digital image correlation”, Applied Optics, vol. 56, No. 32, Nov. 10, 2017, 8 pages. |
Ni, et al., “RGBD-Camera Based Get-Up Event Detection for Hospital Fall Prevention”, Acoustics, Speech and Signal Processing (ICASSP) 2012 IEEE International Conf., Mar. 2012, pp. 1405-1408, 6 pages. |
Nisar, et al., “Contactless heart rate monitor for multiple persons in a video”, IEEE International Conference on Consumer Electronics—Taiwan (ICCE-TW), XP03291229 [Retreived on Jul. 25, 2016], May 27, 2016, 2 pages. |
Pereira, C., et al., “Noncontact Monitoring of Respiratory Rate in Newborn Infants Using Thermal Imaging”, IEEE Transactions on Biomedical Engineering, Aug. 23, 2018, 10 pages. |
Poh, et al., “Advancements in Noncontact, Multiparameter Physiological Measurements Using a Webcam”, IEEE Transactions on Biomedical Engineering, vol. 58, No. 1, Jan. 2011, pp. 7-11, 5 pages. |
Poh, et al., “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation”, OPT. Express 18, 2010, pp. 10762-10774, 14 pages. |
Povsic, Klemen , et al., “Real-Time 3D visualization of the thoraco-abdominal surface during breathing with body movement and deformation extraction”, Physiological Measurement, vol. 36, No. 7, May 28, 2015, pp. 1497-1516, 22 pages. |
Prochazka, et al., “Microsoft Kinect Visual and Depth Sensors for Breathing and Heart Rate Analysis”, Senors, vol. 16, No. 7, Jun. 28, 2016, 11 pages. |
Rajan, V., et al., “Clinical Decision Support for Stroke using Multiview Learning based Models for NIHSS Scores”, PAKDD 2016 Workshop: Predictive Analytics in Critical Care (PACC), Auckland, New Zealand, 2016, pp. 190-199, 10 pages. |
Rajan, V., et al., “Dependency Clustering of Mixed Data with Gaussian Mixture Copulas”, 25th International Joint Conference on Artificial Intelligence IJCAI, New York, USA, 2016, pp. 1967-1973, 7 pages. |
Reisner, A., et al., “Utility of the Photoplethysmogram in Circulatory Monitoring”, American Society of Anesthesiologist, May 2008, pp. 950-958, 9 pages. |
Rougier, Caroline, et al., “Robust Video Surveillance for Fall Detection Based on Human Shape Deformation”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 21, No. 5, May 2011, pp. 611-622, 12 pages. |
Rubinstein, M, “Analysis and Visualization of Temporal Variations in Video”, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Feb. 2014, 118 pages. |
Scalise, Lorenzo, et al., “Heart rate measurement in neonatal patients using a webcamera”, Department of Industrial Engineering and Mathematical Science, Italy, 978-1-4673-0882-3/12, EEE, 2012, 4 pages. |
Schaerer, J., et al., “Multi-dimensional respiratory motion tracking from markerless optical surface imaging based on deformable mesh registration”, Physics in Medicine and Biology, vol. 57, No. 2, Dec. 14, 2011, pp. 357-373, 18 pages. |
Sengupta, A., et al., “A Statistical Model for Stroke Outcome Prediction and Treatment Planning”, 38th Annual International Conference of the IEE Engineering in Medicine and Biology (Society IEEE EMBC2016), Orlando, USA, 2016, pp. 2516-2519, 4 pages. |
Shah, Nitin, et al., “Performance of three new-generation pulse oximeters during motion and low perfursion in volunteers”, Journal of Clinical Anesthesia, No. 24, 2012, pp. 385-391, 7 pages. |
Shao, Dangdang, et al., “Noncontact Monitoring Breathing Pattern, Exhalation Flow Rate and Pulse Transit Time”, EEE Transactions on Biomedical Engineering, vol. 61, No. 11, Nov. 2014, pp. 2760-2767, 8 pages. |
Shrivastava, H., et al., “Classification with Imbalance: A Similarity-based Method for Predicting Respiratory Failure”, IEEE International Conference on Bioinformatics and Biomedicine (IEEE BIBM2015), Washington, DC,USA, 2015, pp. 707-714, 8 pages. |
Sun, Yu, et al., “Motion-compensated noncontact imaging photoplethysmography to monitor cardiorespiratory status during exercise”, Journal of Biomedical Optics, vol. 16, No. 7, Jul. 1, 2011, 10 pages. |
Sun, Yu, et al., “Noncontact imaging photoplethysmography to effectively access pulse rate variability”, Journal of Biomedical Optics, vol. 18(6), Jun. 2013, 10 pages. |
Tamura, et al., “Wearable Photoplethysmographic Sensors—Past & Present”, Electronics, vol. 3, 2014, pp. 282-302, 21 pages. |
Tarassenko, L., et al., “Non-contact video-based vital sign monitoring using ambient light and auto-regressive models”, Institute of Physics and Engineering in Medicine, vol. 35, 2014, pp. 807-831, 26 pages. |
Teichmann, D., et al., “Non-Contact monitoring techniques-Principles and applications”, In Proc. of IEEE International Conference of the Engineering in Medicine and Biology Society (EMBC), San Diego, CA, 2012, pp. 1302-1305, 4 pages. |
Verkruysee, Wim, et al., “Calibration of Contactless Pulse Oximetry”, Anesthesia & Analgesia, vol. 124, No. 1, Jan. 2017, pp. 136-145, 10 pages. |
Mllarroel, Mauricio, et al., “Continuous non-contact vital sign monitoring in neonatal intensive care unit”, Healthcare Technology Letters, vol. 1, Issue 3, 2014, pp. 87-91, 5 pages. |
Wadhwa, N., et al., “Phase-Based Video Motion Processing”, MIT Computer Science and Artificial Intelligence Lab, Jul. 2013, 9 pages. |
Wadhwa, N., et al., “Riesz pyramids for fast phase-based video magnification”, In Proc. of IEEE International Conference on Computational Photography (ICCP), Santa Clara, CA, 2014, 10 pages. |
Wang, W , et al., “Exploiting spatial redundancy of image sensor for motion robust rPPG”, IEEE Transactions on Biomedical Engineering, vol. 62, No. 2, 2015, pp. 415-425, 11 pages. |
Wu, H.Y , et al., “Eulerian video magnifcation for revealing subtle changes in the world”, ACM Transactions on Graphics (TOG), vol. 31, No. 4, 2012, pp. 651-658, 8 pages. |
Wulbrand, H , et al., “Submental and diaphragmatic muscle activity during and at resolution of mixed and obstructive apneas and cardiorespiratory arousal in preterm infants”, Pediatric Research, No. 38(3), 1995, pp. 298-305, 9 pages. |
Zaunsede , et al., “Spatio-temporal analysis of blood perfusion by imaging photoplethysmography”, Progress in Biomedical Optics and Imaging, SPIE—International Society for Optical Engineering, vol. 10501, Feb. 20, 2018, 15 pages. |
Zhou, J , et al., “Maximum parsimony analysis of gene copy No. changes in tumor phylogenetics”, 15th International Workshop on Algorithms in Bioinformatics WABI 2015, Atlanta, USA, 2015, pp. 108-120, 13 pages. |
“European Search Report”, European Application No. 17156334.9, Applicant: Covidien LP, Aug. 23, 2017, 10 pages. |
“European Search Report”, European Patent Application No. 17156337.2, Applicant: Covidien LP, Aug. 23, 2017, 10 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2018/060648, Jan. 28, 2019, 17 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2018/065492, Mar. 8, 2019, 12 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US19/035433, Nov. 11, 2019, 17 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2019/045600, Oct. 23, 2019, 19 pages. |
“Invitation to Pay Additional Fees and Partial International Search Report”, International Application No. PCT/US2019/035433, Sep. 13, 2019, 16 pages. |
“Medical Electrical Equipment, Part 2-61: Particular requirements for basic safety and essential performance of pulse oximeter equipment”, BSI Standards Publication, BS EN ISO 80601-2-61, 2011, 98 pages. |
Aarts, Lonneke A.M., et al., “Non-contact heart rate monitoring utilizing camera photoplethysmography in neonatal intensive care unit—A Pilot Study”, Early Human Development 89, 2013, pp. 943-948, 6 pages. |
Abbas, A.K. , et al., “Neonatal non-contact respiratory monitoring based on real-time infrared thermography”, Biomed. Eng. Online, vol. 10, No. 93, 2011, 17 pages. |
Addison, Paul S., “A Review of Signal Processing Used in the Implementation of the Pulse Oximetry Photoplethysmographic Fluid Responsiveness Parameter”, International Anesthesia Research Society, vol. 119, No. 6, Dec. 2014, pp. 1293-1306, 14 pages. |
Addison, Paul S., et al., “Developing an algorithm for pulse oximetry derived respirator rate (RRoxi): a healthy volunteer study”, J Clin comput, No. 26, 2012, pp. 45-51, 7 pages. |
Addison, Paul S., et al., “Pulse oximetry-derived respiratory rate in general care floor patients”, J. Clin Monit Comput, No. 29, 2015, pp. 113-120, 8 pages. |
Addison, P.S. , et al., “Video-based Heart Rate Monitoring across a Range of Skin Pigmentations during an Acute Hypoxic Challenge”, J Clin Monit Comput, vol. 9, Nov. 9, 2017, 15 pages. |
Amelard , et al., “Non-contact transmittance photoplethysmographic imaging (PPGI) for long-distance cardiovascular monitoring”, ResearchGate, XP055542534 [Retrieved online Jan. 15, 2019], Mar. 23, 2015, pp. 1-13, 14 pages. |
Armanian, A. M. , “Caffeine administration to prevent apnea in very premature infants”, Pediatrics & Neonatology, 57(5), 2016, pp. 408-412, 5 pages. |
Barone, S , et al., “Computer-aided modelling of three-dimensional maxillofacial tissues through multi-modal imaging”, Proceedings of the Institution of Mechanical Engineers, Journal of Engineering in Medicine, Part H vol. 227, No. 2, Feb. 1, 2013, 1 page. |
Barone, S , et al., “Creation of 3D Multi-body Orthodontic Models by Using Independent Imaging Sensors”, Senros MDPI AG Switzerland, vol. 13, No. 2, Jan. 1, 2013, pp. 2033-2050, 18 pages. |
Bhattacharya, S. , et al., “A Novel Classification Method for Predicting Acute Hypotensive Episodes in Critical Care”, 5th ACM Conference on Bioinformatics, Computational Bilogy and Health Informatics (ACM-BCB 2014), Newport Beach, USA, 2014, 10 pages. |
Bhattacharya, S. , et al., “Unsupervised learning using Gaussian Mixture Copula models”, 21st International Conference on Computational Statistics (COMPSTAT 2014), Geneva, Switzerland, 2014, pp. 523-530, 8 pages. |
Bickler, Philip E., et al., “Factors Affecting the Performance of 5 Cerebral Oximeters During Hypoxia in Healthy Volunteers”, Society for Technology in Anesthesia, vol. 117, No. 4, Oct. 2013, pp. 813-823, 11 pages. |
Bousefsaf, Frederic , et al., “Continuous wavelet filtering on webcam photoplethysmographic signals to remotely assess the instantaneous heart rate”, Biomedical Signal Processing and Control 8, 2013, pp. 568-574, 7 pages. |
Bruser, C. , et al., “Adaptive Beat-to-Beat Heart Rate Estimation in Ballistocardiograms”, IEEE Transactions Information Technology in Biomedicine, vol. 15, No. 5, Sep. 2011, pp. 778-786, 9 pages. |
Cennini, Giovanni , et al., “Heart rate monitoring via remote photoplethysmography with motion artifacts reduction”, Optics Express, vol. 18, No. 5, Mar. 1, 2010, pp. 4867-4875, 9 pages. |
Colantonio, S. , et al., “A smart mirror to promote a healthy lifestyle”, Biosystems Engineering. vol. 138, Innovations in Medicine and Healthcare, Oct. 2015, pp. 33-43, 11 pages. |
Cooley , et al., “An Alorithm for the Machine Calculation of Complex Fourier Series”, Aug. 17, 1964, pp. 297-301, 5 pages. |
Di Fiore, J.M. , et al., “Intermittent hypoxemia and oxidative stress in preterm infants”, Respiratory Physiology & Neurobiology, No. 266, 2019, pp. 121-129, 25 pages. |
Fei, J. , et al., “Thermistor at a distance: unobtrusive measurement of breathing”, IEEE Transactions on Biomedical Engineering, vol. 57, No. 4, 2010, pp. 968-998, 11 pages. |
Feng, Litong , et al., “Dynamic ROI based on K-means for remote photoplethysmography”, IEE International Conference on Accoustics, Speech and Signal Processing (ICASSP), Apr. 2015, pp. 1310-1314, 5 pages. |
George , et al., “Respiratory Rate Measurement From PPG Signal Using Smart Fusion Technique”, International Conference on Engineering Trends and Science & Humanities (ICETSH-2015), 2015, 5 pages. |
Goldman, L.J. , “Nasal airflow and thoracoabdominal motion in children using infrared thermographic video processing”, Pediatric Pulmonology, vol. 47, No. 5, 2012, pp. 476-486, 11 pages. |
Grimm, T. , et al., “Sleep position classification from a depth camera using bed aligned maps”, 23rd International Conference on Pattern Recognition (ICPR), Dec. 2016, pp. 319-324, 6 pages. |
Guazzi, Alessandro R., et al., “Non-contact measurement of oxygen saturation with an RGB camera”, Biomedical Optics Express, vol. 6, No. 9, Sep. 1, 2015, pp. 3320-3338, 19 pages. |
Han, J. , et al., “Visible and infrared image registration in man-made environments employing hybrid visuals features”, Pattern Recognition Letters, vol. 34, No. 1, 2013, pp. 42-51, 10 pages. |
Huddar, V. , et al., “Predicting Postoperative Acute Respiratory Failure in Critical Care using Nursing Notes and Physiological Signals”, 36th Annual International Conference of IEEE Engineering in Medicine and Biology Society (IEEE EMBC 2014), Chicago, USA, 2014, pp. 2702-2705, 4 pages. |
Hyvarinen, A. , et al., “Independent Component Analysis: Algorithms and Applications”, Neural Networks, vol. 13, No. 4, 2000, pp. 411-430, 31 pages. |
Javadi, M. , et al., “Diagnosing Pneumonia in Rural Thailand: Digital Cameras versus Film Digitizers for Chest Radiograph Teleradiology”, International Journal of Infectious Disease, 10(2), Mar. 2006, pp. 129-135, 7 pages. |
Jopling, M. W., et al., “Issues in the Laboratory Evaluation of Pulse Oximeter Performance”, Anesth. Analg., No. 94, 2002, pp. S62-S68, 7 pages. |
Kastle, Siegfried W., et al., “Determining the Artifact Sensitivity of Recent Pulse Oximeters During Laboratory Benchmarking”, Journal of Clinical Monitoring and Computing, vol. 16, No. 7, 2000, pp. 509-552, 14 pages. |
Klaessens, J.H.G.M. , et al., “Non-invasive skin oxygenation imaging using a multi-spectral camera system: Effectiveness of various concentration algorithms applied on human skin”, Proc. of SPIE, vol. 7174 717408-1, 2009, 14 pages. |
Kong, Lingqin , et al., “Non-contact detection of oxygen saturation based on visible light imaging device using ambient light”, Optics Express, vol. 21, No. 15, Jul. 29, 2013, pp. 17646-17471, 8 pages. |
Kortelainen, J.M. , et al., “Sleep staging based on signals acquired through bed sensor”, IEEE Transactions on Informational Technology in Biomedicine, vol. 14, No. 3, May 2010, pp. 776-785, 10 pages. |
Kumar, M. , et al., “Distance PPG: Robust non-contact vital signs monitoring using a camera”, Biomedical Optics Express, vol. 6, No. 5, May 1, 2015, 24 pages. |
Kwon, Sungjun , et al., “Validation of heart rate extraction using video imaging on a built-in camera system of a smartphone”, 34th Annual International Conference of the IEEE EMBS, San Diego, CA, USA, Aug. 28-Sep. 1, 2012, pp. 2174-2177, 4 pages. |
Lai, C.J. , et al., “Heated humidified high-flow nasal oxygen prevents intraoperative body temperature decrease in non-intubated thoracoscopy”, Journal of Anesthesia, Oct. 15, 2018, 8 pages. |
Li , et al., “A Non-Contact Vision-Based System for Respiratory Rate Estimation”, IEEE 978-1-4244-7929-0/14, 2014, pp. 2119-2122, 4 pages. |
Liu, H. , et al., “A Novel Method Based on Two Cameras for Accurate Estimation of Arterial Oxygen Saturation”, BioMedical Engineering Online, vol. 14, No. 52, 2015, 18 pages. |
Liu, S. , et al., “In-bed pose estimation: Deep learning with shallow dataset. IEEE journal of translational engineering in health and medicine”, IEEE Journal of Translational Engineering in Health and Medicine, No. 7, 2019, pp. 1-12, 12 pages. |
Liu, C. , et al., “Motion Magnification”, ACM Transactions on Graphics (TOG), vol. 24, No. 3, 2005, pp. 519-526, 8 pages. |
Lv , et al., “Class Energy Image Analysis for Video Sensor-Based Gait Recognition: A Review”, Sensors, No. 15, 2015, pp. 932-964, 33 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2021/015669, Apr. 12, 2021, 15 pages. |
Bartula, M., et al., “Camera-based System for Sontactless Monitoring of Respiration”, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jul. 3, 2013, pp. 2672-2675, 4 pages. |
Fischer, et al., “ReMoteCare: Health Monitoring with Streaming Video”, OCMB '08, 7th International Conference on Mobile Business, IEEE, Piscataway, NJ,, Jul. 7, 2008, pp. 280-286. |
Lawrence, E., et al., “Data Collection, Correlation and Dissemination of Medical Sensor information in a WSN”, IEEE 2009 Fifth International Conference on Networking and Services, 978-0-7695-3586-9/09, Apr. 20, 2009, pp. 402-408, 7 pages. |
Mukherjee, S., et al., “Patient health management system using e-health monitoring architecture”, IEEE, International Advance Computing Conference (IACC), 978-1-4799-2572-8/14, Feb. 21, 2014, pp. 400-405, 6 pages. |
Reyes, B.A., et al., “Tidal Volume and Instantaneous Respiration Rate Estimation using a Volumetric Surrogate Signal Acquired via a Smartphone Camera”, IEEE Journal of Biomedical and Health Informatics, vol. 21(3), Feb. 25, 2016, pp. 764-777, 15 pages. |
Srinivas, J., et al., “A Mutual Authentication Framework for Wireless Medical Sensor Networks”, Journal of Medical Systems, 41:80, 2017, pp. 1-19, 19 pages. |
Transue, S., et al., “Real-time Tidal Volume Estimation using Iso-surface Reconstruction”, 2016 IEEE First International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE), Jun. 27, 2016, pp. 209-218, 10 pages. |
Yu, M.C., et al., “Noncontact Respiratory Measurement of Volume Change Using Depth Camera”, 2012 Annual International Conference of the IEEE Engeineering in Medicine and Biology Society, Aug. 28, 2012, pp. 2371-2374, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20220007966 A1 | Jan 2022 | US |
Number | Date | Country | |
---|---|---|---|
63057413 | Jul 2020 | US | |
63049889 | Jul 2020 | US |