The present technology is generally related to patient monitoring using an image capture device and edge handling methods therefor.
Many conventional medical monitors require attachment of a sensor to a patient in order to detect physiologic signals from the patient and to transmit detected signals through a cable to the monitor. These monitors process the received signals and determine vital signs such as the patient's pulse rate, respiration rate, and arterial oxygen saturation. For example, a pulse oximeter is a finger sensor that can include two light emitters and a photodetector. The sensor emits light into the patient's finger and transmits the detected light signal to a monitor. The monitor includes a processor that processes the signal, determines vital signs (e.g., pulse rate, respiration rate, arterial oxygen saturation), and displays the vital signs on a display.
Other monitoring systems include other types of monitors and sensors, such as electroencephalogram (EEG) sensors, blood pressure cuffs, temperature probes, air flow measurement devices (e.g., spirometer), and others. Some wireless, wearable sensors have been developed, such as wireless EEG patches and wireless pulse oximetry sensors.
Video-based monitoring is a field of patient monitoring that uses one or more remote video cameras to detect physical attributes of the patient. This type of monitoring can also be called “non-contact” monitoring in reference to the remote video sensor(s), which does/do not contact the patient. The remainder of this disclosure offers solutions and improvements in this field.
The techniques of this disclosure generally relate to systems and methods for patient monitoring using an image capture device, including defining a region of interest (ROI) on a patient; capturing two or more images of the ROI using an image capture device; calculating an overall change in depth of the ROI within the two or more images, wherein calculating the overall change in depth of the ROI includes: measuring changes in depths of portions of the ROI; recognizing steep changes in depths in the measured changes in depths; and adjusting the recognized steep changes in depths.
In another aspect, adjusting the recognized steep changes in depths includes excluding the recognized steep changes in depths from the calculation of the overall change in depth of the ROI.
In another aspect, adjusting the recognized steep changes in depths includes (i) excluding measured changes in depths corresponding to an outer percentage of the ROI and/or to an edge region of the patient and/or (ii) excluding a percentage of the measured changes in depths surrounding a recognized steep change in depth.
In another aspect, adjusting the recognized steep changes in depths comprises including only measured changes in depths up to and/or between one or more recognized steep changes in depths in the calculation of the overall change in depth of the ROI.
In another aspect, adjusting the recognized steep changes in depths includes interpolating and/or extrapolating over the recognized steep changes in depths using one or more other measured changes in depths.
In another aspect, adjusting the recognized steep changes in depths includes using a template to approximate changes in depths at portions of the ROI corresponding to the recognized steep changes in depths.
Other aspects include determining one or more patient respiratory parameters using all or a subset of the measured changes in depths and/or all or a subset of the adjusted changes in depths. An exemplary patient respiratory parameter includes a tidal volume of the patient, and wherein the tidal volume of the patient is determined using a subset of the measured changes in depths excluding the recognized steep changes in depths and/or all or a subset of the adjusted changes in depths. Another exemplary patient respiratory parameters includes a respiratory rate of the patient, wherein the respiratory rate of the patient is determined using all of the measured changes in depths and none of the adjusted changes in depths.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale. Instead, emphasis is placed on illustrating clearly the principles of the present disclosure. The drawings should not be taken to limit the disclosure to the specific embodiments depicted, but are for explanation and understanding only.
The following disclosure describes video-based patient monitoring devices, systems, and associated methods for mitigating errors in changes in depths measured at edges of a patient (e.g., at edges of a patient's torso). As described in greater detail below, devices, systems, and/or methods configured in accordance with embodiments of the present technology are configured to capture one or more images (e.g., a video sequence) of a patient or portion(s) of a patient (e.g., a patient's torso) within a field of view of an image capture device. The devices, systems, and/or methods can measure changes in depths of regions (e.g., one or more pixels or groups of pixels) in the captured images over time. Based, at least in part, on these measurements, the devices, systems, and/or methods can determine various respiratory parameters of a patient, including tidal volume, minute volume, respiratory rate, among others. In these and other embodiments, the device, systems, and/or methods can analyze the respiratory parameters and can trigger alerts and/or alarms when the devices, systems, and/or methods detect one or more breathing abnormalities.
Errors in measured depths and/or changes in depths can occur at edges of a patient within the field of view of the image capture device. For example, lateral movement of a patient's torso at the edges of the patient's torso can be perceived as large changes in depths of the patient's torso in these regions. Additionally, or alternatively, as a patient inhales and exhales, edge portions of the patient's torso can move within and outside a line of sight of an image capture device. Thus, during a first portion of the patient's respiratory cycle (e.g., during a portion in which the patient's lungs contain greater than or equal to a first volume of air), an edge portion of the patient's torso can move within and/or remain within the line of sight of the image capture device. During a second portion of the patient's respiratory cycle (e.g., during a portion in which the patient's lungs contain less than or equal to the first volume of air), the edge portion of the patient's torso can move outside or and/or remain outside of the line of sight of the image capture device. As a result, the image capture device can perceive large, inaccurate changes in depths at edge regions of the patient at various points throughout the patient's respiratory cycle. These large, inaccurate changes in depths can contribute to errors in the respiratory parameters of the patient determined by the video-based patient monitoring devices, system, and/or methods.
Therefore, the video-based patient monitoring devices, systems, and associated methods of the present technology are configured to mitigate the errors in changes in depths measured at edge regions of a patient (e.g., of a patient's torso). In some embodiments, the devices, systems, and associated methods exclude the edge portions while integrating over a region within the extent of the patient. In these and other embodiments, the devices, systems, and associated methods interpolate changes in depths at the edge regions using changes in depths perceived at other regions of the patient. In these and still other embodiments, the device, systems, and associated methods use a template (e.g., a template generated from a previous scan of the patient) to estimate changes in depths that occur at the edge regions of the patient. In this manner, the video-based patient monitoring devices, systems, and associated methods configured in accordance with various embodiments of the present technology can more accurately measure changes in depths that occur at edge regions of a patient within the field of view of the image capture device(s). In turn, the devices, systems, and associated methods can more accurately determine a patient's respiratory parameters.
Specific details of several embodiments of the present technology are described herein with reference to
As used herein, the term “steep” shall be understood to include any change in depth or rate of change above a threshold value or percentage. In some embodiments, the threshold value or percentage can be a predetermined and/or predefined threshold value (e.g., 0.5 mm, 1 mm, 2 mm, 5 mm, 10 mm, 20 mm, 50 mm, 75 mm, 100 mm, etc.) or percentage (e.g., 1%, 2%, 5%, 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, 90%, 95%, 97%, etc.). In these and other embodiments, the term “steep” shall be understood to encompass any change in depth or rate of change above a threshold value or percentage vis-à-vis the same pixel and/or region of an ROI across two or more images. In these and still other embodiments, the term “steep” shall be understood to encompass any change in depth or rate of change above a threshold value or percentage vis-à-vis neighboring and/or adjacent pixels and/or regions of an ROI across one or more images.
The camera 114 can capture a sequence of images over time. The camera 114 can be a depth sensing camera, such as a Kinect camera from Microsoft Corp. (Redmond, Wash.). A depth sensing camera can detect a distance between the camera and objects within its field of view. Such information can be used, as disclosed herein, to determine that a patient 112 is within the FOV 116 of the camera 114 and/or to determine one or more ROI's to monitor on the patient 112. Once a ROI is identified, the ROI can be monitored over time, and the changes in depths of regions (e.g., pixels) within the ROI 102 can represent movements of the patient 112 associated with breathing. As described in greater detail in U.S. patent application Ser. No. 16/219,360 and U.S. Provisional Patent Application Ser. No. 62/779,964, those movements, or changes of regions within the ROI 102, can be used to determine various breathing parameters, such as tidal volume, minute volume, respiratory rate, etc. Those movements, or changes of regions within the ROI 102, can also be used to detect various breathing abnormalities, as discussed in greater detail in U.S. Provisional Patent Application Ser. Nos. 62/716,724 and 62/779,964. The various breathing abnormalities can include, for example, apnea, rapid breathing (tachypnea), slow breathing, intermittent or irregular breathing, shallow breathing, obstructed and/or impaired breathing, and others. The entire disclosures of U.S. patent application Ser. No. 16/219,360 and U.S. Provisional Patent Application Ser. Nos. 62/716,724 and 62/779,964 are incorporated herein by reference.
In some embodiments, the system 100 determines a skeleton-like outline of the patient 112 to identify a point or points from which to extrapolate a ROI. For example, a skeleton-like outline can be used to find a center point of a chest, shoulder points, waist points, and/or any other points on a body of the patient 112. These points can be used to determine one or more ROI's. For example, a ROI 102 can be defined by filling in area around a center point 103 of the chest, as shown in
In another example, the patient 112 can wear specially configured clothing (not shown) that includes one or more features to indicate points on the body of the patient 112, such as the patient's shoulders and/or the center of the patient's chest. The one or more features can include visually encoded message (e.g., bar code, QR code, etc.), and/or brightly colored shapes that contrast with the rest of the patient's clothing. In these and other embodiments, the one or more features can include one or more sensors that are configured to indicate their positions by transmitting light or other information to the camera 114. In these and still other embodiments, the one or more features can include a grid or another identifiable pattern to aid the system 100 in recognizing the patient 112 and/or the patient's movement. In some embodiments, the one or more features can be stuck on the clothing using a fastening mechanism such as adhesive, a pin, etc. For example, a small sticker can be placed on a patient's shoulders and/or on the center of the patient's chest that can be easily identified within an image captured by the camera 114. The system 100 can recognize the one or more features on the patient's clothing to identify specific points on the body of the patient 112. In turn, the system 100 can use these points to recognize the patient 112 and/or to define a ROI.
In some embodiments, the system 100 can receive user input to identify a starting point for defining a ROI. For example, an image can be reproduced on a display 122 of the system 100, allowing a user of the system 100 to select a patient 112 for monitoring (which can be helpful where multiple objects are within the FOV 116 of the camera 114) and/or allowing the user to select a point on the patient 112 from which a ROI can be determined (such as the point 103 on the chest of the patient 112). In other embodiments, other methods for identifying a patient 112, identifying points on the patient 112, and/or defining one or more ROI's can be used.
The images detected by the camera 114 can be sent to the computing device 115 through a wired or wireless connection 120. The computing device 115 can include a processor 118 (e.g., a microprocessor), the display 122, and/or hardware memory 126 for storing software and computer instructions. Sequential image frames of the patient 112 are recorded by the video camera 114 and sent to the processor 118 for analysis. The display 122 can be remote from the camera 114, such as a video screen positioned separately from the processor 118 and the memory 126. Other embodiments of the computing device 115 can have different, fewer, or additional components than shown in
The computing device 210 can communicate with other devices, such as the server 225 and/or the image capture device(s) 285 via (e.g., wired or wireless) connections 270 and/or 280, respectively. For example, the computing device 210 can send to the server 225 information determined about a patient from images captured by the image capture device(s) 285. The computing device 210 can be the computing device 115 of
In some embodiments, the image capture device(s) 285 are remote sensing device(s), such as depth sensing video camera(s), as described above with respect to
The server 225 includes a processor 235 that is coupled to a memory 230. The processor 235 can store and recall data and applications in the memory 230. The processor 235 is also coupled to a transceiver 240. In some embodiments, the processor 235, and subsequently the server 225, can communicate with other devices, such as the computing device 210 through the connection 270.
The devices shown in the illustrative embodiment can be utilized in various ways. For example, either the connections 270 and 280 can be varied. Either of the connections 270 and 280 can be a hard-wired connection. A hard-wired connection can involve connecting the devices through a USB (universal serial bus) port, serial port, parallel port, or other type of wired connection that can facilitate the transfer of data and information between a processor of a device and a second processor of a second device. In another embodiment, either of the connections 270 and 280 can be a dock where one device can plug into another device. In other embodiments, either of the connections 270 and 280 can be a wireless connection. These connections can take the form of any sort of wireless connection, including, but not limited to, Bluetooth connectivity, Wi-Fi connectivity, infrared, visible light, radio frequency (RF) signals, or other wireless protocols/methods. For example, other possible modes of wireless communication can include near-field communications, such as passive radio-frequency identification (RFID) and active RFID technologies. RFID and similar near-field communications can allow the various devices to communicate in short range when they are placed proximate to one another. In yet another embodiment, the various devices can connect through an internet (or other network) connection. That is, either of the connections 270 and 280 can represent several different computing devices and network components that allow the various devices to communicate through the internet, either through a hard-wired or wireless connection. Either of the connections 270 and 280 can also be a combination of several modes of connection.
The configuration of the devices in
In these and other embodiments, the system 100 can define other regions of interest in addition to or in lieu of the ROI's 102, 351, 352, 353, and/or 354. For example, the system 100 can define a ROI 356 corresponding to the patient's chest (e.g., the ROI 351 plus the ROI 354) and/or a ROI 357 corresponding to the patient's abdomen (e.g., the ROI 352 plus the ROI 353). In these and other embodiments, the system 100 can define a ROI 358 corresponding to the right side of the patient's chest or torso (e.g., the ROI 353 and/or the ROI 354) and/or a ROI 359 corresponding to the left side of the patient's chest or torso (e.g., the ROI 351 and/or the ROI 352). In these and still other embodiments, the system 100 can define one or more other regions of interest than shown in
As the patient's torso 512 illustrated in
As the patient exhales, similar changes in depths (but in the opposite direction) are perceived by the image capture device along the line of sight. For example, the same large, perceived change in depth away from the image capture device can occur as the patient's torso 512 moves in the opposite direction than illustrated in
Therefore, as discussed above, the large changes in depths measured by the system at edge regions of a patient within a line of sight of an image capture device can inaccurately represent actual changes in depths exhibited by these regions. In turn, patient respiratory parameters that are determined at least in part using these large, perceived changes in depths (without correction) can be inaccurate. Accordingly, video-based patient monitoring devices, systems, and methods configured in accordance with various embodiments of the present technology are configured to account for the inaccuracy of changes in depths perceived at edge regions of a patient, thereby increasing accuracy of the subsequently determined patient respiratory parameters.
Referring to
In these and other embodiments, the video-based patient monitoring devices, systems, and methods can interpolate points between (i) a location on a change in depth curve corresponding to a location near to the edge of a patient's torso and (ii) a location on the change in depth curve corresponding to a location of the edge of the patient's torso. Referring to
In these and still other embodiments, the video-based patient monitoring devices, systems, and methods can fit a template to one or more points along a change in depth curve. Referring to
As shown in
In other embodiments, the video-based patient monitoring devices, systems, and methods can include multiple image capture devices. In these embodiments, an image capture device can remain substantially orthogonal to a region of interest on a patient (e.g., to the patient's torso), and one or more other image capture devices can be positioned at other angles offset from 90 degrees to the region of interest. In this manner, the other image capture device(s) can view around the edge regions of the region of interest and/or can be positioned such that lateral movement of the patient is directed toward or away from the other image capture device(s). Thus, data captured by the other images capture device(s) can be used to factor or filter out and/or account for the large, inaccurate changes in depths perceived at edge portions of the patient by the substantially orthogonal image capture device.
While the foregoing discussion used a horizontal portion 470 (
The routine 900 can begin at block 901 by recognizing a patient within a field of view (FOV) of the image capture device and/or by defining one or more regions of interest (ROI's) on the patient. In some embodiments, the routine 900 can recognize the patient by identifying the patient using facial recognition hardware and/or software of the image capture device. In these embodiments, the routine 900 can display the name of the patient on a display screen once the routine 900 has identified the patient. In these and other embodiments, the routine 900 can recognize a patient within the FOV of the image capture device by determining a skeleton outline of the patient and/or by recognizing one or more characteristic features (e.g., a torso of a patient). In these and still other embodiments, the routine 900 can define one or more ROI's on the patient in accordance with the discussion above with respect to
At block 902, the routine 900 can capture two or more images of one or more ROI's. In some embodiments, the routine 900 can capture the two or more images of the one or more ROI's by capturing a video sequence of the one or more ROI's. In these and other embodiments, the routine 900 can capture the two or more images of the one or more ROI's by capturing separate still images of the one or more ROI's. The routine 900 can capture the two or more still images at a rate faster than a period of the patient's respiration cycle to ensure that the two or more still images occur within one period of the patient's respiration cycle.
At block 903, the routine 900 can measure changes in depths of one or more regions in one or more ROI's over time. In some embodiments, the routine 900 can measure changes in depths of regions in the one or more ROI's by computing a difference between a depth of a region of a ROI in a first captured image of the ROI and a depth of the same region in a second captured image of the ROI.
At block 904, the routine 900 can recognize steep changes (increases and/or decreases) in depths measured at block 903. In some embodiments, the routine 900 can interrogate all or a subset of the changes in depths measured at block 903 to locate steep changes in depths. For example, the routine 900 can interrogate all or a subset of a two-dimensional surface corresponding to all or a subset of the changes in depths measured at block 903. In these and other embodiments, the routine 900 can interrogate a portion of the changes in depths measured at block 903, such as a horizontal portion, a vertical portion, a portion at another angle, a portion the routine 900 recognizes corresponds to an edge region of the ROI's and/or the patient, etc. In these and still other embodiments, the routine 900 can recognize a steep change in depth as a change in depth having a magnitude greater than or equal to a threshold value (e.g., greater than or equal to a predefined value or a value dependent on the patient).
At block 905, the routine 900 can adjust the steep changes in depths measured at block 903 and/or recognized at block 904. In some embodiments, the routine 900 can adjust the steep changes in depths by excluding them from subsequent calculations (e.g., excluding them from calculations used to determine a patient's respiratory parameters at block 906). For example, the routine 900 can exclude the steep changes in depths from a subsequent integration to determine an overall change in depth exhibited across all or a portion of the changes in depths measured at block 903 (e.g., by integrating within the extent of a patient's torso). In these embodiments, the routine 900 can include only changes in depths within an inner percentage of all or a subset of the measured changes in depths (e.g., within an inner percentage of regions corresponding to a patient's torso) in the integration such that an outer percentage near edge regions of the patient are excluded from the integration. In these and other embodiments, the routine 900 can include all measured changes in depths up to and/or between recognized steep changes in depths.
In some embodiments, the routine 900 can use one or more measured changes in depths to interpolate or extrapolate one or more changes in depths over the recognized steep changes in depths (e.g., by using one or more curves and/or one or more changes in depths measured at block 903). In these and other embodiments, the routine 900 can use a template to approximate changes in depths exhibited by regions that correspond to recognized steep changes in depths. The template can be a default template or a template based on a prior body scan of the patient. In these and other embodiments, the template can be an aggregate shape determined from a population-based analysis of body shapes. In these and still other embodiments, the template can correspond to a current position along the patient's respiratory cycle and/or can correspond to one or more changes in depths measured at block 903. In these and yet other embodiments, the routine 900 can use data captured by one or more other, non-orthogonal image capture devices to filter or factor out and/or account for the recognized steep changes in depths perceived by an orthogonal image capture device.
At block 906, the routine 900 can determine one or more patient respiratory parameters using all or a subset of the changes in depths measured at block 903 and/or all or a subset of the adjusted changes in depths generated at block 905. For example, the routine 900 can determine a patient's tidal volume, minute volume, and/or respiratory rate, among others. In some embodiments, the routine 900 can calculate the patient's tidal volume using a subset of the changes in depths measured at block 903 and/or all or a subset of the adjusted changes in depths generated at block 905. In these and other embodiments, the routine 900 can calculate the patient's respiratory rate using all of the changes in depths measured at block 903 (including the recognized steep changes in depths) and none of the adjusted changes in depths generated at block 905 (as the recognized steep changes in depths can be clear markers in a time signal of a respiratory signal).
Although the steps of the routine 900 are discussed and illustrated in a particular order, the routine 900 in
The above detailed descriptions of embodiments of the technology are not intended to be exhaustive or to limit the technology to the precise form disclosed above. Although specific embodiments of, and examples for, the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while steps are presented in a given order, alternative embodiments can perform steps in a different order. Furthermore, the various embodiments described herein can also be combined to provide further embodiments.
The systems and methods described herein can be provided in the form of tangible and non-transitory machine-readable medium or media (such as a hard disk drive, hardware memory, etc.) having instructions recorded thereon for execution by a processor or computer. The set of instructions can include various commands that instruct the computer or processor to perform specific operations such as the methods and processes of the various embodiments described here. The set of instructions can be in the form of a software program or application. The computer storage media can include volatile and non-volatile media, and removable and non-removable media, for storage of information such as computer-readable instructions, data structures, program modules or other data. The computer storage media can include, but are not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, or other optical storage, magnetic disk storage, or any other hardware medium which can be used to store desired information and that can be accessed by components of the system. Components of the system can communicate with each other via wired or wireless communication. The components can be separate from each other, or various combinations of components can be integrated together into a monitor or processor or contained within a workstation with standard computer hardware (for example, processors, circuitry, logic circuits, memory, and the like). The system can include processing devices such as microprocessors, microcontrollers, integrated circuits, control units, storage media, and other hardware.
From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. To the extent any materials incorporated herein by reference conflict with the present disclosure, the present disclosure controls. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Moreover, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Additionally, the terms “comprising,” “including,” “having” and “with” are used throughout to mean including at least the recited feature(s) such that any greater number of the same feature and/or additional types of other features are not precluded. Furthermore, as used herein, the term “substantially” refers to the complete or nearly complete extent or degree of an action, characteristic, property, state, structure, item, or result. For example, an object that is “substantially” enclosed would mean that the object is either completely enclosed or nearly completely enclosed. The exact allowable degree of deviation from absolute completeness may in some cases depend on the specific context. However, generally speaking, the nearness of completion will be so as to have the same overall result as if absolute and total completion were obtained. The use of “substantially” is equally applicable when used in a negative connotation to refer to the complete or near complete lack of an action, characteristic, property, state, structure, item, or result.
From the foregoing, it will also be appreciated that various modifications can be made without deviating from the technology. For example, various components of the technology can be further divided into subcomponents, or various components and functions of the technology can be combined and/or integrated. Furthermore, although advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments can also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the disclosure and associated technology can encompass other embodiments not expressly shown or described herein.
The present application is a continuation application of U.S. application Ser. No. 16/750,502 filed Jan. 23, 2020, entitled “Edge Handling Methods for Associated Depth Sensing Camera Devices, Systems, and Methods” which claims priority to U.S. Provisional Patent Application Ser. No. 62/797,538, filed Jan. 28, 2019, which is specifically incorporated by reference herein for all that it discloses or teaches.
Number | Name | Date | Kind |
---|---|---|---|
5107845 | Guern et al. | Apr 1992 | A |
5408998 | Mersch | Apr 1995 | A |
5704367 | Ishikawa et al. | Jan 1998 | A |
5800360 | Kisner et al. | Sep 1998 | A |
5995856 | Mannheimer et al. | Nov 1999 | A |
6668071 | Minkin et al. | Dec 2003 | B1 |
6920236 | Prokoski | Jul 2005 | B2 |
7431700 | Aoki et al. | Oct 2008 | B2 |
7558618 | Williams | Jul 2009 | B1 |
8149273 | Liu et al. | Apr 2012 | B2 |
8754772 | Horng et al. | Jun 2014 | B2 |
8792969 | Bernal et al. | Jul 2014 | B2 |
8971985 | Bernal et al. | Mar 2015 | B2 |
9226691 | Bernal et al. | Jan 2016 | B2 |
9282725 | Jensen-Jarolim et al. | Mar 2016 | B2 |
9301710 | Mestha et al. | Apr 2016 | B2 |
9402601 | Berger et al. | Aug 2016 | B1 |
9436984 | Xu et al. | Sep 2016 | B2 |
9443289 | Xu et al. | Sep 2016 | B2 |
9504426 | Kyal et al. | Nov 2016 | B2 |
9607138 | Baldwin et al. | Mar 2017 | B1 |
9662022 | Kyal et al. | May 2017 | B2 |
9693693 | Farag et al. | Jul 2017 | B2 |
9693710 | Mestha et al. | Jul 2017 | B2 |
9697599 | Prasad et al. | Jul 2017 | B2 |
9750461 | Telfort | Sep 2017 | B1 |
9839756 | Klasek | Dec 2017 | B2 |
9943371 | Bresch et al. | Apr 2018 | B2 |
10278585 | Ferguson et al. | May 2019 | B2 |
10376147 | Wood et al. | Aug 2019 | B2 |
10398353 | Addison et al. | Sep 2019 | B2 |
10523852 | Tzvieli et al. | Dec 2019 | B2 |
10588779 | Vorhees et al. | Mar 2020 | B2 |
10589916 | McRae | Mar 2020 | B2 |
10650585 | Kiely | May 2020 | B2 |
10667723 | Jacquel et al. | Jun 2020 | B2 |
10702188 | Addison et al. | Jul 2020 | B2 |
10874331 | Kaiser et al. | Dec 2020 | B2 |
10939824 | Addison et al. | Mar 2021 | B2 |
10939834 | Khwaja et al. | Mar 2021 | B2 |
20020137464 | Dolgonos et al. | Sep 2002 | A1 |
20040001633 | Caviedes | Jan 2004 | A1 |
20040258285 | Hansen et al. | Dec 2004 | A1 |
20050203348 | Shihadeh et al. | Sep 2005 | A1 |
20070116328 | Sablak et al. | May 2007 | A1 |
20080001735 | Tran | Jan 2008 | A1 |
20080108880 | Young et al. | May 2008 | A1 |
20080279420 | Masticola et al. | Nov 2008 | A1 |
20080295837 | McCormick et al. | Dec 2008 | A1 |
20090024012 | Li et al. | Jan 2009 | A1 |
20090304280 | Aharoni et al. | Dec 2009 | A1 |
20100210924 | Parthasarathy et al. | Aug 2010 | A1 |
20100236553 | Jafari et al. | Sep 2010 | A1 |
20100249630 | Droitcour | Sep 2010 | A1 |
20100324437 | Freeman et al. | Dec 2010 | A1 |
20110144517 | Cervantes | Jun 2011 | A1 |
20110150274 | Patwardhan et al. | Jun 2011 | A1 |
20120065533 | Carrillo et al. | Mar 2012 | A1 |
20120075464 | Derenne et al. | Mar 2012 | A1 |
20120243797 | Di Venuto Dayer et al. | Sep 2012 | A1 |
20130267873 | Fuchs | Oct 2013 | A1 |
20130271591 | Van Leest et al. | Oct 2013 | A1 |
20130272393 | Kirenko et al. | Oct 2013 | A1 |
20130275873 | Shaw et al. | Oct 2013 | A1 |
20130324830 | Bernal et al. | Dec 2013 | A1 |
20130324876 | Bernal et al. | Dec 2013 | A1 |
20140023235 | Cennini et al. | Jan 2014 | A1 |
20140052006 | Lee et al. | Feb 2014 | A1 |
20140053840 | Liu | Feb 2014 | A1 |
20140073860 | Urtti | Mar 2014 | A1 |
20140139405 | Ribble et al. | May 2014 | A1 |
20140140592 | Asenby et al. | May 2014 | A1 |
20140235976 | Bresch et al. | Aug 2014 | A1 |
20140267718 | Govro et al. | Sep 2014 | A1 |
20140272860 | Peterson et al. | Sep 2014 | A1 |
20140275832 | Muehlsteff et al. | Sep 2014 | A1 |
20140276104 | Tao et al. | Sep 2014 | A1 |
20140330336 | Errico et al. | Nov 2014 | A1 |
20140334697 | Kersten et al. | Nov 2014 | A1 |
20140358017 | Op Den Buijs et al. | Dec 2014 | A1 |
20140378810 | Davis et al. | Dec 2014 | A1 |
20140379369 | Kokovidis et al. | Dec 2014 | A1 |
20150003723 | Huang et al. | Jan 2015 | A1 |
20150094597 | Mestha et al. | Apr 2015 | A1 |
20150131880 | Wang et al. | May 2015 | A1 |
20150157269 | Lisogurski et al. | Jun 2015 | A1 |
20150223731 | Sahin | Aug 2015 | A1 |
20150238150 | Subramaniam | Aug 2015 | A1 |
20150265187 | Bernal et al. | Sep 2015 | A1 |
20150282724 | McDuff et al. | Oct 2015 | A1 |
20150301590 | Furst et al. | Oct 2015 | A1 |
20150317814 | Johnston et al. | Nov 2015 | A1 |
20160000335 | Khachaturian et al. | Jan 2016 | A1 |
20160049094 | Gupta et al. | Feb 2016 | A1 |
20160082222 | Garcia Molina et al. | Mar 2016 | A1 |
20160140828 | Deforest | May 2016 | A1 |
20160143598 | Rusin et al. | May 2016 | A1 |
20160151022 | Berlin et al. | Jun 2016 | A1 |
20160156835 | Ogasawara et al. | Jun 2016 | A1 |
20160174887 | Kirenko et al. | Jun 2016 | A1 |
20160210747 | Hay et al. | Jul 2016 | A1 |
20160235344 | Auerbach | Aug 2016 | A1 |
20160310084 | Banerjee et al. | Oct 2016 | A1 |
20160317041 | Porges et al. | Nov 2016 | A1 |
20160345931 | Xu et al. | Dec 2016 | A1 |
20160367186 | Freeman et al. | Dec 2016 | A1 |
20170007342 | Kasai et al. | Jan 2017 | A1 |
20170007795 | Pedro et al. | Jan 2017 | A1 |
20170055877 | Niemeyer | Mar 2017 | A1 |
20170065484 | Addison et al. | Mar 2017 | A1 |
20170071516 | Bhagat et al. | Mar 2017 | A1 |
20170095215 | Watson et al. | Apr 2017 | A1 |
20170095217 | Hubert et al. | Apr 2017 | A1 |
20170119340 | Nakai et al. | May 2017 | A1 |
20170147772 | Meehan et al. | May 2017 | A1 |
20170164904 | Kirenko | Jun 2017 | A1 |
20170172434 | Amelard et al. | Jun 2017 | A1 |
20170173262 | Veltz | Jun 2017 | A1 |
20170238805 | Addison et al. | Aug 2017 | A1 |
20170238842 | Jacquel et al. | Aug 2017 | A1 |
20170311887 | Eussler et al. | Nov 2017 | A1 |
20170319114 | Kaestle | Nov 2017 | A1 |
20180042486 | Yoshizawa et al. | Feb 2018 | A1 |
20180042500 | Iao et al. | Feb 2018 | A1 |
20180049669 | Vu et al. | Feb 2018 | A1 |
20180053392 | White et al. | Feb 2018 | A1 |
20180104426 | Oldfield et al. | Apr 2018 | A1 |
20180106897 | Shouldice et al. | Apr 2018 | A1 |
20180169361 | Dennis et al. | Jun 2018 | A1 |
20180217660 | Dayal et al. | Aug 2018 | A1 |
20180228381 | Leboeuf et al. | Aug 2018 | A1 |
20180310844 | Tezuka et al. | Nov 2018 | A1 |
20180325420 | Gigi | Nov 2018 | A1 |
20180333050 | Greiner et al. | Nov 2018 | A1 |
20180333102 | De Haan et al. | Nov 2018 | A1 |
20190050985 | Den Brinker et al. | Feb 2019 | A1 |
20190133499 | Auerbach | May 2019 | A1 |
20190142274 | Addison et al. | May 2019 | A1 |
20190199970 | Greiner et al. | Jun 2019 | A1 |
20190209046 | Addison et al. | Jul 2019 | A1 |
20190209083 | Wu et al. | Jul 2019 | A1 |
20190307365 | Addison et al. | Oct 2019 | A1 |
20190311101 | Nienhouse | Oct 2019 | A1 |
20190343480 | Shute et al. | Nov 2019 | A1 |
20190380599 | Addison et al. | Dec 2019 | A1 |
20190380807 | Addison et al. | Dec 2019 | A1 |
20200046302 | Jacquel et al. | Feb 2020 | A1 |
20200187827 | Addison et al. | Jun 2020 | A1 |
20200202154 | Wang et al. | Jun 2020 | A1 |
20200205734 | Mulligan et al. | Jul 2020 | A1 |
20200237225 | Addison et al. | Jul 2020 | A1 |
20200242790 | Addison et al. | Jul 2020 | A1 |
20200250406 | Wang et al. | Aug 2020 | A1 |
20200253560 | De Haan | Aug 2020 | A1 |
20200289024 | Addison et al. | Sep 2020 | A1 |
20200329976 | Chen et al. | Oct 2020 | A1 |
20210068670 | Redtel | Mar 2021 | A1 |
20210153746 | Addison et al. | May 2021 | A1 |
20210235992 | Addison | Aug 2021 | A1 |
20220211296 | Addison et al. | Jul 2022 | A1 |
Number | Date | Country |
---|---|---|
2234191 | Oct 1998 | CA |
106725410 | May 2017 | CN |
111728602 | Oct 2020 | CN |
112233813 | Jan 2021 | CN |
19741982 | Oct 1998 | DE |
2793189 | Nov 2016 | EP |
2428162 | Aug 2017 | EP |
3207862 | Aug 2017 | EP |
3207863 | Aug 2017 | EP |
3384827 | Oct 2018 | EP |
2772828 | Jan 2019 | EP |
2004173010 | Jun 2004 | JP |
2004283373 | Oct 2004 | JP |
3744778 | Dec 2005 | JP |
2009544080 | Dec 2009 | JP |
2011130996 | Jul 2011 | JP |
101644843 | Aug 2016 | KR |
20120373 | Apr 2014 | RS |
2004100067 | Nov 2004 | WO |
2005079658 | Sep 2005 | WO |
2010034107 | Apr 2010 | WO |
2010036653 | Apr 2010 | WO |
2015059700 | Apr 2015 | WO |
2015078735 | Jun 2015 | WO |
2015110859 | Jul 2015 | WO |
2016065411 | May 2016 | WO |
2016178141 | Nov 2016 | WO |
2016209491 | Dec 2016 | WO |
2017060463 | Apr 2017 | WO |
2017089139 | Jun 2017 | WO |
2017100188 | Jun 2017 | WO |
2017144934 | Aug 2017 | WO |
2018042376 | Mar 2018 | WO |
2019094893 | May 2019 | WO |
2019135877 | Jul 2019 | WO |
2019240991 | Dec 2019 | WO |
2020033613 | Feb 2020 | WO |
2021044240 | Mar 2021 | WO |
Entry |
---|
Al-Naji, Ali , et al., “Real Time Apnoea Monitoring of Children Using the Microsoft Kinect Sensor: A Pilot Study”, Sensors, 17(286), Feb. 3, 2017, 15 pages. |
Harte, James M., et al., “Chest wall motion analysis in healthy volunteers and adults with cystic fibrosis using a novel Kinect-based motion tracking system”, Medical & Biological Engineering & Computing, 54(11), Feb. 13, 2016, pp. 1631-1640, 11 pages. |
Lawrence, E., et al., “Data Collection, Correlation and Dissemination of Medical Sensor information in a WSN”, IEEE 2009 Fifth International Conference on Networking and Services, 978-0-7695-3586-9/09, Apr. 20, 2009, pp. 102-408, 7 pages. |
Li, et al., “A Non-Contact Vision-Based System for Respiratory Rate Estimation”, IEEE 978-1-4244-7929-0/14, 2014, pp. 2119-2122, 4 pages. |
Liu, H., et al., “A Novel Method Based on Two Cameras for Accurate Estimation of Arterial Oxygen Saturation”, BioMedical Engineering Online, vol. 14, No. 52, 2015, 18 pages. |
Liu, S., et al., “In-bed pose estimation: Deep learning with shallow dataset. IEEE journal of translational engineering in health and medicine”, IEEE Journal of Translational Engineering in Health and Medicine, No. 7, 2019, pp. 1-12, 12 pages. |
Liu, C., et al., “Motion Magnification”, ACM Transactions on Graphics (TOG), vol. 24, No. 3, 2005, pp. 519-526, 8 pages. |
Lv, et al., “Class Energy Image Analysis for Video Sensor-Based Gait Recognition: A Review”, Sensors, No. 15, 2015, pp. 932-964, 33 pages. |
McDuff, Daniel J., et al., “A Survey of Remote Optical Photoplethysmographic Imaging Methods”, IEEE 987-1-4244-0270-1/15, 2015, pp. 6398-6404, 7 pages. |
Mestha, L.K., et al., “Towards Continuous Monitoring of Pulse Rate in Neonatal Intensive Care Unit with a Webcam”, Proc. of 36th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, 2014, pp. 3817-3820, 4 pages. |
Mukherjee, S., et al., “Patient health management system using e-health monitoring architecture”, IEEE, International Advance Computing Conference (IACC), 978-1-4799-2572-8/14, Feb. 21, 2014, pp. 400-405, 6 pages. |
Nguyen, et al., “3D shape, deformation and vibration measurements using infrared Kinect sensors and digital image correlation”, Applied Optics, vol. 56, No. 32, Nov. 10, 2017, 8 pages. |
Ni, et al., “RGBD-Camera Based Get-Up Event Detection for Hospital Fall Prevention”, Acoustics, Speech and Signal Processing (ICASSP) 2012 IEEE International Conf., Mar. 2012, pp. 1405-1408, 6 pages. |
Nisar, et al., “Contactless heart rate monitor for multiple persons in a video”, IEEE International Conference on Consumer Electronics—Taiwan (ICCE-TW), XP03291229 [Retreived on Jul. 25, 2016], May 27, 2016, 2 pages. |
Pereira, C., et al., “Noncontact Monitoring of Respiratory Rate in Newborn Infants Using Thermal Imaging”, IEEE Transactions on Biomedical Engineering, Aug. 23, 2018, 10 pages. |
Poh, et al., “Advancements in Noncontact, Multiparameter Physiological Measurements Using a Webcam”, IEEE Transactions on Biomedical Engineering, vol. 58, No. 1, Jan. 2011, pp. 7-11, 5 pages. |
Poh, et al., “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation”, OPT. Express 18, 2010, pp. 10762-10774, 14 pages. |
Povsic, Klemen, et al., “Real-Time 3D visualization of the thoraco-abdominal surface during breathing with body movement and deformation extraction”, Physiological Measurement, vol. 36, No. 7, May 28, 2015, pp. 1497-1516, 22 pages. |
Prochazka, et al., “Microsoft Kinect Visual and Depth Sensors for Breathing and Heart Rate Analysis”, Senors, vol. 16, No. 7, Jun. 28, 2016, 11 pages. |
Rajan, V., et al., “Clinical Decision Support for Stroke using Multiview Learning based Models for NIHSS Scores”, PAKDD 2016 Workshop: Predictive Analytics in Critical Care (PACC), Auckland, New Zealand, 2016, pp. 190-199, 10 pages. |
Rajan, V., et al., “Dependency Clustering of Mixed Data with Gaussian Mixture Copulas”, 25th International Joint Conference on Artificial Intelligence IJCAI, New York, USA, 2016, pp. 1967-1973, 7 pages. |
Reisner, A., et al., “Utility of the Photoplethysmogram in Circulatory Monitoring”, American Society of Anesthesiologist, May 2008, pp. 950-958, 9 pages. |
Reyes, B.A., et al., “Tidal vol. and Instantaneous Respiration Rate Estimation using a Volumetric Surrogate Signal Acquired via a Smartphone Camera”, IEEE Journal of Biomedical and Health Informatics, vol. 21(3), Feb. 25, 2016, pp. 764-777, 15 pages. |
Rougier, Caroline, et al., “Robust Video Surveillance for Fall Detection Based on Human Shape Deformation”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 21, No. 5, May 2011, pp. 611-622, 12 pages. |
Rubinstein, M., “Analysis and Visualization of Temporal Variations in Video”, Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Feb. 2014, 118 pages. |
Scalise, Lorenzo, et al., “Heart rate measurement in neonatal patients using a webcamera”, Department of Industrial Engineering and Mathematical Science, Italy, 978-1-4673-0882-3/12, EEE, 2012, 4 pages. |
Schaerer, J., et al., “Multi-dimensional respiratory motion tracking from markerless optical surface imaging based on deformable mesh registration”, Physics in Medicine and Biology, vol. 57, No. 2, Dec. 14, 2011, pp. 357-373, 18 pages. |
Sengupta, A., et al., “A Statistical Model for Stroke Outcome Prediction and Treatment Planning”, 38th Annual International Conference of the IEE Engineering in Medicine and Biology (Society IEEE EMBC2016), Orlando, USA, 2016, pp. 2516-2519, 4 pages. |
Shah, Nitin, et al., “Performance of three new-generation pulse oximeters during motion and low perfursion in volunteers”, Journal of Clinical Anesthesia, No. 24, 2012, pp. 385-391, 7 pages. |
Shao, Dangdang, et al., “Noncontact Monitoring Breathing Pattern, Exhalation Flow Rate and Pulse Transit Time”, IEEE Transactions on Biomedical Engineering, vol. 61, No. 11, Nov. 2014, pp. 2760-2767, 8 pages. |
Shrivastava, H., et al., “Classification with Imbalance: A Similarity-based Method for Predicting Respiratory Failure”, IEEE International Conference on Bioinformatics and Biomedicine (IEEE BIBM2015), Washington, DC,USA, 2015, pp. 707-714, 8 pages. |
Srinivas, J., et al., “A Mutual Authentication Framework for Wireless Medical Sensor Networks”, Journal of Medical Systems, 41:80, 2017, pp. 1-19, 19 pages. |
Sun, Yu, et al., “Motion-compensated noncontact imaging photoplethysmography to monitor cardiorespiratory status during exercise”, Journal of Biomedical Optics, vol. 16, No. 7, Jul. 1, 2011, 10 pages. |
Sun, Yu, et al., “Noncontact imaging photoplethysmography to effectively access pulse rate variability”, Journal of Biomedical Optics, vol. 18(6), Jun. 2013, 10 pages. |
Tamura, et al., “Wearable Photoplethysmographic Sensors—Past & Present”, Electronics, vol. 3, 2014, pp. 282-302, 21 pages. |
Tarassenko, L., et al., “Non-contact video-based vital sign monitoring using ambient light and auto-regressive models”, Institute of Physics and Engineering in Medicine, vol. 35, 2014, pp. 807-831, 26 pages. |
Teichmann, D., et al., “Non-Contact monitoring techniques-Principles and applications”, In Proc. of IEEE International Conference of the Engineering in Medicine and Biology Society (EMBC), San Diego, CA, 2012, pp. 1302-1305, 4 pages. |
Transue, S., et al., “Real-time Tidal Volume Estimation using Iso-surface Reconstruction”, 2016 IEEE First International Conference on Connected Health: Applications, Systems and Engineering Technologies (CHASE), Jun. 27, 2016, pp. 209-218, 10 pages. |
Verkruysee, Wim, et al., “Calibration of Contactless Pulse Oximetry”, Anesthesia & Analgesia, vol. 124, No. 1, Jan. 2017, pp. 136-145, 10 pages. |
Villarroel, Mauricio, et al., “Continuous non-contact vital sign monitoring in neonatal intensive care unit”, Healthcare Technology Letters, vol. 1, Issue 3, 2014, pp. 87-91, 5 pages. |
Wadhwa, N., et al., “Phase-Based Video Motion Processing”, MIT Computer Science and Artificial Intelligence Lab, Jul. 2013, 9 pages. |
Wadhwa, N., et al., “Riesz pyramids for fast phase-based video magnification”, In Proc. of IEEE International Conference on Computational Photography (ICCP), Santa Clara, CA, 2014, 10 pages. |
Wang, W., et al., “Exploiting spatial redundancy of image sensor for motion robust rPPG”, IEEE Transactions on Biomedical Engineering, vol. 62, No. 2, 2015, pp. 415-425, 11 pages. |
Wu, H.Y. , et al., “Eulerian video magnifcation for revealing subtle changes in the world”, ACM Transactions on Graphics (TOG), vol. 31, No. 4, 2012, pp. 651-658, 8 pages. |
Wulbrand, H., et al., “Submental and diaphragmatic muscle activity during and at resolution of mixed and obstructive apneas and cardiorespiratory arousal in preterm infants”, Pediatric Research, No. 38(3), 1995, pp. 298-305, 9 pages. |
Yu, M.C., et al., “Noncontact Respiratory Measurement of Volume Change Using Depth Camera”, 2012 Annual International Conference of the IEEE Engeineering in Medicine and Biology Society, Aug. 28, 2012, pp. 2371-2374, 4 pages. |
Zaunseder, et al., “Spatio-temporal analysis of blood perfusion by imaging photoplethysmography”, Progress in Biomedical Optics and Imaging, SPIE—International Society for Optical Engineering, vol. 10501, Feb. 20, 2018, 15 pages. |
Zhou, J., et al., “Maximum parsimony analysis of gene copy number changes in tumor phylogenetics”, 15th International Workshop on Algorithms in Bioinformatics WABI 2015, Atlanta, USA, 2015, pp. 108-120, 13 pages. |
“European Search Report”, European Application No. 17156334.9, Applicant: Covidien LP, dated Aug. 23, 2017, 10 pages. |
“European Search Report”, European Patent Application No. 17156337.2, Applicant: Covidien LP, dated Aug. 23, 2017, 10 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2021/015669, dated Apr. 12, 2021, 15 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2018/060648, dated Jan. 28, 2019, 17 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2018/065492, dated Mar. 8, 2019, 12 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US19/035433, dated Nov. 11, 2019, 17 pages. |
“International Search Report and Written Opinion”, International Application No. PCT/US2019/045600, dated Oct. 23, 2019, 19 pages. |
“Invitation to Pay Additional Fees and Partial International Search Report”, International Application No. PCT/US2019/035433, dated Sep. 13, 2019, 16 pages. |
“Medical Electrical Equipment, Part 2-61: Particular requirements for basic safety and essential performance of pulse oximeter equipment”, BSI Standards Publication, BS EN ISO 80601-2-61, 2011, 98 pages. |
Aarts, Lonneke A.M., et al., “Non-contact heart rate monitoring utilizing camera photoplethysmography in neonatal Intensive care unit—A Pilot Study”, Early Human Development 89, 2013, pp. 943-948, 6 pages. |
Abbas, A.K., et al., “Neonatal non-contact respiratory monitoring based on real-time infrared thermography”, Biomed. Eng. Online, vol. 10, No. 93, 2011, 17 pages. |
Addison, Paul S., “A Review of Signal Processing Used in the Implementation of the Pulse Oximetry Photoplethysmographic Fluid Responsiveness Parameter”, International Anesthesia Research Society, vol. 119, No. 6, Dec. 2014, pp. 1293-1306, 14 pages. |
Addison, Paul S., et al., “Developing an algorithm for pulse oximetry derived respirator rate (RRoxi): a healthy volunteer study”, J Clin comput, No. 26, 2012, pp. 45-51, 7 pages. |
Addison, Paul S., et al., “Pulse oximetry-derived respiratory rate in general care floor patients”, J. Clin Monit Comput, No. 29, 2015, pp. 113-120, 8 pages. |
Addison, P.S., et al., “Video-based Heart Rate Monitoring across a Range of Skin Pigmentations during an Acute Hypoxic Challenge”, J Clin Monit Comput, vol. 9, Nov. 9, 2017, 15 pages. |
Amazon, “Dockem Koala Tablet Wall Mount Dock for iPad Air/Mini/Pro, Samsung Galaxy Tab/Note, Nexus 7/10, and More (Black Brackets, Screw-in Version)”, https://www.amazon.com/Tablet-Dockem-Samsung-Brackets-Version-dp/B00JV75FC6?th=1, First available Apr. 22, 2014, viewed on Nov. 16, 2021, Apr. 22, 2014, 4 pages. |
Amelard, et al., “Non-contact transmittance photoplethysmographic imaging (PPGI) for long-distance cardiovascular monitoring”, ResearchGate, XP055542534 [Retrieved online Jan. 15, 2019], Mar. 23, 2015, pp. 1-13, 14 pages. |
Armanian, A. M., “Caffeine administration to prevent apnea in very premature infants”, Pediatrics & Neonatology, 57(5), 2016, pp. 408-412, 5 pages. |
Barone, S., et al., “Computer-aided modelling of three-dimensional maxillofacial tissues through multi-modal Imaging”, Proceedings of the Institution of Mechanical Engineers, Journal of Engineering in Medicine, Part H vol. 227, No. 2, Feb. 1, 2013, 1 page. |
Barone, S., et al., “Creation of 3D Multi-body Orthodontic Models by Using Independent Imaging Sensors”, Senros Mdpi Ag Switzerland, vol. 13, No. 2, Jan. 1, 2013, pp. 2033-2050, 18 pages. |
Bartula, M., et al., “Camera-based System for Sontactless Monitoring of Respiration”, 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Jul. 3, 2013, pp. 2672-2675, 4 pages. |
Bhattacharya, S., et al., “A Novel Classification Method for Predicting Acute Hypotensive Episodes in Critical Care”, 5th ACM Conference on Bioinformatics, Computational Bilogy and Health Informatics (ACM-BCB 2014), Newport Beach, USA, 2014, 10 pages. |
Bhattacharya, S., et al., “Unsupervised learning using Gaussian Mixture Copula models”, 21st International Conference on Computational Statistics (COMPSTAT 2014), Geneva, Switzerland, 2014, pp. 523-530, 8 pages. |
Bickler, Philip E., et al., “Factors Affecting the Performance of 5 Cerebral Oximeters During Hypoxia in Healthy Volunteers”, Society for Technology in Anesthesia, vol. 117, No. 4, Oct. 2013, pp. 813-823, 11 pages. |
Bousefsaf, Frederic, et al., “Continuous wavelet filtering on webcam photoplethysmographic signals to remotely assess the instantaneous heart rate”, Biomedical Signal Processing and Control 8, 2013, pp. 568-574, 7 pages. |
Bruser, C., et al., “Adaptive Beat-to-Beat Heart Rate Estimation in Ballistocardiograms”, IEEE Transactions Information Technology in Biomedicine, vol. 15, No. 5, Sep. 2011, pp. 778-786, 9 pages. |
Cennini, Giovanni, et al., “Heart rate monitoring via remote photoplethysmography with motion artifacts reduction”, Optics Express, vol. 18, No. 5, Mar. 1, 2010, pp. 4867-4875, 9 pages. |
Colantonio, S., et al., “A smart mirror to promote a healthy lifestyle”, Biosystems Engineering. vol. 138, Innovations in Medicine and Healthcare, Oct. 2015, pp. 33-43, 11 pages. |
Cooley, et al., “An Alorithm for the Machine Calculation of Complex Fourier Series”, Aug. 17, 1964, pp. 297-301, 5 pages. |
Di Fiore, J.M., et al., “Intermittent hypoxemia and oxidative stress in preterm infants”, Respiratory Physiology & Neurobiology, No. 266, 2019, pp. 121-129, 25 pages. |
Fei, J., et al., “Thermistor at a distance: unobtrusive measurement of breathing”, IEEE Transactions on Biomedical Engineering, vol. 57, No. 4, 2010, pp. 968-998, 11 pages. |
Feng, Litong, et al., “Dynamic ROI based on K-means for remote photoplethysmography”, IEE International Conference on Accoustics, Speech and Signal Processing (ICASSP), Apr. 2015, pp. 1310-1314, 5 pages. |
Fischer, et al., “ReMoteCare: Health Monitoring with Streaming Video”, OCMB '08, 7th International Conference on Mobile Business, IEEE, Piscataway, NJ,, Jul. 7, 2008, pp. 280-286. |
George, et al., “Respiratory Rate Measurement From PPG Signal Using Smart Fusion Technique”, International Conference on Engineering Trends and Science & Humanities (ICETSH-2015), 2015, 5 pages. |
Goldman, L.J., “Nasal airflow and thoracoabdominal motion in children using infrared thermographic video processing”, Pediatric Pulmonology, vol. 47, No. 5, 2012, pp. 476-486, 11 pages. |
Grimm, T., et al., “Sleep position classification from a depth camera using bed aligned maps”, 23rd International Conference on Pattern Recognition (ICPR), Dec. 2016, pp. 319-324, 6 pages. |
Gsmarena, “Apple iPad Pro 11 (2018)”, https://www.gsmarena.com/apple_ipad_pro_11_(2018)-9386.pjp, viewed on Nov. 16, 2021, 1 page. |
Guazzi, Alessandro R., et al., “Non-contact measurement of oxygen saturation with an RGB camera”, Biomedical Optics Express, vol. 6, No. 9, Sep. 1, 2015, pp. 3320-3338, 19 pages. |
Han, J., et al., “Visible and infrared image registration in man-made environments employing hybrid visuals features”, Pattern Recognition Letters, vol. 34, No. 1, 2013, pp. 42-51, 10 pages. |
Huddar, V., et al., “Predicting Postoperative Acute Respiratory Failure in Critical Care using Nursing Notes and Physiological Signals”, 36th Annual International Conference of IEEE Engineering in Medicine and Biology Society (IEEE EMBC 2014), Chicago, USA, 2014, pp. 2702-2705, 4 pages. |
Hyvarinen, A., et al., “Independent Component Analysis: Algorithms and Applications”, Neural Networks, vol. 13, No. 4, 2000, pp. 411-430, 31 pages. |
Javadi, M., et al., “Diagnosing Pneumonia in Rural Thailand: Digital Cameras versus Film Digitizers for Chest Radiograph Teleradiology”, International Journal of Infectious Disease, 10(2), Mar. 2006, pp. 129-135, 7 pages. |
Jopling, M. W., et al., “Issues in the Laboratory Evaluation of Pulse Oximeter Performance”, Anesth. Analg., No. 94, 2002, pp. S62-S68, 7 pages. |
Kastle, Siegfried W., et al., “Determining the Artifact Sensitivity of Recent Pulse Oximeters During Laboratory Benchmarking”, Journal of Clinical Monitoring and Computing, vol. 16, No. 7, 2000, pp. 509-552, 14 pages. |
Klaessens, J.H.G.M., et al., “Non-invasive skin oxygenation imaging using a multi-spectral camera system: Effectiveness of various concentration algorithms applied on human skin”, Proc. of SPIE, vol. 7174 717408-1, 2009, 14 pages. |
Kong, Lingqin, et al., “Non-contact detection of oxygen saturation based on visible light imaging device using ambient light”, Optics Express, vol. 21, No. 15, Jul. 29, 2013, pp. 17646-17471, 8 pages. |
Kortelainen, J.M., et al., “Sleep staging based on signals acquired through bed sensor”, IEEE Transactions on Informational Technology in Biomedicine, vol. 14, No. 3, May 2010, pp. 776-785, 10 pages. |
Kumar, M., et al., “Distance PPG: Robust non-contact vital signs monitoring using a camera”, Biomedical Optics Express, vol. 6, No. 5, May 1, 2015, 24 pages. |
Kwon, Sungjun, et al., “Validation of heart rate extraction using video imaging on a built-in camera system of a smartphone”, 34th Annual International Conference of the IEEE EMBS, San Diego, CA, USA, Aug. 28-Sep. 1, 2012, pp. 2174-2177, 4 pages. |
Lai, C.J., et al., “Heated humidified high-flow nasal oxygen prevents intraoperative body temperature decrease in non-Intubated thoracoscopy”, Journal of Anesthesia, Oct. 15, 2018, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20220207763 A1 | Jun 2022 | US |
Number | Date | Country | |
---|---|---|---|
62797538 | Jan 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16750502 | Jan 2020 | US |
Child | 17655514 | US |