The present technology is generally related to wearable patient monitoring systems.
Many conventional medical monitors require attachment of a sensor to a patient in order to detect physiologic signals from the patient and to transmit detected signals through a cable to the monitor. These monitors process the received signals and determine vital signs such as the patient's pulse rate, respiration rate, and arterial oxygen saturation. For example, a pulse oximeter is a finger sensor that can include two light emitters and a photodetector. The sensor emits light into the patient's finger and transmits the detected light signal to a monitor. The monitor includes a processor that processes the signal, determines vital signs (e.g., pulse rate, respiration rate, arterial oxygen saturation), and displays the vital signs on a display.
Other monitoring systems include other types of monitors and sensors, such as electroencephalogram (EEG) sensors, blood pressure cuffs, temperature probes, air flow measurement devices (e.g., spirometer), and others. Some wireless, wearable sensors have been developed, such as wireless EEG patches and wireless pulse oximetry sensors.
Video-based monitoring is a field of patient monitoring that uses one or more remote video cameras to detect physical attributes of the patient. This type of monitoring can also be called “non-contact” monitoring in reference to the remote video sensor(s), which does/do not contact the patient. The remainder of this disclosure offers solutions and improvements in this field.
The techniques of this disclosure generally relate to a patient monitoring system including a video monitoring system and a patient sensor that is in contact with the patient, both the video monitoring system and the sensor communicating patient data to a wearable wireless hub configured to attach to, be worn by, or be carried by the patient. In some aspects, a relay is configured to receive the first, sensor data and to transmit the data to the wearable wireless hub.
In another aspect, the disclosure provides video-based patient monitoring, wherein at least one region of interest (ROI) of a patient is defined, and wherein at least one image capture device captures two or more images of the ROI. A processor calculates a change in depth of at least one portion of the ROI within the two or more images.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale. Instead, emphasis is placed on illustrating clearly the principles of the present disclosure. The drawings should not be taken to limit the disclosure to the specific embodiments depicted, but are for explanation and understanding only.
The following disclosure describes patient monitoring devices, systems, and associated methods for detecting and/or monitoring one or more patient parameters, such as tidal volume, respiratory rate, minute volume, patient movement, temperature, blood pressure, heart rate, arterial oxygen saturation, and/or others. As described in greater detail below, devices, systems, and/or methods configured in accordance with embodiments of the present technology are configured to capture one or more images (e.g., a video sequence) of a patient or a portion of a patient (e.g., a patient's torso) within a field of view of a non-contact detector (e.g., an image capture device). The devices, systems, and/or methods can measure changes in depths of regions (e.g., one or more pixels or groups of pixels) in the captured images over time. Based, at least in part, on these measurements, the devices, systems, and/or methods can determine various respiratory parameters of a patient, including tidal volume, minute volume, respiratory rate, among others. In these and other embodiments, the device, systems, and/or methods can analyze the respiratory parameters and can trigger alerts and/or alarms when the devices, systems, and/or methods detect one or more breathing abnormalities.
Additionally or alternatively, devices, systems, and/or methods configured in accordance with embodiments of the present technology can include one or more sensors or probes associated with (e.g., contacting) a patient that can be configured to capture data (e.g., temperature, blood pressure, heart rate, arterial oxygen saturation, etc.) related to a patient. The devices, systems, and/or methods can transmit the captured data to a hub and/or central unit. In some embodiments, the devices, systems, and/or methods can analyze the captured data to determine and/or monitor one or more patient parameters. In these and other embodiments, the devices, systems, and/or methods can use the data captured by the one or more sensors or probes in conjunction with data captured using a non-contact detector. In these and still other embodiments, the devices, systems, and/or methods can trigger alerts and/or alarms when the devices, systems, and/or methods detect one or more patient parameter abnormalities.
Specific details of several embodiments of the present technology are described herein with reference to
In the embodiments illustrated in
Information captured by the one or more sensors 140 can be stored and/or processed by the one or more sensors 140 and/or by the wireless hub 130. For example, the one or more sensors 140 can store captured information and/or can locally process the captured information (e.g., to determine one or more patient parameters). In these and other embodiments, the one or more sensors 140 can transmit the raw, captured information and/or the locally processed data to the wireless hub 130. For example, the one or more sensors 140 can include a wireless transmitter (not shown) to transfer the data directly to the wireless hub 130 via a wired or wireless connection (not shown). In turn, the wireless hub 130 can store and/or process the information received from the one or more sensors 140 (e.g., to determine one or more patient parameters). In these and other embodiments, the wireless hub 130 can transfer the raw, captured information and/or processed data to a central unit (not shown), such as a central hospital station, via a wired or wireless connection (not shown).
Additionally or alternatively, as shown in
The image capture device 214 can capture a sequence of images over time. The image capture device 214 can be a depth sensing camera, such as a Kinect camera from Microsoft Corp. (Redmond, Wash.). A depth sensing camera can detect a distance between the camera and objects within its field of view. Such information can be used, as disclosed herein, to determine that a patient 112 is within the FOV 216 of the image capture device 214 and/or to determine one or more ROI's to monitor on the patient 112. Once an ROI is identified, the ROI can be monitored over time, and the changes in depths of regions (e.g., pixels) within the ROI can represent movements of the patient 112 (e.g., associated with breathing). As described in greater detail in U.S. patent application Ser. No. 16/219,360 and U.S. Provisional Patent Application Ser. No. 62/779,964, those movements, or changes of regions within the ROI, can be used to determine various patient parameters, such as various breathing parameters, including tidal volume, minute volume, respiratory rate, etc. Those movements, or changes of regions within the ROI, can also be used to detect various patient parameter abnormalities, as discussed in greater detail in U.S. Provisional Patent Application Ser. Nos. 62/716,724 and 62/779,964. The various patient parameter abnormalities can include, for example, apnea, rapid breathing (tachypnea), slow breathing, intermittent or irregular breathing, shallow breathing, obstructed and/or impaired breathing, and others. The entire disclosures of U.S. patent application Ser. No. 16/219,360 and U.S. Provisional Patent Application Ser. Nos. 62/716,724 and 62/779,964 are incorporated herein by reference.
In these and other embodiments, the image capture device can be an RGB (red green blue) camera or an infrared camera. An RGB camera can detect slight color changes within its field of view. Such information can be used, as disclosed herein, to determine that a patient 112 is within the FOV 216 of the image capture device 214 and/or to determine one or more ROI's to monitor on the patient 112. Once an ROI is identified, the ROI can be monitored over time, and the changes in color of regions (e.g., pixels) within the ROI can represent various information related to the patient 112. As described in greater detail in U.S. patent application Ser. No. 16/188,969 those color changes can be used to detect optical signals associated with one or more medical devices, such as a pulse oximeter attached to the patient. Those color changes can also be used to determine and/or monitor various vital signs of the patient, including pulse rate, respiration rate, and arterial oxygen saturation, as discussed in greater detail in U.S. patent application Ser. Nos. 15/432,057 and 15/432,063. Additionally, or alternatively, as discussed in greater detail in U.S. Provisional Patent Application Ser. Nos. 62/685,485 and 62/695,244, those color changes can also be used in a surgical setting to monitor and/or assess blood flow in the ROI by detecting occlusions and/or monitoring pulsation, pulsation strength, and/or perfusion. The entire disclosures of U.S. patent application Ser. Nos. 16/188,969, 15/432,057, and 15/432,063 and U.S. Provisional Patent Application Ser. Nos. 62/685,485 and 62/695,244 are incorporated herein by reference.
In some embodiments, the system 200 can receive user input to identify a starting point for defining a ROI. For example, an image can be reproduced on a display 222 of the system 200 (or on the display 134 of the hub 130), allowing a user of the system 200 to select a patient 112 for monitoring (which can be helpful where multiple objects are within the FOV 216 of the image capture device 214) and/or allowing the user to select a point on the patient 112 from which a ROI can be determined (such as the point 203 on the chest of the patient 112). In other embodiments, other methods for identifying a patient 112, for identifying points on the patient 112, and/or for defining one or more ROI's can be used. For example, a user can select a patient 112 for monitoring and a point on a patient bed 208 (which can be helpful in defining one or more ranges of depths to be used in measurements taken by a non-contact detector).
The images detected by the image capture device 214 can be sent to the computing device 215 through a wired or wireless connection 220. The computing device 215 can include a processor 218 (e.g., a microprocessor), the display 222, and/or hardware memory 226 for storing software and computer instructions. Sequential image frames of the patient 112 are recorded by the image capture device 214 and sent to the processor 218 for analysis. The display 222 can be remote from the image capture device 214, such as a video screen positioned separately from the processor 218 and the memory 226. Other embodiments of the computing device 215 can have different, fewer, or additional components than shown in
Referring to
As shown in
In some embodiments, the image capture device 214 can be configured to capture data when the wireless hub 130 is placed in the dock 335. Additionally or alternatively, the image capture device 214 can be configured to capture data when the image capture device 214 recognizes that a patient 112 is in the patient bed 208 and/or that the patient 112 corresponds with the patient 112 who is registered with the wireless hub 130. In these and other embodiments, the image capture device 214 can be configured to store and/or transmit data captured and/or processed to the wireless hub 130 (via a wired or wireless connection 357) and/or to a central unit (not shown), such as a central hospital station (not shown) via a wireless connection (not shown). In these and other embodiments, the wireless hub 130, the one or more sensors 140, and/or the relay 142 can be configured to send data to the image capture device 214 via various wired and/or wireless connections (not shown). In these and still other embodiments, data captured and/or processed can be processed at the image capture device 214, at the wireless hub 130, in the cloud, at the central station, and/or at other locations.
When the patient 112 returns to the patient bed 208 (block 304), the patient 112 can continue to wear the wireless hub 130 and/or can place the wireless hub 130 in the dock 335 (block 308). While the patient 112 is in the patient bed 208, the one or more sensors or probes 104 can continue to capture the first data (block 302) and/or send it to the relay 142, to the wireless hub 130, and/or to the central unit (block 303). As the patient 112 moves within the field of view 216 of the image capture device 214 (block 304) and/or as the patient 112 places the wireless hub 130 in the dock 335 (block 308), the routine 300, using the image capture device 214, (i) can identify the patient 112 (block 309), (ii) determine whether the patient 112 matches the patient 112 registered to the wireless hub 130 (block 310), and/or (iii) begin capturing second data (block 311). If the routine 300 determines at block 310 that the patient 112 identified at block 309 does not match the patient registered to the wireless hub 130 (block 301), the routine 300 can trigger an alert/alarm (block 307). On the other hand, if the routine determines at block 310 that the identified patient 112 (block 309) matches the patient 112 registered to the wireless hub 130 (block 301), the routine 300 can capture the second data (block 311) and send the second data to the wireless hub 130 and/or to the central unit (block 312).
Based at least in part on the first data and/or on the second data, the routine 300 can determine one or more physiological parameters of the patient 112 (block 313). If the routine 300 detects a patient parameter abnormality in the physiological parameters (block 314), the routine 300 can trigger an alert/alarm (block 307) to indicate that the patient 112 needs aid. In some cases, the routine 300 can detect a patient abnormality (block 314) when either the first data or the second data is corrupted or not available (e.g., when one of the sensors 140 attached to the patient 112 falls off the patient 112 while the patient 112 is in the patient bed 208). In these instances, the routine 300 can determine whether the patient 112 truly needs aid and/or whether the data is available from one of the other sources of information (block 315 and/or 316), such as from the image capture device 214. If the patient 112 doesn't need assistance and/or if the data is available from another source of information, the routine 300 can cancel or mitigate (e.g., downgrade) the alert/alarm at block 315 and 316, respectively. In this manner, the systems 392 and 394 (
Referring to
The computing device 410 can communicate with other devices, such as the server 425 and/or the image capture device(s) 485 via (e.g., wired or wireless) connections 470 and/or 480, respectively. For example, the computing device 410 can send to the server 425 information determined about a patient from images and/or other data captured by the image capture device(s) 485 and/or one or more other sensors or probes. The computing device 410 can be the computing device 215 of
In some embodiments, the image capture device(s) 485 are remote sensing device(s), such as depth sensing video camera(s), as described above with respect to
The server 425 includes a processor 435 that is coupled to a memory 430. The processor 435 can store and recall data and applications in the memory 430. The processor 435 is also coupled to a transceiver 440. In some embodiments, the processor 435, and subsequently the server 425, can communicate with other devices, such as the computing device 410 through the connection 470.
The devices shown in the illustrative embodiment can be utilized in various ways. For example, either the connections 470 and 480 can be varied. Either of the connections 470 and 480 can be a hard-wired connection. A hard-wired connection can involve connecting the devices through a USB (universal serial bus) port, serial port, parallel port, or other type of wired connection that can facilitate the transfer of data and information between a processor of a device and a second processor of a second device. In another embodiment, either of the connections 470 and 480 can be a dock where one device can plug into another device. In other embodiments, either of the connections 470 and 480 can be a wireless connection. These connections can take the form of any sort of wireless connection, including, but not limited to, Bluetooth connectivity, Wi-Fi connectivity, infrared, visible light, radio frequency (RF) signals, or other wireless protocols/methods. Other possible modes of wireless communication can include near-field communications, such as passive radio-frequency identification (RFID) and active RFID technologies. RFID and similar near-field communications can allow the various devices to communicate in short range when they are placed proximate to one another. In some embodiments, two or more devices in the patient monitoring system 400 can together create a dynamic mesh network that includes connections 470 and/or 480. In these and other embodiments, data captured by and/or received at one device of the system 400 may be sent to and/or through other devices of the system 400 (e.g., to reach the server(s)), hence improving wireless coverage. In these and still other embodiments, the various devices can connect through an internet (or other network) connection. That is, either of the connections 470 and 480 can represent several different computing devices and network components that allow the various devices to communicate through the internet, either through a hard-wired or wireless connection. Either of the connections 470 and 480 can also be a combination of several modes of connection.
The configuration of the devices in
The routine 510 can begin at block 511 by determining whether a wearable patient monitoring system is registered to and/or worn by a patient. In some embodiments, the routine 510 can determine whether a wearable patient monitoring system is registered to a patient by determining whether a wearable wireless hub is registered to a patient and/or whether the wireless hub is receiving data from one or more sensors or probes associated with (e.g., contacting) the patient. In these and other embodiments, the routine 510 can determine whether a wearable patient monitoring system is worn by the patient (i) by determining whether the wearable patient monitoring system is receiving data from one or more sensors or probes associated with the patient, (ii) by detecting motion data associated with the wireless hub (e.g., when the wireless hub is not docked), and/or (ii) by identifying the patient and/or the wireless hub within a field of view (FOV) of an image capture device. In some embodiments, the wireless hub can be configured to emit specific IR signals that the routine 510 can detect using an image capture device to locate the wireless hub within the field of view of the image capture device.
If the routine 510 determines that a wearable patient monitoring system is not registered and/or worn by a patient (e.g., that a wearable wireless hub of the wearable patient monitoring system is not receiving data from one or more sensors or probes associated with or contacting the patient), the routine 510 can proceed to block 518 to trigger one or more alerts and/or alarms. For example, the routine 510 can trigger an error message or sound to alert a user that a wearable patient monitoring system is not (e.g., properly) registered to a patient. In these and other embodiments, the routine 510 can trigger an alert and/or alarm to remind a user to wear the wearable patient monitoring system (e.g., to wear the wireless hub) and/or to notify a user that a patient is out of their hospital bed without their wireless hub. On the other hand, if the routine 510 determines that a wearable patient monitoring system is registered and/or worn by a patient, the routine 510 can proceed to block 512.
At block 512, the routine 510 can capture first patient data from a sensor in contact with the patient. In some embodiments, the routine 510 can capture patient data using one or more sensors or probes associated with (e.g., contacting) the patient. For example, the routine 510 can use one or more sensors or probes to capture an ECG signal, an EEG signal, a temperature signal, a heart rate signal, a respiratory rate signal, an average temperature signal, and/or one or more other physiological signals relating to the patient. In these and other embodiments, the routine 510 can capture patient data using one or more image capture devices. For example, the routine 510 can use an image capture device to detect a location of the patient, movement of the patient, whether the patient is awake or sleeping, and/or other data relating to the patient. In these and still other embodiments, the routine 510 can capture patient data by receiving data at a wearable wireless hub, at a central unit (e.g., a central hospital station), and/or at another location from the one or more sensors or probes, image capture device(s), and/or relay(s).
At block 513, the routine 510 can capture second patient data from the video system. For example, the routine 510 can recognize a patient within a field of view (FOV) of one or more image capture devices and/or define one or more regions of interest (ROI's) on the patient. In some embodiments, the routine 510 can recognize the patient by identifying the patient using facial recognition hardware and/or software of the image capture device(s). In these embodiments, the routine 510 can display the name of the patient on a display screen once the routine 510 has identified the patient. In some embodiments, the routine 510 can determine whether the recognized patient matches that patient registered to the wearable patient monitoring system at block 511, and/or can trigger one or more alerts and/or alarms if the patients do not match. In these and other embodiments, the routine 510 can recognize a patient within the FOV of the image capture device by determining a skeleton outline of the patient and/or by recognizing one or more characteristic features (e.g., a torso of a patient). In these and still other embodiments, the routine 510 can define one or more ROI's on the patient in accordance with the discussion above with respect to
At block 514, the routine 510 can capture two or more images of one or more ROI's. In some embodiments, the routine 510 can capture the two or more images of the one or more ROI's by capturing a video sequence of the one or more ROI's. In these and other embodiments, the routine 510 can capture the two or more images of the one or more ROI's by capturing separate still images of the one or more ROI's. The routine 510 can capture the two or more still images at a rate faster than a period of the patient's respiration cycle to ensure that the two or more still images occur within one period of the patient's respiration cycle.
At block 515, the routine 510 can determine one or more patient parameters from the captured ROI's. In some embodiments, the routine 510 can determine one or more patient parameters using data captured at any of the blocks 512-514. For example, the routine 510 can determine a patient's respiratory rate, tidal volume, motion activity, temperature, blood pressure, arterial oxygen saturation, and/or other patient parameters using the captured data. In these and other embodiments, the routine 510 can determine the one or more patient parameters at the one or more sensors associated with (e.g., contacting) the patient, at a relay and/or wireless hub registered to the patient, at the one or more image capture devices, at a central unit (e.g., at a central hospital station), in the cloud, and/or at one or more other locations.
At block 516, the routine 510 can determine whether a patient parameter abnormality is detected based at least in part on data captured at any of the blocks 512-514 and/or on one or more patient parameters determined at block 515. For example, the routine 510 can detect that a patient is not breathing, is experiencing paradoxical breathing, is straining to breath, and/or is experiencing other breathing or patient parameter abnormalities (e.g., central apnea). In some embodiments, the routine 510 can use data captured with the one or more sensors or probes associated with (e.g., contacting) the patient in conjunction with data captured with the one or more image capture devices to determine whether the patient is experiencing a parameter abnormality. For example, the routine 510 can use data captured with an image capture device to mitigate a potential alert and/or alarm that would otherwise be triggered from a degraded signal captured with the one or more sensors or probes, or vice versa. In these embodiments, when one or more sensors or probes associated with the patient fall off the patient (e.g., while the patient is sleeping), the routine 510 can use data captured with the one or more image capture devices to monitor the patient. If the routine 510 does not detect a parameter abnormality using the data from the one or more image capture devices, the routine 510 can mitigate (e.g., not trigger or downgrade) an alert and/or alarm that otherwise would be triggered when the one or more sensors or probes fall off the patient (e.g., so as not to waken the patient).
As another example, the routine 510 can detect and monitor body motion using one or more image capture devices in conjunction with the one or more sensors or probes. In these embodiments, the routine 510 can detect when a patient is experiencing a seizure (e.g., characterized by significant muscle strain and/or regular jerking motion). In these and other embodiments, the routine can detect neck muscle activation using data captured with the one or more image capture devices in conjunction with an SpO2 signal captured with the one or more sensors or probes to detect a patient's increased effort to breathe and/or to detect other obstructive or difficulty-of-breathing events. If the routine 510 detects a patient parameter abnormality, the routine 510 can proceed to block 518.
At block 517, the routine 510 can trigger one or more alerts and/or alarms. In some embodiments, the routine 510 can trigger one or more alerts and/or alarms at a wireless hub and/or at other components (e.g., a computer, an image capture device, a relay, a central unit, etc.) of a patient monitoring system. In these and other embodiments, the routine 510 can use a data buffer (e.g., a video buffer) such that when the routine 510 triggers an alert and/or alarm, the routine 510 can recall data in the buffer and/or save the data (e.g., in one or more logs for subsequent analysis). In these and still other embodiments, the routine 510 can associate a date and/or time stamp with the recalled and/or saved data to facilitate subsequent analysis and review of the data.
Although the steps of the routine 510 are discussed and illustrated in a particular order, the routine 510 illustrated in
The above detailed descriptions of embodiments of the technology are not intended to be exhaustive or to limit the technology to the precise form disclosed above. Although specific embodiments of, and examples for, the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while steps are presented in a given order, alternative embodiments can perform steps in a different order. Furthermore, the various embodiments described herein can also be combined to provide further embodiments.
The systems and methods described herein can be provided in the form of tangible and non-transitory machine-readable medium or media (such as a hard disk drive, hardware memory, etc.) having instructions recorded thereon for execution by a processor or computer. The set of instructions can include various commands that instruct the computer or processor to perform specific operations such as the methods and processes of the various embodiments described here. The set of instructions can be in the form of a software program or application. The computer storage media can include volatile and non-volatile media, and removable and non-removable media, for storage of information such as computer-readable instructions, data structures, program modules or other data. The computer storage media can include, but are not limited to, RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, or other optical storage, magnetic disk storage, or any other hardware medium which can be used to store desired information and that can be accessed by components of the system. Components of the system can communicate with each other via wired or wireless communication. The components can be separate from each other, or various combinations of components can be integrated together into a monitor or processor or contained within a workstation with standard computer hardware (for example, processors, circuitry, logic circuits, memory, and the like). The system can include processing devices such as microprocessors, microcontrollers, integrated circuits, control units, storage media, and other hardware.
From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. To the extent any materials incorporated herein by reference conflict with the present disclosure, the present disclosure controls. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Moreover, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Where the context permits, singular or plural terms can also include the plural or singular term, respectively. Additionally, the terms “comprising,” “including,” “having” and “with” are used throughout to mean including at least the recited feature(s) such that any greater number of the same feature and/or additional types of other features are not precluded.
From the foregoing, it will also be appreciated that various modifications can be made without deviating from the technology. For example, various components of the technology can be further divided into subcomponents, or various components and functions of the technology can be combined and/or integrated. Furthermore, although advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments can also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the disclosure and associated technology can encompass other embodiments not expressly shown or described herein.
This application claims priority to U.S. Provisional Patent Application Ser. No. 62/797,519, filed Jan. 28, 2019, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62797519 | Jan 2019 | US |