The present technology is generally related to a vision-based patient stimulus monitoring and response system utilizing visual images or image streams to detect stimulation of the patient by a caregiver with detection of a corresponding response from the patient.
For various patients, for example neonate patients, it can be difficult to ascertain whether that patient is in, or is about to enter, an impaired state indicative of disease, such as sepsis, among others.
Accordingly, there is a need in the art for improved techniques for determining whether a patient, be it a neonate or other, is in or is about to enter such impaired state.
The techniques of this disclosure generally relate to vision-based patient stimulus monitoring and response systems and methods, wherein patient stimulation is detected via one or more images or image streams.
In one aspect, the present disclosure provides systems and methods that produces a localized measure of patient response to stimuli from which to accurately monitor that patient's autonomic nervous system (“ANS”) state to determine whether the patient is in or is about to enter an impaired state (e.g., sepsis). Exemplary embodiments also find advantage in the art with regard to neonates, where such determinations may be more difficult otherwise. A lack of or a poor response may be indicative of the patient is in an impaired state whereas an energetic response may be indicative of a healthier state.
In exemplary embodiments systems and methods for determining an autonomic nervous system (ANS) state are provided, including: a camera configured to provide an image; a processor, including memory configured to store data; wherein the camera is configured to view an image of a patient, and wherein the camera and the processor are configured to provide image recognition to determine: identification of the patient; identification of external stimulation of the patient; and identification of cessation of stimulation of the patient; and further, wherein the processor is configured to use a parallel assessment of at least one physiological signal of the patient in time with the image recognized stimulation of the patient to assess a response to the stimulation of the patient in order to provide a measure of an autonomic nervous state (ANS) of the patient.
In further exemplary aspects, the disclosure provides a detector that is associated with a camera image or an image screen and detects stimulation of the patient by a caregiver. Once stimulation of the patient is detected, the patient response is measured, either via a physiological signal from a patient sensor or via the image.
In exemplary aspects related to measuring response via a patient sensor, such physiological sensor may be, for example, a heart rate signal from, e.g., a pulse oximeter probe, an electrocardiogram (ECG sensor), a blood pressure device, an accelerometer, a phonocardiogram, a device that records a cardiac waveform from which a heart rate or pulse rate can be derived, or any patient sensor providing a signal indicative of stimulation.
In exemplary aspects related to measuring response via the image (or image stream), patient response parameters may be collected and analyzed, as will be described in more detail below.
In additional exemplary aspects, the system and method described herein includes a processor that provides analysis, alarms and/or metrics.
Accordingly, such exemplary aspects provide systems and methods that determine a measure of response to stimulation of the patient that is directly related to observed stimulation.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
As we have noted above, the present disclosure describes vision-based patient stimulus monitoring and response systems and methods, wherein patient stimulation is detected via one or more images or image streams. Accordingly, such systems and methods determine a measure of response to stimulation of the patient that is directly related to observed stimulation.
In exemplary embodiments, a camera is configured to monitor a patient (e.g., a neonate), with an associated detector to identify whether a patient is being stimulated, e.g., by the hand of a caregiver. Identification of the stimulus, by the detector, triggers analysis of the response in order to provide a measure directly related to the observed stimulation.
For convenience, various examples such as those referred to with regard to
In one aspect, the present disclosure provides a system for vision-based patient stimulus monitoring using a camera and a processor to detect patient stimulus and to measure a response thereto relative to the patient's autonomic nervous system (ANS) to further determine whether the patient is in or is about to enter an impaired state indicative of disease. Referring to
In exemplary embodiments, the processor 104 is also configured to provide such calculations over time. The processor is optionally also configured to provide control functions for the camera(s). Data from the camera(s) and/or from processor calculations is optionally stored in memory/storage 106. Additionally, data from the camera(s) and/or from processor calculations is optionally transmitted for display on the display device 108. The processor is also optionally connected to alarm 110, which may activate during certain lapses with regard to defined limits.
In exemplary embodiments, the processor/detector 104 (
In exemplary embodiments, such detection (e.g., of hands and/or feet of a neonate) is performed in accordance with deep learning routines, such as a YoLo deep learning algorithm, as described in commonly owned U.S. patent application Ser. No. 16/854,059, filed Apr. 21, 2020, the entire contents of which are expressly incorporated by reference herein.
Artificial intelligence models of varying types may be implemented to recognize objects within images or image streams, e.g. to place a box around recognized objects (multi-object classifiers), including YoLo, among others. For example, a classifier may be trained to detect hands using transfer learning. Additionally, a classifier may be trained (with a base or from scratch) to recognize other objects, e.g., a caregiver's hands.
With reference to
The camera system 502 illustrates a detected start of stimulation at 506, an end of stimulation 508, and a flag (A) set during stimulation at 510. Parallel to such detection, pulse oximeter 504 provides a signal indicative of heart rate (which is an exemplary physiological signal) over time, with a measured heart rate change 512 provided resultant from stimulation. We note that the measured heart rate change in this exemplary embodiment is measured as a difference from a start of stimulation 506 through the end of stimulation 508, though different portions, metrics or adjustments may be taken into account in the calculation of such difference/change 512.
As is noted above, in addition to detection of stimulation of a patient, parallel measurement of physiological signal(s) may be provided, including but not limited to heart rate, heart rate variability, respiration rate, blood pressure, oxygen saturation, an indication of perfusion, skin conductivity, etc.
It should be noted that the system and method may identify when a caregiver touches the neonate, taking into to account, for example, duration and location of physical contact (which may be usable to set an exemplary start of stimulation). For example, a short, one second touch may be, should the defined rules define it as such, not qualified as a start of stimulation, whereas a longer duration touch may qualify, dependent upon pre-set or defined rules. In such a way, a flag (or other indicator) would not be automatically set, e.g., according to that brief one touch, but would instead be set after a set or otherwise defined detected period of initial contact.
It should be noted that when looking at heart rate change, particularly with regard to setting the start of such change 610, such as in
It should also be noted that such a parallel measurement need not be discrete; and may be integrated into a running assessment of stimulation/response over time, either aggregated over all identified stimuli, or via subsets of identified stimuli of interest particular to ANS or other desired categories.
In exemplary embodiments, neonate (or other patient) stimulus can be accessed via the camera 206 and/or via additional cameras.
With reference to
Referring still to
In such a way, the present disclosure provides a system and method for determining a measure of response to stimulation of a neonate (or other patient) directly related to the observed stimulation (with adjustable parameters). This permits localization of the measure in time and provides definable stimulation parameters. Additionally, noise, e.g., due to other factors such as loud noise, a neonate simply waking up independently of a caregiver, apneic episodes, the neonate changing positions in bed, etc., can be excluded.
It should be noted that an ANS heart rate measure may be one or more of, without exclusion: variability of heart rate; the change in heart rate due to stimulus; the time to return to baseline heart rate after stimulus, the rate of return to baseline heart rate after stimulus (delta HR/time), the rate of increase from a baseline heart rate at the start of stimulation, etc.
In further exemplary embodiments, the length of intervention may be used to normalize the return to baseline. For example, a long intervention may cause more of a significant change in heart rate, with normalization of the ANS measure being performed accordingly.
It should also be noted that HRV measurements during stimulus may also be compared to patient (again, e.g., neonatal) HRV in the non-stimulated state for context or qualification. Further, in exemplary embodiments, the period of time of activity after the end of stimulus may also provide context or measure of neonatal ANS.
We refer again to commonly owned U.S. Patent Application Ser. No. 63/145,403, filed Mar. 2, 2021, which describes techniques for: Non-Contact Determination of Heart Rate using Dot Projection Pattern Signals, the contents of which are specifically incorporated herein by reference. As such, in exemplary embodiments, a parallel measure of response may be correlated with the applied determined stimulus.
Additionally, in exemplary embodiments, a direct measure of activity from a neonate can be measured from the video stream, for example by following fiducial points on the neonate, or by measuring gross changes in pixel values within a region, for example. The degree of activity may be used as an ANS measure, e.g., as depicted in
It should be noted that while the present disclosure relates to heart rate or heart rate variability (or indeed, response indicated by the video image or image stream), any other measurable physiological measure applicable to stimulus response is contemplated herein. For example, respiration rate (RR) of a neonate (or other patient) can serve as a measure of response as disclosed herein. For various possible responses, baselines may be created, updated, etc., to provide accurate measures in line with the above. With regard to RR, for example, a pre-stimulus baseline may be measured. When the stimulus occurs, the RR may increase. A measure of this change may be made. The RR may be measured from the pulse oximetry waveform, from the video stream, or from any other device configured to measure RR.
Accordingly, and advantageously, such systems and methods provide mechanisms to determine whether a patient is in or is about to enter an impaired state indicative of disease.
The present disclosure relates to use of any camera system, or combination of camera systems, where a distance or depth can be measured, including infrared (IR) and Red/Green/Blue (RGB), etc. Additionally, the camera(s) may generate distance or depth data independently, or with the assistance of one or more processors in communication therewith.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the techniques). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a medical device.
In one or more examples, the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
Instructions may be executed by one or more processors, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
Number | Name | Date | Kind |
---|---|---|---|
8648900 | Vu et al. | Feb 2014 | B2 |
10757366 | Kwatra et al. | Aug 2020 | B1 |
10770185 | Hayashi et al. | Sep 2020 | B2 |
20060049936 | Collins et al. | Mar 2006 | A1 |
20070040692 | Smith et al. | Feb 2007 | A1 |
20070157385 | Lemire et al. | Jul 2007 | A1 |
20100225501 | Grubis et al. | Sep 2010 | A1 |
20130109965 | Assman et al. | May 2013 | A1 |
20150199892 | Johnson et al. | Jul 2015 | A1 |
20170112504 | McEwen et al. | Apr 2017 | A1 |
20180348759 | Freeman et al. | Dec 2018 | A1 |
20190014982 | Bhuiyan | Jan 2019 | A1 |
20190231231 | Saria et al. | Aug 2019 | A1 |
20190320974 | Alzamzmi | Oct 2019 | A1 |
20200046302 | Jacquel | Feb 2020 | A1 |
20200121186 | Collins, Jr. et al. | Apr 2020 | A1 |
20200302825 | Sachs et al. | Sep 2020 | A1 |
20200329990 | Laszlo et al. | Oct 2020 | A1 |
20210065885 | Receveur et al. | Mar 2021 | A1 |
20210158965 | Receveur et al. | May 2021 | A1 |
20210272696 | Demazumder | Sep 2021 | A1 |
20210327245 | Addison | Oct 2021 | A1 |
20210364589 | Bilgic et al. | Nov 2021 | A1 |
20220240790 | Smit et al. | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
20160124026 | Oct 2016 | KR |
2017139895 | Aug 2017 | WO |
Entry |
---|
International Search Report and Written Opinion for PCT/US2022/036935 mailed Jan. 17, 2023, 13 pp. |
Number | Date | Country | |
---|---|---|---|
20230012742 A1 | Jan 2023 | US |