The present disclosure is directed generally to health care. More particularly, but not exclusively, various methods and apparatus disclosed herein relate to monitoring changes in conditions of multiple individuals such as patients in areas such as waiting rooms.
When patients visit the hospital, they typically are triaged to determine various information about the patients, such as their names, ages, heights, weights, vital signs, reasons for visiting, and other similar information. Once triaged, the patients are sent to an area such as a waiting room to wait for hospital resources such as physicians to become available to examine and/or treat the patients. Wait times for the patients may be significant depending on availability of hospital resources. It is not uncommon for patients to deteriorate while waiting, and medical personnel may not always become aware of the deterioration in a timely fashion.
The present disclosure is directed to methods, systems, and apparatus for monitoring changes in conditions of multiple individuals such as patients in an area such as waiting rooms. For example, a plurality of triaged patients may wait in a waiting room until they can be seen by an emergency room (“ER”) physician. The patients may be included in a patient monitoring queue (also referred to simply as a “patient queue”) that is ordered or ranked, for instance, based on a indicator or measure of acuity associated with each patient (referred to herein as a “patient acuity indicator”) that is determined based on information obtained/acquired from the patient by a triage nurse, as well as other data points such as patient waiting time, patient presence, etc. One or more “vital sign acquisition cameras” mounted in the waiting room may be configured to periodically perform contactless and/or unobtrusive acquisition of one more updated vital signs from each patient. These updated vital signs may include but are not limited to blood pressure, temperature, pulse rate, oxygen saturation (“SO2”), respiration rate, skin color, posture, sweat levels, and so forth. In some embodiments, if the updated vital signs and/or a difference between updated and previously-acquired vital signs (e.g., initial vital signs obtained at triage, previous updated vital signs acquired by the vital sign acquisition cameras) satisfy one or more thresholds, an alert may be raised to notify medical personnel of deterioration of the patient. The medical personnel may then take immediate action.
Generally, in one aspect, a method may include: determining, by one or more processors, patient information associated with a given patient of a plurality of patients in an area, wherein the area can be captured by one or more vital sign acquisition cameras; acquiring, by one or more of the vital sign acquisition cameras, one or more updated vital signs from the given patient; generating, by one or more of the processors, one or more adjusted updated vital signs based on the one or more updated vital signs and the patient information associated with the given patient; comparing, by one or more of the processors, the one or more adjusted updated vital signs and one or more prior vital signs acquired previously from the given patient, detecting, by one or more of the processors, based on the comparing, deterioration of the given patient; and providing, by one or more of the processors, output alerting medical personnel of the deterioration of the given patient.
In various embodiments, the patient information may include a gender of the given patient, and the generating includes adjusting the one or more updated vital signs based on the gender of the given patient. In various embodiments, the patient information may include one or more medications taken by the given patient, and the generating includes adjusting the one or more updated vital signs based on the medications taken by the given patient. In various embodiments, the patient information may include one or more medications taken by the given patient, and the generating includes adjusting the one or more updated vital signs based on the medications taken by the given patient.
In various embodiments, the method may further include: determining, by one or more of the processors, a baseline score for at least one vital sign of the given patient, wherein the baseline score is based on a measurement of the at least one vital sign acquired previously from the given patient; and determining, by one or more of the processors, an updated vital sign score for the at least one vital sign of the given patient, wherein the updated vital sign score is based on the one or more adjusted updated vital signs of the patient. In various embodiments, the comparing may include comparing the baseline vital sign score with the updated vital sign score. In various embodiments, the detecting may include determining that a difference between the baseline vital sign score and the updated vital sign score satisfies a threshold. In various embodiments, the detecting may include determining that a difference between the baseline vital sign score and the updated vital sign score does not demonstrate a trend towards normalcy of the at least one vital sign. In various embodiments, the at least one vital sign may include pulse rate and/or respiration rate. In various embodiments, the one or more vital sign acquisition cameras may include a pan-tilt-zoom (“PTZ”) camera.
Other implementations may include a non-transitory computer readable storage medium storing instructions executable by a processor to perform a method such as one or more of the methods described above. Yet another implementation may include a control system including memory and one or more processors operable to execute instructions, stored in the memory, to implement one or more modules or engines that, alone or collectively, perform a method such as one or more of the methods described above.
It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the disclosure.
When patients visit the hospital, they typically are triaged to determine various information about the patients, such as their names, ages, heights, weights, vital signs, reasons for visiting, and other similar information. Once triaged, the patients are sent to an area such as a waiting room to wait for hospital resources such as physicians to become available to examine and/or treat the patients. Wait times for the patients may be significant depending on availability of hospital resources. It is not uncommon for patients to deteriorate while waiting, and medical personnel may not always become aware of the deterioration in a timely fashion. Accordingly, techniques described herein facilitate automatic and unobtrusive (e.g., contactless) monitoring of patients' conditions in an area such as a waiting room, so that alerts may be provided to medical personnel when a deterioration of a patient warrants immediate medical attention.
The following are definitions of terms as used in the various embodiments of the present invention. The term “database” as used herein refers to a collection of data and information organized in such a way as to allow the data and information to be stored, searched, retrieved, updated, and manipulated and to allow them to be presented into one or more formats such as in table form or to be grouped into text, numbers, images, and audio data. The term “database” as used herein may also refer to a portion of a larger database, which in this case forms a type of database within a database. “Database” as used herein also refers to conventional databases that may reside locally or that may be accessed from a remote location, e.g., remote network servers. The database typically resides in computer memory that includes various types of volatile and non-volatile computer storage. Memory wherein the database resides may include high-speed random access memory or non-volatile memory such as magnetic disk storage devices, optical storage devices, and flash memory. Memory where the database resides may also comprise one or more software for processing and organizing data received by and stored into the database.
At block 108, the new patient may be registered. Registration may include, for instance, collecting information about the patient such as the patient's name, age, gender, insurance information, and reason for visit. Typically, but not exclusively, this information may be manually input into a computer by medical personnel such as a triage nurse. In some embodiments, one or more reference images of the patient may be acquired, e.g., by a camera that is integral with a computing device operated by the triage nurse, by a standalone camera, and/or by a vital sign acquisition camera (in which case at least some vital signs may be optionally acquired at registration). In many instances, the triage nurse additionally may acquire various initial vital signs at block 110 using various medical instruments. These initial vital signs may include but are not limited to blood pressure, pulse, glucose level, SO2, photoplethysmogram (“PPG”), respiration rate (e.g., breathing rate), temperature, skin color, and so forth. While not depicted in
Once the patient is registered and their initial vital signs acquired, at block 112, the patient may be sent to waiting room 104. In some embodiments, the patient may be assigned a so-called “patient acuity indicator,” which may be a measure that is used to rank a severity of the patient's ailment, and in some instances may indicate an anticipated need for emergency room resources. Any number of commonly used indicators and/or clinician decision support (“CDS”) algorithms may be used to determine and/or assign a patient acuity indicator, including but not limited to the Emergency Severity Index (“ESI”), the Taiwan Triage System (“TTS”), the Canadian Triage and Acuity Scale (“CTAS”), and so forth. For example, in some embodiments, vital signs of the patient may be compared with predefined vital sign thresholds stored in a system database, or with published or known vital sign values typical for a given patient age, gender, weight, etc., to determine the patient's initial patient acuity indicator and/or the patient's initial position in the patient queue. In some embodiments, various physiological and other information about the patient may be fed into a trained model (e.g., regression model, neural network, deep learning network, etc.), case-based reasoning algorithm, or other clinical reasoning algorithm to derive one or more acuity measures. In some embodiments, the information used for deriving the acuity measure may include or even be wholly limited to vitals or other information that may be captured by the vital sign acquisition camera. In some embodiments, the information used for deriving the acuity measure may alternatively or additionally include information such as information from a previous electronic medical record (“EMR”) of the patient, information acquired from the patient at triage, information from wearable devices or other sensors carried by the patient, information about other patients or people in the waiting room (e.g., vitals of others in the room), information about family members or others associated with the patient (e.g., family member EMRs), etc.
At block 114, it may be determined, e.g., using one or more cameras, sensors, or input from medical personnel, that a patient has left the waiting room. Block 114 may include scanning each person currently within the waiting room (e.g., as part of a seeking function that attempts to locate the patient once the patient is at top of a queue of patients for which vitals are to be captured, such as an execution of block 120 described below, or cycling through each person in the room to capture vitals, as multiple executions of the loop including blocks 118 and 120 described below) and determining that the patient was not located. In some embodiments, the system may wait until a predetermined number of instances of the patient missing is reached or a predetermined amount of time has passed during which the patient is missing before the patient is deemed to have left the waiting room to account for temporary absences (e.g., visiting the restroom or speaking with clinical staff in a triage room). For example, the patient may have been taken into the ER proper because it is their turn to see a doctor. Or the patient's condition may have improved while they waited, causing them to leave the hospital. Or the patient may have become impatient and left to seek care elsewhere. Whatever the reason, once it is determined that the patient has left the waiting room for at least a threshold amount of time, at block 116, the patient may be released from the system, e.g., by removing them from a queue in which registered patients are entered.
At block 118, a patient in waiting room 104 may be identified for monitoring using techniques described herein. For example, in some embodiments, a database storing registration information obtained at blocks 108-110 may be searched to identify a patient having the highest patient acuity indicator or a patient having the highest acuity measured that has not been monitored recently, as may be determined by a time threshold set for all patients or set (e.g., inversely correlated) based on the acuity measure). In other embodiments, a plurality of patients in waiting room may be ranked in a patient monitoring queue, e.g., by their respective patient acuity indicators, in addition to or instead of other measures such as waiting times, patient presence in the waiting room (e.g., missing patients may be selected for monitoring more frequently to determine whether they should be released if repeatedly absent), etc. In yet other embodiments, patient acuity indicators may not be considered when ranking the patient monitoring queue, and instead only considerations of patient waiting times, patient presence, etc., may be considered.
However such a patient monitoring queue is ranked, in some embodiments, the first patient in the queue may be identified as the one to be monitored next. It is not required (though it is possible) that the patient monitoring queue be stored in sequence of physical memory locations ordered by patient acuity indicators. Rather, in some embodiments, a ranked patient monitoring queue may merely include a rank or priority level value associated with each patient. In other words, a “patient monitoring queue” as described herein may refer to a “logical” queue that is logically ranked based on patient acuity indicators, waiting time etc., not necessarily a contiguous sequence of memory locations. Patients may be identified for monitoring at block 118 in an order of their respective ranking in the patient monitoring queue.
At block 120, the patient identified at block 118 may be located in waiting room 104. In various embodiments, one or more vital sign acquisition cameras (not depicted in
In other embodiments, no patient monitoring queue may be established. Instead, vital sign acquisition cameras may simple be configured to pan, tilt, and/or zoom so that their respective fields of view move across predetermined trajectories of waiting room 304. For example, vital sign acquisition cameras may be configured to sequentially scan across rows of chairs, and/or to sequentially scan through areas of waiting room 104 known to be commonly inhabited by patients. In such embodiments, as each face is captured, it is matched against patient records to identify the patient to which the face corresponds so that vitals may be captured and correlated to the correct patient or person.
In various embodiments, the system may not be limited to only monitor patients that have been registered in the system (e.g., according to blocks 108-112). For example, it is possible that a companion of a patient in the waiting room may develop a condition that requires attention, even though they themselves were not registered as a patient. As another example, a patient may not go through blocks 108-112 and simply sit down in the waiting room because the waiting room/registration stations are busy, they do not know to register, they choose not to register, etc. In order to capture such unregistered persons/patients, in various embodiments, one or more vital sign acquisition cameras may scan the area being monitored (e.g., waiting room 104), e.g., by sequentially capturing locations at which persons are likely to wait and/or by performing a wide angle view of the area. In some embodiments, once one or more vital sign acquisition cameras capture a person, the next person they may capture may be an adjacent person.
The system may detect a non-registered person by capturing an image/video of their face (or capture other identifying features) and failing to find a matching image among the records of registered patients. In such a case, the system may create a new record to represent the unknown person (which may include generating a unique identifier for the unregistered patient), capture vitals for storage in the record as initial vitals measurements, and record the image/video as a reference image (or attempt to capture one or more additional images/videos, potentially from better angles, for subsequent use as reference images). In some such embodiments, the new record may be associated with a population baseline for at least one vital, as described herein, for analysis of vitals to be captured in the future. As successive rounds of vitals are captured, a patient-specific baseline may be established to replace or supplement the population baseline. If an alert or other information about the unknown patient is to be displayed to the waiting room staff (e.g., as described below), the information may be displayed along with one or more of these reference images/videos to aid the staff in identifying the person in the waiting room to which the alert or other information corresponds. If the person is later registered according to blocks 108-112, the new information may be merged into the “unknown person” record either manually (e.g., by staff manually selecting the existing record which is to be supplemented with registration information) or automatically (e.g., by later comparing the reference images of the two records and determining they correspond to the same person, or by later encountering difficulty/ambiguity by matching the person to both records during a vitals capture sequence). Alternatively, when an unregistered person is found, the system can simply skip this person in terms of monitoring.
At block 122, one or more vital sign acquisition cameras mounted or otherwise deployed in or near waiting room 104 may be operated to perform unobtrusive (e.g., contactless) acquisition of one or more updated vital signs from the patient identified at block 118 and located at block 120. These vital sign acquisition cameras may be configured to acquire (without physically contacting the patient) a variety of different vital signs from the patient, including but not limited to blood pressure, pulse (or heart rate), skin color, respiratory rate, PPG, SO2, temperature, posture, sweat levels, and so forth.
In some embodiments, vital sign acquisition cameras may be equipped to perform so-called “contactless methods” to acquire vital signs and/or extract physiological information from a patient may be used as medical image devices. Non-limiting examples of such cameras are described in United States Patent Application Publication Nos. 20140192177A1, 20140139656A1, 20140148663A1, 20140253709A1, 20140235976A1, and 20140275880A1, which are incorporated herein by reference for all purposes.
In some embodiments, one technique for determining a patient's heart rate or pulse may be to monitor the patient's facial skin color. Micro-changes in skin color that are caused by blood flow may be detected by a vital sign acquisition camera. These detected micro-changes may be used to determine a pulse rate of the patient. Facial skin color changes due to varying heart rate changes may not be visible to the naked eye, but the use of vital sign acquisition cameras described herein may allow detection of micro-changes in skin color.
Another vital sign measurable by vital sign acquisition cameras described herein is a patient's respiratory rate. In some embodiments, a vital sign acquisition camera may zoom in to the patient's chest and/or abdominal area to track the patient's chest or abdominal movements. The medical image device may then determine the patient's respiratory rate, e.g., by monitoring the movement of the patient's chest or diaphragm area. Additionally or alternatively, a patient's body temperature may be determined by vital sign acquisition cameras described herein that are configured to capture thermographic or infrared images/video.
At block 124, it may be determined, e.g., by one or more components depicted in
At block 126, it may be determined (again, by one or more components of
Registration module 242 may be configured to receive, e.g., as manual input from a duty nurse, registration information of new patients. This may include, for instance, the patient's name, age, insurance information, and so forth. Triage module 244 may be configured to receive, e.g., as manual input from a duty nurse or directly from networked medical equipment, vital signs such as those described above and/or other physiological data, such as weight, height, the patient's reason for the visit, etc. In various embodiments, vital signs received by triage module 244 and/or a patient acuity indicator (e.g., ESI in
Alarm module 248 may be configured to receive information indicative of various events, such as patient deterioration, and raise various alarms and/or alerts in response. These alarms and/or alerts may be output using a variety of modalities, including but not limited to visual output (e.g., on display screens visible to hospital personnel), intercom announcements, text messages, emails, audio alerts, haptic alerts, pages, pop-up windows, flashing lights, and so forth. Modules 242-248 of hospital information system 240 may be operably coupled, e.g., via one or computer networks (not depicted), to a hospital information system interface 250 (“H.I.S. Interface” in
Hospital information system interface 250 may serve as an interface between the traditional hospital information system 240 and a patient monitoring system 252 configured with selected aspects of the present disclosure. In various embodiments, the hospital information system interface 250 may publish, e.g., to other modules of the patient monitoring system 252, various information about patients such as registration information, patient acuity indicators (e.g., ESI), prescribed and/or administered medications, whether a patient has been released, various alarms/alerts, and so forth. As will be described below, in some embodiments, these publications may be provided to an event publish and subscribe (“EPS”) module 270, which may then selectively store them in database 272 and/or selectively publish them to other modules of patient monitoring system 252. In some embodiments, hospital information system interface 250 may additionally or alternatively subscribe to one or more alerts or publications provided by other modules. For example, hospital information system interface 250 may subscribe to alerts from deterioration detection module 268, e.g., so that hospital information system interface 250 may notify appropriate components of hospital information system 240, such as alarm module 248, that a patient is deteriorating.
Patient monitoring system 252 may include a variety of components that facilitate monitoring of patients in an area such as waiting room 104 to ensure that patients are served in a manner conducive with their actual medical condition. Patient monitoring system 252 may include, for instance, a patient capture module 254 that interfaces with one or more cameras 256, a patient queue module 258, a patient locator module 260, a dynamic calibration module 262, a face/torso acquisition module 264, a vital signs measurement module 266, a deterioration detection module 268, the aforementioned EPS module 270, and one or more databases 272, 274. As noted above, each of modules 250, 254, and 258-274 may be implemented using any combination of hardware and software. And while these modules are depicted separately, that is not meant to be limiting or to suggest each is implemented on a separate piece of hardware or software. For example, one or more modules may be combined and/or omitted, and one or more modules may be implemented on one or more computing systems operably connected via one or more computer networks (not depicted). The lines depicted connecting various components of
Patient monitoring system 252 may also include one or more vital sign acquisition cameras 276 that are configured to acquire, e.g., from some distance from a patient, one or more vital signs of the patient. Examples of such vital sign acquisition cameras were described above. In various embodiments, a vital sign acquisition camera 276 may be a pan-tilt-zoom (“PTZ”) camera that is operable to pan, tilt, and zoom so that different parts of an area such as waiting room 104 are contained within its field of view. In this manner, it is possible to scan the area being monitored to locate different patients, so that updated vital signs may be acquired unobtrusively.
Patient capture module 254 may receive, from one or more cameras 256, one or more signals carrying captured image data of a patient. For example, in some embodiments, patient capture module 254 may receive a video stream from camera 256. Patient capture module 254 may perform image processing (e.g., face detection, segmentation, shape detection to detect human form, etc.) on the video stream to detect when a patient is present, and may capture a reference image of the patient in response to the detection. In some embodiments, the reference image may be captured at a higher resolution than individual frames of the video stream, although this is not required. In some embodiments, camera 256 may be a standalone camera, such as a webcam, a PTZ camera (e.g., 276), and so forth, that is deployed in or near pre-waiting room area(s) 102. The one or more images captured by camera 256 may be used thereafter as reference patient images that are associated with the patient and used later to identify the patient in the area being monitored.
Patient queue module 258 may be configured to establish and/or maintain a priority queue, e.g., in a database, of patients in the area being monitored. In various embodiments, the queue may be ordered by various parameters. In some embodiments, patients in the queue may be ranked in order of patient acuity indicators (i.e. by priority based on health status). For example, the most critical patients may be placed near the front of the queue and less critical patients may be placed near the end of the queue, or vice versa. In some embodiments, updated vital signs may be acquired from patients waiting in the area being monitored, such as waiting room 104, in an order of the queue. In other embodiments, updated vital signs may be acquired from patients in a FIFO or round robin order. In other embodiments, updated vital signs may be acquired from patients in an order that corresponds to a predetermined scan trajectory programmed into vital sign acquisition camera 276 (e.g., scan each row of chairs in order).
Patient locator module 260 may be configured to use one or more signals received from vital sign acquisition camera 276, in conjunction with one or more reference patient images captured by patient capture module 254, to locate one or more patients in the area being monitored (e.g., waiting room 104). Patient locator module 260 may use various image processing techniques to identify patients using various visual features of patients. These visual features that may be used to recognize patients may include but are not limited to facial features, torso features, clothing, hair, posture, and so forth.
In some embodiments, patient locator module 260 may search an area being monitored for particular patients from which to obtain updated vital signs. For example, patient locator module 260 may search the area being monitored for a patient identified by patient queue module 258, which may be, for instance, the patient in the queue having the highest patient acuity indicator. In some embodiments, patient locator module 260 may cause vital sign acquisition camera(s) 276 to scan the area being monitored (e.g., waiting room 104) until the identified patient is identified.
Dynamic calibration module 262 may be configured to track the use of vital sign acquisition camera(s) 276 and calibrate them as needed. For instance, dynamic calibration module 262 may ensure that whenever vital sign acquisition camera 276 is instructed to point to a particular PTZ location, it always points to the same place. PTZ cameras may be in constant or at least frequent motion. Accordingly, their mechanical components may be subject to wear and tear. Small mechanical errors/biases may accumulate and cause vital sign acquisition camera 276 to respond, over time, differently to a given PTZ command. Dynamic calibration module 262 may correct this, for instance, by occasionally running a calibration routine in which landmarks (e.g., indicia such as small stickers on the wall) may be used to train a correction mechanism that will make vital sign acquisition camera 276 respond appropriately
Once a patient identified from patient queue 258 is recognized by patient locator module 260, face/torso acquisition module 264 may be configured to pan, tilt, and/or zoom one or more vital sign acquisition cameras 276 so that their fields of view capture a desired portion of the patient. For example, in some embodiments, face/torso acquisition module 264 may pan, tilt, or zoom a vital sign acquisition camera 276 so that it is focused on a patient's face and/or torso. Additionally or alternatively, face/torso acquisition module 264 may pan, tilt, or zoom one vital sign acquisition camera 276 to capture the patient's face, and another to capture the patient's torso. Various vital signs may then be acquired. For instance, vital signs such as the patient's pulse, SpO2, respiratory rate, and blood pressure may be obtained, e.g., by vital signs measurement module 266, by performing image processing on an image/video of the patient's face captured by vital sign acquisition camera(s) 276. Vital signs such as the patient's respiratory rate, general posture (which may indicate pain and/or injury), and so forth may be obtained, e.g., by vital signs measurement module 266, by performing image processing on an image/video of the patient's torso captured by vital sign acquisition camera(s) 276. Of course, the face and torso are just two examples of body portions that may be examined to obtain vital signs, and are not meant to be limiting.
Deterioration detection module 268 may be configured to analyze one or more signals to determine whether a condition of a registered patient is deteriorating, improving, and/or remaining stable. In some embodiments, the patient condition may be represented, at least in part, by the same patient acuity indicators described above for determining order of patients for monitoring. As such, the deterioration detection module 268 may include one or more CDS, case-based reasoning, or other clinical reasoning algorithms as described herein or other clinical reasoning algorithms (e.g., trained logistic regression models or other machine learning models) for assessing patient condition measures other than acuity measures described herein. In some embodiments, the algorithms for assessing patient acuity or other measures of patient condition employed by the deterioration detection module 268 may be updated from time to time by, for example, writing new trained weights (e.g., theta values) for a selected machine learning module or providing new instructions for execution by a processor (e.g. in the form of a java archive, JAR, file or compiled library). These signals may include, for instance, a patient's initial vital signs and other physiological information (e.g., obtained at blocks 108-110 of
EPS module 270 may be a general communication hub that is configured to distribute events released by various other components of
In some embodiments, EPS module 270 may be in communication with one or more databases, such as database 272 and/or archive 274 (which may be optional). In some embodiments, EPS module 270 may accept remote procedure calls (“RPC”) from any module to provide access to information stored in one or more databases 272 and/or 274, and/or to add information (e.g., alerts) received from other modules to databases 272 and/or 274. Database 272 may store information contained in alerts, publications, or other communications sent/broadcast/transmitted by one or more other modules in
It will be apparent that various hardware arrangements may be utilized to implement the patient monitoring system 252. For example, in some embodiments, a single device may implement the entire system 252 (e.g., a single server to operate the camera 276 to perform the vitals acquisition functions 260-266 and to perform the vitals analysis and alerting functions including deterioration detection module 268 and patient queue module 258). In other embodiments, multiple independent devices may form the system 252. For example, a first device may drive the camera 276 and implement functions 260-266 while another server may perform the remaining functions. In some such embodiments, one device may be local to the waiting room while another may be remote (e.g., implemented as a virtual machine in a geographically distant cloud computing architecture). In some embodiments, a device (e.g., including a processor and memory) may be disposed within the camera 276 itself and, as such, the camera 276 may not simply be a dumb peripheral and, instead may perform the vital signs functions 260-266. In some such embodiments, another server may provide indications (e.g. identifiers, full records, or registered facial images) to the camera 276 to request that vitals be returned for further processing. In some such embodiments, additional functionality may be provided on-board the camera 276 such as, for example, the deterioration detection 268 (or preprocessing therefor) and/or patient queue 258 management may be performed on-board the camera 276. In some embodiments, the camera 276 may even implement the HIS interface 250 or EPS 270. Various additional arrangements will be apparent.
In some embodiments, vital sign acquisition cameras 376 may analyze patients 378 for abnormal breathing patterns such as heavy or irregular breathing. An image or video of a patient with a very pale skin color that also depicts the patient experiencing shortness of breath may be compared to the patient's reference image, e.g., stored by hospital information system 240, to determine that the patient is experiencing a heart attack. In such a scenario, an alert may be sent immediately to medical personnel, e.g., by text message, intercom announcement, output on a display screen, etc.
Suppose that in
Techniques described herein are not limited to hospital waiting rooms. There are numerous other scenarios in which techniques described herein may be implemented to achieve a variety of technical advantages. For example, disclosed techniques may also be used for security monitoring of crowds in airports, arenas, and other public places. In such scenarios, rather than monitoring patients to determine patient acuity indicators, individuals may be monitored for other types of measurements, such as risk measurements.
As another example,
In some embodiments, at the start of the monitoring process, e.g., when first athlete 478A enters the gym to workout, one of the two vital sign acquisition cameras 476 may identify first athlete 478A, e.g., based on a reference image captured previously (e.g., when first athlete 478 joined the gym and received a photo ID). Vital sign acquisition camera 476A may zoom in to a facial area of first athlete 478A to acquire a heart rate, and then may zoom in a chest area of first athlete 478A to acquire a respiratory rate. The acquired vital signs may be transmitted by vital sign acquisition camera 476A to a computing device (not depicted, one or more components of patient monitoring system 252) for further analysis, and may be stored in a database (e.g., 272, 274). If the acquired vital signs exceed a certain threshold level, a notification (e.g., in the form of audible signal or a visual alert) may be generated to alert training instructor 486 about the exceeded threshold. In some embodiments, the computing device may recommend specific steps to be performed by first athlete 478A, such as stretching and adequate breaks between training sessions. Similar techniques may be applied to other athletes, such as second athlete 478B, depending on their respective health conditions. In some embodiments, rather than monitoring for signs of injury, techniques described herein may be used in a gym or similar setting to track calories burned or other physiological metrics, e.g., based on athlete movement, weight, temperature, pulse, respiration rate, etc., that are tracked by vital sign acquisition cameras 476 over time. To adapt the system of
At block 502, individual health indices (e.g., patient acuity indicator, workout intensity measure, ESI, etc.) may be received, e.g., from medical personnel (e.g., at triage), for a plurality of individuals (e.g., patients, athletes, residents of a nursing home, etc.) located in an area such as a waiting room that is capable of being captured in fields of view of the above-described vital sign acquisition cameras (which as noted above may have adjustable fields of view by virtue of panning, tilting, and/or zooming). For example, in various embodiments, prior to entering the monitored area, one or more initial vital signs may be acquired from each individual, e.g., by a triage nurse or trainer. Based on these initial vital signs, individual health indices may be determined, e.g., by medical personnel, for each individual in the area at block 502.
At block 504, the system may establish (or update, if it already exists) a queue (e.g., a patient queue) of individuals in the area. In various embodiments, the queue may be ordered and/or ranked based at least in part on the individual health indices determined at block 502. Additionally or alternatively, in some embodiments, the queue may be ordered based on other data points, including but not limited to the time each patient arrived, how long each patient has waited, and so forth. In some embodiments, when flow returns from block 512 or 514, block 504 may include placing the most recently monitored patient back on the queue (or otherwise placing a new entry for the patient in the queue). For example, in some embodiments, the patient may simply be placed at the end of the queue. In other embodiments, block 504 may take the patient acuity indicator, deterioration measure, vitals, or other information into account such as, for example, placing the patient at a position in the queue after any patients having higher acuity measures but ahead of any patients having lower acuity measures. In other embodiments, more complex rules may be employed. For example, in some embodiments, the patient may be placed back in the queue as described but no higher than fifth (or other constant value) from the top to the queue, to help prevent the same (highest acuity) patient from being monitored repeatedly but not allowing monitoring of other patients (because they never reach the top of the queue). Alternatively, rather than a constant maximum position, the maximum position may be determined based on the current contents of the queue. For example, the maximum position may be set to equal the number (or a constant plus the number) of “high acuity” patients, identified as those patients having a patient acuity indicator surpassing a preset threshold. In some embodiments, such high acuity patients may be placed at an intermediate point in the queue (according to any of the methods described herein), while others may be placed at the end of the queue. In other embodiments, the patient may be placed at a position from the front equal to the number of patients having a higher acuity measure in the queue plus one (or some other constant) to allow at least some lower acuity patients to be monitored ahead of the current patient. In some embodiments, acuity measure values (or ranges thereof) may be associated with a delay between subsequent measurements measured in, for example, number of queue positions or real time between measurements, which may then be translated into the position of the queue where the patient will be placed. In some embodiments, the patient acuity indicator (or other value driving queue position placement) may take into account the time that has passed since the patient was last monitored; as such, as a patient sits in the queue, their acuity measure (or other queue position determining value) may gradually increase, making it more difficult for other patients to be placed ahead of that patient in the queue. In some such embodiments, a “queue priority value” may be utilized in the manner described above as applied to the patient acuity indicator but may equal the patient acuity indicator plus the time since the patient was last monitored (or some weighted sum of these two or additional values).
At block 506, the system may select a given individual from which updated vital signs are to be acquired. For example, the system may select a patient from the front of the queue established in block 504 or may select a patient having a highest patient acuity indicator (which in many embodiments may be the first patient in the queue). In other embodiments, the system may select individuals in different orders, such as using FIFO and/or round robin.
At block 508, the individual selected at block 506 may be located in the monitored area by one or more vital sign acquisition camera 276, e.g., based on one or more reference images of the individual. As noted above, various visual features of individuals may be used for location, including but not limited to facial features, posture, clothing, size, and so forth. At block 510, one or more vital sign acquisition camera 276 may unobtrusively acquire one or more updated vital signs from the individual selected at block 506 and located at block 508. In various embodiments, individuals may opt out of unobtrusive acquisition, e.g., by notifying a triage nurse or other personnel.
In some embodiments, at block 512, deterioration in the individual selected at block 506 and identified at block 508 may be detected based on the updated vital signs obtained at block 510 and at least one of an individual health index (e.g., patient acuity indicator) associated with the given patient (e.g., determined at block 502) or initial vital signs (or updated vital signs acquired during a previous iteration of patient monitoring system 252) acquired from the given patient. If deterioration is detected, e.g., due to a difference between initial and updated vital signs satisfying a threshold, the method 500 may proceed to block 514. At block 514, various modalities of output, including but not limited to text messages, intercom announcements, visual output, audio output, haptic feedback, etc., may be provided to alert pertinent personnel of the difference, e.g., to notify a duty nurse of deterioration of a patient. In some embodiments, for example, medical personnel may be alerted of patient deterioration by displaying either a most-recently captured image of the deteriorating patient (e.g., so that medical personnel will know who to look for in the waiting room) or a live streaming video of the deteriorating patient in the waiting room. Regardless of whether method 500 proceeds to block 514 after block 512, method 500 may proceed back to block 504 to update the queue, e.g., to reorder the queue so that the patient having the next highest patient acuity indicator may be monitored).
While examples described herein have primarily involved vital sign acquisition cameras such as cameras configured to perform contactless acquisition of vital signs, this is not meant to be limiting. In various embodiments, other types of sensors may be incorporated into vital sign acquisition cameras and/or deployed separately to detect vital signs of patients. For example, motion sensors may be used, for example, to detect abnormal motions of a patient in a waiting room such as those due to a patient undergoing a seizure. Various types of motion sensors may be employed, including but not limited to infrared, optical, microwave, ultrasonic, acoustic, or tomographic based sensors, as well as those that fall under the category of occupancy sensors. Motion sensors may be passive and/or dynamic. Passive infrared sensors, for instance, detect heat movement by way of a pyroelectric sensor designed to detect infrared radiation radiated by a moving body. Ultrasonic sensors, by contrast, may leverage the Doppler-shift principle. An ultrasonic sensor may transmit high frequency sound waves in a monitored area and detect reflected wave patterns. Microwave sensors may work in a similar fashion except that they may transmit high frequency microwaves rather than sound waves.
User interface input devices 622 may include a keyboard, pointing devices such as a mouse, trackball, touchpad, or graphics tablet, a scanner, a touchscreen incorporated into the display, audio input devices such as voice recognition systems, microphones, and/or other types of input devices. In general, use of the term “input device” is intended to include all possible types of devices and ways to input information into computer system 610 or onto a communication network.
User interface output devices 620 may include a display subsystem, a printer, a fax machine, or non-visual displays such as audio output devices. The display subsystem may include a cathode ray tube (CRT), a flat-panel device such as a liquid crystal display (LCD), a projection device, or some other mechanism for creating a visible image. The display subsystem may also provide non-visual display such as via audio output devices. In general, use of the term “output device” is intended to include all possible types of devices and ways to output information from computer system 610 to the user or to another machine or computer system.
Data retention system 624 stores programming and data constructs that provide the functionality of some or all of the modules described herein. For example, the data retention system 624 may include the logic to perform selected aspects of method 500, and/or to implement one or more components of patient monitoring system 252.
These software modules are generally executed by processor 614 alone or in combination with other processors. Memory 625 used in the storage subsystem can include a number of memories including a main random access memory (RAM) 630 for storage of instructions and data during program execution, a read only memory (ROM) 632 in which fixed instructions are stored, and other types of memories such as instruction/data caches (which may additionally or alternatively be integral with at least one processor 614). A file storage subsystem 626 can provide persistent storage for program and data files, and may include a hard disk drive, a floppy disk drive along with associated removable media, a CD-ROM drive, an optical drive, or removable media cartridges. The modules implementing the functionality of certain implementations may be stored by file storage subsystem 626 in the data retention system 624, or in other machines accessible by the processor(s) 614. As used herein, the term “non-transitory computer-readable medium” will be understood to encompass both volatile memory (e.g. DRAM and SRAM) and non-volatile memory (e.g. flash memory, magnetic storage, and optical storage) but to exclude transitory signals.
Bus subsystem 612 provides a mechanism for letting the various components and subsystems of computer system 610 communicate with each other as intended. Although bus subsystem 612 is shown schematically as a single bus, alternative implementations of the bus subsystem may use multiple busses.
Computer system 610 can be of varying types including a workstation, server, computing cluster, blade server, server farm, or any other data processing system or computing device. In some embodiments, computer system 610 may be implemented within a cloud computing environment. Due to the ever-changing nature of computers and networks, the description of computer system 610 depicted in
Further, in this embodiment the color sensor 890 generates three different color signals 892A, 892B, 892C, e.g. by use of a color filter array 893 having three different color filter areas provided in front of a photo detector 895 (or, more generally, the image sensor). Such a color sensor (e.g. including a color filter array having only two color filter areas) could also be used in the embodiment shown in
As noted above, at blocks 124-126 of
When deterioration detection module 268 receives a vital sign measurement event, method 900 may proceed to block 904, at which point deterioration detection module 268 may determine, e.g., based on updated vital signs for the patient received from vital signs measurement module 266, whether the patient's condition has deteriorated sufficiently to warrant raising an alarm. If the answer is yes, then method 900 may proceed to block 906, at which point a medical alert may be raised to medical personnel of the patient's deterioration (e.g., similar to block 514 of
A pulse rate event 10201 may be raised, for instance, by vital signs measurement module 266 in response to vital sign acquisition camera 276 unobtrusively acquiring an updated pulse rate from a patient. As noted above, in some embodiments, vital signs measurement module 266 may provide (or “publish”) the patient's updated pulse rate (e.g., along with other information such as a patient identifier) to EPS module 270. EPS module 270 may then provide (or “publish”) the event to subscribing modules of patient monitoring system 252, such as deterioration detection module 268. The same process may be followed vis-à-vis other events depicted in
A respiration rate event 10202 may be raised, for instance, by vital signs measurement module 266 in response to vital sign acquisition camera 276 unobtrusively acquiring an updated respiration rate from a patient. In response to the respiration rate event 10202, in various embodiments, deterioration detection module 268 may, at block 10222, assess deterioration of the patient based on the patient's updated respiration rate. As is indicated in
At block 1102, one or more adjusted updated vital signs (in this example, pulse rate) may be generated, e.g., based on updated vital signs acquired automatically by vital sign acquisition camera(s) 276 (or manually by medical personnel). These adjusted updated vital signs may take into account various patient information that may be pertinent, including but not limited to drugs taken by the patient, gender, age, size, etc. The effect such patient information may have on vital signs may be set by medical personnel, and/or may be adjusted automatically, e.g., based on empirical evidence, one or more machine learning models, and so forth.
For example, patients taking beta-blockers may have a pulse rate that is approximately fifteen beats per minute (“bpm”) slower than the population average. Accordingly, at block 1104, it may be determined, e.g., from patient EMRs and/or information obtained from the patient at registration/triage contained in, for instance, database 272, whether the patient is currently on beta-blockers. If the answer is yes, then the measured pulse rate may be increased at block 1106 to account for that fact, e.g., by fifteen beats per minute. Of course, other adjustments are possible depending on the dosage of the beta-blockers taken by the patient, the size of the patient, the health of the patient, etc. Moreover, any increase/decrease applied as a result of beta-blocker usage may be selected, e.g., as adjusted by medical personnel and/or as automatically, e.g., based on empirical evidence, machine learning techniques, and so forth.
At blocks 1108 and 1112, it may be determined, e.g., using EMRs and/or registration/triage information contained, for instance, in database 272, whether the patient is male or female. If the patient is female, then the pulse rate may be increased at block 1110, e.g., by two beats per minute. This may account for the fact that, in general, females have slightly lower pulse rates than males. Method 1110 may then proceed to block 1116. Likewise, if the patient is determined at block 1112 to be male, then the pulse rate may be decreased, e.g., by two beats per minute, and method 1110 may then proceed to block 1116. Of course, an increase or decrease of two beats per minute is for demonstration purposes only, and any increase/decrease in value may be used, e.g., as adjusted by medical personnel and/or as automatically set, e.g., based on empirical evidence, machine learning techniques, and so forth.
At block 1116, a pulse rate score may be assigned to the patient, e.g., from a table. For example, in some embodiments, a lookup table such as Table 1, below, may be consulted to determine a patient's pulse rate score based on their adjusted pulse rate (“PR”). In this particular example, the top row represents ranges of adjusted pulse rates and the bottom row represents corresponding pulse rate scores:
These values/ranges and/or corresponding scores are not meant to be limiting. Rather, they constitute one example of how various pulse ranges may be used to assign pulse rate scores. One skilled in the art will understand that other ranges/scores may be selected, e.g., manually by medical personnel or automatically using various techniques, such as machine learning algorithms (e.g., neural networks), etc. For example, more granular tables may be established, e.g., based on data mining and/or expert opinions. In some cases, if such a table becomes sufficiently granular, the operation of determining whether a patient's deterioration qualifies as significant (block 1122, described below) may be obviated.
Additionally or alternatively, in some embodiments, various attributes of a patient (e.g., other vital signs, demographic data, etc.) may be used in conjunction with the patient's measured pulse rate, e.g., as input for one or more machine learning algorithms/models/classifiers that are configured to provide, as output, a “label” to the patient. In some cases, the output “label” may be, for instance, a pulse rate score. More generally, in various embodiments, feature vectors may be generated for a plurality of patients having known outcomes (e.g., “deteriorating,”, “stable,” “not deteriorating,” etc.), and in some instances may be labeled with the known outcome. These feature vectors may then be used as training data for a machine learning model (e.g., a neural network). Subsequent patient information and/or associated vital signs (both previously acquired and updated) may then be used to build a new feature vector that is used as input for the machine learning model. Output of the machine learning model may include a label associated with the subsequent patient, such as “deteriorating,” “stable,” “not deteriorating,” etc.
In yet other embodiments, rather than adjusting the patient's measured pulse rate (or more generally, a patient's updated vital sign measurement) based on gender/beta-blockers/size/age/etc., different tables may be selected for patients having different characteristics. For example, one pulse rate score table may be used for males, another for females, another for patients in a particular age range, and so forth. In such embodiments, the updated vital signs may or may not be adjusted according to various patient demographics, etc.
At block 1118, it may be determined, e.g., by deterioration detection module 268 based on the pulse rate score assigned at block 116, whether the patient's pulse rate score is “worse” (or at least different) than a prior score, and/or whether the patient's pulse rate score reflects a change in the patient's condition relative to their previously-determined patient acuity indicator. For example, suppose a patient's previous pulse rate score (e.g., baseline) indicated a relatively high pulse rate, and the patient's new pulse rate score (e.g., determined from an updated pulse rate acquired by vital sign acquisition camera 276) indicates that the patient's pulse rate is slowing towards a “normal” and/or “healthy” pulse rate. In such a scenario, deterioration detection module 268 may determine that the patient's pulse rate score has not “worsened.” Method 1100 may then proceed to block 1120, at which point it may be determined, e.g., by deterioration detection module 268, that no deterioration is detected. Method 1100 may then end. Likewise, if the patient's pulse rate score indicates that the patient's pulse rate is increasing towards a “normal” or “healthy” pulse rate. However, if the answer at block 1118 is yes (e.g., the patient does not appear to be trending towards “normal” or “healthy”), method 1110 may proceed to block 1122.
In some scenarios, a patient's baseline score may not be available. For example, and as was described above with respect to block 120 of
Flagging every patient who experiences a slight decline as deteriorating may not be helpful, especially when a slight decline may not be significant from a physiological standpoint. For example, strict use of lookup tables such as Table 1 above may lead to a baseline pulse rate value and a subsequent pulse rate value that are only one bpm apart (e.g. 100 versus 101) nonetheless indicating a difference in scores (e.g., 0 versus 1). Accordingly, at block 1122, it may be determined whether the patient's previous vital sign score and the patient's current vital sign score are “significantly” different. Whether a vital sign score is “significantly” worse than a previous vital sign score may be determined in various ways. In some embodiments, deterioration detection module 268 may raise an alert of patient deterioration only if a difference in the values exceeds some threshold, such as 5%, 10%, 20%, etc. Such thresholds may be manually or automatically established.
For example, some vital signs such as respiration rates may tend to be noisier, and consequently, thresholds used for those vital signs may be smaller. Additionally or alternatively, in other embodiments, deterioration detection module 268 may employ absolute values, e.g. when a previous vital sign measurement is not available (e.g., for unregistered patients). In some embodiments, one or more trained models (e.g., regression model, neural network, deep learning network, etc.) and/or case-based reasoning algorithms may be used to determine thresholds that should be used to detect deterioration. For example, a model may be trained on a corpus of EMRs for which positive and/or negative outcomes are known. Trends reflected in various vital signs of those EMRs may be used in conjunction with known outcomes to train the model, so that subsequent vital signs (with a yet-to-be-determined outcome) associated with a new patient may be analyzed to determine whether deterioration is present.
If the answer at block 1122 is no, then method 1100 may proceed to block 1120, which was explained previously. However, at block 1122, if the answer is yes, then method 1100 may proceed to block 1124. At block 1124, deterioration detection module 268 may publish an alert, e.g., to EPS module 270, that deterioration is detected. EPS module 270 may then publish an alert to various subscribers, such as alarm module 248 as described above. Alarm module 248 may then raise an appropriate alert, e.g., as was discussed at block 514 of
Method 1100 in
While several embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited.
In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03. It should be understood that certain expressions and reference signs used in the claims pursuant to Rule 6.2(b) of the Patent Cooperation Treaty (“PCT”) do not limit the scope
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2017/078117 | 11/3/2017 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62420629 | Nov 2016 | US |