In some clinical and non-clinical settings, the constant monitoring of the well-being and positioning of patients or other subjects is necessary. In these settings, the use of a precise, intense wearable, or connecting patient monitoring system (PMS) or other monitoring device, may not be available, not desired, not possible, or not required. Some example patients may be admitted into emergency rooms, general hospitals, palliative care centers, or nursing homes. The patients might exhibit symptomology related to recovering from specific severity of psychological, psychiatric, or stress symptoms; or might be undergoing palliative, neo-natal, long inpatient, or other long-term care. Babies, toddlers, the elderly, or other at-home individuals, might also require constant well-being and positioning monitoring, while the use of wearable or connecting PMS or other devices is not practical, not possible, or not necessary. In some of these examples, patients may move between rooms or locations at these facilities.
As a result of not being able to use a well-being or monitoring device, having a skilled professional instead monitor an individual's well-being and positioning might be necessary or desirable. However, such human monitoring can be cost prohibitive, impractical, or not possible due to lack of availability or other impediments.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key factors or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
One or more techniques and systems described herein can be utilized for remote (e.g., non-contact) physiological well-being and positioning monitoring across a plurality of different locations or rooms. For example, systems and methods of monitoring, described herein, can utilize a combination of sensors to actively monitor a patient's well-being and positioning remotely from a distance, without the use of direct contact sensors applied to a patient. The system may be utilized to detect when a patient is mobile and moving from one location to another via continuous tracking and monitoring of the patient with a plurality of sensors or devices.
In one implementation for providing for improved monitoring of an individual, imaging data is received from a plurality of imaging sensors. The received imaging data is aggregated and patterns of targeted data segments of the aggregated imaging data are analyzed using one or more algorithms to determine one or more values. The one or more values determined from the analyzed patterns are classified to represent at least one of a physiological state determination or a positioning of the individual. Data corresponding to the classified one or more values is then output and displayed to a user as an indication of an individual's physiological state or positioning.
In another implementation, a system for remotely monitoring at least one patient may comprise a first subsystem, comprising at least one image sensor for sensing imaging data of at least one patient, a user interface for displaying data associated with the at least one patient, a second subsystem in communication with the first subsystem, the second subsystem comprising a processor and a computer-readable medium storing instructions that are operative upon execution by the processor to receive the imaging data of the at least one patient from the first subsystem, aggregate the received imaging data for the at least one patient, analyze the aggregated imaging data using one or more algorithms to determine one or more data values associated with the at least one patient, classify the one or more data values determined from aggregated imaging data to determine a physiological state or a position of the at least one patient, and transmit, to the first subsystem, data corresponding to the determined physiological state or the position of the at least one patient, wherein the determined physiological state or the position of the at least one patient is displayed via the user interface of the first subsystem.
In another implementation, a method, carried out in a system for monitoring at least one patient may comprise receiving, from a remote data acquisition system, imaging data of at least one patient, wherein the imaging data is acquired via at least one image sensor of the remote data acquisition system, aggregating the received imaging data for the at least one patient, analyzing the aggregated imaging data using one or more algorithms to determine one or more data values associated with the at least one patient, classifying the one or more data values determined from aggregated imaging data to determine a physiological state or a position of the at least one patient, and transmitting, to the remote data acquisition system, data corresponding to the determined physiological state or the position of the at least one patient, and wherein the data corresponding to the determined physiological state or the position of the at least one patient is displayed via a user interface of the remote data acquisition system.
In another implementation, a system for remotely monitoring a plurality of patients may comprise a data acquisition system, comprising a plurality of image sensors for sensing imaging data of a plurality of patients, a user interface for displaying data associated with the plurality of patients, a data processing system in communication with the data acquisition system, the data processing system comprising a processor and a computer-readable medium storing instructions that are operative upon execution by the processor to receive, from the data acquisition system, the imaging data of the plurality of patients, identify, a first patient of the plurality of patients, register the first patient in memory, aggregate the received imaging data for the first patient of the plurality of patients, wherein aggregating includes aggregating imaging data for the first patient from each of the plurality of image sensors, analyze the aggregated imaging data for the first patient, using one or more algorithms, to determine one or more data values associated with the first patient, classify the one or more data values determined from aggregated imaging data of the first patient to determine a physiological state or a position of the first patient, and transmit, to the data acquisition system, data corresponding to the determined physiological state or the position of the first patient, wherein the determined physiological state or the position of the first patient is displayed via the user interface of the data acquisition system.
To the accomplishment of the foregoing and related ends, the following description and annexed drawings set forth certain illustrative aspects and implementations. These are indicative of but a few of the various ways in which one or more aspects may be employed. Other aspects, advantages and novel features of the disclosure will become apparent from the following detailed description when considered in conjunction with the annexed drawings.
The claimed subject matter is now described with reference to the drawings, wherein like reference numerals are generally used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the claimed subject matter. It may be evident, however, that the claimed subject matter may be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to facilitate describing the claimed subject matter.
There is a need for better decision support systems for caregivers; an alternative to current complex, expensive Patient Monitoring Systems (PMS) that unnecessarily limit patient mobility, and that are prone to caregiver error or caregiver fatigue. Alternately, patients may be checked by nurses only a few times during their visit rounds, sometimes hours apart between checks. Current remote tele-sitting services are a valuable patient care solution, but these services may lack parameter assessment, and may rely on a remote caregiver monitoring video, and sound feeds of multiple remote patient rooms. In these scenarios, errors may occur.
In one aspect, a real-time, no-contact, AI enabled, autonomous decision support system that continuously reads, and trends a target patient's condition can be devised. In this aspect, for example, such a system can utilize customizable status and alerts that may be integrated into traditional workstations and mobile platforms. The systems described herein can improve patient care, and has the potential to reduce a patient's adverse events when used in high Patient to Nurse (PTN) scenarios. In some implementations, the example systems can combine imaging hardware with AI trained to recognize target situations to provide customizable user interface experiences, including physiological conditions and patient movements. In this aspect, the systems and method described herein can provide a real-time, zero-contact, autonomous decision support system for caregivers. For example, the exemplary systems can provide real-time feedback of a subject's well-being based on multi-physiological parameters, subject motion, and positioning change analysis using AI proprietary algorithms. The systems can continuously read data, and trend parameters against automatically defined, or user defined baselines.
The methods and systems disclosed herein, for example, may be suitable for use in a myriad of situations (e.g., hospital room, emergency room, nursing home room, patient bedroom, etc.) having one or more target individuals disposed therein to provide one or more monitoring features. For example, automated remote monitoring of various implementations utilizes one or more combinations and applications of imaging sensors to distinguish patterns of aggregated data from the imaging sensors in real-time to provide well-being and positioning monitoring information. It should be appreciated that examples described herein can be used in different settings or environments in combination with various different types of imaging sensors. The examples given herein are merely for illustrative purposes.
In some implementations, the methods and systems described herein are able to detect and report a plurality of physiological and position/motion conditions for a target patient. For example, the systems can detect the heart rate of a target patient, such as indicated by blood perfusion, which is rate at which blood is delivered to tissues, such as in the capillary beds. Further, breathing can be detected, as indicated by tidal respiration volume. Additionally, the motion of the target patient can be detected, such as ocular motion, global (e.g., overall) motion, motion of extremities, head motion, and foreign object motion in the target field. Positioning of the target patient may be detected, such as the subject elevation (e.g., height above the ground, bed, etc.), a global scene map of objects in the target field, and relation of the subject to the global map scene. Other physiological conditions may be monitored, such as patient temperature (e.g., forehead temperature, global temperature, distal temperature, ROI temperature), and other object (in the field of view) temperatures. These indications can be collected and provided in a user interface to a caregiver, and/or to an automated alert system, to help monitor one or more patients in real-time.
As an example, in some implementations, indications of blood perfusion, relating to heart rate monitoring, can provide early indication of conditions, and possible prevention. For example, skin changes, swelling, hot flashes, cognitive, fatigue; clammy, paling skin breakdown (pressure ulcers)/infection/necrosis may be indicated by blood perfusion monitoring. Further, internal bleeding, heart alterations, myocarditis, intravascular volume status, and stroke volume are other conditions that can be indicated by blood perfusion monitoring. Additionally, other useful indictors using blood perfusion can include, trauma patient single or multiple organ failure, for use with multiple patient observation at lower cost to triage change over time &/or stability; sepsis progression/organ function degradation leading to failure; single organ leading to systematic chain reaction; response to vasoconstriction regiment for central vs peripheral differential tied to vasoconstriction medication regiment; and perfusion change as an indicator to subsequent parameters issue advancement.
As another example, in some implementations, indications of tidal respiration, relating to breathing monitoring, can provide early indication of conditions, and possible prevention. For example, rapid breathing, slowing breathing, and erratic breathing are all indicative of downstream next issues. Other useful indicators can include pre & post intervention assessment with objective trending related to breathing, RSV, Covid-19, asthmatic, and upper respiratory issues. Other tidal respiration data can indicate hallucination onset, seizure, stroke, cardiac risk; as well as psych Issues, such as anxiety/confusion/flight risk/self-harm. Additionally, tidal respiration may help indicate response to titrated analgesics, where, if apnea occurs, could lead to arrest. Further indications may include, over or under ventilation, and CO2 rise or downward trends.
As another example, subject motion monitoring can provide carly indication and prevention of certain conditions. For example, ocular motion monitoring can detect a level of consciousness/fall risk, alertness; tumor presence or progressions, concussion, diabetic retinopathy, retinal detachment, seizures, hallucinations, macular deg, retinal edema, and seizure. Head motion and global monitoring can provide indication of seizures, level of consciousness, confusion, and stroke. Extremities motion monitoring can provide indications for a fall candidate, infusion tubing removal risk, TIA mini-strokes, seizures, possible respiratory distress, and a cardiac event. Further, foreign object motion monitoring can provide indications of a failure to interact with bed-side table successfully, perception/eyesight failure, in-room equipment failure, such as IV pole falling, bed rails being moved/covers moving erratically, Munchausen Syndrome-deliberate attempt to harm a patient; parents to kids; adult kids to parents; intentionally resetting alarms and/or pushing drugs into IVs. Additionally, motion monitoring may be able help with psychiatric ward monitoring, such as objective assessment of medication effectiveness, and aggressive trending, patient assessments.
As another example, patient temperature monitoring can provide early indication and prevention of certain conditions. For example, temperature monitoring can detect indications, where an increase could indicate an infection; viral or bacterial, heat exhaustion, inflammatory conditions (RA), malignant tumors; a decrease may indicate a loss of fat, changes in medication reaction and/or effect of meds they are on; a sudden change from range may indicate hypothermia; or systems are shutting down. A distal temperature monitor can provide indications of perfusion issues (see Perfusion), renal disease impact, diabetic conditions, and chronic infection susceptible, which may facilitate Hospice and family final moments caregiving. Further, ROI temperature monitoring can provide indications of IV infiltration, IV extravasation, changes in tissues; surgical site infection, tubing to tissue infection, incisions, scarring infection, and areas of trauma that may be degrading.
As another example, patient positioning monitoring can provide other indications of conditions that may be prevented or mitigated. For example, patient positioning monitoring may identify erratic movement such as thrashing, rapid change of positioning, decrease of motion compared to prior mobility. Other observations may include a reduction in motion or slowing changes, which may be equally valuable to alert a caregiver of a position change that could be leading to a fall, or a lack of position change that could be leading to a long-term tissue elevated pressure. Such a system may also infer a long-term, real-time pressure mapping of patients in bed, based on motion.
As one illustrative example, a depiction of a process 100 for physiological well-being and positioning monitoring is illustrated in
The acquired data is then aggregated at 104. For example, one or more data aggregation operations are performed to combine different data acquired by the one or more imaging sensors 120. It should be appreciated that the imaging sensors 120 can be located in different locations within the room and have different positions, orientations, field-of-views, etc., such as based on the type of images to be acquired, the processing to be performed on the acquired image data (also referred to as imaging data), etc. As such, different perspectives or views of the room or portions of the room can be acquired. In some implementations, the data aggregation is performed on different categories of acquired data. For example, as shown in
In one implementation, the processing at 106a by a combination of the one or more algorithms analyzes recognized patterns at targeted data segments and classifies values against configurable thresholds to represent physiological state determinations. In one implementation, the processing at 106b by a combination of the one or more algorithms analyzes the recognized patterns at targeted data segments and classifies values against configurable areas of interest to represent the positioning of the patient.
In some implementations, the physiological state determinations and the positioning of the patient results are then combined to enhance the available information and monitoring of the patient. For example, the output of the processing of the datasets by the one or more algorithms is then combined at 108 and results prepared at 110, such as for display at a user interface at 112. In one implementation, the output from the processing by the one or more algorithms is encrypted and the results are prepared by performing an automatic function that receives the encrypted algorithm results, decrypts the results, and organizes the results for display via a user interface, such as on an end-use device 200 (see
Thus, in one implementation, a combination of algorithms registers, aggregates, and combines data from at least two imaging sensors 120 to augment the usefulness of the resulting data set for analysis and subsequent inferences. That is, more effective well-being and positioning monitoring can be performed using the process 100. In some examples, the process 100 allows for real-time, remote, autonomous, patient physiological well-being and positioning monitoring.
In some implementations, a combination of algorithms harvests data from one or more data repositories for classification, curation, and augmentation, for the analysis thereof and subsequent inferences. For example, one or more datasets are acquired from stored data and used in the various implementations to perform one or more operations and/or enhance one or more operations.
As illustrated in
As can be seen in
A system configuration of one implementation is illustrated in
In some examples, one or more of the control system 302, the image recognition system 304, and the user interface 306 are configured as sub-systems that together implement the process 100. In the illustrated example, the system 300 includes a main power supply 320 that routes power to a power supply 308 of the control system 302 and a power supply 310 of the image recognition system 304. For example, the power supplies 308, 310 can be local or “on-board” power supplies that power the components of each of the control system 302 and image recognition system 304.
In the illustrated implementation, the image recognition system 304 further includes an image acquisition system 322 that receives image data of one or more subjects 324 and an image processing module 326 that processes the received image data. For example, the image processing module 326 pre-processes or filters received images to be processed by the control system 302. In one example, the image processing module 326 is configured to perform segmentation or other object detection techniques on the received image data to identify objects or data of interest in the images. The image recognition system also includes a communication module 328 that allows for communication with, for example, the control system 302. Additionally, the image processing module 326 in some examples then analyzes the image data to determine well-being and/or positioning information as described in more detail herein. For example, the image processing module 326 uses the one or more algorithms to identify different properties or characteristics of the image data corresponding to a state or condition of the subject(s) 324.
The control system 302 further includes a resolver engine 312 and an event logger 314 connected to a communication module 316 (e.g., a wireless communication device). The control system 302 is configured to receive data from the image recognition system 304 and prepare the data for transmission and display on the end-user device 200. For example, processed and prepared data is wirelessly transmitted by the communication module 316 to a receiver (not shown) of the end-user device 200, which then displays the data via the user interface 306. For example, the resolver engine 312 organizes, filters, and/or sorts the processed data for transmission and display via the user interface 306. The event logger 314 tracks the data that is communicated to and from the control system 304.
In operation, the control system 302 in some examples is configured as a sub-system that organizes and encrypts the results of the algorithm processing described herein and transmits the results to the end-user device 200, such as a remote device (e.g., smartphone, tablet, or other workstation) for interfacing and/or interaction by a user with the user interface 306 (e.g., via display and sound interfacing). In one implementation, the end-user device 200 is configured (e.g., has an application installed thereon) that receives the encrypted results of the algorithm processing, decrypts the results, and organizes the results to one or multiple end-user using graphical and sound formats.
In some examples, the control system is configured as a sub-system that chronologically organizes the results of the algorithm processing (using the event logger 314) in text format and keeps an auditable, size-configurable first-in first-out (FIFO) log of results. The ordering in one example is based on a time-stamping of the processed image data.
Various examples are operable in different environments and for different applications, such as where well-being and patient positioning monitoring is needed while no wearable or connecting PMS is required, desired, or is viable, including, but not limited to:
It should be appreciated that the problems overcome by one or more implementations of the present disclosure are applicable to patient monitoring at one or more different stages of the healthcare continuum, where patient well-being and/or patient positioning is necessary, but in which a connecting or wearable PMS or other device is not practical, not possible, or is not necessary.
It should further be noted that the one or more sensors in various implementations are not diagnostic devices, but devices that only capture image data within the room. That is, the sensors do not directly acquire diagnostic data, but instead acquire image data that can be analyzed for monitoring purposes, and to support a caregiver's decision-making process as described in more detail herein. As a result, the various examples are useful in more applications, including applications where monitoring sensors cannot or are not desired to be used.
At operation 404, the acquired image data is aggregated. For example, the image data from a set of cameras within a room is combined into a larger image dataset that is stored for analysis. In some examples, the aggregated data includes only data relating to monitoring activity for the patient as described in more detail herein. In one example, the image data is segmented or filtered to include data (e.g., image pixel data) of the patient and the immediately surrounding area (e.g., patient bed).
At operation 406, the aggregated image data is analyzed to determine, for example, physiological and/or positioning data for the monitored patient. For example, using one or more algorithms as described herein, the image data is processed to determine the physiological and/or positioning data. For example, one or more different image processing techniques can be used to determine respiration, blood perfusion, heart rate, and ocular motion (or other bio-signals, such as temperature, brain activity, etc.), among others. In one example, the imaging data from the different imaging sensors is processed to detect pixilation, such as in the eyes, cars, checks, etc. of the patient. In some examples, the processing is performed on image data within a defined spectrum, such as the infrared (IR) spectrum (e.g., 350 nm-750 nm light spectrum). It should be noted that in various examples, calibration is automatically set, such as to distinguish between the background and the person (e.g., the patient).
As another example, a light sensor is used wherein an IR grid of dots is transmitted or projected and one or more image cameras “read” the dots. The changing geometry of dots and different depths can be used in the analysis to identify the different patient properties, states, etc.
As still another example, when imaging portions of the skin that are thin, the image data is analyzed to determine changes (e.g., delta changes) of the skin between image frames. Using a correlation (e.g., 1:1), changes in the coloration are representative of changes in blood perfusion.
As yet another example, changes in pixilation between the nose and mouth and changes in temperature can be analyzed to determine respiration, including tidal volume over time.
In some examples, the analysis performed at 406 is based on or uses machine learning. For example, online laboratory simulations can be performed to improve the analysis for different desired properties. It should be noted that any machine learning techniques can be used.
Physiological and/or positioning data is output at 408 and displayed at 410. For example, colored respiration data or heart rate data can be displayed, with the coloring determined based on one or more thresholds. The output data can be representative of different states of the patient, for example, an awakeness of the patient (e.g., whether patient is awake or asleep, in distress, etc.). The data can then be used to proactively support the decision on patient care and can be displayed on different devices, in real time, for that purpose (e.g., a visual on a mobile phone).
Thus, one or more implementations allow for proactive, touchless communication, for example, to a nurse or other caregiver responsible for the care of the patient. The data or information provided to the nurse or other caregiver can be used to make decisions or set different parameters. For example, a “geofence” can be set to determine movement beyond a visual (on the mobile phone) barrier. That is, a monitored area can be set. As another example, patient movement can be tracked, such as very small or minute motion (e.g. to 200 milliseconds (ms)) to identify such movement for patient treatment (such as during surgery) or for imaging use.
It should be noted that the implementations described herein can be used in different environments, such as a hospital, nursing home, rehabilitation center, etc. Additionally, the herein described implementations can be used to monitor individuals of any age (e.g., babies, infants, elderly) or non-humans (e.g., animals). In some examples, the imaging devices include processing capabilities to perform one or more operations described herein. As such, applications specific sensors are configured in some examples, such as a nursing sensor or a baby monitor. In one example, a monitor for delta variant changes can be used, such as for cyber-knife treatment for cancer wherein real-time updates to the movement of the patient can be used to precisely apply radiation to a treatment site. In some examples, azimuths of the imaging space are updated within the system to adjust the imaging, such that no calibration or human interactions are needed.
Further, the patient monitoring device 500 can comprise a thermal camera 504 that is configured to capture thermal imagery 606 of the target area 604, such as one that detects temperatures and generates a thermal map that distinguishes different temperatures by different colors. In this example, the thermal imagery 606 can distinguish temperature by colors to create a thermal map, and to detect potential anomalies, and or certain conditions, as described above. The patient monitoring device 500 can also comprise an infrared light emitter 506, that can be used in conjunction with an infrared camera 508. The infrared light emitter 506 can produce infrared light that can reflect off objects in the target area 604, which may be detected by the infrared camera 508. This can allow for image capture in low (visible) light situations.
In this implementation, the example device 500 can be configured to be readily modular, such that it can be readily moved from a first location to a second, desired location easily. Communication between the device and a remote computer (e.g., workstation, laptop, tablet, handheld, etc.) can be through wireless protocols, such as Wi-Fi, cellular, near-field, Bluetooth, etc. In some implementation, a wired solution can be provided that allows for coupling the device to communication cables (e.g., Cat 5/10) for communicating with a base station, and/or a modem for wireless communication.
As another example, in
As another example, in
In an implementation, the patient monitoring device 500, or a series of patient monitoring devices 500, can be configured to monitor at least one patient as the patient moves between a plurality of rooms or locations. It may be desirable to allow a patient to navigate, move, or walk from one room to another as it can instill a sense of freedom in the patient. Facilities may also have more than one room accessible by patients and may wish to monitor the patient or patients in each of the rooms or locations. By way of example, a patient may be located in an extended stay facility with a bedroom and a living room. A series of patient monitoring devices 500 may be used to monitor the patient's movement from one room to another so that the patient is monitored regardless of location.
The monitoring system 300 may include a single patient monitoring device 500 or may include a plurality of patient monitoring devices 500. Moreover, in some implementations, an ecosystem may include multiple monitoring systems 300, or subsystems, having multiple patient monitoring devices 500. It should be appreciated that any number of patient monitoring devices 500, systems 300, subsystems, or components thereof may be utilized to achieve a desired patient-monitoring scheme. Various references to the system 300 may be made herein and it should be appreciated that this may mean a single system 300 or a combination of connected systems 300 (e.g., an ecosystem).
Each location or room may be configured with at least one patient monitoring device 500 or system 300 to monitor at least one patient in a respective location. For example, in a facility having a bedroom, a kitchen, a hallway, a living room, and a bathroom, there may be one patient monitoring device 500 in each of the mentioned rooms or hallways. In some instances, more than one patient monitoring device 500 can be implemented in a single room if required. It should be appreciated that any number of patient monitoring devices 500 or systems 300 may be utilized to monitor patients in any number of rooms, locations, hallways, outdoor areas, gardens, walking paths, etc. In this manner, the system 300, ecosystem, or a series of ecosystems can monitor the patients in any number of locations or as they move or walk between locations so long as the patients are detected by at least one of the patient monitoring devices 500. In some instances, or configurations, a patient may be monitored by a plurality of patient monitoring devices 500 or systems 300 at any given time. In this example, data from multiple sources may be combined and/or analyzed to determine parameters of the patient or patients.
In one implementation, to achieve patient monitoring between a plurality of rooms or locations, the system 300 or ecosystem can detect and register a single patient or multiple patients within a given location. The process of registering a patient may include at least the steps of assigning a patient to a caregiver and/or identifying the patient as a subject of interest such that the system 300 is configured to track the patient. The system 300 may nonetheless track and monitor both registered and unregistered patients. It should be appreciated that the system 300 can identify and discern each patient from one another so that each patient (either registered or unregistered) is individually identifiable and monitored as desired.
In addition, as multiple subjects might be visible within a field of view of one of the systems 300, patient monitoring devices 500, or other connected devices, the system 300 can discern, based on previous patient registration or by identifying an unregistered patient, bio-signals of these multiple subjects individually. Thus, the system 300 may maintain the data pertinent to each subject or patient as its own data for inferences, both for previously registered patients, and newly identified patients. A caregiver, or otherwise observer, or administrator, can access the inferred data from each patient, and may assign a record to each newly identified patient for continued observation (e.g., as the means for registration). Alternatively, a patient that might not be registered, might still be observed by a patient monitoring device 500, without registration, and its real-time inferred data might still be available to an observer or caregiver, in real-time. In some instances, multiple caregivers may be assigned to a single patient; multiple patients may be assigned to a single caregiver, or any combination thereof.
In an implementation that includes and ecosystem of multiple systems 300 or a plurality of subsystems, a patient may be registered for multi-site or multi-location observation, so that any system 300, subsystem, or device within a defined ecosystem may identify a patient as the patient becomes visible for one or more system within the ecosystem. A sub-system of the ecosystem may autonomously register a patient after the patient is assigned by a caregiver. For example, each building of a hospital campus may comprise a system 300, so that each of the systems 300 communicate together forming a larger ecosystem of subsystems. In other examples, the same hospital campus may operate using a system 300 with multiple components thereof.
To monitor multiple patients, software containers (e.g., containerization) may be implemented for any of the systems or subsystems described herein. By way of example, one instance of a software container may be used for each patient so that data, detection, or analysis for patients can be separated and processed in an efficient manner. As another example, an instance of multiple software containers may be utilized to monitor each patient. In this example, an instance may include software containers for cardiac function, respiratory function, and/or temperature. One instance or group of software containers may be used for monitoring each patient or subject.
In an implementation, the system 300 may issue an alarm when a patient or patients enter an unauthorized room or location. Specifically, a user or caregiver may identify portions of a room or specific rooms in a facility as off limits or as areas of concern. When a patient enters one of these locations or rooms, the system can alert the caregiver so that attention can be paid to the patient. This may provide numerous benefits for caregivers assigned to monitor multiple patients, as it can be difficult to keep track of multiple patients at a given time. In this manner, the system 300 may assist by alerting the caregiver of events that may have otherwise gone undetected. This may prevent or mitigate harm to patients or others.
In addition, the system 300, based on real-time data ready from one or more connected devices, autonomously baselines, offsets, and normalizes bio-signals from each subject, based on novel algorithms and methods for data assessment, and data management, as the system reads, and infers the subject specific bio-signals. Moreover, as a single subject, or multiple subjects are registered for observation, the subjects' data for inference, and the inferred data, is assigned with individual markers for containment, and as pertinent for each subject that is being observed. With respect to the current disclosure, a combination of novel algorithms may analyze the recognized patterns at targeted data segments and may classify values against configurable thresholds to represent physiological state determination. A combination of novel algorithms can analyze the recognized patterns at targeted data segments and may classify values against configurable areas of interest to represent the positioning of the patient. A sub-system may organize and encrypt the algorithms processing results and may transmit such results to off-the-shelf smartphones, tables, or other workstations for user interface via display and sound. A sub-system may receive the encrypted algorithms results, de-crypt, and organize the results to one or multiple end-users using graphical and sound formats. A sub-system may chronologically organize the algorithm's processed results in text format and keep an auditable, size-configurable FIFO (First In First Out) log of results. Similarly, a sub-system may autonomously register a patient after is assigned by a caregiver, and track the patient, as he/she is visible under the field of view of a plurality of systems for remote monitoring.
The process may include the following steps. First, for a single system ecosystem, once the system is enabled, imaging sensors acquire and aggregate data. Then an automatic or manual function may determine bio-signals (biometrics/physiology parameter). A second function may determine or set boundaries and positioning parameters. Then, an automatic function may receive the encrypted algorithms results, de-crypts, and organizes the results to one or multiple end-users using graphical and sound formats. In addition, the system, based on real-time reading from one of its connected devices, autonomously baselines, offsets, and normalizes the bio-signals from each subject, based on novel algorithms and methods for data assessment, and data management, as the system reads, and infers the subject specific bio-signals.
When multiple systems are enabled, a subject is registered for multi-site observation, so any system, within a defined ecosystem, can identify a subject as the subject becomes visible for one or more systems or rooms within the ecosystem. A sub-system can autonomously register a patient after is assigned by a caregiver and track the patient as he/she is visible under the field of view of a plurality of systems for remote monitoring.
The advantages to the methods and systems discloses herein are applicable to patient monitoring at certain stages of the healthcare continuum where patient well-being, and, or patient positioning is necessary, but in which a connecting or wearable PMS or other devices is not practical, not possible, or it is not necessary.
With reference to
Turning to
An MQTT broker is a server that can manage communication between clients or devices in a Message Queuing Telemetry Transport (MQTT) network. The MQTT broker can act as an intermediary that receives messages from devices (e.g., publishers) and may route the messages to appropriate subscribers based on topic filters. MQTT may offer improvements over conventional communication schemes as it is lightweight, efficient in handling large volumes of data, and is ideal for low-bandwidth, high-latency, or unreliable network conditions. By way of example, the patient monitoring devices 500 (e.g., edge devices) may communicate with the admin system 1102 and the user interface 1104 (clinical clients) via an MQTT broker.
In the example illustrated in
By way of example, the system 1200 may also communicate with and receive data from devices other than the patient monitoring devices 500. For instance, the system 1200, or any other system described herein, may receive additional sensor data from original equipment manufacturer (OEM) systems. In other words, the system 1200 may be a stand-alone system or it may interact with and receive additional data from, and existing OEM system. By way of example, an OEM system may comprise of an existing video monitoring system or patient monitoring system currently being utilized by a hospital or medical facility. The system 1200 may receive imaging data from the OEM system in addition to imaging data received by the patient monitoring devices 500. In this manner, the system 1200 may aggregate and analyze data from the system 1200 and from a remote OEM system without deviating from the scope of the disclosure.
Turning to
The first subsystem 1302 may also comprise a sensor manager 1308 for managing data from the various sensors 1306. The sensor manager 1308 may output raw sensor data 1310 from the various sensors 1306. The raw sensor data 1310 may be communicated, via the communications module 1312, to the second subsystem 1304. The communications module 1312 may be used to facilitate local area network (LAN) communications, wireless communication, or any other wired or wireless communication without deviation from the scope of the invention. In some embodiments, the communication via the first subsystem 1302 and the second subsystem 1304 may be accomplished via a cloud network or an MQTT broker as discussed above with respect to system 1200.
The second subsystem 1304 may comprise a data acquisition module 1314 and a data aggregation module 1316. The data acquisition module 1314 and the data aggregation module 1316 may operate similar to the data acquisition and aggregation method describes above with respect to
That is, one or more imaging sensors 1306 acquire data of one or more objects within a room. One particular example includes one or more imaging sensors 1306 configured to acquire images of a patient within a hospital room for well-being and positioning monitoring. The acquired data from module 1314 is then aggregated at module 1316. For example, one or more data aggregation operations are performed to combine different data acquired by the one or more imaging sensors 1306. It should be appreciated that the imaging sensors 1306 can be located in different locations within the room and have different positions, orientations, field-of-views, etc., such as based on the type of images to be acquired, the processing to be performed on the acquired image data (also referred to as imaging data), etc. As such, different perspectives or views of the room or portions of the room can be acquired. In some implementations, the data aggregation is performed on different categories of acquired data. For example, physiological and position data are separately aggregated into datasets and then real-time processing of each of the datasets is performed using one or more algorithms. In some implementations, when processing the datasets, an automatic or manual function sets biometrics/physiology parameters, and a different function sets boundaries and positioning parameters. That is, different parameters are used or set for processing each of the datasets.
Similarly, as described above, real time physiological and real time position data may be determined using the aggregated data for one or more patients. That is, processing by a combination of the one or more algorithms may analyze recognized patterns at targeted data segments and may classify values against configurable thresholds to represent physiological state determinations of a plurality of patients or subjects. In one implementation, the processing by a combination of the one or more algorithms analyzes the recognized patterns at targeted data segments and classifies values against configurable areas of interest to represent the positioning of the patient.
The physiological state determinations and the positioning of the patient results may then be combined to enhance the available information and monitoring of the patient. For example, the output of the processing of the datasets by the one or more algorithms is then combined and results prepared at module 1322, such as for communication and display at a user interface 1330 of the first subsystem 1302. Although it should be appreciated that the results may be displayed on any suitable interface associated with the first subsystem 1302, the second subsystem 1304, or any other system.
To facilitate communication of the results from the module 1322, a communication module 1324 may be used. Similar to the communication module 1312, communication module 1324 may be used to facilitate local area network (LAN) communications, wireless communication, or any other wired or wireless communication without deviation from the scope of the invention. In some embodiments, the communication via the first subsystem 1302 and the second subsystem 1304 may be accomplished via a cloud network or an MQTT broker as discussed above with respect to system 1200.
Once processed data is prepared at 1322, it is communicated, either wired or wirelessly, to the first subsystem 1302 via communication modules 1312 and 1324. It should be appreciated that data/results communicated from the second subsystem 1304 to the first subsystem 1302 may be a lower size compared to the raw data that is communicated from the first subsystem 1302 to the second subsystem 1304 for processing. In this manner, the data/results may be a lower burden on IT infrastructure and communication infrastructure by requiring less bandwidth, for example.
At module 1328, an inference layer may be generated using the data/results (sensor processed data 1326). By way of example, the inference layer may be a visual representation of the physiological state or the position of a patient. The inference layer may be combined with imaging data of the at least one patient, for example from one of the sensors 1306 such that the inference layer is displayed over top of the video stream of the patient via the user interface 1330. In this manner, a user may view a live video stream of the patient along with an inference layer illustrating a physiological state or the position of a patient.
As discussed above in the various other systems, the user interface 1330 may be utilized to display any data, images, alarms, alerts, notifications, or related data associated with the monitoring of a plurality of patients or subjects.
In an exemplary implementation, the data analysis subsystem 1304 and its various functions may be operated or offered as a software as a service (SaaS) that may be integrated with or into a data acquisition subsystem 1302 (e.g., such as an OEM system). SaaS integration may involve any suitable method of integration with an existing or OEM system, such as the subsystem 1302. For instance, API integration may be utilized such that APIs to send data to the SaaS platform, retrieve information, or trigger specific functions. The OEM system can push data to the SaaS platform, where it can be processed, analyzed, and visualized. The SaaS platform can also return processed data or insights back to the OEM system. SaaS software may be integrated into the OEM system through pre-installed software. Moreover, if an OEM system is cloud-enabled, it may integrate with the SaaS platform through cloud connectors or middleware (such as MQTT), facilitating the exchange of data and services between the cloud environments. Or, for OEM systems operating at the edge (close to the data source), integration with a SaaS platform can involve processing data locally before sending it to the cloud for further analysis (e.g., such as in the system 1200).
It should also be appreciated that any combination of systems disclosed herein may be utilized to facilitate a complete monitoring system for a plurality of patients. For instance, a system may comprise a plurality of edge devices such as devices 500 that may acquire, aggregate, and analyze data as discussed herein. The system may further include an integration with an OEM system, such as system 1302 having OEM sensors or cameras that send data to a subsystem such as system 1304 for processing. In other words, a complete patient monitoring system may comprise a combination of edge devices and OEM devices and the system may aggregate data from both edge devices and OEM devices to prepare a data set of physiological or positioning data for the plurality of patients.
The various systems and components described herein may comprise a processor and a computer-readable medium storing instructions that are operative upon execution by the processor to perform any of the functions described herein. For instance, the system 300, the device 500, the system 1200, the system 1302, and/or the system 1304 may comprise a processor and a computer-readable medium.
With reference now to
Although not required, implementations are described in the general context of “computer readable instructions” executed by one or more computing devices. Computer readable instructions may be distributed via computer readable media (discussed below). Computer readable instructions may be implemented as program modules, such as functions, objects, Application Programming Interfaces (APIs), data structures, and the like, that perform particular tasks or implement particular abstract data types. Typically, the functionality of the computer readable instructions may be combined or distributed as desired in various environments.
In some examples, the computing device 1400 includes a memory 1402, one or more processors 1404, and one or more presentation components 1406. The disclosed examples associated with the computing device 1400 are practiced by a variety of computing devices, including personal computers, laptops, smart phones, mobile tablets, hand-held devices, consumer electronics, specialty computing devices, etc. Distinction is not made between such categories as “workstation,” “server,” “laptop,” “hand-held device,” etc., as all are contemplated within the scope of
In one example, the memory 1402 includes any of the computer-readable media discussed herein. In one example, the memory 1402 is used to store and access instructions 1402a configured to carry out the various operations disclosed herein. In some examples, the memory 1402 includes computer storage media in the form of volatile and/or nonvolatile memory, removable or non-removable memory, data disks in virtual environments, or a combination thereof. In one example, the processor(s) 1404 includes any quantity of processing units that read data from various entities, such as the memory 1402 or input/output (I/O) components 1410. Specifically, the processor(s) 1404 are programmed to execute computer-executable instructions for implementing aspects of the disclosure. In one example, the instructions 1402a are performed by the processor 1404, by multiple processors within the computing device 1400, or by a processor external to the computing device 1400. In some examples, the processor(s) 1404 are programmed to execute instructions such as those illustrated in the flow charts discussed herein and depicted in the accompanying drawings.
In other implementations, the computing device 1400 may include additional features and/or functionality. For example, the computing device 1400 may also include additional storage (e.g., removable and/or non-removable) including, but not limited to, magnetic storage, optical storage, and the like. Such additional storage is illustrated in
The presentation component(s) 1406 presents data indications to an operator or to another device. In one example, the presentation components 1406 include a display device, speaker, printing component, vibrating component, etc. One skilled in the art will understand and appreciate that computer data is presented in a number of ways, such as visually in a graphical user interface (GUI), audibly through speakers, wirelessly between the computing device 1400, across a wired connection, or in other ways. In one example, the presentation component(s) 1406 are not used when processes and operations are sufficiently automated that a need for human interaction is lessened or not needed. I/O ports 1408 allow the computing device 1400 to be logically coupled to other devices including the I/O components 1410, some of which is built in. Implementations of the I/O components 1410 include, for example but without limitation, a microphone, keyboard, mouse, joystick, pen, game pad, satellite dish, scanner, printer, wireless device, camera, etc.
The computing device 1400 includes a bus 1416 that directly or indirectly couples the following devices: the memory 1402, the one or more processors 1404, the one or more presentation components 1406, the input/output (I/O) ports 1408, the I/O components 1410, a power supply 1412, and a network component 1414. The computing device 1400 should not be interpreted as having any dependency or requirement related to any single component or combination of components illustrated therein. The bus 1416 represents one or more busses (such as an address bus, data bus, or a combination thereof). Although the various blocks of
The components of the computing device 1400 may be connected by various interconnects. Such interconnects may include a Peripheral Component Interconnect (PCI), such as PCI Express, a Universal Serial Bus (USB), firewire (IEEE 1494), an optical bus structure, and the like. In another implementation, components of the computing device 1400 may be interconnected by a network. For example, the memory 1402 may be comprised of multiple physical memory units located in different physical locations interconnected by a network.
In some examples, the computing device 1400 is communicatively coupled to a network 1418 using the network component 1414. In some examples, the network component 1414 includes a network interface card and/or computer-executable instructions (e.g., a driver) for operating the network interface card. In one example, communication between the computing device 1400 and other devices occurs using any protocol or mechanism over a wired or wireless connection 1420. In some examples, the network component 1414 is operable to communicate data over public, private, or hybrid (public and private) connections using a transfer protocol, between devices wirelessly using short range communication technologies (e.g., near-field communication (NFC), Bluetooth® branded communications, or the like), or a combination thereof.
The connection 1420 may include, but is not limited to, a modem, a Network Interface Card (NIC), an integrated network interface, a radio frequency transmitter/receiver, an infrared port, a USB connection or other interfaces for connecting the computing device 1400 to other computing devices. The connection 1420 may transmit and/or receive communication media.
Although described in connection with the computing device 700, examples of the disclosure are capable of implementation with numerous other general-purpose or special-purpose computing system environments, configurations, or devices. Implementations of well-known computing systems, environments, and/or configurations that are suitable for use with aspects of the disclosure include, but are not limited to, smart phones, mobile tablets, mobile computing devices, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, gaming consoles, microprocessor-based systems, set top boxes, programmable consumer electronics, mobile telephones, mobile computing and/or communication devices in wearable or accessory form factors (e.g., watches, glasses, headsets, or earphones), network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, VR devices, holographic device, and the like. Such systems or devices accept input from the user in any way, including from input devices such as a keyboard or pointing device, via gesture input, proximity input (such as by hovering), and/or via voice input.
Implementations of the disclosure are described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices in software, firmware, hardware, or a combination thereof. In one example, the computer-executable instructions are organized into one or more computer-executable components or modules. Generally, program modules include, but are not limited to, routines, programs, objects, components, and data structures that perform particular tasks or implement particular abstract data types. In one example, aspects of the disclosure are implemented with any number and organization of such components or modules. For example, aspects of the disclosure are not limited to the specific computer-executable instructions or the specific components or modules illustrated in the figures and described herein. Other examples of the disclosure include different computer-executable instructions or components having more or less functionality than illustrated and described herein. In implementations involving a general-purpose computer, aspects of the disclosure transform the general-purpose computer into a special-purpose computing device when configured to execute the instructions described herein.
By way of example and not limitation, computer readable media comprises computer storage media and communication media. Computer storage media include volatile and nonvolatile, removable, and non-removable memory implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules, or the like. Computer storage media are tangible and mutually exclusive to communication media. Computer storage media are implemented in hardware and exclude carrier waves and propagated signals. Computer storage media for purposes of this disclosure are not signals per se. In one example, computer storage media include hard disks, flash drives, solid-state memory, phase change random-access memory (PRAM), static random-access memory (SRAM), dynamic random-access memory (DRAM), other types of random-access memory (RAM), read-only memory (ROM), electrically crasable programmable read-only memory (EEPROM), flash memory or other memory technology, compact disk read-only memory (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium used to store information for access by a computing device. In contrast, communication media typically embody computer readable instructions, data structures, program modules, or the like in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media.
While various spatial and directional terms, including but not limited to top, bottom, lower, mid, lateral, horizontal, vertical, front and the like are used to describe the present disclosure, it is understood that such terms are merely used with respect to the orientations shown in the drawings. The orientations can be inverted, rotated, or otherwise changed, such that an upper portion is a lower portion, and vice versa, horizontal becomes vertical, and the like.
The word “exemplary” is used herein to mean serving as an example, instance or illustration. Any aspect or design described herein as “exemplary” is not necessarily to be construed as advantageous over other aspects or designs. Rather, use of the word exemplary is intended to present concepts in a concrete fashion. As used in this application, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from context, “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, if X employs A; X employs B; or X employs both A and B, then “X employs A or B” is satisfied under any of the foregoing instances. Further, at least one of A and B and/or the like generally means A or B or both A and B. In addition, the articles “a” and “an” as used in this application and the appended claims may generally be construed to mean “one or more” unless specified otherwise or clear from context to be directed to a singular form.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
As used herein, a structure, limitation, or element that is “configured to” perform a task or operation is particularly structurally formed, constructed, or adapted in a manner corresponding to the task or operation. For purposes of clarity and the avoidance of doubt, an object that is merely capable of being modified to perform the task or operation is not “configured to” perform the task or operation as used herein.
Various operations of implementations are provided herein. In one implementation, one or more of the operations described may constitute computer readable instructions stored on one or more computer readable media, which if executed by a computing device, will cause the computing device to perform the operations described. The order in which some or all of the operations are described should not be construed as to imply that these operations are necessarily order dependent. Alternative ordering will be appreciated by one skilled in the art having the benefit of this description. Further, it will be understood that not all operations are necessarily present in each implementation provided herein.
Any range or value given herein can be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Also, although the disclosure has been shown and described with respect to one or more implementations, equivalent alterations and modifications will occur to others skilled in the art based upon a reading and understanding of this specification and the annexed drawings. The disclosure includes all such modifications and alterations and is limited only by the scope of the following claims. In particular regard to the various functions performed by the above described components (e.g., elements, resources, etc.), the terms used to describe such components are intended to correspond, unless otherwise indicated, to any component which performs the specified function of the described component (e.g., that is functionally equivalent), even though not structurally equivalent to the disclosed structure which performs the function in the herein illustrated exemplary implementations of the disclosure.
As used in this application, the terms “component,” “module,” “system,” “interface,” and the like are generally intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program and/or a computer. By way of illustration, both an application running on a controller and the controller can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
Furthermore, the claimed subject matter may be implemented as a method, apparatus or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware or any combination thereof to control a computer to implement the disclosed subject matter. The term “article of manufacture” as used herein is intended to encompass a computer program accessible from any computer-readable device, carrier or media. Of course, those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope or spirit of the claimed subject matter.
In addition, while a particular feature of the disclosure may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Furthermore, to the extent that the terms “includes,” “having,” “has,” “with,” or variants thereof are used in either the detailed description or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising.”
The implementations have been described, hereinabove. It will be apparent to those skilled in the art that the above methods and apparatuses may incorporate changes and modifications without departing from the general scope of this invention. It is intended to include all such modifications and alterations in so far as they come within the scope of the appended claims or the equivalents thereof.
This application claims priority to U.S. Ser. No. 63/519,415, entitled MULTI-SITE, MULTI-SUBJECT JOURNEY OBSERVATION DECISION SUPPORT METHOD AND SYSTEM, filed Aug. 14, 2023, which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63519415 | Aug 2023 | US |