The present disclosure relates to medical lines and, more particularly, to a system and method for medical line detection and monitoring using video.
In critical care, various lines, tubes, and cords can surround a patient. These objects include intravenous lines, percutaneously inserted central catheters, central lines, catheter tubes, endotracheal tubes, extracorporeal membrane oxygenation lines, electroencephalogram lines, electrocardiogram lines, pulse oximeter lines, power cords, and universal serial bus cords. The complexity associated with these lines causes many problems for patient outcomes and caregiver workflow. Challenges related to lines can include unwanted removal of the line. Unwanted removal can be intended by the patient or unintentional from patient movement. The consequences of unwanted removal can include infection, pain or discomfort, or even death. Other challenges can include the line touching the floor or other contaminated objects that can result in infection. Disconnected leads can result in missing vitals data. Additionally, the line can become kinked or severed, especially when line falls below the bedframe and/or becomes tangled. Moreover patient falls can occur as a result of tangled lines or lines on the floor.
In addition to the negative patient outcomes and patient experience, replacing lines is a workflow issue for caregivers who cannot afford to perform additional tasks. Workflow problems are especially challenging when the lines removed require specialists to replace and when the replacement is urgent (e.g. extracorporeal membrane oxygenation lines, endotracheal tubes, etc.).
The present disclosure includes one or more of the features recited in the appended claims and/or the following features which, alone or in any combination, may comprise patentable subject matter.
According to a first aspect of the disclosed embodiments, a system for detecting and monitoring lines in a healthcare setting includes a medical device positioned in a healthcare facility. A line extends from the medical device. At least one imaging device is positioned in the healthcare facility and separate from the medical device. The at least one imaging device detects the line and identifies the line. A computer device had a processor and executable instructions stored on a non-transitory computer readable medium. The execution of the executable instructions configures the processor to analyze an image of the line from the at least one imaging device to determine whether an unwanted condition of the line has occurred.
In some embodiments of the first aspect, the execution of the executable instructions can configure the processor to segment the image to remove any portions of the image that do not include the line. The processor can trace a path of the line with a curvature analysis algorithm that fits the path to a curve of appropriate order. The processor can determine whether the unwanted condition of the line has occurred based on the curve. The processor can locate a terminus of the line. The processor can identify the unwanted condition of the line if the terminus is located at an unexpected position. The processor can detect a flow rate through the line. The processor can identify the unwanted condition if the flow rate does not match a predetermined flow rate. The processor can activate an alert, if the unwanted condition of the line is detected. The alert can be at least one of a visual alert, audible alert, and vocal alert. The alert can identify the line that has experienced the unwanted condition. The processor can process the image using at least one of convolutional operations, thresholding edge enhancement, and machine learning. The processor can detect the presence of compounds in fluids moving through the line.
Optionally, in the first aspect, the at least one imaging device can include at least one of a video camera, and infrared camera, a depth camera, and a thermal camera. The image of the line can include at least one of a visible image, an infrared image, a depth image, and a thermal image. The at least one imaging device can be mounted on at least one of a wall of the healthcare facility, a ceiling of the healthcare facility, or a fixture in the healthcare facility. The line can include at least one of an intravenous line, a percutaneously inserted central catheter, a central line, a catheter tube, an endotracheal tube, an extracorporeal membrane oxygenation line, an electroencephalogram line, an electrocardiogram line, a pulse oximeter line, a power cord, and a universal serial bus cord. The unwanted condition can include at least one of unwanted removal of the line, a kink in the line, the line touching the floor, disconnection of the line, a severed line, and a tangled line. The line can include a visual indicator that is detected by the at least one imaging device to identify the line. The visual indicator can include a dye flowing through the line. The visual indicator can include at least one of a fiducial marker and grid printed on the line. The visual indicator can include at least one of a barcode, a QR code, or a dot pattern on the line. The visual indicator can be made with at least one of visible ink and infrared ink. The visual indicator can include an optical emitter coupled to generate propagation of light through the line. The optical emitter can be modulated to generate a pattern of illumination that can be read temporally.
According to a second aspect of the disclosed embodiments, a method is performed by a computer device having a processor and executable instructions stored on a non-transitory computer readable medium which, when executed by the processor, causes the computer device to perform the method. The method includes detecting, with an imaging device, a line extending from a medical device positioned in a healthcare facility. The method also includes acquiring, with the imaging device, an image of the line. The method also includes analyzing, with the computer device, the image of the line to determine whether an unwanted condition of the line has occurred.
In some embodiments of the second aspect, the method can also include segmenting, with the computer device, the image to remove any portions of the image that do not include the line. The method can also include tracing, with the computer device, a path of the line with a curvature analysis algorithm that fits the path to a curve of appropriate order. The method can also include determining, with the computer device, whether the unwanted condition of the line has occurred based on the curve. The method can also include locating, with the computer device, a terminus of the line. The method can also include identifying, with the computer device, the unwanted condition of the line if the terminus is located at an unexpected position. The method can also include detecting, with the computer device, a flow rate through the line. The method can also include identifying, with the computer device, the unwanted condition if the flow rate does not match a predetermined flow rate. The method can also include activating an alert, with the computer device, if the unwanted condition of the line is detected. The method can also include processing, with the computer device, the image using at least one of convolutional operations, thresholding edge enhancement, and machine learning. The method can also include detecting, with the computer device, the presence of compounds in fluids moving through the line. The method can also include acquiring, with the imaging device, at least one of a visible image, an infrared image, a depth image, and a thermal image. The method can also include detecting, with an imaging device, a visual indicator on the line. The method can also include dentifying, with the computer device, the line based on the visual indicator. The method can also include injecting a dye into the line to create the visual indicator. The method can also include propagating light, with a light emitter, through the line to create the visual indicator.
Additional features, which alone or in combination with any other feature(s), such as those listed above and those listed in the claims, may comprise patentable subject matter and will become apparent to those skilled in the art upon consideration of the following detailed description of various embodiments exemplifying the best mode of carrying out the embodiments as presently perceived.
The detailed description particularly refers to the accompanying figures in which:
While the concepts of the present disclosure are susceptible to various modifications and alternative forms, specific exemplary embodiments thereof have been shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit the concepts of the present disclosure to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
Referring now to
At least one imaging device 30 is positioned in the healthcare facility 10. The at least one imaging device 30 can be coupled to any one of a wall of the healthcare facility 10, a ceiling of the healthcare facility 10, or a fixture (e.g. furniture, an overhead lift, the at least one medical device 16, etc.) in the healthcare facility 10. The imaging device 30 can have a wide field of view capable of monitoring an entire room of the healthcare facility 10. In some embodiments, the healthcare facility 10 can include any number of imaging devices 30. In some embodiments, a range of the imaging devices 30 can be extended using mirrors. In some embodiments, the imaging device 30 can be any one of a video camera, and infrared camera, a depth camera, a thermal camera, or the like. The imaging device 30 is configured to use image processing techniques to identify the line 18 and monitor for unwanted conditions. The image processing techniques can include the analysis of any one of a visible image, an infrared image, a depth image, a thermal image or the like. The imaging data from the imaging device 30 can be processed using an algorithm that employs traditional image processing techniques (convolutional operations, thresholding, edge enhancement, etc.) in conjunction with machine learning (CNN, etc). The unwanted conditions can include, but are not limited to, unwanted removal of the line 18, a kink in the line 18, the line 18 touching the floor, disconnection of the line 18, a severed line 18, a tangled line 18, occlusion of the line 18, the line 18 looped around an object, the line 18 caught in a side rail of the patient support apparatus 12, the line 18 positioned under the patient 14, the line 18 under tension, and trip hazards, e.g. lines 18 in volumes through which patient legs will move. Unwanted conditions can also include a line 18 that has been removed by someone other than a caregiver, i.e. when a caregiver is not present. Such line 18 removal can be determined using video or a real time locating system. To determine if a line 18 has been pulled out, the change in location of the line 18 relative to the insertion point could be monitored. Furthermore, distortion of bandages or dressings can indicate the line 18 has moved relative to the insertion point.
Referring now to
An alert 60 is activated if an unwanted condition is detected. The alert 60 can be an audible alert, a visual alert, or vocal alert. In some embodiments, an audible alert can include a sound, whereas a vocal alert can include a digitally recorded message, for example, “patient line occluded.” The alert can be conveyed using a wired and/or wireless facility communication system. The alert 60 can be activated in the patient's room of the healthcare facility 10, at a nurse's station (not shown), or at any other location within the healthcare facility 10. In some embodiments, the alert 60 is activated on a mobile device or smart device (not shown) carried by a healthcare worker or family member of the patient. In some embodiments, the patient 14, caregiver, or family member is alerted to unwanted conditions based on permissions set by the caregiver. That is, the alerts can be based on context of patient conditions, medication orders from an electronic medical record (EMR) 64, or encoded information about a medication, wherein the information is encoded in the line 18. The alerts can also be based on pre-selected urgency of different conditions that can be specific to the patient 14 or a unit of the healthcare facility 10. In some embodiments, the alert is delivered to and stored in the electronic medical record 64. For example, the electronic medical record 64 can store each alert created by the system 40. In some embodiments, the alert can communicate a certainty of risk based on calculations by the system 40.
A display 62 can also display information related to the unwanted condition. For example, the display 62 can identify the line 18 in question and provide a summary of the unwanted condition. The display 62 can also provide instructions for correcting or troubleshooting the unwanted condition. The display 62 can be positioned on the patient support apparatus 12. In other embodiments, the display 62 is provided on a remote computer (not shown) located at a nurse's station or other location within the healthcare facility 10. In other embodiments, the display 62 is provided on a mobile device or smart device (not shown) carried by a healthcare worker or family member of the patient.
Various visual indicators can be used to identify the line 18.
The visual indicators of
Referring now to
In some embodiments, the line 18 is doped with a dye. That is, dye is mixed with the fluid flowing through the line 18. As such, the imaging device 30 detects the dye to identify the line 18. In some embodiments, the color of the dye identifies the line 18. For example, a line 18 carrying a first medication can have a specific color associated therewith. Likewise, a line 18 carrying a second medication can have a different color associated therewith. It will be appreciated, that, if the dye is not detected by the imaging device 30, the lack of detection can indicate a blockage in the line 18.
In one embodiment, the system 40 is pre-programmed to detect specific lines 18. For example, the system 40 can be pre-programmed to associate a particular bar code with an intravenous line. Likewise, the system 40 can be pre-programmed to associate a particular dot pattern with a central line. As such, all the intravenous lines within the healthcare facility 10 are marked with the same bar code, and all the central lines within the healthcare facility are marked with the same dot pattern. Accordingly, regardless of the room in which the system 40 is operating, the system 40 is pre-programmed to identify the intravenous line and the central line. The shape of the line 18 or location of the line 18 relative to other objects could further be identified based on patterns printed on the lines 18. The patterns could be printed on the lines 18 by manufacturers, in an aftermarket printing process by hospitals, or by caregivers themselves by adhering stickers onto lines 18. The patterns could be as simple as a repeating barcode, or a more complicated grid of points that can indicate length (like a meter stick). Further identification could be integrated into the bandages, dressings, or other adhesives that clinicians already use to secure lines 18.
In another embodiment, the caregiver trains the system 40 to identify the lines 18. For example, upon set up of the room, the caregiver can lift the intravenous line so that the visual indicator on the intravenous line is visible by the system 40. The system 40 is then programmed to associate the visual indicator on the intravenous line with the intravenous line. The caregiver can program the system 40 using auditory commands and/or by entering information into a control panel. At this time, the system 40 can instruct the caregiver on proper placement of a nominal position of the intravenous line. Additionally, the system 40 using the methods described below can trace the intravenous line to associate a particular position of the line 18 with the intravenous line. As the caregiver identifies the lines 18 and programs the system 40, the system 40 can utilize machine learning to map the placement of lines 18 throughout the room of the healthcare facility 10. In another embodiment, detecting and monitoring of the lines 18 is fully automated entirely by algorithms that detect the lines 18, the relationship and orientation of the line 18 to the patient 14, and unwanted conditions.
An algorithm is applied to the image at block 210. In particular, a rule set is modeled in the algorithm. The algorithm “know-how” is codified by creating a multidimensional graph of the lines 18, medical devices 16, and the patient 14. For each item various objects are defined to capture the operational and physical envelope of the item. The objects can be grouped, nested, or linked. For example, presentation characteristics can include dimensions, color, weight, length, etc. States of the items can be defined, wherein the states can include open, closed, sterile, stationary, moving, idle, locked, crossed, tangled, kinked, stretched, snagged, etc. Additionally transitions of the items can be defined. For example, a line 18 can transition from safe to unsafe, or a line 18 can transition from sterile to used; etc. Inputs and outputs can also be defined, for example alarms, operational inputs/outputs, pump infusing inputs/outputs, pump idle inputs/outputs, control panel inputs/outputs, and illumination inputs/outputs.
An analysis takes place at block 212 to identify the lines 18. The lines 18 can be identified using classical deterministic image processing algorithms. For example, convolutional operations to detect edges and textures can be used to process a raw image into a set of masks and blobs. Optionally, an atlas based representation tree is applied for the determination of an object within the frame. This tree shows dependence information about where a line 18 should be with respect to another object and allows the algorithm to concentrate and provide threshold acceptance of identification of the object. For example is the patient 14 is imaged near the line 18, the line 18 is determined to be attached. If the image only includes the line 18, the line 18 is determined to have a loose fit.
The lines 18 can also be identified using a machine learning algorithm, wherein a model is selected for each appropriate object from a zoo of pre-trained models and confidence intervals are set for an appropriate threshold. The machine learning then returns information about the locality of the known objects.
Lines 18 that are identified are then localized within the three-dimensional space of the room of the healthcare facility 10. Using a photogrammetry approach with information known about the size and length of the line 18 an estimation of the location of the line 18 can be computed. Using a source modality like time of flight data, the position of the line 18 can be directly observed once the line 18 is located within the field of view as a lookup into the three-dimensional point cloud data. Additionally, the presentation of the line 18 is given a set of orientation vectors with respect to a global coordinate system.
At block 214 sensors are applied. As used herein the sensors are a classifier of a logical set of states or transitions that an object experiences to create a Boolean trigger that can be used to enable or select protocols, control the environment, or alert an external system. The sensors can directly observe events (an object moved), infer events (a patient transfer position implies that the patient turned over) or predictive events (a line 18 is applied so the patient is at risk). At blocks 218, 220, and 222, control outputs are provided to various objects 230, 232, and 234. It will be appreciated that any number of control outputs can be provided to any number of objects.
In some embodiments, a training mode occurs where the caregiver trains the system by identifying the lines 18. To start the training, the caregiver can press a button on the imaging device 30 or the patient support apparatus 12. Training can also be initiated through a mobile device application or by a verbal command that is heard through the imaging device 30, the patient support apparatus 12, a phone, or a nurse call. For example, the caregiver could say “[listening word] identify lines,” where the [listening word] is a trigger for the system 40. Alternatively, the system 40 can initiate training by instructing “please identify lines.” During the training mode, the system 40 can name lines 18 that the caregiver then touches in sequence, or the caregiver can name lines 18 while touching them. The system 40 then uses the identified locations as a seed three-dimensional position to the algorithm.
Simultaneously, the raw images from block 302 are fed to an image processing sequence that segments out each portion of the image that is not easilly identified as a line 18, at block 340. At block 342, a genetic snake algorithm is employed to trace the line 18. Any random location on the line 18 is selected and traversed with the snake following the line 18 in both directions, at block 344. When the snake hits an obstruction (from an occlusion or artifact) an elastic dead reckoning algorithm uses the current trajectory of where the snake has been to allow the snake to hop over the obstruction in the two-dimensional image. The distance the snake is allowed to traverse (elasticity) is set a priori. If the snake is unable to find a landing spot within that distance, the current point is marked as a terminus, at block 346.
Post processing of the snakes' path is sent to a curvature analysis algorithm, at block 348, that fits the path to a curve of appropriate order. If no fit can be made for each segment of the snake path, the line 18 is found to be taught or stretched and represents a hazard, at block 350. If the terminus of the snake is not located at the expected position for the line 18 (e.g., an IV at the hand), a hazard is also detected, at block 350. When negative conditions are detected, the system 40 alerts caregivers and patients as appropriate. Based on patient and/or family cognitive ability and medical literacy, caregivers can opt to have line issues communicated by a mobile device application or by the patient support apparatus speakers. Regardless of patient or family involvement, potential issues could be communicated to caregivers and escalated using typical alert management. For issues that are lower priority (e.g. a tangled saline drip) a notification could be delayed until a caregiver is in the room as detected by the imaging device 30 or a real time locating system.
When a negative condition is detected, the system 40 can also draw attention by having a projector or other light highlight the location of the condition in question (e.g., if a line is touching the floor or caught in a siderail). Although sheets and other visual obstructions could limit some of the ability of the system 40 to detect certain conditions, some methods allow the system 40 to detect negative outcomes even when obstructed. For example, if the system has visibility to the line at its source and/or destination. In this case, knowledge about the state of the line (it is continuous) would allow the system to make assumptions about its routing (under covers, or other obstructions). Given that standards set the length of lines by their use and class, the system 40 has knowledge a-priori about all the routes the line could possibly take where it is occluded. This is a computational matrix that could be generated in real time. Assuming the system 40 can see the line 18 at the time it is inserted into the patient, the system 40 can recognize points on the line 18 and determine the relative lengths of different points relative to the end of the line 18. If the system later sees points removed from the sheets that allow it to determine the known length of the line 18 under a sheet is shorter than the distance of the end of the line 18 to the visible part of the line 18, then it can be concluded a line 18 has been pulled. Similarly, if the length of line 18 under the sheet is determined to be very long, then there is an increased chance of tangling or entanglement with the patient or other objects. If a line 18 becomes undiscernible the system 40 can use a hysteresis-based rule parser to determine if the condition warrants an alarm. For example, a pic line that was detected becomes no longer visible. No potential hazards were observed within a radius prior to loss of lock (the rail is down, there is no tray or cell phone present). The algorithm returns a safe condition.
Video can allow for detecting additional information about the operation of the lines 18. For example, with sufficient resolution, the system 40 can detect the flow rate of fluids. The actual flow rate could be compared to the patient conditions or medication orders (if the system 40 is connected to the electronic medical record 64) to detect possible dosing errors and alert accordingly. Similarly, the system could use color and/or infrared to determine the presence of certain compounds in blood or urine. Certain findings relative to specific patient conditions (using the electronic medical record 64) could flag potential complications that could be alerted to caregivers.
Any theory, mechanism of operation, proof, or finding stated herein is meant to further enhance understanding of principles of the present disclosure and is not intended to make the present disclosure in any way dependent upon such theory, mechanism of operation, illustrative embodiment, proof, or finding. It should be understood that while the use of the word preferable, preferably or preferred in the description above indicates that the feature so described can be more desirable, it nonetheless cannot be necessary and embodiments lacking the same can be contemplated as within the scope of the disclosure, that scope being defined by the claims that follow.
In reading the claims it is intended that when words such as “a,” “an,” “at least one,” “at least a portion” are used there is no intention to limit the claim to only one item unless specifically stated to the contrary in the claim. When the language “at least a portion” and/or “a portion” is used, the item can include a portion and/or the entire item unless specifically stated to the contrary.
It should be understood that only selected embodiments have been shown and described and that all possible alternatives, modifications, aspects, combinations, principles, variations, and equivalents that come within the spirit of the disclosure as defined herein or by any of the following claims are desired to be protected. While embodiments of the disclosure have been illustrated and described in detail in the drawings and foregoing description, the same are to be considered as illustrative and not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Additional alternatives, modifications and variations can be apparent to those skilled in the art. Also, while multiple inventive aspects and principles have been presented, they need not be utilized in combination, and many combinations of aspects and principles are possible in light of the various embodiments provided above.
This application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application No. 63/312,410, filed Feb. 22, 2022, which is expressly incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63312410 | Feb 2022 | US |