The present disclosure relates to the use of non-contact patient monitoring systems to automatically determine a patient tidal volume for breathing. In some embodiments, the systems and methods described herein can employ non-contact patient monitoring technology to determine various characteristics of a patient that can then be used in calculating an appropriate patient tidal volume. Non-limiting examples of patient characteristics that can be obtained using non-contact patient monitoring technology include patient height, patient gender, and the length of one or more segments of the patient's body. In some embodiments, the measured patient characteristic or characteristics obtained using non-contact patient monitoring technology are used to calculate predicted (also sometimes referred to as ideal) patient height and/or body weight, which are then used to calculate an appropriate patient tidal volume.
The outcome of mechanical ventilation may be influenced by the size of breath given to a patient in relation to the size of that patient's lungs. The size of the lungs is influenced by, e.g., the height and gender of the patient, which in turn determines ideal/predicted body weight. Lung protection ventilation strategies are based on keeping delivered volume within a target range of mL of volume delivered for each kg of ideal/predicted body weight (mL/kg).
When a ventilator is used on a patient, various initial settings are input to ensure that the amount of air supplied to the ventilated patient with each breath is appropriate for the size of that patient's lungs. Initial tidal volume-related settings can be selected using patient demographics, such as gender, height, and/or predicted (ideal) weight of the patient. In one example, a predicted body weight (PBW) is calculated based on the gender and height of the patient using one of various preestablished formula (see, e.g., Moreault, O., Lacasse, Y., & Bussières, J. S. (2017). Calculating ideal body weight: Keep it simple. Anesthesiology: The Journal of the American Society of Anesthesiologists, 127(1), 203-204). The PBW measurement is then used to set an initial tidal volume setting on the ventilator, again using one of various preestablished formula or correlation charts.
Selecting an appropriate tidal volume setting for a ventilator can therefore depend heavily on obtaining accurate measurements of the various patient demographics. In an ideal setting, a patient's height is manually measured by a clinician so that subsequent calculations used to determine an appropriate tidal volume setting and which rely on the patient's height are as accurate as possible. However, it has been recently observed that many clinicians continue to only estimate a patient's height based on visual observation. Furthermore, in emergency situations, the clinician may not have the time or ability to take a manual measurement of the patient. As a result, erroneous tidal volume settings occur more frequently than desired based on these inaccurate patient demographic measurements.
Accordingly, a need exists for methods and systems capable of automating an accurate measurement of various patient demographics used in establishing appropriate patient tidal volume so that more appropriate tidal volume settings can be used, regardless of the clinician's ability to manually measure such patient demographics.
Described herein are various embodiments of methods and systems for automatic determination of a patient's tidal volume using non-contact video-based patient monitoring technology. In one embodiment, a video-based patient monitoring method includes: obtaining a depth sensing image of a patient using a depth sensing camera, the depth sensing image encompassing at least the length of the patient's body; from the depth sensing image, determining the patient's height; calculating a predictive body weight of the patient based on the determined patient height; and calculating a tidal volume for the patient based on the calculated predictive body weight.
In another embodiment, a video-based patient monitoring method includes: obtaining a depth sensing image of a patient using a depth sensing camera; from the depth sensing image, determining the length of a segment of the patient's body; calculating a patient height from the length of the segment of the patient's body; calculating a predicted body weight of the patient based on the calculated patient height; and calculating a tidal volume for the patient based on the calculated predicted body weight.
In another embodiment, a video-based patient monitoring method includes: obtaining a depth sensing image of a patient using a depth sensing camera, the depth sensing image encompassing at least the patient's body; from the depth sensing image, determining the patient's body volume; and calculating a tidal volume of the patient based on the patient's body volume.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawing are not necessarily to scale. Instead, emphasis is placed on illustrating clearly the principles of the present disclosure. The drawings should not be taken to limit the disclosure to the specific embodiments depicted but are for explanation and understanding only.
Specific details of several embodiment of the present technology are described herein with reference to
The camera 114 can capture a sequence of images over time. The camera 114 can be a depth sensing camera, such as a Kinect camera from Microsoft Corp. (Redmond, Wash.) or Intel camera such as the D415, D435, and SR305 cameras from Intel Corp, (Santa Clara, Calif.). A depth sensing camera can detect a distance between the camera and objects within its field of view. Such information can be used to determine that a patient 112 is within the FOV 116 of the camera 114 and/or to determine one or more regions of interest (ROI) to monitor on the patient 112. Once a ROI is identified, the ROI can be monitored over time, and the changes in depth of regions (e.g., pixels) within the ROI 102 can represent movements of the patient 112.
In some embodiments, the system 100 determines a skeleton-like outline of the patient 112 to identify a point or points from which to extrapolate a ROI. For example, a skeleton-like outline can be used to find a center point of a chest, shoulder points, waist points, and/or any other points on a body of the patient 112. These points can be used to determine one or more ROIs. For example, a ROI 102 can be defined by filling in area around a center point 103 of the chest, as shown in
In another example, the patient 112 can wear specially configured clothing (not shown) that includes one or more features to indicate points on the body of the patient 112, such as the patient's shoulders and/or the center of the patient's chest. The one or more features can include visually encoded message (e.g., bar code, QR code, etc.), and/or brightly colored shapes that contrast with the rest of the patient's clothing. In these and other embodiments, the one or more features can include one or more sensors that are configured to indicate their positions by transmitting light or other information to the camera 114. In these and still other embodiments, the one or more features can include a grid or another identifiable pattern to aid the system 100 in recognizing the patient 112 and/or the patient's movement. In some embodiments, the one or more features can be stuck on the clothing using a fastening mechanism such as adhesive, a pin, etc. For example, a small sticker can be placed on a patient's shoulders and/or on the center of the patient's chest that can be easily identified within an image captured by the camera 114. The system 100 can recognize the one or more features on the patient's clothing to identify specific points on the body of the patient 112. In turn, the system 100 can use these points to recognize the patient 112 and/or to define a ROI.
In some embodiments, the system 100 can receive user input to identify a starting point for defining a ROI. For example, an image can be reproduced on a display 122 of the system 100, allowing a user of the system 100 to select a patient 112 for monitoring (which can be helpful where multiple objects are within the FOV 116 of the camera 114) and/or allowing the user to select a point on the patient 112 from which a ROI can be determined (such as the point 103 on the chest of the patient 112). In other embodiments, other methods for identifying a patient 112, identifying points on the patient 112, and/or defining one or more ROI's can be used.
The images detected by the camera 114 can be sent to the computing device 115 through a wired or wireless connection 120. The computing device 115 can include a processor 118 (e.g., a microprocessor), the display 122, and/or hardware memory 126 for storing software and computer instructions. Sequential image frames of the patient 112 are recorded by the video camera 114 and sent to the processor 118 for analysis. The display 122 can be remote from the camera 114, such as a video screen positioned separately from the processor 118 and the memory 126. Other embodiments of the computing device 115 can have different, fewer, or additional components than shown in
The computing device 210 can communicate with other devices, such as the server 225 and/or the image capture device(s) 285 via (e.g., wired or wireless) connections 270 and/or 280, respectively. For example, the computing device 210 can send to the server 225 information determined about a patient from images captured by the image capture device(s) 285. The computing device 210 can be the computing device 115 of
In some embodiments, the image capture device(s) 285 are remote sensing device(s), such as depth sensing video camera(s), as described above with respect to
The server 225 includes a processor 235 that is coupled to a memory 230. The processor 235 can store and recall data and applications in the memory 230. The processor 235 is also coupled to a transceiver 240. In some embodiments, the processor 235, and subsequently the server 225, can communicate with other devices, such as the computing device 210 through the connection 270.
The devices shown in the illustrative embodiment can be utilized in various ways. For example, either the connections 270 or 280 can be varied. Either of the connections 270 and 280 can be a hard-wired connection. A hard-wired connection can involve connecting the devices through a USB (universal serial bus) port, serial port, parallel port, or other type of wired connection that can facilitate the transfer of data and information between a processor of a device and a second processor of a second device. In another embodiment, either of the connections 270 and 280 can be a dock where one device can plug into another device. In other embodiments, either of the connections 270 and 280 can be a wireless connection. These connections can take the form of any sort of wireless connection, including, but not limited to, Bluetooth connectivity, Wi-Fi connectivity, infrared, visible light, radio frequency (RF) signals, or other wireless protocols/methods. For example, other possible modes of wireless communication can include near-field communications, such as passive radio-frequency identification (RFID) and active RFID technologies. RFID and similar near-field communications can allow the various devices to communicate in short range when they are placed proximate to one another. In yet another embodiment, the various devices can connect through an internet (or other network) connection. That is, either of the connections 270 and 280 can represent several different computing devices and network components that allow the various devices to communicate through the internet, either through a hard-wired or wireless connection. Either of the connections 270 and 280 can also be a combination of several modes of connection.
The configuration of the devices in
With reference to
Any manner of determining the patient's height from the depth sensing image captured by the camera 114 can be used. The computing device 115 associated with the camera 114 as shown in
In other embodiments, the computing device 115 runs executable instructions that identify the opposite ends of the patient 400 based on the depth sensing data within the depth sensing image. For example, the computing device 115 may analyze the depth sensing data within the depth sensing image to identify a first end of the patient 400 and a second end of the patient 400 opposite the first end. Identifying these ends may be based on, e.g., identifying locations within the depth sensing image where relatively large changes in measured distance occur, such large changes denoting a transition from the patient's body to the bed upon which the patient 400 is positioned.
The computing device 115 may also employ, in conjunction with the camera 114 and the captured patient depth sensing image, computer-executable instructions capable of identifying a predicted patient foot region and/or a predicted patient head region within the depth sensing image. Such predicted regions can then be used to measure the height of the patient 400.
With respect to identifying a predicted foot region of the patient 400, a predicted foot region can be determined using any suitable methods, including using methods similar to those described previously. For example, in a method where the depth sensing image is used to determine an outline of the patient, the shape of the obtained outline can be analyzed to identify the predicted foot region. The predicted foot region can be identified based on the predicted foot region having a shape similar to an expected foot region for standard body outlines, or may be identified based on its spatial relation to other portions of the outline that are identified as other parts of the patient's body. For example, identification of a hand, waist, hip, etc., from the obtained boundary shape of the patient may then be used to identify other portions of the outline based on proximity/spatial relation to the identified body part.
With reference to
With respect to identifying a predicted head region of the patient 400, a predicted head region can be determined using any suitable method, including using methods similar to those described previously with respect to identifying a predicted foot region. In some embodiments, facial recognition software can be integrated into the computing device 115 so that the facial recognition software can be used to assist with identifying a predicted head region. For example, facial recognition software can be used to identify the location of a face in the image captured by the camera 114. Location of a face by facial recognition software can then allow for assigning a predicted head region in the location of and/or encompassing the recognized face. A more precise predicted head region determination can be achieved by further using the identification of specific facial features within a facial recognition. For example, identification of a nose or eyes within a facial recognition analysis can then be used to more accurately identify a predicted head region, the predicted head region being established at least based on its proximity and/or special relation to the identified facial features.
With reference to
When the system identifies the patient or a portion of the patient as being positioned at an angle, the calculation of the patient's height can be adjusted to take into account such an angle. Any suitable manner of calculating the patient's height as being the sum of segments A and B′ shown in
Once the patient's height has been determined in step 320, such as by using any of the methods described previously, the method 300 proceeds to step 330, wherein a predictive body weight (PBW) is calculated based on the height determined in step 320. Any known formula or correlation chart used to calculate PBW from height can be used to carry out step 330. One exemplary formula is described in Moreault, O., Lacasse, Y., & Bussières, J. S. (2017). Calculating ideal body weight: Keep it simple. Anesthesiology: The Journal of the American Society of Anesthesiologists, 127(1), 203-204. For example, a man's predicted body weight can be calculated based on measured height using the following formula:
Weight (kg)=50 kg+(0.91×[Patient Height in Centimeters−152.5])
As noted previously, this calculation can be carried out automatically using the computing device 115. In some embodiments, the computing device 115 is used to determine the patient height in step 320 as described previously, and therefore the computing device 115 immediately has possession of the height calculation for implementation into the PBW calculation in step 330.
As alluded to previously, calculation of PBW may depend on the gender of the patient. For example, the formula provided previously provides a means for determining a PBW based on patient height for when the patient is a man. The formula for a woman is different, and therefore it may be beneficial for the methods described herein to further take into account the gender of the patient when calculating PBW. In some embodiments, the gender of the patient can be manually input into the system so that the appropriate formula is used when carrying out step 330. However, some embodiments of the method may include an additional step wherein the gender of the patient is determined using the non-contact patient monitoring system.
Any suitable method for determining patient gender using non-contact patient monitoring systems can be used. In some embodiments, patient gender is determined by calculating a ratio of shoulder length (S) to waist length (W). With reference to
In step 340, the predictive body weight obtained in step 330 is used to calculate the patient's tidal volume. Any known formula or correlation chart used to calculate tidal volume from PBW can be used. In some embodiments, the tidal volume (in mL) is calculated as being 4 to 10 times the PBW of the patient in kilograms. In other words, the tidal volume used for the ventilator setting is set as being 4 to 10 mL per kilogram of the patient. As with step 330 discussed above, this calculation can be carried out automatically using the computing device 115. In embodiments where the computing device 115 is used to automatically calculate the PBW from the patient height and gender determined automatically by the non-contact patient monitoring technology associated with the computing device 115, the computing device 115 is also immediately automatically apply this value to the tidal volume calculation to immediately obtain the desired tidal volume setting for the ventilator.
In embodiments where the tidal volume is calculated as being 4 to 10 times the PBW of the patient in kilograms, it may be necessary to select the specific value between 4 and 10 that is used to carry out the tidal volume calculation. The specific value between 4 and 10 is often selected based on facility and/or clinician preference. Thus, in some embodiments, it may be possible for the preferred value to be entered into and stored in the computing device 115 such that the preferred value between 4 and 10 is automatically used when calculating tidal volume. In a scenario where different clinicians within the same facility and using the same equipment have a different preference for the value between 4 and 10 to be used when calculating tidal volume, it is possible for the unique preference of each clinician to be entered into and stored in the computing device 115 and for the computing device 115 to recognize and/or be told which clinician is treating the monitored patient such that each clinician's preferred value is automatically applied when calculating tidal volume.
Once the tidal volume calculation is determined in step 340, a ventilator's tidal volume setting can be programmed for a specific patient based on the tidal volume calculated in step 340. When the ventilator is communicatively associated with, for example, the computing device 115, the calculated tidal volume value can be automatically and immediately sent to the ventilator for use in initial ventilator settings for the patient.
Method 300 described previously generally relates to measuring a patient's height and determining patient tidal volume from the measured patient height (via a PBW calculation that depends on the measured patient height). However, it should be appreciated that modifications to method 300 can be implemented such that the patient's height is calculated from measuring a segment of the patient's body, rather than directly measuring the patient's overall height. With reference to
It should be appreciated that a modification to method 300 as described previously is not limited to measuring ulna length and calculating patient height from ulna length. Any other body segment that has been correlated to overall body height can be used in this embodiment. Regardless of the body segment selected, the manner of identifying and measuring the selected body segment can be similar or identical to methods described previously with respect to identifying various parts of the patient in a depth sensing image and using non-contact patient monitoring systems.
Once the patient's predicted height is calculated from a patient's body segment, the method 300 can generally progress in the same manner as described previously (i.e., where PBW is calculated from the calculated patient height in step 330 and tidal volume is then calculated from the calculated PBW in step 340).
With reference to
Once a patient's total body volume 801 is determined using the depth sensing image or other techniques, the computing device 115 (as shown in
In some embodiments, the method further includes a step of determining the gender of the patient 800 prior to selecting the appropriate Rv value. Determination of patient gender may employ similar techniques as described herein previously, such as measuring a patient S:W ratio and making a determination of gender by comparing the calculated S:W ratio to preestablished S:W values associated with men or women. Once the gender is determined, an appropriate Revalue can be selected such that a more accurate tidal volume calculation can be carried out.
Determination of patient age can also be carried out using non-contact patient monitoring technology. In some embodiments, a rough approximation of age (e.g., infant, adolescent, adult) is all that is required to select an appropriate Rv value. In such embodiments, the non-contact patient monitoring technology is used to make a determination of patient age. For example, a depth sensing image captured from a depth sensing camera and which includes an overall outline of the patient body can be used to make determinations of patient age based on, e.g., the overall size of the patient outline and other ratios of body segments that denote whether a patient is an infant, adolescent or adult.
Information on patient gender, age, etc. can also be obtained by other means, such as manually inputting such data into the computing device 115 by a clinician. This data can then be accessed by the computing device 115 at the appropriate time for calculation of the patient tidal volume 802 in conjunction with the measured patient total body volume 801.
From the foregoing, it will be appreciated that specific embodiments of the invention have been described herein for purposes of illustration, but that various modifications may be made without deviating from the scope of the invention. Accordingly, the invention is not limited except as by the appended claims.
The present application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 63/137,886, entitled “Methods for Automatic Patient Tidal Volume Determination Using Non-Contact Patient Monitoring Systems”, filed Jan. 15, 2021, the entirety of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63137866 | Jan 2021 | US |