The present disclosure relates generally to a wound therapy system, and more particularly to measuring range of motion during healing progression of a wound.
Negative pressure wound therapy (NPWT) is a type of wound therapy that involves applying a negative pressure to a wound site to promote wound healing. Some wound treatment systems apply negative pressure to a wound using a pneumatic pump to generate the negative pressure and flow required. Recent advancements in wound healing with NPWT involve applying topical fluids to wounds to work in combination with NPWT. However, it can be difficult to measure range of motion accurately and precisely as the wound heals.
One implementation of the present disclosure is a system for calculating range of motion of a patient's jointed limb. In some embodiments, the system includes a drape adhered to a patient's skin of the jointed limb. The drape can include multiple locators. One or more of the locators may be positioned at an upper limb of the jointed limb, and one or more of the locators may be positioned at a lower limb of the jointed limb. The system can include a personal computer device having an imaging device. The personal computer device may be configured to record a first image of the patient's joint in a fully extended position with the imaging device and a second image of the patient's joint in a fully flexed position with the imaging device. The personal computer device can be configured to identify positions of the locators of both the first image and the second image, determine an extended angle of the patient's joint based on the identified positions of the locators of the first image, and determine a flexed angle of the patient's joint based on the identified positions of the locators of the second image. The personal computer device can be configured to determine a range of motion angle based on the extended angle and the flexed angle.
In some embodiments, the personal computer device is a mobile device with an application configured to determine the range of motion angle.
In some embodiments, the positions of the locators of both the first image and the second image are identified based on image data of the first image and the second image.
In some embodiments, the personal computer device is configured to generate a report and control a display screen to display the report.
In some embodiments, the report includes any of the range of motion angle, tabular historical information of the range of motion angle, graphical historical information of the range of motion angle, and improvements in the range of motion angle over time.
In some embodiments, the personal computer device is configured to provide the report to a clinician device.
In some embodiments, the personal computer device is configured to perform a calibration process to determine offset amounts for any of the flexed angle, the extended angle, and the range of motion angle to account for orientation of the imaging device relative to the jointed limb.
In some embodiments, the calibration process includes analyzing the first image and the second image to determine a difference in a shape of the locators relative to a known shape of the locators. The calibration process can further include determining an orientation of the imaging device relative to the jointed limb based on the difference in the shape of the locators. The calibration process can further include determining an offset amount for any of the flexed angle, the extended angle, and the range of motion angle to account for the orientation of the imaging device relative to the jointed limb.
In some embodiments, the difference in the shape of the locators is determined based on one or more initially recorded images.
In some embodiments, the personal computer device is configured to provide a notification to the patient to record the first image and the second image.
In some embodiments, the personal computer device is further configured to generate centerlines to determine the extended angle and the flexed angle.
Another implementation of the present disclosure is a controller for calculating a range of motion of a patient's jointed limb. In some embodiments, the controller is configured to record a first image of the patient's joint in a fully extended position with an imaging device and record a second image of the patient's joint in a fully flexed position with the imaging device. The controller can be configured to identify positions of the locators of both the first image and the second image. The controller can be configured to determine an extended angle of the patient's joint based on the identified positions of the locators of the first image, and determine a flexed angle of the patient's joint based on the identified positions of the locators of the second image. The controller can be configured to determine a range of motion angle based on the extended angle and the flexed angle.
In some embodiments, the controller is a mobile device with an application configured to determine the range of motion angle.
In some embodiments, the positions of the locators of both the first image and the second image are identified based on image data of the first image and the second image.
In some embodiments, the controller includes a display screen and is configured to generate a report and control the display screen to display the report.
In some embodiments, the report includes any of the range of motion angle, tabular historical information of the range of motion angle, graphical historical information of the range of motion angle, and improvements in the range of motion angle over time.
In some embodiments, the controller is configured to provide the report to a clinician device.
In some embodiments, the controller is configured to perform a calibration process to determine offset amounts for any of the flexed angle, the extended angle, and the range of motion angle to account for orientation of the imaging device relative to the jointed limb.
In some embodiments, the calibration process includes analyzing the first image and the second image to determine a difference in a shape of the locators relative to a known shape of the locators. The calibration process can further include determining an orientation of the imaging device relative to the jointed limb based on the difference in the shape of the locators, and determining an offset amount for any of the flexed angle, the extended angle, and the range of motion angle to account for the orientation of the imaging device relative to the jointed limb.
In some embodiments, the difference in the shape of the locators is determined based on one or more initially recorded images.
In some embodiments, the controller is configured to provide a notification to the patient to record the first image and the second image.
In some embodiments, the controller is further configured to generate centerlines that extend through the locators to determine the extended angle and the flexed angle.
Another implementation of the present disclosure is a method for calculating range of motion of a patient's jointed limb. In some embodiments, the method includes providing locators on the patient's jointed limb. One or more of the locators can be positioned at an upper limb of the jointed limb, and one or more of the locators can be positioned at a lower limb of the jointed limb. The method can include capturing a first image of the patient's joint in a fully extended position with an imaging device, and capturing a second image of the patient's joint in a fully flexed position with the imaging device. The method can include identifying positions of the locators of both the first image and the second image, and determining an extended angle of the patient's joint based on the identified positions of the locators of the first image. The method can include determining a flexed angle of the patient's joint based on the identified positions of the locators of the second image. The method can include determining a range of motion angle based on the extended angle and the flexed angle.
In some embodiments, the steps of capturing the first image, capturing the second image, identifying the positions of the locators, determining the extended angle, determining the flexed angle, and determining the range of motion are performed by a mobile device with an application.
In some embodiments, identifying the positions of the locators of both the first image and the second image includes identifying the positions of the locators based on image data of the first image and the second image.
In some embodiments, the method further includes generating a report and controlling a display screen to display the report.
In some embodiments, the report includes any of the range of motion angle, tabular historical information of the range of motion angle, graphical historical information of the range of motion angle, and improvements in the range of motion angle over time.
In some embodiments, the method further includes providing the report to a clinician device.
In some embodiments, the method further includes performing a calibration process to determine offset amounts for any of the flexed angle, the extended angle, and the range of motion angle to account for orientation of the imaging device relative to the jointed limb.
In some embodiments, the calibration process includes analyzing the first image and the second image to determine a difference in a shape of the locators relative to a known shape of the locators, and determining an orientation of the imaging device relative to the jointed limb based on the difference in the shape of the locators. The calibration process may include determining an offset amount for any of the flexed angle, the extended angle, and the range of motion angle to account for the orientation of the imaging device relative to the jointed limb.
In some embodiments, determining the difference in the shape of the locators includes comparing the shape of the locators to one or more initially recorded images.
In some embodiments, the method further includes providing a notification to the patient to record the first image and the second image.
In some embodiments, the method further includes generating centerlines that extend through the locators to determine the extended angle and the flexed angle.
Another implementation of the present disclosure is a method for performing negative pressure wound therapy and calculating a range of motion of a jointed limb, according to some embodiments. The method can include providing a dressing having a comfort layer, a manifold, and a drape positioned at a wound. The method can further include providing locators onto the dressing. The method can further include applying negative pressure to the wound through the dressing. The method can further include relieving the negative pressure applied to the wound. The method can further include calculating a range of motion of the jointed limb. The method can further include re-applying negative pressure to the wound through the dressing. Calculating the range of motion of the jointed limb can include capturing a first image of the jointed limb in a fully extended position with an imaging device. Calculating the range of motion of the jointed limb can further include capturing a second image of the jointed limb in a fully flexed position with the imaging device. Calculating the range of motion of the jointed limb can further include identifying positions of the locators of both the first image and the second image. Calculating the range of motion of the jointed limb can further include determining an extended angle of the patient's joint based on the identified positions of the locators of the first image, determining a flexed angle of the patient's joint based on the identified positions of the locators of the second image, and determining a range of motion angle based on the extended angle and the flexed angle.
Those skilled in the art will appreciate that the summary is illustrative only and is not intended to be in any way limiting. Other aspects, inventive features, and advantages of the devices and/or processes described herein, as defined solely by the claims, will become apparent in the detailed description set forth herein and taken in conjunction with the accompanying drawings.
Referring generally to the FIGURES, systems and methods for measuring range of motion of a patient's joint are shown. A smartphone or a personal computer device can be used with an installed mobile application for measuring the range of motion of the patient's joint. Three or four locators (e.g., dots, squares, reflective material, etc.) can be pre-affixed to a dressing, a drape, or the patient's skin. The patient can be reminded at regular time intervals to measure the range of motion. When the patient measures the range of motion, images of the joint in both the fully flexed and the fully extended configuration are recorded. The mobile application identifies positions of the locators, generates lines that extend through the locators and determines angles between the lines. The mobile application determines angles for both the fully flexed image and the fully extended image. The mobile application then determines a difference between the fully extended angle and the fully flexed angle as the range of motion. The mobile application can configure the smartphone to communicate with a clinician device. The mobile application may generate reports and operate a screen of the smartphone to display the reports. The mobile application can also store range of motion measurements throughout healing of the patient's wound. The mobile application can provide reports (e.g., graphs, tabular data, analysis, etc.) to the clinician device.
The mobile application can also perform a calibration technique to identify position and orientation of the smartphone relative to the patient's limb. The mobile application can analyze the images to determine orientation of the smartphone relative to the patient's limb. The mobile application offsets the range of motion to account for orientation of the smartphone relative to the patient's limb. Advantageously, the systems and methods described herein can enable a patient to measure the range of motion of their limb at home. The range of motion can be provided to a remotely positioned clinician device for clinician monitoring, analysis, and checkups.
Referring now to
System 10 includes a negative pressure wound therapy (NPWT) system 28 applied to a patient's wound, according to some embodiments. NPWT system 28 can include a dressing 36 that substantially covers and seals the patient's wound. Dressing 36 can be adhered and sealed to patient's skin 32 and covers the patient's wound. Dressing 36 can be a foam dressing that adheres to the patient's skin 32. NPWT system 28 can include a therapy device 300 (e.g., a NPWT device) that fluidly couples with an inner volume of the patient's wound. Therapy device 300 can be configured to draw a negative pressure at the patient's wound. Therapy device 300 can be fluidly coupled with the patient's wound through conduit, tubing, medical tubing, flexible tubing, etc., shown as tubular member 30. Tubular member 30 can include an inner volume for drawing a negative pressure at the patient's wound. Tubular member 30 can include an inner volume for providing instillation fluid (e.g., a saline solution) to the patient's wound.
Tubular member 30 can be fluidly coupled with therapy device 300 at a first end (not shown) and with the patient's wound at a second end. In some embodiments, tubular member 30 fluidly couples with the patient's wound (e.g., an inner volume of the patients wound) through a connector 34. Connector 34 can be sealed on an exterior surface of dressing 36 and can be fluidly coupled with inner volume between the patient's skin/wound and dressing 36. Tubular member 30 is configured to facilitate drawing negative pressure at the wound site.
In some embodiments, a drape 18 is adhered to patient's skin 32 and covers substantially the entire dressing 36. Drape 18 can be a thin film, a plastic film, a plastic layer, etc., that adheres to an exterior surface of dressing 36 and skin surrounding dressing 36 (e.g., periwound skin). Drape 18 can seal with the patient's skin 32 to facilitate a sealed fluid connection between tubular member 30 (e.g., and the NPWT device) and the patient's wound or surgical incision.
In some embodiments, trackers, locators, dots, etc., shown as locators 20 are applied to drape 18. Locators 20 can be printed on drape 18, adhered to drape 18 after drape 18 is applied, or adhered to the patient's skin 32 before drape 18 is applied. For example, if drape 18 is transparent or translucent, locators 20 can be applied to the patient's skin 32 before drape 18 is applied. Drape 18 can then be applied over locators 20 which are still visible through drape 18. In other embodiments, locators 20 are applied onto an exterior surface of drape 18 after drape 18 is applied to skin 32. In still other embodiments, locators 20 are applied to the patient's skin 32 surrounding drape 18. For example, locators 20 can be applied to the patient's skin 32 at various locations along a perimeter of drape 18.
Locators 20 can be any visual indicator that can be tracked, located, etc., to determine range of motion of the patient's limb. In some embodiments, three locators 20 are applied to the patient's limb. For example, one locator 20b can be applied to joint 12 of the patient's limb, while another locator 20a is applied at upper limb 14, and another locator 20c is applied at lower limb 16. Locators 20 can be applied to any joint or hingedly coupled limbs of a patient. For example,
Angle 22 is formed between centerline 24 and centerline 26. Centerline 24 extends between a center of locator 20a and a center of locator 20b (the locator that is positioned at joint 12), according to some embodiments. Centerline 26 can extend between a center of locator 20c and a center of locator 20b (the locator that is positioned/applied at joint 12). Centerlines 24 and 26 can define angle 22 that indicates a degree of extension or flexion of the jointed limbs of the patient. Angle 22 can be calculated/determined by identifying locations/positions of locators 20, adding centerlines 24 and 26 through locators 20, and calculating angle 22 therebetween centerlines 24 and 26.
Referring now to
Smartphone 100 can perform an angle or range of motion analysis to determine angle 22. In some embodiments, smartphone 100 can download and install an application (e.g., a mobile app) that configures smartphone 100 to calculate angle 22. The mobile app can use various sensory inputs of smartphone 100 to obtain image data and calculate angle 22. For example, smartphone 100 can include a camera, an accelerometer, touchscreen 102, a user interface, buttons, wireless communications, etc. The application can use any of the sensory inputs from the user interface, accelerometer, camera, touchscreen 102, buttons, wireless communications, etc., to determine/identify the locations of locators 20 and to calculate a value of angle 22. The application may configure smartphone 100 to perform any of the functionality, techniques, processes, etc., locally (e.g., via a processor and/or processing circuit that is locally disposed within smartphone 100). The application can configure smartphone 100 to provide image data and/or any other sensor data to a remote server, and the remote server performs any of the functionality, techniques, processes, etc., described herein to determine locations of locators 20 and to calculate a value of angle 22. In some embodiments, angle 22 is referred to as angle θ.
The application can prompt the patient to capture imagery data (e.g., take a picture) at a fully flexed state and a fully extended state. For example, the application can prompt the patient to fully flex their jointed limb and record image data. The application can then prompt the patient to fully extend their jointed limb and record image data. In some embodiments, the application prompts the patient to record fully flexed and fully extended image data via touchscreen 102. For example, the application can provide notifications, alerts, reminders, etc., that the patient should capture both fully flexed and fully extended image data. Smartphone 100 and/or a remote server can be configured to perform a process, algorithm, image analysis technique, etc., to determine a value of angle 22 in both the fully flexed position and the fully extended position. The fully extended value of angle 22 can be referred to as θextend and the fully flexed value of angle 22 is referred to as θflexed. θextend can be determined by the application (e.g., locally by a processor and/or processing circuit of smartphone 100, remotely by a remote device, server, collection of devices, etc.) based on the fully extended image data. θflexed can be determined by the application similar to θextend based on the fully flexed image data.
The value of angle 22 can be determined by performing an image analysis technique to determine locations of locators 20. For example, the application can configure smartphone 100 to identify locations of locators 20. In some embodiments, if three locators (e.g., 20a, 20b, and 20c) are used, the application identifies locations p1, p2, and p3 of locators 20. The identified locations p can include an x-position coordinate, and a y-position coordinate. For example, the application can determine that locator 20a has a location p1={x1 y1}, that locator 20b has a location p2={x2 y2}, and that locator 20c has a location p3={x3 y3}. The application can use the determined locations to generate centerlines 24 and 26. For example, centerline 24 can be determined based on the identified location of locator 20a, and the identified location of locator 20b. The application can be configured to use the identified locations of locator 20a and locator 20b to determine a linear line that extends through both locator 20a and locator 20b. For example, the application can determine centerline 24 in point-point form based on the identified locations of locators 20a and 20b as:
according to some embodiments.
The application can also be configured to determine centerline 24 and/or centerline 26 in point-slope form. In some embodiments, the application determines vectors (e.g., unit vectors) in Cartesian form, polar form, etc. The application can determine a value of the angle 22 based on the equations, vectors, etc., of centerline 24 and centerline 26. The application can determine an intersection location where centerline 24 and centerline 26 intersect, and determine angle 22 between centerline 24 and centerline 26 at the intersection location.
Referring now to
In some embodiments, using four locators 20a-d provides a more accurate measurement of angle 22. For example, the precision, repeatability, accuracy, reliability, etc., of the value of angle 22 can be improved by using four locators 20a-d. Four locators 20 or three locators 20 can be used as preferred by a clinician.
Referring now to
Referring still to
Communications interface 608 can be or include wired or wireless communications interfaces (e.g., jacks, antennas, transmitters, receivers, transceivers, wire terminals, etc.) for conducting data communications with clinician device 612 or other external systems or devices. In various embodiments, communications via communications interface 608 can be direct (e.g., local wired or wireless communications) or via a communications network (e.g., a WAN, the Internet, a cellular network, Bluetooth, etc.). For example, communications interface 608 can include an Ethernet card and port for sending and receiving data via an Ethernet-based communications link or network. In another example, communications interface 608 can include a Wi-Fi transceiver for communicating via a wireless communications network. In another example, communications interface 608 can include cellular or mobile phone communications transceivers. In one embodiment, communications interface 608 is a power line communications interface. In other embodiments, communications interface 608 is an Ethernet interface.
Still referring to
Memory 606 (e.g., memory, memory unit, storage device, etc.) can include one or more devices (e.g., RAM, ROM, Flash memory, hard disk storage, etc.) for storing data and/or computer code for completing or facilitating the various processes, layers and modules described in the present application. Memory 606 can be or include volatile memory or non-volatile memory. Memory 606 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present application. According to some embodiments, memory 606 is communicably connected to processor 604 via processing circuit 602 and includes computer code for executing (e.g., by processing circuit 602 and/or processor 604) one or more processes described herein.
In some embodiments, the functionality of smartphone 100 is implemented within a single computer (e.g., one server, one housing, one computer, etc.). In various other embodiments the functionality of smartphone 100 can be distributed across multiple servers or computers (e.g., that can exist in distributed locations).
Memory 606 includes calibration manager 618, locator position manager 620, and range of motion (ROM) manager 622, according to some embodiments. Calibration manager 618, locator position manager 620, and ROM manager 622 can be configured to perform visual imaging processes to determine θextend and θflexed Calibration manager 618, locator position manager 620, and ROM manager 622 can be configured to receive image data from imaging device 614 to determine θextend and θflexed.
In some embodiments, locator position manager 620 is configured to receive image data from imaging device 614. Locator position manager 620 can receive image data for both a fully flexed position of joint 12 and a fully extended position of joint 12. Locator position manager 620 can receive real time image data from imaging device 614. Locator position manager 620 can receive an image file (e.g., a .jpeg file, a .png file, a .bmp file, etc.) from imaging device 614.
Locator position manager 620 is configured to perform an imaging processing technique to identify the positions of locators 20, according to some embodiments. Locator position manager 620 can determine the positions of locators 20 based on any of color of locators 20, shape of locators 20, brightness of locators 20, contrast of locators 20, etc. For example, locator position manager 620 can use Kernel-based tracking, Contour tracking, etc., or any other image analysis technique to determine the positions of locators 20. Locator position manager 620 can use a neural network technique (e.g., a convolutional neural network) to identify positions of locators 20 in the image file. Locator position manager 620 can be configured to use a Kalman filter, a particle filter, a Condensation algorithm, etc., to identify the positions of locators 20. Locator position manager 620 can also use an object detection technique to identify the position of locators 20. For example, locator position manager 620 can use a region-based convolutional neural network (RCNN) to identify the positions of locators 20.
Locator position manager 620 can determine the positions pi of any of locators 20 and provide the determined positions pi to ROM manager 622. In some embodiments, the determined positions of locators 20 are Cartesian coordinates (e.g., x and y positions of each of locators 20) relative to a coordinate system (e.g., relative to a corner of the image, relative to a center of the image, relative to a location of the image, etc.).
Locator position manager 620 can analyze both the fully flexed image and the fully extended image data concurrently or independently. For example, locator position manager 620 may first receive the fully flexed image data and determine the positions of locators 20, and provide the positions of locators 20 to ROM manager 622, then receive the fully extended image data and determine the positions of locators 20 and provide the positions of locators 20 to ROM manager 622. In some embodiments, locator position manager 620 receive both the fully flexed and the fully extended image data, and identifies the positions of locators 20 for both images concurrently.
Locator position manager 620 can generate a first set Pflex of position data of locators 20, and a second set Pextend of position data of locators 20. In some embodiments, the first set Pflex includes the identified positions of locators 20 for the fully flexed image data and the second set Pextend includes the identified positions of locators 20 for the fully extended image data. For example, Pflex may have the form Pflex=[p1 p2 . . . pn] where n is the number of locators 20, and pi is the position data of an ith locator 20. Likewise, Pextend may have the form Pextend=[p1 p2 . . . pn].
If three locators 20 are used (as shown in
Locator position manager 620 provides the positions of locators 20 (e.g., Pflex and Pextend) to ROM manager 622. ROM manager 622 is configured to determine θextend and θflexed and a range of motion θROM of joint 12 based on the positions of locators 20. In some embodiments, ROM manager 622 is configured to generate centerline 24 and centerline 26 based on the positions of locators 20. Centerline 24 and centerline 26 can be linear lines that extend between corresponding positions of locators 20.
If three locators 20 are used, ROM manager 622 can generate centerline 24 between locator 20a and locator 20b. For example, ROM manager 622 can use the positions p1 and p2 to generate centerline 24. In some embodiments, if the positions are Cartesian coordinates, ROM manager 622 generates centerline 24 using point-point form:
where x1 is the x-position of locator 20a (or locator 20b), y1 is the y-position of locator 20a (or locator 20b), x2 is the x-position of locator 20b (or locator 20a), and y2 is the y-position of locator 20b (or locator 20a) as identified/determined by locator position manager 620.
ROM manager 622 can similarly generate centerline 26 through locators 20b and locators 20c using point-point form:
where y3 is the y-position of locator 20c, and x3 is the x-position of locator 20c as determined by locator position manager 620.
In some embodiments, ROM manager 622 is configured to use the equations of centerline 24 and centerline 26 to determine a value of angle 22. ROM manager 622 can be configured to determine the value of angle 22 based on the positions of locators 20. ROM manager 622 can determine an angle between centerline 24 and a horizontal or vertical axis, and an angle between centerline 26 and a horizontal or vertical axis. In some embodiments, ROM manager 622 uses the equation:
to determine angle 22 (θ), where y1 is the y-position of locator 20a, x1 is the x-position of locator 20a, y2 is the y-position of locator 20b, x2 is the x-position of locator 20b, y3 is the y-position of locator 20c, and x3 is the x-position of locator 20c.
In some embodiments, ROM manager 622 uses the equation:
to determine angle 22 (θ).
In some embodiments, if four locators 20 are used, locator position manager 620 determines a position of a point of intersection (POI) of centerline 24 and 26. ROM manager 622 can determine centerline 24 and 26 using the techniques described above to generate linear equations. ROM manager 622 can generate a line that extends between locator 20a (p1) and locator 20b (p2) as centerline 24, and a line that extends between locator 20c (p3) and locator 20d (p4) as centerline 26. ROM manager 622 can set the equations of centerlines 24 and 26 equal to each other and solve for an x or y position of POI. In some embodiments, the determined value of the x or y position is input into the equation of centerline 24 or centerline 26 to determine the position of the POI.
ROM manager 622 then uses the equations of centerline 24 and 26 and the POI to determine angle 22. In some embodiments, ROM manager 622 uses trigonometric identities, the Pythagorean theorem, etc., to determine angle 22 based on the equations of centerline 24 and 26, the POI, and the positions of locators 20a-d.
ROM manager 622 can use any of the techniques, processes, methods, etc., described in greater detail hereinabove to determine the value of angle 22. ROM manager 622 can analyze both the fully flexed image data and the fully extended image data to determine values of angle 22. The value of angle 22 determined by ROM manager 622 based on the fully flexed image data is θflexed, and the value of angle 22 determined by ROM manager 622 based on the fully extended image data is θextend.
In some embodiments, ROM manager 622 uses θflexed and θextend to determine θROM. θROM is an angular amount that the patient can flex or extend joint 12 from the fully extended to the fully flexed position. In some embodiments, ROM manager 622 is configured to determine θROM using the equation: θROM=θextend−θflexed.
ROM manager 622 can provide the fully extended angle θextend, the fully flexed angle θflexed, and the range of motion angle θROM to ROM database 624. ROM database 624 can be a local database (e.g., memory 606 of smartphone 100), or a remote database (e.g., a remote server) that is configured to wirelessly communicate with smartphone 100. In some embodiments, ROM database 624 is configured to store any of the received angular values. ROM database 624 can also configured to store a time, date, location, etc., at which the image data is captured, a time and date of when the angular values are calculated, etc. ROM database 624 can retrieve or receive a current time, tcurrent from timer 628. Timer 628 can be a clock, a calendar, etc. ROM database 624 can store a datapoint including the range of motion angle θROM, the fully flexed angle θflexed, the fully extended angle θextend, and the corresponding time t at which the angular measurements are obtained/recorded. ROM database 624 can also receive the determined/identified positions of locators 20 from locator position manager 620 and store the positions of locators 20 that are used to calculate the angular values and the range of motion angle.
ROM database 624 can store any of the received angular values, the positions of locators 20, and the time at which the measurement is recorded or obtained as a table, a chart, a matrix, vectors, time series data, etc. In some embodiments, ROM database 624 stores the angular values (e.g., θROM, θextend, θflexed, etc.), the positions of locators 20 used to determine the angular values, and the time at which the angular values are recorded/obtained in a CSV file. ROM database 624 can also be configured to receive the image data used to obtain/calculate the range of motion angle from imaging device 614 and/or locator position manager 620 and store the image data (e.g., the fully flexed and the fully extended image data files) with the corresponding time at which the image data is recorded/obtained.
In some embodiments, ROM database 624 is configured to provide timer 628 and/or display manager 630 with a time tprev at which the previous angular values (e.g., the range of motion angle) was recorded. Timer 628 and/or display manager 630 can use a current time (e.g., a current date, a current time of day, etc.) to determine an amount of elapsed time since the previous range of motion was obtained. Timer 628 and/or display manager 630 can be configured to compare the amount of elapsed time to a threshold value ΔtROM (e.g., 24 hours, 48 hours, 1 week, etc.). The threshold value can be a frequency of how often the patient should record the range of motion angle. For example, ΔtROM can be 24 hours (indicating that the range of motion angle of joint 12 should be recorded daily), 48 hours (indicating that the range of motion angle of joint 12 should be recorded every other day), etc. In some embodiments, ΔtROM is a predetermined threshold value. ΔtROM can be a value set by a clinician or a medical professional. For example, if the clinician desires the patient to record the range of motion angle of joint 12 daily, the clinician can set ΔtROM to a 24 hour period. The clinician can remotely set or adjust (e.g., increase or decrease) the threshold value ΔtROM.
The threshold value ΔtROM can be a value that is set at a beginning of NPWT and remains the same over an entire duration of a NPWT therapy time (e.g., a month, a week, two weeks, etc.). In some embodiments, the threshold value ΔtROM changes according to a schedule as the NPWT progresses. For example, the threshold value ΔtROM may be a smaller value (e.g., 24 hours) over a first time interval of the NPWT therapy time (e.g., the first week), and a larger value (e.g., 48 hours) over a second time interval of the NPWT therapy time. The clinician can set the schedule of ΔtROM at a beginning of NPWT. The clinician can remotely set, adjust, or change the schedule of ΔtROM (e.g., with clinician device 612 that is configured to wirelessly communicate with smartphone 100).
Display manager 630 and/or timer 628 compare the amount of elapsed time since the previously recorded range of motion angle to the threshold value ΔtROM to determine if the patient should record the range of motion angle θROM. If the amount of time elapsed since the previously recorded range of motion angle is greater than or equal to the threshold value ΔtROM, display manager 630 can operate touchscreen 102 to provide a notification, a message, a reminder, a pop-up, etc., to the patient. The notification can remind the patient that it is time to record the range of motion angle of joint 12 and prompt the patient to launch the application. In some embodiments, the notification or reminder includes a value of amount of elapsed time since the previously recorded range of motion angle.
In some embodiments, display manager 630 is configured to notify or prompt the patient to record the range of motion angle before the time since the last recorded range of motion angle is greater than or equal to the threshold value ΔtROM. For example, display manager 630 can pre-emptively remind, notify, prompt, etc., the patient to launch the application to record the range of motion angle to ensure that the patient does not forget to record the range of motion angle.
Smartphone 100 can launch the application in response to receiving a user input via touchscreen 102. When the application is launched, the application can transition into a flex mode, and an extend mode. When in the flex mode, display manager 630 can provide a message to the patient through touchscreen 102 (or any other display device) to record an image with joint 12 fully flexed. Likewise, when in the extend mode, display manager 630 can provide a message to the patient through touchscreen 102 to record an image with joint 12 fully extended. The images can be captured by smartphone 100 and provided to locator position manager 620 and calibration manager 618 in response to a user input via touchscreen 102 (e.g., in response to the user pressing a button on touchscreen 102).
Referring still to
In some embodiments, reporting manager 626 is configured to operate touchscreen 102 to display the report in response to receiving a request from touchscreen 102 that the patient desires to see the report. The report provided to the patient can be generated based on user inputs. For example, the patient can indicate that the report should include a time series graph, tabular information, percent improvements in the range of motion angle, etc.
In some embodiments, reporting manager 626 is configured to automatically provide the report to the patient via touchscreen 102 in response to the range of motion angle being recorded. For example, after the patient launches the application, captures images, and the range of motion angle is determined, reporting manager 626 may operate touchscreen 102 to display a current value of the range of motion angle, a percent improvement since the previously recorded range of motion angle, a total percent improvement since the first recorded range of motion angle, etc.
For example, reporting manager 626 can identify a first recorded/obtained range of motion angle θROM,1, and compare θROM,1 to a current range of motion angle θROM,current. Reporting manager 626 can determine a difference, a percent change, an increase, etc., between θROM,1 and θROM,current and display the difference, the percent change, the increase, etc., to the patient via touchscreen 102. In some embodiments, reporting manager 626 determines a difference, a percent change, an increase, etc., between the current range of motion angle θROM,current and a previously obtained range of motion angle, and displays the difference, the percent change, the increase, etc., to the patient via touchscreen 102.
In some embodiments, reporting manager 626 generates and provides the reports to a remote device, shown as clinician device 612. Clinician device 612 can be a remote device that is communicably connected with smartphone 100 via communications interface 608. Clinician device 612 and smartphone 100 can be configured to communicate via the Internet, a network, a cellular network, etc. Clinician device 612 and smartphone 100 can be wirelessly communicably coupled. Clinician device 612 can launch a messaging application, a chat application, send an email, send an SMS, etc., to smartphone 100. A clinician can provide real-time feedback and communication to the patient via clinician device 612 and smartphone 100. In some embodiments, the clinician device can initiate or launch the messaging or chat application in response to receiving a progress report from reporting manager 626. Advantageously, this allows the clinician to remotely monitor range of motion and healing progress without requiring the patient to visit the clinic. Clinician device 612 can access the patient's calendar and schedule a clinic appointment.
A clinician can send a request from clinician device 612 to smartphone 100 to obtain the report from smartphone 100. Reporting manager 626 can generate and provide the reports to clinician device 612 every time a new range of motion angle value is recorded. In some embodiments, reporting manager 626 provides the reports to clinician device 612 in response to receiving the request from clinician device 612. The reports provided to clinician device 612 by reporting manager 626 can include image data associated with any of the range of motion angles. In this way, a clinician can remotely monitor healing progress, progress in the range of motion of joint 12, etc. The clinician can receive the reports periodically (e.g., automatically every day, every week, in response to a new range of motion angle measurement, etc.) or can receive the reports on a request basis. This facilitates allowing a clinician to remotely monitor and check up on healing progress of joint 12.
Referring still to
Remote device 610 can perform any of the functionality of smartphone 100 to measure or obtain the range of motion angle values. In some embodiments, remote device 610 is configured to perform any of the functionality of calibration manager 618, locator position manager 620, ROM manager 622, timer 628, display manager 630, ROM database 624, reporting manager 626, etc. Remote device 610 can receive the image data from imaging device 614 of smartphone 100, perform the processes described herein remotely, and provide smartphone 100 with the obtained angular values or positions of locators 20.
Referring still to
Calibration manager 618 can use any image analysis techniques described herein to determine the orientation of smartphone 100 relative to the patient's limb, or can use the orientation of smartphone 100 recorded by orientation sensor 616, or some combination of both. In some embodiments, calibration manager 618 communicates with locator position manager 620. Calibration manager 618 can receive the locator positions from locator position manager 620 and identify shape, skew, size, etc., of locators 20 on the image to determine orientation of smartphone 100 relative to the patient's limb.
Referring now to
Referring particularly to
Calibration manager 618 can analyze the image to determine a shape of locators 20. Calibration manager 618 can compare the shape of locators 20 as shown in the image to the known shape of locators 20 when smartphone 100 is perpendicular to the patient's limb. In some embodiments, calibration manager 618 is configured to determine focal points of locators 20. Calibration manager 618 can determine linear eccentricity of the shape of locators 20 as captured in the image. Depending on the ellipticality of locators 20 in the captured image, calibration manager 618 can determine an orientation of smartphone 100 relative to the patient's limb about either vertical axis 702 or about horizontal axis 704, or about both axes 702 and 704.
Referring now to
In some embodiments, calibration manager 618 determines the orientation of smartphone 100 relative to the patient's limb, and/or the distance between smartphone 100 and the patient's limb based on initial images captured by smartphone 100. The initial images captured by smartphone 100 may be captured by a clinician. For example, a clinician can align smartphone 100 such that it is substantially perpendicular to the patient's limb and capture fully flexed and fully extended images. Smartphone 100 can then use any of the processes, techniques, functionality, etc., described in greater detail above to determine the range of motion angle θROM for the initial images.
Calibration manager 618 can store the initial images and compare subsequently captured images to the initial images to determine orientation of smartphone 100 relative to the patient's limb, and/or distance between smartphone 100 and the patient's limb. In some embodiments, the initial images are captured by the clinician in a controlled environment. For example, the clinician can hold smartphone 100 a predetermined distance from the patient's limb and at an orientation such that smartphone 100 is substantially perpendicular to the patient's limb. For example, the predetermined distance may be 2 feet, 3 feet, 2.5 feet, etc. Calibration manager 618 may use the positions, shapes, distances, etc., of locators 20 in the initial images as baseline values. Calibration manager 618 can determine similar values for subsequently captured images and compare the values of the subsequently captured images to the baseline values to determine distance between smartphone 100 and the patient's limb, in addition to the orientation of smartphone 100 relative to the patient's limb.
Referring to
Calibration manager 618 can also determine a distance 1104 between locators 20a and 20b, and a distance 1102 between locators 20b and 20c (assuming three locators 20 are used). In some embodiments, if four locators 20 are used, calibration manager 618 determines a distance between locators 20a and 20b, and a distance between locators 20b and 20c. Calibration manager 618 can use any imaging techniques similar to locator position manager 620 to determine distances between locators 20. Calibration manager 618 can use the positions of locators 20 as determined by locator position manager 620 to determine distances between locators 20.
Calibration manager 618 can also identify a shape of locators 20 based on the initial image(s). For example, calibration manager 618 can determine that the shape of locators 20 is a circle, a square, a star, etc.
Referring now to
Calibration manager 618 can determine dimension 1206 (e.g., diameter, size, etc.) of locators 20 and compare dimension 1206 to dimension 1106. In some embodiments, calibration manager 618 determines a distance between smartphone 100 and the patient's limb for the image represented by diagram 1200 based on dimension 1206 of locators 20. For example, if locators 20 are circles, dimension 1206 can be a diameter, d. Calibration manager 618 can use a predetermined or predefined relationship and the value of d to determine the distance between smartphone 100 and the patient's limb. The diameter d of locators 20 may decrease with increased distance between smartphone 100 and the patient's limb, while the diameter d of locators 20 may increase with decreased distance between smartphone 100 and the patient's limb. In this way, the diameter d of locators 20 can be used by calibration manager 618 with a relationship to determine the distance between smartphone 100 and the patient's limb.
Calibration manager 618 can similarly compare distance 1202 (between locator 20b and locator 20c) to distance 1102 to determine distance between smartphone 100 and the patient's limb. Distance 1202 and/or distance 1204 can have a relationship to the distance between smartphone 100 and the patient's limb similar to the relationship between the diameter d of locators 20 and the distance between smartphone 100 and the patient's limb (e.g., increased distance 1202 or increased distance 1204 corresponds to decrease distance between smartphone 100 and the patients limb, and vice versa). In this way, calibration manager 618 can use distance 1202 and/or distance 1204 to determine the distance between smartphone 100 and the patient's limb.
Calibration manager 618 can also identify changes or deviations in the shape of locators 20 as compared to the shape of locators 20 in the initial image. For example, locators 20 as shown in
Calibration manager 618 can use the orientation of smartphone 100 relative to the patient's limb to determine angular offset amounts or adjustments for θextend, θflex, and θROM to account for the orientation of smartphone 100 relative to the patient's limb. In some embodiments, calibration manager 618 calculates the distance between smartphone 100 and the patient's limb (e.g., r) and/or the orientation of smartphone 100 relative to the patient's limb (e.g., the azimuth angle ϕaz and the elevation angle ϕel) in real-time and notifies the patient when smartphone 100 is properly aligned with the patient's limb. Calibration manager 618 can operate imaging device 614 to capture image data (e.g., take a picture) when smartphone 100 is properly oriented relative to the patient's limb (e.g., when ϕaz and ϕel are substantially equal to zero, or desired values).
Calibration manager 618 can also record the orientation of smartphone 100 when the initial image is captured. In some embodiments, calibration manager 618 receives the orientation of smartphone 100 from orientation sensor 616. Calibration manager 618 can compare the orientation of smartphone 100 for later captured images to the orientation of smartphone 100 for the initial captured image to determine offsets or adjustments for θextend, θflex, and θROM to account for the orientation of smartphone 100 relative to the patient's limb.
Calibration manager 618 can use the distance between smartphone 100 and the patient's limb (e.g., r), and/or the orientation of smartphone 100 relative to the patient's limb (e.g., ϕaz and ϕel) to determine offset or adjustment amounts θextend,adj, θflex,adj, and θROM,adj. For example, calibration manager 618 can use a predetermined function, relationship, equation, etc., to determine θextend,adj, θflex,adj, and θROM,adj based on the distance between smartphone 100 and the patient's limb (e.g., r) and/or the orientation of smartphone 100 relative to the patient's limb (e.g., ϕaz and ϕel). In some embodiments, calibration manager 618 provides the offset or adjustment amounts θextend,adj, θflex,adj, and θROM,adj to ROM manager 622. ROM manager 622 can use the offset/adjustment amounts θextend,adj, θflex,adj, and θROM,adj to adjust (e.g., increase, decrease, etc.) the values of θextend, θflexed, and θROM. For example, ROM manager 622 may add θextend,adj to θextend or subtract θextend,adj from θextend to account for orientation of smartphone 100 relative to the patient's limb.
Referring now to
Referring again to
Advantageously, the application can be installed on smartphone 100 by a clinician and/or by a patient. In some embodiments, the application is installed and set up by a clinician. The clinician can set various initial parameters (e.g., frequency of range of motion measurements, when reports should be provided to the patient, when reports should be provided to clinician device 612, what information is displayed to the patient
Referring now to
Referring now to
Reporting manager 626 and/or display manager 630 can also operate user device 1802 to display a currently calculated or a previously calculated (e.g., a most recent) range of motion angle notification 1806. Reporting manager 626 and/or display manager 630 can also operate user device 1802 to display a notification 1808 including a percent improvement since a previously recorded range of motion angle, a total percent improvement since an initially recorded range of motion angle, a total improvement (e.g., in degrees) since the previously recorded range of motion data, a total improvement (e.g., in degrees) since the initially recorded range of motion data, etc. Display manager 630 and/or reporting manager 626 can also display current or most recently calculated flexed angle values θflexed, current or most recently calculated extension angle θextend, percent improvements (e.g., since previously recorded values or since initially recorded values) of θflexed and/or θextend, total improvements (e.g., an angular improvement since previously recorded values or since initially recorded values) of θflexed and/or θextend, etc. Display manager 630 and/or reporting manager 626 can also operate user device 1802 to display historical data (e.g., in tabular form) of any of the information stored in ROM database 624 (e.g., θROM, θflexed, θextend, dates/times of recorded measurements, etc.).
Referring now to
Table 1900 can be stored in ROM database 624 and retrieved by reporting manager 626. In some embodiments, table 1900 is displayed on or transmitted to clinician device 612. Table 1900 can be displayed to the patient via touchscreen 102. The values of columns 1902, 1904, and 1906 can be determined by ROM manager 622 based on positions of locators 20 and/or based on image data. The values of column 1908 can be recorded/captured by timer 628. The values of columns 1910 and 1912 can be determined by reporting manager 626.
Referring now to
Process 1400 includes recording image data in both a fully flexed and fully extended position (step 1402), according to some embodiments. Step 1402 includes providing a notification to the patient to extend the joint into the fully extended position and capture an image, and to flex the joint into the fully flexed position and capture an image. Step 1402 can be performed by an imaging device. For example, step 1402 can be performed by imaging device 614 of smartphone 100.
Process 1400 includes determining positions of locators that are positioned about the joint (step 1404), according to some embodiments. The locators can be positioned on both the upper and lower limbs of the jointed limb. Three locators can be positioned on the joint, with the first locator being positioned on the upper limb, the second locator being positioned on the joint, and the third locator being positioned on the lower limb. In some embodiments, four locators are positioned on the limb, with a first set of two locators being positioned on the upper limb, and a second set of two locators being positioned on the lower limb. Step 1404 can include analyzing any of the recorded image data of the fully flexed and the fully extended joint. Step 1404 can include using an image processing technique (e.g., a neural network technique, an object detection technique, an edge detection technique, etc.) to determine the positions of the locators. The positions of the locators can be determined as Cartesian coordinates relative to an origin (e.g., an upper left corner of the image, a lower right corner of the image, a center of the image, a lower left corner of the image, etc.). Step 1404 can be performed by locator position manager 620 to determine the positions of locators 20 based on the recorded image data.
Process 1400 includes generating centerlines that extend through the determined positions of the locators (step 1406), according to some embodiments. In some embodiments, the centerlines are lines. The centerlines may be centerlines 24 and 26. The centerlines can be generated based on the determined positions of locators 20. The centerlines may extend through a center of locators 20. Step 1406 can be performed by ROM manager 622.
Process 1400 includes calculating an angle between the centerlines for the fully flexed image data (step 1408) and the fully extended image data (step 1410), according to some embodiments. Steps 1408 and 1410 can be performed by ROM manager 622. Step 1408 can include determining θflexed and step 1410 can include determining θextend. The angles can be determined using trigonometric identities, equations of the centerlines, the determined positions of the locators, etc.
Process 1400 includes determining a range of motion (i.e., θROM) based on the calculated/determined angles (step 1412). In some embodiments, the range of motion is an angular value. The range of motion may be a difference between the calculated angles. For example, the range of motion can be θROM=θextend−θflexed. Step 1412 can be performed by ROM manager 622.
Referring now to
Process 1500 includes establishing communication between a patient's mobile device (e.g., smartphone 100) and a second device (e.g., remote device 610) (step 1502), according to some embodiments. The communication between the patient's mobile device and the second device may be a wireless connection. The communication between the patient's mobile device and the second device may be a wired connection. For example, smartphone 100 can wirelessly communicably connect with the second device, which can be remotely positioned. The patient's mobile device and the second device may be wirelessly or wiredly connected in a clinic by a clinician. Step 1502 can be performed by smartphone 100, communications interface 608, a clinician, the patient, etc.
Process 1500 includes downloading or transferring an installation package onto the patient's mobile device (step 1504), according to some embodiments. The installation package can be transferred to the patient's mobile device from the second device. The installation package can be any of an .apk file, a .pkg file, etc., or any other package file or installation package file. Step 1504 can be performed by smartphone 100.
Process 1500 includes using the installation package to configure the patient's mobile device to calculate the range of motion angle θROM and to perform any of the other processes, functionality, etc., described herein (step 1506), according to some embodiments. Step 1506 can be performed by smartphone 100 using the installation package received from the second device (e.g., received from clinician device 612 and/or remote device 610).
Referring now to
Process 1600 includes obtaining initial image data from an imaging device (step 1602), according to some embodiments. Step 1602 can be the same as or similar to step 1402 of process 1400. The initial image data can be captured by a clinician or a patient. For example, a clinician can use the patient's smartphone or mobile device to capture the initial image data. The clinician may align the patient's smartphone such that the smartphone is substantially perpendicular to the patient's joint or perpendicular to locators 20. The clinician can also capture the initial image data at a predetermined distance from the patient's joint. The initial image data can be recorded at ϕaz≈0 and ϕel≈0 such that locators 20 are substantially perpendicular to a line of sight of imaging device 614 of the patient's smartphone 100.
Process 1600 includes determining one or more initial parameters based on the initial image data (step 1604), according to some embodiments. Step 1604 can include analyzing the initial image/image data to determine relative distances between locators 20, identify an initial shape, size, skew, etc., of locators 20, etc. Step 1604 can include receiving or capturing an initial orientation of smartphone 100 from orientation sensor 616. Step 1604 may be performed by calibration manager 618.
Process 1600 includes performing process 1400 (step 1606), according to some embodiments. Process 1400 can be performed at regularly spaced intervals according to a schedule. Process 1400 can be performed to obtain image data at various points in time along the healing process.
Process 1600 includes determining one or more values of the parameters based on the image data obtained in step 1606 (step 1608), according to some embodiments. Step 1608 can be performed by calibration manager 618. Calibration manager 618 can determine any of the parameters of step 1604 for the newly obtained images. For example, calibration manager 618 can analyze the newly obtained images/image data to determine relative distance between locators 20, shape, size, skew, etc., of locators 20, etc.
Process 1600 includes determining an orientation of the imaging device relative to a reference point (the patient's limb, locators 20, etc.) by comparing the values of the parameters of the newly obtained image to the initial parameters of the initial image (step 1610), according to some embodiments.
Referring now to
Process 2000 includes providing locators on a dressing or skin of a patient's joint (step 2002), according to some embodiments. Step 2002 can include adhering locators 20 to the patient's skin 32. Locators 20 can be adhered directly to the patient's skin or can be adhered to drape 18. Locators 20 can be printed on drape 18 by a drape manufacturer. Step 2002 can be performed by a clinician. For example, the clinician can adhere three or four (or more) locators 20 to the patient's jointed limb for tracking. Step 2002 can be performed periodically when dressing 36 is changed.
Process 2000 includes performing process 1500 to configure the patient's smartphone 100 to record and measure range of motion of the patient's joint (step 2004), according to some embodiments. Step 2004 can be initiated by a clinician in a clinical setting. Step 2004 can include setting various parameters such as measurement interval, measurement schedule, reminder schedules, etc., to ensure that the patient records range of motion when necessary.
Process 2000 includes performing process 1600 to obtain range of motion values of the patient's jointed limb (step 2006), according to some embodiments. Step 2006 can be performed multiple times over NPWT to obtain range of motion values as the patient's wound heals. Step 2006 can be initiated by a patient. Step 2006 can be initiated a first time by a clinician in a controlled environment to obtain baseline image data.
Process 2000 includes generating a range of motion progress report (step 2008), according to some embodiments. The range of motion progress report can include graphs, tabular information, historical information, analysis, etc., of the range of motion of the patient's jointed limb. Step 2008 can be performed by reporting manager 626. Step 2008 can include retrieving historical range of motion data from ROM database 624. The range of motion progress report can also include a currently calculated range of motion angle, percent improvements in the range of motion of the patient's joint, image data, etc.
Process 2000 includes operating a display of a user device to show the range of motion progress report (step 2010), according to some embodiments. Step 2010 can be performed by display manager 630. Display manager 630 can operate touchscreen 102 to display the generated range of motion progress report. Display manager 630 can operate touchscreen 102 of smartphone 100 to display any of the tabular information, the range of motion graphs, etc.
Process 2000 includes providing the range of motion progress report to a clinician device (step 2012), according to some embodiments. The range of motion progress report can be provided to clinician device 612. The range of motion progress report can be provided to clinician device 612 in response to smartphone 100 receiving a request from clinician device 612. The range of motion progress report can be provided to clinician device 612 in response to obtaining a new range of motion measurement of the patient's joint. The range of motion progress report can be provided to clinician device 612 periodically to that the clinician can monitor healing progress of the patient's wound. The range of motion progress report can include historical range of motion data, graphs, images used to calculate the range of motion, etc. Providing the range of motion progress report to clinician device 612 facilitates allowing a clinician to remotely monitor healing progress. The clinician can identify unexpected changes or problems with the healing progress. Process 2000 can include an additional step of receiving a notification from clinician device 612. For example, if the clinician determines, based on the received range of motion progress report, that the patient should come in to the clinic, the clinician can send a notification to the patient's smartphone 100 indicating that the patient should come in to the clinic. The clinician can launch a chat application and can send a message to the patient's smartphone.
Referring now to
Process 2100 includes determining an amount of time since a previously recorded range of motion (step 2102), according to some embodiments. Step 2102 can include determining an amount of elapsed time between a present time and a time at which the previous range of motion was recorded. The time interval can be in hours, days, minutes, etc. Step 2102 can be performed by timer 628 by comparing a current time value to a time at which the previously recorded range of motion was measured. The time at which the previously recorded range of motion was measured may be retrieved from ROM database 624.
Process 2100 includes retrieving a range of motion measurement schedule (step 2104), according to some embodiments. The range of motion measurement schedule can be retrieved from ROM database 624. The range of motion measurement schedule can be stored in timer 628. The range of motion measurement schedule can be predetermined or set at a beginning of NPWT by a clinician. The measurements of the range of motion can be scheduled at regular time intervals (e.g., every day, every week, etc.). Step 2104 may be performed by timer 628.
Process 2100 includes determining a next range of motion measurement time based on the amount of time since the previously recorded range of motion and the range of motion measurement schedule (step 2106), according to some embodiments. The next range of motion measurement time can be retrieved from the range of motion measurement schedule. Step 2106 can include determining an amount of time from a present/current time to the next range of motion measurement time. In some embodiment, step 2106 is performed by timer 628 and/or display manager 630.
Process 2100 includes providing a reminder to the patient to record the range of motion data at a predetermined amount of time before the next range of motion measurement time, or at the next range of motion measurement time (step 2108), according to some embodiments. A notification/reminder can be provided to the patient a predetermined amount of time before the next range of motion measurement time. For example, display manager 630 can operate touchscreen 102 to provide the patient with a reminder or notification that the next range of motion measurement should be recorded/captured within the next 12 hours, the next 24 hours, the next 5 hours, etc. Step 2108 can be performed when the current time is substantially equal to the next range of motion measurement time. Step 2108 can be performed by display manager 630 and/or timer 628. Process 2100 can be performed to ensure that the patient does not forget to capture/record range of motion angular values.
Referring now to
Process 3200 includes providing a dressing including a comfort layer, a manifold layer, and a drape (step 3202), according to some embodiments. The comfort layer can be a PREVENA™ layer. The comfort layer can be the wound-interface layer 128 (as described in greater detail below). The dressing can be dressing 36 and may be provided over or applied to a wound at a jointed limb.
Process 3200 includes providing one or more locators on the dressing (step 3204), according to some embodiments. The locators can be provided onto the drape layer (e.g., the drape 18, the drape layer 120). The locators can be provided onto an exterior surface of the drape layer or onto an interior surface of the drape layer if the drape layer is transparent or translucent. The locators can be printed, adhered, etc., or otherwise coupled with the drape layer such that the locators can be viewed on the dressing.
Process 3200 includes applying negative pressure to the wound at the dressing (step 3206), according to some embodiments. The negative pressure can be applied to the wound at the dressing by the therapy device 300. The therapy device 300 can fluidly couple with the dressing through a conduit that fluidly couples with an inner volume of the dressing.
Process 3200 includes relieving the applied negative pressure after a time duration (step 3208), according to some embodiments. The applied negative pressure can be relieved after a time duration (e.g., after the negative pressure is applied for some amount of time). Step 3208 may be performed by the therapy device 300 and/or the controller 318 of the therapy device 300.
Process 3200 includes performing process 2000 to calculate range of motion of the jointed limb (step 3210), according to some embodiments. Step 3210 can include performing any of the processes 1400, 1500, 1600, 2000, and/or 2100. Step 3210 is performed to calculate the range of motion using the locators on the dressing. The negative pressure may be relieved prior to performing step 3210.
Process 3200 includes re-applying the negative pressure to the wound at the dressing (step 3212), according to some embodiments. The negative pressure can be re-applied after step 3210 is performed. For example, the negative pressure can be re-applied to the wound at the dressing in response to calculating the range of motion of the jointed limb. Step 3212 can be performed by the controller 318.
Referring now to
In various embodiments, dressing 36 can be formed as a substantially flat sheet for topical application to wounds. Dressing 36 can lie flat for treatment of substantially flat wounds and is also configured to bend to conform to body surfaces having high curvature, such as breasts, or body surfaces at joints (e.g., at elbows and knees as shown in
Dressing 36 is shown to include a plurality of layers, including a drape layer 120 (e.g., drape 18), a manifold layer 124, a wound-interface layer 128, a rigid support layer 142, a first adhesive layer 146, a second adhesive layer 150, and a patient-contacting layer 154. In some embodiments, dressing 36 includes a removable cover sheet 132 to cover the manifold layer 124, the wound-interface layer 128, the second adhesive layer 150, and/or the patient-contacting layer 154 before use.
The drape layer 120 is shown to include a first surface 136 and a second, wound-facing, surface 140 opposite the first surface 136. When dressing 36 is applied to a wound, the first surface 136 faces away from the wound, whereas the second surface 140 faces toward the wound. The drape layer 120 supports the manifold layer 124 and the wound-interface layer 128 and provides a barrier to passage of microorganisms through dressing 36. The drape layer 120 is configured to provide a sealed space over a wound or incision. In some embodiments, the drape layer 120 is an elastomeric material or may be any material that provides a fluid seal. “Fluid seal” means a seal adequate to hold pressure at a desired site given the particular reduced-pressure subsystem involved. The term “elastomeric” means having the properties of an elastomer and generally refers to a polymeric material that has rubber-like properties. Examples of elastomers may include, but are not limited to, natural rubbers, polyisoprene, styrene butadiene rubber, chloroprene rubber, polybutadiene, nitrile rubber, butyl rubber, ethylene propylene rubber, ethylene propylene diene monomer, chlorosulfonated polyethylene, polysulfide rubber, polyurethane, EVA film, co-polyester, and silicones. As non-limiting examples, the drape layer 120 may be formed from materials that include a silicone, 3M Tegaderm® drape material, acrylic drape material such as one available from Avery, or an incise drape material.
The drape layer 120 may be substantially impermeable to liquid and substantially permeable to water vapor. In other words, the drape layer 120 may be permeable to water vapor, but not permeable to liquid water or wound exudate. This increases the total fluid handling capacity (TFHC) of wound dressing 36 while promoting a moist wound environment. In some embodiments, the drape layer 120 is also impermeable to bacteria and other microorganisms. In some embodiments, the drape layer 120 is configured to wick moisture from the manifold layer 124 and distribute the moisture across the first surface 136.
In the illustrated embodiment, the drape layer 120 defines a cavity 122 (
In some embodiments, a reduced-pressure interface 158 can be integrated with the drape layer 120. The reduced-pressure interface 158 can be in fluid communication with the negative pressure system through a removed fluid conduit 268 (
With continued reference to
In some embodiments, the second surface 140 of the drape layer 120 contacts the manifold layer 124. The second surface 140 of the drape layer 120 may be adhered to the manifold layer 124 or may simply contact the manifold layer 124 without the use of an adhesive.
In some embodiments, the adhesive applied to the second surface 140 of the drape layer 120 is moisture vapor transmitting and/or patterned to allow passage of water vapor therethrough. The adhesive may include a continuous moisture vapor transmitting, pressure-sensitive adhesive layer of the type conventionally used for island-type wound dressings (e.g. a polyurethane-based pressure sensitive adhesive).
Referring to
The manifold layer 124 can be made from a porous and permeable foam-like material and, more particularly, a reticulated, open-cell polyurethane or polyether foam that allows good permeability of wound fluids while under a reduced pressure. One such foam material that has been used is the V.A.C.® Granufoam™ material that is available from Kinetic Concepts, Inc. (KCI) of San Antonio, Tex. Any material or combination of materials might be used for the manifold layer 124 provided that the manifold layer 124 is operable to distribute the reduced pressure and provide a distributed compressive force along the wound site.
The reticulated pores of the Granufoam™ material that are in the range from about 400 to 600 microns, are preferred, but other materials may be used. The density of the manifold layer material, e.g., Granufoam™ material, is typically in the range of about 1.3 lb/ft3-1.6 lb/ft3 (20.8 kg/m3-25.6 kg/m3). A material with a higher density (smaller pore size) than Granufoam™ material may be desirable in some situations. For example, the Granufoam™ material or similar material with a density greater than 1.6 lb/ft3 (25.6 kg/m3) may be used. As another example, the Granufoam™ material or similar material with a density greater than 2.0 lb/ft3 (32 kg/m3) or 5.0 lb/ft3 (80.1 kg/m3) or even more may be used. The more dense the material is, the higher compressive force that may be generated for a given reduced pressure. If a foam with a density less than the tissue at the tissue site is used as the manifold layer material, a lifting force may be developed. In one illustrative embodiment, a portion, e.g., the edges, of dressing 36 may exert a compressive force while another portion, e.g., a central portion, may provide a lifting force.
The manifold layer material may be a reticulated foam that is later felted to thickness of about one third (⅓) of the foam's original thickness. Among the many possible manifold layer materials, the following may be used: Granufoam™ material or a Foamex® technical foam (www.foamex.com). In some instances it may be desirable to add ionic silver to the foam in a microbonding process or to add other substances to the manifold layer material such as antimicrobial agents. The manifold layer material may be isotropic or anisotropic depending on the exact orientation of the compressive forces that are desired during the application of reduced pressure. The manifold layer material may also be a bio-absorbable material.
As shown in
Referring now to
Therapy device 300 can be configured to provide negative pressure wound therapy by reducing the pressure at wound 314. Therapy device 300 can draw a vacuum at wound 314 (relative to atmospheric pressure) by removing wound exudate, air, and other fluids from wound 314. Wound exudate may include fluid that filters from a patient's circulatory system into lesions or areas of inflammation. For example, wound exudate may include water and dissolved solutes such as blood, plasma proteins, white blood cells, platelets, and red blood cells. Other fluids removed from wound 314 may include instillation fluid 305 previously delivered to wound 314. Instillation fluid 305 can include, for example, a cleansing fluid, a prescribed fluid, a medicated fluid, an antibiotic fluid, or any other type of fluid which can be delivered to wound 314 during wound treatment. Instillation fluid 305 may be held in an instillation fluid canister 304 and controllably dispensed to wound 314 via instillation fluid tubing 308. In some embodiments, instillation fluid canister 304 is detachable from therapy device 300 to allow canister 306 to be refilled and replaced as needed.
The fluids 307 removed from wound 314 pass through removed fluid tubing 310 and are collected in removed fluid canister 306. Removed fluid canister 306 may be a component of therapy device 300 configured to collect wound exudate and other fluids 307 removed from wound 314. In some embodiments, removed fluid canister 306 is detachable from therapy device 300 to allow canister 306 to be emptied and replaced as needed. A lower portion of canister 306 may be filled with wound exudate and other fluids 307 removed from wound 314, whereas an upper portion of canister 306 may be filled with air. Therapy device 300 can be configured to draw a vacuum within canister 306 by pumping air out of canister 306. The reduced pressure within canister 306 can be translated to dressing 36 and wound 314 via tubing 310 such that dressing 36 and wound 314 are maintained at the same pressure as canister 306.
Referring particularly to
Similarly, instillation pump 322 can be fluidly coupled to instillation fluid canister 304 via tubing 309 and fluidly coupled to dressing 36 via tubing 308. Instillation pump 322 can be operated to deliver instillation fluid 305 to dressing 36 and wound 314 by pumping instillation fluid 305 through tubing 309 and tubing 308, as shown in
Filter 328 can be positioned between removed fluid canister 306 and pneumatic pump 320 (e.g., along conduit 336) such that the air pumped out of canister 306 passes through filter 328. Filter 328 can be configured to prevent liquid or solid particles from entering conduit 336 and reaching pneumatic pump 320. Filter 328 may include, for example, a bacterial filter that is hydrophobic and/or lipophilic such that aqueous and/or oily liquids will bead on the surface of filter 328. Pneumatic pump 320 can be configured to provide sufficient airflow through filter 328 that the pressure drop across filter 328 is not substantial (e.g., such that the pressure drop will not substantially interfere with the application of negative pressure to wound 314 from therapy device 300).
In some embodiments, therapy device 300 operates a valve 332 to controllably vent the negative pressure circuit, as shown in
In some embodiments, therapy device 300 vents the negative pressure circuit via an orifice 358, as shown in
In some embodiments, therapy device 300 includes a variety of sensors. For example, therapy device 300 is shown to include a pressure sensor 330 configured to measure the pressure within canister 306 and/or the pressure at dressing 36 or wound 314. In some embodiments, therapy device 300 includes a pressure sensor 313 configured to measure the pressure within tubing 311. Tubing 311 may be connected to dressing 36 and may be dedicated to measuring the pressure at dressing 36 or wound 314 without having a secondary function such as channeling installation fluid 305 or wound exudate. In various embodiments, tubing 308, 110, and 111 may be physically separate tubes or separate lumens within a single tube that connects therapy device 300 to dressing 36. Accordingly, tubing 310 may be described as a negative pressure lumen that functions apply negative pressure dressing 36 or wound 314, whereas tubing 311 may be described as a sensing lumen configured to sense the pressure at dressing 36 or wound 314. Pressure sensors 330 and 313 can be located within therapy device 300, positioned at any location along tubing 308, 110, and 111, or located at dressing 36 in various embodiments. Pressure measurements recorded by pressure sensors 330 and/or 313 can be communicated to controller 318. Controller 318 use the pressure measurements as inputs to various pressure testing operations and control operations performed by controller 318.
Controller 318 can be configured to operate pneumatic pump 320, instillation pump 322, valve 332, and/or other controllable components of therapy device 300. For example, controller 318 may instruct valve 332 to close and operate pneumatic pump 320 to establish negative pressure within the negative pressure circuit. Once the negative pressure has been established, controller 318 may deactivate pneumatic pump 320. Controller 318 may cause valve 332 to open for a predetermined amount of time and then close after the predetermined amount of time has elapsed.
In some embodiments, therapy device 300 includes a user interface 326. User interface 326 may include one or more buttons, dials, sliders, keys, or other input devices configured to receive input from a user. User interface 326 may also include one or more display devices (e.g., LEDs, LCD displays, etc.), speakers, tactile feedback devices, or other output devices configured to provide information to a user. In some embodiments, the pressure measurements recorded by pressure sensors 330 and/or 313 are presented to a user via user interface 326. User interface 326 can also display alerts generated by controller 318. For example, controller 318 can generate a “no canister” alert if canister 306 is not detected.
In some embodiments, therapy device 300 includes a data communications interface 324 (e.g., a USB port, a wireless transceiver, etc.) configured to receive and transmit data. Communications interface 324 may include wired or wireless communications interfaces (e.g., jacks, antennas, transmitters, receivers, transceivers, wire terminals, etc.) for conducting data communications external systems or devices. In various embodiments, the communications may be direct (e.g., local wired or wireless communications) or via a communications network (e.g., a WAN, the Internet, a cellular network, etc.). For example, communications interface 324 can include a USB port or an Ethernet card and port for sending and receiving data via an Ethernet-based communications link or network. In another example, communications interface 324 can include a Wi-Fi transceiver for communicating via a wireless communications network or cellular or mobile phone communications transceivers.
The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.). For example, the position of elements can be reversed or otherwise varied and the nature or number of discrete elements or positions can be altered or varied. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps can be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes, and omissions can be made in the design, operating conditions and arrangement of the exemplary embodiments without departing from the scope of the present disclosure.
The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure can be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures show a specific order of method steps, the order of the steps may differ from what is depicted. Also two or more steps can be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.
This application claims the benefit of priority to U.S. Provisional Application No. 62/890,804, filed on Aug. 23, 2019, which is incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2020/057868 | 8/21/2020 | WO |
Number | Date | Country | |
---|---|---|---|
62890804 | Aug 2019 | US |