The present application relates to systems and methods for objectively measuring range of motion, and more particularly measuring range of motion of a human body part. The present application may be useful for anyone attempting to measure the range of motion of a body part, including but not limited to physical therapists, doctors, nurses, athletic trainers, and individuals.
Diagnosis and treatment of injuries to body parts often involves measurement of the range of motion of a body part. Often times, the range of motion of an injured body part and a similar body part (i.e. left and right arms) is compared to determine the severity of the injury and the progress made during treatment for an injury. Additionally, trainers and medical professionals may determine the range of motion of a body part to design exercises to help prevent injury to a limb or other body part. For example only, a limited range of mobility in a shoulder joint may cause injury to the elbow on the same arm.
Current measurement of range of motion of a body part requires a practitioner to either estimate the range of motion based on experience and practice, or use large protractors to measure certain types of range of motion. Both methods include significant amounts of error and are subject to inconsistencies between practitioners. Both methods also require the practitioner to record his or her results on paper rather than digitally. Additionally, neither method allows for a user to perform range of motion determinations on their own or to view a visual representation of the range of motion. Thus, a digital and accurate apparatus and method for measuring range of motion that could be used by either a practitioner or individual patient is needed.
This present disclosure relates to range of motion measurement devices and techniques, in particular measurement of range of motion of the human body, including but not limited to limbs such as arms and legs, the spine, pelvis, and any other body part. The device may include a sleeve or sleeves that are placed on the body part and wirelessly communicate with a user device. The user device includes a program or application that may instruct the wearer what motions to perform with his or her body parts, records and stores the measurements of those movements, and creates a visual depiction of the movement of the body part. The program or application may also allow a user to move on his or her own and then record and store the measurements of those movements and create a visual depiction of the movement. The user device may wirelessly communicate with a display device that displays the visual depiction of the movement of the body part. The data received by the user device may be stored in the user device, in the cloud, and/or in any data storage device or central database.
A system may include a sleeve or sleeves configured to be placed on a human body that includes a sensor configured to contact the skin and a user device in wireless communication with the sleeve that is capable of receiving and storing data communicated from the sleeve. The user device may be capable of creating a visual rendering of the movement of a body part using data communicated from the sleeve. The system may also include a display device, wherein the display device is in wireless communication with the user device. The display device may also display the visual rendering. The user device may also be in wireless communication with a storage device, the storage device storing data wirelessly transmitted from the user device. The system may also include a battery compartment disposed on the sleeve and a battery component configured to fit within the battery compartment. The system may also include a battery life indicator disposed on the battery.
A method for determining the range of motion of a human body may include receiving data from a sleeve placed on the human body, said data including angular rotation and direction, calculating the range of motion data, storing the range of motion data, and generating a visual depiction of the movement from the data. The method may also include storing the range of motion in the cloud or a central database. The method may also include directing the user to move the human body part in a particular way. The method may also include storing data from previous range of motion determinations. The method may also include comparing previous range of motion determinations with a current range of motion determination.
In one embodiment, a motion monitoring system is disclosed that includes a wearable monitoring device configured to be worn on a human body that includes a body part. The wearable monitoring device is deformable with movement of the body part. The wearable monitoring device includes a sensor system configured to deform with movement of the body part. The sensor system is configured to detect deformation occurring with movement of the body part. Furthermore, the motion monitoring system includes a user device that is in communication with the wearable monitoring device. The user device is capable of receiving data from the sensor system that corresponds to the detected deformation occurring with movement of the body part.
In another embodiment, a method for determining the range of motion of a human body part is disclosed. The method includes detecting, by a sensor system of a monitoring device that is worn on a human body that includes a body part, deformation of the sensor system occurring with movement of the body part. The method also includes receiving data by a user device from a wearable monitoring device. The data corresponds to the detected deformation occurring with movement of the body part. Furthermore, the method includes visually rendering, with a display device, movement of the body part according to the received data.
As shown in
As shown in
As shown in
The lattice 104 may be constructed and formed using the primary sensor 110 and at least one secondary sensor 112. A primary sensor 110 forms the layout and design on the lattice 104 and may be configured to integrate with one or more secondary sensors 112.
For example, the one or more secondary sensors 112 may be disposed at any intersection 106 of the primary sensor 110 of the lattice 104. The entire lattice 104 may be electronically connected to a monitoring system 114 which may accept and process both power and data transmissions from the lattice 104. A series of lattices 104 can be connected to cover great distances or span massive surfaces. A series of connected latices 104 comprises a super lattice 118, which may be included on the sleeve 4 as shown in
By way of example,
The sensor system 102 within the sleeve 4 may be configured to monitor the status of the wearer's body. The sensor system 102 may monitor for a variety of conditions, including strain, temperature, motion, position in space, flow, oxygen saturation, vascular lag, wall thickness, etc. It will be understood that the primary sensors 110 and secondary sensors 112 may monitor for any condition capable of being monitored with sensors.
The primary sensor 110 may most often be the mode of connection to the base object 116 being monitored. The primary sensor 110 may be capable of measuring for a specific series of changes in the base object 116 that is being monitored. For example, the primary sensor 110 may be used to monitor changes in electrical resistance to determine Freedom and Constraint Topology (“FACT”) on the surface of a base object 116, allowing the determination of changes in deflection, shearing force, and general motion with respect to the base object 116. As an example, the primary sensor 110 measures changes in voltage across the primary sensor 110 to determine a change in the length of base object 116 (i.e., an amount of deformation of the base object 116) and/or motion of the base object 116. A 3D model may be built based on these measured changes in resistance, for example, to track the movements of the wearer. In certain embodiments, the primary sensor 110 may also be capable of powering and transmitting data for discrete secondary sensors 112 and may also be scalable. The size and shape of the primary sensor 110 may be utilized to determine the size and shape of the lattice 104. Because the primary sensor 110 may be comprised of an array of repeating geometric patterns, the lattice 104 and super lattice 118 structures may be constructed very quickly with known electrical and conductive properties.
The secondary sensors 112 may be integrated into the primary sensor 110 to create a sensor-within-a-sensor platform. The secondary sensors 112 may measure specific metrics, including, for example, motion, temperature, or flow rate. In some embodiments, the secondary sensors 112 may be included at one or more conductive tabs 108 (
The sensor system 102 may be modular in that each component part of the system 102 may be independently removed, changed, replaced, or repaired. For example, a section of the primary sensor 110 may be removed from the main body of the primary sensor 110, secondary sensors 112 may be removed or replaced, sections of lattice 104 may be removed or replaced, and sections within a super lattice 118 may be removed or replaced. The sensor system 102 may be configured to maintain functionality when components are removed regardless of the size of the system. Additionally, removing sensor components within the sensor system 102 does not impede the flow of electricity or data. Also, in certain embodiments, the sensor system 102 may allow for new components to be integrated because no portion of any layer within the system is a closed loop.
In use, the sensor system 102 allows for the monitoring of 3-dimensional objects (e.g., the sleeve 4) that may be complex in shape and/or massive in size or length. Using the sensor system 102, the sleeve 4 may be tessellated, or covered in repeated geometric patterns of identical interlocking shapes, which allows the surface of the sleeve 4 to be approximated in two dimensions. The approximated surface area of the 3-D object determines the geometric shape of the lattice 104 and primary sensor 110 and the size and number of segments in a single lattice 104/primary sensor 110. Frequency of secondary sensor 112 placement may be also determined by the specific monitoring needs related to the base object 116.
In certain embodiments, the sensor system 102 may monitor the metrics collected by both the primary sensors 110 and secondary sensors 112 on the same system clock at monitoring system 114. All of the data from the primary sensor 110 and secondary sensors 112 may be transmitted to the monitoring system 114 at the same time. This allows for less data processing at the monitoring system 114, which in turn provides faster data access and faster feedback regarding the monitoring of base object 116. Signals may also be integrated in a straightforward manner. Sensor output may be collated, analyzed as a time series by simple linear models, and used to build predictive models of events relevant to the monitored environment (injury, structural or catastrophic failure, etc.) Because the sensor system 102 is modular, it allows for the introduction of new secondary sensors 112 into the system as needed without redesigning the entire system or adding a new circuit to the system. If the size or portion of the base object 116 to be monitored changes, lattices 104 may be added or removed from the system as necessary.
Using the sensor system 102, the 3-D surface of the sleeve 4 may be transformed into planes which provide an active monitoring area. For example, if the sleeve 4 is worn on the user's arm, the arm may be approximated as the curved surface of a cylinder using the formula A=2πrh. The surface area determines the geometric shape of the primary sensor 110 and the distance between secondary sensors 112. The relationship between shape and surface area determines the number of segments in a single cell, hence the number of repeating geometric cells over a given area.
Accordingly, if one or more sleeves 4 are worn on body parts as represented in
The sensor system 102 may generate various signals, such as (in increasing order of complexity): 1) accurate monitoring of the desired shape as a time series; 2) complex instantaneous measurements by combining signals (for instance, measuring electrical to vascular lag as a proxy for vascular stiffness, or monitoring wall thickness and ambient temperature, the integration of (1) and (2) into near-instantaneous monitoring or user feedback, and the eventual acquisition of a sufficient database of integrated signals to form models for predicting certain events.
By way of example, the lattice 104 may have a parabolic shape and may be incorporated into the cylindrical sleeve 4. The sensor system 102 may monitor multiple types of strain across multiple planes, as well as temperature, flow, and expansion. These measurements may be made on the same clock and the output may be sent to a common table. Accordingly, the data should be gathered consistently (i.e., without “holes”).
As shown in
Furthermore, the orientation indicator 18 (
Referring again to
The user device 6 may be any device capable of running a program or application, receiving inputs from a practitioner, and wirelessly connecting to and receiving inputs from the monitoring sleeve 4. For example, and without limitation, a user device 6 could be a tablet, laptop, or smart phone.
The user device 6 may include a program or application that receives data from the sensor system 102 of the monitoring sleeve 4, calculates a range of motion from the data, and creates a real-time visual model of range of motion of a body part as the wearer of the sleeve 4 moves the body part. The user device 6 may also receive inputs from a practitioner or user regarding the wearer's personal information, information regarding which body part is being tested, and any diagnostic information regarding the body part that is being tested. Additionally, the user device 6 may store data and information from previous tests and compare a wearer's previous range of motion with his or her current range of motion.
Data and information from previous tests may also be stored remotely in a central database 22, for example and without limitation, in the cloud, and accessed by the user device 6 as necessary wirelessly. Data stored in a central database 22 may be available to the user device 6, any other user device 6 running a program or application capable of accessing the central database and processing the data, or other programs and users.
The user device 6 may also include one or more processors and/or protocols that direct, measure, display, and record real-time body part movement and biometric data. For example, the user device 6 protocols may direct a user to perform a series of movements with a body part placed within the monitoring sleeve 4, place the sleeve 4 on another corresponding body part, and perform a series of movements with that body part. The user device 6 may also direct a user to perform a series of movements on two or more body parts as described below and/or when more than one monitoring sleeve 4 is used. As the user moves the body part in the specified directions, the user device 6 may receive data from the sensor system 102 of the sleeve 4 that the user device 6 uses to determine the range of motion of the body part and record the information. The user device 6 may also produce a real-time display of the body part and monitoring sleeve 4 as the wearer moves his or her body part. Finally, the user device 6 protocols may record the data received from the sleeve 4 either as part of the application or remotely. The protocols may capture the complex movements of the body part within the sleeve 4 as a combination of angular rotation and direction. The protocols may use a set of landmarks, either internal or external to the user device 6, to control positioning error. The user device 6 may automatically detect and connect to any monitoring sleeves 4 in the vicinity of the user device 6.
The protocols may organize the reception and collection of data in any order or format. For example, in one embodiment the protocols may instruct a user to perform one particular range of motion movement at a time, record the data from that movement, and then instruct the user to perform another range of motion movement. In this embodiment, the range of motion movement may be the same or different than the previous movement. In another embodiment, the protocols may allow for a user to perform any movements he or she desires and record the data from those movements. The protocols may then instruct the user to perform particular movements if necessary. In yet another embodiment, the protocols may direct a user to perform a set of pre-specified movements and then record the data from those movements. In this embodiment, the practitioner or user may choose a set of protocols for the user to perform, or may allow the application to choose the protocols to run. Additionally, a practitioner may customize the protocols to suit the particular application, data, or analysis of range of motion needed. In any of the previously described embodiments, data may be recorded in a user device 6, cloud 20, and/or a central database 22. More than one user device 6 may also be used as a part of the system 2. Each of the user devices 6 of a system 2 may use the same or different protocols than the other user devices 6 of the same system 2.
In one embodiment, the wearer may be given one or more monitoring sleeves 4 on one or more body parts to be evaluate the range of motion of both injured and uninjured body parts by the practitioner. The practitioner may ensure proper anatomical placement of the monitoring sleeve 4. The practitioner may place the battery component 14 into the battery compartment 12 located on the lateral side of the sleeve 4. The battery life indicator 16 may be a light. The light may turn green when the battery 14 is the housing. The practitioner may then open the application on a user device 6 (iPad, tablet, smartphone) and/or project the application onto a display device 8 in the exam room. The practitioner may then select the range of motion assessment protocol to be used from a menu of options as described more fully below. The application may record the range of motion (in degrees), joint velocity (in degrees per second), and the force generated (in Newtons) in real-time and/or near real-time. The application may record a three dimensional rendering of the motion of the wearer's body parts and body that may be played back immediately or any time after each assessment. Range of motion data may be instantly recorded remotely, including but not limited to in a data cloud 20 or central database 22. Computations of the range of motion data may occur in the cloud 20 or as accessed in the central database 22 in order to take advantage of potentially greater computing power than available on the user device 6. The range of motion data may also be printed immediately after the time of the assessment. The wearer may visualize his or her progress with respect to range of motion through a personal and secure login to the application. Range of motion data may be displayed in a table within the application and may show the difference in range of motion. The wearer and/or practitioner may then, if necessary, select another range of motion assessment and follow the same sequence as described above.
Referring now to
The mirror view 202 may be used to collect and record free-form range of movement data, while the range of motion tests 204 instruct the wearer to perform certain movements. The software application may also take the wearer through a video tutorial which may be bypassed if the wearer is familiar with the range of motion assessment.
If the practitioner selects the mirror view 202, the protocol may follow the following steps. At 211, the practitioner may be asked to enter a user profile for the wearer into the user device 6. Next, at 212, the practitioner may be asked to select a position for the movements using the user device 6. At 212, the choices of positions may include, but is not limited to, seated, supine, or standing. Next, at 213, the user device 6 may scan and connect to at least one, but potentially more than one, monitoring sleeve 4. Then, at 214, the practitioner may assign and align monitoring sleeves 4 with the respective body parts to be evaluated. The wearer may then, at 215, move the body parts to be evaluated, and the application in the user device 6 may receive data, including but not limited to range of motion, velocity, and force data from the monitoring sleeves 4. The application may then (at 216) print the data received, (at 218) record the data received on the user device 6/cloud 20/or central database 22, and (at 220) render a visual representation of the movement. Next, at 222, the application may record the visual representation of movement and (at 224) replay the visual representation on the user device 6 and/or the display device 8.
Alternatively, if the practitioner selects range of motion tests, the protocol may operate as follows. First, at 231, the practitioner may select a specified range of motion test desired from a list of available tests. Next, at 232, the application may provide the instructions for the range of motion test selected at 231, for example, via the display device 8. The practitioner may then, at 233, be asked to enter a user profile for the wearer using the user device 6. The practitioner may then, at 234, scan and connect the user device 6 to a monitoring sleeve 4 and, at 235, assign and align the monitoring sleeve 4 with the body parts to be evaluated. Next, at 236, the practitioner may utilize the user device 6 to select either the body part or parts to test as necessary (e.g., selection between right or left limb for testing). The application then, at 237, instructs the wearer on what motions to perform and when to perform them by displaying the instructions on the display device 8. At 238, the wearer of the sleeve 4 performs those movements per the instructions provided at 237. The user device 6 may, at 239, receive the range of motion test results. Also, at 240, the test results may be recorded in memory on the user device 6, cloud 20, and/or central database 22. Furthermore, at 241, processors at the user device 6 may access previously-stored range of motion test results and compare those with the new test results (received at 239) in the application on the user device 6. The application, at 242, may also render a visual representation of the body part movement for the selected test and display the visual representation either on the user device 6 or display device 8. The application may also, at 243, print the range of motion test results received at 239.
In another embodiment of an Objective Range of Motion Monitoring System shown in
While at least one exemplary embodiment has been presented in the foregoing detailed description, it should be appreciated that a vast number of variations exist. It should also be appreciated that the exemplary embodiment or exemplary embodiments are only examples, and are not intended to limit the scope, applicability, or configuration of the invention in any way. Rather, the foregoing detailed description will provide those skilled in the art with a convenient road map for implementing an exemplary embodiment of the present disclosure. It is understood that various changes may be made in the function and arrangement of elements described in an exemplary embodiment without departing from the scope of the present disclosure as set forth in the appended claims.
This application claims the benefit of U.S. Provisional Application No. 63/003,705 filed Apr. 1, 2020.
Number | Date | Country | |
---|---|---|---|
63003705 | Apr 2020 | US |