The present disclosure generally relates to a system and methods for collecting, calculating, and outputting data useful in analyzing an individual's postural sway.
Medio-lateral (ML) and anterior-posterior (AP) sway balance assessment has been considered as a good indicator of the ability to body to stabilize its center of mass within the limits of the base of support. Impairments of the balance control, the result of a wide variety of neuromuscular and vestibular disorders, can lead to frequent falls and associated morbidity and mortality. The ML and AP sway assessment also provides valuable diagnostic and prognostic information on athletes suffering concussions. Current devices used to measure sway balance are mostly limited to laboratory settings and require trained personnel, hence, reducing their value for at-home or in-the-field assessment. For example, Optotrak Certus (NDI, Canada), widely considered as the gold-standard by the clinical and research communities, is expensive (tens of thousands of dollars), requires complex hardware, and is hard to operate without considerable training. Other alternatives in the market are those based on inertial sensors and pressure plates, but lack the accuracy and response speed for useful analysis. Therefore, improvements are needed in the field.
A postural sway analysis system is disclosed. The system includes a camera worn by an individual, a processing unit coupled to the camera, a floor marker placed on a floor near the shoes or feet of the individual. The camera is configured to acquire images of the floor marker, which has a known size or diameter, while the individual is standing. The processing unit is configured to capture an initial calibration image of the floor marker using the camera while an individual is standing still to determine the distance between the camera and the floor marker. The processing unit is further configured to capture subsequent time-varying images of the floor marker while the individual is standing (and swaying). Furthermore, the processing unit is configured to compare the calibration images to the subsequent time-varying images to determine a postural sway of the individual.
A method for acquiring postural sway of an individual is also disclosed. The method includes capturing a calibration image from a floor marker placed on a floor near the shoes or feet of an individual to determine the distance between the camera and the floor marker, wherein the calibration image is obtained from a camera worn by the individual. The method also includes capturing subsequent time-varying images from the floor marker while the individual is standing (and swaying). Furthermore, the method includes comparing the calibration image to the subsequent time-varying images to determine a postural sway of the individual.
In the following description and drawings, identical reference numerals have been used, where possible, to designate identical features that are common to the drawings.
The attached drawings are for purposes of illustration and are not necessarily to scale.
For the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of this disclosure is thereby intended.
In response to the need for a more efficient and effective postural sway analysis system, disclosed herein is a novel postural sway analyzer that can measure postural sway using an imaging system, processing unit, and a camera feature in a processing unit such as a smart cellular phone.
Referring to
The processing unit 110 includes a processor (not shown) or multiple processors (not shown), memory (not shown), input/output (I/O) circuitry (not shown), and other peripheral circuits typically available in a smart cellular phone. The I/O circuitry may include a wireless communication circuit (not shown), e.g., a Bluetooth system or WiFi, and/or a wired communication circuit (not shown).
The imaging system 120 includes a camera 122 and a right angle lens assembly 130. It should be noted that the right angle lens assembly 130 may be avoided with the camera 122 placed in a manner in which it is pointed downward toward the shoes/feet of the subject. The camera 122 is typically integrated with the processing unit 110 but can also be part of the right angle lens assembly 130. The right angle lens assembly 130 includes a housing and a lens. The right angle lens assembly 130 is configured to transfer images from the lens to the camera 122 in a right angle manner. In the embodiment shown in
The right angle lens assembly 130 is configured to tilt the view by 90 degrees and offer a wide angle of view. The camera 122 with the detachable right-angle lens is thus capable of capturing images of a subject's shoes/feet. Once worn, the camera angle can be adjusted, if needed, to bring the marker into a direct field of view and centering it on the camera screen
To analyze the postural sway of a subject, several parameters need to be monitored. Referring to
Once a successful marker image is detected, the system determines a calibration factor by performing a unit-distance calculation using the known size or diameter of the marker to determine a distance-per-pixel for the received image (stage 316). Successive images are then compared by the processing unit to determine the movement of the marker within the pixel grid of the received images. The movement of the floor marker within the image pixel grid is then used to determine the ML and AP sway of the user, since the marker movement is directly related to the movement of the camera relative to the marker. The marker movement data is then written to a data log file in the processing unit memory (stage 320) for further processing and output to a display (e.g., smartphone screen or other electronic display).
The sway data log records various parameters, including date, time, sampling frequency, floor marker size (unit distance), AP sway (distance) and ML sway (distance). In the software developed for the system of the present disclosure, sway assessment provides a brief on-screen summary for the users.
The sway data that is acquired from sway analysis system 100 of the present disclosure can be used to predict user health, as there is a known association between sway variables and various health conditions and diseases, such as concussion. The data acquired by the sway analysis system 100 can be stored and compared to a library of known parameters associated with such health conditions. The individual's values will be compared to these libraries to determine if any of the parameters exceed the threshold. If the threshold is exceeded on one or more parameters, the individual will be identified as being at higher risk for the associated health conditions.
In one example, the system 100 was validated by its direct comparison to the optical motion analysis system, OptoTrak (Optotrak 3020, NDI), with an infrared LED of the OptoTrak placed on the wide-angle lens. Ten young healthy adults (24.6±3.4 yrs) were asked to stand quietly for 1 minute in the following conditions: on two feet with eyes open (2FEO), on two feet with eyes closed (2FEC), on one foot with eyes open (1FEO), and tandem standing eyes open (TEO).
Those skilled in the art will recognize that numerous modifications can be made to the specific implementations described above. The implementations should not be limited to the particular limitations described. Other implementations may be possible. While the inventions have been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only certain embodiments have been shown and described and that all changes and modifications that come within the spirit of the invention are desired to be protected.
The present application is related to and claims the priority benefit of U.S. Provisional Patent Application Ser. No. 62/593,679, filed Dec. 1, 2017, the contents of which is hereby incorporated by reference in its entirety into this disclosure.
Number | Date | Country | |
---|---|---|---|
62593679 | Dec 2017 | US |