Hemiplegia is a disability that renders half of a patient's hand immovable. Therapy includes exercises to move the affected joints and muscles. The therapeutic exercises are prescribed to patients by medical professionals. In order to provide quality of service, therapists need to know the certain kinematic metrics that require the use of certain devices that need to be brought in proximity of the patient's hand. Therapy in the home is more flexible and more convenient for the patient by allowing more frequent repetition of therapy exercises.
Accordingly, what is needed as recognized by the present inventor is a system that uses noninvasive technologies to track and monitor joints. In addition, therapeutic exercises need to be simple, entertaining and may be performed and monitored outside of a clinical setting.
The foregoing “background” description is for the purpose of generally presenting the context of the disclosure. Work of the inventor, to the extent it is described in this background section, as well as aspects of the description which may not otherwise qualify as prior art at the time of filing, are neither expressly or impliedly admitted as prior art against the present invention. The foregoing paragraphs have been provided by way of general introduction, and are not intended to limit the scope of the following claims. The described embodiments, together with further advantages, will be best understood by reference to the following detailed description taken in conjunction with the accompanying drawings.
A game-based method for physical rehabilitation is provided that authenticates via processing circuitry, a user with authentication information input through a communication interface, identifies therapeutic movements by referencing a look-up table stored in a memory, the therapeutic movements being prescribed for the user or associated with a diagnosis of a predetermined physical condition, obtains a user rehabilitation status stored in the memory, generates a game based on the therapeutic movements and the user status wherein the game includes controlling browsing of a map by detecting the therapeutic movements performed by the user, provides the game to the user, receives a data stream from a motion-sensing device that monitors an actual movement of the user in correspondence to the therapeutic movements, analyzes the data stream to calculate information metrics that are associated with the correspondence of the actual movements to the therapeutic movements, and updates the user rehabilitation status based on the information metrics.
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout several views, the following description relates to a system and associated methodology for detecting, tracking, and visualization of physical therapy data.
The present disclosure relates to using a motion-sensing device such as LEAP1 to detect, recognize and track the rotational and angular movements of different joints of the body. Kinematic and therapeutic data are then extracted from these movements. The system and associated methodology calculate a range of motion (ROM) from each therapy session. The range of motion is then displayed to the user in real time. A detailed analysis may also be displayed to the user. The analysis may be plotted in a 3D environment. The system of the present disclosure has the advantage of being non-invasive, as the patient does not need to wear any external devices. Complex measurement devices are bulky and restrict movements for even the normal patient. This is a non-invasive device and hence can be used with disabled hemiplegic patients of any level.
The method of the present disclosure may also incorporate a 3D webGL-based serious game environment where the live therapeutic movement of the patient and a therapist is synchronized between a physical and a 3D environment. Since the framework is web-based, the user just needs the motion-sensing device to be connected to an electronic device. The electronic device connects via a network to a server. The user may then use a web browser to use the system of the present disclosure. In addition, the system and associated methodology of the present disclosure detects, recognizes and visualizes primitive hand gestures and then use them to define high-level and complex therapies that combine the primitive hand gestures.
The network 102 is any network that allows the motion-sensing device and the medical professional computer 110 to communicate information with each other such as Wide Area Network, Local Area Network or the Internet. The medical professional computer 110 represents one or more medical professional computers that could be located in a doctor's office, hospital or other medical facility or health facility where they are used in the treating of patients as well as the review of patient records. The patient 108 may also use a personal computer to connect to the server 100 through the network 102 to view personal medical records and therapeutic exercises. In one embodiment, the personal computer may be connected with the motion sensing devices 104 and 106.
The server 100 may be or include any database configured to store and/or provide access to information, such as an electronic database, a computer and/or computerized server, database server or any network host configured to store data. Further, the server 100 may include or be a part of a distributed network or cloud computing environment. As shown in
The motion-sensing device 104 may be any device configured to detect a 3D movement. The motion-sensing device may be based on different types of technologies. For example, the motion-sensing device may use accelerometers to detect orientation and acceleration. The motion-sensing device may use infrared structured light. For example, the motion-sensing device may be a LEAP or KINECT device. The LEAP device is a 3D-sensor device that captures all the motion of hands and fingers at a rate of 60 frames per second. The KINECT device captures the 3D movement data of all the joints in the body. The KINECT device may capture motions from 20 joints of a human body at a rate of 30 frames per second. In one embodiment, the motion-sensing device may also receive data from one or more sensors that senses motions of the user.
The user may use any electronic device connected to the motion-sensing device to visualize interfaces. The electronic device 114 may be a computer, a laptop, a smartphone, a tablet, a television, or the like. The electronic device may include a computer monitor, television, or projector that can be used to view the output from the system. The electronic device 114 may be connected to the motion-sensing device 104 using a wired or wireless connection. In one embodiment, when the connection to the server 100 is not available, the motion-sensing device may store the captured data. Once a connection becomes available, the motion-sensing device may upload the captured data to the server 100 via the network 102. In one embodiment, the server 100 may poll the motion-sensing device, at predetermined instances, to check whether updated data is available. In response, to determining that new data is available the data are uploaded to the server 100 using the network 102. The data may then be processed in the server 100. The user may then download the data to the electronic device 114.
As discussed above, each joint has a number of movements associated with it. Some movements are angular for example flexion/extension of the elbow that takes place when the wrist is brought near the shoulder or moved away from it. Let J be the set of joints being tracked:
J={j
1
,j
2
,j
3
, . . . ,j
m} (1)
For example, J can be j1=finger MCP, j2=right shoulder, j3=left shoulder.
At any given temporal dimension, the joint has a particular state and at that state the joint produce one or more movements related to that state. A set of states may be defined as follows:
S={s
1
,s
2
,s
3
, . . . ,s
m} (2)
For example, S may be s1=flexion, s2=extension, s3=abduction, s4=adduction.
A primitive therapeutic context Pi may be defined as a set of ordered pairs of joints and their respective states as following:
P
i
={j
m
,s
n>} (3)
For example, P1 may represent the primitive therapeutic contexts for wrist flexion and P2 may represent the primitive therapeutic context for wrist extension
P
1
={j
1
,s
1} (4)
P
2
={j
1
,s
2} (5)
A complete therapeutic context T is defined as a series of primitive therapeutic contexts P1 . . . Pn. As an example the above two primitive therapeutic contexts can be serially combined into a complete therapeutic context T depicting the wrist bend therapy as follows:
T={P
1
,P
2} (6)
where P1 is wrist-flexion and P2 is wrist extension.
A high level therapy may be composed of a number of sub-therapeutic. For example, “walking” therapeutic exercise may be broken down into three separate sub-therapeutic actions around three different joints that need to be monitored. The three joints and their associated movements are: flexion/extension of hip joint, flexion/extension of knee joint and dorsiflexion/planter of ankle joint. The system then tracks movement or motion using the modeling described above.
In one embodiment, the user may need to be authenticated before starting using the system. The authentication can be performed by a variety of methods such as voice recognition via a microphone or fingerprint recognition via a fingerprint sensor. The fingerprint is verified using the fingerprint verification circuitry by comparing the fingerprint with the fingerprint stored in a user profile. In other example, the user may be authenticated by entering a pin code. At the beginning of each session, the patient 108 may indicate what devices are available to him. The patient may use the speech-based interface or the menu driven interface to choose the available devices.
The system may include a sensory data manager 600. The sensory data manager 600 processes raw data from the motion-sensing device to extract joint data. In one embodiment, the raw data frames are in a JavaScript Object Notation (JSON) format. The extracted joint data contains the locations of joints as observed at a predetermined rate. For example, the locations of hand joints may be observed 60 times per second using the LEAP device. The predetermined rate may depend on the maximum acquisition rate of the motion-sensing device 104 type. In other embodiments, the predetermined rate may be in function of the required resolution.
The system may also include a session recorder 602. The session recorder may record the therapy session. The recorded therapy session may be saved to a session repository 604. The recorded therapy session may also be used by a motion extractor 612 to provide a live-view in a 3D environment. The motion extractor 612 may also provide plotting of the different quality of performance metrics in real time. The user may choose to record the therapy session or not. For example, the user may choose not to save the therapy session when an error has occurred. For example, the patient may start the session then stop for a particular reason. The user may also choose which joints need to be tracked. The selected joints are then displayed in real time.
The session repository database 604 stores the session data. The session data may also be stored in a cloud based secondary storage. The session data may be then accessed and played later by the user. The community of interest (COI) 112 may also access the session data using the network 102. The community of interest 112 may include caregivers. The COI 112 may include patient's parents, family members, relatives, friends, medical professional or the like. A medical professional is any member of a health-related field having a professional status or membership as would be understood by one of ordinary skill in the art. The COI may be authenticated before allowing access. Access to the session data may be limited depending on the medical professional level, experience, special privileges or seniority. In other words, the access to the session record may be restricted by the CPU 2400. For example, a nurse may display the session data but cannot delete or update the session data. In another example, relatives may display the patient status but may not be able to display detailed information about the patient health. The session data may also be added to the patient online electronic health record for sharing purposes. The system may also include a user profile database 606 and a therapy database 608.
The user profile database 606 stores electronic medical records (EMR). The user profile database 606 stores detailed information about the patient, the therapist and the caregiver. The patient identification information may include one or more of, but not limited to, a name, a photo, a date of birth, a weight, a height, a gender, a skin color, a hair color, a next of kin, a fingerprint, an address, an emergency contact number and an identification number.
The user profile database 606 may also store a patient medical record. The patient medical record can include one or more of, but not limited to, a blood type, a vaccination list, an allergy list, a past surgeries list, insurance company information, a genetic diseases list, an immunization list, a family medical history and a prescribed medicament list. In addition, the user profile database stores disability information. The disability information may include one or more of, but not limited to, type of disability, therapist name, past history of therapy, recorded sessions, and improvement parameters.
The therapy database 608 stores details about the disability type, a therapy identification, therapy type, types of motions involved in each therapy type, joints and muscles to be tracked in each motion type, metrics that store those joint and muscle values, normal range of each of the motions and metrics, improvement metrics for each disability type, and the like. The therapy database 608 may also include information about specific clinical syndrome.
The motion extractor 612 may combine session data with user preferences from the user profile database 606 and data from the therapy database 608 and provides the output to a session player 610 and to a kinematic analytics module 614.
The kinematic analytics module 614 employs analytics and algorithms to provide live visualization of different quality of improvement metrics for each selected joints.
The session player 610 manages the movement of joints in the interface. For example, the session player 610 manages the movement of the physical hand in the 3D visualization interface.
As mentioned above, the second interface shows live 2D kinematic data. The second interface shows the joint positions, range of motion around each joint, speed and other metrics over the course of the therapy session. The graphs may be plotted in real time during the session. In an embodiment, the graphs may be plotted after a session is completed. The visualization interface is used to start and end the therapy session.
In one embodiment, the medical professional may monitor the therapy session in real time. The medical professional may then provide live feedback to the patient. For example, when the patient receives a new prescribed therapeutic sequence of exercises, the medical professional may monitor the patient and provides a live feedback to the patient via the network 102. For example, the medical professional may correct the patient moves.
In one embodiment, audio to encourage the patient 104 may be generated by the server 100. For example, when the patient 104 performance is better than a predetermined criterion a prerecorded sentence may be played. The predetermined criterion may be a performance better than the patient average performance. In other embodiments, the predetermined criterion may be a goal set by the medical professional. The CPU 2400 may generate a target for a therapeutic session based on the user past performance, other patients' performances, and the user profile. This is done by comparing patient's performance against reference data stored in the therapy database 608. Once the patient achieves the goal, the audio is generated. For example, the CPU 2400 may analyze past data, to determine that the patient is showing an improvement of 1% after each therapeutic session. The CPU 2400 may obtain the current state of the joint from the user health record and may calculate a target state with the 1% improvement. Once the target is reached, which implies that the patient achieved the 1% improvement. The audio may be played. The audio generated may be based on the patient's age. For example, for a young girl the audio may be played using the voice of a Disney princess. In other embodiments, other encouragement methods may be used. For example, once a child completes the therapeutic session, the system may display the child favorite songs or the child favorite cartoon.
The therapeutic sequence of exercises that the patient follows and executes may be delivered to the user in a plurality of ways. For example, the therapeutic exercises may be prerecorded on a DVD. The exercises may be demonstrated using an avatar on the screen. The system may use online virtual game such as Second Life to display the movements of the therapist. The virtual online environment may have a similar design than the rehabilitation center where the patient goes. The patient may log in to the virtual online environment to view or download a practice session.
The system may also include an authoring interface. The authoring interface may be used by the medical professional to design a new therapy. The new therapy may include new exercises or a new therapeutic exercise sequence. The new therapy may be stored in the therapy database 608. The new therapy is then available to other medical professionals. The medical professional may choose to create a new therapy or modify an existing therapy. The system may associate a score with each therapy. The score is a function of past patients improvement using the therapy.
In one embodiment, the therapeutic exercises may be presented to the user using a serious game format. The serious game may be a health GIS game. The hand gestures are then captured while the patient is playing the game. The game may be played by multiplayer. The other player may be a healthy individual, which may encourage kids to play while performing their required therapeutic exercises.
In one embodiment, the health GIS game may consist of browsing a map. The patient 108 may be presented with a map on the display of the electronic device. The user gesture may be captured while browsing the map. The game may include browsing the map in order to find virtual checkpoints. The game may be in a 2D or 3D format. The serious game may be designed to implement a therapeutic session of exercises consisting of six movements for the forearm and two joints. The serious game consists of browsing a map by going left (radial deviation), right (ulnar deviation), zoom in (wrist flexion), zoom out (wrist extension/hyperextension), and circling around an airplane (pronation/supination). The serious game virtual background may be based on the patient age. For example, a cartoon like map may be used for a child.
Table 1 shows exemplary mapping of joint movement with the map movement. Table 1 shows one exemplary configuration. Other configurations may be used based on the patient's health condition. The other configurations may be stored in the server 100. For example, the therapist may indicate the patient health condition and the processing circuitry may determine a suitable configuration for the patient. For example, when the patient is missing limbs or fingers the suitable configuration may not include a map movement that requires the movement of the missing limbs. For example, the abduction/adduction movement 400 to move up in the map may be replaced by a radial deviation in the right hand when the patient is missing fingers. The map may correspond to the map where the patient resides.
An inverse kinematics analyzer 708 processes the data and detects the state of the joints and motions in the live stream. The system also provides information to the analyzer regarding the joints that need to be tracked. The analyzer calls the function required to parse the stream. The output is forwarded to the appropriate window in the user interface to inform the user about information to improve the quality of improvement (QoI). The algorithm for the LEAP and KINECT motion analyzer is shown in flow chart shown in
The user interface may include a quality of improvement display window 722. The quality of improvement window may display the name of the motion of the joints being tracked, for example, supination or pronation of the forearm. The motion name is received from the inverse kinematics analyzer 708.
The session recorder 702 may record the data stream. The data stream may be stored in a memory of the electronic device of the user. The data stream may also be uploaded to a health cloud 707. The data stream may be stored in a JavaScript object Notation (JSON) or a Bio Vision Hierarchy (BVH) format.
In one embodiment, the user may control the session recorder using the system interface. The user may click a button to start, stop or pause the recording. The user has the ability to start the recoding when he is ready to perform the therapeutic exercises. The user may pause the recording in the case of interference from clothes or objects in the environment. In one embodiment, the live stream display may continue even when the recording is paused. The user can hence get a visual clue when the interference is removed and can continue with the recording. As discussed above, the system may use voice control to accept the user input. The user may speak a single command. Then, the server may match the single command with a corresponding action. The server 100 may then execute the voice command. In one embodiment, the single command may also be used to authenticate and identify the user by comparing the voice command with stored speech model as would be understood by one of ordinary skill in the art.
A reporting engine 712 takes the stored motion files and processes them to extract joint movement data. It converts the joint movement data to charts and plots them on the screen. For multiple joint movements, charts are plotted for each joint and its movement from top to bottom on the page, aligned by the time stamps as shown in
A GIS games repository 714 may store a plurality of GIS based serious game. A GIS game controller 716 may configure the game based on a patient rehabilitation status. A GIS game interface 720 is provided to the user as would be understood by one of ordinary skill in the art.
The software environment is set up such that a therapist can record an exercise session in 3D environment. The patient can log on to the framework and preview the hand therapy in 3D environment in the form of an avatar-hand on the screen. The system can record the patient's session and send it to the therapist. Temporal data collected from a number of sessions over a long period can be used to monitor the effectiveness and progress of the rehabilitation process.
In one embodiment, the system may display a video of the user performing the prescribed therapeutic exercises while the therapist is correcting him. This functionality provides high level of personalization and increases the accuracy of the patient performing the therapeutic exercises.
In selected embodiments, the server 100 based on the patient's current state and rate of improvement may select therapeutic exercises with a higher complexity and difficulty level.
In one embodiment, the server 100 using the CPU 2400 may generate an alert when the patient 108 did not perform the required therapeutic exercise. The server 100 may also generate the alert when the patient 108 does not complete the exercise. When the patient 108 skips the therapeutic exercise for more than a predetermined number, an alert is generated to the community of interest 112. For example, when a child skips the required therapeutic exercise or does not complete correctly an alert is generated to the parent or the guardian. For example, when a child performs the exercise less than a prescribed number, an alert may be generated.
In one embodiment, the CPU 2400 may compare the current state of a joint with a prescribed state stored in the therapy database. In response to determining that the joint state does not correspond with the prescribed state, the alert is generated. The alert may include a warning sound. The alert may also include generating an error message on the display. The error message may show the prescribed state.
In one embodiment, a reminder may be provided to the patient to perform the required therapeutic exercise. For example, the alert may be visual, audible or tactical. The alert may be shown on the patient's computer, television, smartphone, smartwatch, or the like.
The system may also poll a patient's electronic calendar to generate the reminder to perform the required therapeutic exercise at convenient times to the user. For example, the system may access an electronic calendar of the patient and the user profile. The system, using the CPU 2400, may determine a convenient time to perform the exercise. Then the system may generate an alert informing the user about the convenient time. For example, the therapist may indicate that the patient should perform the therapeutic session each morning. This information is stored in the user profile as explained above. The CPU 2400 may poll the electronic calendar of the patient to determine available free time during the morning. The CPU 2400 may then alert the patient to perform the therapeutic session at the available free time. In another example, the patient may perform its daily therapeutic session at 5 pm. The CPU 2400 may determine that the patient has an activity such as attending a birthday party at 5 pm and may then generate the alert to perform the therapeutic session at 4 pm. In this way, the system avoids constraints such as prayer times, school, or the like.
To illustrate the capabilities of the system, exemplary results are presented.
The patient may use a personal computer to connect the Kinect device through a USB port. The output interface may be displayed using a HTML5 based browser using a WebSocket.
Next, a hardware description of the server 100 according to exemplary embodiments is described with reference to
Further, the claimed advancements may be provided as a utility application, background daemon, or component of an operating system, or combination thereof, executing in conjunction with CPU 2400 and an operating system such as Microsoft Windows 7, UNIX, Solaris, LINUX, Apple MAC-OS and other systems known to those skilled in the art.
The hardware elements in order to achieve the server 100 may be realized by various circuitry elements, known to those skilled in the art. For example, CPU 2400 may be a Xenon or Core processor from Intel of America or an Opteron processor from AMD of America, or may be other processor types that would be recognized by one of ordinary skill in the art. Alternatively, the CPU 2400 may be implemented on an FPGA, ASIC, PLD or using discrete logic circuits, as one of ordinary skill in the art would recognize. Further, CPU 2400 may be implemented as multiple processors cooperatively working in parallel to perform the instructions of the inventive processes described above.
The server 100 in
The server 100 further includes a display controller 2408, such as a NVIDIA GeForce GTX or Quadro graphics adaptor from NVIDIA Corporation of America for interfacing with display 2410, such as a Hewlett Packard HPL2445w LCD monitor. A general purpose I/O interface 2412 interfaces with a keyboard and/or mouse 2414 as well as a touch screen panel 2416 on or separate from display 2410. General purpose I/O interface also connects to a variety of peripherals 2418 including printers and scanners, such as an OfficeJet or DeskJet from Hewlett Packard.
A sound controller 2420 is also provided in the server 100, such as Sound Blaster X-Fi Titanium from Creative, to interface with speakers/microphone 2422 thereby providing sounds and/or music.
The general purpose storage controller 2424 connects the storage medium disk 2404 with communication bus 2426, which may be an ISA, EISA, VESA, PCI, or similar, for interconnecting all of the components of the server 100. A description of the general features and functionality of the display 2410, keyboard and/or mouse 2414, as well as the display controller 2408, storage controller 2424, network controller 2406, sound controller 2420, and general purpose I/O interface 2412 is omitted herein for brevity as these features are known.
The exemplary circuit elements described in the context of the present disclosure may be replaced with other elements and structured differently than the examples provided herein. Moreover, circuitry configured to perform features described herein may be implemented in multiple circuit units (e.g., chips), or the features may be combined in circuitry on a single chipset, as shown on
In
For example,
Referring again to
The PCI devices may include, for example, Ethernet adapters, add-in cards, and PC cards for notebook computers. The Hard disk drive 2560 and CD-ROM 2566 can use, for example, an integrated drive electronics (IDE) or serial advanced technology attachment (SATA) interface. In one implementation, the I/O bus can include a super I/O (SIO) device.
Further, the hard disk drive (HDD) 2560 and optical drive 2566 can also be coupled to the SB/ICH 2520 through a system bus. In one implementation, a keyboard 2570, a mouse 2572, a parallel port 2578, and a serial port 2576 can be connected to the system bust through the I/O bus. Other peripherals and devices that can be connected to the SB/ICH 2520 using a mass storage controller such as SATA or PATA, an Ethernet port, an ISA bus, a LPC bridge, SMBus, a DMA controller, and an Audio Codec.
Moreover, the present disclosure is not limited to the specific circuit elements described herein, nor is the present disclosure limited to the specific sizing and classification of these elements. For example, the skilled artisan will appreciate that the circuitry described herein may be adapted based on changes on battery sizing and chemistry, or based on the requirements of the intended back-up load to be powered.
The hardware description above, exemplified by any one of the structure examples shown in
The above-described hardware description is a non-limiting example of corresponding structure for performing the functionality described herein.
A system that includes the features in the foregoing description provides numerous advantages to users. In particular, the system helps to conduct the therapy in home and as many times as needed, and helps therapists viewing live therapy conducted in the patient's home. In addition, the system is easy to use, as it does not require any sensor to be attached to the human body.
Obviously, numerous modifications and variations are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.
Thus, the foregoing discussion discloses and describes merely exemplary embodiments of the present invention. As will be understood by those skilled in the art, the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting of the scope of the invention, as well as other claims. The disclosure, including any readily discernible variants of the teachings herein, defines, in part, the scope of the foregoing claim terminology such that no inventive subject matter is dedicated to the public.
This application claims the benefit of priority from U.S. Provisional Application No. 62/060,981 filed Oct. 7, 2014 and from U.S. Provisional Application No. 62/141,719 filed Apr. 1, 2015, both of which are herein incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
62060981 | Oct 2014 | US | |
62141719 | Apr 2015 | US |