The present disclosure relates to apparatus and methods for improving eye-hand coordination. In particular, the present disclosure relates to an apparatus for measuring and quantifying eye-hand coordination using progressive displaying and tracing techniques and related methods.
Eye-hand coordination is the coordinated movement of a subject user's eye as the user's brain processes visual stimuli with the movement of the user's hand. In other words, it is the ability of the subject user's vision processing system to coordinate information received through the eyes to control and guide movement of the subject user's hands.
Eye-hand coordination is important for many day-to-day activities, such as writing, driving, or operating a computer. Beyond such basic needs, eye-hand coordination measurement and quantification is important to understand for particular individuals, such as athletes where activities may include catching a ball or making coordinated movements of the hands relative to a sports object (e.g., a baseball bat as a baseball approaches the subject user or a tennis racquet as a tennis ball moves towards the subject user).
Hand-eye coordination problems are usually first noted in children as a lack of skill in drawing or writing. For impaired children, drawing may show poor orientation on the page and the child may be unable to stay “within the lines” when using a coloring book. The child may continue to depend on his or her hand for inspection and exploration of toys or other objects.
Poor hand-eye coordination can have a wide variety of causes. Some common conditions responsible for inadequate eye-hand coordination include aging, vision problems and movement disorders. More specifically, impairments to eye-hand coordination are known to occur due to brain damage, degeneration of the brain, or other clinical conditions or problems. Adults having Parkinson's disease have a tendency to have increasing difficulty with eye-hand coordination as the disease progresses over time. Other movement disorders exhibiting eye-hand coordination issues include hypertonia (a condition characterized by an abnormal increase in muscle tension and a decreased ability of the muscle to stretch) and ataxia (a condition characterized by a lack of coordination while performing voluntary movements).
In order to treat such impairments, it is desired to repeatedly and consistently measure and quantify the eye-hand coordination of a subject user. Accordingly, there is a need to improve how to measure and quantify eye-hand coordination that permits individualized scoring from different visual stimuli presented to a subject user. Such improved methods and systems for measuring and quantifying eye-hand coordination may be used by insurance companies, which may desire quantifiable tests that analyze a subject user's eye-hand coordination and historically track the subject user's improvement over time. Thus, it may be desirable to provide an apparatus and/or related methods for improving eye-hand coordination that permits improved measuring and quantification of improvements to eye-hand coordination over time.
In the following description, certain aspects and embodiments will become evident. It should be understood that the aspects and embodiments, in their broadest sense, could be practiced without having one or more features of these aspects and embodiments. Thus, it should be understood that these aspects and embodiments are merely exemplary.
One aspect of the disclosure relates to an apparatus for measuring and quantifying eye-hand coordination of a subject user. The apparatus may include a processor, a tablet, and a memory. The table is interfaced with the processor and configured to accept progressive input from the subject user. The memory is interfaced with the processor and configured to maintain a record associated with measuring the eye-hand coordination of the subject. The processor is configured to progressively display a visual symbol on the tablet, detect a progressive tracing of the displayed visual symbol from the progressive input of the subject user, determine a score based upon a characteristic of the progressive tracing, and store the determined score within the record in the memory.
Another aspect of the disclosure relates to an apparatus for measuring and quantifying eye-hand coordination of a subject user, where the apparatus may include a housing, a processor disposed within the housing, a tablet, a stylus, a measurement result interface, and a memory. The housing has a first display opening and a second display opening. The tablet is in communication with the processor and disposed within the housing such that a display surface of the tablet is oriented for viewing through the first display opening of the housing. The tablet is configured to accept progressive input from the subject user, who is operating the stylus. The tablet is configured to detect the presence of the stylus as it moved relative to the display surface of the table by the subject user over time as the progressive input of the subject user. The measurement result interface is in communication with the processor and disposed within the housing such that the measurement result interface is viewable through the second display opening of the housing. The memory is in communication with the processor and configured to maintain a plurality of visual cues and a plurality of records associated with measuring the eye-hand coordination of the subject user. As part of this apparatus, the processor is configured to select one of a plurality of visual cues stored in the memory as a visual symbol to be progressively displayed for the subject user based upon an analysis of previously determined eye-hand coordination scores for the subject user stored within the records in the memory, progressively display the selected visual symbol on the tablet, detect a progressive tracing of the displayed visual symbol from the progressive input of the subject user, determine a new eye-hand coordination score based upon how quickly the progressive tracing follows the progressive display of the visual symbol over time and how far a path of the progressive tracing deviates from a path of the progressive display of the visual symbol over time, and store the new eye-hand coordination score within a new record in the memory. The processor may be configured to progressively remove an older portion of the visual symbol while progressively displaying a newer part of the visual symbol.
The apparatus, according to this aspect of the disclosure, may also have the processor being configured to provide a ranking on measurement result interface of the new eye-hand coordination score for the subject user in comparison to at least one prior score for the subject user stored within the records in the memory so as to quantify an improvement factor of the eye-hand coordination of the subject user over time.
Yet a further aspect of the disclosure relates to a method for measuring and quantifying eye-hand coordination of a subject user. The method begins by progressively displaying a visual symbol on a tablet and accepting progressive input from a stylus operated by the subject user as a visual symbol is progressively displayed on the tablet. The method continues by detecting a progressive tracing of the displayed visual symbol from the progressive input of the subject user. Next, the method determines a score based upon a characteristic of the progressive tracing, the characteristic of the progressive tracing being at least one of how quickly the progressive tracing follows the progressive display of the visual symbol over time on the tablet and how far a path of the progressive tracing on the tablet deviates from a path of the progressive display of the visual symbol on the tablet over time. Finally, the method stores the determined score as a record on a memory device, where the record is associated with the subject user's eye-hand coordination at a particular instance in time.
Another aspect of the disclosure relates to an apparatus for measuring and quantifying eye-hand coordination of a subject user applicable to a three-dimensional operating environment. The apparatus includes, at least in part, a three-dimensional display device, a processor, at least one sensor, and a memory. The three-dimensional display device provides a three-dimensional view of displayed information to the subject user. The processor is in communication with the three-dimensional display device and a sensor, which is configured to accept three-dimensional progressive input from the subject user. The memory is also in communication with the processor and configured to maintain a plurality of visual cues and a plurality of records associated with measuring the eye-hand coordination of the subject user. The processor is configured to progressively display at least one of the visual cues as a three-dimensional visual symbol on the three-dimensional display device, detect a three-dimensional progressive tracing of the displayed visual symbol from the progressive input of the subject user, determine a new eye-hand coordination score based upon how quickly the three-dimensional progressive tracing follows the progressive three-dimensional display of the visual symbol over time and how far a path of the three-dimensional progressive tracing deviates from a path of the three-dimensional progressive display of the visual symbol over time, and store the new eye-hand coordination score within a new record in the memory.
Additional advantages of the disclosure will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the disclosed exemplary embodiments.
Aside from the structural and procedural arrangements set forth above, the embodiments could include a number of other arrangements, such as those explained hereinafter. It is to be understood that both the foregoing description and the following description are exemplary only.
The accompanying drawings, which are incorporated in and constitute a part of this description, illustrate several exemplary embodiments and together with the description, serve to explain principles of the embodiments. In the drawings,
Reference will now be made in detail to exemplary embodiments. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
In general, embodiments of an apparatus and method for measuring and quantifying eye-hand coordination of a subject user are described herein. One or more visual symbols are progressively displayed on a display surface, such as an interactive surface of a tablet, where a progressive tracing of the displayed symbols can be detected based upon progressive input received from the user. Based upon various characteristics of the tracing, such as its path, accuracy, and/or how quickly the user completes the tracing, a score is determined and stored relative to the specific user. Thus, through repeated performance of following the developing progressive lines of the visual symbol, the subject user's eye-hand coordination may be measured and repeatedly quantified.
In overview,
Referring now to
Those skilled in the art will appreciate that a processor is used herein as a general term for one or more logic devices that are able to control an apparatus with inputs and conditional outputs, including but not limited to combinational logic circuits, general purpose microprocessors, programmable logic devices or programmable logic arrays (PLA). And while exemplary unit 100 is described herein as microprocessor based, other variations of such a unit may be implemented with similar functionality with hard wired circuits or other logic circuits to function without the need for a programmable microprocessor.
Tablet 110 may be implemented as a touch sensitive input device configured to display one or more different possible visual cues as a particular visual symbol, such as symbols 112, 113, 114, and 115a. The tablet 110 receives input from the subject user via touch by detecting the presence, relative location and movement of the user's finger when pressed against a display surface of tablet 110. Alternatively, the tablet 110 may receive input from the user by detecting the presence, relative location and movement of a stylus (not shown) as it is held against the display surface of tablet 110 and moved relative to that surface.
Display 120 of unit 100 is shown as generally having various interfaces, e.g., 125a-125e, that provide useful information to the user. In the illustrated embodiment of
In operation, based upon patient identification and the subject user's prior scores, the tablet 110 of unit 100 displays a particular visual symbol to be traced by the subject user once the test begins. Patient identification may be in the form of a user response to a prompt appearing on one of unit 100's displays (include the surface of tablet 110) or, alternatively, in the form of an electronic signal received by the unit 100 from an external source (not shown), such as a remote computer used in a rehabilitation or clinical environment. Visual cues may be of any type of scenes, shapes, objects, numbers or letters, such as the exemplary visual symbols shown in
In the example shown in
CPU 300 is implemented as a microprocessor capable of accessing RAM 310, where program code (not shown) for operating unit 100 resides during operation. The program code is initially stored within memory storage 320 as one or more sets of executable instructions. CPU 300 typically reads the appropriate program code from memory storage 320 upon initialization. From there, CPU 300 runs the program code, which in turn, controls operation of unit 100 as the subject user interacts with the unit 100 as part of measuring and quantifying the user's eye-hand coordination in accordance with embodiments of the present invention. The steps shown in
Memory storage 320 is implemented within unit 100 as a local memory storage, but alternatively may be implemented as a remote memory storage device accessible by CPU 300 through a data communication network interface (not shown). Thus, embodiments of the present invention may implement memory storage 320 as a variety of computer-readable medium including, but not limited to, a hard disk drive, a floppy disk drive, a flash drive, an optical drive, or small format memory card such as a Secure Digital (SD) card. Thus, other embodiments of the present invention may provide the program code on removable memory storage or on memory storage located remotely from the actual testing unit, such as unit 100 or unit 200. Memory storage 320 also stores and maintains determined scores after a subject user completes a test using unit 100 as well as prior scores for a particular subject user.
In operation, the visual symbol being progressively displayed may be selected from one of multiple possible visual cues (not shown) stored in memory storage 320 or generated from the program code resident on memory storage 320. The visual cues may be stored in memory storage 320 as separate code representing the particular visual symbols to be displayed or, alternatively, may be stored in memory storage 320 as part of the operational program code initially loaded by CPU 300 into RAM 310 described above. Selection of which visual cues to use as part of the visual symbol being progressively displayed may depend upon the patient I.D. information associated with the user, prior scores stored in records of memory storage 320 indicative of past performance by the user on eye-hand coordination tests, or determined rankings of the user's improvement of eye-hand coordination.
Additionally, CPU 300 may cause the tablet 110 to progressively display the visual symbol by progressively removing an older portion of the visual symbol while progressively displaying a newer part of the visual symbol while detecting the progressive tracing of the appearing and disappearing visual symbol.
Referring now to
In general, CPU 300 is configured to determine a score for the subject user upon completing the test based upon one or more characteristics of the detected progressive tracing. For example, in one embodiment, the score is based upon how quickly the user is able to trace the visual symbol over time. In another embodiment, the score may be based upon how accurate the progressive tracing is relative to the path of the progressively displayed visual symbol, such as how far the progressive tracing path deviates from a path of the progressively displayed visual symbol. In some embodiments, CPU 300 is configured to provide feedback and interim test results in the form of accumulated scores. However, other embodiments may configure the CPU 300 to determine the subject user's score at the completion of the test as a final score.
In addition to these general examples and accompanying description for how an embodiment of the present invention may determine a score, several more detailed examples for determining a score in accordance with principles of the present invention are provided with reference to Table 1 below.
Score 1 may be determined in one embodiment by determining the “Distance” as the length of the traced arc along the path of the progressive display (e.g., path 415 shown in
Score 2 is an inverse of Score 1 with the “Constant” being added to the Distance to prevent divide-by-zero errors. Score 2 may also be determined as a percentage when compared to when the Distance is a zero value (the ideal perfect score). Thus, an implementation of Score 2 as a percentage may be determined as (100)×(Score 2)/(Score 2 when Distance is a zero value).
Score 3 is a type of score that may be helpful in gauging initial reaction time between observation with the eye and coordination with hand movements. The Mulitplier value is multiplied times two different factors: (1) the time between the first appearance of a particular visual symbol and when the subject user first provides the initial point of progressive input for tracing that visual symbol (i.e., the Elapsed Contact Time), and (2) the position error between the first appearance of the particular visual symbol and the initial point of progressive input from the subject user when attempting to trace that visual symbol (i.e., Contact Position Error).
Score 4 is the product of the Multiplier and a maximum value of the Distance determined when the subject user is attempting to trace a particular visual symbol. Likewise, Score 5 is the product of the Multiplier and a sum of the Distances incrementally determined over time as the subject user is attempting to trace a particular visual symbol. Thus, Score 4 is a maximum error type of scoring measurement while Score 5 is an accumulated error type of scoring measurement.
Another example of scoring alternatively determines the Distance as a pure linear distance between points (e.g., the straight line distance between the newest point displayed on the progressively displayed visual image and the latest progressive input point when the subject user attempts to trace the path of the progressively displayed visual symbol) as opposed to the distance computed along the progressively displayed path (which may be different than the straight line distance). Scores 6-10 are types of scores determined in accordance with the exemplary Scores 1-5, but with the Distance value determined as a straight line Distance. Depending on the configuration of the visual symbol being progressively displayed, using a straight line Distance may be less taxing on the unit to compute.
As with Scores 1 and 2, which have an inverted relationship, other scores may be used that are inverted versions of such exemplary scores. For example, Scores 11-18 correspond to the factors and calculations used to determine Scores 3-10, respectively, but are merely inverted.
In some embodiments, Scores 1 and/or Score 2 may be determined and displayed in substantially real time as instantaneous scores. In other embodiments, they may also or alternatively be determined as averages and, at the end of the test, the last average may be used as the respective final score for the test. Embodiments of the invention may also or alternatively determine Scores 3, 4 and/or 5 at the end of each progressively displayed visual symbol and, as such, Scores 3, 4 and/or 5 would be updated incrementally rather than in a continuous or near real time manner. However, it is anticipated that embodiments of the present invention implementing Scores 3, 4, and 5 may determine these scores as averages and, at the end of the test, the last average may be used as the final score for the test.
Further details of the operation and functionality of embodiments of the present invention may be understood with reference to the flow diagrams set forth in
At stage 520, method 500 updates the progressive tracing from the detected progressive input from the user before proceeding to stage 525. At stage 525, if the time for the test is at an end, stage 525 proceeds to 530 for scoring. However, if the test is not yet ended, stage 525 proceeds back to stage 510 where the visual symbol continues to be progressively displayed and stages 515 and 520 where progressive input is received and the progressive tracing is continued to be detected.
At stage 530, the method 500 determines a score based upon a characteristic of the progressive tracing. In one embodiment, the score may be based upon how quickly the progressive tracing follows the progressive display of the visual symbol over time. In another embodiment, the score may be based upon how far a path of the progressive tracing deviates from a path of the progressive display of the visual symbol over time. Such deviations may be computed as an average accumulated error distance of the tracing off the path of the progressively displayed visual symbol. In some embodiments, scoring may be determined on an ongoing, accumulated basis instead of just at the completion of the test, and as such, may be periodically shown to the user during the test.
After the score is determined at stage 530, the determined score is stored within a record in memory at stage 535 before method 500 ends. The record may be of any predetermined format or data structure within volatile memory, such as RAM 310, or non-volatile memory, such as memory storage 320. The record may include the determined score at the end of the test or may also include details of the test (e.g., ongoing accumulated scores during the time, a time profile of such periodic accumulated scores, information relating to the tracing deviations and/or time lag information relating to how quickly the subject user was tracing the visual symbol, etc.).
At stages 625 and 630, method 600 receives progressive input from the subject user while progressively displaying a visual symbol from the selected visual cue(s). In some embodiments, the visual symbol may have an older portion that is progressively removed while a newer portion of the visual symbol is progressively displayed. At stage 635, a determination is made if change in stylus position is detected as updated progressive input from the subject user. If so, stage 635 proceeds to stage 640. If not, stage 635 proceeds back to state 630 where the visual symbol continues to be progressively displayed.
At stage 640, the path of tracing is updated. In some embodiments, the location information of the user's latest trace input may be recorded with reference to elapsed time. At stage 645, if the test had ended, method 600 proceeds to stage 650 for scoring. However, if the test has not yet ended, method 600 proceeds back to stage 630 where the visual symbol continues to be progressively displayed.
At stage 650, method determines a new eye-hand coordination score based upon a characteristic of the progressive tracing. In one embodiment, the score may be based upon how quickly the progressive tracing follows the progressive display of the visual symbol over time. In another embodiment, the score may be based upon how far a path of the progressive tracing deviates from a path of the progressive display of the visual symbol over time. Such deviations may be computed as an average accumulated error distance of the tracing off the path of the progressively displayed visual symbol. In some embodiments, scoring may be determined on an ongoing, accumulated basis instead of just at the completion of the test, and as such, may be periodically shown to the user during the test and maintained for later storage in memory associated with the current test.
After the score is determined at stage 650, the determined new score is stored within a new record in memory at stage 655. As mentioned previously, such a new record may be of any predetermined format or data structure within volatile memory, such as RAM 310, or non-volatile memory, such as memory storage 320. The record may include the determined score at the end of the test or may also include details of the test (e.g., ongoing accumulated scores during the time, a time profile of such periodic accumulated scores, information relating to the tracing deviations and/or time lag information relating to how quickly the subject user was tracing the visual symbol, etc.).
At stage 660, method 600 also may determine and provide a ranking of the new score in comparison to one or more prior scores for the subject user. Alternatively, the ranking may be in comparison to other standards or statistics other than the subject user's actual prior scores, such as general population statistical information relating to eye-hand coordination. Such rankings may provide an indication of progress for prescribed therapy that is intended to address and improve the subject user's eye-hand coordination skills.
While the principles of the present invention as exemplified through the embodiments described above for measuring and quantifying eye-hand coordination rely upon two-dimensional (2D) input and output, those skilled in the art will appreciate that alternative embodiments of the present invention may be implemented with three-dimensional (3D) input and output. Generally, an exemplary embodiment of the present invention may progressively display and remove the path and trace of a visual symbol in three-dimensions (e.g., via output seen through a three-dimensional head set, goggles or other vision system) and detect three-dimensional input from the subject user (e.g., via input from a user-manipulated three-dimensional input device, such as sensors in a hand glove).
In one embodiment, the tracker system 740 is generally implemented as one or more communication ports (such as a universal system bus (USB), serial, parallel, or other data communication interface) that links the computer and sensor/display device. The tracker system 740 provides access by computer 730 to positional signals generated by one or more input sensors 760 and the display device 750 while providing signals from the computer 730 to the display device 750 as a 3D user interface. In one embodiment, the tracker system 740 provides a wired interface between computer 750 and the sensors/display device. In other embodiments, the tracker system 740 may use a wireless transmitter and receiver in each of the respective devices to facilitate provision of signals from the computer 730 to each of the sensors 760 and display device 750 and reception by the computer 730 of signals generated from each of the sensors 760 and display device 750.
The 3D embodiment of the present invention illustrated in
In the context of such three-dimensional input and display output for the subject user, principles of the present invention, the application software running on computer 730 may implement the exemplary methods described with respect to
Those skilled in the art will appreciate that the details of a computer or processor-based apparatus and computer-implemented method for receiving three-dimensional user input and displaying three-dimensional output are well known. Such well known apparatus may provide the operating platform for embodiments of the present invention. For example, details of an implementation for such an apparatus consistent with the embodiment described in
Principles of the present invention, which include measuring and qualifying eye-hand coordination of a subject user, may be applied with embodiments used in a teaching environment. For example, embodiments of the present invention may be used to help teach a subject user how to draw, sketch or paint in two dimensions (e.g., via a tablet interface) or in three-dimensions (e.g., via the 3D input and output devices referenced in
In more detail, an embodiment of the present invention may be used to teach a subject user how to sketch or draw with a guided measurement and quantification of eye-hand coordination as set forth relative above. Relevant embodiments used to measure and quantify eye-hand coordination and teach drawing may have a memory that maintains files associated with one or more composite images to be drawn. Each file for a composite image may include one or more visual items. The visual items collectively make up the composite image to be drawn by the subject user. For example, one file may include four distinct visual items that collectively make up a composite image of a person's face (e.g., two eyes, a nose and a mouth).
In operation of such an embodiment, the processor in the apparatus is configured, via programmatic instructions, to follow steps outlined generally and described with respect to
Another embodiment of the present invention may be used to teach a subject user how to paint as it measures and quantifies eye-hand coordination. In this embodiment, the memory maintains files associated with one or more composite images to be painted. Each file for a composite image may include one or more visual items and related color information assigned to the whole or parts of each visual item. The visual items, including their respective individually colored parts, collectively make up the composite image to be painted by the subject user.
In operation of such an embodiment, the user may be prompted to select a color from a predetermined palette to attempt to match the color associated with a particular visual symbol or part thereof. The visual symbol, or the part of the symbol, is then progressively traced by the user using the selected color. After the user completes painting of the visual symbol, including all of it's individual parts, another of the visual items making up the composite image may be progressively displayed. In this manner, different visual symbols are progressively shown to the subject user, who manipulates an input device (e.g., a stylus and tablet, a hand glove with positional sensors) that provides progressive input to represent painting of the composite image. The user's tracing, which includes outlining and filling of individual parts of the visual symbols, is then progressively displayed and scored. Scoring, as described above, may be determined based upon how quickly the subject user's progressive tracing follows the progressive display of the visual symbol over time and how close the tracing comes to the displayed visual symbol (generally how far a path of the tracing deviates from a path of the displayed visual symbol). Additionally, scoring may include a matching determination of the user's selected color and the visual symbol's assigned color (or it's individual part's color). As such, scoring in this application is associated with a level of painting skill, which may advantageously improve with time and practice using an embodiment of the present invention.
Similar to selection of color, other embodiments may also include selection of other reproduction characteristics (e.g., painting characteristics, drawing characteristics). For example, an embodiment may have the system or apparatus prompt selection of a pattern (e.g., dots, streaks, etc.), representative brush type and shape, (e.g., round, flat, fan, angle, Filbert, etc.), and texture to be applied (oil-like thick appearance, watercolor-like smooth appearance, charcoal-like rough appearance, etc). In such embodiments, the file for the composite image would maintain predetermined stored characteristics for each visual item's assigned painting characteristics (e.g., pattern, correct brush type to be used when painting, and texture to be applied). As the visual symbol is progressively displayed, the progressive input from the user is shown on the relevant display as corresponding to painting with the selected painting characteristics.
Although aspects of the exemplary embodiments disclosed herein are explained in relation to a specific computer or microprocessor based system with programmatic instructions, it should be understood that the exemplary embodiments described herein could be used in systems with processors that are hard wired with programmatic instructions for providing the novel functionality and operations.
Thus, at least some portions of exemplary embodiments of the systems outlined above may used in association with portions of other exemplary embodiments. Moreover, at least some of the exemplary embodiments disclosed herein may be used independently from one another and/or in combination with one another and may have applications to devices and methods not disclosed herein.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structures and methodologies described herein. Thus, it should be understood that the invention is not limited to the subject matter discussed in the description. Rather, the present invention is intended to cover modifications and variations.