https://www.kickstarter.com/projects/2073884751/brightfingers-computer-vision-helps-correct-finger
The present invention relates to methods and apparatus for teaching computer keyboarding, also known as touch typing.
It has long been recognized that being able to detect incorrect finger use in learning keyboarding would be a substantial advantage. Students learning correct touch typing technique will still, even months into such training, exhibit incorrect finger use to a greater or lesser extent. This is due to the fact that, though they may be given guidance as to the correct finger to use, existing systems are not able to detect correct finger use.
If however a system did actively encourage correct finger use, then users would learn the proper technique significantly faster. They would be aware from the very beginning of the learning process when they were using the incorrect finger. After an initial acclimatization phase, the system might actually require correct finger use. In this case no incorrect finger muscle memory would be reinforced, i.e. by allowing the lesson to move forward and the student to continue practicing. Only correct motor movements would allow the student to progress in the lesson.
The advantage would come from the time saved learning the proper technique. Students' progress would be slower after the said acclimatization phase, as they would not be able to advance the lesson with correct keypresses that did not use the correct finger. But the overall time required to gain proficiency in this motor skill would be sizably reduced.
In accordance with an exemplary and non-limiting embodiment of the invention, the video input is configured to provide a real-time feed of the user's keyboard.
The user wears markings on his/her fingers, which can be analyzed by the system using, for example, computer vision. In one embodiment, the user wears full-fingered gloves with individually-colored fingers, with which lessons are performed.
The app has an internal map of the correct finger to be used for each key. For example the left ring finger is the correct finger for the W, S, and X keys on a US English keyboard. In one embodiment, the app also has an internal map of the expected color for each column of fingers. In this case of the left ring finger, the expected color is orange. These colors correspond to the colors applied to individual fingers on the gloves in this embodiment.
The app performs real-time analysis on the video feed, doing finger recognition on the region of pixels showing the expected key to be pressed, as specified by the current lesson. In the event that the right key has been pressed (no processing is required in the event that the wrong key has been pressed), finger recognition is performed to see if the finger above the key at the time of the keypress was the correct finger for this particular key. In one embodiment the means of this analysis/recognition is via computer vision color detection. If the color of the gloved finger seen in this pixel region of the video feed matches the color expected according to the fingers/keys map, then the lesson advances.
If the result of the analysis/recognition is that an incorrect finger was used, then the user is alerted to having used the wrong finger. This alerting can be merely visual or auditory (as in the said acclimatization phase), or it can actually halt the advance of the lesson.
To illustrate,
In accordance with another exemplary and non-limiting embodiment of the invention, finger detection is performed via so-called hand pose estimation. In this embodiment, no special markings must be worn on the fingers.
In accordance with exemplary and non-limiting embodiments, the system disclosed herein makes it possible to alert the user when an incorrect finger has been used to press a particular key.
With reference to
The video feed 140 routes a live image of the keyboard and user's fingers, when present, to the app for real-time analysis 103. In one embodiment, the video can come from a computing device's built-in camera, the feed of which is directed from the keyboard area by means of a mirror attached to the camera lens, which redirects the feed from the user's face down to the keyboard instead. In another embodiment, a webcam or similar device can be attached to, for example, the screen of a laptop, for devices without a built-in camera, to provide the needed video feed.
With reference to
In one embodiment, the analysis 204 is performed via color analysis of the pixel region 301. In this embodiment, gloves with colored fingers are used as the means of marking individual fingers. The analysis in this embodiment uses computer vision to determine the color of the finger used. In another embodiment, gloves with different geometric patterns are used for each finger, the computer vision function then analyzing the patterns to determine which finger was used. In a further embodiment, rings, for example, can be placed on the fingers to provide colors or patterns for the detection system to analyze.
In a further embodiment, nothing is worn on the fingers for the analysis to be performed. Instead, the system determines which finger was used using hand pose estimation.
With reference to
With reference to
This application claims the benefit of priority to U.S. Provisional Patent Application No. 62/365,959 filed Jul. 22, 2016 titled “IDENTIFYING FINGERS IN LEARNING COMPUTER KEYBOARDING”.
Number | Date | Country | |
---|---|---|---|
62365959 | Jul 2016 | US |