The present invention relates to keyboard input devices in general, and to a more convenient device and method for providing input to computerized devices in particular.
Interactions with computerized devices are generally achieved through the use of input devices. Input devices associated with computerized devices commonly include keyboards used for providing the computer signals interpreted as characters. Most users of such regular keyboards, must repeatedly lift up their heads; re-focus the eyes on the computer screen and search for the current cursor position in order to see the text that has just been typed. Re-focusing of the user's field of view (FoV) is performed very often during typing, sometimes as often as every few seconds. The speed and accuracy of typing, for most users, is reduced considerably, because they have to refocus their FoV from the screen to the keyboard and vice-versa.
Even with the rise in popularity of computer use, and though most people spend a large proportion of their time, at home or at work, using keyboards, very few people are full “touch typists.” That is, they are incapable of keeping their FoV focused on the screen, while continuously using a keyboard for character input. Users of keyboards and like input devices can type for a period of time without looking at the keyboard, but must stop once in a while to re-orient their hands over the keyboard or look for a specific key on the keyboard, while shifting their eye focus. This shift in eye focus usually occurs once every few seconds in most users.
Reference is now made to prior art
Prior art
There is therefore a need for a device and method to allow users of computerized devices such as keyboards to do away with some or most focus re-orientation pauses.
Accordingly, it is a principal object of the present invention to increase typing speed, improve accuracy, and prevent deterioration of eyesight while performing input to computers and other electronic keyboard-enabled devices.
It is another principal object of the present invention to provide computer systems that overcome the drawbacks of separating the input device, e.g. a keyboard, from the display of the data entered.
It is one other principal object of the present invention to provide an input device and method for use with computerized devices that reduce the need to shift FoV from an output device to an input device.
It is a further principal object of the present invention to increase the speed and accuracy of using an input device, such as a keyboard in connection with a computerized device having an output device, such as a screen display, showing the input made.
It is one further principal object of the present invention to provide a user of the computerized device an indication preferably on an output device, such as a screen display, as to the location of the user's fingers.
It is yet another principal object of the present invention to determine the location of the user's fingers and/or to determine which keys the user is likely to use next, based on various indications received from the input device, and to display on an output device, such as a screen display, the user's fingers location as well as a depiction of the input device under said fingers.
It is still another principal object of the present invention to determine the characters that will be generated by typing on the keyboard keys in the vicinity of the fingers of the user and displaying on the output device, such as the screen display, a display, such as keyboard keys, showing the characters that will be entered should either of the keys be engaged.
It is yet still another principal object of the present invention to reduce the need for the user of a computerized device to shift a field of view or refocus his eyesight between an input device such as a keyboard and an output device such as a screen.
It is one more principal object of the present invention to achieve a new type of keyboard with an enhanced level of typing efficiency and user friendliness.
A method is disclosed for a user operating a computer device that includes a computer program, a keyboard, a screen display and driver software. The method includes the driver software receiving information from the keyboard by indicating where the user's fingers are located and associating the information with keyboard keys by the driver software upon receiving the information and generating an image depicting an on-screen keyboard key display on the screen. The method further includes determining where to display the generated image on the screen, transmitting the generated image for display on the screen and indicating to the user by the generated image the location of his fingers, which key he is likely to hit next and where his hands are oriented with respect to the keyboard, such that the device assists in creating a closed eye-brain feedback loop, allowing a user to type without pause.
Aspects of the present invention relate to various embodiments of keyboards and displays, including inter alia (i) computer systems wherein a representation of the keyboard and the location the user's hands or pointing devices on the keyboard are displayed, (ii) keyboards having touch sensors for determining multiple points of contact thereon by a user, (iii) computer systems wherein variation in the physical keystroke on a given key activates correspondingly different functions and wherein a menu of these different functions is presented on the display screen when a user approaches or touches the given key, (iv) computer systems for handicapped individuals that enable user input without requiring the user to look at the input device.
The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which corresponding or like numerals or characters indicate corresponding or like components. Unless indicated otherwise, the drawings provide exemplary embodiments or aspects of the disclosure and do not limit the scope of the disclosure.
In the drawings:
The principles and operation of a method and an apparatus according to the present invention may be better understood with reference to the drawings and the accompanying description, it being understood that these drawings are given for illustrative purposes only and are not meant to be limiting.
These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus incorporated into a system, such that the executed instructions create means or devices for implementing the functions/acts specified in the drawings and/or their descriptions.
The computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process, such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
In accordance with some embodiments of the subject matter, when a user places his hands on keyboard 304 an image 308 is displayed on screen display 306 in the vicinity of the character entry point. Such location can, in some embodiments of the present invention, be in the vicinity of an indicating element such as a cursor, mouse cursor, text entry cursor, highlighted location, field entry and the like. Image 308 can be a graphical or animated layover or any graphical or other depiction of the location of the user's fingers with respect to the input device. In some embodiments image 308 can be replaced by auditory indication, such as for example for the hard of hearing.
In other embodiments of the subject matter, image 308 can be replaced by vibratory indication. Image 308 indicates to the user the location of his hands or fingers on the keyboard, and may also indicate what keys the user is touching or about to touch. In some other embodiments, image 308 can provide a further indication as to which keys the user is likely to touch based on various indicators collected by computer device 302.
In view of image 308, the user of computer 302 can then type freely without having to pause and look at the keyboard in order to re-orient his hands or in order to find a certain key to press next. The keys the user's hands are touching, and in some cases adjacent keys, are made visible to him on the screen through image 308 as he is typing.
In accordance with a preferred embodiment of the invention, computer 302 comprises a computer program, such as driver software (not shown) that receives information from keyboard 304, indicating where the user's fingers are located, or the vicinity thereof.
Upon receiving such information, the driver software associates the information with one or more keyboard keys and generates image 308. Next, the driver determines where to display image 308 on the screen display 306. Next, driver software transmits the generated image 308 to be displayed on the screen display 306. The image 308, which is a depiction of an on-screen keyboard keys display, indicates to the user the location of his fingers and/or hand and which key he is likely to hit next, and where his hands are oriented with respect to the keyboard. The driver software may also determine which character will be displayed on the screen given depression of any key and generate a display an image 308 wherein the characters on the image correspond to those which will be generated by computer 302 when the user activates the keys.
In accordance with some embodiments of the present invention, the device assists in creating a closed eye-brain feedback loop, allowing a user to type without pause to look at the keyboard.
In some exemplary embodiments, an indication of the user's fingers touching the touched keys may be shown. Indication 420 illustrates that the user is touching keys 406 and 408 with one finger. Indication 422 indicates that the user is touching keys 410 and 412 with another finger. In some exemplary embodiments, a key may have several meanings 430, as is exemplified in
In other embodiments of the present invention, ultrasound sensors and/or light sensitive sensors and/or infrared sensors positioned in various locations on the keyboard and/or a camera located on or in the vicinity of the keyboard can be used to identify the vicinity of the user's fingers to the keys or if said fingers are in contact with the keyboard.
In another embodiment of the present invention touch sensitive sensors are placed over the keys (or the keyboard). In accordance with such an embodiment, the processing unit determines the position of the user's hand when the user is in contact with any one or more of the keyboard keys. The driver software will generate an image that will show the current location of the fingers as long as they are placed over keyboard keys.
In yet another embodiment of the present invention the driver software generates and displays an image of a finger placed over more than one key, such as when the finger is located between two keys. By employing other types of sensors, such as a camera sensor, ultrasound sensors (or other types) the continuous position, and optionally the distance from the keyboard keys of the user's fingers can be determined.
In accordance with the principles of the present invention the user is provided with a real-time or near real-time indication of his hands and/or fingers current position throughout the typing process. The current position of the user's hands/fingers includes a three dimensional representation of the keyboard and hands/fingers to allow better accuracy and improve the typing process, and/or correct any pre-existing positions the user may be using which may lead to inefficiency, errors in typing, pain in the hand muscles and the like.
Instead of, or in addition to, the on-screen display, an audio indication of the position of the users hands can be provided in some embodiments of the present invention. Different tones, rhythms, levels of volume or other sound variations can be used to indicate to the user the position of his hands over the keyboard input device. In some other embodiments of the present invention, gesture motions and/or hand movements of various types can be employed in association with touch sensitive sensors, camera sensors or other sensor types. The user will be able to perform gesture motions on or above the keyboard in order to achieve a specified function, such as opening an application, inputting a preset text string or any other preset function. In yet other embodiments of the present invention, the user may be shown a pull down menu and may make a selection directly from a pre-assigned key in the vicinity of the location of his fingers, or use another input device such as a pointing device.
In other embodiments of the present invention, in an input device that utilizes touch sensitive sensors, the different levels of pressure applied by the user during typing can be measured and accordingly different functions can be activated as a result of the amount of pressure applied. One exemplary embodiment would include an analysis of the touch sensitive sensors. If the duration or amount of pressure applied to the touch sensitive sensors is greater than a predetermined threshold, an appropriate indication is provided, and in the present example a capital letter or an alternate character is provided as an output. Other examples can include outputting a lower case character with weaker pressure applied, providing secondary function to any key upon a predetermined pressure applied, such as pull down menu if the key was pressed more than a predetermined amount of milliseconds.
Persons skilled in the art will appreciate that the present invention can be implemented in various devices, including Personal Computers, Laptop computers, Television sets with keyboard input devices, mobile telephones, mobile data devices and the like. The definition of input device and/or “keyboard” is not limited to any specific input device, computer or other keyboard and particular layout or any number of keys, or keys functions. The present invention can be applied to various text or character input devices in various layouts and configurations. The on-screen display can be shown on different devices such as screens, television sets, mobile device screens, projectors, display glasses, head mounted displays, on one or more displays, in various shapes sizes and configurations.
The present invention can also be adapted for blind, but not hard of hearing typists. It can also work for people who cannot use their hands and need to type using their feet or artificial pointing devices.
Having described the present invention with regard to certain specific embodiments thereof, it is to be understood that the description is not meant as a limitation, since further modifications will now suggest themselves to those skilled in the art, and it is intended to cover such modifications as fall within the scope of the appended claims.