Early systems for entering commands into programs used keyboards to enter text strings that included the names of the commands, any input parameters, and any switches to modify operation of the commands. Over the last couple of decades, these systems have been nearly replaced by graphical input systems that use a pointing device to move an icon, such as a graphical representation of an arrow, to point at objects displayed on the screen and, then, select them for further operations. The selection may be performed, for example, by setting the icon over the object and clicking a button on the pointing device. In recent years, systems for entering commands have been developed that more strongly emulate physical reality, for example, allowing physical selection of items on a touch sensitive screen.
Certain exemplary embodiments are described in the following detailed description and in reference to the drawings, in which:
Embodiments described herein provide an optical command entry system that can use an optical sensor system to enter commands selected from a template. The optical sensor system may be configured to monitor a three dimensional space in front of a monitor to determine locations of objects with respect to the display. A pattern recognition module can monitor an image of the area in front of the display as collected by the optical sensor system. If a template having printed patterns is placed in view of the sensor, the pattern recognition module may identify the patterns, map their locations, and associate them with particular commands, such as for an application. A command module may determine a location of an object, such as a finger, hand, or other object, in front of the display and, if the location of the object intersects one of the patterns, the command associated with that pattern can be passed to an application. In some embodiments, if one of the patterns is associated with a particular application, placing the template in front of the display may cause the pattern recognition module to start the associated application.
The sensors 102 may include motion sensors, infrared sensors, cameras, infrared cameras, or any other device capable of capturing an image. In an embodiment, the sensors 102 may include an infrared array or camera that senses the locations of targets using a time-of-flight calculation for each pixel in the infrared array. In this embodiment, an infrared emitter can emit pulses of infrared light, which are reflected from a target and returned to the infrared array. A computational system associated with the infrared array uses the time it takes for the infrared light to reach a target and be reflected back to the infrared sensor array to generate a distance map, indicating the distance from the sensor to the target for each pixel in the infrared sensor array. The infrared array can also generate a raw infrared image, in which the brightness of each pixel represents the infrared reflectivity of the target image at that pixel. However, embodiments are not limited to an infrared sensor array, as any number of other sensors that generate an image may be used in some embodiments.
The volume 108 imaged by the sensors 102 can extend beyond the display 106, for example, to a surface 110 which may be supporting the system 100, a keyboard 112, or a mouse 114. A template 116 may be placed on the surface 110 in front of the system 100 in view of the sensors 102. The system 100 may be configured to note the presence of the template 116, for example, by recognizing patterns 118 on the template. For example, the system may recognize an identifying pattern 120 associated with a particular program, such as a drawing application or a computer aided drafting program, among others, or by recognizing patterns associated with individual commands. The pattern recognition may be performed by any number of techniques known in the art, for example, generating a hash code from the pattern, and comparing the hash code to a library of codes. Any number of other techniques may also be used.
The system 100 may respond in a number of ways to recognizing a pattern, for example, the identifying pattern 120 on the template 116. In one embodiment, the system 100 may start a program associated with the identifying pattern 120. The system 100 may analyze the template 116 for other patterns, which can be associated with specific functions, such as save 122, undo 124, redo 126, or fill 128, among many others.
The system 100 can allow gestures to be used for interfacing with programs. For example, an item 130 in a program and shown on the display 106, may be selected by a gesture, such as by using a finger 132 to touch the location of the item 130 on the display 106. Further, a function identified on the template 116 may be selected, for example, by using a finger 132 to touch the relevant pattern 128. Touching the pattern 128 may trigger an operational code sequence associated with the pattern 128, for example, filling a previously selected item 130 with a color. Any number of functions and or shapes may be used in association with a selected item, or with open documents, the operating system itself, and the like, such as printing, saving, deleting, or closing programs, among others. Removing the template 116, or other patterns, from the view of the sensors 102 may trigger actions, such as querying the user about closing the program, saving the document, and the like.
In the all-in-one computer system 202, a bus 204 can provide communications between a processor 206 and a sensor system 208, such as the sensors 102 described with respect to
Other units are generally included in the all-in-one computer system 202 to provide functionality. For example, a human-machine interface may be included to interface to a keyboard or a pointing device. In some embodiments, one or both of the pointing device and keyboard may be omitted in favor of using the functionality provided by the sensor system, for example, using an on-screen keyboard or a keyboard provided, or projected, as a template. A display 220 will generally be built into the all-in-one computer system 202. As shown herein, the display 220 includes driver electronics, coupled to the bus 204, as well as the screen itself. Other units that may be present include a network interface card (NIC) for coupling the all-in-on computer to a network 226. The NIC can include an Ethernet card, a wireless network card, a mobile broadband card, or any combinations thereof.
Command patterns 404 on the template 400 may be recognized and associated with commands for the associated program. For example, the command patterns 404 may include commands such as save 406, open 408, line draw 410, and the like. Selecting a command, such as by touching a command pattern 404 on the template, can be used to activate the associated command, for example, generally following the method shown in
At block 504, the patterns on the template may be recognized, for example, by comparing a hash code generated from the pattern to a library of codes stored for various patterns. Once a pattern is identified, at block 506, it may be associated with an operational code sequence, such as a command for a program. The program may be manually selected by the user or may be automatically selected by a pattern on the template. Further, equivalent patterns may be associated with different commands depending on the program selected. For example, the play 302 and rewind 306 patterns discussed with respect to
After patterns are associated with commands for a loaded program, at block 610, the computer system may identify an input corresponding to a user action. The input may include the user touching a pattern on a template with a finger or other object. For example, a detection system within the computer system may locate an object in the three dimensional space in front of the screen. When the object and a command location, such as a pattern on the template, intersect, the detection system may send a command to the program through the operating system. In some embodiments, the object may include three dimensional shapes that activate specific commands, or code modules, that are relevant to the shape and the location selected.
An example of such a shape could be a pyramidal object that represents a printer. If the printer shape is touched to a pattern on the template, the associated command may be executed with a parameter controlled by the shape. Such shapes may also represent a program parameter, such as an operational selection. For example, touching a first shape to a pattern on a template may initiate a code module that prints the object, while touching a second shape to a pattern on a template may initiate a code module that saves the current file. Other shapes may activate code modules that modifies the object, or transmits the data representing the object to another system or location.
If a template pattern has been selected at block 612, process flow proceeds to block 614 where an associated command can be entered into the program. At block 616, the system may determine if the template has been removed from the scanned area. If not, process flow may return to block 610 to continue looking for user input. While the computer system is specifically looking for input relevant to the template present, it may detect the placement of another template in view of the imaging sensors, for example, by continuing to execute block 602 in parallel.
If at block 616 it is determined that the template is no longer in the imaged volume in front of the computer system, process flow may proceed to block 618, at which the system may perform a series of actions to close the program. However, embodiments are not limited to automatically closing the program, as the user may manually close the program at any time. In an embodiment, removing the template may have no effect except to eliminate selection of the associated commands using the template. The system may also take other actions to close out the program, such as saving the files in the program or prompting a user to save the files.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2010/051487 | 10/5/2010 | WO | 00 | 4/2/2013 |