This disclosure pertains to interactive interfaces and, more particularly, to touch interfaces that feature optical touch navigation.
Touch interfaces permit a user to interact with computing devices by touching a screen with a finger or stylus. Touch interfaces are pervasive, particularly in mobile computing devices. Touch interfaces may be implemented using various technologies, e.g., resistive, capacitive, or optical technologies. Resistive technology-based touch interfaces typically may include two layers coated with a resistive material that are separated by a gap. A different voltage electrifies each of the two layers. A touch by a finger or stylus presses the two layers together, changing the voltage and allowing the interface to identify the location of the touch. Resistive technology-based interfaces are inexpensive to manufacture but suffer from low optical transparency. Resistive technology-based interfaces are susceptible to scratches on the touch surface.
Capacitive technology-based touch interfaces may use a single active layer coated with a transparent conductor. A small current runs across the interface, with circuits located at the corners to measure the capacitance of a finger or a conductive stylus when it touches the interface. The touch of the finger or the conductive stylus draws current from the active layer, changing capacitance and allowing the interface to identify the location of the touch. Capacitive technology-based touch interfaces may determine geometrical features of a contact patch, e.g., centroid and size, to track movement of the finger or conductive stylus. The touch interfaces estimate movement based on the geometrical features of the contact patch as the finger or the conductive stylus moves from one location to another on the touch screen surface. The geometrical features of the contact patch, however, are indirect measurements of the finger or the conductive stylus' position and trajectory, which may lead to position estimation inaccuracies or slop, e.g., retrograde scrolling, where a contact patch is erroneously interpreted as moving backwards even as the user extends his finger forward.
Optical technology-based touch interfaces rely on optics to detect light emission or reflections from touch that translate into movement of a cursor or other icon on a screen or monitor. Optical touch interfaces have been found useful for applications in which little physical space or area exists for a larger capacitive or resistive touch interfaces. For example, optical touch interfaces are common in computer mice. Small area optical touch interfaces such as those implemented in mice are not generally considered ideal for the long distance precision control necessary for scrolling or panning since these actions would require multiple swipes of the touch interface to scroll or pan through an entire page.
Consumer products manufacturers often seek touch interfaces that may address some of the disadvantages associated with resistive, capacitive, or optical touch interfaces.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
An exemplary touch interface includes a lighting device, an optical device, and a sensing device. The lighting device distributes light over at least one surface of the optical device which, in turn, generates a light beam at an exit of the optical device by internally reflecting the light in the optical device. The sensing device detects an object incident on or proximate to the optical device by comparing successive images of the object captured by the sensing device in response to the light beam striking the sensing device. The lighting device may include a light source and a backlight device configured to project the light generated by the light source onto at least one surface of the optical device. The optical device may comprise an optical wedge including a thick end opposing a thin end. The optical wedge internally may reflect the light between a top surface and a bottom surface of the optical wedge and may produce the light beam at the thick end.
Additional aspects and advantages of exemplary touch devices including optical touch navigation will be apparent from the following detailed description that proceeds with reference to the accompanying drawings.
An exemplary optical touch interface described herein is useful in applications where precision control is necessary, e.g., for scrolling or panning, which require tracking of one or more objects over larger distances than those afforded by known optical finger navigation devices such as optical touch mice. Note that the optical touch interface directly tracks movement of at least one object incident on a surface of an optical device, which increases tracking precision. The larger distance optical tracking is possible due at least in part to the optical device, which generates a light beam by internally reflecting the light. The internal light reflection, in turn, allows for a reduction in a size of an optical path necessary for such larger distance optical tracking. The reduction in the size of the optical path provides greater design freedom with regard to the touch interface's angles and contours. A sensing device captures an image of a surface of the optical device in response to the light beam striking the sensing device. The sensing device detects one or more objects incident on the optical device by comparing successively captured images of the light beam at the exit of the optical device since the object(s) will scatter at least a portion of the light internally reflected by the optical device.
Referring to
Touch interface 100 may include a lighting device 101, which, in turn, may include light source 102 and backlight device 104. Light source 102 may source light 103 while backlight device 104 may project light 103 onto optical device 106. Light source 102 may be any illuminant configured to emit or source any kind of light known to a person of ordinary skill in the art including structured, unstructured, single wavelength, visible, or infrared light. An exemplary light source 102 may include at least one light emitting diode positioned adjacent to end 105 of backlight device 104. Another exemplary light source 102 may include a plurality of light emitting diodes positioned along and adjacent to end 105 of backlight device 104. The plurality of light emitting diodes may increase the intensity of rays 114 distributed to optical device 106 by backlight device 104.
Backlight device 104 projects or otherwise distributes light 103 from light source 102 as rays 114 onto optical device 106. A portion of light 103 may leak out along a length of backlight device 104, e.g., due to diffusing elements shown in
Referring to
Referring back to
Optical device 106 may extend over a portion or an entire length of backlight device 104. Rays 114 may enter optical device 106 from bottom surface 128 at any angle, including a view angle greater than or equal to zero degrees. Optical device 106 may distribute rays 114 through internal reflection as reflected rays 116. Reflected rays 116 may reflect internally between top surface 126 and bottom surface 128 before exiting at thick end 124 to be delivered as light beam 118 to lens 108 and sensor 110. A portion of reflected rays 116 may exit top surface 126. In this manner, optical device 106 de-magnifies, focuses, or otherwise directs light 103 delivered from light source 102 to sensor 110 through optical device 106 bounded by top surface 126, bottom surface 128, opposing sides 125, thin end 122, and thick end 124. In an embodiment, light beam 118 may comprise substantially collimated or parallel rays.
Sensor 110, in turn, senses light beam 118 to capture images of top surface 126 or an object 130 proximate to or incident on top surface 126 as reflected rays 116 traverse optical device 106. Sensor 110 may be any kind of device that captures light and converts the captured light into an electronic signal, e.g., charge-coupled devices (CCDs), complementary metal-oxide-semiconductor (CMOS) sensors, and active pixel arrays. Sensor 110 may include an analog portion and a digital portion (not shown separately from sensor 110). The analog portion may include a photo sensor that holds an electrical charge representative of the light striking its surface and converts the charge into a voltage one pixel at a time. The digital portion (not shown separately from sensor 110) may convert the voltage into a digital signal representative of the light striking the photo sensor. Sensor 110 may be an integrated circuit that includes both the analog portion and the digital portion. Alternatively, sensor 110 may comprise two distinct circuits separately implementing the analog portion and the digital portion. Sensor 110 may alternatively be integrated with processing device 112 to include additional features described in more detail herein.
In an embodiment, lens 108 may be interposed between thick end 124 and sensor 110 to focus the light beam 118 exiting thick end 124 onto sensor 110. In another embodiment, lens 108 may be interposed between bottom surface 128 and sensor 110 to focus the light beam 118 exiting bottom surface 128 onto sensor 110. Lens 108 may be any device known to a person of ordinary skill in the art capable of focusing light beam 118 on sensor 110 and capable of compensating for optical aberrations that may occur in optical device 106. In this context, an optical aberration may be any deviation of the actual image from the ideal image of object 130 that may occur due to, e.g., optical device shape variation, imaging aberrations, and the like (
Processing device 112 may include any processor capable of manipulating or otherwise processing an output of sensor 110. Processing device 112 may include memory 113, which may be of any type or of any size known to a person of ordinary skill in the art. An embodiment of processing device 112 and memory 113 may include processor 504 and memory 506 shown in
Object 130 proximate to or incident on optical device 106 may scatter at least a portion of the reflected rays 116 as scattered rays 115. An angle at which at least a portion of the reflected rays 116 and scattered rays 115 exit the thick end 124 as light beam 118, therefore, may change according to a position of object 130 on top surface 126. Sensor 110 may capture images of object 130 proximate to or incident on top surface 126 at predetermined times and store the images in onboard memory (not shown separately from sensor 110). Alternatively, sensor 110 may transmit the images to processing device 112 for storage in memory 113 and subsequent processing. Processing device 112 may compare successively captured images or may compare images captured at predetermined intervals to determine a location of object 130 on top surface 126 and to directly track movement of object 130 on top surface 126. Note that positions of object 130 on top surface 126 are directly determined or tracked by processing device 112 from a comparison of the images, and not indirectly from other indicia, e.g., contact patch geometries, as is the case with other touch technologies. Processing device 112 may compare successively captured images using any number of algorithms, including cross-correlation algorithms or single or multi contact tracking algorithms as is well known to a person of ordinary skill in the art.
Referring to
Referring to
Referring to
Moreover, a person of ordinary skill in the art readily will recognize that system 500 may be implemented on other types of computing architectures, e.g., general purpose or personal computers, hand-held devices, mobile communication devices, multi-processor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, application specific integrated circuits, System-On-Chip (SOC), and like. For illustrative purposes only, system 500 is shown in
Similarly, a person of ordinary skill in the art readily will recognize that system 500 may be implemented in a distributed computing system in which various computing entities or devices, often geographically remote from one another, e.g., computing device 502 and remote computing device 502R, perform particular tasks or execute particular objects, components, routines, programs, instructions, data structures, and the like. For example, system 500 may be implemented in a server/client configuration (e.g., computing device 502 may operate as a server and remote computing device 502R, tablet computing device 502T, mobile computing device 502M, or laptop computing device 502L may operate as clients). In system 500, application programs 506C may be stored in local memory device 506, external memory device 536, or remote memory device 534. Local memory device 506, external memory device 536, or remote memory device 534 may be any kind of memory known to a person of ordinary skill in the art including random access memory (RAM), flash memory, read only memory (ROM), ferroelectric RAM, magnetic storage devices, optical discs, and the like.
Computing device 502 may comprise processing device 504, memory device 506, device interface 508, and network interface 510, which all may be interconnected through bus 512. Processing device 504 may represent a single, central processing unit, or a plurality of processing units in a single computing device 502 or plural computing devices, e.g., computing device 502 and remote computing device 502R. Local memory device 506, external memory device 536, and/or remote memory device 534 may be any type of memory device, such as any combination of RAM, flash memory, ROM, ferroelectric RAM, magnetic storage devices, optical discs, and the like. Local memory device 506 may include a basic input/output system (BIOS) 506A with routines to transfer data, including data 506D, between the various elements of system 500. Local memory device 506 also may store an operating system (OS) 506B that, after being initially loaded by a boot program, manages other programs in computing device 502. Local memory device 506 may store routines or programs 506C designed to perform a specific function for a user or another application program, e.g., application programs configured to capture images from sensor or application programs configured to compare successively captured images to detect an object incident over an optical element, which we describe in more detail above. Local memory device 506 additionally may store any kind of data 506D, e.g., images from sensor 110 (
Computing device 502 may comprise processing device 112 and memory 113 of touch interface 100, shown in
Device interface 508 may be any one of several types of interfaces. Device interface 508 may operatively couple any of a variety of devices, e.g., hard disk drive, optical disk drive, magnetic disk drive, or the like, to bus 512. Device interface 508 may represent either one interface or various distinct interfaces, each specially constructed to support the particular device that it interfaces to bus 512. Device interface 508 may additionally interface input or output devices utilized by a user to provide direction to computing device 502 and to receive information from computing device 502. These input or output devices may include keyboards, monitors, mice, pointing devices, speakers, stylus, microphone, joystick, game pad, satellite dish, printer, scanner, camera, video equipment, modem, monitor, and the like. Device interface 508 may interface with touch devices, optical or otherwise, including optical touch interface 100 shown in
A person of skill in the art readily will recognize that system 500 may comprise any type of computer readable medium accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, cartridges, RAM, ROM, flash memory, magnetic disc drives, optical disc drives, and the like.
Network interface 510 may operatively couple computing device 502 to remote computing devices 502R, tablet computing devices 502T, mobile computing devices 502M, and/or laptop computing devices 502L, on network 530. Network 530 may be a local, wide area, or wireless network, or any other type of network capable of electronically coupling one computing device to another computing device. Computing devices 502R may be geographically remote from computing device 502. Remote computing device 502R may have a structure corresponding to computing device 502, or may operate as a server, client, router, switch, peer device, network node, or other networked device and may include some or all of the elements of computing device 502. Computing device 502 may connect to the local or wide area network 530 through a network interface 510 or adapter included in interface 560, may connect to the local or wide area network 530 through a modem or other communications device included in the network interface 510, may connect to the local or wide area network 530 using a wireless device 532, or the like. Modem or other communication devices may establish communications to remote computing devices 502R through global communications network 530. A person of ordinary skill in the art readily will recognize that application programs or modules 506C may be stored remotely through such networked connections.
A person of ordinary skill in the art will recognize that they may make many changes to the details of the above-described exemplary touch interfaces that feature optical touch navigation without departing from the underlying principles. Only the following claims, therefore, define the scope of the exemplary touch interfaces that feature optical touch navigation.