An optical joystick is a navigation device suitable for small electronics such as personal digital assistants (PDAs) cell phones and other mobile and portable electronic devices. An optical joystick typically utilizes a sensor array, a light source and optics to detect motion of a user's finger on a navigation surface. The sensor array captures the reflected light off the navigation surface. A correlation based algorithm is used to determine the relative motion of the finger from the changes in captured light.
In an optical joystick, the navigation surface is typically located on an optically transparent cover piece that protects the navigation engine (sensor and light source) and establishes a physical interface for the user's finger to navigate upon.
Existing optical joystick designs do not offer mechanical feedback to a user as do navigation devices that are implemented with moving parts. For example, a slide pad motion navigation is a mechanical device that allows a user to navigate based on mechanical movement of a pod placed at the center of the slide pad motion navigation device. A spring structure creates a force feedback to the user as the pod is displaced from the center of the slide pad motion navigation device. Such spring force offers the user a sense of the relative position of the pod relative to the center slide pad motion navigation device. An optical joystick, however, has only a flat outer surface for the user finger to navigate on, with no mechanically movable parts to provide feedback on the current position of a user's finger.
In accordance with an embodiment of the present invention, a navigation device includes a surface and an optical motion sensor. A user moves a finger across the surface to provide navigation information to a host device for the navigation device. The surface includes a contoured region that provides tactile feedback to the user as the user moves the finger across the surface. The optical motion sensor senses motion of the finger across the surface.
Surface 17 is at least partially transparent at a wavelength of light emitted by light source 15. For example, when the wavelength of light emitted by light source 15 is not visible to a human eye, surface 17 can be opaque to light visible to a human eye.
Lens array 14 includes M×N elements, where M≧1 and N≧1. Lens array 14 collects light reflected from surface 17 and forms a pattern onto the two-dimensional sensor array 12 underneath.
For example, when a light emitting diode (LED) is used as light source 15, lens array 14 may be used to form an image of a surface, for example a finger surface of a user, in contact with surface 17.
An optional lens 16 may be placed between the light source 15 and the surface 20 where the output beam is substantially collimated.
Illumination source 15 may be, for example, a coherent light source such as a laser diode or a vertical cavity surface emitting laser. Alternatively, illumination source 15 may be an incoherent or quasi-coherent light source such as a light emitting diode (LED) or a broadband source with or without an optical filter. Alternative to using illumination source, surface can be illuminated in another way, such as, for example, by an external light source such as ambient light. Lens array 14 is comprised of elements that may be refractive, diffractive or hybrid. Sensor array 12 is, for example, a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) imaging array. Sensor array 12 is preferably positioned to capture the pattern formed by the lens array.
A contoured region 18 on surface 17 creates for the user a natural feel of a center of the optical joystick surface. For example, in
An analog-to-digital converter (ADC) 22 receives analog signals from image array 12 and converts the signals to digital data. An automatic gain control (AGC) 23 evaluates digital data received from ADC 22 and controls shutter speed and gain adjust within image array 12. This is done, for example, to prevent saturation or underexposure of images captured by image array 12.
A navigation engine 24 evaluates the digital data from ADC 22 and performs a correlation to calculate overlap of images and to determine shift between images in order to detect motion. For example, the correlation is performed using an image processing algorithm such as a convolution, or can be performed in another way to detect image shift. Navigation engine 24 determines a delta x value placed on an output 25 and determines a delta y value placed on an output 26. A controller 28 receives the delta x value placed on output 25 and the delta y value placed on an output 26. Controller 28, through an interface 29, forwards representatives of these values to a host system. For example, the host system is a PDA, a cell phone or some other device utilizing an optical joystick. The representatives of the delta x values placed on output 25 and the delta y values placed on an output 26 can be transmitted immediately and continuously to the host system, or, alternatively, can be stored for later transmission in response to a query from the host system.
In
In
While specific examples of contoured regions (convex, concave and textured) have been provided, these are meant to be illustrative of contoured (i.e., non-flat) regions that can be used to provide tactile feedback to a user of an optical joystick.
The foregoing discussion discloses and describes merely exemplary methods and embodiments of the present invention. As will be understood by those familiar with the art, the invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.