System and method of detecting contact on a display

Information

  • Patent Grant
  • 8094137
  • Patent Number
    8,094,137
  • Date Filed
    Monday, July 23, 2007
    17 years ago
  • Date Issued
    Tuesday, January 10, 2012
    13 years ago
Abstract
A system for detecting contact on a display is provided. The system comprises a planar medium associated with the display and includes at least one edge facet and opposing surfaces. The system also includes one or more optical sources operatively coupled to the at least one edge facet for transmitting an optical signal into the planar medium such that the transmitted optical signal is totally internally reflected between the at least one edge facet and opposing surfaces. An optical sensing device is positioned to substantially face at least a portion of the edge facet and adapted to detect at least one object contacting a first surface of the opposing surfaces. The optical sensing device is operative to detect a portion of the optical signal emitted from the first surface at a location corresponding to the object contacting the first surface.
Description
FIELD OF THE INVENTION

The present invention relates generally to display screens, and in particular, to a system and method for detecting contact on such display screens.


BACKGROUND OF THE INVENTION

Touch systems are well known in the art and typically include a touch screen having a touch surface on which contacts are made using a pointer. Pointer contacts with the touch surface are detected and are used to generate corresponding output pointer position data representing areas of the touch surface where the pointer contacts are made. There are basically two general types of touch systems available and they can be broadly classified as “active” touch systems and “passive” touch systems.


Active touch systems allow a user to generate pointer position data by contacting the touch surface with a special pointer that usually requires some form of on-board power source, typically batteries. The special pointer emits signals such as infrared light, visible light, ultrasonic frequencies, electromagnetic frequencies, etc. that activate the touch surface.


Passive touch systems allow a user to generate pointer position data by contacting the touch surface with a passive pointer and do not require the use of special pointers in order to activate the touch surface. A passive pointer can be a finger, a cylinder of some material, or any other suitable object that can be used to contact some predetermined area of interest on the touch surface. Since special active pointers are not necessary in passive touch systems, battery power levels and/or pointer damage, theft, or pointer misplacement are of no concern to users. The detection of one or more points of contact with the touch screen or other display surfaces may be accomplished by a myriad of techniques.


United States Patent Application No. 20060279558 to Van Delden et al. discloses a display device having a touch screen 301. The touch screen comprises a first light guide 302, a second light guide 307, and a media 309 between the light guides for eliminating interference and reflections. A light source 308 is arranged to emit light 310 into the first light guide 302, where the light is normally confined within the first light guide by means of total internal reflections. The second light guide 307 is arranged at the exterior face of the first light guide 302. When a user of the display device establishes physical contact with the touch screen 301, light is extracted from the first light guide and directed towards light detecting means 303. The light detecting means 303 are arranged for relating a light detecting event to an input position on the touch screen 301, where user interaction occurred.


United States Patent Application No. 20060114244 to Saxena et al. discloses a touch input system including a light-emitting device, a bent light guide, and a light detector, whereby the light-emitting device emits light. The bent light guide receives the light emitted by the light-emitting device and guides the light to travel in a direction across a face of a display screen, where light detector detects the light. When an object interrupts transmission of light, the interruption is detected by an activated light detector opposite the light emitter transmitting light. This is illustrated by an object 17 interrupting light transmitted from one of light emitters 10 to light detectors 11, and interrupting light transmitted from one of light emitters 12 to light detectors 14.


United States Patent Application No. 20050104860 to McCreary et al. discloses a touchframe system including a plurality of light emitting elements and a plurality of light receiving elements positioned around the perimeter of a display area. Each of the light receiving elements in combination with a plurality of the light emitting elements form a zone of light beam paths. The number and positioning of receivers is sufficient to form a plurality of partially overlapping zone pairs. These zone pairs are arranged relative to the display area such that any touch event lies within at least two zone pairs. A processor monitors each of the zone pairs for blockage of at least one light beam path. Upon such blockage, the processor calculates the location of the touch event associated with the blockage based on the slopes and end points of at least two intersecting blocked light beam paths from a first zone pair and two intersecting blocked light beam paths from a second zone pair.


United States Patent Application No. 20040032401 to Nakazawa et al. discloses a substrate made of glass that serves both as a substrate for a touch panel and a front light. The substrate includes both the function of propagating an ultrasonic wave in order to detect a touched position, and propagating light emitted from a light source to guide the light toward a reflective-type liquid crystal display. In the case where an image on the liquid crystal display is made visible by external light, the external light that is transmitted through the substrate is reflected by the liquid crystal display and transmitted through the substrate to be emitted from the front face. In the case where the front light function is used, light which has been introduced into the substrate from the light source is reflected by the liquid crystal display and transmitted through the substrate to be emitted from the front face.


U.S. Pat. No. 7,002,555 to Jacobsen et al. discloses a display device having a touch sensor that consists of an electrochromic cell or a liquid crystal cell that is located between two transparent plates, a transparent cover plate, and a transparent support plate. A radiation source whose light enters the cover plate and illuminates it is arranged on at least one of the end faces of the transparent cover plate. At least one photodetector is mounted on the support plate.


U.S. Pat. No. 6,738,051 to Boyd et al. discloses a frontlit touch panel for use with a reflective light valve, where the panel comprises a front light guide having at least one light input face that supplies light to the guide, a viewing face, a light output face opposite the viewing face, and at least one component of a touch-sensitive transducer. The light output face includes a light extraction layer thereon having a substantially flat light exit face and contains buried reflective facets that extract supplied light from the guide through the light exit face. The touch panel can be used with a light source, a reflective light valve, and suitable control electronics to form a compact and efficient illuminated touch panel display assembly.


U.S. Pat. No. 4,710,760 to Kasday discloses a touch-sensitive device comprising a photoelastic screen having light reflecting edges and a unique light emitting/receiving module placed at two of the four corners of the screen, which advantageously determines the location at which a force is applied to the screen. Circularly and linearly polarized light focused into the photoelastic screen by the modules reflects off the edges of the screen and is returned to the modules where it is absorbed by a circular polarizer. The polarization of light passing through a point at which the screen is touched is changed thereby allowing these rays or signals to pass through each module's absorber. The location as well as the magnitude and direction of the force imparted to the screen by the touch is then determined from the changes in the signals that pass through the absorber.


It is therefore at least one object of the present invention to provide a novel system and method of detecting contact on a display screen.


SUMMARY OF THE INVENTION

These and other objects may be accomplished according to one or more embodiments, whereby a system for detecting contact on a display is provided. The system for detecting contact comprises a planar medium associated with the display and includes at least one edge facet and opposing surfaces. The system also includes one or more optical sources operatively coupled to the edge facet for transmitting an optical signal into the planar medium such that the transmitted optical signal is totally internally reflected between the at least one edge facet and opposing surfaces. According to the system, an optical sensing device is positioned to substantially face at least a portion of the edge facet and adapted to detect at least one object contacting a first surface of the opposing surfaces. The optical sensing device is operative to detect a portion of the optical signal emitted from the first surface at a location corresponding to the object contacting the first surface.


According to another embodiment, a system for detecting contact on a display is provided, where the system comprises a planar medium associated with the display and includes at least one edge facet and opposing surfaces. The system also includes one or more optical sources operatively coupled to the edge facet for transmitting an optical signal into the planar medium such that the transmitted optical signal is totally internally reflected between the at least one edge facet and opposing surfaces. Further, according to the system, at least two camera devices are provided, where the camera devices are positioned to substantially face at least a portion of the edge facet and adapted to detect at least one object contacting a first surface of the opposing surfaces. The camera devices are operative to capture images of a portion of the optical signal emitted from the first surface at a location corresponding to the object contacting the first surface.


According to yet another embodiment, a method of detecting contact to a display is provided. The method of detecting contact to a display comprises transmitting an optical signal into a planar medium associated with the display, where within the planar medium the transmitted optical signal is totally internally reflected. An optical sensing device is positioned to substantially face a side location associated with the planar medium and a surface location on the first surface is contacted using at least one object. Using the optical sensing device, a portion of the optical signal emitted from the surface location is detected based on the object contacting the surface location.


Further, according to an embodiment, a method of detecting contact to a display is provided, where the method comprises transmitting an optical signal into a planar medium associated with the display, where within the planar medium the transmitted optical signal is totally internally reflected. The method also includes positioning a first camera device to substantially face a first side location associated with the planar medium, where the first camera device receives images from a first surface of the planar medium. A second camera device is positioned to substantially face a second side location associated with the planar medium, where the second camera device receives images from the first surface of the planar medium. A surface location on the first surface is contacted using at least one object, whereby using the first and second camera, images of a portion of the optical signal emitted from the surface location based on the object contacting the surface location are captured.


Also, in another embodiment, a passive touch system comprises a touch screen having opposing surfaces that are adapted to receive an optical signal that is totally internally reflected within the opposing surfaces. Upon an object contacting a surface location associated with the opposing surfaces, a portion of the optical signal is emitted from the surface location. At least two cameras are associated with the touch surface and positioned substantially at a side location to the touch surface. At the surface location, images of the portion of the optical signal emitted from the surface location are captured by the two cameras for determining a coordinate position associated with the object contacting the surface location.





BRIEF DESCRIPTION OF THE DRAWINGS

One or more embodiments will now be described more fully with reference to the accompanying drawings in which:



FIG. 1A is a system for detecting contact on a display according to one embodiment;



FIG. 1B is a cross-sectional view illustrating the position of an optical sensing device relative to the surface of a planar shaped medium associated with the embodiment of FIG. 1A;



FIGS. 2A-2C illustrate the concept of detecting contact to a planar shaped medium according to the embodiment of FIG. 1A;



FIGS. 3A-3B are photographic illustrations of contact detection on a planar shaped medium constituting a display;



FIG. 4 illustrates an alternative embodiment of a planar shaped medium associated with the described system;



FIGS. 5A-5C illustrate other alternative embodiments of a planar shaped medium associated with described system;



FIG. 6 is a schematic diagram of a camera-based touch system adapted to incorporate the system of FIG. 1A;



FIG. 7 is a front elevation view of a touch screen forming part of the touch system of FIG. 6;



FIG. 8 is a schematic diagram of a camera system forming part of the touch system of FIG. 6;



FIG. 9 is a schematic diagram of a master controller forming part of the touch system of FIG. 6; and



FIG. 10 is a flow diagram illustrating the operation of the embodiment of FIG. 1A.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

In the following description, an embodiment of a system and method for detecting contact from, for example, a user's finger, a cylindrical hand-held object, or any other capable means on a display screen such as a touch screen used in media presentation systems is provided.


Turning now to FIG. 1A, a system 100 for detecting contact on a display includes a planar shaped medium 102, an optical source 118, and at least one optical sensing device such as camera devices 130 and 132.


The planar shaped medium 102 may be adapted to conform to the shape of a display screen (not shown) or alternatively form an integral part of the outer-surface of a display screen. In either aspect, the planar shaped medium 102 is utilized to receive contact directly (e.g., via a finger) or indirectly (e.g., via a pen-shaped or any other object) from a user. Planar shaped medium 102 includes opposing surfaces such as top-surface 104 and bottom-surface 106. The planar shaped medium 102 also includes peripheral edge facets such as opposing edge facets 108 and 109, and opposing edge facets 110 and 112. The outer surfaces of opposing edge facets 108 and 109 and edge facet 112 are covered with an optically reflective material such as copper or silver tape. Alternatively, reflective materials may be deposited directly onto the outer surfaces of the side opposing facets 108, 109 and end facet 112 using known deposition, adhesion, or bonding techniques. As illustrated, edge facet 108 includes reflective surface 116, edge facet 109 includes reflective surface 114, and edge facet 112 includes reflective surface 118.


Edge facet 110 is adapted to receive an optical signal from the optical source 118, whereby the optical source 118 may be coupled either directly to the edge facet 110 using a surface mountable optical device (e.g., surface emitting light emitting diodes) or via an intermediate optical mechanism (e.g., an optical fiber, a lens assembly, optical filters, an optical diffuser, etc.). The planar shaped medium 102 is constructed from a material capable of exhibiting optical wave-guiding properties such as an acrylic material having fire polished edges. Other materials such as glass may also be used to form planar shaped medium 102. Although optical source 118 is coupled to edge facet 110, other optical sources (not shown) may also be adapted to transmit optical signals into the planar shaped medium 102 via one more of the other facets. For example, additional optical sources (not shown) may be coupled to facets 108, 109, and/or 112. The addition of other optical sources (not shown) reduces the reflectivity requirements exhibited by reflective surfaces 114, 116, and 118. For example, when an optical source is coupled to each of facets 108, 109, 110, and 112, the incorporation of reflective surfaces on the facets is no longer a necessity and may be optional. According to another example, optical sources may each be coupled to facets 110 and 108. In such an embodiment, edge facets 112 and 109 may include reflective surfaces that reflect any optical signals transmitted from the optical sources that are coupled to facets 110 and 108, respectively.


Optical source 118 may include one or more spatially distributed light emitting diodes such LED device 120a and LED device 120b. Light emitting diodes 120a-120b may include a viewing half angle in the range of 0-90 degrees. The number of utilized LED devices may depend on the required optical power relative to the spatial distribution of the incident optical signal transmitted into the planar shaped medium 102. These factors may in turn depend on the geometric size of the planar shaped medium 102 and the attenuation properties of the material forming the planar shaped medium 102. For example, glass may exhibit less attenuation on a transmitted optical signal relative to plastic. Other optical sources such as one or more laser devices (e.g., FP laser diodes, DFB laser diodes, VCSEL devices, etc.) may also be used for transmitting optical signals into planar shaped medium 102.


The optical output signals provided by optical source 118 may include a wide range of wavelengths such as infrared, visible light, as well as ultraviolet. For example, the use of certain visible wavelengths may create various visual effects with respect to a planar shaped medium incorporated as a display screen. In one scenario, for example, multiple presentation screens may be set-up during a conference. The different presentation screens may then be color coded by transmitting different visible light signals into each display screen (i.e., via the planar medium) using different wavelength optical sources (e.g., red LED, blue LED, etc.). In another scenario, for example, no visual effect may be desired. Accordingly, an optical source operating in the infrared range is employed for signal transmission into the planar shaped medium 102.


According to one aspect, optical sensing devices such as camera devices 130 and 132 may include CMOS based camera sensors, which allow for the processing of a subset of available pixels as opposed to the collection of every pixel. This reduces the processing overhead while increasing the frame rate (fps) capabilities. Each of the camera devices 130, 132 is positioned to substantially face one of the edge facets and adapted to capture images of an object 124 (e.g., a user's finger) contacting a location 126 on the top-surface of the planar shaped medium 102. For example, camera 132 may be positioned to fully or partially face edge facet 109, while capturing images from top-surface 104. Similarly, camera 130 may be positioned to fully or partially face opposing edge facet 112, while also capturing images from top-surface 104. In an alternative example, camera 132 may be positioned to fully or partially face opposing edge facet 108, while capturing images from top-surface 104. The field of view of cameras 130 and 132 overlap and cover the top-surface 104 of the planar medium 102 in order to facilitate both the detection and position determination of an applied contact location on the top-surface 104. According to another aspect, the optical sensing device may include a photodetector device (not shown) such as a photodiode. As with the camera devices 130, 132, the photodetector may also be positioned to substantially face one of the opposing edge facets and adapted to detect the object 124 (e.g., a user's finger) contacting a region such as location 126 on the top-surface of the planar shaped medium 102.


The position of cameras 130 and 132 relative to top-surface 104 are arranged in a manner that enables the capture of images from the top-surface 104. As illustrated in FIG. 1B, an optical sensing device 121 such as one or more camera devices or photodetectors may be positioned according to a multitude of positions while still capturing the images from top-surface 104. For example, the “position A” device 121 is positioned (i.e., relative to an axis going through the center of a camera lens or photodetector photosensitive area) to be substantially aligned with the top-surface 104 of planar medium 102. The “position B” and “position C” devices 121 are positioned (i.e., relative to an axis going through the center of a camera lens or photodetector photosensitive area) to be substantially elevated relative to the top-surface 104 of planar medium 102. In both cases, however, the device 121 is capturing images from the side of the planar medium 102. Since the planar medium 102 may, for example, form the outer surface of a display such as a plasma or LCD screen, the side-looking positioning of the device 121 does not interfere or obstruct any projection means used by the plasma or LCD technology in generating images on the corresponding plasma or LCD displays. For example, as illustrated in FIGS. 6 and 7, camera devices 270 (FIG. 6) may be installed in one of the corners 268 of display frame 262. Alternatively, for example the cameras 270 may be positioned along any portion of the frame between corners 268.


Turning now to FIGS. 2A-2C, the concept of detecting contact to a display according to an embodiment such as the embodiment of system 100 (FIG. 1A) is illustrated. As illustrated in FIG. 2A, a user may apply contact to a location 126 on the top-surface 104 of the planar shaped medium 102. The effect of such a contact is now illustrated with the aid of FIGS. 2B and 2C. FIGS. 2B and 2C show a cross-sectional view along axis A-A′ of a region 126 of the planar shaped medium 102 of FIG. 2A. Referring to FIG. 2B, an optical signal 140 generated from source 118 (FIG. 1A) is totally internally reflected between opposing surfaces 104 and 106, and the peripheral edge facets 108, 109, 112 (FIG. 2A). Referring to FIG. 2C, as a user 124 applies a contact to top-surface 104, a portion 146 of the optical signal 140 that is totally internally reflected between the opposing surfaces 104,106 and peripheral edge facets 108, 109, 112 (FIG. 2A) is emitted from the top-surface. Based on the user applying the contact to the top-surface, a refractive index change is generated at the point of contact P, which causes the totally reflected optical signal 140 to be frustrated at the contact point P. Thus, the Frustrated Total Internal Reflection (FTIR) phenomenon at the boundary between the point of contact and the top-surface 104 facilitates the detection of the portion 146 of the internally reflected optical signal 140 emitted from the top-surface 104 by any suitable optical sensing device such as a cameras or optical sensing device that is directed at the top-surface 104 of the planar shaped medium 102 (FIG. 2A).


The use of a suitable optical sensing device may depend on the application of the system and methods described herein. For example, the use of one or more cameras provides the capability of both detecting the point or points of contact with the top-surface 104 and locating the position of the point of contact with the top-surface 104 using further image processing techniques. Alternatively, for example, a photodetector device may be utilized to detect the presence of the emitted portion of light 146, therefore, signifying that contact with the top-surface 104 has been made.


Referring now to FIGS. 3A and 3B, photographic illustrations of captured images by a camera device of points of contact with a top-surface of a screen 150 are illustrated. In FIG. 3A, a user's finger 160 is used to contact the top-surface of the screen 150 incorporating a similar embodiment to that of system 100 (FIG. 1A). As illustrated, an emitted frustrated optical signal manifested as an illuminated region 152 is captured by the camera device at the point of contact T. Similarly, as depicted in FIG. 3B, the camera captures a user's fingers applying multiple contact points to the top-surface 150 and thus causing the emission of multiple frustrated optical signals manifested as illuminated regions 152, 154, 156 at contact points T, U, and V, respectively.


Turning now to FIG. 4, an alternative embodiment of a planar shaped medium 163 is illustrated. As shown in FIG. 1A, the planar shaped medium 102 is rectangular shaped and therefore includes four flat-faced edge facets. In the alternative embodiment shown in FIG. 4, a planar shaped medium 163 having a single edge facet 165 is provided by utilizing an elliptical or circular shaped planar medium. The surfaces of edge facet 165 may be partially or completely covered by an optically reflective material for facilitating the reflectivity of an optical signal transmitted by optical source 167 into the planar shaped medium 163.


Other shaped planar media may also be utilized within system 100 (FIG. 1A), as depicted in FIGS. 5A-5C. Referring to FIG. 5A, planar shaped medium 170 includes flat faced edge facets 174, 176, 178 and a curve shaped edge facet 172. One or more optical sources may be coupled to any one or more of edge facets 172-178. Moreover, any one of the surfaces of edge facets 172-178 may be covered by an optically reflective surface. In FIG. 5B, planar shaped medium 180 includes opposing flat faced edge facets 184, 188 and opposing curve shaped edge facets 182, 186. One or more optical sources may be coupled to any one or more of edge facets 182-188. Also, any one of the surfaces of edge facets 182-188 may be covered by an optically reflective surface. Turning now to FIG. 5C, planar shaped medium 190 includes multi-sided flat faced edge facets 192-202, where one or more optical sources may be coupled to any one or more of the multi-sided flat faced edge facets 192-202. Any one of the surfaces of multi-sided flat faced edge facets 172-178 may also be covered by an optically reflective surface. As previously described in relation to FIG. 1A, the outer surfaces of the edge facets associated with FIGS. 4 and 5A-5C may be covered with an optically reflective material such as copper or silver tape. Alternatively, reflective materials may be deposited directly onto these outer surfaces using known deposition, adhesion, or bonding techniques.


The exemplary embodiments described in relation to FIGS. 4 and 5A-5C illustrate that a multitude of shapes may be adopted as a planar shaped medium. The allocation of a particular shape to a planar shaped medium may depend on, but is not limited to, aesthetic considerations, the shape of a display screen to which the planar shaped medium may be coupled to, the required size of the planar shaped medium, reflectivity considerations, optical source considerations, and other factors.


The foregoing embodiment for detecting contact on a planar shaped medium such as a display screen will now be explained in association with an exemplary media presentation system. Turning now to FIG. 6, an exemplary media presentation system such as camera-based touch system 250 is provided, as disclosed in U.S. Pat. No. 6,803,906 to Morrison et al. and assigned to the assignee of the subject application, the content of which is incorporated by reference herein in its entirety.


As illustrated in FIG. 6, a passive touch system 250 includes a touch screen 252 coupled to a master controller 254, whereby the master controller 254 is also coupled to a computer 256. Computer 256 executes one or more application programs and generates a display that is projected onto the touch screen 252 via a projector 258. The touch screen 252, master controller 254, computer 256 and projector 258 form a closed-loop so that user-contacts with the touch screen 252 can be recorded as writing or drawing, or used to control execution of application programs executed by the computer 256.



FIG. 7 better illustrates the touch screen 252. As shown in FIG. 6, touch screen 252 includes a touch surface 260 bordered by a frame 262. Touch surface 260 is passive and is in the form of a rectangular planar sheet of material such as the planar shaped medium 102 (FIG. 1A) described above. Referring to FIG. 7, each camera subsystem includes a camera system (not shown) mounted adjacent a different corner 268 of the touch screen 252 by a frame assembly 264. Each frame assembly 264 may include an angled support plate (not shown) on which the camera system is mounted.


Referring to FIG. 8, each camera system 263 may include a two-dimensional CMOS camera image sensor and associated lens assembly 280, a first-in-first-out (FIFO) buffer 282 coupled to the image sensor and lens assembly 280 by a data bus, and a digital signal processor (DSP) 284 coupled to the FIFO 282 by a data bus and to the image sensor and lens assembly 280 by a control bus. A boot EPROM 286 and a power supply subsystem 288 are also included.


The CMOS camera image sensor may include is a Photo-bit PB300 image sensor configured for a 20×640 pixel sub-array that can be operated to capture image frames at rates in excess of 200 frames per second. For example, the FIFO buffer 282 and DSP 284 may both be manufactured by Cypress under part number CY7C4211V and Analog Devices under part number ADSP2185M, respectively.


The DSP 284 provides control information to the image sensor and lens assembly 280 via the control bus. The control information allows the DSP 284 to control parameters of the image sensor and lens assembly 280 such as exposure, gain, array configuration, reset and initialization. The DSP 284 also provides clock signals to the image sensor and lens assembly 280 to control the frame rate of the image sensor and lens assembly 280.


As illustrated in FIG. 9, master controller 254 includes a DSP 290, a boot EPROM 292, a serial line driver 294 and a power supply subsystem 295. The DSP 290 communicates with the DSPs 284 of each of the camera systems 263 over a data bus and via a serial port 296. The DSP 290 also communicates with the computer 256 via a data bus, a serial port 298, and the serial line driver 294. In this embodiment, the DSP 290 is also manufactured by Analog Devices under part number ADSP2185M. The serial line driver 294 is manufactured by Analog Devices under part number ADM222.


The master controller 254 and each camera system 263 follow a communication protocol that enables bi-directional communications via a common serial cable similar to a universal serial bus (USB). The transmission bandwidth is divided into thirty-two (32) 16-bit channels. Of the thirty-two channels, five (5) channels are assigned to each of the DSPs 284 in the camera system 263 and to the DSP 290 in the master controller 254. The remaining seven (7) channels are unused. The master controller 254 monitors the twenty (20) channels assigned to the camera system DSPs 284 while the DSPs 284 in each of the camera systems 263 monitor the five (5) channels assigned to the master controller DSP 290. Communications between the master controller 254 and each of the camera systems 263 are performed as background processes in response to interrupts.


The general operation of the passive touch system 250 will now be described in association with system 100 (FIG. 1A), whereby the planar shaped medium 102 (FIG. 1A) forms the touch screen 260. In this embodiment, it is possible to superimpose the planar shaped medium 102 onto the existing touch screen 260 and therefore adapt system 100 for use with passive touch system 250. Alternatively, the planar shaped medium 102 may form an integral part of the touch screen 260 such that system 100 is an integral part of the passive touch system 250.


Each camera system 263 acquires images of the touch surface 260 within the field of view of its image sensor and lens assembly 280 at the frame rate established by the DSP clock signals and processes the images to determine if a pointer is in the acquired images. If a pointer is in the acquired images, the images are further processed to determine characteristics of the pointer contacting or hovering above the touch surface 260. The contacting of the pointer with touch surface 260 is detected by the camera as one or more illuminated regions that are created by frustrated optical signals that are emitted at the point of contact of the pointer with the touch surface 260. Pixel information associated with the one or more illuminated regions received is captured by the image sensor and lens assembly 280 and then processed by the camera DSPs 284. Pointer characteristics corresponding to pointer contact with the touch surface are converted into pointer information packets (PIPs) and the PIPs are queued for transmission to the master controller 254. Each of the camera systems 263 also receive and respond to diagnostic PIPs generated by the master controller 254.


The master controller 254 polls each of the camera system 263 at a set frequency (in this embodiment 70 times per second) for PIPs and triangulates pointer characteristics (e.g., pointer contact) in the PIPs to determine pointer position data. The master controller 254 in turn transmits pointer position data and/or status information to the personal computer 256. In this manner, the pointer position data transmitted to the personal computer 256 can be recorded as writing (e.g., annotations), drawing, executing a response, or can be used to control execution of application programs executed by the computer 256. The computer 256 also updates the display output conveyed to the projector 258 so that information projected onto the touch surface 260 reflects the pointer activity.


The operation of system 100 (FIG. 1A) is now described with the aid of flow diagram 300 illustrated in FIG. 10. At step 302 an optical signal is transmitted into planar shaped medium 102 (FIG. 1A), where the planar shaped medium may form a display portion of a media presentation system such as passive touch system 250 (FIG. 6). The optical signal is totally internally reflected within the planar shaped medium 102.


At step 304, an optical sensing device such as one or more optical detectors and/or one or more camera devices 130, 132 (FIG. 1A) is positioned to substantially face a side location of the planar shaped medium and adapted to receive optical signals from the top-surface 104 (FIG. 1A) of the planar shaped medium 102. The side location of the planar shaped medium is generally the area or region surrounding the periphery such as edge facets 108, 109, 110, and 112 (FIG. 1A). For example, if other shaped planar media such as those illustrated and described in association with FIGS. 4 & 5 are utilized, the side location of the planar shaped media would generally be the area or region surrounding their edge periphery such as any one of the edge facets.


At step 306, once an object such as a user's finger or other pointer device contacts the top-surface of the planar shaped medium 102, a portion of the optical signal that is totally internally reflected within the planar medium 102 is emitted from the contact location based on the change in refractive index introduced by the contacting object. The magnitude of emitted light may depend on the surface pressure applied by the object at the contact location and the material used to apply the contact. For example, an increased pressure by the object at the contact location may increase the magnitude of optical signal emitted from the contact location. Also, the use of different materials to apply the contact may increase or decrease the amount of emitted optical signal.


At step 308, once the portion of the optical signal is emitted from the top-surface 104 based on the applied contact (step 306), the positioned optical sensing device (step 304) receives the emitted optical signal. In order to increase the detection capabilities of the optical sensing device with respect to background optical reflections, ambient light changes, or any other factors that may create a false indication of a detected optical signal, the optical source 118 (FIG. 1A) that transmits the optical signal into the planar medium 102 may be modulated and/or encoded using known techniques. By modulating and/or encoding the transmitted optical signal, the received emitted optical signal will also be modulated and/or encoded. Upon reception and processing, the modulated and/or encoded emitted optical signal facilitates distinguishing an actual emitted optical signal from spurious optical signals or intensity level changes and, therefore increases the signal-to-noise ratio of the system 100 (FIG. 1A). For example, the optical source may be encoded with a binary sequence using ON/OFF keying (OOK). The optical signal may also be intensity modulated, frequency modulated, or phase modulated. In another example, the optical source may be encoded or modulated using a Pseudo Random Binary Sequence (PRBS) generator.


If at step 310 it is determined that the optical sensing device is a camera device such as devices 130 and 132 (FIG. 1A), camera pixel information associated with the detected optical signal emitted from the planar medium 102 is captured and processed by a processor device such as the camera DSPs 284 (FIG. 8). Pointer information packets (PIPs) associated with the object contacting the location on the planar medium 102 are generated by and sent from the camera DSPs 284 to the DSP 290 or second processor device within master controller 254 (step 312). At the master controller 254, triangulation techniques may be used in conjunction with the PIPs received from the camera devices 130, 132 in order to generate coordinate information associated with the location or point(s) of contact of the object with top-surface 104 (step 314).


If at step 310 it is determined that the optical sensing device is one or more photodetectors, the detected signal associated with the emitted optical signal may be processed in order to decode the detected signal (step 316). For example, if a user contacts the top-surface 104 a few times in succession, the resultant successive detection of an increased optical intensity by the photodetector may, for example, be processed by the master controller 254 (step 316). Responsive to this processing, one or more predetermined events such as launching an application program on computer 256 (FIG. 6) may be initiated (step 318). Thus, by encoding the contacting, various events or processes may be identified and executed.


Other characteristics of the optical signal emitted from the top-surface 104 may be detected and decoded in response to the application of one or more contacts to the top-surface 104. For example, changes in the intensity of the emitted optical signal as a function of the applied pressure to a top-surface location by the object 124 (FIG. 1A), the simultaneous application of a plurality of objects (e.g., two, three, or more of the use's fingers) to the top-surface 104, and/or the successive application of contact (e.g., two or more taps) to one or more locations on the top-surface 104 may be decoded for initiating a predetermined response.


Although preferred embodiments of the present invention have been described, those of skill in the art will appreciate that variations and modifications may be made without departing from the spirit and scope thereof as defined by the appended claims.

Claims
  • 1. A system for detecting contact on a display, the system comprising: a planar medium associated with the display and including at least one edge facet and opposing surfaces;at least one optical source operatively coupled to the at least one edge facet for transmitting an optical signal into the planar medium such that the transmitted optical signal is totally internally reflected between the at least one edge facet and opposing surfaces; andan optical sensing device positioned to substantially face at least a portion of the at least one edge facet and adapted to detect at least one object contacting a first surface of the opposing surfaces, wherein the optical sensing device is operative to detect a portion of the optical signal emitted from the first surface at a location corresponding to the at least one object contacting the first surface.
  • 2. The system according to claim 1, wherein the at least one edge facet comprises a single circular shaped edge facet having an optically reflective surface.
  • 3. The system according to claim 1, wherein the at least one edge facet comprises at least one curved edge facet and at least one straight edge facet each having an optically reflective surface.
  • 4. The system according to claim 1, wherein the at least one edge facet comprises a first edge facet, a second edge facet, a third edge facet, and a fourth edge facet each comprising an optically reflective surface.
  • 5. The system according to claim 1, wherein the optical sensing device comprises at least one of the group consisting of at least one photodetector and at least two camera devices.
  • 6. A system for detecting contact on a display, the system comprising: a planar medium associated with the display and including at least one edge facet and opposing surfaces;at least one optical source operatively coupled to the at least one edge facet for transmitting an optical signal into the planar medium such that the transmitted optical signal is totally internally reflected between the at least one edge facet and opposing surfaces; andat least two camera devices positioned to substantially face at least a portion of the at least one edge facet and adapted to detect at least one object contacting a first surface of the opposing surfaces, wherein the at least two camera devices are operative to capture images of a portion of the optical signal emitted from the first surface at a location corresponding to the at least one object contacting the first surface.
  • 7. The system according to claim 6, further comprising at least one processor coupled to the at least two camera devices, wherein the at least one processor is adapted to process pixel information corresponding to the at least one object contacting the first surface.
  • 8. The system according to claim 7, further comprising a master controller coupled to the at least one processor, wherein the master controller is adapted to generate location coordinate information corresponding to the at least one object contacting the first surface.
  • 9. The system according to claim 8, further comprising at least one application program operative to receive the location coordinate information for displaying an annotation associated with the at least one object contacting the surface.
  • 10. The system according to claim 8, further comprising at least one application program operative to receive the location coordinate information for executing a response based the at least one object contacting the surface.
  • 11. The system according to claim 6, wherein the at least one edge facet comprises a single circular shaped edge facet having an optically reflective surface.
  • 12. The system according to claim 6, wherein the at least one edge facet comprises at least one curved edge facet and at least one straight edge facet each having an optically reflective surface.
  • 13. The system according to claim 6, wherein the at least one edge facet comprises a first edge facet, a second edge facet, a third edge facet, and a fourth edge facet each comprising an optically reflective surface.
  • 14. The system according to claim 13, wherein the optically reflective surface comprises at least one of copper reflective tape and silver reflective tape.
  • 15. The system according to claim 6, wherein the planar medium comprises an acrylic sheet coupled to the display, the acrylic sheet having substantially the same shape as the display.
  • 16. The system according to claim 15, wherein the acrylic sheet comprises fire polished acrylic edges.
  • 17. The system according to claim 6, wherein the at least one optical source comprises a light emitting diode (LED).
  • 18. The system according to claim 17, wherein the light emitting diode (LED) comprises a viewing half angle of about 0-90 degrees.
  • 19. The system according to claim 6, wherein the at least one optical source comprises a laser device.
  • 20. The system according to claim 6, wherein the transmitted optical signal comprises an infrared signal.
  • 21. The system according to claim 6, wherein the transmitted optical signal comprises visible light.
  • 22. The system according to claim 6, wherein the at least two camera devices comprise complementary metal oxide semiconductor (CMOS) cameras.
  • 23. The system according to claim 6, wherein the portion of the optical signal emitted from the location on the first surface comprises a frustrated total internal reflection (FTIR) optical signal generated at the location based on the at least one object contacting the location.
  • 24. The system according to claim 6, further comprising a screen associated with the display, wherein the screen is coupled to a second surface of the opposing surfaces.
  • 25. The system according to claim 6, wherein the planar medium comprises an integral component of the display.
  • 26. The system according to claim 6, wherein the at least one object comprises a finger associated with a user interacting with the first surface of the planar medium.
  • 27. The system according to claim 6, wherein the at least one object comprises a cylindrical pen shaped object adapted to be used by a user interacting with the first surface of the planar medium.
  • 28. The system according to claim 6, wherein the transmitted optical signal comprises a modulated signal.
  • 29. The system according to claim 6, wherein the transmitted optical signal comprises an encoded signal.
  • 30. The system according to claim 29, wherein the encoded signal comprises a binary sequence.
  • 31. The system according to claim 28, wherein the modulated signal comprises a pseudo random binary sequence (PRBS) modulated signal.
  • 32. The system according to claim 28, wherein the modulated signal comprises an intensity modulated signal.
  • 33. A method of detecting contact to a display, comprising: transmitting an optical signal into a planar medium associated with the display, wherein within the planar medium the transmitted optical signal is totally internally reflected;positioning an optical sensing device to substantially face a side location associated with the planar medium;contacting a surface location on the first surface using at least one object; anddetecting using the optical sensing device a portion of the optical signal emitted from the surface location based on the object contacting the surface location.
  • 34. A method of detecting contact to a display, comprising: transmitting an optical signal into a planar medium associated with the display, wherein within the planar medium the transmitted optical signal is totally internally reflected;positioning a first camera device to substantially face a first side location associated with the planar medium, wherein the first camera device receives images from a first surface of the planar medium;positioning a second camera device to substantially face a second side location associated with the planar medium, wherein the second camera device receives images from the first surface of the planar medium;contacting a surface location on the first surface using at least one object; andcapturing using the first and second camera images of a portion of the optical signal emitted from the surface location based on the object contacting the surface location.
  • 35. The method according to claim 34, further comprising determining a coordinate position associated with the at least one object at the surface location based on the captured images.
  • 36. The method according to claim 34, wherein the captured images of the portion of the optical signal emitted from the surface location comprises an illuminated region.
  • 37. The method according to claim 34, wherein a second surface of the planar medium is coupled to the display.
  • 38. The method according to claim 34, wherein the portion of the optical signal emitted from the surface location is based on a Frustrated Total Internal Reflection (FTIR) of the optical signal at the contacted surface location.
  • 39. The method according to claim 34, wherein the portion of the optical signal emitted from the surface location is responsive to a change in refractive index associated with the surface location.
  • 40. The method according to claim 34, wherein the first camera includes a first field of view and the second camera includes a second field of view, the first and second field of view having an area of overlap.
  • 41. The method according to claim 40, wherein contacting the surface location on the first surface using the at least one object comprises changing a refractive index value associated with the first surface at the contacted surface location.
  • 42. A passive touch system, comprising: a touch screen having opposing surfaces and adapted to receive an optical signal that is totally internally reflected within the opposing surfaces, wherein upon an object contacting a surface location associated with the opposing surfaces a portion of the optical signal is emitted from the surface location; andat least two cameras associated with the touch surface and positioned substantially at a side location to the touch surface, wherein at the surface location images of the portion of the optical signal emitted from the surface location are captured by the at least two cameras for determining a coordinate position associated with the object contacting the surface location.
  • 43. The system according to claim 42, further comprising at least one first processor operatively coupled to the at least two cameras, the at least one first processor adapted to receive the captured images and generate pixel data associated with the captured images.
  • 44. The system according to claim 43, further comprising a second processor operatively coupled to the at least one first processor, wherein the second processor receives the generated pixel data and generates location coordinate information for the object at the surface location.
  • 45. The system according to claim 42, wherein the at least two cameras comprise complementary metal oxide semiconductor (CMOS) cameras.
US Referenced Citations (454)
Number Name Date Kind
2769374 Sick Nov 1956 A
3025406 Stewart et al. Mar 1962 A
3128340 Harmon Apr 1964 A
3187185 Milnes Jun 1965 A
3360654 Muller Dec 1967 A
3364881 Kool Jan 1968 A
3478220 Milroy Nov 1969 A
3613066 Cooreman Oct 1971 A
3764813 Clement et al. Oct 1973 A
3775560 Ebeling et al. Nov 1973 A
3857022 Rebane et al. Dec 1974 A
3860754 Johnson et al. Jan 1975 A
4107522 Walter Aug 1978 A
4144449 Funk et al. Mar 1979 A
4243879 Carroll et al. Jan 1981 A
4247767 O'Brien et al. Jan 1981 A
4372631 Leon Feb 1983 A
D270788 Umanoff et al. Oct 1983 S
4420261 Barlow et al. Dec 1983 A
4459476 Weissmueller et al. Jul 1984 A
4468694 Edgar Aug 1984 A
4507557 Tsikos Mar 1985 A
4550250 Mueller et al. Oct 1985 A
4553842 Griffin Nov 1985 A
4558313 Garwin et al. Dec 1985 A
D286831 Matyear Nov 1986 S
D290199 Hampshire Jun 1987 S
4672364 Lucas Jun 1987 A
4673918 Adler et al. Jun 1987 A
4703316 Sherbeck Oct 1987 A
4710760 Kasday Dec 1987 A
4737631 Sasaki et al. Apr 1988 A
4742221 Sasaki et al. May 1988 A
4746770 McAvinney May 1988 A
4762990 Caswell et al. Aug 1988 A
4766424 Adler et al. Aug 1988 A
4782328 Denlinger Nov 1988 A
4811004 Person et al. Mar 1989 A
4818826 Kimura Apr 1989 A
4820050 Griffin Apr 1989 A
4822145 Staelin Apr 1989 A
4831455 Ishikawa May 1989 A
4851664 Rieger Jul 1989 A
4868551 Arditty et al. Sep 1989 A
4868912 Doering Sep 1989 A
4888479 Tamaru Dec 1989 A
4893120 Doering et al. Jan 1990 A
D306105 Newhouse Feb 1990 S
4916308 Meadows Apr 1990 A
4928094 Smith May 1990 A
4943806 Masters et al. Jul 1990 A
D312928 Scheffers Dec 1990 S
4980547 Griffin Dec 1990 A
4990901 Beiswenger Feb 1991 A
5025314 Tang et al. Jun 1991 A
5025411 Tallman et al. Jun 1991 A
D318660 Weber Jul 1991 S
5097516 Amir Mar 1992 A
5103085 Zimmerman Apr 1992 A
5105186 May Apr 1992 A
5109435 Lo et al. Apr 1992 A
5130794 Ritchey Jul 1992 A
5140647 Ise et al. Aug 1992 A
5148015 Dolan Sep 1992 A
5162618 Knowles Nov 1992 A
5162783 Moreno Nov 1992 A
5164714 Wehrer Nov 1992 A
5168531 Sigel Dec 1992 A
5179369 Person et al. Jan 1993 A
5196835 Blue et al. Mar 1993 A
5196836 Williams Mar 1993 A
5239152 Caldwell et al. Aug 1993 A
5239373 Tang et al. Aug 1993 A
5272470 Zetts Dec 1993 A
5317140 Dunthorn May 1994 A
5359155 Helser Oct 1994 A
D353368 Poulos Dec 1994 S
5374971 Clapp et al. Dec 1994 A
5414413 Tamaru et al. May 1995 A
5422494 West et al. Jun 1995 A
5448263 Martin Sep 1995 A
5457289 Huang et al. Oct 1995 A
5483261 Yasutake Jan 1996 A
5483603 Luke et al. Jan 1996 A
5484966 Segen Jan 1996 A
5490655 Bates Feb 1996 A
5502568 Ogawa et al. Mar 1996 A
5525764 Junkins et al. Jun 1996 A
5528263 Platzker et al. Jun 1996 A
5528290 Saund Jun 1996 A
5537107 Funado Jul 1996 A
D372601 Roberts et al. Aug 1996 S
5554828 Primm Sep 1996 A
5581276 Cipolla et al. Dec 1996 A
5581637 Cass et al. Dec 1996 A
5591945 Kent Jan 1997 A
5594469 Freeman et al. Jan 1997 A
5594502 Bito et al. Jan 1997 A
5617312 Iura et al. Apr 1997 A
5638092 Eng et al. Jun 1997 A
5670755 Kwon Sep 1997 A
5686942 Ball Nov 1997 A
5698845 Kodama et al. Dec 1997 A
5729704 Stone et al. Mar 1998 A
5734375 Knox et al. Mar 1998 A
5736686 Perret, Jr. et al. Apr 1998 A
5737740 Henderson et al. Apr 1998 A
5739479 Davis-Cannon Apr 1998 A
5745116 Pisutha-Arnond Apr 1998 A
5764223 Chang et al. Jun 1998 A
5771039 Ditzik Jun 1998 A
5784054 Armstrong et al. Jul 1998 A
5785439 Bowen Jul 1998 A
5786810 Knox et al. Jul 1998 A
5790910 Haskin Aug 1998 A
5801704 Oohara et al. Sep 1998 A
5804773 Wilson et al. Sep 1998 A
5818421 Ogino et al. Oct 1998 A
5818424 Korth Oct 1998 A
5819201 DeGraaf Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5831602 Sato et al. Nov 1998 A
5909210 Knox et al. Jun 1999 A
5911004 Ohuchi et al. Jun 1999 A
5914709 Graham et al. Jun 1999 A
5920342 Umeda et al. Jul 1999 A
5936615 Waters Aug 1999 A
5940065 Babb et al. Aug 1999 A
5943783 Jackson Aug 1999 A
5963199 Kato et al. Oct 1999 A
5982352 Pryor Nov 1999 A
5988645 Downing Nov 1999 A
5990874 Tsumura Nov 1999 A
6002808 Freeman Dec 1999 A
6008798 Mato, Jr. et al. Dec 1999 A
6031531 Kimble Feb 2000 A
6061177 Fujimoto May 2000 A
6075905 Herman et al. Jun 2000 A
6076041 Watanabe Jun 2000 A
6091406 Kambara et al. Jul 2000 A
6100538 Ogawa Aug 2000 A
6104387 Chery et al. Aug 2000 A
6118433 Jenkin et al. Sep 2000 A
6122865 Branc et al. Sep 2000 A
6128003 Smith et al. Oct 2000 A
6141000 Martin Oct 2000 A
6147678 Kumar et al. Nov 2000 A
6153836 Goszyk Nov 2000 A
6161066 Wright et al. Dec 2000 A
6179426 Rodriguez, Jr. et al. Jan 2001 B1
6188388 Arita et al. Feb 2001 B1
6191773 Maruno et al. Feb 2001 B1
6208329 Ballare Mar 2001 B1
6208330 Hasegawa et al. Mar 2001 B1
6209266 Branc et al. Apr 2001 B1
6215477 Morrison et al. Apr 2001 B1
6222175 Krymski Apr 2001 B1
6226035 Korein et al. May 2001 B1
6229529 Yano et al. May 2001 B1
6252989 Geisler et al. Jun 2001 B1
6256033 Nguyen Jul 2001 B1
6262718 Findlay et al. Jul 2001 B1
6310610 Beaton et al. Oct 2001 B1
6320597 Ieperen Nov 2001 B1
6323846 Westerman et al. Nov 2001 B1
6326954 Van Ieperen Dec 2001 B1
6328270 Elberbaum Dec 2001 B1
6335724 Takekawa et al. Jan 2002 B1
6337681 Martin Jan 2002 B1
6339748 Hiramatsu Jan 2002 B1
6346966 Toh Feb 2002 B1
6352351 Ogasahara et al. Mar 2002 B1
6353434 Akebi et al. Mar 2002 B1
6359612 Peter et al. Mar 2002 B1
6362468 Murakami et al. Mar 2002 B1
6377228 Jenkin et al. Apr 2002 B1
6384743 Vanderheiden May 2002 B1
6414671 Gillespie et al. Jul 2002 B1
6414673 Wood et al. Jul 2002 B1
6421042 Omura et al. Jul 2002 B1
6427389 Branc et al. Aug 2002 B1
6429856 Omura et al. Aug 2002 B1
6429857 Masters et al. Aug 2002 B1
D462346 Abboud Sep 2002 S
D462678 Abboud Sep 2002 S
6480187 Sano et al. Nov 2002 B1
6496122 Sampsell Dec 2002 B2
6497608 Ho et al. Dec 2002 B2
6498602 Ogawa Dec 2002 B1
6504532 Ogasahara et al. Jan 2003 B1
6507339 Tanaka Jan 2003 B1
6512838 Rafii et al. Jan 2003 B1
6517266 Saund Feb 2003 B2
6518600 Shaddock Feb 2003 B1
6522830 Yamagami Feb 2003 B2
6529189 Colgan et al. Mar 2003 B1
6530664 Vanderwerf et al. Mar 2003 B2
6531999 Trajkovic Mar 2003 B1
6532006 Takekawa et al. Mar 2003 B1
6540366 Keenan et al. Apr 2003 B2
6540679 Slayton et al. Apr 2003 B2
6545669 Kinawi et al. Apr 2003 B1
6545670 Pryor Apr 2003 B1
6559813 DeLuca et al. May 2003 B1
6563491 Omura May 2003 B1
6567078 Ogawa May 2003 B2
6567121 Kuno May 2003 B1
6570103 Saka et al. May 2003 B1
6570612 Saund et al. May 2003 B1
6577299 Schiller et al. Jun 2003 B1
6587099 Takekawa Jul 2003 B2
6590568 Astala et al. Jul 2003 B1
6594023 Omura et al. Jul 2003 B1
6597348 Yamazaki et al. Jul 2003 B1
6597508 Seino et al. Jul 2003 B2
6603867 Sugino et al. Aug 2003 B1
6608619 Omura et al. Aug 2003 B2
6608636 Roseman Aug 2003 B1
6614422 Rafii et al. Sep 2003 B1
6624833 Kumar et al. Sep 2003 B1
6626718 Hiroki Sep 2003 B2
6630922 Fishkin et al. Oct 2003 B2
6633328 Byrd et al. Oct 2003 B1
6650318 Arnon Nov 2003 B1
6650822 Zhou Nov 2003 B1
6674424 Fujioka Jan 2004 B1
6683584 Ronzani et al. Jan 2004 B2
6690357 Dunton et al. Feb 2004 B1
6690363 Newton Feb 2004 B2
6690397 Daignault, Jr. Feb 2004 B1
6710770 Tomasi et al. Mar 2004 B2
6714311 Hashimoto Mar 2004 B2
6720949 Pryor et al. Apr 2004 B1
6736321 Tsikos et al. May 2004 B2
6738051 Boyd et al. May 2004 B2
6741250 Furlan et al. May 2004 B1
6747636 Martin Jun 2004 B2
6756910 Ohba et al. Jun 2004 B2
6760009 Omura et al. Jul 2004 B2
6760999 Branc et al. Jul 2004 B2
6774889 Zhang et al. Aug 2004 B1
6803906 Morrison et al. Oct 2004 B1
6828959 Takekawa et al. Dec 2004 B2
6864882 Newton Mar 2005 B2
6867886 Lassen Mar 2005 B2
6911972 Brinjes Jun 2005 B2
6919880 Morrison et al. Jul 2005 B2
6927384 Reime et al. Aug 2005 B2
6933981 Kishida et al. Aug 2005 B1
6947032 Morrison et al. Sep 2005 B2
6954197 Morrison et al. Oct 2005 B2
6972401 Akitt et al. Dec 2005 B2
6972753 Kimura et al. Dec 2005 B1
7002555 Jacobsen et al. Feb 2006 B1
7007236 Dempski et al. Feb 2006 B2
7015418 Cahill et al. Mar 2006 B2
7030861 Westerman et al. Apr 2006 B1
7057647 Monroe Jun 2006 B1
7058204 Hildreth et al. Jun 2006 B2
7075054 Iwamoto et al. Jul 2006 B2
7084857 Lieberman et al. Aug 2006 B2
7084868 Farag et al. Aug 2006 B2
7098392 Sitrick et al. Aug 2006 B2
7121470 McCall et al. Oct 2006 B2
7129927 Mattsson Oct 2006 B2
7151533 Van Ieperen Dec 2006 B2
7176904 Satoh Feb 2007 B2
7184030 McCharles et al. Feb 2007 B2
7187489 Miles Mar 2007 B2
7190496 Klug et al. Mar 2007 B2
7202860 Ogawa Apr 2007 B2
7227526 Hildreth et al. Jun 2007 B2
7232986 Worthington et al. Jun 2007 B2
7236162 Morrison et al. Jun 2007 B2
7237937 Kawashima et al. Jul 2007 B2
7242388 Lieberman et al. Jul 2007 B2
7265748 Ryynanen Sep 2007 B2
7268692 Lieberman Sep 2007 B1
7274356 Ung et al. Sep 2007 B2
7283126 Leung Oct 2007 B2
7283128 Sato Oct 2007 B2
7289113 Martin Oct 2007 B2
7302156 Lieberman et al. Nov 2007 B1
7305368 Lieberman et al. Dec 2007 B2
7327376 Shen et al. Feb 2008 B2
7330184 Leung Feb 2008 B2
7333094 Lieberman et al. Feb 2008 B2
7333095 Lieberman et al. Feb 2008 B1
7355593 Hill et al. Apr 2008 B2
7372456 McLintock May 2008 B2
7375720 Tanaka May 2008 B2
RE40368 Arnon Jun 2008 E
D571365 Morelock et al. Jun 2008 S
D571803 Morelock et al. Jun 2008 S
D571804 Morelock et al. Jun 2008 S
7403837 Graiger et al. Jul 2008 B2
7411575 Hill et al. Aug 2008 B2
7414617 Ogawa Aug 2008 B2
7479949 Jobs et al. Jan 2009 B2
7492357 Morrison et al. Feb 2009 B2
7499037 Lube Mar 2009 B2
7515143 Keam Apr 2009 B2
7538759 Newton May 2009 B2
7559664 Walleman et al. Jul 2009 B1
7593593 Wilson Sep 2009 B2
7619617 Morrison et al. Nov 2009 B2
7630002 Jenkins Dec 2009 B2
7692625 Morrison et al. Apr 2010 B2
7705835 Eikman Apr 2010 B2
7710391 Bell et al. May 2010 B2
7728821 Hillis et al. Jun 2010 B2
20010012001 Rekimoto et al. Aug 2001 A1
20010019325 Takekawa Sep 2001 A1
20010022579 Hirabayashi Sep 2001 A1
20010026268 Ito Oct 2001 A1
20010033274 Ong Oct 2001 A1
20010050677 Tosaya Dec 2001 A1
20010055006 Sano et al. Dec 2001 A1
20020008692 Omura et al. Jan 2002 A1
20020015159 Hashimoto Feb 2002 A1
20020036617 Pryor Mar 2002 A1
20020041327 Hildreth et al. Apr 2002 A1
20020050979 Oberoi et al. May 2002 A1
20020064382 Hildreth et al. May 2002 A1
20020067922 Harris Jun 2002 A1
20020075243 Newton Jun 2002 A1
20020080123 Kennedy et al. Jun 2002 A1
20020118177 Newton Aug 2002 A1
20020145595 Satoh Oct 2002 A1
20020163530 Takakura et al. Nov 2002 A1
20030001825 Omura et al. Jan 2003 A1
20030025951 Pollard et al. Feb 2003 A1
20030043116 Morrison et al. Mar 2003 A1
20030046401 Abbott et al. Mar 2003 A1
20030063073 Geaghan et al. Apr 2003 A1
20030071858 Morohoshi Apr 2003 A1
20030085871 Ogawa May 2003 A1
20030095112 Kawano et al. May 2003 A1
20030137494 Tulbert Jul 2003 A1
20030142880 Hyodo Jul 2003 A1
20030151532 Chen et al. Aug 2003 A1
20030151562 Kulas Aug 2003 A1
20030156118 Ayinde Aug 2003 A1
20030161524 King Aug 2003 A1
20030227492 Wilde et al. Dec 2003 A1
20040001144 McCharles et al. Jan 2004 A1
20040012573 Morrison et al. Jan 2004 A1
20040021633 Rajkowski Feb 2004 A1
20040031779 Cahill et al. Feb 2004 A1
20040032401 Nakazawa et al. Feb 2004 A1
20040046749 Ikeda Mar 2004 A1
20040051709 Ogawa et al. Mar 2004 A1
20040108990 Lieberman Jun 2004 A1
20040125086 Hagermoser et al. Jul 2004 A1
20040149892 Akitt et al. Aug 2004 A1
20040150630 Hinckley et al. Aug 2004 A1
20040169639 Pate et al. Sep 2004 A1
20040178993 Morrison et al. Sep 2004 A1
20040178997 Gillespie et al. Sep 2004 A1
20040179001 Morrison et al. Sep 2004 A1
20040189720 Wilson et al. Sep 2004 A1
20040201575 Morrison Oct 2004 A1
20040204129 Payne et al. Oct 2004 A1
20040218479 Iwamoto et al. Nov 2004 A1
20040221265 Leung et al. Nov 2004 A1
20040233235 Rubin et al. Nov 2004 A1
20040252091 Ma et al. Dec 2004 A1
20050052427 Wu et al. Mar 2005 A1
20050057524 Hill et al. Mar 2005 A1
20050077452 Morrison et al. Apr 2005 A1
20050083308 Homer et al. Apr 2005 A1
20050104860 McCreary et al. May 2005 A1
20050110964 Bell May 2005 A1
20050122308 Bell Jun 2005 A1
20050128190 Ryynanen Jun 2005 A1
20050151733 Sander et al. Jul 2005 A1
20050156900 Hill et al. Jul 2005 A1
20050162381 Bell Jul 2005 A1
20050183035 Ringel et al. Aug 2005 A1
20050190162 Newton Sep 2005 A1
20050241929 Auger et al. Nov 2005 A1
20050243070 Ung et al. Nov 2005 A1
20050248539 Morrison et al. Nov 2005 A1
20050248540 Newton Nov 2005 A1
20050270781 Marks Dec 2005 A1
20050276448 Pryor Dec 2005 A1
20060012579 Sato Jan 2006 A1
20060022962 Morrison et al. Feb 2006 A1
20060028456 Kang Feb 2006 A1
20060034486 Morrison et al. Feb 2006 A1
20060044282 Pinhanez et al. Mar 2006 A1
20060114244 Saxena et al. Jun 2006 A1
20060152500 Weng Jul 2006 A1
20060158425 Andrews et al. Jul 2006 A1
20060158437 Blythe et al. Jul 2006 A1
20060170658 Nakamura et al. Aug 2006 A1
20060197749 Popovich Sep 2006 A1
20060202953 Pryor et al. Sep 2006 A1
20060227120 Eikman Oct 2006 A1
20060244734 Hill et al. Nov 2006 A1
20060274067 Hikai Dec 2006 A1
20060279558 Van Delden et al. Dec 2006 A1
20070002028 Morrison et al. Jan 2007 A1
20070019103 Lieberman et al. Jan 2007 A1
20070046775 Ferren et al. Mar 2007 A1
20070075648 Blythe et al. Apr 2007 A1
20070075982 Morrison et al. Apr 2007 A1
20070089915 Ogawa et al. Apr 2007 A1
20070116333 Dempski et al. May 2007 A1
20070126755 Zhang et al. Jun 2007 A1
20070139932 Sun et al. Jun 2007 A1
20070152984 Ording et al. Jul 2007 A1
20070152986 Ogawa et al. Jul 2007 A1
20070165007 Morrison et al. Jul 2007 A1
20070167709 Slayton et al. Jul 2007 A1
20070205994 van Ieperen Sep 2007 A1
20070236454 Ung et al. Oct 2007 A1
20070273842 Morrison Nov 2007 A1
20080029691 Han Feb 2008 A1
20080042999 Martin Feb 2008 A1
20080055262 Wu et al. Mar 2008 A1
20080055267 Wu et al. Mar 2008 A1
20080062140 Hotelling et al. Mar 2008 A1
20080062149 Baruk Mar 2008 A1
20080068352 Worthington et al. Mar 2008 A1
20080083602 Auger et al. Apr 2008 A1
20080084539 Daniel Apr 2008 A1
20080106706 Holmgren et al. May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080129707 Pryor Jun 2008 A1
20080150890 Bell Jun 2008 A1
20080150913 Bell Jun 2008 A1
20080179507 Han Jul 2008 A2
20080234032 de Courssou et al. Sep 2008 A1
20080259050 Lin et al. Oct 2008 A1
20080259052 Lin et al. Oct 2008 A1
20080278460 Arnett et al. Nov 2008 A1
20090027357 Morrison Jan 2009 A1
20090058832 Newton Mar 2009 A1
20090058833 Newton Mar 2009 A1
20090085881 Keam Apr 2009 A1
20090103853 Daniel Apr 2009 A1
20090109180 Do et al. Apr 2009 A1
20090128499 Izadi May 2009 A1
20090146972 Morrison et al. Jun 2009 A1
20090153519 Suarez Rovere Jun 2009 A1
20100001963 Doray et al. Jan 2010 A1
20100020025 Lemort et al. Jan 2010 A1
20100073326 Keam Mar 2010 A1
20100079385 Holmgren Apr 2010 A1
20100079409 Sirotich et al. Apr 2010 A1
20100079493 Tse et al. Apr 2010 A1
20100083109 Tse et al. Apr 2010 A1
20100177049 Levy Jul 2010 A1
Foreign Referenced Citations (159)
Number Date Country
2003233728 Dec 2003 AU
2006243730 Nov 2006 AU
2058219 Apr 1993 CA
2367864 Apr 1993 CA
2219886 Apr 1999 CA
2251221 Apr 1999 CA
2267733 Oct 1999 CA
2268208 Oct 1999 CA
2252302 Apr 2000 CA
2350152 Jun 2001 CA
2412878 Jan 2002 CA
2341918 Sep 2002 CA
2386094 Dec 2002 CA
2372868 Aug 2003 CA
2390503 Dec 2003 CA
2390506 Dec 2003 CA
2432770 Dec 2003 CA
2493236 Dec 2003 CA
2448603 May 2004 CA
2453873 Jul 2004 CA
2460449 Sep 2004 CA
2521418 Oct 2004 CA
2481396 Mar 2005 CA
2491582 Jul 2005 CA
2563566 Nov 2005 CA
2564262 Nov 2005 CA
2501214 Sep 2006 CA
2606863 Nov 2006 CA
2580046 Sep 2007 CA
1310126 Aug 2001 CN
1784649 Jun 2006 CN
101019096 Aug 2007 CN
101023582 Aug 2007 CN
1440539 Sep 2009 CN
3836429 May 1990 DE
198 10 452 Dec 1998 DE
60124549 Sep 2007 DE
125068 Nov 1984 EP
0 279 652 Aug 1988 EP
0 347 725 Dec 1989 EP
420335 Apr 1991 EP
0 657 841 Jun 1995 EP
0 762 319 Mar 1997 EP
0 829 798 Mar 1998 EP
897161 Feb 1999 EP
911721 Apr 1999 EP
1059605 Dec 2000 EP
1262909 Dec 2002 EP
1739528 Jan 2003 EP
1739529 Jan 2003 EP
1315071 May 2003 EP
1420335 May 2004 EP
1450243 Aug 2004 EP
1457870 Sep 2004 EP
1471459 Oct 2004 EP
1517228 Mar 2005 EP
1550940 Jun 2005 EP
1611503 Jan 2006 EP
1674977 Jun 2006 EP
1 297 488 Nov 2006 EP
1741186 Jan 2007 EP
1766501 Mar 2007 EP
1830248 Sep 2007 EP
1876517 Jan 2008 EP
1877893 Jan 2008 EP
2279823 Sep 2007 ES
1575420 Sep 1980 GB
2176282 May 1986 GB
2204126 Nov 1988 GB
2263765 Aug 1993 GB
2404127 Jan 2005 GB
57-211637 Dec 1982 JP
61-196317 Aug 1986 JP
61-260322 Nov 1986 JP
62-005428 Jan 1987 JP
63-223819 Sep 1988 JP
3-054618 Mar 1991 JP
3244017 Oct 1991 JP
4-350715 Dec 1992 JP
4-355815 Dec 1992 JP
5-181605 Jul 1993 JP
5-189137 Jul 1993 JP
5-197810 Aug 1993 JP
6-110608 Apr 1994 JP
7-110733 Apr 1995 JP
7-230352 Aug 1995 JP
8-016931 Feb 1996 JP
8-108689 Apr 1996 JP
08-205113 Aug 1996 JP
8-240407 Sep 1996 JP
8-315152 Nov 1996 JP
9-091094 Apr 1997 JP
9-224111 Aug 1997 JP
9-319501 Dec 1997 JP
10-105324 Apr 1998 JP
11-051644 Feb 1999 JP
11-064026 Mar 1999 JP
11-085376 Mar 1999 JP
11-110116 Apr 1999 JP
11-203042 Jul 1999 JP
11-212692 Aug 1999 JP
2000-105671 Apr 2000 JP
2000-132340 May 2000 JP
2001-075735 Mar 2001 JP
2001-142642 May 2001 JP
2001-282456 Oct 2001 JP
2001-282457 Oct 2001 JP
2002-055770 Feb 2002 JP
2002-236547 Aug 2002 JP
2003-65716 Mar 2003 JP
2003-158597 May 2003 JP
2003-167669 Jun 2003 JP
2003-173237 Jun 2003 JP
2005-108211 Apr 2005 JP
2005-182423 Jul 2005 JP
2005-202950 Jul 2005 JP
9807112 Feb 1998 WO
9908897 Feb 1999 WO
9921122 Apr 1999 WO
9928812 Jun 1999 WO
9940562 Aug 1999 WO
0124157 Apr 2001 WO
0131570 May 2001 WO
0163550 Aug 2001 WO
0191043 Nov 2001 WO
0203316 Jan 2002 WO
0207073 Jan 2002 WO
0227461 Apr 2002 WO
03104887 Dec 2003 WO
03105074 Dec 2003 WO
2004072843 Aug 2004 WO
2004090706 Oct 2004 WO
WO 2004090706 Oct 2004 WO
2004102523 Nov 2004 WO
2004104810 Dec 2004 WO
2005031554 Apr 2005 WO
2005034027 Apr 2005 WO
WO 2005034027 Apr 2005 WO
2005106775 Nov 2005 WO
2005107072 Nov 2005 WO
2006002544 Jan 2006 WO
2006092058 Sep 2006 WO
2006095320 Sep 2006 WO
2006096962 Sep 2006 WO
WO 2006095320 Sep 2006 WO
2006116869 Nov 2006 WO
2007003196 Jan 2007 WO
2007019600 Feb 2007 WO
2007037809 Apr 2007 WO
2007064804 Jun 2007 WO
2007079590 Jul 2007 WO
2007132033 Nov 2007 WO
2007134456 Nov 2007 WO
2008128096 Oct 2008 WO
2009029764 Mar 2009 WO
2009029767 Mar 2009 WO
2009146544 Dec 2009 WO
WO 2009146544 Dec 2009 WO
2010051633 May 2010 WO
Related Publications (1)
Number Date Country
20090027357 A1 Jan 2009 US