The present invention relates to a touch sensitive screen and in particular to optically detecting the presence of an object by using signal processing.
Touch screens of the prior art can take on five main forms. These five forms of touch screen input device include resistive, capacitive, surface acoustic wave (SAW), infrared (IR), and optical. Each of these types of touch screen has its own features, advantages and disadvantages.
Resistive is the most common type of touch screen technology. It is a low-cost solution found in many touch screen applications, including hand-held computers, PDA's, consumer electronics, and point-of-sale-applications. A resistive touch screen uses a controller and a specifically coated glass overlay on the display face to produce the touch. connection. The primary types of resistive overlays are 4-wire, 5-wire, and 8 wires. The 5-wire and 8-wire technologies are more expensive to manufacture and calibrate, while 4-wire provides lower image clarity. Two options are generally given: polished or anti-glare. Polished offers clarity of image, but generally introduces glare. Anti-glare will minimize glare, but will also further diffuse the light thereby reducing the clarity. One benefit of using a resistive display is that it can be accessed with a finger (gloved or not), pen, stylus, or a hard object. However, resistive displays are less effective in public environments due to the degradation in image clarity caused by the layers of resistive film, and its susceptibility to scratching. Despite the trade-offs, the resistive screen is the most popular technology because of its relatively low price (at smaller screen sizes), and ability to use a range of input means (fingers, gloves, hard and soft stylus).
Capacitive touch screens are all glass and designed for use in ATM's and similar kiosk type applications. A small current of electricity runs across the screen with circuits located at the corners of the screen to measure the capacitance of a person touching the overlay. Touching the screen interrupts the current and activates the software operating the kiosk. Because the glass and bezel that mounts it to the monitor can be sealed, the touch screen is both durable and resistant to water, dirt and dust. This makes it commonly used in harsher environments like gaming, vending retail displays, public kiosks and industrial applications. However, the capacitive touch screen is only activated by the touch of a human finger and a gloved finger, pen, stylus or hard object will not work. Hence, it is inappropriate for use in many applications, including medical and food preparation.
Surface acoustic wave (SAW) technology provides better image clarity because it uses pure glass construction. A SAW touch screen uses a glass display overlay. Sound waves are transmitted across the surface of the display. Each wave is spread across the screen by bouncing off reflector arrays along the edges of the overlay. Two receivers detect the waves. When the user touches the glass surface, the user's finger absorbs some of the energy of the acoustic wave and the controller circuitry measures the touch location. SAW touch screen technology is used in ATM's, Amusements Parks, Banking and Financial Applications and kiosks. The technology is not able to be gasket sealed, and hence is not suitable to many industrial or commercial applications. Compared to resistive and capacitive technologies, it provides superior image clarity, resolution, and higher light transmission.
Infrared technology relies on the interruption of an infrared light grid in front of the display screen. The touch frame or opto-matrix frame contains a row of infrared LEDs and photo transistors; each mounted on two opposite sides to create a grid of invisible infrared light. The frame assembly is comprised of printed wiring boards on which the opto-electronics are mounted and is concealed behind an infrared-transparent bezel. The bezel shields the opto-electronics from the operating environment while allowing the infrared beams to pass through. The infrared controller sequentially pulses the LEDs to create a grid of infrared light beams. When a stylus, such as a finger, enters the grid, it obstructs the beams. One or more phototransistors detect the absence of light and transmit a signal that identifies the x and y coordinates. Infrared touch screens are often used in manufacturing and medical applications because they can be completely sealed and operated using any number of hard or soft objects. The major issue with infrared is the “seating” of the touch frame is slightly above the screen. Consequently, it is susceptible to “early activation” before the finger or stylus has actually touched the screen. The cost to manufacture the infrared bezel is also quite high.
Optical imaging for touch screens uses a combination of line-scan cameras, digital signal processing, front or back illumination and algorithms to determine a point of touch. The imaging lenses image the user's finger, stylus or object by scanning along the surface of the display. This type of touch screen is susceptible to false readings due to moving shadows and bright lights and also requires that the screen be touched before a reading is taken. Attempts have been made to overcome these disadvantages. Touch screens using optical imaging technology are disclosed in the following publications.
A touch screen using digital ambient light sampling is disclosed in U.S. Pat. No. 4,943,806, in particular this patent discloses a touch input device that continuously samples and stores ambient light readings and compares these with previously taken readings. This is done to minimise the effect of bright light and shadows.
A touch screen for use with a computer system is disclosed in U.S. Pat. No. 5,914,709. In particular a user input device sensitive to touch is disclosed that uses threshold adjustment processing. A light intensity value is read and an “ON” threshold is established, this threshold measurement and adjustment is frequently and periodically performed.
This U.S. Pat. No. 5,317,140 patent discloses a method for optically determining the position and direction of an object on a touch screen display. In particular, a diffuser is positioned over the light sources to produce an average light intensity over the touch screen.
U.S. Pat. No. 5,698,845 discloses a touch screen display that uses an optical detection apparatus to modulate the ON/OFF frequency of light emitters at a frequency of twice the commercial AC line source. The receiver determines the presence of light and compares this to the actual signal transmitted.
U.S. Pat. No. 4,782,328 discloses a touch screen that uses a photosensor unit positioned at a predetermined height above the touch screen, and when a pointer nears the touch screen, rays of its reflected or shadowed ambient light allow it to be sensed.
U.S. Pat. No. 4,868,551 discloses a touch screen that can detect a pointer near the surface of the display by detecting light reflected by the pointer (reflected or diffusive).
It is an object of the present invention to provide a touch sensitive screen which goes someway to overcoming the above mentioned disadvantages or which will at least provide the public with a useful choice.
Accordingly in a first aspect the invention may broadly be said to consist in a touch display comprising:
a screen for a user to touch and view an image on or through;
light sources at one or more edges of said screen, said light sources directing light across the surface of said screen;
at least two cameras having outputs, each said camera located at the periphery of said screen to image the space in front of said screen, said output including a scanned Image;
means for processing said outputs to detect the level of light, said light including:
a processor receiving the processed outputs of said cameras, said processor employing triangulation techniques and said processed outputs to determine whether the processed outputs indicate the presence of an object proximate to said screen and if so the location of said object.
Preferably said processed output indicates the relative bearing of a presumed object location relative to said camera.
Preferably said processed output indicates the relative bearing of a presumed object location relative to the centre of the lens of said camera.
Preferably said processor determines location of said object as a planar screen co-ordinate.
Preferably said light sources are behind said screen arranged to project light through said screen and said display includes at each edge having a light source, light deflectors in front of said screen, directing light emitted from said light sources across the surface of said screen.
Preferably said cameras are line scan cameras, said camera output including information on line scanned and said processor using said information in determining location of said object.
Preferably said touch display including:
means for modulating said light from said light sources to provide a frequency band within the imageable range of said cameras;
means for excluding image data outside said frequency band.
Preferably said means for processing said outputs includes said means for excluding image data outside said frequency band and said means for excluding image data outside said frequency includes filtering.
Preferably said filtering includes applying a filter selected from the group consisting of:
a comb filter;
a high pass filter;
a notch filter; and
a band pass filter.
Preferably said touch display including
means for controlling said light sources; and
means for taking and processing an image taken in a non lighted ambient light state and in a lighted state;
wherein said means for processing said outputs subtracts the ambient state from the lighted state before detecting the level of light.
Preferably said light sources are LEDs and said touch display includes means for controlling the operation of sections of said light source independent of other sections of said light source.
Preferably means for controlling the operation of sections of said light source includes means for independently controlling the effective intensity of said light source.
Preferably said means for controlling sections of said light source comprises wiring said sections in antiphase and driving using a bridge drive.
Preferably means for controlling sections of said light source comprises using a diagonal bridge drive.
Preferably said means for controlling sections of said light source comprises using a shift register for each section to be controlled.
Preferably said means for taking and processing images includes controlling sections of said light sources and each said camera and said means for processing said outputs includes processing information on whether a said section is lighted or not.
Preferably some section are lighted and others are not when an image is taken.
Accordingly in a second aspect the invention may broadly be said to consist in a touch display comprising:
a screen for a user to touch and view an image on or through;
light sources at one or more edges edge of said screen, said light sources directing light across the surface of said screen;
at least two cameras having outputs located at the periphery of said screen, said cameras located so as not to receive direct light from said light sources, each said camera imaging the space in front of said screen, said output including a scanned image;
means for processing said outputs to detect level of reflected light; and
a processor receiving the processed outputs of said cameras, said processor employing triangulation techniques and said processed outputs to determine whether the processed outputs indicate the presence of an object proximate to said screen and if so the location of said object.
Preferably said processed output indicates the relative bearing of a presumed object location relative to said camera.
Preferably said processed output indicates the relative bearing of a presumed object location relative to the centre of the lens of said camera.
Preferably said processor determines location of said object as a planar screen co-ordinate.
Preferably said touch display including:
means for modulating said light from said light sources to provide a frequency band within the imageable range of said cameras;
means for excluding image data outside said frequency band.
Preferably said means for processing said outputs includes said means for excluding image data outside said frequency band and said means for excluding image data outside said frequency includes filtering.
Preferably filtering includes applying a filter selected from the group consisting of:
a comb filter;
a high pass filter;
a notch filter; and
a band pass filter.
Preferably said touch display including:
means for controlling said light sources; and
means for taking and processing an image taken in a non lighted ambient light state and in a lighted state;
wherein said means for processing said outputs subtracts the ambient state from the lighted state before detecting the level of light.
Preferably said light sources are LEDs and said touch display includes means for controlling the operation of sections of said light source independent of other sections of said light source.
Preferably means for controlling the operation of sections of said light source includes means for independently controlling the effective intensity of said light source.
Preferably the means for controlling sections of said light source comprises wiring said sections in antiphase and driving using a bridge drive.
Preferably the means for controlling sections of said light source comprises using a diagonal bridge drive.
Preferably the means for controlling sections of said light source comprises using a shift register for each section to be controlled.
Preferably said means for taking and processing images includes controlling sections of said light sources and each said camera and said means for processing said outputs includes processing information on whether a said section is lighted or not.
Preferably some sections are lighted and others are not when an image is taken.
Preferably said screen is reflective, said camera further images said screen, and said means for processing outputs detects the level of light from the mirror image.
Preferably said processed out put indicates the relative bearing of a presumed object relative to said camera and the distance of said object from said screen.
Accordingly in a third aspect the invention may broadly be said to consist in a method of receiving user inputs in reference to an image including the steps of:
providing a screen for a user to touch and view an image on or through;
providing light sources at one or more edges of said screen, said light sources directing light across the surface of said screen;
providing at least two cameras having outputs, each said camera located at the periphery of said screen to image the space in front of said screen, said output including a scanned image;
processing said outputs to detect the level of light, said light including:
direct light from said light sources, and/or
reflected light from said light sources;
processing the processed outputs of said cameras, using triangulation techniques to obtain the location of said object.
Preferably said processed output indicates the relative bearing of a presumed object location relative to a said camera.
Preferably said processed output indicates the relative bearing of a presumed object location relative to the centre of the lens of said camera.
Preferably said location of is a planar screen co-ordinate.
Preferably said light sources are behind said screen and arranged to project light through said screen and said display includes at each edge having a light source, light deflectors in front of said screen, directing light emitted from said light sources across the surface of said screen.
Preferably said cameras are line scan cameras, said camera output including information on line scanned and said processor using said information in determining location of said object.
Preferably said method including the steps of:
modulating said light from said light sources to provide a frequency band within the imageable range of said cameras;
excluding image data outside said frequency band.
Preferably the step of processing said outputs includes the steps of excluding image data outside said frequency band and said step of excluding image data outside said frequency includes filtering.
Preferably filtering includes the step of applying a filter selected from the group consisting of:
a comb filter;
a high pass filter;
a notch filter; and
a band pass filter.
Preferably said method including the steps of:
controlling said light sources; and
taking and processing an image taken in a non lighted ambient light state and in a lighted state;
wherein said step of processing said outputs subtracts the ambient state from the lighted state before detecting the level of light.
Preferably said light sources are LEDs and said touch display includes means for controlling the operation of sections of said light source independent of other sections of said light source.
Preferably the step of controlling the operation of sections of said light source includes independently controlling the effective intensity of said light source.
Preferably the step of controlling sections of said light source comprises wiring said sections in antiphase and driving using a bridge drive.
Preferably the step of controlling sections of said light source comprises using a diagonal bridge drive.
Preferably the step of controlling sections of said light source comprises using a shift register for each section to be controlled.
Preferably the step of taking and processing images includes controlling sections of said light sources and each said camera and said step of processing said outputs includes processing information on whether a said section is lighted or not.
Preferably some sections are lighted and others are not when an image is taken.
Accordingly in a fourth aspect the invention may broadly be said to consist in a method of receiving user inputs in reference to an image including the steps of:
providing a screen for a user to touch and view an image on or through;
providing light sources at one or more edges edge of said screen, said light sources directing light across the surface of said screen;
providing at least two cameras having outputs located at the periphery of said screen, said cameras located so as not to receive direct light from said light sources, each said camera imaging the space in front of said screen, said output including a scanned image;
processing said outputs to detect level of reflected light; and
processing the processed outputs of said cameras, employing triangulation techniques and said processed outputs to determine whether the processed outputs indicate the presence of an object proximate to said screen and if so the location of said object.
Preferably said processed output indicates the relative bearing of a presumed object location relative to said camera.
Preferably said processed output indicates the relative bearing of a presumed object location relative to the centre of the lens of said camera.
Preferably said processor determines location of said object as a planar screen co-ordinate.
Preferably said method including:
means for modulating said light from said light sources to provide a frequency band within the imageable range of said cameras;
means for excluding image data outside said frequency band.
Preferably said means for processing said outputs includes said means for excluding image data outside said frequency band and said means for excluding image data outside said frequency includes filtering.
Preferably filtering includes applying a filter selected from the group consisting of:
a comb filter;
a high pass filter;
a notch filter; and
a band pass filter.
Preferably said method including
means for controlling said light sources; and
means for taking and processing an image taken in a non lighted ambient light state and in a lighted state;
wherein said means for processing said outputs subtracts the ambient state from the lighted state before detecting the level of light.
Preferably said light sources are LEDs and said touch display includes means for controlling the operation of sections of said light source independent of other sections of said light source.
Preferably the means for controlling the operation of sections of said light source includes means for independently controlling the effective intensity of said light source.
Preferably the means for controlling sections of said light source comprises wiring said sections in antiphase and driving using a bridge drive.
Preferably the means for controlling sections of said light source comprises using a diagonal bridge drive.
Preferably the means for controlling sections of said light source comprises using a shift register for each section to be controlled.
Preferably said means for taking and processing images includes controlling sections of said light sources and each said camera and said means for processing said outputs includes processing information on whether a said section is lighted or not.
Preferably some sections are lighted and others are not when an image is taken.
Preferably said screen is reflective, said camera further images said screen, and said means for processing outputs detects the level of light from the mirror image.
Preferably said processed out put indicates the relative bearing of a presumed object relative to said camera and the distance of said object from said screen.
Accordingly in a fifth aspect the invention may broadly be said to consist in a method of receiving user inputs in reference to an image:
providing at least one light sources on or adjacent the periphery of said image, said light sources directing light across said image;
detecting at least two locations on or adjacent the periphery of said image, the level of light and providing said level as an output;
processing said outputs using triangulation techniques to determine whether said outputs indicate the presence of an object proximate to said image and if so the location of said object.
Preferably said locations are substantially non-opposite so that when an object is present said output is substantially indicative of light reflected from said object.
Accordingly in a sixth aspect the invention may broadly be said to consist in a user input device for locating an object with reference to an image comprising:
at least one light source at or proximate to the periphery of said image, said light source directing light across said image;
at one detector having an output, said detector located or in proximity to said image to image the space in front of said screen, said output indicative of a level of light;
a processor receiving said outputs and using triangulation techniques and said outputs determining the presence of said object and if so the location of said object.
One preferred form of the present invention will now be described with reference to the accompanying drawings in which;
a is an illustration of a cross sectional view through X-X of
b is an illustration of front illumination of the preferred embodiment of the touch screen of the present invention,
a is a block diagram of the filter implementation of the preferred embodiment of the touch screen of the present invention,
b is a diagrammatic illustration of the pixels seen by an area camera and transmitted to the processing module in the preferred embodiment of the present invention,
a is top view of the determination of the position of an object using the mirrored signal in the preferred embodiment of the touch screen of the present invention,
a is a graph representing in the frequency domain the filters responses on the signal from the imager in the preferred embodiment of the touch screen of the present invention,
b is a graph representing in the frequency domain the separation of the object from the background after two types of filtering in the preferred embodiment of the touch screen of the present invention,
a is an illustration of a cross sectional view through X-X of the alternate embodiment of the touch screen of the present invention,
b is an illustration of rear illumination of the alternate embodiment of the touch screen of the present invention,
c is an illustration of rear illumination controlling the sense height of the alternate embodiment of the present invention,
d is a diagrammatic illustration of the pixels seen by a line scan camera and transmitted to the processing module in the alternate embodiment of the present invention,
The present invention relates to improvements in signal processing in the field of optical imaging touch screens. In the preferred embodiment the optical touch screen uses front illumination and is comprised of a screen, a series of light sources, and at least two area scan cameras located in the same plane and at the periphery of the screen. In another embodiment, the optical touch screen uses backlight illumination; the screen is surrounded by an array of light sources located behind the touch panel which are redirected across the surface of the touch panel. At least two line scan cameras are used in the same plane as the touch screen panel. The signal processing improvements created by these implementations are that an object can be sensed when in close proximity to the surface of the touch screen, calibration is simple, and the sensing of an object is not effected by the changing ambient light conditions, for example moving lights or shadows.
A block diagram of a general touch screen system 1 is shown in
Front Illumination Touch Screen
The preferred embodiment of the touch screen of the present invention is shown in
Referring to
Mirrored Signal
The mirrored signal occurs when the object 7 nears the touch panel 3. The touch panel 3 is preferably made from glass which has reflective properties. As shown in
A section of the processing module 10 is shown in
Referring back to
The mirrored signal also provides information about the position of the finger 7 in relation to the cameras 6. It can determine the height 8 of the finger 7 above the panel 3 and its angular position. The information gathered from the mirrored signal is enough to determine where the finger 7 is in relation to the panel 3 without the finger 7 having to touch the panel 3.
a show the positional information that is able to be obtained from the processing of the mirrored signal. The positional information is given in polar co-ordinates. The positional information relates to the height of the finger 7, and the position of the finger 7 over the panel 3.
Referring again to
Modulating
The processing module 10 modulates and collimates the LEDs 4 and sets a sampling rate. The LEDs 4 are modulated, in the simplest embodiment the LEDs 4 are switched on and off at a predetermined frequency. Other types of modulation are possible, for example modulation with a sine wave. Modulating the LEDs 4 at a high frequency results in a frequency reading (when the finger 7 is sensed) that is significantly greater than any other frequencies produced by changing lights and shadows. The modulation frequency is greater than 500 Hz but no more than 10 kHz.
Sampling
The cameras 6 continuously generate an output, which due to data and time constraints is periodically sampled by the processing module 10. In the preferred embodiment, the sampling rate is at least two times the modulation frequency; this is used to avoid aliasing. The modulation of the LEDs and the sampling frequency does not need to be synchronised.
Filtering
The output in the frequency domain from the scanning imager 13 is shown in
In the preferred embodiment when there is not object in the field view, no signal is transmitted to the area camera so there are no other peaks in the output. When an object is in the field of view, there is a signal 24 corresponding to the LED modulated frequency, for example 500 Hz. The lower unwanted frequencies 22, 23 can be removed by various forms of filters. Types of filters can include comb, high pass, notch, and band pass filters.
In
Once the signal has been filtered and the signal in the area of interest identified, the resulting signal is passed to the comparators to be converted into a digital signal and triangulation is performed to determine the actual position of the object. Triangulation is known in the prior art and disclosed in U.S. Pat. No. 5,534,917 and U.S. Pat. No. 4,782,328, and are herein incorporated by reference.
Calibration
The preferred embodiment of the touch screen of the present invention uses very quick and easy calibration that allows the touch screen to be used in any situation and moved to new locations, for example the touch screen is manufactured as a lap top. Calibration involves touching the panel 3 in three different locations 31a, 31b, 31c, as shown in
Back Illumination Touch Screen
Alternately, the array of lights 42 may be replaced with cold cathode tubes. When using a cold cathode tube, a diffusing plate 43 is not necessary as the outer tube of the cathode tube diffuses the light. The cold cathode tube runs along the entire length of one side of the panel 41. This provides a substantially even light intensity across the surface of the panel 41. Cold cathode tubes are not preferably used as they are difficult and expensive to modify to suit the specific length of each side of the panel 41. Using LED's allows greater flexibility in the size and shape of the panel 41.
The diffusing plate 43 is used when the array of lights 42 consists of numerous LED's. The plate 43 is used to diffuse the light emitted from an LED and redirect it across the surface of panel 41. As shown in
Referring to
The line scan cameras 44 consists of a CCD element, lens and driver control circuitry. When an image is seen by the cameras 44 a corresponding output signal is generated.
Referring to
The line scan cameras 44 can read two light variables, namely direct light transmitted from the LED's 42 and reflected light. The method of sensing and reading direct and mirrored light is similar to what has been previously described, but is simpler as line scan cameras can only read one column from the panel at once; it is not broken up into a matrix as when using an area scan camera. This is shown in
In the alternate embodiment, since the bezel surrounds the touch panel, the line scan cameras will be continuously reading the modulated light transmitted from the LEDs. This will result in the modulated frequency being present in the output whenever there is no object to interrupt the light path. When an object interrupts the light path, the modulated frequency in the output will not be present. This indicates that an object is in near to or touching the touch panel. The frequency present in the output signal is twice the height (twice the amplitude) than the frequency in the preferred embodiment. This is due to both signals (direct and mirrored) being present at once.
In a further alternate embodiment, shown in
Calibration of this alternate embodiment is performed in the same manner as previously described but the touch points 31a, 31b, 31c (referring to
In
The backlight switching may advantageously be arranged such that while one section is illuminated, the ambient light level of another section is being measured by the signal processor. By simultaneously measuring ambient and backlit sections, speed is improved over single backlight systems.
The backlight brightness is adaptively adjusted by controlling LED current or pulse duration, as each section is activated so as to use the minimum average power whilst maintaining a constant signal to noise plus ambient ratio for the pixels that view that section.
Control of the plurality of sections with a minimum number of control lines is achieved in one of several ways.
In a first implementation of a two section backlight two groups of diodes can be wired antiphase and driven with a bridge drive.
In a second implementation with more than two sections, a diagonal bridge drive is used. For example, four wires are able to select 1 of 12 sections, 5 wires can drive 20 sections, and 6 wires can drive 30 sections.
In a third implementation, for a large number of sections, a shift register is physically distributed around the backlight, and only two control lines are required.
X-Y multiplexing arrangements are well known in the art. For example, 8+4 wires are used to control a 4 digit display with 32 LED's. In one embodiment, a 4 wire diagonal multiplexing arrangement can have 12 LEDs. The control lines are driven by tristate outputs such as are commonly found at the pins of microprocessors such as the Microchip PIC family. Each tristate output has two electronic switches which are commonly mosfets. Either or neither of the switches can be turned on. To operate each LED, unique combinations of switches can be enabled. This arrangement can be used with any number of control lines, but is particularly advantageous for the cases of 4, 5, 6 control lines, where 12, 20, 30 leds can be controlled whilst the printed circuit board tracking remains simple. Where higher control numbers are used it may be advantageous to use degenerate forms where some of the possible leds are omitted to ease the practical interconnection difficulties.
The diagonal multiplexing system has the following features:
To those skilled in the art to which the invention relates, many changes in construction and widely differing embodiments and applications of the invention will suggest themselves without departing from the scope of the invention as defined in the appended claims. The disclosures and the descriptions herein are purely illustrative and are not intended to be in any sense limiting.
Number | Date | Country | Kind |
---|---|---|---|
524211 | Feb 2003 | NZ | national |
This application is a continuation of U.S. patent application Ser. No. 11/033,183, filed Jan. 11, 2005, now U.S. Pat. No. 7,629,967 which is a continuation of Application No. PCT NZ2004/000029, published as WO 2004/072843, filed Feb. 16, 2004, which claims priority to NZ Application No. 524211, filed Feb. 14, 2003, each of which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
844152 | Little | Feb 1907 | A |
2407680 | Palmquist et al. | Sep 1946 | A |
2769374 | Sick | Nov 1956 | A |
3025406 | Stewart et al. | Mar 1962 | A |
3128340 | Harmon | Apr 1964 | A |
3187185 | Milnes | Jun 1965 | A |
3360654 | Muller | Dec 1967 | A |
3478220 | Milroy | Nov 1969 | A |
3563771 | Tung | Feb 1971 | A |
3613066 | Cooreman | Oct 1971 | A |
3764813 | Clement et al. | Oct 1973 | A |
3775560 | Ebeling et al. | Nov 1973 | A |
3810804 | Rowland | May 1974 | A |
3830682 | Rowland | Aug 1974 | A |
3857022 | Rebane et al. | Dec 1974 | A |
3860754 | Johnson et al. | Jan 1975 | A |
4107522 | Walter | Aug 1978 | A |
4144449 | Funk et al. | Mar 1979 | A |
4243618 | Van Arnam | Jan 1981 | A |
4243879 | Carroll et al. | Jan 1981 | A |
4247767 | O'Brien et al. | Jan 1981 | A |
4329037 | Caviness | May 1982 | A |
4420261 | Barlow et al. | Dec 1983 | A |
4459476 | Weissmueller et al. | Jul 1984 | A |
4468694 | Edgar | Aug 1984 | A |
4486363 | Pricone et al. | Dec 1984 | A |
4507557 | Tsikos | Mar 1985 | A |
4542375 | Alles et al. | Sep 1985 | A |
4550250 | Mueller et al. | Oct 1985 | A |
4553842 | Griffin | Nov 1985 | A |
4558313 | Garwin et al. | Dec 1985 | A |
4601861 | Pricone et al. | Jul 1986 | A |
4672364 | Lucas | Jun 1987 | A |
4673918 | Adler et al. | Jun 1987 | A |
4688933 | Lapeyre | Aug 1987 | A |
4703316 | Sherbeck | Oct 1987 | A |
4710760 | Kasday | Dec 1987 | A |
4737631 | Sasaki et al. | Apr 1988 | A |
4742221 | Sasaki et al. | May 1988 | A |
4746770 | McAvinney | May 1988 | A |
4762990 | Caswell et al. | Aug 1988 | A |
4766424 | Adler et al. | Aug 1988 | A |
4782328 | Denlinger | Nov 1988 | A |
4811004 | Person et al. | Mar 1989 | A |
4818826 | Kimura | Apr 1989 | A |
4820050 | Griffin | Apr 1989 | A |
4822145 | Staelin | Apr 1989 | A |
4831455 | Ishikawa et al. | May 1989 | A |
4851664 | Rieger | Jul 1989 | A |
4868551 | Arditty et al. | Sep 1989 | A |
4868912 | Doering | Sep 1989 | A |
4888479 | Tamaru | Dec 1989 | A |
4893120 | Doering et al. | Jan 1990 | A |
4916308 | Meadows | Apr 1990 | A |
4928094 | Smith | May 1990 | A |
4943806 | Masters et al. | Jul 1990 | A |
4980547 | Griffin | Dec 1990 | A |
4990901 | Beiswenger | Feb 1991 | A |
5025314 | Tang et al. | Jun 1991 | A |
5025411 | Tallman et al. | Jun 1991 | A |
5043751 | Rice | Aug 1991 | A |
5097516 | Amir | Mar 1992 | A |
5103085 | Zimmerman | Apr 1992 | A |
5103249 | Keene | Apr 1992 | A |
5105186 | May | Apr 1992 | A |
5109435 | Lo et al. | Apr 1992 | A |
5130794 | Ritchey | Jul 1992 | A |
5140647 | Ise et al. | Aug 1992 | A |
5148015 | Dolan | Sep 1992 | A |
5162618 | Knowles | Nov 1992 | A |
5162783 | Moreno | Nov 1992 | A |
5164714 | Wehrer | Nov 1992 | A |
5168531 | Sigel | Dec 1992 | A |
5177328 | Ito et al. | Jan 1993 | A |
5179369 | Person et al. | Jan 1993 | A |
5196835 | Blue et al. | Mar 1993 | A |
5196836 | Williams | Mar 1993 | A |
5200851 | Coderre et al. | Apr 1993 | A |
5200861 | Moskovich | Apr 1993 | A |
5233502 | Beatty et al. | Aug 1993 | A |
5239152 | Caldwell et al. | Aug 1993 | A |
5239373 | Tang et al. | Aug 1993 | A |
5272470 | Zetts | Dec 1993 | A |
5317140 | Dunthorn | May 1994 | A |
5359155 | Helser | Oct 1994 | A |
5374971 | Clapp et al. | Dec 1994 | A |
5414413 | Tamaru et al. | May 1995 | A |
5422494 | West et al. | Jun 1995 | A |
5448263 | Martin | Sep 1995 | A |
5457289 | Huang et al. | Oct 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5483603 | Luke et al. | Jan 1996 | A |
5484966 | Segen | Jan 1996 | A |
5490655 | Bates | Feb 1996 | A |
5502568 | Ogawa et al. | Mar 1996 | A |
5525764 | Junkins et al. | Jun 1996 | A |
5528263 | Platzker et al. | Jun 1996 | A |
5528290 | Saund | Jun 1996 | A |
5537107 | Funado | Jul 1996 | A |
5541372 | Baller et al. | Jul 1996 | A |
5554828 | Primm | Sep 1996 | A |
5581276 | Cipolla et al. | Dec 1996 | A |
5581637 | Cass et al. | Dec 1996 | A |
5591945 | Kent | Jan 1997 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5594502 | Bito et al. | Jan 1997 | A |
5617312 | Iura et al. | Apr 1997 | A |
5638092 | Eng et al. | Jun 1997 | A |
5670755 | Kwon | Sep 1997 | A |
5686942 | Ball | Nov 1997 | A |
5698845 | Kodama et al. | Dec 1997 | A |
5712024 | Okuzaki et al. | Jan 1998 | A |
5729704 | Stone et al. | Mar 1998 | A |
5734375 | Knox et al. | Mar 1998 | A |
5736686 | Perret, Jr. et al. | Apr 1998 | A |
5737740 | Henderson et al. | Apr 1998 | A |
5739479 | Davis-Cannon et al. | Apr 1998 | A |
5745116 | Pisutha-Arnond | Apr 1998 | A |
5764223 | Chang et al. | Jun 1998 | A |
5771039 | Ditzik | Jun 1998 | A |
5784054 | Armstrong et al. | Jul 1998 | A |
5785439 | Bowen | Jul 1998 | A |
5786810 | Knox et al. | Jul 1998 | A |
5790910 | Haskin | Aug 1998 | A |
5801704 | Oohara et al. | Sep 1998 | A |
5804773 | Wilson et al. | Sep 1998 | A |
5818421 | Ogino et al. | Oct 1998 | A |
5818424 | Korth | Oct 1998 | A |
5819201 | DeGraaf | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5831602 | Sato et al. | Nov 1998 | A |
5877459 | Prater | Mar 1999 | A |
5909210 | Knox et al. | Jun 1999 | A |
5911004 | Ohuchi et al. | Jun 1999 | A |
5914709 | Graham et al. | Jun 1999 | A |
5920342 | Umeda et al. | Jul 1999 | A |
5936615 | Waters | Aug 1999 | A |
5936770 | Nestegard et al. | Aug 1999 | A |
5940065 | Babb et al. | Aug 1999 | A |
5943783 | Jackson | Aug 1999 | A |
5963199 | Kato et al. | Oct 1999 | A |
5982352 | Pryor | Nov 1999 | A |
5988645 | Downing | Nov 1999 | A |
5990874 | Tsumura et al. | Nov 1999 | A |
6002808 | Freeman | Dec 1999 | A |
6008798 | Mato, Jr. et al. | Dec 1999 | A |
6015214 | Heenan et al. | Jan 2000 | A |
6020878 | Robinson | Feb 2000 | A |
6031524 | Kunert | Feb 2000 | A |
6031531 | Kimble | Feb 2000 | A |
6061177 | Fujimoto | May 2000 | A |
6067080 | Holtzman | May 2000 | A |
6075905 | Herman et al. | Jun 2000 | A |
6076041 | Watanabe | Jun 2000 | A |
6091406 | Kambara et al. | Jul 2000 | A |
6100538 | Ogawa | Aug 2000 | A |
6104387 | Chery et al. | Aug 2000 | A |
6118433 | Jenkin et al. | Sep 2000 | A |
6122865 | Branc et al. | Sep 2000 | A |
6128003 | Smith et al. | Oct 2000 | A |
6141000 | Martin | Oct 2000 | A |
6147678 | Kumar et al. | Nov 2000 | A |
6153836 | Goszyk | Nov 2000 | A |
6161066 | Wright et al. | Dec 2000 | A |
6179426 | Rodriguez, Jr. et al. | Jan 2001 | B1 |
6188388 | Arita et al. | Feb 2001 | B1 |
6191773 | Maruno et al. | Feb 2001 | B1 |
6208329 | Ballare | Mar 2001 | B1 |
6208330 | Hasegawa et al. | Mar 2001 | B1 |
6209266 | Branc et al. | Apr 2001 | B1 |
6215477 | Morrison et al. | Apr 2001 | B1 |
6222175 | Krymski | Apr 2001 | B1 |
6226035 | Korein et al. | May 2001 | B1 |
6229529 | Yano et al. | May 2001 | B1 |
6252989 | Geisler et al. | Jun 2001 | B1 |
6256033 | Nguyen | Jul 2001 | B1 |
6262718 | Findlay et al. | Jul 2001 | B1 |
6285359 | Ogasawara et al. | Sep 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6320597 | Ieperen | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6326954 | Van Ieperen | Dec 2001 | B1 |
6328270 | Elberbaum | Dec 2001 | B1 |
6335724 | Takekawa et al. | Jan 2002 | B1 |
6337681 | Martin | Jan 2002 | B1 |
6339748 | Hiramatsu | Jan 2002 | B1 |
6346966 | Toh | Feb 2002 | B1 |
6352351 | Ogasahara et al. | Mar 2002 | B1 |
6353434 | Akebi et al. | Mar 2002 | B1 |
6359612 | Peter et al. | Mar 2002 | B1 |
6362468 | Murakami et al. | Mar 2002 | B1 |
6377228 | Jenkin et al. | Apr 2002 | B1 |
6384743 | Vanderheiden | May 2002 | B1 |
6406758 | Bottari et al. | Jun 2002 | B1 |
6414671 | Gillespie et al. | Jul 2002 | B1 |
6414673 | Wood et al. | Jul 2002 | B1 |
6421042 | Omura et al. | Jul 2002 | B1 |
6427389 | Branc et al. | Aug 2002 | B1 |
6429856 | Omura et al. | Aug 2002 | B1 |
6429857 | Masters et al. | Aug 2002 | B1 |
6480187 | Sano et al. | Nov 2002 | B1 |
6496122 | Sampsell | Dec 2002 | B2 |
6497608 | Ho et al. | Dec 2002 | B2 |
6498602 | Ogawa | Dec 2002 | B1 |
6501461 | Holtzman | Dec 2002 | B2 |
6504532 | Ogasahara et al. | Jan 2003 | B1 |
6507339 | Tanaka | Jan 2003 | B1 |
6512838 | Rafii et al. | Jan 2003 | B1 |
6517266 | Saund | Feb 2003 | B2 |
6518600 | Shaddock | Feb 2003 | B1 |
6518960 | Omura et al. | Feb 2003 | B2 |
6522830 | Yamagami | Feb 2003 | B2 |
6529189 | Colgan et al. | Mar 2003 | B1 |
6530664 | Vanderwerf et al. | Mar 2003 | B2 |
6531999 | Trajkovic | Mar 2003 | B1 |
6532006 | Takekawa et al. | Mar 2003 | B1 |
6537673 | Sada et al. | Mar 2003 | B2 |
6540366 | Keenan et al. | Apr 2003 | B2 |
6540679 | Slayton et al. | Apr 2003 | B2 |
6545669 | Kinawi et al. | Apr 2003 | B1 |
6559813 | DeLuca et al. | May 2003 | B1 |
6563491 | Omura | May 2003 | B1 |
6567078 | Ogawa | May 2003 | B2 |
6567121 | Kuno | May 2003 | B1 |
6570103 | Saka et al. | May 2003 | B1 |
6570612 | Saund et al. | May 2003 | B1 |
6577299 | Schiller et al. | Jun 2003 | B1 |
6587099 | Takekawa | Jul 2003 | B2 |
6590568 | Astala et al. | Jul 2003 | B1 |
6594023 | Omura et al. | Jul 2003 | B1 |
6597348 | Yamazaki et al. | Jul 2003 | B1 |
6597508 | Seino et al. | Jul 2003 | B2 |
6603867 | Sugino et al. | Aug 2003 | B1 |
6608619 | Omura et al. | Aug 2003 | B2 |
6614422 | Rafii et al. | Sep 2003 | B1 |
6624833 | Kumar et al. | Sep 2003 | B1 |
6626718 | Hiroki | Sep 2003 | B2 |
6630922 | Fishkin et al. | Oct 2003 | B2 |
6633328 | Byrd et al. | Oct 2003 | B1 |
6650318 | Arnon | Nov 2003 | B1 |
6650822 | Zhou | Nov 2003 | B1 |
6664952 | Iwamoto et al. | Dec 2003 | B2 |
6670985 | Karube et al. | Dec 2003 | B2 |
6674424 | Fujioka | Jan 2004 | B1 |
6683584 | Ronzani et al. | Jan 2004 | B2 |
6690357 | Dunton et al. | Feb 2004 | B1 |
6690363 | Newton | Feb 2004 | B2 |
6690397 | Daignault, Jr. | Feb 2004 | B1 |
6710770 | Tomasi et al. | Mar 2004 | B2 |
6714311 | Hashimoto | Mar 2004 | B2 |
6720949 | Pryor et al. | Apr 2004 | B1 |
6727885 | Ishino et al. | Apr 2004 | B1 |
6736321 | Tsikos et al. | May 2004 | B2 |
6738051 | Boyd et al. | May 2004 | B2 |
6741250 | Furlan et al. | May 2004 | B1 |
6741267 | Leperen | May 2004 | B1 |
6747636 | Martin | Jun 2004 | B2 |
6756910 | Ohba et al. | Jun 2004 | B2 |
6760009 | Omura et al. | Jul 2004 | B2 |
6760999 | Branc et al. | Jul 2004 | B2 |
6767102 | Heenan et al. | Jul 2004 | B1 |
6774889 | Zhang et al. | Aug 2004 | B1 |
6803906 | Morrison et al. | Oct 2004 | B1 |
6828959 | Takekawa et al. | Dec 2004 | B2 |
6864882 | Newton | Mar 2005 | B2 |
6909425 | Matsuda et al. | Jun 2005 | B2 |
6911972 | Brinjes | Jun 2005 | B2 |
6919880 | Morrison et al. | Jul 2005 | B2 |
6927384 | Reime et al. | Aug 2005 | B2 |
6933981 | Kishida et al. | Aug 2005 | B1 |
6947029 | Katagiri et al. | Sep 2005 | B2 |
6947032 | Morrison et al. | Sep 2005 | B2 |
6952202 | Hirabayashi | Oct 2005 | B2 |
6954197 | Morrison et al. | Oct 2005 | B2 |
6972401 | Akitt et al. | Dec 2005 | B2 |
6972753 | Kimura et al. | Dec 2005 | B1 |
7002555 | Jacobsen et al. | Feb 2006 | B1 |
7007236 | Dempski et al. | Feb 2006 | B2 |
7015418 | Cahill et al. | Mar 2006 | B2 |
7030861 | Westerman et al. | Apr 2006 | B1 |
7057647 | Monroe | Jun 2006 | B1 |
7058204 | Hildreth et al. | Jun 2006 | B2 |
7075054 | Iwamoto et al. | Jul 2006 | B2 |
7084857 | Lieberman et al. | Aug 2006 | B2 |
7084868 | Farag et al. | Aug 2006 | B2 |
7098392 | Sitrick et al. | Aug 2006 | B2 |
7113174 | Takekawa et al. | Sep 2006 | B1 |
7121470 | McCall et al. | Oct 2006 | B2 |
7133032 | Cok | Nov 2006 | B2 |
7151533 | Van Ieperen | Dec 2006 | B2 |
7176904 | Satoh | Feb 2007 | B2 |
7184030 | McCharles et al. | Feb 2007 | B2 |
7187489 | Miles | Mar 2007 | B2 |
7190496 | Klug et al. | Mar 2007 | B2 |
7202860 | Ogawa | Apr 2007 | B2 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7230608 | Cok | Jun 2007 | B2 |
7232986 | Worthington et al. | Jun 2007 | B2 |
7236132 | Lin et al. | Jun 2007 | B1 |
7236154 | Kerr et al. | Jun 2007 | B1 |
7236162 | Morrison et al. | Jun 2007 | B2 |
7237937 | Kawashima et al. | Jul 2007 | B2 |
7242388 | Lieberman et al. | Jul 2007 | B2 |
7265748 | Ryynanen | Sep 2007 | B2 |
7268692 | Lieberman | Sep 2007 | B1 |
7274356 | Ung et al. | Sep 2007 | B2 |
7283126 | Leung | Oct 2007 | B2 |
7283128 | Sato | Oct 2007 | B2 |
7289113 | Martin | Oct 2007 | B2 |
7302156 | Lieberman et al. | Nov 2007 | B1 |
7305368 | Lieberman et al. | Dec 2007 | B2 |
7330184 | Leung | Feb 2008 | B2 |
7333094 | Lieberman et al. | Feb 2008 | B2 |
7333095 | Lieberman et al. | Feb 2008 | B1 |
7355593 | Hill et al. | Apr 2008 | B2 |
7372456 | McLintock | May 2008 | B2 |
7375720 | Tanaka | May 2008 | B2 |
RE40368 | Arnon | Jun 2008 | E |
7411575 | Hill et al. | Aug 2008 | B2 |
7414617 | Ogawa | Aug 2008 | B2 |
7432914 | Kobayashi et al. | Oct 2008 | B2 |
7460110 | Ung et al. | Dec 2008 | B2 |
7477241 | Lieberman et al. | Jan 2009 | B2 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7492357 | Morrison et al. | Feb 2009 | B2 |
7499037 | Lube | Mar 2009 | B2 |
7515138 | Sullivan | Apr 2009 | B2 |
7515141 | Kobayashi | Apr 2009 | B2 |
7522156 | Sano et al. | Apr 2009 | B2 |
7538759 | Newton | May 2009 | B2 |
7557935 | Baruch | Jul 2009 | B2 |
7559664 | Walleman et al. | Jul 2009 | B1 |
7619617 | Morrison et al. | Nov 2009 | B2 |
7629967 | Newton | Dec 2009 | B2 |
7692625 | Morrison et al. | Apr 2010 | B2 |
7751671 | Newton et al. | Jul 2010 | B1 |
7755613 | Morrison et al. | Jul 2010 | B2 |
7777732 | Herz et al. | Aug 2010 | B2 |
7781722 | Lieberman et al. | Aug 2010 | B2 |
20010019325 | Takekawa | Sep 2001 | A1 |
20010022579 | Hirabayashi | Sep 2001 | A1 |
20010026268 | Ito | Oct 2001 | A1 |
20010033274 | Ong | Oct 2001 | A1 |
20010048169 | Nilsen et al. | Dec 2001 | A1 |
20010050677 | Tosaya | Dec 2001 | A1 |
20010055006 | Sano et al. | Dec 2001 | A1 |
20020008692 | Omura et al. | Jan 2002 | A1 |
20020015159 | Hashimoto | Feb 2002 | A1 |
20020036617 | Pryor | Mar 2002 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20020050979 | Oberoi et al. | May 2002 | A1 |
20020064382 | Hildreth et al. | May 2002 | A1 |
20020067922 | Harris | Jun 2002 | A1 |
20020075243 | Newton | Jun 2002 | A1 |
20020080123 | Kennedy et al. | Jun 2002 | A1 |
20020118177 | Newton | Aug 2002 | A1 |
20020145595 | Satoh | Oct 2002 | A1 |
20020145596 | Vardi | Oct 2002 | A1 |
20020163505 | Takekawa | Nov 2002 | A1 |
20020163530 | Takakura et al. | Nov 2002 | A1 |
20030001825 | Omura et al. | Jan 2003 | A1 |
20030025951 | Pollard et al. | Feb 2003 | A1 |
20030034439 | Reime et al. | Feb 2003 | A1 |
20030043116 | Morrison et al. | Mar 2003 | A1 |
20030046401 | Abbott et al. | Mar 2003 | A1 |
20030063073 | Geaghan et al. | Apr 2003 | A1 |
20030071858 | Morohoshi | Apr 2003 | A1 |
20030085871 | Ogawa | May 2003 | A1 |
20030095112 | Kawano et al. | May 2003 | A1 |
20030137494 | Tulbert | Jul 2003 | A1 |
20030142880 | Hyodo | Jul 2003 | A1 |
20030147016 | Lin et al. | Aug 2003 | A1 |
20030151532 | Chen et al. | Aug 2003 | A1 |
20030151562 | Kulas | Aug 2003 | A1 |
20030156118 | Ayinde | Aug 2003 | A1 |
20030161524 | King | Aug 2003 | A1 |
20030227492 | Wilde et al. | Dec 2003 | A1 |
20040001144 | McCharles et al. | Jan 2004 | A1 |
20040012573 | Morrison et al. | Jan 2004 | A1 |
20040021633 | Rajkowski | Feb 2004 | A1 |
20040031779 | Cahill et al. | Feb 2004 | A1 |
20040032401 | Nakazawa et al. | Feb 2004 | A1 |
20040046749 | Ikeda | Mar 2004 | A1 |
20040051709 | Ogawa et al. | Mar 2004 | A1 |
20040108990 | Lieberman et al. | Jun 2004 | A1 |
20040125086 | Hagermoser et al. | Jul 2004 | A1 |
20040149892 | Akitt et al. | Aug 2004 | A1 |
20040150630 | Hinckley et al. | Aug 2004 | A1 |
20040169639 | Pate et al. | Sep 2004 | A1 |
20040178993 | Morrison et al. | Sep 2004 | A1 |
20040178997 | Gillespie et al. | Sep 2004 | A1 |
20040179001 | Morrison et al. | Sep 2004 | A1 |
20040189720 | Wilson et al. | Sep 2004 | A1 |
20040201575 | Morrison | Oct 2004 | A1 |
20040204129 | Payne et al. | Oct 2004 | A1 |
20040218479 | Iwamoto et al. | Nov 2004 | A1 |
20040221265 | Leung et al. | Nov 2004 | A1 |
20040252091 | Ma et al. | Dec 2004 | A1 |
20050020612 | Gericke | Jan 2005 | A1 |
20050030287 | Sato | Feb 2005 | A1 |
20050052427 | Wu et al. | Mar 2005 | A1 |
20050057524 | Hill et al. | Mar 2005 | A1 |
20050077452 | Morrison et al. | Apr 2005 | A1 |
20050083308 | Homer et al. | Apr 2005 | A1 |
20050104860 | McCreary et al. | May 2005 | A1 |
20050128190 | Ryynanen | Jun 2005 | A1 |
20050151733 | Sander et al. | Jul 2005 | A1 |
20050156900 | Hill et al. | Jul 2005 | A1 |
20050178953 | Worthington et al. | Aug 2005 | A1 |
20050190162 | Newton | Sep 2005 | A1 |
20050218297 | Suda et al. | Oct 2005 | A1 |
20050241929 | Auger et al. | Nov 2005 | A1 |
20050243070 | Ung et al. | Nov 2005 | A1 |
20050248539 | Morrison et al. | Nov 2005 | A1 |
20050248540 | Newton | Nov 2005 | A1 |
20050270781 | Marks | Dec 2005 | A1 |
20050276448 | Pryor | Dec 2005 | A1 |
20060012579 | Sato | Jan 2006 | A1 |
20060022962 | Morrison et al. | Feb 2006 | A1 |
20060028456 | Kang | Feb 2006 | A1 |
20060033751 | Keely et al. | Feb 2006 | A1 |
20060034486 | Morrison et al. | Feb 2006 | A1 |
20060070187 | Chilson | Apr 2006 | A1 |
20060132432 | Bell | Jun 2006 | A1 |
20060139314 | Bell | Jun 2006 | A1 |
20060152500 | Weng | Jul 2006 | A1 |
20060158437 | Blythe et al. | Jul 2006 | A1 |
20060170658 | Nakamura et al. | Aug 2006 | A1 |
20060197749 | Popovich | Sep 2006 | A1 |
20060202953 | Pryor et al. | Sep 2006 | A1 |
20060202974 | Thielman | Sep 2006 | A1 |
20060227120 | Eikman | Oct 2006 | A1 |
20060232568 | Tanaka et al. | Oct 2006 | A1 |
20060232830 | Kobayashi | Oct 2006 | A1 |
20060244734 | Hill et al. | Nov 2006 | A1 |
20060274067 | Hidai | Dec 2006 | A1 |
20060279558 | Van Delden et al. | Dec 2006 | A1 |
20060284858 | Rekimoto | Dec 2006 | A1 |
20070002028 | Morrison et al. | Jan 2007 | A1 |
20070019103 | Lieberman et al. | Jan 2007 | A1 |
20070059520 | Hatin et al. | Mar 2007 | A1 |
20070075648 | Blythe et al. | Apr 2007 | A1 |
20070075982 | Morrison et al. | Apr 2007 | A1 |
20070089915 | Ogawa et al. | Apr 2007 | A1 |
20070116333 | Dempski et al. | May 2007 | A1 |
20070126755 | Zhang et al. | Jun 2007 | A1 |
20070132742 | Chen et al. | Jun 2007 | A1 |
20070139932 | Sun et al. | Jun 2007 | A1 |
20070152977 | Ng et al. | Jul 2007 | A1 |
20070152984 | Ording et al. | Jul 2007 | A1 |
20070152986 | Ogawa | Jul 2007 | A1 |
20070160362 | Mitsuo et al. | Jul 2007 | A1 |
20070165007 | Morrison et al. | Jul 2007 | A1 |
20070167709 | Slayton et al. | Jul 2007 | A1 |
20070205994 | Van Ieperen | Sep 2007 | A1 |
20070215451 | Sasloff et al. | Sep 2007 | A1 |
20070236454 | Ung et al. | Oct 2007 | A1 |
20070247435 | Benko et al. | Oct 2007 | A1 |
20070273842 | Morrison et al. | Nov 2007 | A1 |
20080012835 | Rimon et al. | Jan 2008 | A1 |
20080029691 | Han | Feb 2008 | A1 |
20080042999 | Martin | Feb 2008 | A1 |
20080055262 | Wu et al. | Mar 2008 | A1 |
20080055267 | Wu et al. | Mar 2008 | A1 |
20080062140 | Hotelling et al. | Mar 2008 | A1 |
20080062149 | Baruk | Mar 2008 | A1 |
20080068352 | Worthington et al. | Mar 2008 | A1 |
20080083602 | Auger et al. | Apr 2008 | A1 |
20080103267 | Hurst et al. | May 2008 | A1 |
20080106706 | Holmgren et al. | May 2008 | A1 |
20080122803 | Izadi et al. | May 2008 | A1 |
20080129707 | Pryor | Jun 2008 | A1 |
20080143682 | Shim et al. | Jun 2008 | A1 |
20080150913 | Bell et al. | Jun 2008 | A1 |
20080158170 | Herz et al. | Jul 2008 | A1 |
20080259050 | Lin et al. | Oct 2008 | A1 |
20080259052 | Lin et al. | Oct 2008 | A1 |
20080259053 | Newton | Oct 2008 | A1 |
20090030853 | De La Motte | Jan 2009 | A1 |
20090058832 | Newton | Mar 2009 | A1 |
20090058833 | Newton | Mar 2009 | A1 |
20090077504 | Bell et al. | Mar 2009 | A1 |
20090122027 | Newton | May 2009 | A1 |
20090135162 | Van De Wijdeven et al. | May 2009 | A1 |
20090141002 | Sohn et al. | Jun 2009 | A1 |
20090146972 | Morrison et al. | Jun 2009 | A1 |
20090207144 | Bridger | Aug 2009 | A1 |
20090213093 | Bridger | Aug 2009 | A1 |
20090213094 | Bridger | Aug 2009 | A1 |
20090219256 | Newton | Sep 2009 | A1 |
20090237376 | Bridger | Sep 2009 | A1 |
20090278816 | Colson | Nov 2009 | A1 |
20090284495 | Geaghan et al. | Nov 2009 | A1 |
20090295755 | Chapman et al. | Dec 2009 | A1 |
20090309844 | Woo et al. | Dec 2009 | A1 |
20090309853 | Hildebrandt et al. | Dec 2009 | A1 |
20100009098 | Bai et al. | Jan 2010 | A1 |
20100045629 | Newton | Feb 2010 | A1 |
20100045634 | Su et al. | Feb 2010 | A1 |
20100079412 | Chiang et al. | Apr 2010 | A1 |
20100085330 | Newton | Apr 2010 | A1 |
20100090987 | Lin et al. | Apr 2010 | A1 |
20100097353 | Newton | Apr 2010 | A1 |
20100103143 | Newton et al. | Apr 2010 | A1 |
20100177052 | Chang et al. | Jul 2010 | A1 |
20100182279 | Juni | Jul 2010 | A1 |
20100193259 | Wassvik | Aug 2010 | A1 |
20100207911 | Newton | Aug 2010 | A1 |
20100225588 | Newton et al. | Sep 2010 | A1 |
20100229090 | Newton et al. | Sep 2010 | A1 |
20100315379 | Allard et al. | Dec 2010 | A1 |
20110019204 | Bridger | Jan 2011 | A1 |
20110050649 | Newton et al. | Mar 2011 | A1 |
20110199335 | Li et al. | Aug 2011 | A1 |
20110199387 | Newton | Aug 2011 | A1 |
20110205151 | Newton et al. | Aug 2011 | A1 |
20110205155 | Newton et al. | Aug 2011 | A1 |
20110205185 | Newton et al. | Aug 2011 | A1 |
20110205186 | Newton et al. | Aug 2011 | A1 |
Number | Date | Country |
---|---|---|
7225001 | Jan 2002 | AU |
2003233728 | Dec 2003 | AU |
2004211738 | Aug 2004 | AU |
2006243730 | Nov 2006 | AU |
2058219 | Apr 1993 | CA |
2367864 | Apr 1993 | CA |
2219886 | Apr 1999 | CA |
2251221 | Apr 1999 | CA |
2267733 | Oct 1999 | CA |
2268208 | Oct 1999 | CA |
2252302 | Apr 2000 | CA |
2412878 | Jan 2002 | CA |
2341918 | Sep 2002 | CA |
2350152 | Dec 2002 | CA |
2386094 | Dec 2002 | CA |
2372868 | Aug 2003 | CA |
2390503 | Dec 2003 | CA |
2390506 | Dec 2003 | CA |
2432770 | Dec 2003 | CA |
2493236 | Dec 2003 | CA |
2448603 | May 2004 | CA |
2453873 | Jul 2004 | CA |
2460449 | Sep 2004 | CA |
2521418 | Oct 2004 | CA |
2481396 | Mar 2005 | CA |
2491582 | Jul 2005 | CA |
2563566 | Nov 2005 | CA |
2564262 | Nov 2005 | CA |
2501214 | Sep 2006 | CA |
2606863 | Nov 2006 | CA |
2580046 | Sep 2007 | CA |
2515955 | Jan 2011 | CA |
1277349 | Dec 2000 | CN |
1407506 | Apr 2003 | CN |
1440539 | Sep 2003 | CN |
1774692 | May 2006 | CN |
1784649 | Jun 2006 | CN |
1310126 | Apr 2007 | CN |
101019096 | Aug 2007 | CN |
101023582 | Aug 2007 | CN |
101663637 | Mar 2010 | CN |
101802759 | Aug 2010 | CN |
101802760 | Aug 2010 | CN |
3836429 | May 1990 | DE |
19810452 | Dec 1998 | DE |
60124549 | Sep 2007 | DE |
102007021537 | Jun 2008 | DE |
0125068 | Nov 1984 | EP |
0181196 | May 1986 | EP |
0279652 | Aug 1988 | EP |
0347725 | Dec 1989 | EP |
0420335 | Apr 1991 | EP |
0657841 | Jun 1995 | EP |
0762319 | Mar 1997 | EP |
0829798 | Mar 1998 | EP |
0843202 | May 1998 | EP |
0897161 | Feb 1999 | EP |
0911721 | Apr 1999 | EP |
1059605 | Dec 2000 | EP |
1262909 | Dec 2002 | EP |
1297488 | Apr 2003 | EP |
1420335 | May 2004 | EP |
1450243 | Aug 2004 | EP |
1457870 | Sep 2004 | EP |
1471459 | Oct 2004 | EP |
1517228 | Mar 2005 | EP |
1550940 | Jul 2005 | EP |
1577745 | Sep 2005 | EP |
1599789 | Nov 2005 | EP |
1611503 | Jan 2006 | EP |
1674977 | Jun 2006 | EP |
1736856 | Dec 2006 | EP |
1739528 | Jan 2007 | EP |
1739529 | Jan 2007 | EP |
1741186 | Jan 2007 | EP |
1759378 | Mar 2007 | EP |
1766501 | Mar 2007 | EP |
1830248 | Sep 2007 | EP |
1877893 | Jan 2008 | EP |
2135155 | Dec 2009 | EP |
2195726 | Jun 2010 | EP |
2250546 | Nov 2010 | EP |
2279823 | Sep 2007 | ES |
2521330 | Aug 1983 | FR |
1575420 | Sep 1980 | GB |
2176282 | Dec 1986 | GB |
2204126 | Nov 1988 | GB |
2263765 | Aug 1993 | GB |
57211637 | Dec 1982 | JP |
58146928 | Sep 1983 | JP |
61196317 | Aug 1986 | JP |
61260322 | Nov 1986 | JP |
62005428 | Jan 1987 | JP |
63223819 | Sep 1988 | JP |
1061736 | Mar 1989 | JP |
1154421 | Jun 1989 | JP |
3054618 | Mar 1991 | JP |
3244017 | Oct 1991 | JP |
4350715 | Dec 1992 | JP |
4355815 | Dec 1992 | JP |
5181605 | Jul 1993 | JP |
5189137 | Jul 1993 | JP |
5197810 | Aug 1993 | JP |
6110608 | Apr 1994 | JP |
7110733 | Apr 1995 | JP |
7160403 | Jun 1995 | JP |
7230352 | Aug 1995 | JP |
8016931 | Feb 1996 | JP |
8108689 | Apr 1996 | JP |
8506193 | Jul 1996 | JP |
8240407 | Sep 1996 | JP |
8315152 | Nov 1996 | JP |
9091094 | Apr 1997 | JP |
9224111 | Aug 1997 | JP |
9319501 | Dec 1997 | JP |
10031546 | Feb 1998 | JP |
10105324 | Apr 1998 | JP |
10162698 | Jun 1998 | JP |
10254623 | Sep 1998 | JP |
11045155 | Feb 1999 | JP |
11051644 | Feb 1999 | JP |
11064026 | Mar 1999 | JP |
11085376 | Mar 1999 | JP |
11110116 | Apr 1999 | JP |
11203042 | Jul 1999 | JP |
11212692 | Aug 1999 | JP |
11338687 | Dec 1999 | JP |
2000105671 | Apr 2000 | JP |
2000132340 | May 2000 | JP |
2000259347 | Sep 2000 | JP |
2001014091 | Jan 2001 | JP |
2001075735 | Mar 2001 | JP |
2001142642 | May 2001 | JP |
2001166874 | Jun 2001 | JP |
2001282445 | Oct 2001 | JP |
2001282456 | Oct 2001 | JP |
2001282457 | Oct 2001 | JP |
2002055770 | Feb 2002 | JP |
2002116428 | Apr 2002 | JP |
2002196874 | Jul 2002 | JP |
2002236547 | Aug 2002 | JP |
2002287886 | Oct 2002 | JP |
2003065716 | Mar 2003 | JP |
2003158597 | May 2003 | JP |
2003167669 | Jun 2003 | JP |
2003173237 | Jun 2003 | JP |
2003303046 | Oct 2003 | JP |
2003533786 | Nov 2003 | JP |
2004030003 | Jan 2004 | JP |
2004502261 | Jan 2004 | JP |
2005108211 | Apr 2005 | JP |
2005182423 | Jul 2005 | JP |
2005202950 | Jul 2005 | JP |
2006522967 | Oct 2006 | JP |
2007536652 | Dec 2007 | JP |
1020050111324 | Nov 2005 | KR |
WO8901677 | Feb 1989 | WO |
WO9807112 | Feb 1998 | WO |
WO9908897 | Feb 1999 | WO |
WO9921122 | Apr 1999 | WO |
WO9928812 | Jun 1999 | WO |
WO9936805 | Jul 1999 | WO |
WO9940562 | Aug 1999 | WO |
WO0021023 | Apr 2000 | WO |
WO0124157 | Apr 2001 | WO |
WO0131570 | May 2001 | WO |
WO0163550 | Aug 2001 | WO |
WO0186586 | Nov 2001 | WO |
WO0191043 | Nov 2001 | WO |
WO0203316 | Jan 2002 | WO |
WO0207073 | Jan 2002 | WO |
WO0208881 | Jan 2002 | WO |
WO0221502 | Mar 2002 | WO |
WO0227461 | Apr 2002 | WO |
WO03104887 | Dec 2003 | WO |
WO03105074 | Dec 2003 | WO |
WO2004072843 | Aug 2004 | WO |
WO2004090706 | Oct 2004 | WO |
WO2004102523 | Nov 2004 | WO |
WO2004104810 | Dec 2004 | WO |
WO2005031554 | Apr 2005 | WO |
WO2005034027 | Apr 2005 | WO |
WO2005106775 | Nov 2005 | WO |
WO2005107072 | Nov 2005 | WO |
WO2005109396 | Nov 2005 | WO |
WO2006002544 | Jan 2006 | WO |
WO2006092058 | Sep 2006 | WO |
WO2006095320 | Sep 2006 | WO |
WO2006096962 | Sep 2006 | WO |
WO2006116869 | Nov 2006 | WO |
WO2007003196 | Jan 2007 | WO |
WO2007019600 | Feb 2007 | WO |
WO2007037809 | Apr 2007 | WO |
WO2007064804 | Jun 2007 | WO |
WO2007079590 | Jul 2007 | WO |
WO2007132033 | Nov 2007 | WO |
WO2007134456 | Nov 2007 | WO |
WO2008007276 | Jan 2008 | WO |
WO2008085789 | Jul 2008 | WO |
WO2008128096 | Oct 2008 | WO |
WO2009029764 | Mar 2009 | WO |
WO2009029767 | Mar 2009 | WO |
WO2009035705 | Mar 2009 | WO |
WO2009102681 | Aug 2009 | WO |
WO2009137355 | Nov 2009 | WO |
WO2009146544 | Dec 2009 | WO |
WO2010039663 | Apr 2010 | WO |
WO2010039932 | Apr 2010 | WO |
WO2010044575 | Apr 2010 | WO |
WO2010051633 | May 2010 | WO |
WO2010110681 | Sep 2010 | WO |
WO2010110683 | Sep 2010 | WO |
Entry |
---|
Japanese Patent Application No. 2007-511305, Office Action, mailed Sep. 6, 2011, Office Action—3 pages, English Translation—4 pages. |
Anon, “SMART Board Specifications Model 680i”, XP7915047 Retrieved from the Internet: URL:http://www2.smarttech.com/kbdoc/74231 [retrieved on Sep. 23, 2010] the whole document, 2008, pp. 1-5. |
Benko, et al., “Precise Selection Techniques for Multi-Touch Screens”, Conference on Human Factors in Computing Systems—Proceedings 2006, 2: 1263-1273. |
Buxton, et al., “Issues and Techniques in Touch-Sensitive Tablet Input”, Computer Graphics, Proceedings of SIGGRAPH'85, 1985, 19(3): 215-223. |
Canadian Patent Application No. 2412878, Office Action, mailed May 12, 2009, 4 pages. |
Chinese Patent Application No. 200880105040.5, Office Action, at least as early as Aug. 11, 2011, 6 pages. (English Translation Not Available). |
“Composite List of Projects 1983 to 1989”, NASA Small Business Innovation Research Program, Aug. 1990, 132 pages. |
“Digital Vision Touch Technology”, White Paper, SMART Technologies Inc., Feb. 2003, 10 pages. |
European Application No. 02253594.2, European Search Report, mailed Jan. 5, 2006, 3 pages. |
European Application No. 03257166.3, Partial European Search Report, mailed May 29, 2006, 4 pages. |
European Application No. 04251392.9, European Search Report, mailed Jan. 18, 2007, 3 pages. |
European Application No. 04711522.5, Office Action, mailed Jun. 29, 2010, 8 pages. |
European Application No. 04711522.5, Office Action, mailed Mar. 22, 2010, 1 page. |
European Application No. 04711522.5, Supplementary European Search Report, mailed Mar. 3, 2010, 3 pages. |
European Application No. 06019268.9, European Search Report and Search Opinion, mailed Nov. 24, 2006, 5 pages. |
European Application No. 06019269.7, European Search Report and Search Opinion, mailed Nov. 23, 2006, 5 pages. |
European Application No. 07250888.0, European Search Report and Search Opinion, mailed Jun. 22, 2007, 6 pages. |
European Application No. 07701682.2, Supplementary European Search Report and Search Opinion, mailed Dec. 7, 2010, 10 pages. |
European Application No. 08745663.8, Office Action, mailed Dec. 27, 2010, 13 pages. |
European Application No. 08745663.8, Office Action, mailed Jul. 6, 2010, 6 pages. |
Förstner, “On Estimating Rotations”, Institut für Photogrammetrie, Universität Bonn, 12 pages. |
Fukushige, et al., “Interactive 3D Pointing Device Using Mirror Reflections”, Graduate School of Engineering, Osaka University, 2006, 231-235. |
Funk, “CCDs in optical touch panels deliver high resolution”, Electronic Design, Sep. 27, 1980, pp. 139-143. |
Geer, “Will Gesture—Recognition Technology Point the Way?”, Industry Trends, Oct. 2004, 20-23. |
Hartley, “Multiple View Geometry in Computer Vision”, Cambridge University Press First published 2000, Reprinted (with corrections) 2001, pp. 70-73, 92-93, and 98-99. |
Heddier Electronic, “Store Window Presentations”, Feb. 2, 2011, 2 pages. |
Herot, et al., “One-Point Touch Input of Vector Information for Computer Displays”, Architecture Machine Group Massachusetts Institute of Technology Cambridge, Massachusetts, Oct. 31, 1977, pp. 210-216. |
Herrero, et al., “Background Subtraction Techniques: Systematic Evaluation and Comparative Analysis”, Advanced Concepts for Intelligent Vision Systems, Springer-Verlag Berlin Heidelberg, Sep. 2009, pp. 33-42. |
Hu, et al., “Multiple-view 3-D Reconstruction Using a Mirror”, The University of Rochester, May 2005, 14 pages. |
International Application No. PCT/CA2001/00980, International Search Report, mailed Oct. 22, 2001, 3 pages. |
International Application No. PCT/CA2004/001759, International Search Report and Written Opinion, mailed Feb. 21, 2005, 7 pages. |
International Application No. PCT/CA2007/002184, International Search Report, mailed Mar. 13, 2008, 3 pages. |
International Application No. PCT/CA2008/001350, International Search Report, mailed Oct. 17, 2008, 5 pages. |
International Application No. PCT/CA2009/000733, International Search Report and Written Opinion, mailed Sep. 10, 2009, 6 pages. |
International Application No. PCT/CA2010/001085, International Search Report, mailed Oct. 12, 2010, 4 pages. |
International Application No. PCT/NZ2004/000029, International Preliminary Report on Patentability, issued May 20, 2005, 21 pages. |
International Application No. PCT/NZ2004/000029, International Search Report and Written Opinion, mailed Jun. 10, 2004, 6 pages. |
International Application No. PCT/NZ2005/000092, International Preliminary Report on Patentability, completed Dec. 30, 2006, 3 pages. |
International Application No. PCT/NZ2005/000092, International Search Report, mailed Sep. 27, 2006, 4 pages. |
International Application No. PCT/NZ2010/000049, International Search Report and Written Opinion, mailed Oct. 14, 2010, 12 pages. |
International Application No. PCT/NZ2010/000051, International Search Report and Written Opinion, mailed Oct. 5, 2010, 15 pages. |
International Application No. PCT/US2008/060102, International Preliminary Report on Patentability, mailed Oct. 22, 2009, 10 pages. |
International Application No. PCT/US2008/060102, International Search Report and Written Opinion, mailed Feb. 12, 2009, 20 pages. |
International Application No. PCT/US2008/074749, International Preliminary Report on Patentability, issuance Mar. 2, 2010, 9 pages. |
International Application No. PCT/US2008/074749, International Search Report and Written Opinion, mailed Feb. 11, 2009, 15 pages. |
International Application No. PCT/US2008/074755, International Preliminary Report on Patentability, issuance Mar. 2, 2010, 8 pages. |
International Application No. PCT/US2008/074755, International Search Report and Written Opinion, mailed Jan. 29, 2009, 8 pages. |
International Application No. PCT/US2009/030694, International Preliminary Report on Patentability, completion Apr. 26, 2010, 10 pages. |
International Application No. PCT/US2009/030694, International Search Report, mailed Aug. 5, 2009, 5 pages. |
International Application No. PCT/US2009/033624, International Preliminary Report on Patentability and Written Opinion, issuance Aug. 17, 2010, 6 pages. |
International Application No. PCT/US2009/033624, International Search Report, mailed Mar. 29, 2010, 3 pages. |
International Application No. PCT/US2009/042547, International Preliminary Report on Patentability, mailed Nov. 9, 2010, 6 pages. |
International Application No. PCT/US2009/042547, International Search Report and Written Opinion, mailed Sep. 2, 2010, 12 pages. |
International Application No. PCT/US2009/058682, International Search Report and Written Opinion, mailed Apr. 27, 2010, 15 pages. |
International Application No. PCT/US2009/059193, International Search Report and Written Opinion, mailed Dec. 7, 2009, 15 pages. |
International Application No. PCT/US2010/059050, International Search Report and Written Opinion, mailed Mar. 23, 2011, 9 pages. |
International Application No. PCT/US2010/059104, International Search Report and Written Opinion, mailed Jun. 6, 2011, 14 pages. |
International Application No. PCT/US2010/059078, International Search Report and Written Opinion, mailed Aug. 2, 2011, 17 pages. |
“Introducing the NextWindow 1900 Optical Touch Screen”, A NextWindow White Paper, Next Window Human Touch, May 22, 2007, 13 pages. |
INTUIFACE Press Release, “IntuiLab introduces IntuiFace, an interactive table and its application platform”, Nov. 30, 2007, 1 page. |
INTUILAB, “Overview Page”, Mar. 9, 2011, 1 page. |
Japanese Patent Application No. 2005-000268, Office Action, mailed Jul. 5, 2010, Office Action—3 pages, English Translation—3 pages. |
Japanese Patent Application No. 2006-502767, Office Action, mailed Jan. 20, 2009, Office Action—2 pages, English Translation—3 pages. |
Japanese Patent Application No. 2006-502767, Office Action, mailed Jun. 22, 2010, Office Action—3 pages, English Translation—4 pages. |
Japanese Patent Application No. 2007-511305, Office Action, mailed Feb. 1, 2011, Office Action—2 pages, English Translation—5 pages. |
Kanatani, “Camera Calibration”, Geometric Computation for Machine Vision, Oxford Engineering Science Series, 1993, 37(2): 56-63. |
Korean Patent Application No. 10-2005-7014885, Office Action, dated Aug. 9, 2010, English Translation—5 pages. |
Lane, et al., “Reflective Interaction in Virtual Environments”, Eurographics, 2001, 20(3): 7 pages. |
Lo, “Solid-state image sensor: technologies and applications”, SPIE Proceedings, 1998, 3422: 70-80. |
Loinaz, et al., “A 200-mW, 3.3-V, CMOS Color Camera IC Producing 352×288 24-b Video at 30 Frames”, IEEE Journal of Solid-State Circuits, Dec. 1998, 33(12); 2092-2103. |
Piccardi, et al., “Background subtraction techniques: a review”, 2004 IEEE International Conference on Systems, Man and Cybernetics, Oct. 10, 2004, 4: 3099-3104. |
Pogue, “The Multi-Touch Screen”, POGUE's Posts, Mar. 27, 2007, 13 pages. |
Singapore Patent Application No. 201001122-9, Office Action, dated May 3, 2011, 9 pages. |
Tappert, et al., “On-Line Handwriting Recognition—a Survey”, Proceedings of the 9th International Conference on Pattern Recognition (ICPR), Rome, IEEE Computer Society Press, Nov. 14-17, 1988, 2: 1123-1132. |
“ThruGlass™ Projected Capacitive Touchscreens Specifications”, Micro Touch, 2000, 4 pages. |
“Touch Panel”, Veritas et Visus, Nov. 2005, vol. 1, No. 1. |
“Touch Panel”, Veritas et Visus, Dec. 2005, Issue 2 of 10. |
“Touch Panel”, Veritas et Visus, Feb. 2006, vol. 1, No. 3. |
“Touch Panel”, Veritas et Visus, Mar. 2006, vol. 1, No. 4. |
“Touch Panel”, Veritas et Visus, May 2006, vol. 1, No. 5. |
“Touch Panel”, Veritas et Visus, Jun. 2006, vol. 1, No. 6. |
“Touch Panel”, Veritas et Visus, Jul. 2006, vol. 1, No. 7. |
“Touch Panel”, Veritas et Visus, Aug. 2006, vol. 1, No. 8. |
“Touch Panel”, Veritas et Visus, Oct. 2006, vol. 1, No. 9. |
“Touch Panel”, Veritas et Visus, Nov. 2006, vol. 1, No. 10. |
“Touch Panel”, Veritas et Visus, Dec. 2006, vol. 2, No. 1. |
“Touch Panel”, Veritas et Visus, Feb. 2007, vol. 2, No. 2. |
“Touch Panel”, Veritas et Visus, Mar. 2007, vol. 2, No. 3. |
“Touch Panel”, Veritas et Visus, May 2007, vol. 2, No. 4. |
“Touch Panel”, Veritas et Visus, Jul. 2007, vol. 2, No. 5. |
“Touch Panel”, Veritas et Visus, Oct. 2007, vol. 2, No. 6. |
“Touch Panel”, Veritas et Visus, Jan. 2008, vol. 2, Nos. 7-8. |
“Touch Panel”, Veritas et Visus, Mar. 2008, vol. 2, Nos. 9-10. |
“Touch Panel”, Veritas et Visus, Aug. 2008, vol. 3, Nos. 1-2. |
“Touch Panel”, Veritas et Visus, Nov. 2008, vol. 3, Nos. 3-4. |
“Touch Panel”, Veritas et Visus, Jan. 2009, vol. 3, Nos. 5-6. |
“Touch Panel”, Veritas et Visus, Mar. 2009, vol. 3, Nos. 7-8. |
“Touch Panel”, Veritas et Visus, May 2009, vol. 3, No. 9. |
“Touch Panel”, Veritas et Visus, Sep. 2009, vol. 4, Nos. 2-3. |
“Touch Panel”, Veritas et Visus, Sep. 2010, vol. 5, Nos. 2-3. |
“Touch Panel”, Veritas et Visus, Nov. 2010, vol. 5, No. 4. |
Photobit Corporation, “VGA-format CMOS Camera-on-a-Chip for Multimedia Applications”, 1999, 2 pages. |
Villamor, et al., “Touch Gesture Reference Guide”, Last updated Apr. 15, 2010, 7 pages. |
Wang, et al., “Stereo camera calibration without absolute world coordinate information”, SPIE, Jun. 14, 1995, 2620: 655-662. |
Wrobel, et al., “Minimum Solutions for Orientation”, Calibration and Orientation of Cameras in Computer Vision, Springer Series in Information Sciences, 2001, 34: 28-33. |
Number | Date | Country | |
---|---|---|---|
20100090985 A1 | Apr 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11033183 | Jan 2005 | US |
Child | 12578165 | US | |
Parent | PCT/NZ2004/000029 | Feb 2004 | US |
Child | 11033183 | US |