Multipoint touch surface controller

Information

  • Patent Grant
  • 11853518
  • Patent Number
    11,853,518
  • Date Filed
    Monday, February 8, 2021
    5 years ago
  • Date Issued
    Tuesday, December 26, 2023
    2 years ago
Abstract
A multipoint touch surface controller is disclosed herein. The controller includes an integrated circuit including output circuitry for driving a capacitive multi-touch sensor and input circuitry for reading the sensor. Also disclosed herein are various noise rejection and dynamic range enhancement techniques that permit the controller to be used with various sensors in various conditions without reconfiguring hardware.
Description
BACKGROUND

There exist today many styles of input devices for performing operations in a computer system. The operations generally correspond to moving a cursor and/or making selections on a display screen. By way of example, the input devices may include buttons or keys, mice, trackballs, touch pads, joy sticks, touch screens and the like. Touch pads and touch screens (collectively “touch surfaces”) are becoming increasingly popular because of their ease and versatility of operation as well as to their declining price. Touch surfaces allow a user to make selections and move a cursor by simply touching the surface, which may be a pad or the display screen, with a finger, stylus, or the like. In general, the touch surface recognizes the touch and position of the touch and the computer system interprets the touch and thereafter performs an action based on the touch.


Of particular interest are touch screens. Various types of touch screens are described in applicant's co-pending patent application Ser. No. 10/840,862, entitled “Multipoint Touchscreen,” filed May 6, 2004, which is hereby incorporated by reference in its entirety. As noted therein, touch screens typically include a touch panel, a controller and a software driver. The touch panel is generally a clear panel with a touch sensitive surface. The touch panel is positioned in front of a display screen so that the touch sensitive surface covers the viewable area of the display screen. The touch panel registers touch events and sends these signals to the controller. The controller processes these signals and sends the data to the computer system. The software driver translates the touch events into computer events.


There are several types of touch screen technologies including resistive, capacitive, infrared, surface acoustic wave, electromagnetic, near field imaging, etc. Each of these devices has advantages and disadvantages that are taken into account when designing or configuring a touch screen. One problem found in these prior art technologies is that they are only capable of reporting a single point even when multiple objects are placed on the sensing surface. That is, they lack the ability to track multiple points of contact simultaneously. In resistive and traditional capacitive technologies, an average of all simultaneously occurring touch points are determined and a single point which falls somewhere between the touch points is reported. In surface wave and infrared technologies, it is impossible to discern the exact position of multiple touch points that fall on the same horizontal or vertical lines due to masking. In either case, faulty results are generated.


These problems are particularly problematic in handheld devices, such as tablet PCs, where one hand is used to hold the tablet and the other is used to generate touch events. For example, as shown in FIGS. 1A and 1B, holding a tablet 2 causes the thumb 3 to overlap the edge of the touch sensitive surface 4 of the touch screen 5. As shown in FIG. 1A, if the touch technology uses averaging, the technique used by resistive and capacitive panels, then a single point that falls somewhere between the thumb 3 of the left hand and the index finger 6 of the right hand would be reported. As shown in FIG. 1B, if the technology uses projection scanning, the technique used by infrared and surface acoustic wave panels, it is hard to discern the exact vertical position of the index finger 6 due to the large vertical component of the thumb 3. The tablet 2 can only resolve the patches shown in gray. In essence, the thumb 3 masks out the vertical position of the index finger 6.


While virtually all commercially available touch screen based systems available today provide single point detection only and have limited resolution and speed, other products available today are able to detect multiple touch points. Unfortunately, these products only work on opaque surfaces because of the circuitry that must be placed behind the electrode structure. Examples of such products include the Fingerworks series of touch pad products. Historically, the number of points detectable with such technology has been limited by the size of the detection circuitry.


Therefore, what is needed in the art is a multi-touch capable touch screen controller that facilitates the use of transparent touch sensors and provides for a conveniently integrated package.


SUMMARY

A controller for multi-touch touch surfaces is disclosed herein. One aspect of the multi-touch touch surface controller relates to the integration of drive electronics for stimulating the multi-touch sensor and sensing circuits for reading the multi-touch sensor into a single integrated package.


Another aspect of the controller relates to a technique for suppressing noise in the sensor by providing a plurality of stimulus waveforms to the sensor wherein the waveforms have different frequencies. This permits at least one noise-free detection cycle in cases where noise appears at a particular frequency.


Another aspect of the controller relates to a charge amplifier that includes programmable components, namely, programmable resistors and capacitors to allow the circuit to be easily reconfigured to provide optimum sensing configurations for a variety of sensor conditions.


Another aspect of the controller relates to an offset compensation circuit that expands the dynamic range of the controller by eliminating a static portion of the multi-touch surface sensor output allowing the full dynamic range of the sensing circuitry to be allocated to the changing portions of the output signal.


Another aspect of the controller relates to a demodulation circuit that enhances the noise immunity of the sensor arrangement by application of particular demodulation waveforms known to have particular frequency characteristics.


Another aspect of the controller relates to the application of various algorithms to the sensor outputs obtained from the multiple stimulus frequencies described above to further increase noise immunity of the system.


These and other aspects will be more readily understood by reference to the following detailed description and figures.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1A and 1B illustrates certain problems with prior art touch screen technologies.



FIG. 2 illustrates a perspective view of a computing device incorporating a multi-touch touch screen and multi-touch touch screen controller according to certain teachings of the present disclosure.



FIG. 3 is a block diagram of a computing device incorporating a multi-touch touch screen and multi-touch touch screen controller according to certain teachings of the present disclosure.



FIGS. 4A and 4B illustrate two possible arrangement of drive and sense electrodes in a multi-touch touch surface.



FIG. 5 is a layer diagram illustrating communication between the multi-touch surface and the host computer device by way of a multi-touch controller incorporating various teachings of the present disclosure.



FIG. 6 is an equivalent circuit showing the output circuitry of the controller, a cell of the multi-touch sensor, and the input circuitry of a multi-touch controller incorporating various teachings of the present disclosure.



FIG. 7 is a circuit schematic of a charge amplifier incorporated in certain embodiments of a multi-touch controller incorporating various teachings of the present disclosure.



FIG. 8 is a block diagram of the multi-touch surface and multi-touch controller system in accordance with various teachings of the present disclosure.



FIG. 9 illustrates the sequence in which drive waveforms of varying frequencies are applied to the multi-touch sensor in accordance with certain teachings of the present disclosure.



FIG. 10 is a block diagram illustrating the input circuitry of a multi-touch controller incorporating certain teachings of the present disclosure.



FIGS. 11A and 11B illustrate various demodulation waveforms together with frequency spectra of their passbands.



FIG. 12 illustrates a sequence of stimulus waveforms, together with a particular demodulation waveform, and the resulting output.



FIG. 13 illustrates the noise rejection technique employed by the majority rules algorithm.



FIG. 14 illustrates a flowchart depicting operation of the controller in accordance with certain teachings of the present disclosure.





DETAILED DESCRIPTION

A multipoint touch screen controller (MTC) is described herein. The following embodiments of the invention, described in terms of devices and applications compatible with computer systems manufactured by Apple Computer, Inc. of Cupertino, Calif., are illustrative only and should not be considered limiting in any respect.



FIG. 2 is a perspective view of a touch screen display arrangement 30. Display arrangement 30 includes a display 34 and a transparent touch screen 36 positioned in front of display 34. Display 34 may be configured to display a graphical user interface (GUI) including perhaps a pointer or cursor as well as other information to the user. Transparent touch screen 36 is an input device that is sensitive to a user's touch, allowing a user to interact with the graphical user interface on display 34. In general, touch screen 36 recognizes touch events on surface 38 of touch screen 36 and thereafter outputs this information to a host device. The host device may, for example, correspond to a computer such as a desktop, laptop, handheld or tablet computer. The host device interprets the touch event and thereafter performs an action based on the touch event.


In contrast to prior art touch screens, touch screen 36 shown herein is configured to recognize multiple touch events that occur simultaneously at different locations on touch sensitive surface 38. That is, touch screen 36 allows for multiple contact points T1-T4 to be tracked simultaneously. Touch screen 36 generates separate tracking signals S1-S4 for each touch point T1-T4 that occurs on the surface of touch screen 36 at the same time. In one embodiment, the number of recognizable touches may be about 15 which allows for a user's 10 fingers and two palms to be tracked along with 3 other contacts. The multiple touch events can be used separately or together to perform singular or multiple actions in the host device. Numerous examples of multiple touch events used to control a host device are disclosed in U.S. Pat. Nos. 6,323,846; 6,888,536; 6,677,932; 6,570,557, and co-pending U.S. patent application Ser. Nos. 11/015,434; 10/903,964; 11/048,264; 11/038,590; 11/228,758; 11/228,700; 11/228,737; 11/367,749, each of which is hereby incorporated by reference in its entirely.



FIG. 3 is a block diagram of a computer system 50, employing a multi-touch touch screen. Computer system 50 may be, for example, a personal computer system such as a desktop, laptop, tablet, or handheld computer. The computer system could also be a public computer system such as an information kiosk, automated teller machine (ATM), point of sale machine (POS), industrial machine, gaming machine, arcade machine, vending machine, airline e-ticket terminal, restaurant reservation terminal, customer service station, library terminal, learning device, etc.


Computer system 50 includes a processor 56 configured to execute instructions and to carry out operations associated with the computer system 50. Computer code and data required by processor 56 are generally stored in storage block 58, which is operatively coupled to processor 56. Storage block 58 may include read-only memory (ROM) 60, random access memory (RAM) 62, hard disk drive 64, and/or removable storage media such as CD-ROM, PC-card, floppy disks, and magnetic tapes. Any of these storage devices may also be accessed over a network. Computer system 50 also includes a display device 68 that is operatively coupled to the processor 56. Display device 68 may be any of a variety of display types including liquid crystal displays (e.g., active matrix, passive matrix, etc.), cathode ray tubes (CRT), plasma displays, etc.


Computer system 50 also includes touch screen 70, which is operatively coupled to the processor 56 by I/O controller 66 and touch screen controller 76. (The I/O controller 66 may be integrated with the processor 56, or it may be a separate component.) In any case, touch screen 70 is a transparent panel that is positioned in front of the display device 68, and may be integrated with the display device 68 or it may be a separate component. Touch screen 70 is configured to receive input from a user's touch and to send this information to the processor 56. In most cases, touch screen 70 recognizes touches and the position and magnitude of touches on its surface.


Better understanding of the interface between the touch sensor and the host computer system may be had with reference to FIG. 5, which is a layer diagram of the system illustrated in FIG. 3. The touch sensor 301 resides at the lowermost layer. In a preferred embodiment, the sensor interfaces with an ASIC (application specific integrated circuit) 305 that stimulates the sensor and reads the raw sensor output as described in more detail below. ASIC 305 interfaces via signaling 306 with a DSP (digital signal processor) and/or microcontroller 307, which generates the capacitance images. Together ASIC 305 and DSP/microcontroller 307 form the multipoint touch screen controller.


DSP/Microcontroller 307 includes an interface 308 for accepting the signaling 306 from ASIC 305, and these signals are then passed to a data capture and error rejection layer 309. Data from this layer may be accessed both for calibration, baseline and standby processing by module, as well as feature (i.e., touch point) extraction and compression module. Once the features are extracted they are passed as high-level information to the host computer 302 via interface 303. Interface 303 may be, for example, a USB (universal serial bus) interface. Alternatively, other forms of interface, such as IEEE 1394 (“Firewire”), RS-232 serial interface, SCSI (small computer systems interface), etc. could be used.


The exact physical construction of the sensing device is not necessary for a complete understanding touch screen controller disclosed herein. Nonetheless, details of the construction may be understood by reference to the patents and patent applications incorporated by reference above. For purposes of the present description, the sensor may be assumed to be a mutual capacitance device constructed as described below with reference to FIGS. 4A and 4B.


The sensor panel is comprised of a two-layered electrode structure, with driving lines on one layer and sensing lines on the other. In either case, the layers are separated by a dielectric material. In the Cartesian arrangement of FIG. 4A, one layer is comprised of N horizontal, preferably equally spaced row electrodes 81, while the other layer is comprised of M vertical, preferably equally spaced column electrodes 82. In a polar arrangement, illustrated in FIG. 4B, the sensing lines may be concentric circles and the driving lines may be radially extending lines (or vice versa). As will be appreciated by those skilled in the art, other configurations based on an infinite variety of coordinate systems are also possible.


Each intersection 83 represents a pixel and has a characteristic mutual capacitance, CSIG. A grounded object (such as a finger) that approaches a pixel 83 from a finite distance shunts the electric field between the row and column intersection, causing a decrease in the mutual capacitance CSIG at that location. In the case of a typical sensor panel, the typical signal capacitance CSIG is about 0.75 pF and the change induced by a finger touching a pixel, is about 0.25 pF.


The electrode material may vary depending on the application. In touch screen applications, the electrode material may be ITO (Indium Tin Oxide) on a glass substrate. In a touch tablet, which need not be transparent, copper on an FR4 substrate may be used. The number of sensing points 83 may also be widely varied. In touch screen applications, the number of sensing points 83 generally depends on the desired sensitivity as well as the desired transparency of the touch screen 70. More nodes or sensing points generally increases sensitivity, but reduces transparency (and vice versa).


During operation, each row (or column) is sequentially charged by driving it with a predetermined voltage waveform 85 (discussed in greater detail below). The charge capacitively couples to the columns (or rows) at the intersection. The capacitance of each intersection 83 is measured to determine the positions of multiple objects when they touch the touch surface. Sensing circuitry monitors the charge transferred and time required to detect changes in capacitance that occur at each node. The positions where changes occur and the magnitude of those changes are used to identify and quantify the multiple touch events. Driving each row and column and sensing the charge transfer is the function of a multipoint touch screen controller.



FIG. 6 is a simplified diagram of the equivalent mutual capacitance circuit 220 for each coupling node. Mutual capacitance circuit 220 includes a driving line 222 and a sensing line 224 that are spatially separated thereby forming a capacitive coupling node 226. When no object is present, the capacitive coupling at node 226 stays fairly constant. When an object, such as a finger, is placed proximate the node 226, the capacitive coupling through node 226 changes. The object effectively shunts the electric field so that the charge transferred across node 226 is less.


With reference to FIGS. 5 and 8, ASIC 305 generates all the drive waveforms necessary to scan the sensor panel. Specifically, the microprocessor sends a clock signal 321 to set the timing of the ASIC, which in turn generates the appropriate timing waveforms 322 to create the row stimuli to the sensor 301. Decoder 311 decodes the timing signals to drive each row of sensor 301 in sequence. Level shifter 310 converts timing signals 322 from the signaling level (e.g., 3.3V) to the level used to drive the sensor (e.g., 18V).


Each row of the sensor panel is driven determined by microprocessor 307. For noise rejection purposes it is desirable to drive the panel at multiple different frequencies for noise rejection purposes. Noise that exists at a particular drive frequency may not, and likely will not exist at the other frequencies. In a preferred embodiment, each sensor panel row is stimulated with three bursts of twelve square wave cycles (50% duty-cycle, 18V amplitude), while the remaining rows are kept at ground. For better noise rejection, described in greater detail below the frequency of each burst is different, exemplary burst frequencies are 140 kHz, 200 kHz, and 260 Khz.


During each burst of pulses ASIC 305 takes a measurement of the column electrodes. This process is repeated for all remaining rows in the sensor panel. The results are three images, each image taken at a different stimulus frequency.


Additionally, it is preferable to minimize the amount of stimulus frequency change required for each subsequent burst. Therefore a frequency hopping pattern that minimizes the changes is desirable. FIG. 29 shows one possible frequency hopping pattern. In this arrangement, a first row is driven with a 140 kHz burst, then a 200 kHz, and finally a 260 kHz burst. Then a next row is driven with three bursts at 260 kHz, 200 kHz, and 140 kHz, respectively. This particular frequency pattern was chosen to keep changes between frequencies small and allow the frequency transitions have to be smooth and glitch free. However, other frequency hopping arrangements are also possible, including scanning more than three frequencies, scanning the frequencies in a quasi-random sequence rather than the ordered pattern described, and adaptive frequency hopping, in which the scan frequencies are selected based on the noise environment.


Turning back to FIG. 6, sensing line 224 is electrically coupled to a capacitive sensing circuit 230. Capacitive sensing circuit 230 detects and quantifies the current change and the position of the node 226 where the current change occurred and reports this information to a host computer. The signal of interest is the capacitance CSIG, which couples charge from RC network A to RC network B. The output from RC network B connects directly to the analog input terminals of ASIC 305. ASIC 305 also uses the clock signal 321 (FIG. 8) from microprocessor 307 (FIG. 8) to time the detection and quantification of the capacitance signals.



FIG. 10 is a block diagram illustrating the input stage of ASIC 305. The input signal is first received by a charge amplifier 401. The charge amplifier performs the following tasks: (1) charge to voltage conversion, (2) charge amplification, (3) rejection or stray capacitance present at the column electrode, and (4) anti aliasing, and (5) gain equalization at different frequencies. FIG. 7 is a diagram of one possible charge amplifier 401.


Charge to voltage conversion is performed by a capacitor CFB in the feedback path of an operational amplifier 450. In one embodiment, the feedback capacitor can be programmed with values ranging from 2 to 32 pF, which allows the output voltage level to be adjusted to obtain the best dynamic range for a range of CSIG values. The feedback resistor RFB is also preferably programmable to control the amplifier gain.


Because CSIG will vary across a touch surface because of a variety of manufacturing tolerance related factors, it is useful to adjust the charge amplifier feedback capacitance CFB on a per-pixel basis. This allows gain compensation to be performed to optimize the performance of each pixel. In one embodiment, quasi-per pixel adjustment is performed as follows: The feedback capacitor CFB has its value set by a register known as CFB_REG. The value of CFB_REG is set according to the following equation:

CFB_REG[Y]=CFB_UNIV+CFB[Y]

where Y is an individual pixel within a row, CFB_UNIV is adjusted on a row by row basis, and CFB[Y] is a lookup table loaded at system startup. In alternative arrangements, CFB_UNIV may be constant for all rows, or the CFB[Y] lookup table may be switched out on a row by row basis. Also, although discussed in terms of rows and columns, the adjustment arrangement is equally applicable to non-Cartesian coordinate systems.


Obviously it is desirable to measure CSIG while rejecting as much as possible the effects of any parasitic resistance and capacitance in the physical sensor. Rejection of parasitic resistance and capacitance in the sensor may be accomplished by holding the non-inverting input 451 of amplifier 45D at a constant value, e.g., ground. The inverting input 452 is coupled to the node being measured. As will be appreciated by those skilled in the art, inverting input 452 (connected to the column electrode being measured) is thus held at virtual ground. Therefore any parasitic capacitance present at the column electrode, e.g., PCB stray capacitance or dynamic stray capacitance caused by the user touching the column electrode, is rejected because the net charge of the stray capacitor does not change (i.e., the voltage across the stray capacitance is held at virtual ground). Therefore the charge amplifier output voltage 453 is only a function of the stimulus voltage, CSIG, and CFB. Because the stimulus voltage and CFB are determined by the controller, CSIG may be readily inferred.


A series resistor 454 between the ASIC input pin 455 and the inverting input 452 of the charge amplifier forms an anti-aliasing filter in combination with the feedback network of RFB and CFB.


The high pass roll off of the charge amplifier is set by the parallel combination of the feedback resistor RFB and the feedback capacitor CFB.


Again with reference to FIG. 10, the output of charge amplifier 401 passes to demodulator 403. Demodulator 403 is a 5-bit quantized continuous time analog (four-quadrant) multiplier. The purpose of demodulator 403 is to reject out of band noise sources (from cell phones, microwave ovens, etc.) that are present on the signal entering ASIC 305. The output of the charge amplifier (VSIG) is mixed with a 5-bit quantized waveform that is stored in a lookup table 404. The shape, amplitude, and frequency of the demodulation waveform is determined by programming suitable coefficients into lookup table 404. The demodulation waveform determines pass band, stop band, stop band ripple and other characteristics of the mixer. In a preferred embodiment, Gaussian shaped sine wave is used as the demodulation waveform. A Gaussian sine wave provides a sharp pass band with reduced stop band ripple.


Another aspect of demodulator 403 relates to demodulator phase delay adjustment. As can be seen with reference to FIG. 10, the touch surface electrodes can be represented by a RC networks (RC Network A and RC Network B) that have a mutual capacitance (CSIG) at the point they intersect. Each RC network constitutes a low pass filter, while CSIG introduces a high pass filter response. Therefore the touch panel looks like a bandpass filter, only allowing signals with a certain frequency ranges to pass the panel. This frequency range, i.e., those frequencies that are below the cutoff of CSIG but above the cutoff of RC Networks A and B, determines the stimulus frequencies that may be used to drive the touch panel.


The panel will therefore impose a phase delay on the stimulus waveform passing through it. This phase delay is negligible for traditional opaque touch panels, wherein the electrode structure is typically formed by PCB traces, which have negligible resistance to their characteristic impedance. However, for transparent panels, typically constructed using Indium Tin Oxide (ITO) conductive traces, the resistive component may be quite large. This introduces a significant time (phase) delay in the propagation of the stimulus voltage through the panel. This phase delay causes the demodulation waveform to be delayed with respect to the signal entering the pre-amplifier, thereby reducing the dynamic range of the signal coming out of the ADC.


To compensate for this phase delay, a delay clock register (“DCL”, not shown) may be provided, which can be used to delay the demodulation waveform relative to the signal entering the preamplifier therefore compensating for the external panel delay and maximizing the dynamic range. This register is input into the demodulator 403 and simply delays the demodulation waveform by a predetermined amount. The amount may be determined either on startup of the panel by measurement, or may be estimated for the panel as a whole based on known manufacturing characteristics. Each pixel of the touch surface may have its own uniquely determined delay parameter to fully optimize the reading circuitry, or the delay parameter may be determined on a row by row basis. Adjustment would be generally similar to the techniques discussed above for adjustment of the charge amplifier feedback capacitor and the offset compensation voltage.


The demodulated signal is then passed to offset compensation circuitry. The offset compensation circuitry comprises mixer 402 and programmable offset DAC 405. Mixer 402 takes the output voltage 453 of the demodulator and subtracts an offset voltage (discussed below) to increase the dynamic range of the system.


Offset compensation is necessary because the pixel capacitance CSIG is comprised of a static part and a dynamic part. The static part is a function of sensor construction. The dynamic part is a function of the change of CSIG when the finger approaches the pixel, and is thus the signal of interest. The purpose of the offset compensator is to eliminate or minimize the static component thereby extending the dynamic range of the system.


As noted above, the offset compensation circuitry is comprised of two parts, a programmable offset DAC 405 and a mixer 402. Offset DAC 405 generates a programmable offset voltage from the digital static offset value VOFF_REG. This digital value is converted into a static analog voltage (or current, if operating in the current domain) by the DAC and then mixed (by mixer 403b) with a voltage (or current) set by the absolute value (determined by block 404b) of the demodulation waveform. The result is a rectified version of the demodulation waveform, the amplitude of which is set by the static value of VOFF_REG and the absolute portion of the demodulation waveform currently retrieved from the DMOD lookup table 404. This allows for the right amount of offset compensation for a given portion of the demodulation waveform. Therefore the offset compensation waveform effectively tracks the demodulation waveform.


As with the charge amplifier feedback capacitor, it is useful to adjust the offset compensation circuitry to account for variations in the individual pixel capacitance due to manufacturing tolerances, etc. The adjustment may be substantially similar to that discussed above with respect to the charge amplifier feedback capacitor. Specifically, the offset voltage value stored in VOFF_REG may be calculated as follows:

VOFF_REG[Y]=VOFF_UNIV+VOFF[Y]

where Y is the individual column within a row, VOFF_UNIV is an offset voltage set on a row by row basis, and VOFF[Y] is a lookup table. Again, the adjustment could be performed on a true pixel by pixel basis or VOFF_UNIV could be a single constant value, depending on a particular implementation. Also, although discussed in terms of rows and columns, the adjustment arrangement is equally applicable to non-Cartesian coordinate systems.


As an alternative to the arrangement described above with respect to FIG. 10, the offset compensation could take place prior to demodulation. In this case, the shape of the offset compensation waveform has to match the waveform coming out of the preamplifier rather than the waveform coming out of the demodulator, i.e., it has to be a square wave, assuming negligible attenuation in the panel, such that the shape of the drive waveform is preserved. Also, if offset compensation is performed first, the offset waveform is an AC waveform with respect to the reference voltage, i.e., the maxima are positive in respect to VREF and the minima are negative in respect to VREF. The amplitude of the offset waveform is equivalent to the amount of offset compensation. Conversely, if demodulation is performed first, the offset waveform is a DC waveform, i.e. it is either positive in respect to Vref or negative (since the demodulated waveform is also DC in respect to Vref). Again, the amplitude in this case is equivalent to the amount of offset compensation for every part of the demodulated waveform. In essence, the offset compensation circuit needs to correlate the amount of offset compensation needed dependent on the shape of the waveform.


The demodulated, offset compensated signal is then processed by programmable gain ADC 406. In one embodiment, ADC 406 may be a sigma-delta, although similar type ADCs (such as a voltage to frequency converter with a subsequent counter stage) could be used. The ADC performs two functions: (1) it converts the offset compensated waveform out of the mixer arrangement (offset and signal mixer) to a digital value; and (2) it performs low pass filtering functions, i.e., it averages the rectified signal coming out of the mixer arrangement. The offset compensated, demodulated signal looks like a rectified Gaussian shaped sine wave, whose amplitude is a function of CFB and CSIG. The ADC result returned to the host computer is actually the average of that signal.


One advantage of using a sigma delta ADC is that such ADCs are much more efficient for performing averaging in the digital domain. Additionally, digital gates are a lot smaller than analog low pass filters and sample and hold elements, thus reducing the size of the total ASIC. One skilled in the art will further appreciated other advantages, particularly with regard to power consumption and clock speed.


Alternatively, one could use an ADC separate from the controller ASIC. This would require a multiplexor to share the ADC between multiple channels and a sample and hold circuit for each channel to average and hold the average of the demodulation waveform. This would likely consume so much die area as to be impractical for controllers intended for use with touch surfaces having a large number of pixels. Additionally, to achieve acceptable operation, the external ADC would need to operate very fast, as a large number of pixels must be processed very quickly to provide timely and smooth results in response to a user's input.


As noted above, the sensor is driven at three different frequencies, resulting in three capacitance images, which are used for noise rejection as described below. The three frequencies are chosen such that the pass band at one particular frequency does not overlap with the pass bands at the other frequencies. As noted above, a preferred embodiment uses frequencies of 140 kHz, 200 kHz, and 260 kHz. The demodulation waveform is chosen such that the side bands are suppressed.


A standard sine wave, illustrated in FIG. 11A together with its passband frequency spectrum, may be used as a demodulation waveform. The sine wave provides a well-defined pass band with some stop band ripple. Alternatively, other waveforms having well defined pass bands with minimum stop band ripple could also be used. For example, a Gaussian-enveloped sine wave, illustrated in FIG. 11B together with its passband frequency spectrum, also has a well defined pass band, with less stop band ripple. One skilled in the art will appreciate that the shape and type of the demodulation waveform affects the passband of the mixer, which, in turn, affects the effectiveness of the noise suppression provided by the frequency hopping mechanism. As will be further appreciated by those skilled in the art, other waveforms could also be used.


Turning now to FIG. 12, nine waveforms are illustrated that explain the noise suppression features of the system. Voltage waveform 501 is a square wave demonstrating the stimulus waveform applied to the sensor. Waveform 504 is the Gaussian enveloped sine wave signal used as a demodulation waveform. Waveform 507 is the output of the demodulator, i.e., the product of the waveforms 501 and 504. Note that it provides a well defined pulse at the fundamental frequency of the applied square wave voltage.


The center column illustrates an exemplary noise waveform 502. Demodulation waveform 505 is the same as demodulation waveform 504. Note that the demodulated noise signal 508 does not produce a significant spike, because the fundamental frequency of the noise signal is outside the passband of the demodulation signal.


The composite of the excitation waveform and noise signal is illustrated in 503. Again, demodulation waveform 506 is the same as demodulation waveforms 505 and 504. The demodulated composite does still show the noise waveform, although various signal processing algorithms may be applied to extract this relatively isolated spike.


Additionally, noise rejection may accomplished by providing multiple stimulus voltage at different frequencies and applying a majority rules algorithm to the result. In a majority rules algorithm, for each capacitance node, the two frequency channels that provide the best amplitude match are averaged and the remaining channel is disposed of. For example, in FIG. 13 vertical line 600 represents the measured capacitance, with the markings 601, 602, and 603 representing the three values measured at three stimulus frequencies. Values 602 and 603 provide the best match, possibly suggesting that value 601 is corrupted. Thus value 601 is discarded and values 602 and 603 are averaged to form the output.


Alternatively, a median filter could be applied, in which case value 602, i.e., the median value would be selected as an output. As yet another alternative, the three results could simply be averaged, in which case a value somewhere between value 601 and 602 would result. A variety of other noise rejection techniques for multiple sample values will be obvious to those skilled in the art, any of which may suitably be used with the controller described herein.


Operation of the circuit may be further understood with respect to FIG. 14, which is a flowchart depicting operation of the controller. One skilled in the art will appreciate that various timing and memory storage issues are omitted from this flowchart for the sake of clarity.


Image acquisition begins at block 701. The system then sets the clock so as to acquire samples at the middle clock frequency (e.g., 200 kHz) as discussed above with respect to FIG. 9 (block 702). The various programmable registers, which control such parameters as voltage offset, amplifier gain, delay clocks, etc., are then updated (block 703). All columns are read, with the result stored as a Mid Vector (block 704) The high clock frequency is then set (block 705), and the steps of updating registers (block 706) and reading all columns and storing the result (step 707) are repeated for the high sample frequency. The clock is then set to the low frequency (step 708) and the register update (block 709) and column reading (block 710) are repeated for the low sample frequency.


The three vectors are then offset compensated, according to the algorithm described above (block 711). The offset compensated vectors are then subjected to a median filter as described above. Alternatively, the offset compensated vectors could be filtered by the majority rules algorithm described with respect to FIG. 13 or any other suitable filtering technique. In any case, the result is stored. If more rows remain, the process returns to the mid frequency sampling at block 702). If all rows are completed (block 713), the entire image is output to the host device (block 714), and a subsequent new image is acquired (block 701).


While this invention has been described in terms of several preferred embodiments, there are alterations, permutations, and equivalents, which fall within the scope of this invention. For example, the term “computer” does not necessarily mean any particular kind of device, combination of hardware and/or software, nor should it be considered restricted to either a multi purpose or single purpose device. Additionally, although the embodiments herein have been described in relation to touch screens, the teachings of the present invention are equally applicable to touch pads or any other touch surface type of sensor. Furthermore, although the disclosure is primarily directed at capacitive sensing, it should be noted that some or all of the features described herein may be applied to other sensing methodologies. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.

Claims
  • 1. A method for demodulating one or more capacitive coupling signals associated with an image of touch, the method comprising: receiving, at a demodulator circuit, a first signal associated with a capacitive coupling between one or more drive electrodes and one or more sense electrodes of a touch surface during a first time period, wherein the one or more drive electrodes are driven with a first waveform during the first time period;performing a first operation associated with the first signal and a first demodulation waveform, wherein the first demodulation waveform is determined based on one or more characteristics associated with the first signal, including delaying the first demodulation waveform by a predetermined amount to compensate for phase delay in the touch surface;generating a first demodulated signal based on performing the first operation; anddetermining an image of touch using the first demodulated signal.
  • 2. The method of claim 1, wherein the one or more characteristics at least includes a characteristic associated with a frequency of the first waveform.
  • 3. The method of claim 1, wherein the generated first demodulation signal rejects noise that is out of band with respect to the first signal.
  • 4. The method of claim 1, further comprising: receiving, at the demodulator circuit, a second signal associated with a capacitive coupling between the one or more drive electrodes and the one or more sense electrodes during a second time period, wherein the one or more drive electrodes are driven with a second waveform, different from the first waveform, during the second time period;performing a second operation associated with the second signal and a second demodulation waveform, different from the first demodulation waveform, wherein the second demodulation waveform is determined based on the one or more characteristics associated with the second signal;generating a second demodulated signal based on performing the second operation; anddetermining an image of touch using the first demodulated signal and the second demodulated signal.
  • 5. The method of claim 4, wherein the first demodulated signal and the second demodulated signal differ in at least one of shape, amplitude, and/or frequency.
  • 6. The method of claim 4, wherein the first demodulation waveform and the second demodulation waveform differ in at least one of shape, amplitude, and/or frequency.
  • 7. The method of claim 4, wherein the first demodulation waveform and the second demodulation waveform are further determined based on coefficients stored in a lookup table.
  • 8. The method of claim 4, wherein the demodulator circuit delays the second demodulation waveform by the predetermined amount to compensate for phase delay in the touch surface.
  • 9. The method of claim 4, wherein the one or more characteristics at least includes a characteristic associated with a frequency of the second waveform.
  • 10. The method of claim 4, wherein the generated second demodulation signal rejects noise that is out of band with respect to the second signal.
  • 11. The method of claim 4, wherein determining the image of touch using the first demodulated signal and the second demodulated signal includes: mixing the first demodulated signal with a first offset signal generated by an offset compensation circuit,mixing the second demodulated signal with a second offset signal different from the first offset signal, the second offset signal generated by the offset compensation circuit; andgenerating the image of touch based at least on the mixing of the first demodulated signal with the first offset signal and the mixing of the second demodulated signal and the second offset signal.
  • 12. The method of claim 11, wherein the first offset signal generated by the offset compensation circuit is based on one or more respective portions of the first demodulation waveform, and the second offset signal generated by the offset compensation circuit is based on one or more respective portions of the second demodulation waveform.
  • 13. A controller for a touch surface, the controller comprising: output circuitry configured to generate drive waveforms for one or more drive electrodes on the touch surface, the drive waveforms including a first periodic waveform having a first predetermined frequency;input circuitry operatively connected to one or more sense electrodes; anda demodulator circuit coupled to the input circuitry, wherein the controller is configured to:receive, at the demodulator circuit, a first signal, the first signal associated with a capacitive coupling between the one or more drive electrodes and the one or more sense electrodes on the touch surface during a first time period, wherein the one or more drive electrodes are driven with the first periodic waveform during the first time period;perform a first operation associated with the first signal and a first demodulation waveform, wherein the first demodulation waveform is determined based on one or more characteristics associated with the first signal, including delaying the first demodulation waveform by a predetermined amount to compensate for phase delay in the touch surface;generate a first demodulated signal based on performing the first operation; anddetermine an image of touch using the first demodulated signal.
  • 14. The controller of claim 13, wherein the one or more characteristics at least includes a characteristic associated with the first predetermined frequency of the first periodic waveform.
  • 15. The controller of claim 13, wherein: the drive waveforms generated by the output circuitry further include a second periodic waveform having a second predetermined frequency; andthe the controller is further configured to: receive, at the demodulator circuit, a second signal, the second signal associated with a capacitive coupling between the one or more drive electrodes and the one or more sense electrodes on the touch surface during a second time period, wherein the one or more drive electrodes are driven with the second periodic waveform during the second time period;perform a second operation associated with the second signal and a second demodulation waveform, different from the second demodulation waveform, wherein the second demodulation waveform is determined based on one or more characteristics associated with the second signal;generate a second demodulated signal based on performing the second operation; anddetermining an image of touch using the first demodulated signal and the second demodulated signal.
  • 16. The controller of claim 15, wherein the first demodulation waveform and the second demodulation waveform differ in at least one of shape, amplitude, and/or frequency.
  • 17. A computer-implemented method for controlling a touch surface, the method comprising: receiving, at a demodulator circuit, a first signal, the first signal associated with a capacitive coupling between one or more drive electrodes and one or more sense electrodes of a touch surface during a first time period, wherein the one or more drive electrodes are driven with a first waveform during the first time period;performing a first operation associated with the first signal and a first demodulation waveform, wherein the first demodulation waveform is determined based on one or more characteristics associated with the first signal, including delaying the first demodulation waveform by a predetermined amount to compensate for phase delay in the touch surface;generating a first demodulated signal based on performing the first operation; anddetermining an image of touch using the first demodulated signal.
  • 18. The computer-implemented method of claim 17, wherein the one or more characteristics at least includes a characteristic associated with a frequency of the first waveform.
  • 19. The computer-implemented method of claim 17, further comprising: receiving, at the demodulator circuit, a second signal, the second signal associated with a capacitive coupling between the one or more drive electrodes and the one or more sense electrodes during a second time period, wherein the one or more drive electrodes are driven with a second waveform, different from the first waveform, during the second time period;performing a second operation associated with the second signal and a second demodulation waveform, different from the first demodulation waveform, wherein the second demodulation waveform is determined based on the one or more characteristics associated with the second signal;generating a second demodulated signal based on performing the second operation; anddetermining an image of touch using the first demodulated signal and the second demodulated signal.
  • 20. The computer-implemented method of claim 19, wherein the first demodulation waveform and the second demodulation waveform differ in at least one of shape, amplitude, and/or frequency.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/384,202, filed on Dec. 19, 2016 (now U.S. Publication No. 2017-0097736), which is a continuation of U.S. patent application Ser. No. 12/549,229, filed on Aug. 27, 2009 (now U.S. Pat. No. 9,547,394), which is a divisional of U.S. patent application Ser. No. 11/381,313 filed on May 2, 2006 (now U.S. Pat. No. 8,279,180), the entire disclosures of which are incorporated by reference herein.

US Referenced Citations (263)
Number Name Date Kind
3333160 Gorski Jul 1967 A
3541541 Engelbart Nov 1970 A
3662105 Hurst et al. May 1972 A
3798370 Hurst Mar 1974 A
4246452 Chandler Jan 1981 A
4476463 Ng et al. Oct 1984 A
4511760 Garwin et al. Apr 1985 A
4543564 Audoin et al. Sep 1985 A
4550221 Mabusth Oct 1985 A
4672364 Lucas Jun 1987 A
4672558 Beckes et al. Jun 1987 A
4692809 Beining et al. Sep 1987 A
4695827 Beining et al. Sep 1987 A
4733222 Evans Mar 1988 A
4734685 Watanabe Mar 1988 A
4746770 Mcavinney May 1988 A
4771276 Parks Sep 1988 A
4788384 Bruere-dawson et al. Nov 1988 A
4806846 Kerber Feb 1989 A
4898555 Sampson Feb 1990 A
4902886 Smisko Feb 1990 A
4968877 Mcavinney et al. Nov 1990 A
5003519 Noirjean Mar 1991 A
5017030 Crews May 1991 A
5117071 Greanias et al. May 1992 A
5178477 Gambaro Jan 1993 A
5189403 Franz et al. Feb 1993 A
5194862 Edwards Mar 1993 A
5224861 Glass et al. Jul 1993 A
5241308 Young Aug 1993 A
5252951 Tannenbaum et al. Oct 1993 A
5281966 Walsh Jan 1994 A
5305017 Gerpheide Apr 1994 A
5345543 Capps et al. Sep 1994 A
5349303 Gerpheide Sep 1994 A
5357266 Tagawa Oct 1994 A
5376948 Roberts Dec 1994 A
5398310 Tchao et al. Mar 1995 A
5442742 Greyson et al. Aug 1995 A
5463388 Boie et al. Oct 1995 A
5463696 Beernink et al. Oct 1995 A
5483261 Yasutake Jan 1996 A
5488204 Mead et al. Jan 1996 A
5495077 Miller et al. Feb 1996 A
5513309 Meier et al. Apr 1996 A
5523775 Capps Jun 1996 A
5530455 Gillick et al. Jun 1996 A
5543590 Gillespie et al. Aug 1996 A
5543591 Gillespie et al. Aug 1996 A
5563632 Roberts Oct 1996 A
5563996 Tchao Oct 1996 A
5565658 Gerpheide et al. Oct 1996 A
5579036 Yates, IV Nov 1996 A
5581681 Tchao et al. Dec 1996 A
5583946 Gourdol Dec 1996 A
5590219 Gourdol Dec 1996 A
5592566 Pagallo et al. Jan 1997 A
5594810 Gourdol Jan 1997 A
5596694 Capps Jan 1997 A
5612719 Beernink et al. Mar 1997 A
5631805 Bonsall May 1997 A
5633955 Bozinovic et al. May 1997 A
5634102 Capps May 1997 A
5636101 Bonsall et al. Jun 1997 A
5642108 Gopher et al. Jun 1997 A
5644657 Capps et al. Jul 1997 A
5666113 Logan Sep 1997 A
5666502 Capps Sep 1997 A
5666552 Greyson et al. Sep 1997 A
5675361 Santilli Oct 1997 A
5677710 Thompson-rohrlich Oct 1997 A
5689253 Hargreaves et al. Nov 1997 A
5710844 Capps et al. Jan 1998 A
5729250 Bishop et al. Mar 1998 A
5730165 Philipp Mar 1998 A
5736976 Cheung Apr 1998 A
5741990 Davies Apr 1998 A
5745116 Pisutha-arnond Apr 1998 A
5745716 Tchao et al. Apr 1998 A
5746818 Yatake May 1998 A
5748269 Harris et al. May 1998 A
5764222 Shieh Jun 1998 A
5764818 Capps et al. Jun 1998 A
5767457 Gerpheide et al. Jun 1998 A
5767842 Korth Jun 1998 A
5790104 Shieh Aug 1998 A
5790107 Kasser et al. Aug 1998 A
5802516 Shwarts et al. Sep 1998 A
5808567 Mccloud Sep 1998 A
5809267 Moran et al. Sep 1998 A
5821690 Martens et al. Oct 1998 A
5821930 Hansen Oct 1998 A
5823782 Marcus et al. Oct 1998 A
5825351 Tam Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5831600 Inoue et al. Nov 1998 A
5835079 Shieh Nov 1998 A
5854625 Frisch et al. Dec 1998 A
5880411 Gillespie et al. Mar 1999 A
5892540 Kozlowski et al. Apr 1999 A
5898434 Small et al. Apr 1999 A
5920309 Bisset Jul 1999 A
5923319 Bishop et al. Jul 1999 A
5933134 Shieh Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5945980 Moissev et al. Aug 1999 A
6002389 Kasser Dec 1999 A
6002808 Freeman Dec 1999 A
6020881 Naughton et al. Feb 2000 A
6031524 Kunert Feb 2000 A
6037882 Levy Mar 2000 A
6050825 Nichol et al. Apr 2000 A
6052339 Frenkel et al. Apr 2000 A
6072494 Nguyen Jun 2000 A
6075520 Inoue et al. Jun 2000 A
6084576 Leu et al. Jul 2000 A
6100827 Boesch et al. Aug 2000 A
6107997 Ure Aug 2000 A
6128003 Smith et al. Oct 2000 A
6131299 Raab et al. Oct 2000 A
6135958 Mikula-curtis et al. Oct 2000 A
6144380 Shwarts et al. Nov 2000 A
6188391 Seely et al. Feb 2001 B1
6198515 Cole Mar 2001 B1
6201573 Mizuno Mar 2001 B1
6208329 Ballare Mar 2001 B1
6222465 Kumar et al. Apr 2001 B1
6239788 Nohno et al. May 2001 B1
6239790 Martinelli et al. May 2001 B1
6243071 Shwarts et al. Jun 2001 B1
6246862 Grivas et al. Jun 2001 B1
6249606 Kiraly et al. Jun 2001 B1
6288707 Philipp Sep 2001 B1
6289326 Lafleur Sep 2001 B1
6292178 Bernstein et al. Sep 2001 B1
6300613 Kuderer Oct 2001 B1
6310610 Beaton et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6323849 He et al. Nov 2001 B1
6339363 Fowler Jan 2002 B1
6347290 Bartlett Feb 2002 B1
6377009 Philipp Apr 2002 B1
6380931 Gillespie et al. Apr 2002 B1
6411287 Scharff et al. Jun 2002 B1
6414671 Gillespie et al. Jul 2002 B1
6421234 Ricks et al. Jul 2002 B1
6452514 Philipp Sep 2002 B1
6457355 Philipp Oct 2002 B1
6466036 Philipp Oct 2002 B1
6515669 Mohri Feb 2003 B1
6525749 Moran et al. Feb 2003 B1
6535200 Philipp Mar 2003 B2
6543684 White et al. Apr 2003 B1
6543947 Lee Apr 2003 B2
6570557 Westerman et al. May 2003 B1
6577910 Eastty et al. Jun 2003 B1
6593916 Aroyan Jul 2003 B1
6610936 Gillespie et al. Aug 2003 B2
6624833 Kumar et al. Sep 2003 B1
6639577 Eberhard Oct 2003 B2
6650319 Hurst et al. Nov 2003 B1
6658994 Mcmillan Dec 2003 B1
6670894 Mehring Dec 2003 B2
6677932 Westerman Jan 2004 B1
6677934 Blanchard Jan 2004 B1
6690387 Zimmerman et al. Feb 2004 B2
6724366 Crawford Apr 2004 B2
6747264 Miida Jun 2004 B2
6757002 Oross et al. Jun 2004 B1
6803906 Morrison et al. Oct 2004 B1
6842672 Straub et al. Jan 2005 B1
6856259 Sharp Feb 2005 B1
6888536 Westerman et al. May 2005 B2
6900795 Knight, III et al. May 2005 B1
6927761 Badaye et al. Aug 2005 B2
6942571 Mcallister et al. Sep 2005 B1
6965375 Gettemy et al. Nov 2005 B1
6972401 Akitt et al. Dec 2005 B2
6977666 Hedrick Dec 2005 B1
6985801 Straub et al. Jan 2006 B1
6992659 Gettemy Jan 2006 B2
7015894 Morohoshi Mar 2006 B2
7031228 Born et al. Apr 2006 B2
7031886 Hargreaves Apr 2006 B1
7129416 Steinfeld et al. Oct 2006 B1
7184064 Zimmerman et al. Feb 2007 B2
7219829 Treat May 2007 B2
RE40153 Westerman et al. Mar 2008 E
7378856 Peine et al. May 2008 B2
RE40993 Westerman Nov 2009 E
7663607 Hotelling et al. Feb 2010 B2
7808479 Hotelling et al. Oct 2010 B1
8232970 Krah et al. Jul 2012 B2
8279180 Hotelling et al. Oct 2012 B2
8479122 Hotelling et al. Jul 2013 B2
8816984 Hotelling et al. Aug 2014 B2
8931780 Zachut et al. Jan 2015 B2
9262029 Hotelling et al. Feb 2016 B2
9547394 Hotelling et al. Jan 2017 B2
20020118848 Karpenstein Aug 2002 A1
20030006974 Clough et al. Jan 2003 A1
20030067447 Geaghan et al. Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030076306 Zadesky et al. Apr 2003 A1
20030095095 Pihlaja May 2003 A1
20030095096 Robbin et al. May 2003 A1
20030098858 Perski et al. May 2003 A1
20030206202 Moriya Nov 2003 A1
20030234768 Rekimoto et al. Dec 2003 A1
20040056845 Harkcom et al. Mar 2004 A1
20040066367 Fagard Apr 2004 A1
20040187577 Higuchi et al. Sep 2004 A1
20040196936 Kawama et al. Oct 2004 A1
20040263484 Mantysalo et al. Dec 2004 A1
20050005703 Saito et al. Jan 2005 A1
20050012723 Pallakoff Jan 2005 A1
20050052425 Zadesky et al. Mar 2005 A1
20050068044 Peine et al. Mar 2005 A1
20050089120 Quinlan et al. Apr 2005 A1
20050104867 Westerman et al. May 2005 A1
20050110768 Marriott et al. May 2005 A1
20050189154 Perski et al. Sep 2005 A1
20050243893 Ranganathan et al. Nov 2005 A1
20060017701 Marten et al. Jan 2006 A1
20060022955 Kennedy Feb 2006 A1
20060022956 Lengeling et al. Feb 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060032680 Elias et al. Feb 2006 A1
20060033007 Terzioglu Feb 2006 A1
20060033724 Chaudhri et al. Feb 2006 A1
20060044259 Hotelling et al. Mar 2006 A1
20060053387 Ording Mar 2006 A1
20060066582 Lyon et al. Mar 2006 A1
20060085757 Andre et al. Apr 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060133600 Holcombe Jun 2006 A1
20060197753 Hotelling Sep 2006 A1
20060232567 Westerman et al. Oct 2006 A1
20060238517 King et al. Oct 2006 A1
20060238518 Westerman et al. Oct 2006 A1
20060238519 Westerman et al. Oct 2006 A1
20060238520 Westerman et al. Oct 2006 A1
20060238521 Westerman et al. Oct 2006 A1
20060238522 Westerman et al. Oct 2006 A1
20060267953 Peterson et al. Nov 2006 A1
20070007438 Liu et al. Jan 2007 A1
20070008299 Hristov Jan 2007 A1
20070062852 Zachut et al. Mar 2007 A1
20070109274 Reynolds May 2007 A1
20070176609 Ely et al. Aug 2007 A1
20070229464 Hotelling et al. Oct 2007 A1
20070236466 Hotelling Oct 2007 A1
20070247429 Westerman Oct 2007 A1
20070257890 Hotelling et al. Nov 2007 A1
20080297476 Hotelling et al. Dec 2008 A1
20090315850 Hotelling et al. Dec 2009 A1
20090315851 Hotelling et al. Dec 2009 A1
20120319996 Hotelling et al. Dec 2012 A1
20140362048 Hotelling et al. Dec 2014 A1
20170097736 Hotelling et al. Apr 2017 A1
Foreign Referenced Citations (43)
Number Date Country
1243096 Oct 1988 CA
10251296 May 2004 DE
0288692 Nov 1988 EP
0464908 Jan 1992 EP
0288692 Jul 1993 EP
0664504 Jul 1995 EP
0464908 Sep 1996 EP
0818751 Jan 1998 EP
1014295 Jun 2000 EP
1014295 Jan 2002 EP
1211633 Jun 2002 EP
S57176448 Oct 1982 JP
S57176448 Nov 1982 JP
605324 Jan 1985 JP
S61115118 Jun 1986 JP
0248724 Feb 1990 JP
H04266116 Sep 1992 JP
H04507316 Dec 1992 JP
H06161640 Jun 1994 JP
H09325852 Dec 1997 JP
H10233670 Sep 1998 JP
H11110116 Apr 1999 JP
H11143626 May 1999 JP
H11505641 May 1999 JP
H11305932 Nov 1999 JP
2000020229 Jan 2000 JP
2000163031 Jun 2000 JP
2002501271 Jan 2002 JP
2002342033 Nov 2002 JP
2002542633 Dec 2002 JP
2005507083 Mar 2005 JP
2005508073 Mar 2005 JP
9103039 Mar 1991 WO
9618179 Jun 1996 WO
9718547 May 1997 WO
9723738 Jul 1997 WO
9814863 Apr 1998 WO
03088176 Oct 2003 WO
2005020056 Mar 2005 WO
2005114369 Dec 2005 WO
2006023569 Mar 2006 WO
2007130771 Nov 2007 WO
2007130771 Feb 2008 WO
Non-Patent Literature Citations (110)
Entry
4-Wire Resistive Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-4resistive.html>, Accessed on Aug. 5, 2005.
5-Wire Resistive Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-resistive.html>, Accessed on Aug. 5, 2005.
A Brief Overview of Gesture Recognition, Available online at: <http://www.dai.ed.ac.uk/Cvonline/LOCA_COPIES/COHEN/gesture_overview.html>, Accessed on Apr. 20, 2004.
Advisory Action received for U.S. Appl. No. 11/381,313, dated Mar. 17, 2011, 3 pages.
Advisory Action received for U.S. Appl. No. 11/381,313, dated Mar. 29, 2010, 3 pages.
Advisory Action received for U.S. Appl. No. 12/549,229, dated Aug. 27, 2012, 5 pages.
Advisory Action received for U.S. Appl. No. 12/549,229, dated Feb. 24, 2014, 3 pages.
Advisory Action received for U.S. Appl. No. 14/464,524, dated May 26, 2015, 4 pages.
Capacitive Position Sensing, Available online at: <http://www.synaptics.com/technology/cps.cfm>, Accessed on Aug. 5, 2005.
Capacitive Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-capacitive.html>, Accessed on Aug. 5, 2005.
Combined Search Report and Examination Report received for GB Patent Application No. 1101918.9, dated Mar. 21, 2011, 4 pages.
Comparing Touch Technologies, Available online at: <http://www.touchscreens.com/intro-touchtypes.html>, Accessed on Oct. 10, 2004.
Decision to Grant received for Japanese Patent Application No. 2018-076137, dated Oct. 21, 2019, 4 pages (2 pages of English Translation and 2 pages of Official copy).
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 12/549,229, mailed on Jul. 17, 2014, 12 pages.
Final Office Action received for U.S. Appl. No. 11/381,313, dated Dec. 3, 2010, 25 pages.
Final Office Action received for U.S. Appl. No. 11/381,313, dated Dec. 11, 2009, 30 pages.
Final Office Action received for U.S. Appl. No. 12/549,229, dated May 9, 2012, 13 pages.
Final Office Action received for U.S. Appl. No. 12/549,229, dated Nov. 5, 2013, 12 pages.
Final Office Action received for U.S. Appl. No. 13/595,898, dated Jun. 26, 2013, 9 pages.
Final Office Action received for U.S. Appl. No. 14/464,524, dated Feb. 12, 2015, 15 pages.
Final Office Action received for U.S. Appl. No. 15/384,202, dated Nov. 19, 2019, 27 pages.
FingerWorks—Gesture Guide—Application Switching, Available online at: <http://www.fingerworks.com/gesture_guide_apps.html>, Accessed on Aug. 27, 2004, 1 page.
FingerWorks—Gesture Guide—Editing, Available online at: <http://www.fingerworks.com/gesure_guide_editing.html> Feb. 13, 2004, Accessed on Aug. 27, 2004, 1 page.
FingerWorks—Gesture Guide—File Operations, Available online at: <http://www.fingerworks.com/gesture_guide_files.html> Jun. 18, 2004, Accessed on Aug. 27, 2004, 1 page.
FingerWorks—Gesture Guide—Text Manipulation, Available online at: <http://www.fingerworks.com/gesture_guide_text_manip.html> Jun. 6, 2004, Accessed on Aug. 27, 2004, 2 pages.
FingerWorks—Gesture Guide—Tips and Tricks, Available online at: <http://www.fingerworks.com/gesture_guide_tips.html>, Accessed on Aug. 27, 2004, 1 page.
FingerWorks—Gesture Guide—Web, Available online at: <http://www.fingerworks.com/gesture_guide_web.html> Jun. 5, 2004, Accessed on Aug. 27, 2004, 1 page.
FingerWorks—Guide to Hand Gestures for USB Touchpads, Available online at: <http://www.fingerworks.com/igesture_userguide.html>, Accessed on Aug. 27, 2004, 1 page.
FingerWorks—iGesture—Technical Details, Available online at: <http://www.fingerworks.com/igesture_tech.html>, Accessed on Aug. 27, 2004, 1 page.
FingerWorks—The Only Touchpads with Ergonomic Full-Hand Resting and Relaxation!, Available online at: <http://www.fingerworks.com/resting.html>, 2001, 1 page.
FingerWorks—Tips for Typing on the Mini, Available online at: <http://www.fingerworks.com/mini_typing.html> Jun. 5, 2004, Accessed on Aug. 27, 2004, 2 pages.
Gesture Recognition, Available online at: <http://www.fingerworks.com/gesture_recognition.html>, downloaded on Aug. 30, 2005, 2 pages.
Glidepoint, Available online at: <http://www.cirque.com/technology/technology_gp.html>, Accessed on Aug. 5, 2005.
How Do Touchscreen Monitors Know Where You're Touching?, Available online at: <http://electronics.howstuffworks.com/question716.html>, Jul. 7, 2008, 2 pages.
How Does a Touchscreen Work?, Available online at: <http://www.touchscreens.com/intro-anatomy.html>, Accessed on Aug. 5, 2005.
IGesture Pad—the MultiFinger USB TouchPad with Whole-Hand Gestures, Available online at: <http://www.fingerworks.com/igesture.html>, Accessed on Aug. 27, 2004, 2 pages.
IGesture Products for Everyone (learn in minutes) Product Overview, Available online at: <FingerWorks.com>, Accessed on Aug. 30, 2005.
Infrared Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-infrared.html>, Accessed on Aug. 5, 2005.
International Search Report received for PCT Patent Application No. PCT/US2005/003325, dated Mar. 3, 2006.
International Search Report received for PCT Patent Application No. PCT/US2006/008349, dated Oct. 6, 2006, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2007/066021, dated Jan. 11, 2008, 5 pages.
Mouse Emulation, FingerWorks, Available online at: <http://www.fingerworks.com/gesture_guide_mouse.html> Dec. 10, 2002, Accessed on Aug. 30, 2005.
Mouse Gestures, Optim oz, May 21, 2004.
Mouse Gestures in Opera, Available online at: <http://www.opera.com/products/desktop/mouse/index.dml>, Accessed on Aug. 30, 2005.
MultiTouch Overview, FingerWorks, Available online at: <http://www.fingerworks.com/multoverview.html>, Accessed on Aug. 30, 2005.
Near Field Imaging Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-nfi.html>, Accessed on Aug. 5, 2005.
Non-Final Office Action received for U.S. Appl. No. 12/549,229, dated Jul. 22, 2013, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 12/549,229, dated Nov. 3, 2011, 12 pages.
Non-Final Office Action received for U.S. Appl. No. 11/381,313, dated Jan. 19, 2012, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 11/381,313, dated Jul. 6, 2010, 26 pages.
Non-Final Office Action received for U.S. Appl. No. 11/381,313, dated Jun. 22, 2009, 31 pages.
Non-Final Office Action received for U.S. Appl. No. 12/549,269, dated Dec. 28, 2011, 11 pages.
Non-Final Office Action received for U.S. Appl. No. 13/595,898, dated Dec. 24, 2013, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 13/595,898, dated Nov. 28, 2012, 9 pages.
Non-Final Office Action received for U.S. Appl. No. 14/464,524, dated Sep. 30, 2014, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 15/384,202, dated May 30, 2019, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 15/384,202, dated May 15, 2020, 30 pages.
Notice of Allowance received for U.S. Appl. No. 11/381,313, dated May 24, 2012, 8 pages.
Notice of Allowance received for U.S. Appl. No. 12/549,229, dated Sep. 8, 2016, 5 pages.
Notice of Allowance received for U.S. Appl. No. 13/595,898, dated Apr. 22, 2014, 7 pages.
Notice of Allowance received for U.S. Appl. No. 14/464,524, dated Oct. 15, 2015, 7 pages.
Notice of Allowance received for U.S. Appl. No. 15/384,202, dated Oct. 9, 2020, 10 pages.
Patent Board Decision received for U.S. Appl. No. 12/549,229, dated Jul. 15, 2016, 8 pages.
PenTouch Capacitive Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-pentouch.html>, Accessed on Aug. 5, 2005.
Restriction Requirement received for U.S. Appl. No. 11/381,313, dated Mar. 19, 2009, 5 pages.
Search Report received for European Patent Application No. 1621989, dated Mar. 27, 2006.
Supplemental Notice of Allowance received for U.S. Appl. No. 11/381,313, dated Jun. 26, 2012, 2 pages.
Supplemental Notice of Allowance received for U.S. Appl. No. 13/595,898, dated Jul. 7, 2014, 5 pages.
Supplemental Notice of Allowance received for U.S. Appl. No. 14/464,524, dated Dec. 8, 2015, 2 pages.
Surface Acoustic Wave Touchscreens, Available online at: <http://www.touchscreens.com/intro-touchtypes-saw.html>, Accessed on Aug. 5, 2005.
Symbol Commander, Available online at: <http://www.sensiva.com/symbolcomander/>, Accessed on Aug. 30, 2005.
Tips for Typing, FingerWorks, Available online at: <http://www.fingerworks.com/mini_typing.html>, Accessed on Aug. 30, 2005.
Touch Technologies Overview, 3M Touch Systems, Massachusetts, 2001.
Wacom Components—Technology, Available online at: <http://www.wacom-components.com/english/tech.asp>, Accessed on Oct. 10, 2004.
Watershed Algorithm, Available online at: <http://rsb.info.nih.gov/ij/plugins/watershed.html>, Accessed on Aug. 5, 2005.
Ahmad Subatai, “A Usable Real-Time 3D Hand Tracker”, Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of 2), vol. 2, Oct. 1994, 5 pages.
Bier et al., “Toolglass and Magic Lenses: The See-Through Interface”, In James Kijiya, editor, Computer Graphics (SIGGRAPH '93 Proceedings), vol. 27, Aug. 1993, pp. 73- 80.
Douglas et al., “The Ergonomics of Computer Pointing Devices”, 1997.
EVB Elektronik,“TSOP6238 IR Receiver Modules for Infrared Remote Control Systems”, Jan. 2004, 1 page.
Fisher et al., “Repetitive Motion Disorders: The Design of Optimal Rate-Rest Profiles”, Human Factors, vol. 35, No. 2, Jun. 1993, pp. 283-304.
Fukumoto et al., “ActiveClick: Tactile Feedback for Touch Panels”, In CHI 2001 Summary, 2001, pp. 121-122.
Fukumoto et al., “Body Coupled Fingering: Wireless Wearable Keyboard”, CHI 97, Mar. 1997, pp. 147-154.
Hardy Ian, “FingerWorks”, BBC World On Line, Mar. 7, 2002.
Hillier et al., “Introduction to Operations Research”, 1986.
horowitz et al., “4.19 Integrators, Figure 4.48”, The Art of Electronics, 2nd Edition, 4.19, Cambridge University Press, pp. 222-223.
Integration Associates Inc., “Proximity Sensor Demo Kit”, User Guide, Version 0.62—Preliminary, Apr. 13, 2004, 14 pages.
Jacob et al., “Integrality and Separability of Input Devices”, ACM Transactions on Computer-Human Interaction, vol. 1, Mar. 1994, pp. 3-26.
Kinkley et al., “Touch-Sensing Input Devices”, CHI '99 Proceedings, May 1999, pp. 223-230.
Kionx,“KXP84 Series Summary Data Sheet”, Oct. 21, 2005, 4 pages.
Lee et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet”, CHI '85 Proceedings, Apr. 1985, pp. 121-128.
Lee S., “A Fast Multiple-Touch-Sensitive Input Device”, A Thesis Submitted in Conformity with the Requirements for the Degree of Master of Applied Science in the Department of Electrical Engineering, University of Toronto, Oct. 1984, 115 pages.
Matsushita et al., “HoloWall: Designing a Finger, Hand, Body and Object Sensitive Wall”, In Proceedings of UIST '97, Oct. 1997.
Peter et al., Unpublished U.S. Appl. No. 10/789,676, filed Feb. 27, 2004, titled “Shape Detecting Input Device”.
Quantum Research Group, “QT510/QWheel Touch Slider IC”, 2004-2005, 14 pages.
Quek, “Unencumbered Gestural Interaction”, IEEE Multimedia, vol. 3, 1996, pp. 36-47.
Radwin, “Activation Force and Travel Effects on Overexertion in Repetitive Key Tapping”, Human Factors, vol. 39, No. 1, Mar. 1997, pp. 130-140.
Rekimoto et al., “ToolStone: Effective Use of the Physical Manipulation Vocabularies of Input Devices”, In Proc. of UIST 2000, 2000.
Rekimoto J., “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces”, CHI 2002 Conference Proceedings, Conference on Human Factors in Computing Systems, Minneapolis, vol. 4, No. 1, Apr. 20-25, 2002, pp. 113-120.
Rubine et al., “Programmable Finger-Tracking Instrument Controllers”, Computer Music Journal, vol. 14, No. 1, 1990, pp. 26-41.
Rubine Dean, “Combining Gestures and Direct Manipulation”, CHI'92, May 3-7, 1992, pp. 659-660.
Rubine Dean H., “The Automatic Recognition of Gestures”, CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, Dec. 1991, 285 pages.
Rutledge et al., “Force-To-Motion Functions For Pointing”, Human-Computer Interaction—INTERACT, 1990.
Texas Instruments,“TSC2003/I2C Touch Screen Controller”, Data Sheet SBAS 162, Oct. 2001, 20 pages.
Wellner Pierre, “The Digital Desk Calculators: Tangible Manipulation on a Desk Top Display”, In ACM UIST '91 Proceedings, Nov. 11-13, 1991, pp. 27-34.
Westerman Wayne, “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface”, A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment ofthe Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 1999, 363 pages.
Williams Jim, “Applications for a Switched-Capacitor Instrumentation Building Block”, Linear Technology Application Note 3, Jul. 1985, pp. 1-16.
Yamada et al., “A Switched-Capacitor Interface for Capacitive Pressure Sensors”, IEEE Transactions on Instrumentation and Measurement, vol. 41, No. 1, Feb. 1992, pp. 81-86.
Yeh et al., “Switched Capacitor Interface Circuit for Capacitive Transducers”, IEEE, 1985.
Zhai et al., “Dual Stream Input for Pointing and Scrolling”, Proceedings of CHI '97 Extended Abstracts, 1997.
Zimmerman et al., “Applying Electric Field Sensing to Human-Computer Interfaces”, In CHI '85 Proceedings, 1995, pp. 280-287.
Related Publications (1)
Number Date Country
20210240304 A1 Aug 2021 US
Divisions (1)
Number Date Country
Parent 11381313 May 2006 US
Child 12549229 US
Continuations (2)
Number Date Country
Parent 15384202 Dec 2016 US
Child 17170384 US
Parent 12549229 Aug 2009 US
Child 15384202 US