Multipoint touchscreen

Information

  • Patent Grant
  • 9035907
  • Patent Number
    9,035,907
  • Date Filed
    Thursday, November 21, 2013
    11 years ago
  • Date Issued
    Tuesday, May 19, 2015
    9 years ago
Abstract
A touch panel having a transparent capacitive sensing medium configured to detect multiple touches or near touches that occur at the same time and at distinct locations in the plane of the touch panel and to produce distinct signals representative of the location of the touches on the plane of the touch panel for each of the multiple touches is disclosed.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The present invention relates generally to an electronic device having a touch screen. More particularly, the present invention relates to a touch screen capable of sensing multiple points at the same time.


2. Description of the Related Art


There exist today many styles of input devices for performing operations in a computer system. The operations generally correspond to moving a cursor and/or making selections on a display screen. By way of example, the input devices may include buttons or keys, mice, trackballs, touch pads, joy sticks, touch screens and the like. Touch screens, in particular, are becoming increasingly popular because of their ease and versatility of operation as well as to their declining price. Touch screens allow a user to make selections and move a cursor by simply touching the display screen via a finger or stylus. In general, the touch screen recognizes the touch and position of the touch on the display screen and the computer system interprets the touch and thereafter performs an action based on the touch event.


Touch screens typically include a touch panel, a controller and a software driver. The touch panel is a clear panel with a touch sensitive surface. The touch panel is positioned in front of a display screen so that the touch sensitive surface covers the viewable area of the display screen. The touch panel registers touch events and sends these signals to the controller. The controller processes these signals and sends the data to the computer system. The software driver translates the touch events into computer events.


There are several types of touch screen technologies including resistive, capacitive, infrared, surface acoustic wave, electromagnetic, near field imaging, etc. Each of these devices has advantages and disadvantages that are taken into account when designing or configuring a touch screen. In resistive technologies, the touch panel is coated with a thin metallic electrically conductive and resistive layer. When the panel is touched, the layers come into contact thereby closing a switch that registers the position of the touch event. This information is sent to the controller for further processing. In capacitive technologies, the touch panel is coated with a material that stores electrical charge. When the panel is touched, a small amount of charge is drawn to the point of contact. Circuits located at each corner of the panel measure the charge and send the information to the controller for processing.


In surface acoustic wave technologies, ultrasonic waves are sent horizontally and vertically over the touch screen panel as for example by transducers. When the panel is touched, the acoustic energy of the waves are absorbed. Sensors located across from the transducers detect this change and send the information to the controller for processing. In infrared technologies, light beams are sent horizontally and vertically over the touch panel as for example by light emitting diodes. When the panel is touched, some of the light beams emanating from the light emitting diodes are interrupted. Light detectors located across from the light emitting diodes detect this change and send this information to the controller for processing.


One problem found in all of these technologies is that they are only capable of reporting a single point even when multiple objects are placed on the sensing surface. That is, they lack the ability to track multiple points of contact simultaneously. In resistive and capacitive technologies, an average of all simultaneously occurring touch points are determined and a single point which falls somewhere between the touch points is reported. In surface wave and infrared technologies, it is impossible to discern the exact position of multiple touch points that fall on the same horizontal or vertical lines due to masking. In either case, faulty results are generated.


These problems are particularly problematic in tablet PCs where one hand is used to hold the tablet and the other is used to generate touch events. For example, as shown in FIGS. 1A and 1B, holding a tablet 2 causes the thumb 3 to overlap the edge of the touch sensitive surface 4 of the touch screen 5. As shown in FIG. 1A, if the touch technology uses averaging, the technique used by resistive and capacitive panels, then a single point that falls somewhere between the thumb 3 of the left hand and the index finger 6 of the right hand would be reported. As shown in FIG. 1B, if the technology uses projection scanning, the technique used by infra red and SAW panels, it is hard to discern the exact vertical position of the index finger 6 due to the large vertical component of the thumb 3. The tablet 2 can only resolve the patches shown in gray. In essence, the thumb 3 masks out the vertical position of the index finger 6.


SUMMARY OF THE INVENTION

The invention relates, in one embodiment, to a touch panel having a transparent capacitive sensing medium configured to detect multiple touches or near touches that occur at the same time and at distinct locations in the plane of the touch panel and to produce distinct signals representative of the location of the touches on the plane of the touch panel for each of the multiple touches.


The invention relates, in another embodiment, to a display arrangement. The display arrangement includes a display having a screen for displaying a graphical user interface. The display arrangement further includes a transparent touch panel allowing the screen to be viewed therethrough and capable of recognizing multiple touch events that occur at different locations on the touch sensitive surface of the touch screen at the same time and to output this information to a host device.


The invention relates, in another embodiment, to a computer implemented method. The method includes receiving multiple touches on the surface of a transparent touch screen at the same time. The method also includes separately recognizing each of the multiple touches. The method further includes reporting touch data based on the recognized multiple touches.


The invention relates, in another embodiment, to a computer system. The computer system includes a processor configured to execute instructions and to carry out operations associated with the computer system. The computer also includes a display device that is operatively coupled to the processor. The computer system further includes a touch screen that is operatively coupled to the processor. The touch screen is a substantially transparent panel that is positioned in front of the display. The touch screen is configured to track multiple objects, which rest on, tap on or move across the touch screen at the same time. The touch screen includes a capacitive sensing device that is divided into several independent and spatially distinct sensing points that are positioned throughout the plane of the touch screen. Each sensing point is capable of generating a signal at the same time. The touch screen also includes a sensing circuit that acquires data from the sensing device and that supplies the acquired data to the processor.


The invention relates, in another embodiment, to a touch screen method. The method includes driving a plurality of sensing points. The method also includes reading the outputs from all the sensing lines connected to the sensing points. The method further includes producing and analyzing an image of the touch screen plane at one moment in time in order to determine where objects are touching the touch screen. The method additionally includes comparing the current image to a past image in order to determine a change at the objects touching the touch screen.


The invention relates, in another embodiment, to a digital signal processing method. The method includes receiving raw data. The raw data includes values for each transparent capacitive sensing node of a touch screen. The method also includes filtering the raw data. The method further includes generating gradient data. The method additionally includes calculating the boundaries for touch regions base on the gradient data. Moreover, the method includes calculating the coordinates for each touch region.





BRIEF DESCRIPTION OF THE DRAWINGS

The invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:



FIGS. 1A and 1B show a user holding conventional touch screens.



FIG. 2 is a perspective view of a display arrangement, in accordance with one embodiment of the present invention.



FIG. 3 shows an image of the touch screen plane at a particular point in time, in accordance with one embodiment of the present invention.



FIG. 4 is a multipoint touch method, in accordance with one embodiment of the present invention.



FIG. 5 is a block diagram of a computer system, in accordance with one embodiment of the present invention.



FIGS. 6A and 6B are a partial top view of a transparent multiple point touch screen, in accordance with one embodiment of the present invention.



FIG. 7 is a partial top view of a transparent multi point touch screen, in accordance with one embodiment of the present invention.



FIGS. 8A and 8B are a front elevation view, in cross section of a display arrangement, in accordance with one embodiment of the present invention.



FIG. 9 is a top view of a transparent multipoint touch screen, in accordance with another embodiment of the present invention.



FIG. 10 is a partial front elevation view, in cross section of a display arrangement, in accordance with one embodiment of the present invention.



FIGS. 11A and 11B are partial top view diagrams of a driving layer and a sensing layer, in accordance with one embodiment.



FIG. 12 is a simplified diagram of a mutual capacitance circuit, in accordance with one embodiment of the present invention.



FIG. 13 is a diagram of a charge amplifier, in accordance with one embodiment of the present invention.



FIG. 14 is a block diagram of a capacitive sensing circuit, in accordance with one embodiment of the present invention.



FIG. 15 is a flow diagram, in accordance with one embodiment of the present invention.



FIG. 16 is a flow diagram of a digital signal processing method, in accordance with one embodiment of the present invention.



FIGS. 17A-E show touch data at several steps, in accordance with one embodiment of the present invention.



FIG. 18 is a side elevation view of an electronic device, in accordance with one embodiments of the present invention.



FIG. 19 is a side elevation view of an electronic device, in accordance with one embodiments of the present invention.





DETAILED DESCRIPTION OF THE INVENTION

Embodiments of the invention are discussed below with reference to FIGS. 2-19. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.



FIG. 2 is a perspective view of a display arrangement 30, in accordance with one embodiment of the present invention. The display arrangement 30 includes a display 34 and a transparent touch screen 36 positioned in front of the display 34. The display 34 is configured to display a graphical user interface (GUI) including perhaps a pointer or cursor as well as other information to the user. The transparent touch screen 36, on the other hand, is an input device that is sensitive to a user's touch, allowing a user to interact with the graphical user interface on the display 34. By way of example, the touch screen 36 may allow a user to move an input pointer or make selections on the graphical user interface by simply pointing at the GUI on the display 34.


In general, touch screens 36 recognize a touch event on the surface 38 of the touch screen 36 and thereafter output this information to a host device. The host device may for example correspond to a computer such as a desktop, laptop, handheld or tablet computer. The host device interprets the touch event and thereafter performs an action based on the touch event. Conventionally, touch screens have only been capable of recognizing a single touch event even when the touch screen is touched at multiple points at the same time (e.g., averaging, masking, etc.). Unlike conventional touch screens, however, the touch screen 36 shown herein is configured to recognize multiple touch events that occur at different locations on the touch sensitive surface 38 of the touch screen 36 at the same time. That is, the touch screen 36 allows for multiple contact points T1-T4 to be tracked simultaneously, i.e., if four objects are touching the touch screen, then the touch screen tracks all four objects. As shown, the touch screen 36 generates separate tracking signals S1-S4 for each touch point T1-T4 that occurs on the surface of the touch screen 36 at the same time. The number of recognizable touches may be about 15.15 touch points allows for all 10 fingers, two palms and 3 others.


The multiple touch events can be used separately or together to perform singular or multiple actions in the host device. When used separately, a first touch event may be used to perform a first action while a second touch event may be used to perform a second action that is different than the first action. The actions may for example include moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device connected to the host device etc. When used together, first and second touch events may be used for performing one particular action. The particular action may for example include logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like.


Recognizing multiple touch events is generally accomplished with a multipoint sensing arrangement. The multipoint sensing arrangement is capable of simultaneously detecting and monitoring touches and the magnitude of those touches at distinct points across the touch sensitive surface 38 of the touch screen 36. The multipoint sensing arrangement generally provides a plurality of transparent sensor coordinates or nodes 42 that work independent of one another and that represent different points on the touch screen 36. When plural objects are pressed against the touch screen 36, one or more sensor coordinates are activated for each touch point as for example touch points T1-T4. The sensor coordinates 42 associated with each touch point T1-T4 produce the tracking signals S1-S4.


In one embodiment, the touch screen 36 includes a plurality of capacitance sensing nodes 42. The capacitive sensing nodes may be widely varied. For example, the capacitive sensing nodes may be based on self capacitance or mutual capacitance. In self capacitance, the “self” capacitance of a single electrode is measured as for example relative to ground. In mutual capacitance, the mutual capacitance between at least first and second electrodes is measured. In either cases, each of the nodes 42 works independent of the other nodes 42 so as to produce simultaneously occurring signals representative of different points on the touch screen 36.


In order to produce a transparent touch screen 36, the capacitance sensing nodes 42 are formed with a transparent conductive medium such as indium tin oxide (ITO). In self capacitance sensing arrangements, the transparent conductive medium is patterned into spatially separated electrodes and traces. Each of the electrodes represents a different coordinate and the traces connect the electrodes to a capacitive sensing circuit. The coordinates may be associated with Cartesian coordinate system (x and y), Polar coordinate system (r, ✓) or some other coordinate system. In a Cartesian coordinate system, the electrodes may be positioned in columns and rows so as to form a grid array with each electrode representing a different x, y coordinate. During operation, the capacitive sensing circuit monitors changes in capacitance that occur at each of the electrodes. The positions where changes occur and the magnitude of those changes are used to help recognize the multiple touch events. A change in capacitance typically occurs at an electrode when a user places an object such as a finger in close proximity to the electrode, i.e., the object steals charge thereby affecting the capacitance.


In mutual capacitance, the transparent conductive medium is patterned into a group of spatially separated lines formed on two different layers. Driving lines are formed on a first layer and sensing lines are formed on a second layer. Although separated by being on different layers, the sensing lines traverse, intersect or cut across the driving lines thereby forming a capacitive coupling node. The manner in which the sensing lines cut across the driving lines generally depends on the coordinate system used. For example, in a Cartesian coordinate system, the sensing lines are perpendicular to the driving lines thereby forming nodes with distinct x and y coordinates. Alternatively, in a polar coordinate system, the sensing lines may be concentric circles and the driving lines may be radially extending lines (or vice versa). The driving lines are connected to a voltage source and the sensing lines are connected to capacitive sensing circuit. During operation, a current is driven through one driving line at a time, and because of capacitive coupling, the current is carried through to the sensing lines at each of the nodes (e.g., intersection points). Furthermore, the sensing circuit monitors changes in capacitance that occurs at each of the nodes. The positions where changes occur and the magnitude of those changes are used to help recognize the multiple touch events. A change in capacitance typically occurs at a capacitive coupling node when a user places an object such as a finger in close proximity to the capacitive coupling node, i.e., the object steals charge thereby affecting the capacitance.


By way of example, the signals generated at the nodes 42 of the touch screen 36 may be used to produce an image of the touch screen plane at a particular point in time. Referring to FIG. 3, each object in contact with a touch sensitive surface 38 of the touch screen 36 produces a contact patch area 44. Each of the contact patch areas 44 covers several nodes 42. The covered nodes 42 detect surface contact while the remaining nodes 42 do not detect surface contact. As a result, a pixilated image of the touch screen plane can be formed. The signals for each contact patch area 44 may be grouped together to form individual images representative of the contact patch area 44. The image of each contact patch area 44 may include high and low points based on the pressure at each point. The shape of the image as well as the high and low points within the image may be used to differentiate contact patch areas 44 that are in close proximity to one another. Furthermore, the current image, and more particularly the image of each contact patch area 44 can be compared to previous images to determine what action to perform in a host device.


Referring back to FIG. 2, the display arrangement 30 may be a stand alone unit or it may integrated with other devices. When stand alone, the display arrangement 32 (or each of its components) acts like a peripheral device (monitor) that includes its own housing and that can be coupled to a host device through wired or wireless connections. When integrated, the display arrangement 30 shares a housing and is hard wired into the host device thereby forming a single unit. By way of example, the display arrangement 30 may be disposed inside a variety of host devices including but not limited to general purpose computers such as a desktop, laptop or tablet computers, handhelds such as PDAs and media players such as music players, or peripheral devices such as cameras, printers and/or the like.



FIG. 4 is a multipoint touch method 45, in accordance with one embodiment of the present invention. The method generally begins at block 46 where multiple touches are received on the surface of the touch screen at the same time. This may for example be accomplished by placing multiple fingers on the surface of the touch screen. Following block 46, the process flow proceeds to block 47 where each of the multiple touches is separately recognized by the touch screen. This may for example be accomplished by multipoint capacitance sensors located within the touch screen. Following block 47, the process flow proceeds to block 48 where the touch data based on multiple touches is reported. The touch data may for example be reported to a host device such as a general purpose computer.



FIG. 5 is a block diagram of a computer system 50, in accordance with one embodiment of the present invention. The computer system 50 may correspond to personal computer systems such as desktops, laptops, tablets or handhelds. By way of example, the computer system may correspond to any Apple or PC based computer system. The computer system may also correspond to public computer systems such as information kiosks, automated teller machines (ATM), point of sale machines (POS), industrial machines, gaming machines, arcade machines, vending machines, airline e-ticket terminals, restaurant reservation terminals, customer service stations, library terminals, learning devices, and the like.


As shown, the computer system 50 includes a processor 56 configured to execute instructions and to carry out operations associated with the computer system 50. For example, using instructions retrieved for example from memory, the processor 56 may control the reception and manipulation of input and output data between components of the computing system 50. The processor 56 can be a single-chip processor or can be implemented with multiple components.


In most cases, the processor 56 together with an operating system operates to execute computer code and produce and use data. The computer code and data may reside within a program storage block 58 that is operatively coupled to the processor 56. Program storage block 58 generally provides a place to hold data that is being used by the computer system 50. By way of example, the program storage block may include Read-Only Memory (ROM) 60, Random-Access Memory (RAM) 62, hard disk drive 64 and/or the like. The computer code and data could also reside on a removable storage medium and loaded or installed onto the computer system when needed. Removable storage mediums include, for example, CD-ROM, PC-CARD, floppy disk, magnetic tape, and a network component.


The computer system 50 also includes an input/output (I/O) controller 66 that is operatively coupled to the processor 56. The (I/O) controller 66 may be integrated with the processor 56 or it may be a separate component as shown. The I/O controller 66 is generally configured to control interactions with one or more I/O devices. The I/O controller 66 generally operates by exchanging data between the processor and the I/O devices that desire to communicate with the processor. The I/O devices and the I/O controller typically communicate through a data link 67. The data link 67 may be a one way link or two way link. In some cases, the I/O devices may be connected to the I/O controller 66 through wired connections. In other cases, the I/O devices may be connected to the I/O controller 66 through wireless connections. By way of example, the data link 67 may correspond to PS/2, USB, Firewire, IR, RF, Bluetooth or the like.


The computer system 50 also includes a display device 68 that is operatively coupled to the processor 56. The display device 68 may be a separate component (peripheral device) or it may be integrated with the processor and program storage to form a desktop computer (all in one machine), a laptop, handheld or tablet or the like. The display device 68 is configured to display a graphical user interface (GUI) including perhaps a pointer or cursor as well as other information to the user. By way of example, the display device 68 may be a monochrome display, color graphics adapter (CGA) display, enhanced graphics adapter (EGA) display, variable-graphics-array (VGA) display, super VGA display, liquid crystal display (e.g., active matrix, passive matrix and the like), cathode ray tube (CRT), plasma displays and the like.


The computer system 50 also includes a touch screen 70 that is operatively coupled to the processor 56. The touch screen 70 is a transparent panel that is positioned in front of the display device 68. The touch screen 70 may be integrated with the display device 68 or it may be a separate component. The touch screen 70 is configured to receive input from a user's touch and to send this information to the processor 56. In most cases, the touch screen 70 recognizes touches and the position and magnitude of touches on its surface. The touch screen 70 reports the touches to the processor 56 and the processor 56 interprets the touches in accordance with its programming. For example, the processor 56 may initiate a task in accordance with a particular touch.


In accordance with one embodiment, the touch screen 70 is capable of tracking multiple objects, which rest on, tap on, or move across the touch sensitive surface of the touch screen at the same time. The multiple objects may for example correspond to fingers and palms. Because the touch screen is capable of tracking multiple objects, a user may perform several touch initiated tasks at the same time. For example, the user may select an onscreen button with one finger, while moving a cursor with another finger. In addition, a user may move a scroll bar with one finger while selecting an item from a menu with another finger. Furthermore, a first object may be dragged with one finger while a second object may be dragged with another finger. Moreover, gesturing may be performed with more than one finger.


To elaborate, the touch screen 70 generally includes a sensing device 72 configured to detect an object in close proximity thereto and/or the pressure exerted thereon. The sensing device 72 may be widely varied. In one particular embodiment, the sensing device 72 is divided into several independent and spatially distinct sensing points, nodes or regions 74 that are positioned throughout the touch screen 70. The sensing points 74, which are typically hidden from view, are dispersed about the touch screen 70 with each sensing point 74 representing a different position on the surface of the touch screen 70 (or touch screen plane). The sensing points 74 may be positioned in a grid or a pixel array where each pixilated sensing point 74 is capable of generating a signal at the same time. In the simplest case, a signal is produced each time an object is positioned over a sensing point 74. When an object is placed over multiple sensing points 74 or when the object is moved between or over multiple sensing point 74, multiple signals are generated.


The number and configuration of the sensing points 74 may be widely varied. The number of sensing points 74 generally depends on the desired sensitivity as well as the desired transparency of the touch screen 70. More nodes or sensing points generally increases sensitivity, but reduces transparency (and vice versa). With regards to configuration, the sensing points 74 generally map the touch screen plane into a coordinate system such as a Cartesian coordinate system, a Polar coordinate system or some other coordinate system. When a Cartesian coordinate system is used (as shown), the sensing points 74 typically correspond to x and y coordinates. When a Polar coordinate system is used, the sensing points typically correspond to radial (r) and angular coordinates (✓).


The touch screen 70 may include a sensing circuit 76 that acquires the data from the sensing device 72 and that supplies the acquired data to the processor 56. Alternatively, the processor may include this functionality. In one embodiment, the sensing circuit 76 is configured to send raw data to the processor 56 so that the processor 56 processes the raw data. For example, the processor 56 receives data from the sensing circuit 76 and then determines how the data is to be used within the computer system 50. The data may include the coordinates of each sensing point 74 as well as the pressure exerted on each sensing point 74. In another embodiment, the sensing circuit 76 is configured to process the raw data itself. That is, the sensing circuit 76 reads the pulses from the sensing points 74 and turns them into data that the processor 56 can understand. The sensing circuit 76 may perform filtering and/or conversion processes. Filtering processes are typically implemented to reduce a busy data stream so that the processor 56 is not overloaded with redundant or non-essential data. The conversion processes may be implemented to adjust the raw data before sending or reporting them to the processor 56. The conversions may include determining the center point for each touch region (e.g., centroid).


The sensing circuit 76 may include a storage element for storing a touch screen program, which is a capable of controlling different aspects of the touch screen 70. For example, the touch screen program may contain what type of value to output based on the sensing points 74 selected (e.g., coordinates). In fact, the sensing circuit in conjunction with the touch screen program may follow a predetermined communication protocol. As is generally well known, communication protocols are a set of rules and procedures for exchanging data between two devices. Communication protocols typically transmit information in data blocks or packets that contain the data to be transmitted, the data required to direct the packet to its destination, and the data that corrects errors that occur along the way. By way of example, the sensing circuit may place the data in a HID format (Human Interface Device).


The sensing circuit 76 generally includes one or more microcontrollers, each of which monitors one or more sensing points 74. The microcontrollers may for example correspond to an application specific integrated circuit (ASIC), which works with firmware to monitor the signals from the sensing device 72 and to process the monitored signals and to report this information to the processor 56.


In accordance with one embodiment, the sensing device 72 is based on capacitance. As should be appreciated, whenever two electrically conductive members come close to one another without actually touching, their electric fields interact to form capacitance. In most cases, the first electrically conductive member is a sensing point 74 and the second electrically conductive member is an object 80 such as a finger. As the object 80 approaches the surface of the touch screen 70, a tiny capacitance forms between the object 80 and the sensing points 74 in close proximity to the object 80. By detecting changes in capacitance at each of the sensing points 74 and noting the position of the sensing points, the sensing circuit can recognize multiple objects, and determine the location, pressure, direction, speed and acceleration of the objects 80 as they are moved across the touch screen 70. For example, the sensing circuit can determine when and where each of the fingers and palm of one or more hands are touching as well as the pressure being exerted by the finger and palm of the hand(s) at the same time.


The simplicity of capacitance allows for a great deal of flexibility in design and construction of the sensing device 72. By way of example, the sensing device 72 may be based on self capacitance or mutual capacitance. In self capacitance, each of the sensing points 74 is provided by an individual charged electrode. As an object approaches the surface of the touch screen 70, the object capacitive couples to those electrodes in close proximity to the object thereby stealing charge away from the electrodes. The amount of charge in each of the electrodes are measured by the sensing circuit 76 to determine the positions of multiple objects when they touch the touch screen 70. In mutual capacitance, the sensing device 72 includes a two layer grid of spatially separated lines or wires. In the simplest case, the upper layer includes lines in rows while the lower layer includes lines in columns (e.g., orthogonal). The sensing points 74 are provided at the intersections of the rows and columns. During operation, the rows are charged and the charge capacitively couples to the columns at the intersection. As an object approaches the surface of the touch screen, the object capacitive couples to the rows at the intersections in close proximity to the object thereby stealing charge away from the rows and therefore the columns as well. The amount of charge in each of the columns is measured by the sensing circuit 76 to determine the positions of multiple objects when they touch the touch screen 70.



FIG. 6 is a partial top view of a transparent multiple point touch screen 100, in accordance with one embodiment of the present invention. By way of example, the touch screen 100 may generally correspond to the touch screen shown in FIGS. 2 and 4. The multipoint touch screen 100 is capable of sensing the position and the pressure of multiple objects at the same time. This particular touch screen 100 is based on self capacitance and thus it includes a plurality of transparent capacitive sensing electrodes 102, which each represent different coordinates in the plane of the touch screen 100. The electrodes 102 are configured to receive capacitive input from one or more objects touching the touch screen 100 in the vicinity of the electrodes 102. When an object is proximate an electrode 102, the object steals charge thereby affecting the capacitance at the electrode 102. The electrodes 102 are connected to a capacitive sensing circuit 104 through traces 106 that are positioned in the gaps 108 found between the spaced apart electrodes 102. The electrodes 102 are spaced apart in order to electrically isolate them from each other as well as to provide a space for separately routing the sense traces 106. The gap 108 is preferably made small so as to maximize the sensing area and to minimize optical differences between the space and the transparent electrodes.


As shown, the sense traces 106 are routed from each electrode 102 to the sides of the touch screen 100 where they are connected to the capacitive sensing circuit 104. The capacitive sensing circuit 104 includes one or more sensor ICs 110 that measure the capacitance at each electrode 102 and that reports its findings or some form thereof to a host controller. The sensor ICs 110 may for example convert the analog capacitive signals to digital data and thereafter transmit the digital data over a serial bus to a host controller. Any number of sensor ICs may be used. For example, a single chip may be used for all electrodes, or multiple chips may be used for a single or group of electrodes. In most cases, the sensor ICs 110 report tracking signals, which are a function of both the position of the electrode 102 and the intensity of the capacitance at the electrode 102.


The electrodes 102, traces 106 and sensing circuit 104 are generally disposed on an optical transmissive member 112. In most cases, the optically transmissive member 112 is formed from a clear material such as glass or plastic. The electrode 102 and traces 106 may be placed on the member 112 using any suitable patterning technique including for example, deposition, etching, printing and the like. The electrodes 102 and sense traces 106 can be made from any suitable transparent conductive material. By way of example, the electrodes 102 and traces 106 may be formed from indium tin oxide (ITO). In addition, the sensor ICs 110 of the sensing circuit 104 can be electrically coupled to the traces 106 using any suitable techniques. In one implementation, the sensor ICs 110 are placed directly on the member 112 (flip chip). In another implementation, a flex circuit is bonded to the member 112, and the sensor ICs 110 are attached to the flex circuit. In yet another implementation, a flex circuit is bonded to the member 112, a PCB is bonded to the flex circuit and the sensor ICs 110 are attached to the PCB. The sensor ICs may for example be capacitance sensing ICs such as those manufactured by Synaptics of San Jose, Calif., Fingerworks of Newark, Del. or Alps of San Jose, Calif.


The distribution of the electrodes 102 may be widely varied. For example, the electrodes 102 may be positioned almost anywhere in the plane of the touch screen 100. The electrodes 102 may be positioned randomly or in a particular pattern about the touch screen 100. With regards to the later, the position of the electrodes 102 may depend on the coordinate system used. For example, the electrodes 102 may be placed in an array of rows and columns for Cartesian coordinates or an array of concentric and radial segments for polar coordinates. Within each array, the rows, columns, concentric or radial segments may be stacked uniformly relative to the others or they may be staggered or offset relative to the others. Additionally, within each row or column, or within each concentric or radial segment, the electrodes 102 may be staggered or offset relative to an adjacent electrode 102.


Furthermore, the electrodes 102 may be formed from almost any shape whether simple (e.g., squares, circles, ovals, triangles, rectangles, polygons, and the like) or complex (e.g., random shapes). Further still, the shape of the electrodes 102 may have identical shapes or they may have different shapes. For example, one set of electrodes 102 may have a first shape while a second set of electrodes 102 may have a second shape that is different than the first shape. The shapes are generally chosen to maximize the sensing area and to minimize optical differences between the gaps and the transparent electrodes.


In addition, the size of the electrodes 102 may vary according to the specific needs of each device. In some cases, the size of the electrodes 102 corresponds to about the size of a fingertip. For example, the size of the electrodes 102 may be on the order of 4-5 mm2. In other cases, the size of the electrodes 102 are smaller than the size of the fingertip so as to improve resolution of the touch screen 100 (the finger can influence two or more electrodes at any one time thereby enabling interpolation). Like the shapes, the size of the electrodes 102 may be identical or they may be different. For example, one set of electrodes 102 may be larger than another set of electrodes 102. Moreover, any number of electrodes 102 may be used. The number of electrodes 102 is typically determined by the size of the touch screen 100 as well as the size of each electrode 102. In most cases, it would be desirable to increase the number of electrodes 102 so as to provide higher resolution, i.e., more information can be used for such things as acceleration.


Although the sense traces 106 can be routed a variety of ways, they are typically routed in manner that reduces the distance they have to travel between their electrode 102 and the sensor circuit 104, and that reduces the size of the gaps 108 found between adjacent electrodes 102. The width of the sense traces 106 are also widely varied. The widths are generally determined by the amount of charge being distributed there through, the number of adjacent traces 106, and the size of the gap 108 through which they travel. It is generally desirable to maximize the widths of adjacent traces 106 in order to maximize the coverage inside the gaps 108 thereby creating a more uniform optical appearance.


In the illustrated embodiment, the electrodes 102 are positioned in a pixilated array. As shown, the electrodes 102 are positioned in rows 116 that extend to and from the sides of the touch screen 100. Within each row 116, the identical electrodes 102 are spaced apart and positioned laterally relative to one another (e.g., juxtaposed). Furthermore, the rows 116 are stacked on top of each other thereby forming the pixilated array. The sense traces 106 are routed in the gaps 108 formed between adjacent rows 106. The sense traces 106 for each row are routed in two different directions. The sense traces 106 on one side of the row 116 are routed to a sensor IC 110 located on the left side and the sense traces 106 on the other side of the row 116 are routed to another sensor IC 110 located on the right side of the touch screen 100. This is done to minimize the gap 108 formed between rows 116. The gap 108 may for example be held to about 20 microns. As should be appreciated, the spaces between the traces can stack thereby creating a large gap between electrodes. If routed to one side, the size of the space would be substantially doubled thereby reducing the resolution of the touch screen. Moreover, the shape of the electrode 102 is in the form of a parallelogram, and more particularly a parallogram with sloping sides.



FIG. 7 is a partial top view of a transparent multi point touch screen 120, in accordance with one embodiment of the present invention. In this embodiment, the touch screen 120 is similar to the touch screen 100 shown in FIG. 6, however, unlike the touch screen 100 of FIG. 6, the touch screen 120 shown in FIG. 7 includes electrodes 122 with different sizes. As shown, the electrodes 122 located in the center of the touch screen 120 are larger than the electrodes 122 located at the sides of the touch screen 120. In fact, the height of the electrodes 122 gets correspondingly smaller when moving from the center to the edge of the touch screen 120. This is done to make room for the sense traces 124 extending from the sides of the more centrally located electrodes 122. This arrangement advantageously reduces the gap found between adjacent rows 126 of electrodes 122. Although the height of each electrode 122 shrinks, the height H of the row 126 as well as the width W of each electrode 122 stays the same. In one configuration, the height of the row 126 is substantially equal to the width of each electrode 122. For example, the height of the row 126 and the width of each electrode 122 may be about 4 mm to about 5 mm.



FIG. 8 is a front elevation view, in cross section of a display arrangement 130, in accordance with one embodiment of the present invention. The display arrangement 130 includes an LCD display 132 and a touch screen 134 positioned over the LCD display 132. The touch screen may for example correspond to the touch screen shown in FIG. 6 or 7. The LCD display 132 may correspond to any conventional LCD display known in the art. Although not shown, the LCD display 132 typically includes various layers including a fluorescent panel, polarizing filters, a layer of liquid crystal cells, a color filter and the like.


The touch screen 134 includes a transparent electrode layer 136 that is positioned over a glass member 138. The glass member 138 may be a portion of the LCD display 132 or it may be a portion of the touch screen 134. In either case, the glass member 138 is a relatively thick piece of clear glass that protects the display 132 from forces, which are exerted on the touch screen 134. The thickness of the glass member 138 may for example be about 2 mm. In most cases, the electrode layer 136 is disposed on the glass member 138 using suitable transparent conductive materials and patterning techniques such as ITO and printing. Although not shown, in some cases, it may be necessary to coat the electrode layer 136 with a material of similar refractive index to improve the visual appearance of the touch screen. As should be appreciated, the gaps located between electrodes and traces do not have the same optical index as the electrodes and traces, and therefore a material may be needed to provide a more similar optical index. By way of example, index matching gels may be used.


The touch screen 134 also includes a protective cover sheet 140 disposed over the electrode layer 136. The electrode layer 136 is therefore sandwiched between the glass member 138 and the protective cover sheet 140. The protective sheet 140 serves to protect the under layers and provide a surface for allowing an object to slide thereon. The protective sheet 140 also provides an insulating layer between the object and the electrode layer 136. The protective cover sheet 140 may be formed from any suitable clear material such as glass and plastic. The protective cover sheet 140 is suitably thin to allow for sufficient electrode coupling. By way of example, the thickness of the cover sheet 140 may be between about 0.3-0.8 mm. In addition, the protective cover sheet 140 may be treated with coatings to reduce sticktion when touching and reduce glare when viewing the underlying LCD display 132. By way of example, a low sticktion/anti reflective coating 142 may be applied over the cover sheet 140. Although the electrode layer 136 is typically patterned on the glass member 138, it should be noted that in some cases it may be alternatively or additionally patterned on the protective cover sheet 140.



FIG. 9 is a top view of a transparent multipoint touch screen 150, in accordance with another embodiment of the present invention. By way of example, the touch screen 150 may generally correspond to the touch screen of FIGS. 2 and 4. Unlike the touch screen shown in FIGS. 6-8, the touch screen of FIG. 9 utilizes the concept of mutual capacitance rather than self capacitance. As shown, the touch screen 150 includes a two layer grid of spatially separated lines or wires 152. In most cases, the lines 152 on each layer are parallel one another. Furthermore, although in different planes, the lines 152 on the different layers are configured to intersect or cross in order to produce capacitive sensing nodes 154, which each represent different coordinates in the plane of the touch screen 150. The nodes 154 are configured to receive capacitive input from an object touching the touch screen 150 in the vicinity of the node 154. When an object is proximate the node 154, the object steals charge thereby affecting the capacitance at the node 154.


To elaborate, the lines 152 on different layers serve two different functions. One set of lines 152A drives a current therethrough while the second set of lines 152B senses the capacitance coupling at each of the nodes 154. In most cases, the top layer provides the driving lines 152A while the bottom layer provides the sensing lines 152B. The driving lines 152A are connected to a voltage source (not shown) that separately drives the current through each of the driving lines 152A. That is, the stimulus is only happening over one line while all the other lines are grounded. They may be driven similarly to a raster scan. The sensing lines 152B are connected to a capacitive sensing circuit (not shown) that continuously senses all of the sensing lines 152B (always sensing).


When driven, the charge on the driving line 152A capacitively couples to the intersecting sensing lines 152B through the nodes 154 and the capacitive sensing circuit senses all of the sensing lines 152B in parallel. Thereafter, the next driving line 152A is driven, and the charge on the next driving line 152A capacitively couples to the intersecting sensing lines 152B through the nodes 154 and the capacitive sensing circuit senses all of the sensing lines 152B in parallel. This happens sequential until all the lines 152A have been driven. Once all the lines 152A have been driven, the sequence starts over (continuously repeats). In most cases, the lines 152A are sequentially driven from one side to the opposite side.


The capacitive sensing circuit typically includes one or more sensor ICs that measure the capacitance in each of the sensing lines 152B and that reports its findings to a host controller. The sensor ICs may for example convert the analog capacitive signals to digital data and thereafter transmit the digital data over a serial bus to a host controller. Any number of sensor ICs may be used. For example, a sensor IC may be used for all lines, or multiple sensor ICs may be used for a single or group of lines. In most cases, the sensor ICs 110 report tracking signals, which are a function of both the position of the node 154 and the intensity of the capacitance at the node 154.


The lines 152 are generally disposed on one or more optical transmissive members 156 formed from a clear material such as glass or plastic. By way of example, the lines 152 may be placed on opposing sides of the same member 156 or they may be placed on different members 156. The lines 152 may be placed on the member 156 using any suitable patterning technique including for example, deposition, etching, printing and the like. Furthermore, the lines 152 can be made from any suitable transparent conductive material. By way of example, the lines may be formed from indium tin oxide (ITO). The driving lines 152A are typically coupled to the voltage source through a flex circuit 158A, and the sensing lines 152B are typically coupled to the sensing circuit, and more particularly the sensor ICs through a flex circuit 158B. The sensor ICs may be attached to a printed circuit board (PCB). Alternatively, the sensor ICs may be placed directly on the member 156 thereby eliminating the flex circuit 158B.


The distribution of the lines 152 may be widely varied. For example, the lines 152 may be positioned almost anywhere in the plane of the touch screen 150. The lines 152 may be positioned randomly or in a particular pattern about the touch screen 150. With regards to the later, the position of the lines 152 may depend on the coordinate system used. For example, the lines 152 may be placed in rows and columns for Cartesian coordinates or concentrically and radially for polar coordinates. When using rows and columns, the rows and columns may be placed at various angles relative to one another. For example, they may be vertical, horizontal or diagonal.


Furthermore, the lines 152 may be formed from almost any shape whether rectilinear or curvilinear. The lines on each layer may be the same or different. For example, the lines may alternate between rectilinear and curvilinear. Further still, the shape of the opposing lines may have identical shapes or they may have different shapes. For example, the driving lines may have a first shape while the sensing lines may have a second shape that is different than the first shape. The geometry of the lines 152 (e.g., linewidths and spacing) may also be widely varied. The geometry of the lines within each layer may be identical or different, and further, the geometry of the lines for both layers may be identical or different. By way of example, the linewidths of the sensing lines 152B to driving lines 152A may have a ratio of about 2:1.


Moreover, any number of lines 152 may be used. It is generally believed that the number of lines is dependent on the desired resolution of the touch screen 150. The number of lines within each layer may be identical or different. The number of lines is typically determined by the size of the touch screen as well as the desired pitch and linewidths of the lines 152.


In the illustrated embodiment, the driving lines 152A are positioned in rows and the sensing lines 152B are positioned in columns that are perpendicular to the rows. The rows extend horizontally to the sides of the touch screen 150 and the columns extend vertically to the top and bottom of the touch screen 150. Furthermore, the linewidths for the set of lines 152A and 152B are different and the pitch for set of lines 152A and 152B are equal to one another. In most cases, the linewidths of the sensing lines 152B are larger than the linewidths of the driving lines 152A. By way of example, the pitch of the driving and sensing lines 152 may be about 5 mm, the linewidths of the driving lines 152A may be about 1.05 mm and the linewidths of the sensing lines 152B may be about 2.10 mm. Moreover, the number of lines 152 in each layer is different. For example, there may be about 38 driving lines and about 50 sensing lines.


As mentioned above, the lines in order to form semi-transparent conductors on glass, film or plastic, may be patterned with an ITO material. This is generally accomplished by depositing an ITO layer over the substrate surface, and then by etching away portions of the ITO layer in order to form the lines. As should be appreciated, the areas with ITO tend to have lower transparency than the areas without ITO. This is generally less desirable for the user as the user can distinguish the lines from the spaces therebetween, i.e., the patterned ITO can become quite visible thereby producing a touch screen with undesirable optical properties. To further exacerbate this problem, the ITO material is typically applied in a manner that produces a relatively low resistance, and unfortunately low resistance ITO tends to be less transparent than high resistance ITO.


In order to prevent the aforementioned problem, the dead areas between the ITO may be filled with indexing matching materials. In another embodiment, rather than simply etching away all of the ITO, the dead areas (the uncovered spaces) may be subdivided into unconnected electrically floating ITO pads, i.e., the dead areas may be patterned with spatially separated pads. The pads are typically separated with a minimum trace width. Furthermore, the pads are typically made small to reduce their impact on the capacitive measurements. This technique attempts to minimize the appearance of the ITO by creating a uniform optical retarder. That is, by seeking to create a uniform sheet of ITO, it is believed that the panel will function closer to a uniform optical retarder and therefore non-uniformities in the visual appearance will be minimized. In yet another embodiment, a combination of index matching materials and unconnected floating pads may be used.



FIG. 10 is a partial front elevation view, in cross section of a display arrangement 170, in accordance with one embodiment of the present invention. The display arrangement 170 includes an LCD display 172 and a touch screen 174 positioned over the LCD display 170. The touch screen may for example correspond to the touch screen shown in FIG. 9. The LCD display 172 may correspond to any conventional LCD display known in the art. Although not shown, the LCD display 172 typically includes various layers including a fluorescent panel, polarizing filters, a layer of liquid crystal cells, a color filter and the like.


The touch screen 174 includes a transparent sensing layer 176 that is positioned over a first glass member 178. The sensing layer 176 includes a plurality of sensor lines 177 positioned in columns (extend in and out of the page). The first glass member 178 may be a portion of the LCD display 172 or it may be a portion of the touch screen 174. For example, it may be the front glass of the LCD display 172 or it may be the bottom glass of the touch screen 174. The sensor layer 176 is typically disposed on the glass member 178 using suitable transparent conductive materials and patterning techniques. In some cases, it may be necessary to coat the sensor layer 176 with material of similar refractive index to improve the visual appearance, i.e., make more uniform.


The touch screen 174 also includes a transparent driving layer 180 that is positioned over a second glass member 182. The second glass member 182 is positioned over the first glass member 178. The sensing layer 176 is therefore sandwiched between the first and second glass members 178 and 182. The second glass member 182 provides an insulating layer between the driving and sensing layers 176 and 180. The driving layer 180 includes a plurality of driving lines 181 positioned in rows (extend to the right and left of the page). The driving lines 181 are configured to intersect or cross the sensing lines 177 positioned in columns in order to form a plurality of capacitive coupling nodes 182. Like the sensing layer 176, the driving layer 180 is disposed on the glass member using suitable materials and patterning techniques. Furthermore, in some cases, it may be necessary to coat the driving layer 180 with material of similar refractive index to improve the visual appearance. Although the sensing layer is typically patterned on the first glass member, it should be noted that in some cases it may be alternatively or additionally patterned on the second glass member.


The touch screen 174 also includes a protective cover sheet 190 disposed over the driving layer 180. The driving layer 180 is therefore sandwiched between the second glass member 182 and the protective cover sheet 190. The protective cover sheet 190 serves to protect the under layers and provide a surface for allowing an object to slide thereon. The protective cover sheet 190 also provides an insulating layer between the object and the driving layer 180. The protective cover sheet is suitably thin to allow for sufficient coupling. The protective cover sheet 190 may be formed from any suitable clear material such as glass and plastic. In addition, the protective cover sheet 190 may be treated with coatings to reduce sticktion when touching and reduce glare when viewing the underlying LCD display 172. By way of example, a low sticktion/anti reflective coating may be applied over the cover sheet 190. Although the line layer is typically patterned on a glass member, it should be noted that in some cases it may be alternatively or additionally patterned on the protective cover sheet.


The touch screen 174 also includes various bonding layers 192. The bonding layers 192 bond the glass members 178 and 182 as well as the protective cover sheet 190 together to form the laminated structure and to provide rigidity and stiffness to the laminated structure. In essence, the bonding layers 192 help to produce a monolithic sheet that is stronger than each of the individual layers taken alone. In most cases, the first and second glass members 178 and 182 as well as the second glass member and the protective sheet 182 and 190 are laminated together using a bonding agent such as glue. The compliant nature of the glue may be used to absorb geometric variations so as to form a singular composite structure with an overall geometry that is desirable. In some cases, the bonding agent includes an index matching material to improve the visual appearance of the touch screen 170.


With regards to configuration, each of the various layers may be formed with various sizes, shapes, and the like. For example, each of the layers may have the same thickness or a different thickness than the other layers in the structure. In the illustrated embodiment, the first glass member 178 has a thickness of about 1.1 mm, the second glass member 182 has a thickness of about 0.4 mm and the protective sheet has a thickness of about 0.55 mm. The thickness of the bonding layers 192 typically varies in order to produce a laminated structure with a desired height. Furthermore, each of the layers may be formed with various materials. By way of example, each particular type of layer may be formed from the same or different material. For example, any suitable glass or plastic material may be used for the glass members. In a similar manner, any suitable bonding agent may be used for the bonding layers 192.



FIGS. 11A and 11B are partial top view diagrams of a driving layer 200 and a sensing layer 202, in accordance with one embodiment. In this embodiment, each of the layers 200 and 202 includes dummy features 204 disposed between the driving lines 206 and the sensing lines 208. The dummy features 204 are configured to optically improve the visual appearance of the touch screen by more closely matching the optical index of the lines. While index matching materials may improve the visual appearance, it has been found that there still may exist some non-uniformities. The dummy features 204 provide the into outgoing digital signals; and isolated and positioned in the gaps between each of the lines 206 and 208. Although they may be patterned separately, the dummy features 204 are typically patterned along with the lines 206 and 208. Furthermore, although they may be formed from different materials, the dummy features 204 are typically formed with the same transparent conductive material as the lines as for example ITO to provide the best possible index matching. As should be appreciated, the dummy features will more than likely still produce some gaps, but these gaps are much smaller than the gaps found between the lines (many orders of magnitude smaller). These gaps, therefore have minimal impact on the visual appearance. While this may be the case, index matching materials may be additionally applied to the gaps between the dummy features to further improve the visual appearance of the touch screen. The distribution, size, number, dimension, and shape of the dummy features may be widely varied.



FIG. 12 is a simplified diagram of a mutual capacitance circuit 220, in accordance with one embodiment of the present invention. The mutual capacitance circuit 220 includes a driving line 222 and a sensing line 224 that are spatially separated thereby forming a capacitive coupling node 226. The driving line 222 is electrically coupled to a voltage source 228, and the sensing line 224 is electrically coupled to a capacitive sensing circuit 230. The driving line 222 is configured to carry a current to the capacitive coupling node 226, and the sensing line 224 is configured to carry a current to the capacitive sensing circuit 230. When no object is present, the capacitive coupling at the node 226 stays fairly constant. When an object 232 such as a finger is placed proximate the node 226, the capacitive coupling changes through the node 226 changes. The object 232 effectively shunts some of the field away so that the charge projected across the node 226 is less. The change in capacitive coupling changes the current that is carried by the sensing lines 224. The capacitive sensing circuit 230 notes the current change and the position of the node 226 where the current change occurred and reports this information in a raw or in some processed form to a host controller. The capacitive sensing circuit does this for each node 226 at about the same time (as viewed by a user) so as to provide multipoint sensing.


The sensing line 224 may contain a filter 236 for eliminating parasitic capacitance 237, which may for example be created by the large surface area of the row and column lines relative to the other lines and the system enclosure at ground potential. Generally speaking, the filter rejects stray capacitance effects so that a clean representation of the charge transferred across the node 226 is outputted (and not anything in addition to that). That is, the filter 236 produces an output that is not dependent on the parasitic capacitance, but rather on the capacitance at the node 226. As a result, a more accurate output is produced.



FIG. 13 is a diagram of an inverting amplifier 240, in accordance with one embodiment of the present invention. The inverting amplifier 240 may generally correspond to the filter 236 shown in FIG. 12. As shown, the inverting amplifier includes a non inverting input that is held at a constant voltage (in this case ground), an inverting input that is coupled to the node and an output that is coupled to the capacitive sensing circuit 230. The output is coupled back to the inverting input through a capacitor. During operation, the input from the node may be disturbed by stray capacitance effects, i.e., parasitic capacitance. If so, the inverting amplifier is configured to drive the input back to the same voltage that it had been previously before the stimulus. As such, the value of the parasitic capacitance doesn't matter.



FIG. 14 is a block diagram of a capacitive sensing circuit 260, in accordance with one embodiment of the present invention. The capacitive sensing circuit 260 may for example correspond to the capacitive sensing circuits described in the previous figures. The capacitive sensing circuit 260 is configured to receive input data from a plurality of sensing points 262 (electrode, nodes, etc.), to process the data and to output processed data to a host controller.


The sensing circuit 260 includes a multiplexer 264 (MUX). The multiplexer 264 is a switch configured to perform time multiplexing. As shown, the MUX 264 includes a plurality of independent input channels 266 for receiving signals from each of the sensing points 262 at the same time. The MUX 264 stores all of the incoming signals at the same time, but sequentially releases them one at a time through an output channel 268.


The sensing circuit 260 also includes an analog to digital converter 270 (ADC) operatively coupled to the MUX 264 through the output channel 268. The ADC 270 is configured to digitize the incoming analog signals sequentially one at a time. That is, the ADC 270 converts each of the incoming analog signals into outgoing digital signals. The input to the ADC 270 generally corresponds to a voltage having a theoretically infinite number of values. The voltage varies according to the amount of capacitive coupling at each of the sensing points 262. The output to the ADC 270, on the other hand, has a defined number of states. The states generally have predictable exact voltages or currents.


The sensing circuit 260 also includes a digital signal processor 272 (DSP) operatively coupled to the ADC 270 through another channel 274. The DSP 272 is a programmable computer processing unit that works to clarify or standardize the digital signals via high speed mathematical processing. The DSP 274 is capable of differentiating between human made signals, which have order, and noise, which is inherently chaotic. In most cases, the DSP performs filtering and conversion algorithms using the raw data. By way of example, the DSP may filter noise events from the raw data, calculate the touch boundaries for each touch that occurs on the touch screen at the same time, and thereafter determine the coordinates for each touch event. The coordinates of the touch events may then be reported to a host controller where they can be compared to previous coordinates of the touch events to determine what action to perform in the host device.



FIG. 15 is a flow diagram 280, in accordance with one embodiment of the present invention. The method generally begins at block 282 where a plurality of sensing points are driven. For example, a voltage is applied to the electrodes in self capacitance touch screens or through driving lines in mutual capacitance touch screens. In the later, each driving line is driven separately. That is, the driving lines are driven one at a time thereby building up charge on all the intersecting sensing lines. Following block 282, the process flow proceeds to block 284 where the outputs (voltage) from all the sensing points are read. This block may include multiplexing and digitizing the outputs. For example, in mutual capacitance touch screens, all the sensing points on one row are multiplexed and digitized and this is repeated until all the rows have been sampled. Following block 284, the process flow proceeds to block 286 where an image or other form of data (signal or signals) of the touch screen plane at one moment in time can be produced and thereafter analyzed to determine where the objects are touching the touch screen. By way of example, the boundaries for each unique touch can be calculated, and thereafter the coordinates thereof can be found. Following block 286, the process flow proceeds to block 288 where the current image or signal is compared to a past image or signal in order to determine a change in pressure, location, direction, speed and acceleration for each object on the plane of the touch screen. This information can be subsequently used to perform an action as for example moving a pointer or cursor or making a selection as indicated in block 290.



FIG. 16 is a flow diagram of a digital signal processing method 300, in accordance with one embodiment of the present invention. By way of example, the method may generally correspond to block 286 shown and described in FIG. 15. The method 300 generally begins at block 302 where the raw data is received. The raw data is typically in a digitized form, and includes values for each node of the touch screen. The values may be between 0 and 256 where 0 equates to the highest capacitive coupling (no touch pressure) and 256 equates to the least capacitive coupling (full touch pressure). An example of raw data at one point in time is shown in FIG. 17A. As shown in FIG. 17A, the values for each point are provided in gray scale where points with the least capacitive coupling are shown in white and the points with the highest capacitive coupling are shown in black and the points found between the least and the highest capacitive coupling are shown in gray.


Following block 302, the process flow proceeds to block 304 where the raw data is filtered. As should be appreciated, the raw data typically includes some noise. The filtering process is configured to reduce the noise. By way of example, a noise algorithm may be run that removes points that aren't connected to other points. Single or unconnected points generally indicate noise while multiple connected points generally indicate one or more touch regions, which are regions of the touch screen that are touched by objects. An example of a filtered data is shown in FIG. 17B. As shown, the single scattered points have been removed thereby leaving several concentrated areas.


Following block 304, the process flow proceeds to block 306 where gradient data is generated. The gradient data indicates the topology of each group of connected points. The topology is typically based on the capacitive values for each point. Points with the lowest values are steep while points with the highest values are shallow. As should be appreciated, steep points indicate touch points that occurred with greater pressure while shallow points indicate touch points that occurred with lower pressure. An example of gradient data is shown in FIG. 17C.


Following block 306, the process flow proceeds to block 308 where the boundaries for touch regions are calculated based on the gradient data. In general, a determination is made as to which points are grouped together to form each touch region. An example of the touch regions is shown in FIG. 17D.


In one embodiment, the boundaries are determined using a watershed algorithm. Generally speaking, the algorithm performs image segmentation, which is the partitioning of an image into distinct regions as for example the touch regions of multiple objects in contact with the touchscreen. The concept of watershed initially comes from the area of geography and more particularly topography where a drop of water falling on a relief follows a descending path and eventually reaches a minimum, and where the watersheds are the divide lines of the domains of attracting drops of water. Herein, the watershed lines represent the location of pixels, which best separate different objects touching the touch screen. Watershed algorithms can be widely varied. In one particular implementation, the watershed algorithm includes forming paths from low points to a peak (based on the magnitude of each point), classifying the peak as an ID label for a particular touch region, associating each point (pixel) on the path with the peak. These steps are performed over the entire image map thus carving out the touch regions associated with each object in contact with the touchscreen.


Following block 308, the process flow proceeds to block 310 where the coordinates for each of the touch regions are calculated. This may be accomplished by performing a centroid calculation with the raw data associated with each touch region. For example, once the touch regions are determined, the raw data associated therewith may be used to calculate the centroid of the touch region. The centroid may indicate the central coordinate of the touch region. By way of example, the X and Y centroids may be found using the following equations:

Xc=custom characterZ*x/custom characterZ; and
Yc=custom characterZ*y/custom characterZ,


where Xc represents the x centroid of the touch region


Yc represents the y centroid of the touch region


x represents the x coordinate of each pixel or point in the touch region


y represents the y coordinate of each pixel or point in the touch region


Z represents the magnitude (capacitance value) at each pixel or point


An example of a centroid calculation for the touch regions is shown in FIG. 17E. As shown, each touch region represents a distinct x and y coordinate. These coordinates may be used to perform multipoint tracking as indicated in block 312. For example, the coordinates for each of the touch regions may be compared with previous coordinates of the touch regions to determine positioning changes of the objects touching the touch screen or whether or not touching objects have been added or subtracted or whether a particular object is being tapped.



FIGS. 18 and 19 are side elevation views of an electronic device 350, in accordance with multiple embodiments of the present invention. The electronic device 350 includes an LCD display 352 and a transparent touch screen 354 positioned over the LCD display 352. The touch screen 354 includes a protective sheet 356, one or more sensing layers 358, and a bottom glass member 360. In this embodiment, the bottom glass member 360 is the front glass of the LCD display 352. Further, the sensing layers 358 may be configured for either self or mutual capacitance as described above. The sensing layers 358 generally include a plurality of interconnects at the edge of the touch screen for coupling the sensing layer 358 to a sensing circuit (not shown). By way of example, the sensing layer 358 may be electrically coupled to the sensing circuit through one or more flex circuits 362, which are attached to the sides of the touch screen 354.


As shown, the LCD display 352 and touch screen 354 are disposed within a housing 364. The housing 364 serves to cover and support these components in their assembled position within the electronic device 350. The housing 364 provides a space for placing the LCD display 352 and touch screen 354 as well as an opening 366 so that the display screen can be seen through the housing 364. In one embodiment, as shown in FIG. 18, the housing 364 includes a facade 370 for covering the sides the LCD display 352 and touch screen 354. Although not shown in great detail, the facade 370 is positioned around the entire perimeter of the LCD display 352 and touch screen 354. The facade 370 serves to hide the interconnects leaving only the active area of the LCD display 352 and touch screen 354 in view.


In another embodiment, as shown in FIG. 19, the housing 364 does not include a facade 370, but rather a mask 372 that is printed on interior portion of the top glass 374 of the touch screen 354 that extends between the sides of the housing 364. This particular arrangement makes the mask 372 look submerged in the top glass 356. The mask 372 serves the same function as the facade 370, but is a more elegant solution. In one implementation, the mask 372 is a formed from high temperature black polymer. In the illustrated embodiment of FIG. 19, the touch screen 354 is based on mutual capacitance sensing and thus the sensing layer 358 includes driving lines 376 and sensing lines 378. The driving lines 376 are disposed on the top glass 356 and the mask 372, and the sensing lines 378 are disposed on the bottom glass 360. The driving lines and sensing lines 376 and 378 are insulated from one another via a spacer 380. The spacer 380 may for example be a clear piece of plastic with optical matching materials retained therein or applied thereto.


In one embodiment and referring to both FIGS. 18 and 19, the electronic device 350 corresponds to a tablet computer. In this embodiment, the housing 364 also encloses various integrated circuit chips and other circuitry 382 that provide computing operations for the tablet computer. By way of example, the integrated circuit chips and other circuitry may include a microprocessor, motherboard, Read-Only Memory (ROM), Random-Access Memory (RAM), a hard drive, a disk drive, a battery, and various input/output support devices.


While this invention has been described in terms of several preferred embodiments, there are alterations, permutations, and equivalents, which fall within the scope of this invention. For example, although the touch screen was primarily directed at capacitive sensing, it should be noted that some or all of the features described herein may be applied to other sensing methodologies. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.

Claims
  • 1. A touch panel having a transparent capacitive sensing medium configured to detect multiple touches or near touches that occur at a same time and at distinct locations in a plane of the touch panel and to produce distinct signals representative of a location of the touches on the plane of the touch panel for each of the multiple touches; wherein the transparent capacitive sensing medium comprises a transparent electrode layer, the electrode layer including a plurality of electrically isolated electrodes and electrode traces formed from a transparent conductive material, each of the electrodes being placed at different locations in the plane of the touch panel, each of the electrodes having an individual trace for operatively coupling to self-capacitance monitoring circuitry.
  • 2. The touch panel as recited in claim 1 wherein the transparent sensing medium includes a pixilated array of transparent capacitance sensing nodes, wherein each of the different locations represents one of multiple locations along a first direction and one of multiple locations along a second direction corresponding to coordinates in the plane of the touch panel.
  • 3. The touch panel as recited in claim 1 further including one or more integrated circuits for monitoring the capacitance at each of the electrodes, the integrated circuits being operatively coupled to the electrodes via the traces.
  • 4. The touch panel as recited in claim 3, the one or more integrated circuits comprising: a multiplexer (MUX) that receives incoming analog signals from the electrodes;an analog to digital converter coupled to the MUX, the analog to digital converter being configured to convert the incoming analog signals into outgoing digital signals; anda digital signal processor (DSP) coupled to the analog to digital converter, the DSP filtering noise events from the digital signals, calculating touch boundaries for each touch that occurs on the touch panel at the same time and thereafter determining the coordinates for each touch.
  • 5. The touch panel as recited in claim 1 wherein the electrodes are placed in rows and columns.
  • 6. The touch panel as recited in claim 1 wherein the electrodes and traces are formed from indium tin oxide (ITO).
  • 7. A display arrangement comprising: a display having a screen for displaying a graphical user interface; anda transparent touch panel allowing the screen to be viewed therethrough and capable of recognizing multiple touch events that occur at different locations on the touch panel at a same time and to output touch even information to a host device to form a pixilated image;wherein the touch panel includes a multipoint sensing arrangement configured to simultaneously detect and monitor the touch events and a change in capacitive coupling associated with those touch events at distinct locations across the touch panel; andwherein the multipoint sensing arrangement provides a plurality of transparent self-capacitive sensing nodes that work independent of one another and that represent different locations on the touch panel.
  • 8. The display arrangement as recited in claim 7, wherein the capacitive sensing nodes are formed with a transparent conductive medium.
  • 9. The display arrangement as recited in claim 8, wherein the transparent conductive medium corresponds to indium tin oxide (ITO).
  • 10. The display arrangement as recited in claim 8, wherein each of the different locations represents one of multiple locations along a first direction and one of multiple locations along a second direction corresponding to coordinates in the plane of the transparent touch panel.
  • 11. The display arrangement as recited in claim 10, wherein the transparent conductive medium is patterned into electrically isolated electrodes and traces, each electrode representing a different coordinate in the plane of the touch panel, and the traces connecting the electrodes to a capacitive sensing circuit.
  • 12. The display arrangement as recited in claim 8, wherein the capacitive sensing nodes are coupled to a capacitive sensing circuit, and wherein the capacitive sensing circuit monitors decreases in capacitance that occurs at each of the capacitive sensing nodes and the position where the decreases occur.
  • 13. The display arrangement as recited in claim 7, wherein the capacitive sensing circuit comprises: a multiplexer (MUX) that receives incoming analog signals from the electrodes;an analog to digital converter coupled to the MUX, the analog to digital converter being configured to convert the incoming analog signals into outgoing digital signals; anda digital signal processor (DSP) coupled to the analog to digital converter, the DSP filtering noise events from the digital signals, calculating touch boundaries for each touch that occurs on the touch panel at the same time and thereafter determining the coordinates for each touch.
  • 14. A display arrangement comprising: a display having a screen for displaying a graphical user interface; anda transparent touch panel allowing the screen to be viewed therethrough and capable of recognizing multiple touch events that occur at different locations on the touch panel at a same time and to output touch event information to a host device; andwherein the touch panel comprises:a transparent conductive layer including an array of electrically isolated electrodes forming a plurality of transparent self-capacitive sensing nodes configured to simultaneously detect and monitor the touch events and a change in self-capacitive coupling associated with those touch events at distinct locations across the touch panel; andone or more sensor circuits operatively coupled to the electrodes for sensing the self-capacitance of the electrodes,wherein each of the distinct locations represents one of multiple locations along a direction and multiple locations along a second direction corresponding to coordinates in t plane of the touch panel.
  • 15. The touch panel as recited in claim 14, wherein the one or more sensor circuits include an inverting amplifier.
  • 16. The touch panel as recited in claim 15, the transparent self-capacitive sensing nodes are formed on a single side of a substrate.
  • 17. The touch panel as recited in claim 15, the transparent conductive layer is disposed on a transparent substrate.
  • 18. The touch panel as recited in claim 17, wherein the transparent substrate is formed from glass.
  • 19. A touch panel having a transparent capacitive sensing medium configured to detect multiple touches or near touches that occur at a same time and at distinct locations in a plane of the touch panel and to produce distinct signals representative of a location of the touches on the plane of the touch panel for each of the multiple touches; wherein the transparent capacitive sensing medium comprises a transparent electrode layer, the electrode layer including a plurality of electrically isolated electrodes and electrode traces formed from a transparent conductive material, each of the electrodes being placed at different locations in the plane of the touch panel, each of the electrodes operatively coupled to self-capacitance monitoring circuitry,wherein each of the different locations represents one of multiple locations along a first direction and one of multiple locations along a second direction corresponding to coordinates in the plane of the touch panel.
  • 20. The touch panel as recited in claim 19, wherein the capacitive monitoring circuitry comprises a virtual ground charge amplifier.
  • 21. The touch panel as recited in claim 19, the transparent capacitive sensing medium formed on a single side of a substrate.
  • 22. The touch panel as recited in claim 19, the transparent capacitive sensing medium formed on a transparent substrate.
  • 23. The touch panel as recited in claim 22, wherein the transparent substrate is formed from glass.
  • 24. The touch panel as recited in claim 19, each of the plurality of electrically isolated electrodes has a single trace connected thereto.
  • 25. The touch panel as recited in claim 19, the transparent capacitive sensing medium formed on both sides of a single substrate.
  • 26. A display arrangement comprising: a display having a display screen; anda transparent touch panel allowing the screen to be viewed therethrough and capable of recognizing multiple touch events that occur at different locations on the touch panel at a same time and to output touch event information to a host device;wherein the touch panel includes a multipoint sensing arrangement configured to simultaneously detect and monitor the touch events and a change in self-capacitive coupling associated with those touch events at distinct locations across the touch panel; andfurther comprising a virtual ground charge amplifier coupled to the touch panel for detecting the touch events on the touch panel,wherein each of the distinct locations represents one of multiple locations along a first direction and multiple locations along a second direction corresponding to coordinates in the plane of the touch panel.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/717,573, filed Dec. 17, 2012, which is a divisional of U.S. patent application Ser. No. 13/345,347, filed Jan. 6, 2012, and issued on Apr. 9, 2013 as U.S. Pat. No. 8,416,209, which is a continuation of U.S. patent application Ser. No. 12/267,532, filed Nov. 7, 2008, abandoned, which is a divisional of U.S. patent application Ser. No. 10/840,862, filed May 6, 2004, and issued on Feb. 16, 2010 as U.S. Pat. No. 7,663,607, the entire disclosures of which are incorporated herein by reference.

US Referenced Citations (559)
Number Name Date Kind
2751555 Kirkpatrick Jun 1956 A
3333160 Gorski Jul 1967 A
3541541 Englebart Nov 1970 A
3644835 Thompson Feb 1972 A
3662105 Hurst et al. May 1972 A
3798370 Hurst Mar 1974 A
3974332 Abe et al. Aug 1976 A
4194083 Abe et al. Mar 1980 A
4233522 Grummer et al. Nov 1980 A
4246452 Chandler Jan 1981 A
4250495 Beckerman et al. Feb 1981 A
4266144 Bristol May 1981 A
4268815 Eventoff et al. May 1981 A
4277517 Smith, Jr. Jul 1981 A
4290052 Eichelberger et al. Sep 1981 A
4307383 Brienza Dec 1981 A
4313108 Yoshida Jan 1982 A
4345000 Kawazoe et al. Aug 1982 A
4363027 Brienza Dec 1982 A
4394643 Williams Jul 1983 A
4526043 Boie Jul 1985 A
4550221 Mabusth Oct 1985 A
4587378 Moore May 1986 A
4618989 Tsukune et al. Oct 1986 A
4623757 Marino Nov 1986 A
4639720 Rympalski et al. Jan 1987 A
4672364 Lucas Jun 1987 A
4672558 Beckes et al. Jun 1987 A
4686332 Greanias et al. Aug 1987 A
4692809 Beining et al. Sep 1987 A
4695827 Beining et al. Sep 1987 A
4707845 Krein et al. Nov 1987 A
4723056 Tamaru et al. Feb 1988 A
4733222 Evans Mar 1988 A
4734685 Watanabe Mar 1988 A
4740781 Brown Apr 1988 A
4746770 McAvinney May 1988 A
4771276 Parks Sep 1988 A
4772885 Uehara et al. Sep 1988 A
4788384 Bruere-Dawson et al. Nov 1988 A
4806709 Evans Feb 1989 A
4806846 Kerber Feb 1989 A
4853493 Schlosser et al. Aug 1989 A
4898555 Sampson Feb 1990 A
4910504 Eriksson Mar 1990 A
4914624 Dunthorn et al. Apr 1990 A
4916308 Meadows Apr 1990 A
4954823 Binstead Sep 1990 A
4968877 McAvinney et al. Nov 1990 A
5003519 Noirjean Mar 1991 A
5017030 Crews May 1991 A
5062198 Sun Nov 1991 A
5073950 Colbert et al. Dec 1991 A
5105186 May Apr 1992 A
5105288 Senda et al. Apr 1992 A
5113041 Blonder et al. May 1992 A
5117071 Greanias et al. May 1992 A
5178477 Gambaro Jan 1993 A
5189403 Franz et al. Feb 1993 A
5194862 Edwards Mar 1993 A
5224861 Glass et al. Jul 1993 A
5239152 Caldwell et al. Aug 1993 A
5241308 Young Aug 1993 A
5252951 Tannenbaum et al. Oct 1993 A
5281966 Walsh Jan 1994 A
5293430 Shiau et al. Mar 1994 A
5305017 Gerpheide Apr 1994 A
5345543 Capps et al. Sep 1994 A
5353135 Edwards Oct 1994 A
5374787 Miller Dec 1994 A
5376948 Roberts Dec 1994 A
5381160 Landmeier Jan 1995 A
5386219 Greanias et al. Jan 1995 A
5392058 Tagawa Feb 1995 A
5398310 Tchao et al. Mar 1995 A
5432671 Allavena Jul 1995 A
5442742 Greyson et al. Aug 1995 A
5457289 Huang et al. Oct 1995 A
5459463 Gruaz et al. Oct 1995 A
5463388 Boie et al. Oct 1995 A
5463696 Beernink et al. Oct 1995 A
5483261 Yasutake Jan 1996 A
5488204 Mead et al. Jan 1996 A
5495077 Miller et al. Feb 1996 A
5499026 Liao et al. Mar 1996 A
5513309 Meier et al. Apr 1996 A
5523775 Capps Jun 1996 A
5530455 Gillick et al. Jun 1996 A
5534892 Tagawa Jul 1996 A
5543588 Bisset et al. Aug 1996 A
5543589 Buchana et al. Aug 1996 A
5543590 Gillespie et al. Aug 1996 A
5543591 Gillespie et al. Aug 1996 A
5550659 Fujieda et al. Aug 1996 A
5552787 Schuler et al. Sep 1996 A
5563632 Roberts Oct 1996 A
5563727 Larson et al. Oct 1996 A
5563996 Tchao Oct 1996 A
5565658 Gerpheide et al. Oct 1996 A
5572205 Caldwell et al. Nov 1996 A
5576070 Yaniv Nov 1996 A
5579036 Yates, IV Nov 1996 A
5581681 Tchao et al. Dec 1996 A
5583946 Gourdol Dec 1996 A
5589856 Stein et al. Dec 1996 A
5590219 Gourdol Dec 1996 A
5592566 Pagallo et al. Jan 1997 A
5594806 Colbert Jan 1997 A
5594810 Gourdol Jan 1997 A
5596694 Capps Jan 1997 A
5612719 Beernink et al. Mar 1997 A
5623280 Akins et al. Apr 1997 A
5631805 Bonsall May 1997 A
5633955 Bozinovic et al. May 1997 A
5634102 Capps May 1997 A
5636101 Bonsall et al. Jun 1997 A
5638093 Takahashi et al. Jun 1997 A
5642108 Gopher et al. Jun 1997 A
5644657 Capps et al. Jul 1997 A
5648642 Miller et al. Jul 1997 A
5650597 Redmayne Jul 1997 A
5666113 Logan Sep 1997 A
5666502 Capps Sep 1997 A
5666552 Greyson et al. Sep 1997 A
5675361 Santilli Oct 1997 A
5677710 Thompson-Rohrlich Oct 1997 A
5677744 Yoneda et al. Oct 1997 A
5686973 Lee Nov 1997 A
5689253 Hargreaves et al. Nov 1997 A
5710844 Capps et al. Jan 1998 A
5729250 Bishop et al. Mar 1998 A
5730165 Philipp Mar 1998 A
5734742 Asaeda et al. Mar 1998 A
5734751 Saito Mar 1998 A
5736976 Cheung Apr 1998 A
5741990 Davies Apr 1998 A
5745116 Pisutha-Arnond Apr 1998 A
5745716 Tchao et al. Apr 1998 A
5748269 Harris et al. May 1998 A
5764218 Della Bona et al. Jun 1998 A
5764818 Capps et al. Jun 1998 A
5767457 Gerpheide et al. Jun 1998 A
5767842 Korth Jun 1998 A
5777596 Herbert Jul 1998 A
5790104 Shieh Aug 1998 A
5790106 Hirano et al. Aug 1998 A
5790107 Kasser et al. Aug 1998 A
5802516 Shwarts et al. Sep 1998 A
5805144 Scholder et al. Sep 1998 A
5808567 McCloud Sep 1998 A
5809166 Huang et al. Sep 1998 A
5809267 Moran et al. Sep 1998 A
5815141 Phares Sep 1998 A
5821690 Martens et al. Oct 1998 A
5821930 Hansen Oct 1998 A
5823782 Marcus et al. Oct 1998 A
5825351 Tam Oct 1998 A
5825352 Bisset et al. Oct 1998 A
5835079 Shieh Nov 1998 A
5838308 Knapp et al. Nov 1998 A
5841078 Miller et al. Nov 1998 A
5841415 Kwon et al. Nov 1998 A
5844506 Binstead Dec 1998 A
5847690 Boie et al. Dec 1998 A
5852487 Fujimori et al. Dec 1998 A
5854450 Kent Dec 1998 A
5854625 Frisch et al. Dec 1998 A
5856822 Du et al. Jan 1999 A
5861583 Schediwy et al. Jan 1999 A
5861875 Gerpheide Jan 1999 A
5867151 Nakai Feb 1999 A
5869790 Shigetaka et al. Feb 1999 A
5869791 Young Feb 1999 A
5880411 Gillespie et al. Mar 1999 A
5898434 Small et al. Apr 1999 A
5914465 Allen et al. Jun 1999 A
5917165 Platt et al. Jun 1999 A
5920298 McKnight Jul 1999 A
5920309 Bisset et al. Jul 1999 A
5923319 Bishop et al. Jul 1999 A
5929834 Inoue et al. Jul 1999 A
5933134 Shieh Aug 1999 A
5940055 Lee Aug 1999 A
5940064 Kai et al. Aug 1999 A
5942733 Allen et al. Aug 1999 A
5943043 Furuhata et al. Aug 1999 A
5943044 Martinelli et al. Aug 1999 A
5945980 Moissev et al. Aug 1999 A
5952998 Clancy et al. Sep 1999 A
5955198 Hashimoto et al. Sep 1999 A
5982352 Pryor Nov 1999 A
5986723 Nakamura et al. Nov 1999 A
6002389 Kasser Dec 1999 A
6002808 Freeman Dec 1999 A
6008800 Pryor Dec 1999 A
6020881 Naughton et al. Feb 2000 A
6020945 Sawai et al. Feb 2000 A
6023265 Lee Feb 2000 A
6028581 Umeya Feb 2000 A
6029214 Dorfman et al. Feb 2000 A
6031524 Kunert Feb 2000 A
6037882 Levy Mar 2000 A
6050825 Nichol et al. Apr 2000 A
6052339 Frenkel et al. Apr 2000 A
6057903 Colgan et al. May 2000 A
6061177 Fujimoto May 2000 A
6072494 Nguyen Jun 2000 A
6081259 Teterwak Jun 2000 A
6084576 Leu et al. Jul 2000 A
6107654 Yamazaki Aug 2000 A
6107997 Ure Aug 2000 A
6124848 Ballare et al. Sep 2000 A
6128003 Smith et al. Oct 2000 A
6131299 Raab et al. Oct 2000 A
6135958 Mikula-Curtis et al. Oct 2000 A
6137427 Binstead Oct 2000 A
6144380 Shwarts et al. Nov 2000 A
6163313 Aroyan et al. Dec 2000 A
6172667 Sayag Jan 2001 B1
6177918 Colgan et al. Jan 2001 B1
6188391 Seely et al. Feb 2001 B1
6191828 Kim et al. Feb 2001 B1
6198515 Cole Mar 2001 B1
6204897 Colgan et al. Mar 2001 B1
6208329 Ballare Mar 2001 B1
6211585 Sato et al. Apr 2001 B1
6222465 Kumar et al. Apr 2001 B1
6239389 Allen et al. May 2001 B1
6239788 Nohno et al. May 2001 B1
6239790 Martinelli et al. May 2001 B1
6243071 Shwarts et al. Jun 2001 B1
6246862 Grivas et al. Jun 2001 B1
6249606 Kiraly et al. Jun 2001 B1
6259490 Colgan et al. Jul 2001 B1
6271835 Hoeksma Aug 2001 B1
6285428 Kim et al. Sep 2001 B1
6288707 Philipp Sep 2001 B1
6289326 LaFleur Sep 2001 B1
6292178 Bernstein et al. Sep 2001 B1
6297811 Kent Oct 2001 B1
6310610 Beaton et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6323849 He et al. Nov 2001 B1
6337678 Fish Jan 2002 B1
6342938 Song et al. Jan 2002 B1
6347290 Bartlett Feb 2002 B1
6377009 Philipp Apr 2002 B1
6380931 Gillespie et al. Apr 2002 B1
6411287 Scharff et al. Jun 2002 B1
6414671 Gillespie et al. Jul 2002 B1
6417846 Lee Jul 2002 B1
6421039 Moon et al. Jul 2002 B1
6421234 Ricks et al. Jul 2002 B1
6425289 Igel et al. Jul 2002 B1
6452514 Philipp Sep 2002 B1
6457355 Philipp Oct 2002 B1
6459424 Resman Oct 2002 B1
6466036 Philipp Oct 2002 B1
6483498 Colgan et al. Nov 2002 B1
6501528 Hamada Dec 2002 B1
6501529 Kurihara et al. Dec 2002 B1
6504530 Wilson et al. Jan 2003 B1
6504713 Pandolfi et al. Jan 2003 B1
6515669 Mohri Feb 2003 B1
6522772 Morrison et al. Feb 2003 B1
6525547 Hayes Feb 2003 B2
6525749 Moran et al. Feb 2003 B1
6535200 Philipp Mar 2003 B2
6543684 White et al. Apr 2003 B1
6543947 Lee Apr 2003 B2
6549193 Huang et al. Apr 2003 B1
6570557 Westerman et al. May 2003 B1
6593916 Aroyan Jul 2003 B1
6602790 Kian et al. Aug 2003 B2
6610936 Gillespie et al. Aug 2003 B2
6624833 Kumar et al. Sep 2003 B1
6624835 Willig Sep 2003 B2
6628268 Harada et al. Sep 2003 B1
6639577 Eberhard Oct 2003 B2
D482368 DenToonder et al. Nov 2003 S
6650319 Hurst et al. Nov 2003 B1
6658994 McMillan Dec 2003 B1
6670894 Mehring Dec 2003 B2
6677932 Westerman Jan 2004 B1
6677934 Blanchard Jan 2004 B1
6680448 Kawashima et al. Jan 2004 B2
6690387 Zimmerman et al. Feb 2004 B2
6721375 Hammel Apr 2004 B1
6723929 Kent Apr 2004 B2
6724366 Crawford Apr 2004 B2
6757002 Oross et al. Jun 2004 B1
6762752 Perski et al. Jul 2004 B2
6784948 Kawashima et al. Aug 2004 B2
6785578 Johnson et al. Aug 2004 B2
6803906 Morrison et al. Oct 2004 B1
6825833 Mulligan et al. Nov 2004 B2
6842672 Straub et al. Jan 2005 B1
6846579 Anderson et al. Jan 2005 B2
6856259 Sharp Feb 2005 B1
6876355 Ahn et al. Apr 2005 B1
6888536 Westerman et al. May 2005 B2
6900795 Knight, III et al. May 2005 B1
6906692 Ishiyama Jun 2005 B2
6924789 Bick Aug 2005 B2
6927761 Badaye et al. Aug 2005 B2
6927763 LaMonica Aug 2005 B2
6942571 McAllister et al. Sep 2005 B1
6943779 Satoh Sep 2005 B2
6961049 Mulligan et al. Nov 2005 B2
6965375 Gettemy et al. Nov 2005 B1
6970160 Mulligan et al. Nov 2005 B2
6972401 Akitt et al. Dec 2005 B2
6977666 Hedrick Dec 2005 B1
6982432 Umemoto et al. Jan 2006 B2
6985801 Straub et al. Jan 2006 B1
6992659 Gettemy Jan 2006 B2
6995752 Lu Feb 2006 B2
7015894 Morohoshi Mar 2006 B2
7023427 Kraus et al. Apr 2006 B2
7030860 Hsu et al. Apr 2006 B1
7031228 Born et al. Apr 2006 B2
7038659 Rajkowski May 2006 B2
7042444 Cok May 2006 B2
7046235 Katoh May 2006 B2
7088342 Rekimoto Aug 2006 B2
7088343 Smith Aug 2006 B2
7098127 Ito Aug 2006 B2
7098897 Vakil et al. Aug 2006 B2
7109978 Gillespie et al. Sep 2006 B2
7129935 Mackey Oct 2006 B2
7133032 Cok Nov 2006 B2
7138984 Miles Nov 2006 B1
7151528 Taylor et al. Dec 2006 B2
7154481 Cross et al. Dec 2006 B2
7177001 Lee Feb 2007 B2
7184064 Zimmerman et al. Feb 2007 B2
7190416 Paukshto et al. Mar 2007 B2
7202856 Cok Apr 2007 B2
7230608 Cok Jun 2007 B2
7254775 Geaghan et al. Aug 2007 B2
7268770 Takahata et al. Sep 2007 B1
7274353 Chiu et al. Sep 2007 B2
7280167 Choi et al. Oct 2007 B2
7292229 Morag et al. Nov 2007 B2
7307231 Matsumoto et al. Dec 2007 B2
RE40153 Westerman et al. Mar 2008 E
7339579 Richter et al. Mar 2008 B2
7355592 Hong et al. Apr 2008 B2
7362313 Geaghan et al. Apr 2008 B2
7372455 Perski et al. May 2008 B2
7379054 Lee May 2008 B2
7453444 Geaghan Nov 2008 B2
7463246 Mackey Dec 2008 B2
7483016 Gettemy et al. Jan 2009 B1
7554624 Kusuda et al. Jun 2009 B2
7633484 Ito Dec 2009 B2
7663607 Hotelling et al. Feb 2010 B2
7683888 Kennedy Mar 2010 B1
7688315 Gettemy et al. Mar 2010 B1
7705834 Swedin Apr 2010 B2
7730401 Gillespie et al. Jun 2010 B2
7746326 Sato Jun 2010 B2
7755683 Sergio et al. Jul 2010 B2
7800589 Hurst et al. Sep 2010 B2
7812828 Westerman et al. Oct 2010 B2
7843439 Perski et al. Nov 2010 B2
7920129 Hotelling et al. Apr 2011 B2
8031180 Miyamoto et al. Oct 2011 B2
8125463 Hotelling et al. Feb 2012 B2
8130209 Chang Mar 2012 B2
8243027 Hotelling et al. Aug 2012 B2
8259078 Hotelling et al. Sep 2012 B2
8416209 Hotelling et al. Apr 2013 B2
8432371 Hotelling et al. Apr 2013 B2
8479122 Hotelling et al. Jul 2013 B2
8493330 Krah Jul 2013 B2
8552989 Hotelling et al. Oct 2013 B2
8605051 Hotelling et al. Dec 2013 B2
8654083 Hotelling et al. Feb 2014 B2
8743300 Chang et al. Jun 2014 B2
8804056 Chang et al. Aug 2014 B2
8872785 Hotelling et al. Oct 2014 B2
8928618 Hotelling et al. Jan 2015 B2
20010000961 Hikida et al. May 2001 A1
20010020578 Baier Sep 2001 A1
20010020986 Ikeda et al. Sep 2001 A1
20010020987 Ahn et al. Sep 2001 A1
20020015024 Westerman et al. Feb 2002 A1
20020049070 Bick Apr 2002 A1
20020084992 Agnew Jul 2002 A1
20020089496 Numao Jul 2002 A1
20020101410 Sakata et al. Aug 2002 A1
20020118848 Karpenstein Aug 2002 A1
20020140649 Aoyama et al. Oct 2002 A1
20020159015 Seo et al. Oct 2002 A1
20020167489 Davis Nov 2002 A1
20020185981 Dietz et al. Dec 2002 A1
20020185999 Tajima et al. Dec 2002 A1
20020186210 Itoh Dec 2002 A1
20020190964 Van Berkel Dec 2002 A1
20020191029 Gillespie et al. Dec 2002 A1
20020192445 Ezzell et al. Dec 2002 A1
20020196237 Fernando et al. Dec 2002 A1
20030006974 Clough et al. Jan 2003 A1
20030035479 Kan et al. Feb 2003 A1
20030067451 Tagg et al. Apr 2003 A1
20030069653 Johnson et al. Apr 2003 A1
20030076301 Tsuk et al. Apr 2003 A1
20030076303 Huppi Apr 2003 A1
20030076306 Zadesky et al. Apr 2003 A1
20030085882 Lu May 2003 A1
20030095095 Pihlaja May 2003 A1
20030095096 Robbin et al. May 2003 A1
20030098858 Perski et al. May 2003 A1
20030151600 Takeuchi et al. Aug 2003 A1
20030174128 Matsufusa Sep 2003 A1
20030179323 Abileah et al. Sep 2003 A1
20030201984 Falvo Oct 2003 A1
20030206162 Roberts Nov 2003 A1
20030206202 Moriya Nov 2003 A1
20030222857 Abileah Dec 2003 A1
20030234768 Rekimoto et al. Dec 2003 A1
20030234769 Cross et al. Dec 2003 A1
20030234770 MacKey Dec 2003 A1
20040022010 Shigetaka Feb 2004 A1
20040056839 Yoshihara Mar 2004 A1
20040080501 Koyama Apr 2004 A1
20040090429 Geaghan et al. May 2004 A1
20040095335 Oh et al. May 2004 A1
20040109097 Mai Jun 2004 A1
20040119701 Mulligan et al. Jun 2004 A1
20040141096 Mai Jul 2004 A1
20040150629 Lee Aug 2004 A1
20040155871 Perski et al. Aug 2004 A1
20040155991 Lowles et al. Aug 2004 A1
20040188150 Richard et al. Sep 2004 A1
20040189587 Jung et al. Sep 2004 A1
20040189612 Bottari et al. Sep 2004 A1
20040217945 Miyamoto et al. Nov 2004 A1
20040227736 Kamrath et al. Nov 2004 A1
20040239650 Mackey Dec 2004 A1
20040243747 Rekimoto Dec 2004 A1
20040263484 Mantysalo et al. Dec 2004 A1
20050007349 Vakil et al. Jan 2005 A1
20050012723 Pallakoff Jan 2005 A1
20050017737 Yakabe et al. Jan 2005 A1
20050046621 Kakikuranta Mar 2005 A1
20050052425 Zadesky et al. Mar 2005 A1
20050052427 Wu et al. Mar 2005 A1
20050052582 Mai Mar 2005 A1
20050062620 Schaefer Mar 2005 A1
20050073507 Richter et al. Apr 2005 A1
20050083307 Aufderheide et al. Apr 2005 A1
20050099402 Nakanishi et al. May 2005 A1
20050104867 Westerman et al. May 2005 A1
20050110768 Marriott et al. May 2005 A1
20050146511 Hill et al. Jul 2005 A1
20050162402 Watanachote Jul 2005 A1
20050170668 Park et al. Aug 2005 A1
20050231487 Ming Oct 2005 A1
20050237439 Mai Oct 2005 A1
20050243023 Reddy et al. Nov 2005 A1
20060007087 Choi et al. Jan 2006 A1
20060007165 Yang et al. Jan 2006 A1
20060012575 Knapp et al. Jan 2006 A1
20060017710 Lee et al. Jan 2006 A1
20060022955 Kennedy Feb 2006 A1
20060022956 Lengeling et al. Feb 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060032680 Elias et al. Feb 2006 A1
20060033724 Chaudhri et al. Feb 2006 A1
20060044259 Hotelling et al. Mar 2006 A1
20060053387 Ording Mar 2006 A1
20060066582 Lyon et al. Mar 2006 A1
20060085757 Andre et al. Apr 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060109222 Lee et al. May 2006 A1
20060132462 Geaghan Jun 2006 A1
20060145365 Halls et al. Jul 2006 A1
20060145983 Lee et al. Jul 2006 A1
20060146033 Chen et al. Jul 2006 A1
20060146034 Chen et al. Jul 2006 A1
20060197753 Hotelling Sep 2006 A1
20060227114 Geaghan et al. Oct 2006 A1
20060232564 Nishimura et al. Oct 2006 A1
20060232567 Westerman et al. Oct 2006 A1
20060238517 King et al. Oct 2006 A1
20060238518 Westerman et al. Oct 2006 A1
20060238519 Westerman et al. Oct 2006 A1
20060238520 Westerman et al. Oct 2006 A1
20060238521 Westerman et al. Oct 2006 A1
20060238522 Westerman et al. Oct 2006 A1
20060244736 Tseng Nov 2006 A1
20060278444 Binstead Dec 2006 A1
20060290863 HoeSup Dec 2006 A1
20070013678 Nakajima et al. Jan 2007 A1
20070018969 Chen et al. Jan 2007 A1
20070027932 Thibeault Feb 2007 A1
20070062739 Philipp et al. Mar 2007 A1
20070075977 Chen et al. Apr 2007 A1
20070085838 Ricks et al. Apr 2007 A1
20070109274 Reynolds May 2007 A1
20070152976 Townsend et al. Jul 2007 A1
20070159561 Chien Jul 2007 A1
20070176905 Shih et al. Aug 2007 A1
20070182706 Cassidy et al. Aug 2007 A1
20070216657 Konieck Sep 2007 A1
20070229464 Hotelling et al. Oct 2007 A1
20070236466 Hotelling Oct 2007 A1
20070247429 Westerman Oct 2007 A1
20070257890 Hotelling et al. Nov 2007 A1
20070262967 Rho Nov 2007 A1
20080048994 Lee et al. Feb 2008 A1
20080055221 Yabuta et al. Mar 2008 A1
20080055268 Yoo et al. Mar 2008 A1
20080055270 Cho et al. Mar 2008 A1
20080062139 Hotelling et al. Mar 2008 A1
20080062140 Hotelling et al. Mar 2008 A1
20080062147 Hotelling et al. Mar 2008 A1
20080062148 Hotelling et al. Mar 2008 A1
20080067528 Choi et al. Mar 2008 A1
20080074401 Chung et al. Mar 2008 A1
20080079697 Lee et al. Apr 2008 A1
20080129898 Moon Jun 2008 A1
20080131624 Egami et al. Jun 2008 A1
20080136980 Rho et al. Jun 2008 A1
20080150901 Lowles et al. Jun 2008 A1
20080157867 Krah Jul 2008 A1
20080158167 Hotelling et al. Jul 2008 A1
20080158181 Hamblin et al. Jul 2008 A1
20080165158 Hotelling et al. Jul 2008 A1
20080186288 Chang Aug 2008 A1
20080297476 Hotelling et al. Dec 2008 A1
20090066670 Hotelling et al. Mar 2009 A1
20090096757 Hotelling et al. Apr 2009 A1
20090096758 Hotelling et al. Apr 2009 A1
20090115743 Oowaki May 2009 A1
20090160816 Westerman et al. Jun 2009 A1
20090273581 Kim et al. Nov 2009 A1
20090303193 Lim et al. Dec 2009 A1
20100066650 Lee et al. Mar 2010 A1
20100103121 Kim et al. Apr 2010 A1
20100182273 Noguchi et al. Jul 2010 A1
20100188347 Mizuhashi et al. Jul 2010 A1
20100194699 Chang Aug 2010 A1
20100238134 Day et al. Sep 2010 A1
20100289770 Lee et al. Nov 2010 A1
20110187677 Hotelling Aug 2011 A1
20120105371 Hotelling et al. May 2012 A1
20120162104 Chang Jun 2012 A1
20120162584 Chang Jun 2012 A1
20130106780 Hotelling et al. May 2013 A1
20140062955 Hotelling Mar 2014 A1
20140139457 Hotelling May 2014 A1
20140152619 Hotelling et al. Jun 2014 A1
20140300577 Hotelling et al. Oct 2014 A1
20140300578 Hotelling Oct 2014 A1
Foreign Referenced Citations (153)
Number Date Country
2005246219 Dec 2005 AU
1243096 Oct 1988 CA
2 494 353 Feb 2004 CA
101241277 Aug 2008 CN
197 06 168 Aug 1998 DE
102 51 296 May 2004 DE
0 156 593 Oct 1985 EP
0 156 593 Oct 1985 EP
0 250 931 Jan 1988 EP
0 250 931 Jan 1988 EP
0 250 931 Jan 1988 EP
0 464 908 Jan 1992 EP
0 464 908 Jan 1992 EP
0 464 908 Jan 1992 EP
0 288 692 Jul 1993 EP
0 288 692 Jul 1993 EP
0 288 692 Jul 1993 EP
0 664 504 Jul 1995 EP
0 786 745 Jul 1997 EP
0 786 745 Jul 1997 EP
0 786 745 Jul 1997 EP
0 932 117 Jul 1999 EP
0 932 117 Jul 1999 EP
0 932 117 Jul 1999 EP
0 973 123 Jan 2000 EP
0 014 295 Jan 2002 EP
1 014 295 Jan 2002 EP
1 211 633 Jun 2002 EP
1 211 633 Jun 2002 EP
1 322 104 Jun 2003 EP
1 391 807 Feb 2004 EP
1 396 218 Mar 2004 EP
1 396 812 Mar 2004 EP
1 418 491 May 2004 EP
1 418 491 May 2004 EP
1 422 601 May 2004 EP
1 455 264 Sep 2004 EP
1 455 264 Sep 2004 EP
2 267 584 Dec 2010 EP
1 486 988 Sep 1977 GB
2 168 816 Jun 1986 GB
2 368 483 Jul 2004 GB
53-147626 Nov 1978 JP
58-166430 Oct 1983 JP
59-214941 Dec 1984 JP
60-123927 Jul 1985 JP
60-211529 Oct 1985 JP
61-131314 Jun 1986 JP
63-279316 Nov 1988 JP
02-030024 Jan 1990 JP
03-180922 Aug 1991 JP
03-294918 Dec 1991 JP
04-127314 Apr 1992 JP
05-053726 Mar 1993 JP
05-080923 Apr 1993 JP
05-224818 Sep 1993 JP
06-161661 Jun 1994 JP
07-036017 Feb 1995 JP
07-044305 Feb 1995 JP
07-110741 Apr 1995 JP
07-141086 Jun 1995 JP
08-016307 Jan 1996 JP
08-147092 Jun 1996 JP
08-242458 Sep 1996 JP
08-297267 Nov 1996 JP
09-054650 Feb 1997 JP
09-091079 Apr 1997 JP
09-096792 Apr 1997 JP
09-212302 Aug 1997 JP
09-292950 Nov 1997 JP
09-325852 Dec 1997 JP
10-003349 Jan 1998 JP
11-145141 May 1999 JP
11-505641 May 1999 JP
11-249813 Sep 1999 JP
2000-105670 Apr 2000 JP
2000-112642 Apr 2000 JP
2000-163031 Jun 2000 JP
2000-172437 Jun 2000 JP
2000-172447 Jun 2000 JP
2000-221932 Aug 2000 JP
2001-283228 Oct 2001 JP
2002-501271 Jan 2002 JP
2002-116017 Apr 2002 JP
2002-259052 Sep 2002 JP
2002-287660 Oct 2002 JP
2002-342014 Nov 2002 JP
2002-342033 Nov 2002 JP
2002-366304 Dec 2002 JP
2003-029899 Jan 2003 JP
2003-099192 Apr 2003 JP
2003-516015 May 2003 JP
2003-185688 Jul 2003 JP
2003-196023 Jul 2003 JP
2003-249738 Sep 2003 JP
2003-255855 Sep 2003 JP
2004-038919 Feb 2004 JP
2004-102985 Apr 2004 JP
2004-186333 Jul 2004 JP
2005-346047 Dec 2005 JP
2007-533044 Nov 2007 JP
2008-032756 Feb 2008 JP
2009-244958 Oct 2009 JP
10-0226812 Jul 1999 KR
10-2004-0002310 Jan 2004 KR
10-2004-0013029 Feb 2004 KR
10-2004-0022243 Mar 2004 KR
10-2005-0019799 Mar 2005 KR
10-2006-0089645 Aug 2006 KR
10-2006-0089645 Aug 2006 KR
10-2010-0127164 Dec 2010 KR
200302778 Aug 2003 TW
2004-21156 Oct 2004 TW
200529441 Sep 2005 TW
WO-8704553 Jul 1987 WO
WO-9213328 Aug 1992 WO
WO-9615464 May 1996 WO
WO-9618179 Jun 1996 WO
WO-9718547 May 1997 WO
WO-9723738 Jul 1997 WO
WO-9814863 Apr 1998 WO
WO-9938149 Jul 1999 WO
WO-0127868 Apr 2001 WO
WO-0139371 May 2001 WO
WO-03079176 Sep 2003 WO
WO-03088176 Oct 2003 WO
WO-2004013833 Feb 2004 WO
WO-2004013833 Feb 2004 WO
WO-2004023376 Mar 2004 WO
WO-2004023376 Mar 2004 WO
WO-2004053576 Jun 2004 WO
WO-2004061808 Jul 2004 WO
WO-2004061808 Jul 2004 WO
WO-2004114265 Dec 2004 WO
WO-2005064451 Jul 2005 WO
WO-2005114369 Dec 2005 WO
WO-2005114369 Dec 2005 WO
WO-2006023569 Mar 2006 WO
WO-2006054585 May 2006 WO
WO-2007115032 Oct 2007 WO
WO-2007115032 Oct 2007 WO
WO-2007146779 Dec 2007 WO
WO-2007146779 Dec 2007 WO
WO-2007146780 Dec 2007 WO
WO-2007146780 Dec 2007 WO
WO-2007146783 Dec 2007 WO
WO-2007146783 Dec 2007 WO
WO-2007146785 Dec 2007 WO
WO-2007146785 Dec 2007 WO
WO-2008085457 Jul 2008 WO
WO-2008085457 Jul 2008 WO
WO-2009035471 Mar 2009 WO
WO-2012087639 Jun 2012 WO
Non-Patent Literature Citations (239)
Entry
“Gesture Recognition,” (2006). Located at <http://www.fingerworks.com/gesture—recognition.html>, last visited Jul. 25, 2006, two pages.
3M (2002). MicroTouch Capacitive Touch Screens Datasheets, 3M Innovation, six pages.
Agrawal, R. et al. (Jul. 1986). “An Overview of Tactile Sensing,” Center for Research on Integrated Manufacturing: Robot Systems Division, The University of Michigan, 47 pages.
Anonymous. (May 8, 1992). “The Sensor Frame Graphic Manipulator,” NASA Phase II Final Report, 28 pages.
Anonymous. (Oct. 30, 2001). “Radiotelephone with Rotating Symbol Keypad and Multi-Directional Symbol Input,” located at www.vitgn.com/mobile—terminal.com, 12 pages.
Anonymous. “4-Wire Resistive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-4resistive.html generated Aug. 5, 2005.
Anonymous. “5-Wire Resistive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-resistive.html generated Aug. 5, 2005.
Anonymous. “A Brief Overview of Gesture Recognition” obtained from http://www. Dai. Ed. Ac.uk/Cvonline/LOCA—COPIES/COHEN/gesture—overview.Html, generated Apr. 20, 2004.
Anonymous. “Capacitive Position Sensing” obtained from http://www.synaptics.com/technology/cps.cfin generated Aug. 5, 2005.
Anonymous. “Capacitive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-capacitive.html generated Aug. 5, 2005.
Anonymous. “Comparing Touch Technologies” obtained from http://www.touchscreens.com/intro-touchtypes.html generated Oct. 10, 2004.
Anonymous. “FingerWorks—Gesture Guide—Application Switching,” obtained from http://www.fingerworks.com/gesture—guide—apps.html, generated on Aug. 27, 2004, 1-pg.
Anonymous. “FingerWorks—Gesture Guide—Editing,” obtained from http://www.fingerworks.com/gesure—guide—editing.html, generated on Aug. 27, 2004, 1-pg.
Anonymous. “FingerWorks—Gesture Guide—File Operations,” obtained from http://www.fingerworks.com/gesture—guide—files.html, generated on Aug. 27, 2004, 1-pg.
Anonymous. “FingerWorks—Gesture Guide—Text Manipulation,” obtained from http://www.fingerworks.com/gesture—guide—text—manip.html, generated on Aug. 27, 2004, 2-pg.
Anonymous. “FingerWorks—Gesture Guide—Tips and Tricks,” obtained from http://www.fingerworks.com/gesture—guide—tips.html, generated Aug. 27, 2004, 2-pgs.
Anonymous. “FingerWorks—Gesture Guide—Web,” obtained from http://www.fingerworks.com/gesture—guide—web.html, generated on Aug. 27, 2004, 1-pg.
Anonymous. “FingerWorks—Guide to Hand Gestures for USB Touchpads,” obtained from http://www.fingerworks.com/igesture—userguide.html, generated Aug. 27, 2004, 1-pg.
Anonymous. “FingerWorks—iGesture—Technical Details,” obtained from http://www.fingerworks.com/igesture—tech.html, generated Aug. 27, 2004, 1-pg.
Anonymous. “FingerWorks—The Only Touchpads with Ergonomic Full-Hand Resting and Relaxation!” obtained from http://www.fingerworks.com/resting.html, Copyright 2001, 1-pg.
Anonymous. “FingerWorks—Tips for Typing on the Mini,” obtained from http://www.fingerworks.com/mini—typing.html, generated on Aug. 27, 2004, 2-pgs.
Anonymous. “GlidePoint®” obtained from http://www.cirque.com/technology/technology—gp.html generated Aug. 5, 2005.
Anonymous. “How do touchscreen monitors know where you're touching?” obtained from http://www.electronics.howstuffworks.com/question716.html generated Aug. 5, 2005.
Anonymous. “How does a touchscreen work?” obtained from http://www.touchscreens.com/intro-anatomy.html generated Aug. 5, 2005.
Anonymous. “iGesture Pad—The Multi Finger USB Touch Pad with Whole-Hand Gestures,” obtained from http://www.fingerworks.com/igesture.html, generated Aug. 27, 2004, 2-pgs.
Anonymous. “iGesture Products for Everyone (learn in minutes) Product Overview” FingerWorks.com downloaded Aug. 30, 2005.
Anonymous. “Infrared Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-infrared.html generated Aug. 5, 2005.
Anonymous. “Mouse Emulation” FingerWorks obtained from http://www.fingerworks.com/gesture—guide—mouse.html generated Aug. 30, 2005.
Anonymous. “Mouse Gestures in Opera” obtained from http://www.opera.com/products/desktop/mouse/index.dml generated Aug. 30, 2005.
Anonymous. “Mouse Gestures,” Optim oz, May 21, 2004.
Anonymous. “MultiTouch Overview” FingerWorks obtained from http://www.fingerworks.com/multoverview.html generated Aug. 30, 2005.
Anonymous. “Near Field Imaging Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-nfi.html generated Aug. 5, 2005.
Anonymous. “PenTouch Capacitive Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-pentouch.html generated Aug. 5, 2005.
Anonymous. “Surface Acoustic Wave Touchscreens” obtained from http://www.touchscreens.com/intro-touchtypes-saw.html generated Aug. 5, 2005.
Anonymous. “Symbol Commander” obtained from http://www.sensiva.com/symbolcommander/, generated Aug. 30, 2005.
Anonymous. “Tips for Typing” FingerWorks http://www.fingerworks.com/mini—typing.html generated Aug. 30, 2005.
Anonymous. “Touch Technologies Overview” 2001, 3M Touch Systems, Massachusetts.
Anonymous. “Touchscreen Technology Choices,” <http://www.elotouch.com/products/detech2.asp>, downloaded Aug. 5, 2005.
Anonymous. “Wacom Components—Technology” obtained from http://www.wacom-components.com/english/tech.asp generated on Oct. 10, 2004.
Anonymous. “Watershed Algorithm” http://rsb.info.nih.gov/ij/plugins/watershed.html generated Aug. 5, 2005.
Baxter, L.K. (1996). Capacitive Sensors: Design and Applications, vol. 1 of IEEE Press Series on Electronics Technology, John Wiley & Sons: New York, NY, (Table of Contents Only) three pages.
Bennion, S.I. et al. (Dec. 1981). “Touch Sensitive Graphics Terminal Applied to Process Control,” Computer Graphics 15(4):342-350.
Bier et al., “Toolglass and Magic Lenses: The see-through interface” In James Kijiya, editor, Computer Graphics (SIGGRAPH '93 Proceedings), vol. 27, pp. 73-80, Aug. 1993.
Boie, R.A. (Mar. 1984). “Capacitive Impedance Readout Tactile Image Sensor,” Proceedings of 1984 IEEE International Conference on Robotics and Automation, pp. 370-378.
Buxton, W.A.S. (Mar./Apr. 1994). “Combined Keyboard/Touch Tablet Input Device,” Xerox Disclosure Journal 19(2):109-111.
Chun, K. et al. (Jul. 1985). “A High-Performance Silicon Tactile Imager Based on a Capacitive Cell,” IEEE Transactions on Electron Devices 32(7):1196-1201.
Cliff (Jul. 24, 2002). “Building a Pressure-Sensitive, Multi-Point TouchScreen?” Posted from the D-I-Y-Baby Department, one page.
Collberg, C. et al. (2002). “TetraTetris: A Study of Multi-User Touch-Based Interaction Using DiamondTouch,” located at cs.arizona.edu, eight pages.
Dannenberg, R.B. et al. (1989). “A Gesture Based User Interface Prototyping System,” ACM, pp. 127-132.
Davies, E.R. (Aug. 1987). “Lateral Histograms for Efficient Object Location: Speed versus Ambiguity,” Pattern Recognition Letters 6(3):189-198.
Davies, E.R. (1990). Machine Vision: Theory, Algorithms, Practicalities, Academic Press, Inc..: San Diego, CA, pp. xi-xxi (Table of Contents Only.).
Davies, E.R. (1997). “Boundary Pattern Analysis,” Chapter 7 in Machine Vision: Theory, Algorithms, Practicalities, 2nd Edition, Academic Press, Inc.: San Diego, CA, pp. 171-191.
Davies, E.R. (1997). “Ellipse Detection,” Chapter 11 in Machine Vision: Theory, Algorithms, Practicalities, 2nd Edition, Academic Press, Inc.: San Diego, CA, pp. 271-290.
Davies, E.R. (1997). “Image Acquisition,” Chapter 23 in Machine Vision: Theory, Algorithms, Practicalities, 2nd Edition, Academic Press, Inc.: San Diego, CA, pp. 583-601.
Diaz-Marino, R.A. et al. (2003). “Programming for Multiple Touches and Multiple Users: A Toolkit for the DiamondTouch Hardware,” Proceedings of ACM UIST'03 User Interface Software and Technology, two pages.
Dietz, P. et al. (2001). “DiamondTouch: A Multi-User Touch Technology,” Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology, Nov. 11-14, 2001, Orlando, FL, pp. 219-226.
Douglas et al., The Ergonomics of Computer Pointing Devices (1997).
Esenther, A. et al. (Nov. 2002). “DiamondTouch SDK: Support for Multi-User, Multi-Touch Applications,” Mitsubishi Electric Research Laboratories, Inc., five pages.
European Search Report mailed Jul. 28, 2011, for EP Application No. 11159164.0, filed Jun. 8, 2007, eight pages.
European Search Report mailed Oct. 21, 2011, for EP Application No. 11159166.5, filed Jun. 8, 2007, seven pages.
European Search Report mailed Feb. 16, 2012, for EP Application No. 11183531.0, 11 pages.
European Search Report mailed Mar. 27, 2012, for EP Application No. 10178558.2, nine pages.
European Search Report received in EP 1 621 989 (@ Beyer Weaver & Thomas, LLP) dated Mar. 27, 2006.
EVB Elektronik “TSOP6238 IR Receiver Modules for Infrared Remote Control Systems” dated Jan. 2004 1-pg.
Fearing, R.S. (Jun. 1990). “Tactile Sensing Mechanisms,” The International Journal of Robotics Research 9(3):3-23.
Final Office Action mailed Jul. 6, 2010, for U.S. Appl. No. 11/760,036, filed Jun. 8, 2007, 51 pages.
Final Office Action mailed Jul. 6, 2010, for U.S. Appl. No. 11/760,080, filed Jun. 8, 2007, 66 pages.
Final Office Action mailed Jul. 22, 2010, for U.S. Appl. No. 11/760,049, filed Jun. 8, 2007, 52 pages.
Final Office Action mailed Aug. 2, 2010, for U.S. Appl. No. 11/760,060, filed Jun. 8, 2007, 78 pages.
Final Office Action mailed Sep. 1, 2011, for U.S. Appl. No. 11/650,203, filed Jan. 3, 2007, nine pages.
Final Office Action mailed Oct. 17, 2011, for U.S. Appl. No. 11/818,395, filed Jun. 13, 2007, 16 pages.
Final Office Action mailed Dec. 16, 2011, for U.S. Appl. No. 11/760,036, filed Jun. 8, 2007, 53 pages.
Final Office Action mailed Jan. 30, 2012, for U.S. Appl. No. 11/760,049, filed Jun. 8, 2007, 64 pages.
Final Office Action mailed Feb. 27, 2012, for U.S. Appl. No. 11/760,080, filed Jun. 8, 2007, 62 pages.
Final Office Action mailed May 9, 2013, for U.S. Appl. No. 12/976,997, filed Dec. 22, 2010, 7 pages.
Fisher et al., “Repetitive Motion Disorders: The Design of Optimal Rate—Rest Profiles,” Human Factors, 35(2):283-304 (Jun. 1993).
Fukumoto and Yoshinobu Tonomura, “Body Coupled Fingering: Wireless Wearable Keyboard,” CHI97, pp. 147-154 (Mar. 1997).
Fukumoto et al., “ActiveClick: Tactile Feedback for Touch Panels,” In CHI 2001 Summary, pp. 121-122, 2001.
Hardy, “Fingerworks” Mar. 7, 2002; BBC World On Line.
Hector, J. et al. (May 2002). “Low Power Driving Options for an AMLCD Mobile Display Chipset,” Chapter 16.3 in SID 02 Digest (2002 SID International Symposium, Digest of Technical Papers), XXXIII(II):694-697.
Hillier and Gerald J. Lieberman, Introduction to Operations Research (1986).
Hinckley, K. et al. (1998). “Interaction and Modeling Techniques for Desktop Two-Handed Input,” Proceedings of ACM USIT'98 Symposium on User Interface Software and Technology, pp. 49-58.
Hinckley, K. et al. (May 1999). “Touch-Sensing Input Devices,” CHI 99 pp. 223-230.
Hinckley, K. et al. (2000). “Sensing Techniques for Mobile Interaction,” CHI Letters 2(2):91-100.
Hlady, A.M. (1969). “A Touch Sensitive X-Y Position Encoder for Computer Input,” Fall Joint Computer Conference, pp. 545-551.
Hotelling et al., Office action for U.S. Appl. No. 10/840,862 mailed May 14, 2008.
International Search Report dated Mar. 3, 2006 (PCT/US 05/03325).
International search report for International Application No. PCT/US2005/014364 mailed Jan. 12, 2005.
International Search Report mailed Mar. 6, 2008, for PCT Application No. PCT/2007/70733, filed Jun. 8, 2007, five pages.
International Search Report mailed Mar. 7, 2008, for PCT Application No. PCT/2007/70722, filed Jun. 8, 2007, three pages.
International Search Report mailed Jun. 24, 2008, for PCT Application No. PCT/US2007/026298, filed Dec. 21, 2007, two pages.
International Search Report mailed Jul. 18, 2008, for PCT Application No. PCT/2007/70725, filed Jun. 8, 2007, six pages.
International Search Report mailed Jul. 18, 2008, for PCT Application No. PCT/2007/70729, filed Jun. 8, 2007, five pages.
International Search Report mailed Oct. 16, 2008, for PCT Application No. PCT/US2007/088749, filed Dec. 21, 2007, four pages.
International Search Report mailed Jun. 15, 2012, for PCT/US2011/064455, filed Dec. 12, 2011, four pages.
International Search Report received in corresponding PCT application No. PCT/US2006/008349 dated Oct. 6, 2006.
Jacob et al., “Integrality and Separability of Input Devices,” ACM Transactions on Computer-Human Interaction, 1:3-26 (Mar. 1994).
Kanda, E. et al. (2008). “55.2: Integrated Active Matrix Capacitive Sensors for Touch Panel LTPS-TFT LCDs,” SID 08 Digest, pp. 834-837.
Kinkley et al., “Touch-Sensing Input Devices,” in CHI '99 Proceedings, pp. 223-230, 1999.
Kionx “KXP84 Series Summary Data Sheet” copyright 2005,dated Oct. 21, 2005, 4-pgs.
Kirk, D.E. (1970). “Numerical Determination of Optimal Trajectories,” Chapter 6 in Optimal Control Theory: An Introduction, Prentice Hall, Inc.: Englewood Cliffs, NY. pp. 329-413, with Table of Contents, pp. vii-xi. (90 pages total).
Kling, M. et al. (Sep. 2003). “Interface Design: LCD Touch Interface for ETRAX 100LX,” Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science, UMEA University, Umea, Sweden, 79 pages.
Ko, H. (Jul. 2000). “Open Systems Advanced Workstation Transition Report,” Technical Report 1822, U.S. Navy, SSC San Diego, CA, 82 pages.
Krein, P. et al. (May/Jun. 1990). “The Electroquasistatics of the Capacitive Touch Panel,” IEEE Transactions on Industry Applications 26(3):529-534.
Krueger, M. et al. (Jun. 10, 1988). “Videoplace, Responsive Environment, 1972-1990,” located at http://www.youtube.com/watch?v=dmmxVA5xhuo, last visited Aug. 5, 2011, two pages.
Lee, “A Fast Multiple-Touch-Sensitive Input Device,” Master's Thesis, University of Toronto (1984).
Lee, S.K. et al., “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” in CHI '85 Proceedings, pp. 121-128, 2000 [(Apr. 1985). Conference Proceedings: Human Factors in Computing Systems, pp. 21-25.].
Leigh, J. et al. (2002). “Amplified Collaboration Environments,” VizGrid Symposium, Nov. 2002, Tokyo, Japan, nine pages.
Ljungstrand, P. et al. eds. (2002). UBICOMP2002, Adjunct Proceedings, 4th International Conference on Ubiquitous Computing, Sep. 29-Oct. 1, 2002, Goteborg, Sweden, 90 pages.
Magerkurth, C. et al. (2004). “Towards the Next Generation of Tabletop Gaming Experiences,” Graphics Interface 2004 (GI'04), May 17-19, 2004, Ontario, Canada, pp. 1-8.
Malik, S. et al. (2004). “Visual Touchpad: A Two-Handed Gestural Input Device,” ICMI'04 Proceedings of the 6th International Conference on Multimodal Intercases, ACM, 8 pages.
Matsushita et al., “HoloWall: Designing a Finger, Hand, Body and Object Sensitive Wall,” In Proceedings of UIST '97, Oct. 1997.
Matsushita, N. et al. (2000). “Dual Touch: A Two-Handed Interface for Pen-Based PDAs,” CHI Letters 2(2):211-212.
McMillan, G.R. (Oct. 1998). “The Technology and Applications of Gesture-Based Control,” presented at the RTO Lecture Series on Alternative Control Technologies: Human Factor Issues, Oct. 14-15, 1998, Ohio, USA, pp. 4-1-4-11.
Mehta, N. et al. (May 1982). “Feature Extraction as a Tool for Computer Input,” Proceedings of ICASSP '82, May 3-5, 1982, Paris, France, pp. 818-820.
Mitchell, G. D. (Oct. 2003). “Orientation on Tabletop Displays,” Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science, Simon Fraser University, 119 pages.
Noda, K. et al. (2001). “Production of Transparent Conductive Films with Inserted SiO2 Anchor Layer, and Application to a Resistive Touch Panel,” Electronics and Communications in Japan Part 2 84(7):39-45.
Non-Final Office Action mailed May 14, 2008, for U.S. Appl. No. 10/840,862, filed May 6, 2004, six pages.
Non-Final Office Action mailed Dec. 24, 2008, for U.S. Appl. No. 10/840,862, filed May 6, 2004, nine pages.
Non-Final Office Action mailed Jun. 2, 2009, for U.S. Appl. No. 10/840,862, filed May 6, 2004, seven pages.
Non-Final Office Action mailed Nov. 12, 2009, for U.S. Appl. No. 10/840,862, filed May 6, 2004, eight pages.
Non-Final Office Action mailed Mar. 12, 2010, for U.S. Appl. No. 11/760,080, filed Jun. 8, 2007, 31 pages.
Non-Final Office Action mailed Apr. 22, 2010, for U.S. Appl. No. 11/760,036, filed Jun. 8, 2007, 37 pages.
Non-Final Office Action mailed Apr. 23, 2010, for U.S. Appl. No. 11/760,060, filed Jun. 8, 2007, 66 pages.
Non-Final Office Action mailed May 5, 2010, for U.S. Appl. No. 11/760,049, filed Jun. 8, 2007, 65 pages.
Non-Final Office Action mailed Jun. 21, 2010, for U.S. Appl. No. 11/650,203, filed Jan. 3, 2007, eight pages.
Non-Final Office Action mailed Jan. 25, 2011, for U.S. Appl. No. 11/818,395, filed Jun. 13, 2007, 31 pages.
Non-Final Office Action mailed Mar. 14, 2011, for U.S. Appl. No. 11/650,203, filed Jan. 3, 2007, nine pages.
Non-Final Office Action mailed May 13, 2011, for U.S. Appl. No. 12/267,540, filed Nov. 7, 2008, seven pages.
Non-Final Office Action mailed Jul. 8, 2011, for U.S. Appl. No. 12/267,532, filed Nov. 7, 2008, five pages.
Non-Final Office Action mailed Jul. 14, 2011, for U.S. Appl. No. 12/267,522, filed Nov. 7, 2008, six pages.
Non-Final Office Action mailed Aug. 4, 2011, for U.S. Appl. No. 11/760,036, filed Jun. 8, 2007, 45 pages.
Non-Final Office Action mailed Aug. 11, 2011, for U.S. Appl. No. 11/760,049, filed Jun. 8, 2007, 60 pages.
Non-Final Office Action mailed Sep. 1, 2011, for U.S. Appl. No. 11/760,060, filed Jun. 8, 2007, 76 pages.
Non-Final Office Action mailed Nov. 14, 2011, for U.S. Appl. No. 11/760,080, filed Jun. 8, 2007, 60 pages.
Non-Final Office Action mailed Feb. 17, 2012, for U.S. Appl. No. 13/251,099, filed Sep. 30, 2011, seven pages.
Non-Final Office Action mailed Jun. 20, 2012, for U.S. Appl. No. 13/345,347, filed Jan. 6, 2012, five pages.
Non-Final Office Action mailed Sep. 12, 2012, for U.S. Appl. No. 11/650,203, filed Jan. 3, 2007, nine pages.
Non-Final Office Action mailed Oct. 5, 2012, for U.S. Appl. No. 12/976,997, filed Dec. 22, 2010, 6 pages.
Non-Final Office Action mailed Oct. 5, 2012, for U.S. Appl. No. 12/976,997, filed Dec. 22, 2010, six pages.
Non-Final Office Action mailed Mar. 29, 2013, for U.S. Appl. No. 13/717,573, filed Dec. 17, 2012, five pages.
Non-Final Office Action mailed May 30, 2013, for U.S. Appl. No. 13/251,099, filed Sep. 30, 2011, seven pages.
Non-Final Office Action mailed Jun. 27, 2013, for U.S. Appl. No. 11/760,080, filed Jun. 8, 2007, 48 pages.
Notice of Allowance mailed Apr. 27, 2012, for U.S. Appl. No. 11/760,036, filed Jun. 8, 2007, eight pages.
Notice of Allowance mailed Oct. 25, 2011, for U.S. Appl. No. 12/267,540, filed Nov. 7, 2008, seven pages.
Notice of Allowance mailed Mar. 27, 2012, for U.S. Appl. No. 11/760,060, filed Jun. 8, 2007, 17 pages.
Notice of Allowance mailed Jul. 12, 2012, for U.S. Appl. No. 13/251,099, filed Sep. 30, 2011, seven pages.
Notice of Allowance mailed Sep. 19, 2012, for U.S. Appl. No. 13/345,347, filed Jan. 6, 2012, seven pages.
Notice of Allowance mailed Oct. 29, 2012, for U.S. Appl. No. 13/345,347, filed Jan. 6, 2012, eight pages.
Notice of Allowance mailed Feb. 6, 2013, for U.S. Appl. No. 13/084,402, filed Apr. 11, 2011, 12 pages.
Notice of Allowance mailed Feb. 19, 2013, for U.S. Appl. No. 13/538,498, filed Jun. 29, 2012, 16 pages.
Notice of Allowance mailed Apr. 26, 2013, for U.S. Appl. No. 11/650,203, filed Jan. 3, 2007, 8 pages.
Notice of Allowance mailed May 28, 2013, for U.S. Appl. No. 11/760,049, filed Jun. 8, 2007, 10 pages.
Notice of Allowance mailed Jul. 19, 2013, for U.S. Appl. No. 13/717,573, filed Dec. 17, 2012, 9 pages.
Notice of Allowance mailed Oct. 10, 2013, for U.S. Appl. No. 11/760,080, filed Jun. 8, 2007, 10 pages.
Ogawa, H. et al. (1979). “Preprocessing for Chinese Character Recognition and Global Classification of Handwritten Chinese Characters,” Pattern Recognition 11:1-7.
Partial European Search Report mailed Mar. 15, 2011, for EP Application No. 10178661.4, filed Jun. 8, 2007, six pages.
Partial European Search Report mailed Oct. 21, 2011, for EP Application No. 11159165.7 filed Jun. 8, 2007, seven pages.
Partial European Search Report mailed Oct. 24, 2011, for EP Application No. 11159167.3 filed Jun. 8, 2007, eight pages.
Phipps, C.A. (Spring 2003). “A Metric to Measure Whole Keyboard Index of Difficulty Based on Fitts' Law,” A Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Ph.D., 103 pages.
Quantum Research Group “QT510 / Qwhee1™ Touch Slider IC” copyright 2004-2005, 14-pgs.
Quantum Research Group Ltd.(1997). QT9701B2 Datasheet, 30 pages.
Quantum Research Group Ltd. (1999). QProx™ QT60320 32-Key Qmatrix™ Charge-Transfer IC Datasheet, pp. 1-14.
Quantum Research Group Ltd. (2001). QT60325, QT60485, QT60645 32, 48, 64 Key QMatrix™ Keypanel Sensor Ics Datasheet, 42 pages.
Quantum Research Group Ltd. (2002). QMatrix™ QT60040 4-Key Charge-Transfer IC Datasheet, pp. 1-9.
Quantum Research Group Ltd. (Oct. 10, 2002). Quantum Research Application Note AN-KD01: Qmatrix™ Panel Design Guidelines, four pages.
Quek, “Unencumbered Gestural Interaction,” IEEE Multimedia, 3:36-47 (Winter 1996).
Rabuffetti, M. (2002). “Touch-screen System for Assessing Visuo-motor Exploratory Skills in Neuropsychological Disorders of Spatial Cognition,” Medical & Biological Engineering & Computing 40:675-686.
Radwin, “Activation Force and Travel Effects on Overexertion in Repetitive Key Tapping,” Human Factors, 39(1):130-140 (Mar. 1997).
Raisamo, R. (Dec. 7, 1999). “Multimodal Human-Computer Interaction: A Constructive and Empirical Study,” Dissertation, University of Tampere, Finland, 86 pages.
Rekimoto et al., “ToolStone: Effective Use of the Physical Manipulation Vocabularies of Input Devices,” In Proc. of UIST 2000, 2000.
Rekimoto, J. (2002). “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,” CHI 2002, Apr. 20-25, 2002. [(Apr. 25, 2002). 4(1):113-120.].
Rekimoto, J. et al. (2003). “Pre-Sense: Interaction Techniques for Finger Sensing Input Devices,” CHI Letters 5(2):203-212.
Request for Ex Parte Reexamination of U.S. Patent No. 7,663,607, 106 pages.
Rong, J. et al. (2002). “AIAA 2002-4553: Hierarchical Agent Based System for General Aviation CD&R Under Free Flight,” AIAA Guidance, Navigation, and Control Conference and Exhibit, Aug. 5-8, 2002, Monterey, CA, pp. 1-11.
Rubine et al., “Programmable Finger-Tracking Instrument Controllers,” Computer Music Journal, vol. 14, No. 1 (Spring 1990).
Rubine, D. (Jul. 1991). “Specifying Gestures by Example,” Computer Graphics 25(4):329-337.
Rubine, D. et al. (1988). “The VideoHarp,” Proceedings of the 14th International Computer Music Conference, Cologne, W. Germany, Sep. 20-25, 1988, pp. 49-55.
Rubine, D. et al. (1991). “The Videoharp: An Optical Scanning MIDI Controller,” Contemporary Music Review 6(1):31-46.
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements of the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages.
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660.
Russell, D.M. et al. (2004). “The Use Patterns of Large, Interactive Display Surfaces: Case Studies of Media Design and use for BlueBoard and MERBoard,” Proceedings of the 37th Hawaii International Conference on System Sciences 2004, IEEE, pp. 1-10.
Rutledge et al., “Force-To-Motion Functions for Pointing,” Human-Computer Interaction—Interact (1990).
Sears, A. (Mar. 11, 1991). “Improving Touchscreen Keyboards: Design Issues and a Comparison with Other Devices,” Human-Computer Interaction Laboratory, pp. 1-19.
Sears, A. et al. (Jun. 1990). “A New Era for High-Precision Touchscreens,” Advances in Human-Computer Interaction, vol. 3, Tech Report HCIL-90-01, one page only.
Segen, J. et al. (1998). “Human-Computer Interaction Using Gesture Recognition and 3D Hand Tracking,” IEEE, pp. 188-192.
Shen, C. et al. (Jan. 2004). “DiamondSpin: An Extensible Toolkit for Around-the-Table Interaction,” CHI 2004, Apr. 24-29, 2004, Vienna, Austria, 10 pgs.
Siegel, D.M. et al. (1987). “Performance Analysis of a Tactile Sensor,” IEEE, pp. 1493-1499.
Singapore Examination Report mailed Jan. 11, 2010, for SG Patent Application No. 0607116-1, five pages.
Son, J.S. et al. (1996). “Comparison of Contact Sensor Localization Abilities During Manipulation,” Robotics and Autonomous System 17 pp. 217-233.
Stansfield, S.A. (Mar. 1990). “Haptic Perception With an Articulated, Sensate Robot Hand,” Sandia Report: SAND90-0085-UC-406, 37 pages.
Stauffer, R.N. ed. (Jun. 1983). “Progress in Tactile Sensor Development,” Robotics Today pp. 43-49.
Stumpe, B. (Mar. 16, 1977). “A New Principle for an X-Y Touch Screen,” CERN, 19 pages.
Stumpe, B. (Feb. 6, 1978). “Experiments to Find a Manufacturing Process for an X-Y Touch Screen: Report on a Visit to Polymer-Physik GmbH,” CERN, five pages.
Subatai Ahmad, “A Usable Real-Time 3D Hand Tracker,” Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of2), vol. 2 (Oct. 1994).
Sugiyama, S. et al. (Mar. 1990). “Tactile Image Detection Using a 1k-element Silicon Pressure Sensor Array,” Sensors and Actuators A21-A23(1-3):397-400.
Suzuki, K. et al. (Aug. 1990). “A 1024-Element High-Performance Silicon Tactile Imager,” IEEE Transactions on Electron Devices 37(8):1852-1860.
Texas Instruments “TSC2003 / I2C Touch Screen Controller” Data Sheet SBAS 162, dated Oct. 2001, 20-pgs.
TW Search Report mailed Jun. 27, 2011, for TW Patent Application No. 097100481, one page.
U.S. Appl. No. 10/789,676, filed Feb. 27, 2004 entitled “Shape Detecting Input Device”.
U.S. Appl. No. 11/015,978, filed Dec. 17, 2004.
U.S. Appl. No. 60/072,509, filed Jan. 26, 1998, by Westerman et al.
U.S. Appl. No. 60/333,770, filed Nov. 29, 2001, by Perski et al.
U.S. Appl. No. 60/406,662, filed Aug. 29, 2002, filed by Morag et al.
U.S. Appl. No. 60/501,484, filed Sep. 5, 2003, by Perski et al.
Van Kleek, M. (Feb. 2003). “Intelligent Environments for Informal Public Spaces: The Ki/o Kiosk Platform,” Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Engineering, MIT, 108 pages.
Van Oversteegen, B.G.F.A.W. (Apr. 10, 1998). “Touch Screen Based Measuring Equipment: Design and Implementation,” Master's Thesis, Submitted to Technische Universiteit, Eindhoven, The Nederlands, 103 pages.
Vazquez, A.A. (Sep. 1990). “Touch Screen Use on Flight Simulator Instructor/Operator Stations,” Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science in Information Systems, 78 pages.
Vernier, F. et al. (2002). “Multi-User, Multi-Finger Drag & Drop of Multiple Documents,” located at http://www.edgelab.ca/CSCW/Workshop2002/fred—vernier, three pages.
Wacom Company Limited. (Nov. 12, 2003). Wacom intuos® 2 User's Manual for Windows®, English V4.1, 165 pages.
Wallergard, M. (2003). “Designing Virtual Environments for Brain Injury Rehabilitation,” Thesis, Lund University, Sweden, 98 pages.
Wellner, “The Digital Desk Calculators: Tangible Manipulation on a Desk Top Display” In ACM UIST '91 Proceedings, pp. 27-34, Nov. 1991.
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages.
Westerman, W. et al. (2001). “Multi-Touch: A New Tactile 2-D Gesture Interface for Human-Computer Interaction,” Proceedings of the Human Factors and Ergonomics Societ 45th Annual Meeting, pp. 632-636.
Williams, “Applications for a Switched-Capacitor Instrumentation Building Block” Linear Technology Application Note 3, Jul. 1985, pp. 1-16.
Wu, M. et al. (2003). “Multi-Finger and Whole Hand Gestural Interaction Techniques for Multi-User Tabletop Displays,” ACM pp. 193-202.
Yamada et al., “A Switched-Capacitor Interface for Capacitive Pressure Sensors” IEEE Transactions on Instrumentation and Measurement, vol. 41, No. 1, Feb. 1992, pp. 81-86.
Yee, K-P. (2004). “Two-Handed Interaction on a Tablet Display,” CHI'04, pp. 1493-1496.
Yeh et al., “Switched Capacitor Interface Circuit for Capacitive Transducers” 1985 IEEE.
Zhai et al., “Dual Stream Input for Pointing and Scrolling,” Proceedings of CHI '97 Extended Abstracts (1997).
Zimmerman et al., “Applying Electric Field Sensing to Human-Computer Interfaces,” In CHI '85 Proceedings, pp. 280-287, 1995.
Non-Final Office Action mailed Mar. 25, 2014, for U.S. Appl. No. 14/073,818, filed Nov. 6, 2013, six pages.
Notice of Allowance mailed Mar. 3, 2014, for U.S. Appl. No. 13/251,099, filed Sep. 30, 2011, eight pages.
Leeper, A.K. (May 21, 2002) “Integration of a clear capacitive touch screen with a ⅛-VGA FSTN-LCD to form and LCD based touchpad,” Synaptics Inc., Society for information Display, 3 pages.
Non-Final Office Action mailed Dec. 24, 2013, for Ex Parte Reexamination of U.S. Patent No. 7,663,607, 52 pages.
Non-Final Office Action mailed Apr. 18, 2014, for U.S. Appl. No. 11/818,395, filed Jun. 13, 2007, 17 pages.
Non-Final Office Action mailed May 16, 2014, for Ex Parte Reexamination of U.S. Patent No. 7,663,607, 34 pages.
Notice of Allowance mailed Apr. 14, 2014, for U.S. Appl. No. 12/976,997, filed Dec. 22, 2010, 8 pages.
Notice of Prior and Concurrent Proceedings under 37 C.F.R. § 1.565(a) for U.S. Ex Parte Reexamination Control No. 90/012,935, filed Jul. 30, 2013 (Reexamination of U.S. Patent No. 7,663,607), 317 pages. (Submitted in two parts).
Response to Non-Final Office Action in Ex Parte Reexamination dated Mar. 24, 2014, of U.S. Patent No. 7,663,607, 392 pages. (Submitted in two parts).
Final Office Action mailed Oct. 27, 2014, for U.S. Appl. No. 11/818,395, filed Jun. 13, 2007, 17 pages.
Non-Final Office Action mailed Jul. 14, 2014, for U.S. Appl. No. 14/308,595, filed Jun. 18, 2014, five pages.
Non-Final Office Action mailed Sep. 26, 2014, for U.S. Appl. No. 14/308,646, filed Jun. 18, 2014, five pages.
Notice of Allowability (Corrected) mailed Jun. 27, 2014, for U.S. Appl. No. 12/976,997, filed Dec. 22, 2010, five pages.
Notice of Allowance mailed Jul. 14, 2014, for U.S. Appl. No. 14/073,818, filed Nov. 6, 2013, seven pages.
Notice of Allowance mailed Oct. 31, 2014, for U.S. Appl. No. 14/308,595, filed Jun. 18, 2014, eight pages.
Notice of Allowance mailed Nov. 6, 2014, for U.S. Appl. No. 14/308,646, filed Jun. 18, 2014, eight pages.
Notice of Allowance mailed Dec. 23, 2014, for U.S. Appl. No. 14/456,831, filed Aug. 11, 2014, eight pages.
TW Search Report mailed Feb. 11, 2014, for TW Patent Application No. 110145112, one page.
Related Publications (1)
Number Date Country
20140078108 A1 Mar 2014 US
Divisions (2)
Number Date Country
Parent 13345347 Jan 2012 US
Child 13717573 US
Parent 10840862 May 2004 US
Child 12267532 US
Continuations (2)
Number Date Country
Parent 13717573 Dec 2012 US
Child 14086877 US
Parent 12267532 Nov 2008 US
Child 13345347 US