This relates generally to touch and hover sensing and more particularly to devices that can perform both touch and hover sensing.
Touch sensitive devices have become quite popular as input devices to computing systems because of their ease and versatility of operation as well as their declining price. A touch sensitive device can include a touch sensor panel, which can be a clear panel with a touch sensitive surface, and a display device such as a liquid crystal display (LCD) that can be positioned partially or fully behind the panel or integrated with the panel so that the touch sensitive surface can cover at least a portion of the viewable area of the display device. The touch sensitive device can allow a user to perform various functions by touching the touch sensor panel using a finger, stylus or other object at a location often dictated by a user interface (UI) being displayed by the display device. In general, the touch sensitive device can recognize a touch event and the position of the touch event on the touch sensor panel, and the computing system can then interpret the touch event in accordance with the display appearing at the time of the touch event, and thereafter can perform one or more actions based on the touch event.
Some touch sensitive devices can also recognize a hover event, i.e., an object near but not touching the touch sensor panel, and the position of the hover event at the panel. The touch sensitive device can then process the hover event in a manner similar to that for a touch event, where the computing system can interpret the hover event in according with the display appearing at the time of the hover event, and thereafter can perform one or more actions based on the hover event.
While touch and hover capabilities in a touch sensitive device are desirable, together they can present a challenge to cooperative performance for accurate, reliable detection of touch and hover events.
This relates to switching a touch and hover sensing device between a touch mode and a hover mode. During a touch mode, the device can be switched to sense one or more objects touching the device. During a hover mode, the device can be switched to sense one or more objects hovering over the device. The device can include a panel having multiple sensors for sensing a touching object and/or a hovering object and a touch and hover control system for switching the device between the touch and hover modes. The device's touch and hover control system can include a touch sensing circuit for coupling to the sensors to measure a capacitance indicative of a touching object during the touch mode, a hover sensing circuit for coupling to the sensors to measure a capacitance indicative of a hovering object during the hover mode, and a switching mechanism for switching the sensors to couple to either the touch sensing circuit or the hover sensing circuit. The device can switch modes based on a condition of the device, such as an expiration of a timer or a relative distance of an object from the panel. This switching can advantageously provide improved touch and hover sensing.
In the following description of various embodiments, reference is made to the accompanying drawings which form a part hereof, and in which it is shown by way of illustration specific embodiments which can be practiced. It is to be understood that other embodiments can be used and structural changes can be made without departing from the scope of the various embodiments.
This relates to improved touch and hover sensing. Various aspects of touch and hover sensing can be addressed to improve detection of touch and hover events. In some embodiments, a touch and hover sensing device can ensure that a desired hover event is not masked by an incidental touch event, e.g., a hand holding the device, by compensating for the touch event in the sensing signal that represents both events. Conversely, in some embodiments, when both the hover and touch events are desired, the device can make adjustments to its sensors and/or the sensing signal to ensure that both events are detected. In some embodiments, the device can improve the accuracy of its determination of the device user interface location to which a hovering object is pointing by profiling the object shape. In some embodiments, the device can differentiate between object distance and area (or size) so as to properly process the corresponding sensing signal and subsequently perform the intended actions. In some embodiments, the device can improve detection of concurrent hover events. In some embodiments, the device can compensate for signal drift in the sensing signal by adjusting the baseline capacitance of the device. In some embodiments, the device can compensate for resistance from the touch and hover sensors by making adjustments to the sensors and/or the voltage patterns driving the device. In some embodiments, the device can compensate the sensing signal for sensitivity variations of the sensors (generally at issue during a hover event), by applying a gain factor as a function of the location of the hover event to the sensing signal. In some embodiments, the device can improve sensor switching between a touch mode and a hover mode by compensating for parasitic capacitance introduced by the switching components in the sensing signal. In some embodiments, the device can improve integration of a display with the sensors by reducing interference from the display at the sensors.
These and other approaches can advantageously provide improved touch and hover sensing.
The touch and hover sensing device 100 can operate based on self capacitance and/or mutual capacitance. In self capacitance, the self capacitance of the sensor panel 111 can be measured relative to some reference, e.g., ground. An object placed in the electric field near the sensor panel 111 can cause a change in the self capacitance of the sensor panel, which can be measured by various techniques. For example, the touch and hover control system 107 can drive each horizontal line 101 and vertical line 102 to create an electric field extending from the sensor panel 111. The touch and hover control system 107 can then measure the self capacitance on each horizontal line 101 and vertical line 102, where the strongest measurements, e.g., the greatest changes in self capacitance, on the horizontal lines and the vertical lines can indicate the (x,y) location of a hover event or a touch event.
In mutual capacitance, the mutual capacitance of the sensor panel 111 can be formed between the crossing or proximate horizontal and vertical lines 101, 102. An object placed in the electric field near the sensor panel 111 can cause a change in the mutual capacitance of the sensor panel, which can be measured by various techniques. For example, the touch and hover control system 107 can drive each horizontal line 101 to create an electric field extending from the sensor panel 111. The touch and hover control system 107 can then measure the change in mutual capacitance on the vertical lines 102, where the strongest measurements, e.g., the greatest changes in mutual capacitance, at the crossings or proximate locations of the horizontal and vertical lines can indicate the (x,y) location of a hover event or a touch event.
In some embodiments, the touch and hover sensing device 100 can use self capacitance to detect a hover event and mutual capacitance to detect a touch event. In other embodiments, the device 100 can use self capacitance to detect all events. In still other embodiments, the device can use mutual capacitance to detect all events. Various capacitance configurations can be used according to the needs of the device.
As described herein, in some embodiments, a capacitance measurement can indicate an absolute capacitance and, in some embodiments, a capacitance measurement can indicate a change in capacitance.
Below are various aspects of touch and hover sensing that can be addressed to provide improved detection of touch and hover events according to various embodiments.
Touch Signal Compensation
An object touching a sensing device can generally produce a stronger signal than an object hovering over the device, such that the touch signal can mask or otherwise reduce detectability of the hover signal when they occur at the same time. This can be particularly problematic when the touch signal is merely incidental and the hover signal is of interest.
Although the touch and hover sensing device of
In addition or alternative to partitioning the device as in
In the example of
Although the examples of
Object Shape Profiling
An object hovering over a sensing device can point to an area on the device's UI display to cause an action. In some instances, there can be difficulty in determining specifically where the object is pointing so as to cause the intended action.
In some embodiments, when the touch and hover sensing device is held in a substantially upright pose, an accelerometer or a similar detector in the device can be used to determine where to shift the pointer location (1040). For example, the accelerometer can detect the direction of gravity with respect to the device. Assuming that the object pointing to the device is right-side up and not upside down, the pointer location can be shifted from the centroid location to another sensor location in the direction opposite gravity. The shifted pointer can then be used to estimate the pointed-to area on the UI display (1050).
In some embodiments, the hovering signal in the determined object area can be curve-fitted to determine the signal maximum, indicative of the portion of the object closest to the touch and hover sensing device and correspondingly where the object is pointing. Accordingly, the pointer location can be shifted from the centroid location to the sensor location corresponding to the signal maximum (1140). The shifted pointer can then be used to estimate the pointed-to area on the UI display (1150).
It is to be understood that other method are also possible to profile an object shape according to the needs of the device.
Distance and Area Differentiation
A smaller object close to a sensing device and a larger object farther away from a sensing device can generate similar sensing signals such that it can be difficult to differentiate between them to determine their areas and/or distances from the device, which may adversely affect subsequent device actions based on the signals.
In addition or alternative to using capture devices and detectors, the sensing signals can be used to differentiate between object distances and areas, as described in
It is to be understood that other methods can be used to determine object area and distance based on object sensing signals according to various embodiments.
Concurrent Touch and Hover
As described previously, an object touching a sensing device can generally produce a stronger signal than an object hovering over the device, such that the touch signal can mask or otherwise reduce detectability of the hover signal when they occur at the same time. Unlike the instance of
An example application of concurrent touch and hover sensing can include using a thumb touch to select a button that changes to a particular operating mode of the device while using a finger hover to select an action to perform during that operating mode.
Although the touch sensing device of
It is to be understood that other methods for detecting concurrent hover and touch events can also be used according to various embodiments.
Multi-Hover Detection
Detecting multiple hovering objects in a sensing device can be desirable for device actions that need multiple inputs.
In the example of
In some embodiments, the touch and hover sensing device can be partitioned into quadrants (or other partitions) such that multi-hover can be realized in each quadrant, thereby increasing the number of hovering objects that can be detected.
Signal Drift Compensation
When a sensing device experiences environmental changes, e.g., changes in ambient temperature, humidity, or pressure; operating changes, e.g., component start-up, shutdown, prolonged operation, or noise; or mechanical changes, e.g., component shifts, expansion, or contraction, baseline capacitance of the device can change over time. Baseline capacitance refers to the capacitance of the device when there is no touch or hover at the device. As a result, capacitance measurements indicative of a touch or a hover at the device can similarly change. This is known as signal drift. Signal drift can adversely affect device action, particularly when the action is responsive to a particular capacitance measurement value or a particular capacitance range of values. To compensate for signal drift, the baseline capacitance can be reset periodically to take into account any environmental, operating, mechanical, and other changes. The new baseline can then be applied to a touch or hover capacitance measurement to correct the measurement.
In some embodiments, a touch and hover sensing device may not have either a cover or a docking station. In such embodiments, when a touch or hover is not detected, capacitance measurements can be taken and the device can be calibrated with the measurements as the new baseline. This can be done when the device is idle for an extended period, when the device is in use but between touch or hover detections, or in some otherwise non-touch or non-hover circumstance.
Alternatively, rather than waiting until there is no touch or hover at the device, a new baseline capacitance can be set during a touch or hover.
If there is no touch or hover at the device, a determination can be made whether the device is substantially stationary (2715). Typically a substantially stationary device can be more desirable to reset the baseline capacitance to avoid capacitance measurements being adversely affected by device motion. The device's motion can be determined using any suitable motion detector or detection algorithm according to the needs of the device. If the device is moving, resetting the baseline capacitance can be suspended until conditions are more favorable. If the device is not moving, capacitance measurements can be taken to compensate for signal drift (2720). A determination can be made whether the capacitance measurements indicate some unacceptable condition, e.g., the measurements are either negative or drifting in a negative direction (2725). If so, resetting the baseline capacitance can be suspended until conditions are more favorable. Otherwise, if the capacitance measurements are acceptable, the measurements can be set as the new baseline capacitance for the device, thereby compensating for the signal drift (2730).
If there is a touch or hover at the device, a determination can be made whether the touching or hovering object is substantially stationary (2750). If not, the device is likely in operation or the object is shaking such that resetting the baseline can be suspended until conditions are more favorable. If the touching or hovering object is substantially stationary, the object is likely touching or hovering to reset the baseline capacitance (as in
In some embodiments, the user can manually input a new baseline capacitance to compensate for the signal drift.
After the baseline capacitance has been reset, a capacitance can be measured indicative of either a touch or hover at the device (2780). The new baseline capacitance, compensated for signal drift, can be subtracted from the capacitance measurement to determine the capacitance change as a result of the touch or hover (2785).
It is to be understood that other methods can also be used for resetting the baseline capacitance to compensate for signal drift according to the needs of the device.
Sensor Resistance Compensation
As described previously, due to resistance from touch and hover sensing device sensor lines' conductive material, the ability of a drive voltage to travel along a sensor line can be influenced by the drive voltage's frequency, where higher frequencies have more difficulty than lower frequencies. As a result, at higher frequency drive voltages, the sensors at the start of the sensor lines can see stronger drive voltages than sensors at the end of the sensor lines, thereby generating stronger electric fields and subsequent touch and hover signals. At lower frequency drive voltages, the sensors all along the sensor lines can be driven similarly, thereby generating acceptable electric fields and subsequent touch and hover signals everywhere. While higher frequency drive voltages are desirable, larger touch and hover sensing devices can have difficulty driving all the sensors along longer sensor lines. To compensate for the sensor lines' resistance, various sensor configurations can be used as described below.
In this example, the horizontal sensor lines are ganged together. However, it is to be understood that the vertical sensor lines can be similarly ganged together according to the needs of the device.
Although the touch sensing device of
Sensitivity Variation Compensation
Touch or hover sensitivity can vary as a function of sensor location in a touch and hover sensing device. Sensor locations at the edges of the device can generally be less sensitive than sensor locations at the center of the device.
To compensate for the sensitivity variation, a gain factor as a function of the hover location can be applied to the capacitance measurement to ensure consistent hover signals at any location on the device.
It is to be understood that other methods are also available to compensation for sensitivity variations according to the needs of the device.
Touch and Hover Switching
As described previously, sensors formed from sensor lines of a touch and hover sensing device can sense both a touching object and a hovering object. In some embodiments, to sense a touching object, the sensors can be configured based on mutual capacitance. In some embodiments, to sense a hovering object, the sensors can be configured based on self capacitance. Switching a sensor between a touch mode and a hover mode can be accomplished through software, firmware, or hardware.
The switch 3539 can have a substantial capacitance that can interfere with a touch signal or a hover signal from the sensor 3512. The interference can be more adverse in the hover signal where the hover sensing circuit can measure absolute capacitance. In contrast, the interference can be less adverse, and in some cases advantageous, in the touch signal where the touch sensing circuit can measure differential capacitance (or changes in capacitance). For example, in some embodiments, the switch capacitance can be about 20 pF, which can be the dynamic range of the signal. The switch capacitance can be offset with device components as illustrated in
As an alternative to a switch for switching between touch and hover modes, logic can be used to switch between the modes.
There can be parasitic capacitance on the lines connecting the touch sensing circuit 3816 and the hover sensing circuit 3818 together which can interfere with the touch signal and the hover signal from the sensor 3812. As described previously, the interference can be more adverse in the hover signal than the touch signal. In one embodiment, to reduce the effects of the parasitic capacitance on the hover signal, characteristics of the touch sensing circuit and the hover sensing circuit can be adjusted so as to provide a high impedance state through a resistor at the touch sensing circuit to force the voltage path from the sensor to stay at the hover sensing circuit and to impede the parasitic capacitance at the touch sensing circuit from interfering. Other solutions are also available for reducing parasitic capacitance.
A determination can be made whether an object is far away from the device based on the hover measurement (3920). To perform the determination, the hover measurement can be compared to a threshold hover measurement. If the hover measurement is at or lower than the hover threshold, the object can be determined to be far away. If the hover measurement is higher than the hover threshold, the object can be determined to be close.
If the object is determined to be far away, the device can continue in hover mode, repeating a hover capacitance measurement (3910) and determining whether the object is still far away (3920) until the hover measurement exceeds the hover threshold, indicating that the object is touching or almost touching the device.
If however the object is determined to be close, the controller can switch to touch mode so that a touch sensing circuit of the device can measure a touch capacitance at the sensors (3930). In some embodiments, the controller can send a control signal to actuate a switch to couple with the touch sensing circuit. In some embodiments, the controller can send an enable signal to the touch sensing circuit and a disable signal to the hover sensing circuit.
A determination can be made whether an object is touching the device based on the touch measurement (3940). To perform the determination, the touch measurement can be compared to a threshold touch measurement. If the touch measurement is at or higher than the touch threshold, the object can be determined to be touching the device. If the touch measurement is lower than the touch threshold, the object can be determined not to be touching the device.
If the object is determined to be touching the device, the device can continue in touch mode, repeating a touch capacitance measurement (3930) and determining whether the object is still touching the device (3940) until the touch measurement falls below the touch threshold, indicating there is no longer a touch on the device.
If however the object is determined not to be touching the device, the measurement can be considered ambiguous since it is between the threshold hover measurement and the threshold touch measurement. As such, the device can switch between the two modes until such time that the measurement satisfies either of the thresholds. Accordingly, the controller can switch back to hover mode and the method can repeat (3910) through (3940).
It is to be understood that the method of
If the timer has expired, the controller can reset the timer (4030). The controller can switch to another mode (4040). If the device was in the touch mode, the device can switch to the hover mode. In some embodiments, to perform the switching, the controller can send a control signal to actuate a switch to decouple from a touch sensing circuit and to couple to a hover sensing circuit. In other embodiments, to perform the switching, the controller can send an enable signal to the hover sensing circuit and a disable signal to the touch sensing circuit. If, however, the device was in the hover mode, the device can switch to the touch mode. In some embodiments, to perform the switching, the controller can send a control signal to actuate a switch to decouple from the hover sensing circuit and to couple to the touch sensing circuit. In other embodiments, to perform the switching, the controller can send an enable signal to the touch sensing circuit and a disable signal to the hover sensing circuit.
After switching to another mode, the controller can repeat the method (4010) through (4040), checking for expiration of the timer and, upon expiration, resetting the timer and switching to a different mode.
It is to be understood that the method of
In touch and hover switching, some or all of the sensors can be switched between the two modes. For example, in some embodiments, a portion of the sensors can be switched to hover mode to couple to the hover sensing circuit and a portion of the sensors can be switched to touch mode to couple to the touch sensing circuit. This can implement panel partitioning as described previously. In other embodiments, all of the sensor can be switched to the hover mode to couple to the hover sensing circuit or to the touch mode to couple to the touch sensing circuit.
Display Integration
In some embodiments, a touch and hover sensing device can integrate a display device with the touch and hover sensing panel, where the display can provide a graphical user interface with various graphics selectable via a touch or hover signal from the panel to cause the device to perform actions associated with the selected graphics. Because of its proximity to the panel, the display can interfere with the touch or hover signals generated by the panel to introduce noise, decrease touch or hover sensitivity, or otherwise adversely affect the signals. This can then cause unintended device action.
Exemplary Touch and Hover Sensing Devices
The touch and hover control system 4306 can also include charge pump 4315, which can be used to generate the supply voltage for the transmit section 4314. The stimulation signals 4316 can have amplitudes higher than the maximum voltage by cascading two charge store devices, e.g., capacitors, together to form the charge pump 4315. Therefore, the stimulus voltage can be higher (e.g., 43V) than the voltage level a single capacitor can handle (e.g., 3.6 V). Although
Touch and hover sensor panel 4324 can include a capacitive sensing medium having sensors for detecting a touch event or a hover event at the panel. The sensors can be formed from a transparent conductive medium such as indium tin oxide (ITO) or antimony tin oxide (ATO), although other transparent and non-transparent materials such as copper can also be used. Each sensor can represent a capacitive sensing node and can be viewed as picture element (pixel) 4326, which can be particularly useful when the sensor panel 4324 is viewed as capturing an “image” of touch or hover. (In other words, after the touch and hover control system 4306 has determined whether a touch event or a hover event has been detected at each sensor in the sensor panel, the pattern of sensors in the panel at which a touch event or a hover event occurred can be viewed as an “image” of touch or hover (e.g. a pattern of an object touching or hovering over the panel).)
Computing system 4300 can also include host processor 4328 for receiving outputs from the processor subsystems 4302 and performing actions based on the outputs that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device coupled to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like. The host processor 4328 can also perform additional functions that may not be related to panel processing, and can be coupled to program storage 4332 and display device 4330 such as an LCD display for providing a UI to a user of the device. In some embodiments, the host processor 4328 can be a separate component from the touch and hover control system 4306, as shown. In other embodiments, the host processor 4328 can be included as part of the touch and hover control system 4306. In still other embodiments, the functions of the host processor 4328 can be performed by the processor subsystem 4302 and/or distributed among other components of the touch and hover control system 4306. The display device 4330 together with the touch and hover sensor panel 4324, when located partially or entirely under the sensor panel or when integrated with the sensor panel, can form a touch sensitive device such as a touch screen.
Note that one or more of the functions described above can be performed, for example, by firmware stored in memory (e.g., one of the peripherals) and executed by the processor subsystem 4302, or stored in the program storage 4332 and executed by the host processor 4328. The firmware can also be stored and/or transported within any computer readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer readable storage medium” can be any medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable storage medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like.
The firmware can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.
The mobile telephone, media player, and personal computer of
In the examples above, a capacitance measurement can be a measure of a capacitance at a particular time, i.e., an absolute capacitance, or a measure of a capacitance difference over a particular time period, i.e., a change in capacitance. Accordingly, in some embodiments, touch events or hover events can be detected by a measurement of absolute capacitance at sensing lines of a touch and hover sensing device. In other embodiments, touch events or hover events can be detected by a measurement of a change in capacitance at sensing lines of a touch and hover sensing device. In still other embodiments, touch events or hover events can be detected by a combination of measurements of absolute capacitance and a change in capacitance at sensing lines of a touch and hover sensing device. The particular measurement can be determined according to the particular function, e.g., signal compensation, signal detection, etc., being performed by the device.
Although embodiments have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of the various embodiments as defined by the appended claims.
This application is a continuation of U.S. patent application Ser. No. 12/895,643, filed Sep. 30, 2010, now U.S. Publication No. 2012-0050180 and published on Mar. 1, 2012, which claims the benefit of U.S. Provisional Application No. 61/377,829, filed Aug. 27, 2010, the contents of which are incorporated by reference herein in their entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4603231 | Reiffel et al. | Jul 1986 | A |
5392058 | Tagawa | Feb 1995 | A |
5402151 | Duwaer | Mar 1995 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
6137427 | Binstead | Oct 2000 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6335642 | Hiroshima et al. | Jan 2002 | B1 |
6650157 | Amick et al. | Nov 2003 | B2 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
7015894 | Morohoshi | Mar 2006 | B2 |
7084643 | Howard et al. | Aug 2006 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7315793 | Jean | Jan 2008 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
8149002 | Ossart et al. | Apr 2012 | B2 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8770033 | Roziere | Jul 2014 | B2 |
8913021 | Elias et al. | Dec 2014 | B2 |
8933710 | Blondin et al. | Jan 2015 | B2 |
9086768 | Elias et al. | Jul 2015 | B2 |
9201547 | Elias et al. | Dec 2015 | B2 |
9323398 | Bernstein et al. | Apr 2016 | B2 |
9569053 | Elias et al. | Feb 2017 | B2 |
9933879 | Yao et al. | Apr 2018 | B2 |
10037118 | Elias et al. | Jul 2018 | B2 |
20020121146 | Manaresi et al. | Sep 2002 | A1 |
20040217945 | Miyamoto et al. | Nov 2004 | A1 |
20040239650 | Mackey | Dec 2004 | A1 |
20060022682 | Nakamura et al. | Feb 2006 | A1 |
20060071913 | Wang et al. | Apr 2006 | A1 |
20060097991 | Hotelling | May 2006 | A1 |
20060097992 | Gitzinger et al. | May 2006 | A1 |
20060197749 | Popovich | Sep 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20070074913 | Geaghan et al. | Apr 2007 | A1 |
20070109274 | Reynolds | May 2007 | A1 |
20070152977 | Ng et al. | Jul 2007 | A1 |
20080007533 | Hotelling | Jan 2008 | A1 |
20080007543 | D'Souza | Jan 2008 | A1 |
20080012835 | Rimon et al. | Jan 2008 | A1 |
20080041639 | Westerman | Feb 2008 | A1 |
20080042660 | Dy et al. | Feb 2008 | A1 |
20080062148 | Hotelling et al. | Mar 2008 | A1 |
20080122798 | Koshiyama et al. | May 2008 | A1 |
20080143683 | Hotelling | Jun 2008 | A1 |
20080158172 | Hotelling et al. | Jul 2008 | A1 |
20080158174 | Land et al. | Jul 2008 | A1 |
20080158176 | Land | Jul 2008 | A1 |
20080174321 | Kang | Jul 2008 | A1 |
20080231292 | Ossart et al. | Sep 2008 | A1 |
20080309623 | Hotelling et al. | Dec 2008 | A1 |
20080309632 | Westerman et al. | Dec 2008 | A1 |
20090009483 | Hotelling et al. | Jan 2009 | A1 |
20090009485 | Bytheway | Jan 2009 | A1 |
20090045823 | Tasher et al. | Feb 2009 | A1 |
20090127005 | Zachut et al. | May 2009 | A1 |
20090128515 | Bytheway | May 2009 | A1 |
20090160787 | Westerman et al. | Jun 2009 | A1 |
20090167325 | Geaghan | Jul 2009 | A1 |
20090174675 | Gillespie et al. | Jul 2009 | A1 |
20090219255 | Woolley et al. | Sep 2009 | A1 |
20090234207 | Rantala | Sep 2009 | A1 |
20090251434 | Rimon | Oct 2009 | A1 |
20090251439 | Westerman et al. | Oct 2009 | A1 |
20090289914 | Cho | Nov 2009 | A1 |
20090309851 | Bernstein | Dec 2009 | A1 |
20100026656 | Hotelling | Feb 2010 | A1 |
20100071459 | Kamm | Mar 2010 | A1 |
20100097077 | Philipp et al. | Apr 2010 | A1 |
20100105443 | Vaisanen | Apr 2010 | A1 |
20100110038 | Mo et al. | May 2010 | A1 |
20100110040 | Kim et al. | May 2010 | A1 |
20100149126 | Futter | Jun 2010 | A1 |
20100241956 | Matsuda et al. | Sep 2010 | A1 |
20100252335 | Orsley | Oct 2010 | A1 |
20100253638 | Yousefpor et al. | Oct 2010 | A1 |
20100321333 | Oda | Dec 2010 | A1 |
20100328262 | Huang et al. | Dec 2010 | A1 |
20110025629 | Grivna et al. | Feb 2011 | A1 |
20110050585 | Hotelling et al. | Mar 2011 | A1 |
20110050610 | Pearce | Mar 2011 | A1 |
20110061949 | Krah et al. | Mar 2011 | A1 |
20110063247 | Min | Mar 2011 | A1 |
20110084929 | Chang | Apr 2011 | A1 |
20110115729 | Kremin et al. | May 2011 | A1 |
20110115742 | Sobel | May 2011 | A1 |
20110157069 | Zhuang et al. | Jun 2011 | A1 |
20110234523 | Chang et al. | Sep 2011 | A1 |
20110273395 | Chung, II | Nov 2011 | A1 |
20110273399 | Lee | Nov 2011 | A1 |
20120007831 | Chang et al. | Jan 2012 | A1 |
20120013399 | Huang | Jan 2012 | A1 |
20120043971 | Maharyta | Feb 2012 | A1 |
20120044199 | Karpin | Feb 2012 | A1 |
20120050180 | King | Mar 2012 | A1 |
20120050214 | Kremin et al. | Mar 2012 | A1 |
20120050333 | Bernstein | Mar 2012 | A1 |
20120092288 | Wadia | Apr 2012 | A1 |
20120162088 | Van Lieshout et al. | Jun 2012 | A1 |
20120169660 | Seo | Jul 2012 | A1 |
20120176179 | Harders et al. | Jul 2012 | A1 |
20120182251 | Krah | Jul 2012 | A1 |
20120187965 | Roziere | Jul 2012 | A1 |
20120188200 | Roziere | Jul 2012 | A1 |
20120262222 | Schwartz et al. | Oct 2012 | A1 |
20120287077 | Pant et al. | Nov 2012 | A1 |
20130038863 | Fresquet | Feb 2013 | A1 |
20130120052 | Siska | May 2013 | A1 |
20130154996 | Trend et al. | Jun 2013 | A1 |
20130285972 | Elias et al. | Oct 2013 | A1 |
20130314109 | Kremin et al. | Nov 2013 | A1 |
20140022188 | Ahn | Jan 2014 | A1 |
20140078096 | Tan | Mar 2014 | A1 |
20140085246 | Shahparnia | Mar 2014 | A1 |
20140145732 | Blondin et al. | May 2014 | A1 |
20140145997 | Tiruvuru | May 2014 | A1 |
20140146006 | Luong | May 2014 | A1 |
20140240280 | Ekici | Aug 2014 | A1 |
20140267165 | Roziere | Sep 2014 | A1 |
20140360854 | Roziere | Dec 2014 | A1 |
20150002176 | Kwon | Jan 2015 | A1 |
20150015531 | Kim et al. | Jan 2015 | A1 |
20150145802 | Yao et al. | May 2015 | A1 |
20150324062 | Elias et al. | Nov 2015 | A1 |
20160209982 | Bernstein et al. | Jul 2016 | A1 |
20170003816 | Bernstein et al. | Jan 2017 | A1 |
20170147109 | Elias et al. | May 2017 | A1 |
20180335873 | Elias et al. | Nov 2018 | A1 |
Number | Date | Country |
---|---|---|
101393502 | Mar 2009 | CN |
2 949 007 | Jun 2012 | FR |
2000-163031 | Jun 2000 | JP |
2002-342033 | Nov 2002 | JP |
2008-117371 | May 2008 | JP |
10-2009-0017557 | Feb 2009 | KR |
WO-2005114369 | Dec 2005 | WO |
WO-2005114369 | Dec 2005 | WO |
WO-2007130771 | Nov 2007 | WO |
WO-2008121411 | Oct 2008 | WO |
WO-2009023880 | Feb 2009 | WO |
WO-2009023880 | Feb 2009 | WO |
WO-2011005977 | Jan 2011 | WO |
WO-2011137200 | Nov 2011 | WO |
WO-2012027086 | Mar 2012 | WO |
WO-2013165925 | Nov 2013 | WO |
Entry |
---|
Cypress. (Apr. 21, 2010). “Cypress's New Hover Detection for TrueTouch™ Touchscreen Solution Indicates Where a Finger Will Touch as It Approaches Screen,” Press Release by Cypress Semiconductor Corp., located at <http://www.cypress.com/?rID=42779>, last visited Sep. 28, 2010, two pages. |
European Search Report dated Jul. 20, 2015, for EP Application No. 15162455.8, three pages. |
Final Office Action dated May 10, 2013, for U.S. Appl. No. 12/501,382, filed Jul. 10, 2009, 23 pages. |
Final Office Action dated Aug. 19, 2013, for U.S. Appl. No. 12/895,643, filed Sep. 30, 2010, 18 pages. |
Final Office Action dated Dec. 5, 2013, for U.S. Appl. No. 12/501,382, filed Jul. 10, 2009, 28 pages. |
Final Office Action dated Aug. 1, 2014, for U.S. Appl. No. 12/501,382, filed Jul. 10, 2009, 42 dated. |
Final Office Action dated Aug. 14, 2014, for U.S. Appl. No. 13/460,620, filed Apr. 30, 2012, 17 pages. |
Final Office Action dated Oct. 14, 2014, for U.S. Appl. No. 13/460,652, filed Apr. 30, 2012, 16 pages. |
Final Office Action dated Mar. 3, 2015, for U.S. Appl. No. 12/895,643, filed Sep. 30, 2010, 22 pages. |
Final Office Action dated Jan. 14, 2016, for U.S. Appl. No. 14/089,418, filed Nov. 25, 2013, 17 pages. |
International Search Report dated Apr. 20, 2011, for PCT Application No. PCT/US2010/041391, filed Jul. 8, 2010, six pages. |
International Search Report dated Aug. 22, 2012, for PCT Application No. PCT/US2011/046814, filed Aug. 5, 2011, three pages. |
International Search Report dated Dec. 17, 2013, for PCT Application No. PCT/US2013/038706, filed Apr. 29, 2013, eight pages. |
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25. |
Non-Final Office Action dated Oct. 4, 2012, for U.S. Appl. No. 12/501,382, filed Jul. 10, 2009, 20 pages. |
Non-Final Office Action dated Mar. 29, 2013, for U.S. Appl. No. 12/895,643, filed Sep. 30, 2010, 16 pages. |
Non-Final Office Action dated Dec. 24, 2013, for U.S. Appl. No. 13/460,645, filed Apr. 30, 2012, 25 pages. |
Non-Final Office Action dated Jan. 30, 2014, for U.S. Appl. No. 13/460,620, filed Apr. 30, 2012, 14 pages. |
Non-Final Office Action dated Feb. 25, 2014, for U.S. Appl. No. 13/460,652, filed Apr. 30, 2012, 14 pages. |
Non-Final Office Action dated Aug. 15, 2014, for U.S. Appl. No. 12/895,643, filed Sep. 30, 2010, 19 pages. |
Non-Final Office Action dated Apr. 23, 2015, for U.S. Appl. No. 12/501,382, filed Jul. 10, 2009, 32 pages. |
Non-Final Office Action dated May 19, 2015, for U.S. Appl. No. 14/089,418, filed Nov. 25, 2013, 14 pages. |
Non-Final Office Action dated Feb. 26, 2016, for U.S. Appl. No. 14/788,642, filed Jun. 30, 2015, 16 pages. |
Notice of Allowance dated Aug. 15, 2014, for U.S. Appl. No. 13/460,645, filed Apr. 30, 2012, seven pages. |
Notice of Allowance dated Mar. 16, 2015, for U.S. Appl. No. 13/460,620, filed Apr. 30, 2012, seven pages. |
Notice of Allowance dated Apr. 13, 2015, for U.S. Appl. No. 13/460,652, filed Apr. 30, 2012, seven pages. |
Notice of Allowance dated Aug. 5, 2015, for U.S. Appl. No. 13/460,620, filed Apr. 30, 2012, eight pages. |
Notice of Allowance dated Oct. 6, 2015, for U.S. Appl. No. 12/895,643, filed Sep. 30, 2010, 8 pages. |
Notice of Allowance dated Dec. 17, 2015, for U.S. Appl. No. 12/501,382, filed Jul. 10, 2009, five pages. |
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages. |
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660. |
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages. |
Chinese Search Report dated May 23, 2016, for CN Application No. 201380022713.1, filed Apr. 29, 2013, two pages. |
Notice of Allowance dated Sep. 28, 2016, for U.S. Appl. No. 14/788,642, filed Jun. 30, 2015, eight pages. |
Final Office Action dated Jun. 6, 2017, for U.S. Appl. No. 14/089,418, filed Nov. 25, 2013, 21 pages. |
Non-Final Office Action dated Jun. 30, 2017, for U.S. Appl. No. 15/428,082, filed Feb. 8, 2017, 13 pages. |
Non-Final Office Action dated Nov. 30, 2016, for U.S. Appl. No. 14/089,418, filed Nov. 25, 2013, 18 pages. |
Chinese Search Report dated Aug. 29, 2018, for CN Application No. 201610846833.9, filed Aug. 5, 2011, with English translation, four pages. |
Final Office Action dated Jul. 26, 2018, for U.S. Appl. No. 15/083,102, filed Mar. 28, 2016, seven pages. |
Non-Final Office Action dated Oct. 5, 2018, for U.S. Appl. No. 16/049,691, filed Jul. 30, 2018, eleven pages. |
Non-Final Office Action dated Oct. 5, 2017, for U.S. Appl. No. 15/083,102, filed Mar. 28, 2016, 16 pages. |
Non-Final Office Action dated Oct. 5, 2017, for U.S. Appl. No. 15/268,418, filed Sep. 16, 2016, 22 pages. |
Notice of Allowance dated Nov. 30, 2017, for U.S. Appl. No. 14/089,418, filed Nov. 25, 2013, ten pages. |
Notice of Allowance dated Mar. 28, 2018, for U.S. Appl. No. 15/428,082, filed Feb. 8, 2017, 8 pages. |
Final Office Action dated Oct. 31, 2018 for U.S. Appl. No. 15/268,418, filed Sep. 16, 2016, 21 pages. |
Number | Date | Country | |
---|---|---|---|
20160259474 A1 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
61377829 | Aug 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12895643 | Sep 2010 | US |
Child | 15050351 | US |