System and method for detecting grip of a touch enabled device

Information

  • Patent Grant
  • 10296146
  • Patent Number
    10,296,146
  • Date Filed
    Tuesday, December 22, 2015
    9 years ago
  • Date Issued
    Tuesday, May 21, 2019
    5 years ago
Abstract
A device includes a display, a controller configured to control the display, a sensor integrated with the display and a circuit a circuit in communication with the sensor. The sensor is configured to sense touch input and the circuit is configured to detect when a user is gripping the device only based on output from the sensor and a pre-defined model. Gripping is reported to the controller.
Description
BACKGROUND

Capacitive sensors are used touch detection in many Human Interface Devices (HID) such as laptops, trackpads, MP3 players, computer monitors, and smart-phones. The capacitive sensor senses positioning and proximity of a conductive object such as a conductive stylus or finger used to interact with the HID. The capacitive sensor is often integrated with an electronic display to form a touch-screen. Capacitive sensors include antennas or lines constructed from different media, such as copper, Indium Tin Oxide (ITO) and printed ink. ITO is typically used to achieve transparency. Some capacitive sensors are grid based and operate to detect either mutual capacitance between the electrodes at different junctions in the grid or to detect self-capacitance at lines of the grid.


SUMMARY

According to some embodiments of the present disclosure there is a provided a system and method to identify when a user is gripping a touch-enabled computing device. The system and method described herein specifically relate to a touch-enabled computing device that includes a capacitive based digitizer sensor. Touch of a palm or thumb along an edge of the touch-screen together with touch on a chassis of the touch-screen indicates that a user is gripping the touch-enabled device, e.g. the touch screen. Both touch along the edge of touch-screen and touch on the chassis is determined from output sampled from the digitizer sensor. A controller reports a gripping state, e.g. grip or no grip to one or more applications running the computing device, to a computer processing unit (CPU) or a graphical processing unit (GPU) of the computing device. An application may use the report as input for running the application. Optionally, a GPU adjusts positioning of objects displayed on the screen to avoid displaying in areas covered by the part of the hand holding the screen.


Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the disclosure, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.





BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS

Some embodiments of the disclosure are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the disclosure. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the disclosure may be practiced.


In the drawings:



FIG. 1 is a simplified schematic drawing of a user holding a touch-enabled computing device;



FIG. 2 is a simplified block diagram of an exemplary touch enabled computing device in accordance with some embodiments of the present disclosure;



FIG. 3 is a schematic representation of the relative effect on a grid based capacitive sensor with one finger touching the digitizer sensor in accordance with some embodiments of the present disclosure;



FIG. 4 is a schematic representation of the relative effect on a grid based capacitive sensor when two fingers are touching the digitizer sensor;



FIG. 5 is a simplified block diagram describing capacitance between a user and a touch enabled computing system in accordance with some embodiments of the present disclosure;



FIG. 6 is a simplified flow chart of an exemplary method detecting capacitance between a user and a touch enabled device in accordance with some exemplary embodiments of the present disclosure;



FIG. 7 is a simplified schematic representation of a grounding state machine in accordance with some embodiments of the present disclosure;



FIGS. 8A, 8B and 8C are simplified drawing illustrating exemplary defined grip areas on a touch screen in accordance with some embodiments of the present disclosure; and



FIG. 9 is a simplified flow chart of an exemplary method for detecting grip of a touch-enabled computing device in accordance with some exemplary embodiments of the present disclosure.





DETAILED DESCRIPTION


FIG. 1 shows a hand 49 gripping a touch-enabled computing device 100. Computing device 100 typically includes a grid based capacitive digitizer sensor 50 integrated on an electronic display 45. Display 45 may include a frame 55 of black print to hide circuitry associated with digitizer sensor 50. When a user grips computing device 100 a thumb 48 is positioned on an edge of display 45 including digitizer sensor 50 and one or more fingers 44 on chassis 65. Chassis 65 is a back face or edge around display 45 and typically includes conductive material. Alternatively, a palm of hand 49 is positioned on an edge of display 45 and a thumb 48 is on chassis 65. A user may also grip computing device 100 with two hands. When two hands are used thumb 48 or palm maybe positioned on two facing edges of display 45 and both hands hold chassis 65.


According to some embodiments of the present disclosure, touch on chassis 65 is detected based on a detected capacitance between a user and computing device 100 and touch on an edge of display 45 is detected based on identifying thumb 48 or palm input along an edge of the digitizer sensor 50. Both the capacitance and thumb 48 or palm input is detected based on output from digitizer sensor 50.


According to some embodiments of the present disclosure, a physical model is defined that relates output from digitizer sensor 50 to capacitance (CBD) between the user and the computing device 100. CBD may be monitored during user interaction with computing device 100. A threshold on running averages of detected CBD may be used toggle between defining computing device 100 as being touched or untouched on chassis 65. Optionally, CBD below 250 pF indicates that chassis 65 is untouched and above 250 pF indicates that chassis 65 is touched.


Capacitance between the user and the computing device (CBD) may be highly dynamic. High capacitance may be detected when the computing device is well grounded. For example, capacitance above 500 pF may be detected while a user holds a chassis of the computing device and capacitance above 50 pF may be detected while the computing device is connected to an external power source with two prong plug and above 100 pF while the computing device is connected to an external power source with three prong plug. Low capacitance, e.g. below 30 pF may be detected while the computing device is ungrounded. For example, low capacitance may be detected while a computing device is resting on a pillow and disconnected from an external power source.


According to some embodiments of the present disclosure, a gripping state is reported responsive to detecting CBD greater than a defined threshold and thumb 48 or palm input on an edge of digitizer sensor 50.


Reference is now made to FIG. 2 a simplified block diagram of an exemplary touch enabled computing device in accordance with some embodiments of the present disclosure. According to some embodiments of the present disclosure, a computing device 100 includes display 45 integrated with digitizer sensor 50. Digitizer sensor 50 typically includes a matrix formed with parallel conductive material 58 arranged in rows and columns with a capacitive connection in junction areas 59 formed between rows and columns.


Digitizer circuitry 25 applies mutual capacitive detection or a self-capacitive detection for sensing a touch signal from touch of fingertip 46. Bringing a grounded finger 46 (or other part of the hand) close to the surface of digitizer sensor 50 changes the local electrostatic field and reduces the mutual capacitance at junctions 59 in the touched area. A change in mutual capacitance may be detected by a circuit 25 when applying a drive signal along one axis (the drive lines) of the matrix while sampling output on the other axis (the receive lines) to detect a coupled signal. Finger touch has the effect of reducing amplitude of the measured signal.


Output from digitizer sensor 50 may be in the form of a heatmap that maps detected amplitudes of the coupled signals at each junction. In a heatmap, finger touch produces a negative blob at the finger touch location. The change in amplitude of the measured signal due to the finger touch, e.g. the relative effect depends on the ground state of the system. While this system is floating, the relative effect is typically round 15%, e.g. the measured amplitude is 0.85 of the amplitude measured with no touch. While the system is connected to an external power supply, the relative effect may reach 20%, e.g. the measured amplitude is 0.80 of the amplitude measured with no touch. The largest relative may typically be detected while the chassis is touched and may reach 25%, e.g. the measured amplitude is 0.80 of the amplitude measured with no touch.


According to some embodiments of the present disclosure, digitizer circuitry 25 includes a dedicated engine 252 for estimating or determining CBD from the heatmap and a state machine 253 for defining touch or no touch on a chassis of the computing device based on the CBD. In some exemplary embodiments, CBD detection engine 252 is associated with memory that stores one or more look-up tables for relating detected touch signals to a level of capacitance between a user with fingertip 46 and device 100. Typically, state machine 253 is also associated with memory for storing parameters, e.g. thresholds, accumulated averages of CBD.


Typically, output from digitizer circuitry 25 is reported to host 22. Typically, the output provided by digitizer circuitry 25 may include coordinates of one or more fingertips 46. Optionally, a gripping state, e.g. grip or no grip is reported or CBD is reported. Optionally, palm or thumb touch along an edge of digitizer sensor 50 is reported.


Typically, digitizer circuitry 25 uses both analog and digital processing to process signals detected with digitizer sensor 50. Optionally, some and/or all of the functionalities of CBD detection engine 252 and state machine 253 is integrated in one or more processing units adapted for controlling operation of digitizer sensor 50. Optionally, some and/or all of the functionalities of digitizer circuitry 25, CBD detection engine 252 and state machine 253 is integrated and/or included in host 22.


Reference is now to FIG. 3 showing a schematic representation of the relative effect on a grid based capacitive sensor with one finger touching the digitizer sensor. The relative effect as defined herein is a difference between baseline amplitude detected with no touch input and an amplitude detected at a touched location. Only a portion of digitizer sensor 50 is shown for simplicity.


A presence of a finger at location 320 reduces mutual capacitance at junctions 59 in location 320. Due to the reduced mutual capacitance, when a drive signal 305 is imposed on drive lines 56, amplitudes detected on the touched receive lines 57 are lower than amplitude detected on other receive lines 57. Reduced amplitudes due to the reduced mutual capacitances are represented by arrows 310. At the same time, potential may be induced on the finger from drive signal 305. This potential may be injected on receive lines 57 which increases amplitudes of the outputs as represented by arrows 315. The output detected from the touched receive lines is therefore a summation of amplitude 310 and amplitude 315. Typically, output detected from a single finger touch produces a negative blob having amplitude that varies based on the magnitude of the induced potential.


When more than one finger is touching the sensing surface or when part of the hand is also touching the sensing surface, the potential induced on one finger or part of the hand spreads to other parts of the digitizer sensor touched by other fingers or other parts of the hand. FIG. 4 shows a schematic representation of the relative effect on a grid based capacitive sensor when two fingers are touching the digitizer sensor. Simultaneous finger touch at locations 320 may lead to a drive signal 305 transmitted on drive lines 56 crossing one location 320 to spread on to receive lines 57 that cross both locations 320. Potential induced on a finger that is not touching the current drive line may introduce positive blob ghosts 330. The induced potential also reduces the relative effect detected at touch locations 320 as discussed in reference to FIG. 3. The effect of the induced potential on the hand may also lead to blob deformation when a relatively large area is touched, e.g. due to palm input or when multiple touches occur in close proximity.


Reference is now made to FIG. 5 showing a simplified block diagram describing capacitance between a user and a touch enabled computing system in accordance with some embodiments of the present disclosure. Capacitance between a user and a touch enabled computing device is defined by touch capacitance (CT) between the user and the digitizer sensor due to a user touching the touch screen, body earth capacitance (CBE) between the user and earth ground and device earth capacitance (CDE) between the computing device ground and earth ground. CBD is the combination of CBE and CDE.


CT is typically a function of surface contact between the user and the digitizer sensor as well as physical parameters of the touch-screen. CT increases as more fingers touch the digitizer sensor or due to palm contact and decreases with less contact. In some exemplary embodiments, CT may be estimated from the heat map based on the following relationship:

CT=(NTotal)(CD)  Equation (1)


Where NTotal is the number of junctions 59 touched and CD is a constant that represents capacitance of the device due to both physical properties of the conductive strips 58 and display 45 and also due to proximity between the elements. Optionally, Equation (1) may also include parameters that account for palm touch and size of each finger touch.


CBE is typically ranges between 100-300 pF. CDE can differ between devices based on their constructions and components included in the device. CDE also changes with changing conditions in the surrounding environment. As described herein above, CDE can increase significantly when the device is plugged into a power outlet, connected to another device, or when a user grounds the device by touching its chassis. The capacitance (CBD) between the device ground and the user can be defined by:

1/CBD=1/CBE+1/CDE  Equation (2)


And the total capacitance (CTotal) between the user and the device can be defined by:

CTotal=CBD+CT  Equation (3)


According to some embodiments of the present invention, CTotal may also be estimated based on the detected heatmap using the following relationship:

CTotal=(NH)(NV)(CE)/P  Equation (4)


Where:


CE is a second constant and can be obtained with empirical data or simulations and represents capacitive coupling at the junctions due to physical properties of the conductive strips 58 and the geometry between them. P is the peak relative effect at a location with a perceived maximum potential induced on the finger from the drive signal. For single finger touch, P is typically peak amplitude at the touch location. In cases where positive blob ghosts are present, P is peak amplitude of the positive blob ghosts. P may also be defined by the following equation:

P=FEG−FED  Equation (5)


Where:


FEG is an ideal relative effect that is detected when the computing device has a same impedance to earth ground as the user and FED is the detected relative effect; and


NH is a number of touched junctions along a row conductive line crossing the location at which P is detected and NV is a number of touched junctions along a column conductive line crossing the location of P. CBD can then be estimated based on the following relationship:

CBD=(NH)(NV)(CE)/(FEG−FED)−(NTotal)(CD)  Equation (6)


Where parameters CE, CD and FEG are pre-determined constants and parameters NH, NV, FED and NTotal are values determined from the sampled heatmap.


Reference is now made to FIG. 6 showing a simplified flow chart of an exemplary method detecting capacitance between a user and a touch enabled device in accordance with some exemplary embodiments of the present disclosure. Output from a grid based digitizer sensor is detected while a user is operating a touch enabled computing device (block 705). Optionally, CBD detection is applied once every few frames or optionally on every frame detected. The output detected is touch signals at junctions of a digitizer sensor. A heatmap of the touch signals at each junction may be constructed. A processor or circuit 25 may be programmed to detect blobs, e.g. areas at which touch signals are identified (block 710). In some exemplary embodiments, geometry or characteristics of the blobs are examined to determine if the output from the frame is suitable for detecting CBD based on the defined model (block 720). Typically, frames that include blobs with heavy deformations or blobs with low relative effect are not used for detecting CBD.


If the frame is accepted, peak amplitudes of the blobs are detected and a representative blob is selected based on which the model for detecting CBD will be applied (block 730). Parameters NV, NH, and NTOTAL may be determined based on characterization of the spread of the detected touch signals (block 740). Typically, NV, NH, and NTOTAL are parameters that can be extracted from the heatmap. CBD may be estimated based on a pre-defined model relating CBD to FE NV, NH, and NTOTAL (block 750).


Typically, touch on a chassis of the computing device is determined based on average CBD detected over time. Typically an adaptive history buffer is used to update the average CBD. CBD may be detected every few frames, e.g. every 3-10 frames during user touch interaction. Optionally, a rate at which CBD is detected depends on a current status of the computing device.


Reference is now made to FIG. 7 showing a simplified schematic representation of a state machine in accordance with some embodiments of the present disclosure. Optionally, a chassis of computing device is assumed to be in an untouched while no information is available. After a pre-defined number of CBD detections, the CBD is checked against a first threshold, an untouched to touched threshold (U2T_TH). If CBD is greater than the first threshold, the state changes to ‘Touched.’ U2T_TH may typically be set to a value above 250 pF, e.g. 400 pF. Average CBD are gathered and as long as CBD does not fall below a second threshold, a touched to untouched threshold (G2T_TH) the state of the machine is maintained. G2U_TH may typically be set to a value below 250 pF, e.g. 150 pF. Once CBD fall below the second threshold the state of the machine is changed to ‘Untouched’. Average CBD are gathered and as long as CBD is below (U2T_TH) the state of the machine is maintained. According to some embodiments, a state of the state machine is first defined based on an average of 4-10 CBD detections and then updated using more detections, e.g. 20-50 CBD detections.


Reference is now made to FIGS. 8A, 8B and 8C showing a simplified drawing illustrating exemplary defined grip areas on a touch screen in accordance with some embodiments of the present disclosure. In some exemplary embodiments, one or more areas 610 associated with gripping are pre-defined. A gripping state may be indicated while a user touches the chassis and places a thumb or a palm in one of areas 610. Thumb or palm touch outside of areas 610 may not be used to indicate gripping. In some exemplary embodiments, areas 610 are areas on a left and right edge of display 45 as shown in FIG. 8A. Optionally, areas 610 are adjusted as display 45 is rotated so that areas 610 will correspond to an area most likely to be gripped by a user (FIG. 8B). In some exemplary embodiments, area 610 is defined as a frame around display 50 and any thumb or palm touch around an edge of digitizer sensor 50 can be used as in indication of gripping.



FIG. 9 is a simplified flow chart of an exemplary method for detecting grip of a touch-enabled computing device in accordance with some exemplary embodiments of the present disclosure. Output from a digitizer sensor is detected at a defined refresh rate (block 805). From the output detected, touch input in a grip designated area, e.g. area 610 is identified (block 810). The touch input, e.g. a blob in a heatmap is examined for features related to size and shape to identify if the touch is touch by a thumb or palm (block 820). Typically, an oblong shape or a blob that is larger than a defined size is used as indication of thumb or palm input.


A state of the state machine is also detected for to determine grip (block 830). Optionally, only if thumb or palm input is detected, a current state of the state machine is examined. If a state machine also indicates that the chassis is being touched (block 840), gripping is reported (block 850). If the state machine indicates that the chassis is not being touched no gripping may be reported (block 845). Alternatively, a state machine is not used, and instead accumulated averages of CBD above a defined threshold provides an indication that the chassis is being touched. Typically, this process is repeated over a duration of user interaction with the computing device. Optionally, grip detection is activated by specific command from an application running on the host. In some exemplary embodiments, the state machine is updated throughout user interaction with the computing device and only when the state machine indicates that the chassis is being touched, thumb or palm input in grip designated areas are sought to determine if the computing device is gripped.


According to some exemplary embodiments, there is provided a device comprising: a display; a controller configured to control the display; a sensor integrated with the display, wherein the sensor is configured to sense touch input; and a circuit in communication with the sensor, the circuit configured to: detect when a user is gripping the device only based on output from the sensor and a pre-defined model; and report the gripping detected to the controller.


Optionally, the circuit is configured to: detect when the user is touching a pre-defined portion of the sensor with a thumb or palm based on the output; detect when the user is touching a chassis of the device based on the output; and detect that the user is gripping the device based on detecting that the user is both touching the pre-defined portion with the thumb or the palm and touching the chassis.


Optionally, the sensor is a capacitive based sensor having electrode junctions.


Optionally, the circuit is configured to detect capacitance (CBD) between the device ground and a user touching the sensor based on output from the sensor and the pre-defined model.


Optionally, the model is a physical model for CBD capacitive estimation based on a number of touched junctions and a relative effect, and wherein the number of touched junction and the relative effect is detected from the output sampled.


Optionally, the circuit is configured to detect that the user is touching the chassis based on the CBD detected.


Optionally, the circuit is configured to detect that the user is touching the chassis based on accumulated averages of the CBD being above a pre-defined threshold.


Optionally, the pre-defined portion of the sensor is along at least one edge of the display.


Optionally, the pre-defined portion of the sensor is dynamically adjusted based on an orientation of the display.


Optionally, the report is input to an application running on the device or to a graphical processing unit of the device.


According to some exemplary embodiments, there is provided a method including: detecting output from a sensor integrated with a display of a device, wherein the sensor is configured to sense touch input; and detecting when the user is gripping the device based only on the output and a pre-defined model; and reporting the gripping detected to a controller of the device.


Optionally, the gripping is detected based on detecting from the output that the user is both touching a pre-defined portion of the sensor with a thumb or a palm and touching a chassis of the device.


Optionally, the sensor is a capacitive based sensor having electrode junctions.


Optionally, the method includes detecting capacitance (CBD) between the device ground and a user touching the sensor based on the output from the sensor and the pre-defined model.


Optionally, the model is a physical model for CBD capacitive estimation based on a number of touched junctions and a relative effect, and wherein the number of touched junction and the relative effect is detected from the output sampled.


Optionally, the method includes detecting that the user is touching the chassis based on the CBD detected.


Optionally, the method includes detecting that the user is touching the chassis based on accumulated averages of the CBD being above a pre-defined threshold.


Optionally, the pre-defined portion of the sensor is along at least one edge of the display.


Optionally, the pre-defined portion of the sensor is dynamically adjusted based on an orientation of the display.


Optionally, the report is input to an application running on the device or to a graphical processing unit of the device.


Certain features of the examples described herein, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the examples described herein, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment of the disclosure. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.

Claims
  • 1. A device comprising: a front face and a back face, the back face opposite the front face;a display on the front face of the device;a controller configured to control the display;a grid based capacitive sensor integrated with the display, wherein the sensor is configured to sense touch input on the display; anda circuit in communication with the sensor, the circuit configured to: perform mutual capacitive detection based on scanning one axis of the grid based capacitive sensor;detect a heatmap based on the mutual capacitive detection;identify a thumb or palm on a pre-defined portion of the display based on the heatmap;detect capacitance (CBD) between the user and the device based on the heatmap;identify that a user is touching the back face of the device based on accumulated averages of the CBD being above a pre-defined threshold, wherein the accumulated averages are accumulated over a plurality of refresh cycles of the sensor including refresh cycles during which the thumb or palm is on the pre-defined portion of the display;detect that the user is gripping the device based on identifying the thumb or palm on the pre-defined portion of the display and based on detecting when the user is touching the back face of the device; andreport the gripping detected to the controller.
  • 2. The device of claim 1, wherein the CBD is detected based on a number of touched junctions in the heatmap and amplitude of touch junctions in the heatmap.
  • 3. The device of claim 1, wherein the pre-defined portion of the sensor is along at least one edge of the display.
  • 4. The device of claim 1, wherein the pre-defined portion of the sensor is dynamically adjusted based on an orientation of the display.
  • 5. The device of claim 1, wherein the report is input to an application running on the device or to a graphical processing unit of the device.
  • 6. The device of claim 1, wherein the CBD is also detected based on pre-defined constants, wherein the pre-defined constants are related to physical properties of the grid based capacitive sensor and are defined based on empirical data.
  • 7. A method comprising: performing mutual capacitive detection based on scanning one axis of a grid based capacitive sensor integrated with a display of a device, wherein the display occupies at least a portion of a first surface of the device, wherein the first surface is opposite a second surface of the device, and wherein the sensor is configured to sense touch input on the display;detecting when the user is touching a pre-defined portion of the display with a thumb or palm based on a heatmap determined from the mutual capacitive detection;detecting capacitance (CBD) between the user and the device based on the heatmap;detecting when the user is touching the second surface of the device based on accumulated averages of the CBD being above a pre-defined threshold, wherein the accumulated averages are accumulated over a plurality of refresh cycles of the sensor including refresh cycles during which the thumb or palm is on the pre-defined portion of the display;detecting that the user is gripping the device based on identifying the thumb or palm on the pre-defined portion of the display and based on detecting when the user is touching the back face of the device; andreporting the gripping detected to a controller of the device.
  • 8. The method of claim 7, wherein the CBD is detected based on a number of touched junctions in the heatmap and amplitude of touch junctions in the heatmap.
  • 9. The method of claim 7, wherein the pre-defined portion of the sensor is along at least one edge of the display.
  • 10. The method of claim 7, wherein the pre-defined portion of the sensor is dynamically adjusted based on an orientation of the display.
  • 11. The method of claim 7, wherein the report is input to an application running on the device or to a graphical processing unit of the device.
  • 12. The method of claim 7, wherein the pre-defined threshold is 250 pF.
  • 13. The method of claim 7, wherein the CBD is also detected based on pre-defined constants, wherein the pre-defined constants are related to physical properties of the grid based capacitive sensor and are defined based on empirical data.
  • 14. The method of claim 7, wherein the CBD is also detected based on spatial spread of a touched area on the heatmap.
US Referenced Citations (136)
Number Name Date Kind
4398720 Jones et al. Aug 1983 A
4591710 Komadina et al. May 1986 A
4672154 Rodgers et al. Jun 1987 A
4686332 Greanias et al. Aug 1987 A
4707845 Krein et al. Nov 1987 A
4791592 Burgess Dec 1988 A
4977397 Kuo et al. Dec 1990 A
5117071 Greanias et al. May 1992 A
5129654 Bogner Jul 1992 A
5239139 Zuta Aug 1993 A
5528002 Katabami Jun 1996 A
5543588 Bisset et al. Aug 1996 A
5574262 Petty Nov 1996 A
5691512 Obi Nov 1997 A
5825345 Takahama et al. Oct 1998 A
5831600 Inoue et al. Nov 1998 A
5841078 Miller et al. Nov 1998 A
5854881 Yoshida et al. Dec 1998 A
5859392 Petty Jan 1999 A
5889511 Ong et al. Mar 1999 A
5905489 Takahama et al. May 1999 A
5923320 Murakami et al. Jul 1999 A
5973676 Kawakura Oct 1999 A
6020849 Fukuzaki Feb 2000 A
6081259 Teterwak Jun 2000 A
6229529 Yano et al. May 2001 B1
6239389 Allen et al. May 2001 B1
6417846 Lee Jul 2002 B1
6459424 Resman Oct 2002 B1
6690156 Weiner et al. Feb 2004 B1
6888536 Westerman et al. May 2005 B2
7244901 Liao et al. Jul 2007 B1
7248249 Kong et al. Jul 2007 B2
7292229 Morag et al. Nov 2007 B2
7372455 Perski et al. May 2008 B2
7643010 Westerman et al. Jan 2010 B2
7656396 Bosch et al. Feb 2010 B2
7725089 Lee et al. May 2010 B2
7843439 Perski et al. Nov 2010 B2
7868874 Reynolds Jan 2011 B2
7995036 Perski et al. Aug 2011 B2
8059102 Rimon et al. Nov 2011 B2
8130203 Westerman Mar 2012 B2
8278571 Orsley Oct 2012 B2
8289289 Rimon et al. Oct 2012 B2
8305358 Klinghult et al. Nov 2012 B2
8402391 Doray et al. Mar 2013 B1
8420958 Lu et al. Apr 2013 B2
8446374 Westerman May 2013 B2
8482545 King-Smith et al. Jul 2013 B2
8542210 Westerman Sep 2013 B2
8660978 Hinckley et al. Feb 2014 B2
8810542 Yousefpor Aug 2014 B2
8823664 Kyrynyuk et al. Sep 2014 B2
8994692 Yumoto et al. Mar 2015 B2
9035905 Saukko et al. May 2015 B2
9262010 Bulea Feb 2016 B2
9367168 Ahn et al. Jun 2016 B2
9495052 Shepelev Nov 2016 B2
9626020 Durojaiye et al. Apr 2017 B2
9632622 Hotelling et al. Apr 2017 B2
20020089491 Willig Jul 2002 A1
20030080946 Chuang May 2003 A1
20040027340 Muraoka et al. Feb 2004 A1
20040100450 Choi May 2004 A1
20040155871 Perski et al. Aug 2004 A1
20040160426 DeGroot et al. Aug 2004 A1
20040178995 Sterling Sep 2004 A1
20050189154 Perski et al. Sep 2005 A1
20050271259 Lorch et al. Dec 2005 A1
20060012580 Perski et al. Jan 2006 A1
20060017709 Okano Jan 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060109252 Kolmykov-Zotov et al. May 2006 A1
20060139339 Pechman et al. Jun 2006 A1
20070103454 Elias May 2007 A1
20070152976 Townsend et al. Jul 2007 A1
20070285404 Rimon et al. Dec 2007 A1
20080012835 Rimon et al. Jan 2008 A1
20080012838 Rimon Jan 2008 A1
20080128180 Perski et al. Jun 2008 A1
20080238885 Zachut et al. Oct 2008 A1
20090020343 Rothkopf et al. Jan 2009 A1
20090095540 Zachut et al. Apr 2009 A1
20090160787 Westerman et al. Jun 2009 A1
20090251434 Rimon et al. Oct 2009 A1
20100060608 Yousefpor Mar 2010 A1
20100155153 Zachut Jun 2010 A1
20100156851 Kurokawa Jun 2010 A1
20100289752 Birkler Nov 2010 A1
20100321338 Ely Dec 2010 A1
20110012840 Hotelling et al. Jan 2011 A1
20110012855 Yeh et al. Jan 2011 A1
20110084929 Chang et al. Apr 2011 A1
20110155479 Oda et al. Jun 2011 A1
20110175835 Wang Jul 2011 A1
20110254802 Philipp Oct 2011 A1
20110254807 Perski et al. Oct 2011 A1
20110284632 Mullen Nov 2011 A1
20110291944 Simmons et al. Dec 2011 A1
20110310040 Ben-Shalom et al. Dec 2011 A1
20120050180 King et al. Mar 2012 A1
20120105362 Kremin et al. May 2012 A1
20120133616 Nishihara et al. May 2012 A1
20120158629 Hinckley et al. Jun 2012 A1
20120182238 Lee Jul 2012 A1
20120249457 Chou et al. Oct 2012 A1
20130009896 Zaliva Jan 2013 A1
20130009907 Rosenberg et al. Jan 2013 A1
20130027361 Perski et al. Jan 2013 A1
20130300696 Haran et al. Jan 2013 A1
20130127757 Mann et al. May 2013 A1
20130132903 Krishnaswamy May 2013 A1
20130176270 Cattivelli et al. Jul 2013 A1
20130176280 Wu et al. Jul 2013 A1
20130249950 Mahmoud et al. Sep 2013 A1
20130265258 Garfinkel et al. Oct 2013 A1
20130278543 Hsu et al. Oct 2013 A1
20130285973 Elias et al. Oct 2013 A1
20130300668 Churikov et al. Nov 2013 A1
20130300672 Griffin Nov 2013 A1
20130328832 Boumgarten Dec 2013 A1
20140152620 Perski et al. Jun 2014 A1
20140168116 Sasselli et al. Jun 2014 A1
20140176447 Alameh et al. Jun 2014 A1
20140184554 Walley Jul 2014 A1
20140320445 Kim Oct 2014 A1
20150049044 Yousefpor et al. Feb 2015 A1
20150070310 Suzuki et al. Mar 2015 A1
20150109243 Jun Apr 2015 A1
20150177089 Ferran et al. Jun 2015 A1
20150193025 Rebeschi et al. Jul 2015 A1
20160041685 Perski et al. Feb 2016 A1
20160098742 Minicucci et al. Apr 2016 A1
20160274700 Mishalov Sep 2016 A1
20170177138 Orlovsky et al. Jun 2017 A1
Foreign Referenced Citations (21)
Number Date Country
202092590 Dec 2011 CN
103576997 Feb 2014 CN
0684580 Nov 1995 EP
1422601 May 2004 EP
1717677 Nov 2006 EP
2659832 Nov 2013 EP
05-173698 Jul 1993 JP
07-311647 Nov 1995 JP
09-325852 Dec 1997 JP
10-031545 Feb 1998 JP
2002-207563 Jul 2002 JP
201537442 Oct 2015 TW
WO 03019346 Mar 2003 WO
WO 2005081631 Sep 2005 WO
WO 2009108334 Sep 2009 WO
WO 2011154950 Dec 2011 WO
WO 2012111010 Aug 2012 WO
WO 2012140656 Oct 2012 WO
WO 2013171747 Nov 2013 WO
2014149243 Sep 2014 WO
WO 2014145872 Sep 2014 WO
Non-Patent Literature Citations (55)
Entry
Official Action dated Apr. 3, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 15/050,656. (23 pages).
International Search Report and the Written Opinion dated Jun. 3, 2016 From the International Searching Authority Re. Application No. PCT/US2016/022760.
Official Action dated Dec. 30, 2016From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381. (24 pages).
Applicant-Initiated Interview Summary dated May 2, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381. (3 pages).
International Search Report and the Written Opinion dated Apr. 11, 2017 From the International Searching Authority Re. Application No. PCT/US2016/066737. (14 Pages).
Written Opinion dated Feb. 1, 2017 From the International Preliminary Examining Authority Re. Application No. PCT/US2016/022760. (8 Pages).
Advisory Action Before the Filing of an Appeal Brief dated Sep. 29, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381.
Official Action dated Jul. 8, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381.
International Search Report and the Written Opinion dated May 23, 2006 From the International Searching Authority Re.: Application No. PCT/IL05/00229.
Communication Pursuant to Article 94(3) EPC dated Nov. 13, 2012 From the European Patent Office Re. Application No. 05709125.8.
Communication Pursuant to Article 94(3) EPC dated May 15, 2014 From the European Patent Office Re. Application No. 05709125.8.
Communication Pursuant to Article 94(3) EPC dated Jul. 19, 2012 From the European Patent Office Re. Application No. 05709125.8.
Communication Pursuant to Article 94(3) EPC dated Jun. 20, 2013 From the European Patent Office Re. Application No. 05709125.8.
Communication Pursuant to Article 94(3) EPC dated Nov. 22, 2013 From the European Patent Office Re. Application No. 05709125.8.
Communication Relating to the Results of the Partial International Search dated Sep. 4, 2013 From the International Searching Authority Re. Application No. PCT/IL2013/050417.
Decision to Refuse a European Patent Application (Article 97(2) EPC) dated Jul. 3, 2015 From the European Patent Office Re. Application No. 05709125.8.
International Preliminary Report on Patentability dated Nov. 27, 2014 From the International Bureau of WIPO Re. Application No. PCT/IL2013/050417.
International Search Report and the Written Opinion dated Dec. 20, 2013 From the International Searching Authority Re. Application No. PCT/IL2013/050417.
Notice of Allowance dated Jun. 10, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/176,289.
Official Action dated Jun. 5, 2012 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/171,601.
Official Action dated Oct. 5, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381.
Official Action dated Dec. 8, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/063,535.
Official Action dated Jan. 13, 2016 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381.
Official Action dated Nov. 15, 2011 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/171,601.
Official Action dated Mar. 18, 2015 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381.
Official Action dated Dec. 22, 2010 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/063,535.
Official Action dated May 25, 2010 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/063,535.
Official Action dated May 27, 2009 From the US Patent and Trademark Office Re.: U.S. Appl. No. 11/063,535.
Official Action dated Mar. 28, 2013 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/644,331.
Proceedings Further With the European Patent Application Pursuant to Rule 70(2) EPC dated Apr. 13, 2012 From the European Patent Office Re. Application No. 05709125.8.
Summons to Attend Oral Proceedings Pursuant to Rule 115(1) EPC dated Dec. 15, 2014 From the European Patent Office Re. Application No. 05709125.8.
Supplementary European Search Report dated Mar. 27, 2012 From the European Patent Office Re. Application No. 05709125.8.
Translation of Decision of Rejection dated Jun. 2, 2011 From the Japanese Patent Office Re.: Application No. 2007-500353.
Translation of Notification of Reasons of Rejection dated May 21, 2010 From the Japanese Patent Office Re.: Application No. 2007-500353.
Hughes “Apple's Stylus Receiver Concept Would Improve the Precision of Digital Pen-Based Input”, Follow AppleInsider, Quiller Media, 8 P., Jan. 29, 2015.
Park et al. “A Pen-Pressure-Sensitive Capacitive Touch System Using Electrically Coupled Resonance Pen”, IEEE Journal of Solid-State Circuits, 51(1): 168-176, Jul. 30, 2015.
Wang et al. “Detecting and Leveraging Finger Orientation for Interaction with Direct-Touch Surfaces”, UIST '09 Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology: 23-32, Jul. 4, 2009.
Official Action dated Jun. 14, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 13/892,381. (26 pages).
Official Action dated Jun. 2, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/977,721. (38 Pages).
International Preliminary Report on Patentability dated Jun. 6, 2017 From the International Preliminary Examining Authority Re. Application No. PCT/US2016/022760. (9 Pages).
Restriction Official Action dated Aug. 3, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 15/070,380. (6 Pages).
Official Action dated Nov. 30, 2017 From the US Patent and Trademark Office Re. U.S. Appl. No. 11/977,721. (31 pages).
Official Action dated Apr. 5, 2018 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/977,721. (21 pages).
Advisory Action Before the Filing of an Appeal Brief dated Feb. 15, 2018 From the US Patent and Trademark Office Re. U.S. Appl. No. 14/977,721.(4 pages).
“AN11623 LPC82x Touch Solution Hardware Design Guide”, Retrieved From: https://www.mouser.com/pdfdocs/NXPLPC82xTouchSolutionHardwareDesignGuide.PDF, Dec. 22, 2014, 18 Pages.
“Capacitive Sensing Solutions from Silicon Labs”, Retrieved From: https://web.archive.org/web/20140831110204/http://www.silabs.com/Support%20Documents/TechnicalDocs/CapacitiveTouchSenseTechnologySPKR.pdf, Aug. 31, 2014, 53 Pages.
“MTI Instruments Capacitance Measurement Products”, Retrieved From: http://www.mtiinstruments.com/technology/Capacitance.aspx, Retrieved on: Jul. 16, 2015, 9 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 15/070,380”, dated Jan. 11, 2018, 14 Pages.
Allan, Roger, “How to Select the Right Touch Sensing Approach for Your Design”, Retrieved From: http://www.digikey.com/es/articles/techzone/2011/aug/how-to-select-the-right-touch-sensing-approach-for-your-design, Aug. 25, 2011, 4 Pages.
Camacho, et al., “Designing Touch Sensing Electrodes: Electrical Considerations and Recommended Layout Patterns”, In Free Semiconductor Application Note, Document No. AN3863, Rev 4, Jul. 2011, 28 Pages.
Carey, John, “Smart Phone Design: Projected Capacitance Fueling Innovation”, Retrieved From: https://www.eeweb.com/profile/john-carey/articles/smart-phone-design-projected-capacitance-fueling-innovation, Aug. 18, 2011, 5 Pages.
Goel, et al., “GripSense: Using Built-In Sensors To Detect Hand Posture And Pressure On Commodity Mobile Phones”, In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, Oct. 7, 2012, pp. 545-554.
Harrison, et al., “Capacitive Fingerprinting: Exploring User Differentiation by Sensing Electrical Properties of the Human Body”, In Proceedings of the 25th annual ACM symposium on User Interface Software and Technology, Oct. 7, 2012, pp. 537-544.
Leigh, et al., “High Rate, Low-Latency Multi-Touch Sensing with Simultaneous Orthogonal Multiplexing”, In Proceedings of the 27th annual ACM symposium on User interface software and technology, Oct. 5, 2014, pp. 355-364.
“Final Office Action Issued in U.S. Appl. No. 14/977,721”, dated Nov. 8, 2018, 18 Pages.
Related Publications (1)
Number Date Country
20170177110 A1 Jun 2017 US