Sensor baseline offset adjustment for a subset of sensor output values

Information

  • Patent Grant
  • 11294503
  • Patent Number
    11,294,503
  • Date Filed
    Thursday, October 4, 2018
    5 years ago
  • Date Issued
    Tuesday, April 5, 2022
    2 years ago
Abstract
An image jaggedness filter is disclosed that can be used to detect the presence of ungrounded objects such as water droplets or coins, and delay periodic baseline adjustments until these objects are no longer present. To do otherwise could produce inaccurate normalized baseline sensor output values. The application of a global baseline offset is also disclosed to quickly modify the sensor offset values to account for conditions such as rapid temperature changes. Background pixels not part of any touch regions can be used to detect changes to no-touch sensor output values and globally modify the sensor offset values accordingly. The use of motion dominance ratios and axis domination confidence values is also disclosed to improve the accuracy of locking onto dominant motion components as part of gesture recognition.
Description
FIELD OF THE INVENTION

This relates to touch sensor panels used as input devices for computing systems, and more particularly, to the normalization and post-processing of touch sensor data.


BACKGROUND OF THE INVENTION

Many types of input devices are presently available for performing operations in a computing system, such as buttons or keys, mice, trackballs, touch sensor panels, joysticks, touch screens and the like. Touch screens, in particular, are becoming increasingly popular because of their ease and versatility of operation as well as their declining price. Touch screens can include a touch sensor panel, which can be a clear panel with a touch-sensitive surface. The touch sensor panel can be positioned partially or completely in front of a display screen, or integrated partially or entirely within the display screen, so that at least a portion of the touch-sensitive surface covers at least a portion of the viewable area of the display screen. Touch screens can allow a user to make selections and move a cursor by simply touching the display screen via a finger or stylus. In general, the touch screen can recognize the touch and position of the touch on the display screen, and the computing system can interpret the touch and thereafter perform an action based on the touch event.


Touch sensor panels can be capable of detecting either single-touch events or multiple touch events, an example of which is described in Applicant's co-pending U.S. application Ser. No. 11/649,998 entitled “Proximity and Multi-Touch Sensor Detection and Demodulation,” filed on Jan. 3, 2007 and published as U.S. Patent Application Publication No. 2008/0158172, the contents of which are incorporated by reference herein in their entirety for all purposes.


To provide a more uniform response from the touch sensor panel given the same amount of touch, the sensor output values can be calibrated or normalized by using offset values to compensate the raw no-touch output values for each sensor in the panel so that all sensor output values are normalized to approximately the same value. A periodic local baseline offset adjustment algorithm can then be employed to locally update the sensor offset values to account for variables such as temperature drift. However, when ungrounded objects such as water droplets or coins are present on the touch sensor panel, the periodic local baseline offset adjustment algorithm can generate inaccurate normalized results. Furthermore, factors such as temperature changes can rapidly skew the normalized sensor output values. In addition, when processing touch data to recognize gestures, it can be difficult to clearly identify and lock onto a particular dominant motion component as a preliminary step in recognizing a particular gesture.


SUMMARY OF THE INVENTION

This relates to an image jaggedness filter that can be used to detect the presence of ungrounded objects such as water droplets or coins on a touch sensor panel, and delay periodic local offset adjustments until these objects have largely disappeared. To do otherwise could produce inaccurate normalized sensor output values. This also relates to the application of a global baseline offset to quickly normalize the sensor output values to account for conditions such as rapid temperature changes. Background pixels not part of any touch regions can be used to detect changes to no-touch sensor output values and compute a global baseline offset accordingly. This also relates to the use of motion dominance ratios and axis domination confidence values to improve the accuracy of locking onto dominant motion components as part of gesture recognition.





BRIEF DESCRIPTION OF THE DRAWINGS


FIGS. 1a-1c illustrate an exemplary periodic local baseline adjustment for a single row of pixels in a touch sensor panel according to embodiments of the invention.



FIG. 2a illustrates an exemplary touch sensor panel having water droplets on its touch surface and the resulting touch image having a high spatial frequency.



FIG. 2b illustrates an exemplary flow diagram of the use of the image jaggedness filter according to one embodiment of this invention.



FIG. 3 illustrates an exemplary image of touch on touch sensor panel showing how a global baseline offset can be determined according to one embodiment of this invention.



FIG. 4a illustrates the computation of an exemplary periodic global baseline offset adjustment value for a single row of pixels (sensors) A-G in a touch sensor panel according to embodiments of the invention.



FIG. 4b illustrates an exemplary plot of the overall offset value for a single sensor over time including the total contributions of a local baseline offset and the contribution of a global baseline offset according to one embodiment of this invention.



FIG. 4c illustrates an exemplary flowchart or algorithm for implementing the global baseline offset algorithm according to embodiments of the invention.



FIG. 4d illustrates an exemplary plot of the overall offset value for a single sensor over time wherein the global baseline offset value is applied to the sensor offset value gradually according to embodiments of the invention.



FIG. 5 illustrates an exemplary motion component dominance algorithm that can be implemented by a processor executing firmware according to embodiments of the invention.



FIG. 6 illustrates an exemplary algorithm for computing an axis_domination_confidence value that can be implemented by a processor executing firmware according to embodiments of the invention.



FIG. 7 illustrates an exemplary computing system operable with a touch sensor panel to implement the image jaggedness filter, global baseline offset, and motion component dominance factors according to one embodiment of this invention.



FIG. 8a illustrates an exemplary mobile telephone that can include a touch sensor panel and computing system for implementing the image jaggedness filter, global baseline offset, and motion component dominance factors according to one embodiment of this invention.



FIG. 8b illustrates an exemplary digital media player that can include a touch sensor panel and computing system for implementing the image jaggedness filter, global baseline offset, and motion component dominance factors according to one embodiment of this invention.



FIG. 8c illustrates an exemplary personal computer that can include a touch sensor panel and computing system for implementing the image jaggedness filter, global baseline offset, and motion component dominance factors according to one embodiment of this invention.





DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

In the following description of preferred embodiments, reference is made to the accompanying drawings in which it is shown by way of illustration specific embodiments in which the invention can be practiced. It is to be understood that other embodiments can be used and structural changes can be made without departing from the scope of the embodiments of this invention.


This relates to an image jaggedness filter that can be used to detect the presence of ungrounded objects such as water droplets or coins, and delay periodic local baseline offset adjustments until these objects have largely disappeared. To do otherwise could produce inaccurate normalized sensor output values. This also relates to the application of a global baseline offset to quickly modify the sensor offset values to account for conditions such as rapid temperature changes. Background pixels not part of any touch regions can be used to detect changes to no-touch sensor output values and compute the global baseline offset accordingly. This also relates to the use of motion dominance ratios and axis domination confidence values to improve the accuracy of locking onto dominant motion components as part of gesture recognition.


Image Jaggedness Filter for Baseline Calculations

To provide a more uniform response from the touch sensor panel given the same amount of touch, touch sensor panel output values can be calibrated using offset values to adjust the raw no-touch output values for each sensor in the panel so that all touch sensor panel output values are normalized to approximately the same value. However, even with normalized sensor outputs, temperature drift and other factors can cause the sensor output values to change, which will tend to skew the normalized baseline. To account for these gradual changes to the normalized sensor output values, a periodic local baseline offset adjustment algorithm can be employed.



FIGS. 1a-1c illustrate an exemplary periodic local baseline adjustment for a single row of pixels (sensors) A-G in a touch sensor panel according to embodiments of the invention. Although not shown, it should be understood that each row in the touch sensor panel can also be subject to this periodic local baseline adjustment. The periodic local baseline offset adjustment algorithm can increment or decrement individual sensor offset values by one count or unit, or some small value to provide periodic fine-tuning of the offsets to track temperature drift or other shifts in the sensor output values.


As shown in FIG. 1a, to perform this periodic local baseline offset adjustment, a no-touch scan of the sensor panel is performed after a dynamic adjustment time interval has passed, and raw sensor output values 108 are obtained. The adjustment time interval is generally much longer than the frame rate (the time it takes to scan the entire sensor panel one time). Previously computed offset values for each sensor (see 110-A through 110-G) are then subtracted from the measured raw sensor output values 108 to normalize them. Ideally, as shown in FIG. 1a, the subtraction results in all normalized sensor output values being equal to the same baseline value 112.


However, as shown in FIG. 1b, if some of the no-touch measured raw sensor output values 114 shift due to a change in some condition such as a temperature increase, for example, after subtraction of the offset values 110-A through 110-G, some of the normalized sensor output values may be equal to some value other than baseline value 112, such as value 116 in FIG. 1b. To adjust for this shift according to embodiments of the invention, all sensors having normalized sensor output values that are positive and negative as compared to the baseline 112 are identified. (In the example of FIG. 1b, the normalized sensor values for sensors B-E and G are positive.) For any sensors with normalized sensor output values that are positive, their corresponding offset values are incremented by P, where P may be one count, or a small value, or a percentage of the positive value. In the example of FIG. 1b, P represents the full difference between value 116 and the original baseline 112, but it should be understood that if P represents less than the full difference between value 116 and the original baseline 112, multiple periodic local baseline offset adjustments can eventually take up the full difference. Similarly, for any sensors with normalized sensor output values that are negative, their corresponding offset values are decremented by Q, where Q may be one count, or a small value, or a percentage of the negative value. The algorithm waits the duration of an adjustment period before scanning the panel again.


As shown in FIG. 1c, after the sensor offset values for sensors B-E and G have been adjusted, the normalized sensor output values should be closer to the original baseline 112. In the example of FIG. 1c, because the offset adjustment value P represented the full difference between value 116 and the original baseline 112, the normalized sensor output values equal the original baseline 112.


Despite this normalization, in multi-touch sensor panels, certain pixels can generate false, erroneous or otherwise distorted readings when two or more simultaneous touch events are generated by the same poorly grounded object. Compensation of these distorted readings (so-called “negative pixels”) is described in U.S. application Ser. No. 11/963,578 entitled “Negative Pixel Compensation,” the contents of which are incorporated by reference herein in their entirety for all purposes. To compensate for these distorted readings, a predicted negative pixel value can first be computed as an indicator of pixels that are likely to be distorted. The predicted negative pixel value for any particular pixel can be computed by summing up the touch output values for pixels in the drive line of the particular pixel being considered, summing up the touch output values for pixels in the sense line of the particular pixel being considered, and then multiplying these two sums. A scaled function of the predicted negative pixel value can then be added to the measured touch output value for the pixel to compensate for artificially negative readings.


However, due to physical design changes, state-of-the-art touch sensor panels can have a greater incidence of negative pixels than previous touch sensor panels. In trackpad embodiments, for example, negative pixels can appear more frequently due to the expected frequent usage of unplugged notebook computers, which can cause a higher incidence of touches by ungrounded objects. Thus, for a given image of touch, there can be a higher sum of negative and positive pixels than in previous designs.


Water droplets on a touch sensor panel can also appear as ungrounded objects. On trackpads, where user fingers and palms are often touching (sometimes inadvertently) the panel, water droplets can easily get smeared. Therefore, if the possible presence of water droplets can be detected, it would be preferable to hold off on any periodic local baseline offset adjustment until the water has dried off, because of the likely existence of corrupting negative pixels.


To suppress periodic local baseline offset adjustments in the presence of water droplets, extra filters can first be employed to detect the presence of water droplets. To detect water droplets, a jaggedness/irregularity filter can be used, as described in U.S. application Ser. No. 11/619,490 entitled “Irregular Input Identification” and U.S. application Ser. No. 11/756,211 entitled “Multi-touch Input Discrimination,” both of which are incorporated by reference herein in their entirety for all purposes. This jaggedness/irregularity filter can be used to find touch images having a high spatial frequency, such as those caused by water droplets.



FIG. 2a illustrates an exemplary touch sensor panel 200 having water droplets 202 on its touch surface. The sensors in row 204 can generate touch outputs as shown in plot 106. Plot 206 shows that water droplets 202, being ungrounded, can generate raw touch sensor output values having a high spatial frequency (a high frequency of occurrence of touch images in space), a certain jaggedness in the captured image, and a number of positive and negative pixels. Although not shown in FIG. 2, a similar plot can be obtained for every row and column in touch sensor panel 200.



FIG. 2b illustrates an exemplary flow diagram of the use of the image jaggedness filter according to embodiments of the invention. In FIG. 2, a jaggedness measure can be obtained at 208. To accomplish this, the jaggedness/irregularity filter as mentioned above can be applied to all rows and columns to generate a jaggedness measure for the entire image. In some embodiments, the jaggedness measure for all rows and columns can be averaged and normalized. Alternatively, a spatial Fourier transform can be used.


If a moderate (relatively even) mix of negative and positive pixels are found or are within a particular mix threshold at 210, and a certain jaggedness threshold is exceeded at 212, indicating the presence of numerous poorly grounded objects such as water droplets, then the next periodic local baseline offset adjustment can be skipped at 214. For example, a “moderate” mix of negative and positive pixels may be defined as having percentages of negative and positive pixels are within 40% of each other—30% and 70%. All other percentages would not be considered “moderate.” Additionally, if the jaggedness measure is normalized between [0,1], with “0” being not jagged (no ungrounded objects) and “1” being completely jagged (many small ungrounded objects), then the jaggedness threshold could be set to 0.5.


If the jaggedness threshold is not exceeded at 212, but the number of positive and negative pixels is changing rapidly at 216 (which can occur when water droplets are evaporating), periodic local baseline offset adjustments can also be suppressed at 214. To make this determination of whether the number of positive and negative pixels are changing rapidly, the sums of the negative and positive pixels can be passed though a (mathematical) low pass filter (LFP) that produces an auto-regressive average. Instantaneous values can then be subtracted from the average. If the difference is high (greater than a predetermined threshold, such as the instantaneous value being more than 25% different from the computed average), this indicates a large change in the number of negative or positive pixels sufficient to suppress periodic local baseline offset adjustments. On the other hand, if the number of positive and negative pixels is not changing rapidly at 216, then the next periodic local baseline offset adjustment can occur as scheduled at 218 (including the suppression of an initial baseline capture if fingers are detected at startup, as disclosed in U.S. application Ser. No. 11/650,112 entitled “Periodic Sensor Panel Baseline Adjustment,” the contents of which are incorporated by reference herein in their entirety for all purposes).


If the mix of negative and positive pixels is not moderate at 210 (e.g. many more positive pixels than negative pixels, or vice versa), the jaggedness threshold is not exceeded at 222, and the mix of negative and positive pixels is changing rapidly at 216, periodic local baseline offset adjustments can be suppressed at 214. However, if the mix of negative and positive pixels is not changing rapidly at 216, periodic local baseline offset adjustments can be performed at 218.


After enough water evaporates, no significant number of negative pixels may remain, but some positive pixels may remain. If the positive pixels are scattered spatially, they can still cause the jaggedness measure to be above the threshold. Note that the jaggedness algorithm may only recognize that the jaggedness measure has exceeded a threshold—it does not see actual negative and positive pixels, so it cannot determine that there are few negative pixels remaining. Thus, if the mix of negative and positive pixels is not moderate at 210, but the jaggedness threshold is exceeded at 222, periodic local baseline offset adjustments can be performed at 218. In addition, to compensate for this effect, the increment/decrement rate of the adaptation algorithm can be sped up, so that the positive pixels are compensated more quickly and the effect is reduced.


Global Baseline Offset

As described above, there are situations in which it can be preferable to delay periodic local baseline offset adjustments so that ungrounded touches do not cause erroneous adjustments to the sensor offset values. Additionally, with conventional keyboards having trackpads, inadvertent touch events can be commonplace while the keyboard is being utilized, presenting another situation where it can be preferable to keep the adaptation rate slower so that patches due to hovering or inadvertent touches do not get incorporated into the sensor offset values. However, it can still desirable to quickly compensate for temperature or other global effects.


Therefore, in addition to the periodic local baseline offset adjustment algorithm described above that can cause sensor offset values to be incrementally adapted or changed on a pixel-by-pixel (local) basis, in other embodiments of the invention a global baseline offset can be applied to the offset values for all pixels. The global baseline offset can be used to effect changes much more quickly than the periodic local baseline offset adjustment algorithm to compensate for large temperature changes or the effects of other global conditions. In some embodiments, the full amount of this global baseline offset can be immediately applied to the offset values for all pixels. In other embodiments, the offset values for all pixels can be incremented or decremented gradually over time (but more often than the individual pixels can be incremented or decremented using local baseline offset adjustments), until the full amount of the global baseline offset has been applied.



FIG. 3 illustrates an exemplary image of touch on touch sensor panel 300 showing how a global baseline offset value can be determined according to embodiments of the invention. First, in some embodiments, unions of adjacent or nearby patches can be determined (see 302 and 304). To determine which patches should be grouped together, any number of methods can be used, such as computing the centroids of the patches and grouping together those pixels whose centroids are closest together. The union of those patches can be formed based on the touch sensor output values within the patches. For example, for any two grouped patches, all pixels within those two patches having touch sensor output values above a certain threshold can be considered part of the union. These union areas can be blocked out from subsequent calculations so that only background pixels 306 remain. In other embodiments, unions need not be formed, and only the patches themselves can be excluded from the background pixels.


An average of all or a portion of the background pixels 306 can then be computed, and this average can then used to globally modify the offset values for all pixels in the touch sensor panel. Because the background pixels 306 are untouched, the average of their untouched output values can provide an indication of rapid changes to the pixel outputs due to factors such as temperature. This average, or some adjustment value that is a function of this average, can then be added to or subtracted from the current sensor baseline to compute the global baseline offset value. This global baseline offset value can then be added to the current offset values for every pixel in the touch sensor panel to effect a global adjustment of the offset values. In some embodiments, this global baseline offset value can be applied immediately to the current offset values for every pixel. In other embodiments, the current offset values can be incremented or decremented gradually until the full global baseline offset values has been applied. To keep the normalized sensor output values from “running away” (e.g. getting excessively large or small) due to unintended artifacts of the algorithm such as an accumulation of roundoff error, the global baseline offset value can optionally decay to zero over time.



FIG. 4a illustrates the computation of an exemplary periodic global baseline offset value for a single row of pixels (sensors) A-G in a touch sensor panel according to embodiments of the invention. Although not shown, it should be understood that each row in the touch sensor panel can be involved in the computation of this global baseline offset value. In the example of FIG. 4a, current no-touch (i.e. background) raw sensor output values 408 have risen substantially and in a fairly uniform manner from previous no-touch raw sensor output values 420 due to a change in some condition such as a temperature increase. As such, subtracting of the previous sensor offset values 410-A through 410-G from the current raw sensor output values 408 results in normalized values 416 well above the original baseline 412, which can create errors in touch detection and interpretation. To perform a global baseline offset adjustment on all offset values in the touch sensor panel, an average of the background pixels can first be computed. In the example of FIG. 4a, the average is shown at 422. Next, the difference between this average and the original baseline 412 can be computed as the global baseline offset value 424. This global baseline offset value 424 can then be added to the previous sensor offset values 410-A through 410-G to produce updated sensor offset values and effect a global adjustment of the offset values.



FIG. 4b illustrates an exemplary plot of the overall offset value 400 for a single sensor over time including the total contributions of a local baseline offset 404 and the contribution of a global baseline offset 402 according to embodiments of the invention. In the example of FIG. 4b, the offset value 400, global baseline offset value 402, and the total contribution of the local baseline offset value 404 start near zero at 406, indicating that the raw no-touch sensor output value for that sensor is approximately equal to the desired baseline value. If a temperature shift or other environmental condition is detected at 408 resulting in a rapid increase in the average of the background pixels (e.g., a change of more than 25% over the span of a minute), the full amount of the calculated global baseline offset value 402 can be immediately added to the sensor offset value, causing the overall sensor offset value 400 to increase rapidly to a value 410 equal to the difference between the average of the background pixels and the original baseline as described above. The global baseline offset value 402 can then decay back to zero over time at 412 to ensure that the offset value does not get excessively large or small due to unintended artifacts of the algorithm.


However, if the increase in the raw sensor output values remains, even while the global baseline offset value 402 is decaying back down to zero, another mechanism is needed to ensure that an increase to the overall offset value does occur. To accomplish this, the local baseline offset adjustment algorithm described above can periodically incrementally increase the overall offset value 400 as the global baseline offset value 402 is decaying. Although each increment to the overall offset value 400 made by the local baseline offset adjustment algorithm is small, the total contribution of the local baseline offset value 404 gradually increases over time, as shown at 414 in FIG. 4b.



FIG. 4c illustrates an exemplary flowchart or algorithm for implementing the global baseline offset algorithm as described above according to embodiments of the invention.


Although not shown, similar adjustments to the overall sensor offset value of each pixel can be made in the negative direction if the average of the background pixels rapidly decreases.



FIG. 4d illustrates an exemplary plot of the overall offset value 400 for a single sensor over time wherein the global baseline offset value is applied to the sensor offset value gradually according to embodiments of the invention. In the example of FIG. 4d, the global baseline offset value 402 can be incrementally added to the sensor offset value, causing the overall sensor offset value 400 to increase gradually to a value 410 equal to the difference between the average of the background pixels and the original baseline as described above. It should be noted that although the global baseline offset value is applied incrementally, the increment period can be much faster than the local baseline offset adjustment described above. The global baseline offset value 402 can then decay back to zero over time at 412 to ensure that the offset value does not get excessively large or small due to unintended artifacts of the algorithm.


Motion Component Dominance Factors for Motion Locking

In the processing of touch images, after touch images (e.g. from two fingers) are captured, identified and tracked over multiple panel scans, motion components can be extracted. In the case of two fingers, motion components can include the X component, the Y component, a scale (zoom) component (the dot product of the two finger motion vectors), and a rotate component (the cross product of the two finger motion vectors). The extracted motion components can provide for two types of control. “Integral control” is defined herein as providing all four degrees of freedom (the ability to control all axes at once). “Separable control” is more limited, and separates motion between either (1) X-Y scrolling as a set, (2) zoom, or (3) rotate (i.e. one axis).



FIG. 5 illustrates an exemplary motion component dominance algorithm 500 that can be implemented by a processor executing firmware according to embodiments of the invention. After multiple images of touch are captured at 502, motion components such as the x-direction velocity (Vx), y-direction velocity (Vy), rotational velocity (Vr), and scaling velocity (Vs) can be extracted at 504. To implement separable control, embodiments of the invention can lock onto the first component (axis) with significant motion, and ignore the others. For example, if significant X-Y scrolling is detected first, subsequently detected zooming motions may be ignored until liftoff of the fingers. To lock onto the first component with significant motion, a low pass filter (LPF) can be applied to the computed velocities of the extracted motion components to compute the following at 506:

Smooth_translation_speed=(LPF(Vx)2+LPF(Vy)2)0.5
Smooth_rotate_speed=LPF(Vr)
Smooth_scale_speed=LPF(Vs)


Note that the smooth_translation_speed value includes Vx and Vy because of the desire to lock onto scrolling as a whole, not just the X and Y components. Of these three values, the dominant (largest) computed speed can be used, while the others can be ignored (zeroed or clipped out).


However, in practice it can be difficult to lock on properly, because a scroll motion might initially look like a rotate motion, for example, or vice versa. Therefore, in embodiments of the invention, the three raw values described above can be utilized in conjunction with two new parameters, scale_dominance_ratio (SDR) and rotate_dominance_ratio (RDR), which can be used to apply weights to the various motion components and set a balance point for the motions so that a particular component can be locked onto more accurately. The SDR and RDR values can be established after the various finger contacts are identified at 508. The SDR and RDR values computed at 510 can be based on whether the detected contacts are identified as fingers and/or thumbs. For example, if a thumb is detected, it can be more likely that a user is using a thumb and finger to perform a scaling (zoom) or rotate operation rather than a translation or scroll operation, so the SDR and RDR values can be set to high values (e.g. 2.5) so that the Smooth_scale_speed or the Smooth_rotate_speed values dominate the Smooth_translation_speed value.


However, if two or more fingers are detected, but not a thumb, it is more likely that a user is using the two fingers to perform a translation or scroll operation rather than a scaling or rotate operation, so the SDR and RDR values can be set to lower values to ensure that the Smooth_translation_speed value dominates. The multiple-finger, no-thumb SDR value can further be a function of the horizontal separation of the fingers, because it can be more likely that a user is performing a translation or scroll operation when the fingers are close together, but more likely that a user is performing a two finger scaling operation when the fingers have a greater separation. Thus, for example, the SDR can be set to 0.25 if the finger separation is between 0 and 3 cm, can vary from 0.25 to 1.25 if the separation is from 3-6 cm, and can be set to 1.25 for separations greater than 6 cm.


In further embodiments, an exception can be created for the SDR during a two-finger top-to-bottom translation because of the tendency for a user's fingers to draw together during the translation. The movement of the fingers towards each other during the translation should not be interpreted as a scaling operation. To prevent this, if a downward translation is detected plus a scale contraction, then the SDR can be maintained at 0.25, even if the two finger separation distance is high.


After the SDR and RDR values are computed at 510, the following pseudocode can then be implemented at 512, 514, 516 and 518:


Variables: scale_dominance_ratio (SDR), rotate_dominance_ratio (RDR)

If(smooth_translation_speed>SDR×smooth_scale_speed),then
Clip scale(Vx→pass,Vs→zero)Leave scroll;  (A)
If(smooth_translation_speed>RDR×smooth_rotate_speed),then
Clip rotate(Vx→pass,Vr→zero)Leave scroll.  (B)


In other embodiments, where the movement of contacts along with contact identifications provides an ambiguous determination of which motion component to lock onto, locking onto a particular motion component can be delayed until enough motion has occurred to make a more accurate determination. To accomplish this, an axis_domination_confidence value can be computed to provide a representation of the unambiguousness of the motion component to be locked onto.



FIG. 6 illustrates an exemplary algorithm 600 for computing an axis_domination_confidence value that can be implemented by a processor executing firmware according to embodiments of the invention. If smooth_translation_speed<(smooth_scale_speed+smooth_rotate_speed) at 602, then







axis_domination

_confidence

=

1
-



smooth_translation

_speed


(


smooth_scale

_speed

+

smooth_rotate

_speed


)


.







at 604. Otherwise, at 606,







axis_domination

_confidence

=

1
-



(


smooth_scale

_speed

+

smooth_rotate

_speed


)


smooth_translation

_speed


.






The axis_domination_confidence value as calculated above can be normalized to be between [0,1], where values approaching 1 represent a pure translation (and therefore there is high confidence in locking on to the X-Y motion components) and values approaching 0 indicate that the translation amount is about equal to the scale and rotation amount (and therefore low confidence in locking on to any motion component).


After the axis_domination_confidence value is computed, in one embodiment the motion component locking decision can be delayed by an amount proportional to the inverse of the axis_domination_confidence value at 608. Thus, if the value is high, indicating high confidence, there can be little or no delay. However, if the value is low, indicating low confidence, the locking decision can be delayed to allow for the motion components to become less ambiguous.


In another embodiment, the axis_domination_confidence value (or the square of this value) can be multiplied by any non-clipped motion components (see, e.g., equations (A) and (B) above) at 610. This has the effect of slowing down the ultimate gesture decision. For example, if the axis_domination_confidence value is 1 and this is multiplied by the unclipped motion component, the motion will be locked onto and integrated quickly in gesture detection algorithms. However, if no motion component has been locked onto, and motion is being integrated but the dominant motion component is borderline, when the motion component is multiplied by a low axis_domination_confidence value, this can dampen the motion and extend the integration period. This can delay the triggering of a decision on which motion components to pass and which motion components to clip and ultimately the identification of gestures. During this delay time, the motions can become more unambiguous. Once locked, it is not necessary to apply the axis_domination_confidence value any more.


Embodiments of the invention described above can be implemented, for example, using touch sensor panels of the types described in U.S. application Ser. No. 11/650,049 entitled “Double-Sided Touch Sensitive Panel and Flex Circuit Bonding.” Sense channels of the types described in U.S. application Ser. No. 11/649,998 entitled “Proximity and Multi-Touch Sensor Detection and Demodulation” can be used, for example, to detect touch and hover events. The resulting image of touch can be further processed to determine the location of the touch events, the identification of finger contacts, and the identification of gestures as described, for example, in U.S. application Ser. No. 11/428,522 entitled “Identifying Contacts on a Touch Surface,” U.S. application Ser. No. 11/756,211 entitled “Multi-touch Input Discrimination,” and U.S. application Ser. No. 10/903,964 entitled “Gestures for Touch Sensitive Input Devices.” All of the preceding applications referred to in this paragraph are incorporated by reference herein in their entirety for all purposes.



FIG. 7 illustrates exemplary computing system 700 that can include one or more of the embodiments of the invention described above. Computing system 700 can include one or more panel processors 702 and peripherals 704, and panel subsystem 706. Peripherals 704 can include, but are not limited to, random access memory (RAM) or other types of memory or storage, watchdog timers and the like. Panel subsystem 706 can include, but is not limited to, one or more sense channels 708, channel scan logic 710 and driver logic 714. Channel scan logic 710 can access RAM 712, autonomously read data from the sense channels and provide control for the sense channels. In addition, channel scan logic 710 can control driver logic 714 to generate stimulation signals 716 at various frequencies and phases that can be selectively applied to drive lines of touch sensor panel 724 at a voltage established by charge pump 715. In some embodiments, panel subsystem 706, panel processor 702 and peripherals 704 can be integrated into a single application specific integrated circuit (ASIC).


Touch sensor panel 724 can include a capacitive sensing medium having a plurality of drive lines and a plurality of sense lines, although other sensing media can also be used. Each intersection, adjacency or near-adjacency of drive and sense lines can represent a capacitive sensing node and can be viewed as picture element (pixel) 726, which can be particularly useful when touch sensor panel 724 is viewed as capturing an “image” of touch. (In other words, after panel subsystem 706 has determined whether a touch event has been detected at each touch sensor in the touch sensor panel, the pattern of touch sensors in the multi-touch panel at which a touch event occurred can be viewed as an “image” of touch (e.g. a pattern of fingers touching the panel).) Each sense line of touch sensor panel 724 can drive sense channel 708 (also referred to herein as an event detection and demodulation circuit) in panel subsystem 706.


Computing system 700 can also include host processor 728 for receiving outputs from panel processor 702 and performing actions based on the outputs that can include, but are not limited to, moving an object such as a cursor or pointer, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, making a selection, executing instructions, operating a peripheral device coupled to the host device, answering a telephone call, placing a telephone call, terminating a telephone call, changing the volume or audio settings, storing information related to telephone communications such as addresses, frequently dialed numbers, received calls, missed calls, logging onto a computer or a computer network, permitting authorized individuals access to restricted areas of the computer or computer network, loading a user profile associated with a user's preferred arrangement of the computer desktop, permitting access to web content, launching a particular program, encrypting or decoding a message, and/or the like. Host processor 728 can also perform additional functions that may not be related to panel processing, and can be coupled to program storage 732 and display device 730 such as an LCD display for providing a UI to a user of the device. Display device 730 together with touch sensor panel 724, when located partially or entirely under the touch sensor panel, or partially or entirely integrated with the touch sensor panel, can form touch screen 718.


Note that one or more of the functions described above can be performed by firmware stored in memory (e.g. one of the peripherals 704 in FIG. 7) and executed by panel processor 702, or stored in program storage 732 and executed by host processor 728. The firmware can also be stored and/or transported within any computer-readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “computer-readable storage medium” can be any storage medium that can contain or store the program for use by or in connection with the instruction execution system, apparatus, or device. The computer readable storage medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, a portable computer diskette (magnetic), a random access memory (RAM) (magnetic), a read-only memory (ROM) (magnetic), an erasable programmable read-only memory (EPROM) (magnetic), a portable optical disc such a CD, CD-R, CD-RW, DVD, DVD-R, or DVD-RW, or flash memory such as compact flash cards, secured digital cards, USB memory devices, memory sticks, and the like.


The firmware can also be propagated within any transport medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. In the context of this document, a “transport medium” can be any medium that can communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The transport readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic or infrared wired or wireless propagation medium.



FIG. 8a illustrates exemplary mobile telephone 836 that can include touch sensor panel 824 and computing system 842 for implementing the image jaggedness filter, global baseline offset, and motion component dominance factors described above according to embodiments of the invention.



FIG. 8b illustrates exemplary digital media player 840 that can include touch sensor panel 824 and computing system 642 for implementing the image jaggedness filter, global baseline offset, and motion component dominance factors described above according to embodiments of the invention.



FIG. 8c illustrates exemplary personal computer 844 that can include touch sensor panel (trackpad) 824 and computing system 842 for implementing the image jaggedness filter, global baseline offset, and motion component dominance factors described above according to embodiments of the invention. The mobile telephone, media player, and personal computer of FIGS. 8a, 8b and 8c can advantageously benefit from the image jaggedness filter, global baseline offset, and motion component dominance factors described above because implementation of these features can improve the normalized outputs of the touch sensor panel and the recognition of gestures.


Although embodiments of this invention have been fully described with reference to the accompanying drawings, it is to be noted that various changes and modifications will become apparent to those skilled in the art. Such changes and modifications are to be understood as being included within the scope of embodiments of this invention as defined by the appended claims.

Claims
  • 1. An apparatus for adjusting sensor output values captured on a touch panel, the apparatus comprising: scan logic couplable to the touch panel and configured for performing a scan of the touch panel to capture the sensor output values for a plurality of sensors on the touch panel; anda processor coupled to the scan logic and configured for normalizing the sensor output values using a plurality of stored local baseline offset values,identifying a first subset of normalized sensor output values, less than a full set of the normalized sensor output values, that are different from a sensor output baseline value, and identifying a first subset of sensors, less than a full set of the plurality of sensors, corresponding to the first subset of normalized sensor output values, andselectively adjusting the sensor output values by individually adjusting a first subset of local baseline offset values, less than a full set of the plurality of stored local baseline offset values and corresponding to the first subset of sensors, and applying the plurality of stored local baseline offset values including the individually adjusted first subset of local baseline offset values to sensor output values captured in subsequent scans of the touch panel.
  • 2. The apparatus of claim 1, the processor further configured for individually adjusting the first subset of local baseline offset values by: increasing each local baseline offset value in the first subset of local baseline offset values whose corresponding normalized sensor output value was greater than the sensor output baseline value; anddecreasing each local baseline offset value in the first subset of local baseline offset values whose corresponding normalized sensor output value was less than the sensor output baseline value.
  • 3. The apparatus of claim 2, wherein the increase or decrease in each local baseline offset value is an amount of adjustment equal to a difference between each local baseline offset value and the sensor output baseline value.
  • 4. The apparatus of claim 2, the processor further configured for increasing or decreasing each local baseline offset value by an amount equal to a fraction of the difference between each local baseline offset value and the sensor output baseline value, and repeating this adjustment over time.
  • 5. The apparatus of claim 1, the apparatus comprising at least one of a computing device, a media player, or a mobile telephone.
  • 6. A method for adjusting sensor output values captured on a touch panel, the method comprising: capturing the sensor output values for a plurality of sensors on the touch panel;normalizing the sensor output values using a plurality of stored local baseline offset values;identifying a first subset of normalized sensor output values, less than a full set of the normalized sensor output values, that are different from a sensor output baseline value, and identifying a first subset of sensors, less than a full set of the plurality of sensors, corresponding to the first subset of normalized sensor output values; andselectively adjusting the sensor output values by individually adjusting a first subset of local baseline offset values, less than a full set of the plurality of stored local baseline offset values and corresponding to the first subset of sensors, and applying the plurality of stored local baseline offset values including the individually adjusted first subset of local baseline offset values to sensor output values captured in subsequent scans of the touch panel.
  • 7. The method of claim 6, further comprising individually adjusting the first subset of local baseline offset values by: increasing each local baseline offset value in the first subset of local baseline offset values whose corresponding normalized sensor output value was greater than the sensor output baseline value; anddecreasing each local baseline offset value in the first subset of local baseline offset values whose corresponding normalized sensor output value was less than the sensor output baseline value.
  • 8. The method of claim 7, wherein the increase or decrease in each local baseline offset value is an amount of adjustment equal to a difference between each local baseline offset value and the sensor output baseline value.
  • 9. The method of claim 7, further comprising increasing or decreasing each local baseline offset value by an amount equal to a fraction of the difference between each local baseline offset value and the sensor output baseline value, and repeating this adjustment over time.
  • 10. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by one or more processors of an electronic device, cause the electronic device to perform a method comprising: capturing the sensor output values for a plurality of sensors on the touch panel;normalizing the sensor output values using a plurality of stored local baseline offset values;identifying a first subset of normalized sensor output values, less than a full set of the normalized sensor output values, that are different from a sensor output baseline value, and identifying a first subset of sensors, less than a full set of the plurality of sensors, corresponding to the first subset of normalized sensor output values; andselectively adjusting the sensor output values by individually adjusting a first subset of local baseline offset values, less than a full set of the plurality of stored local baseline offset values and corresponding to the first subset of sensors, and applying the plurality of stored local baseline offset values including the individually adjusted first subset of local baseline offset values to sensor output values captured in subsequent scans of the touch panel.
  • 11. The non-transitory computer readable storage medium of claim 10, the method further comprising individually adjusting the first subset of local baseline offset values by: increasing each local baseline offset value in the first subset of local baseline offset values whose corresponding normalized sensor output value was greater than the sensor output baseline value; anddecreasing each local baseline offset value in the first subset of local baseline offset values whose corresponding normalized sensor output value was less than the sensor output baseline value.
  • 12. The non-transitory computer readable storage medium of claim 11, wherein the increase or decrease in each local baseline offset value is an amount of adjustment equal to a difference between each local baseline offset value and the sensor output baseline value.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/006,987, filed Jan. 26, 2016 and published on Jun. 2, 2016 as U.S. Patent Publication No. 2016/0154529, which claims the benefit of U.S. patent application Ser. No. 12/238,342, filed Sep. 25, 2008 and published on Jul. 9, 2009 as U.S. Patent Publication No. 2009/0174676, which claims the benefit of U.S. Provisional Patent Application No. 61/019,222 filed on Jan. 4, 2008, the contents of which are incorporated herein by reference in their entirety for all purposes.

US Referenced Citations (555)
Number Name Date Kind
4087625 Dym et al. May 1978 A
4090092 Serrano May 1978 A
4304976 Gottbreht et al. Dec 1981 A
4475235 Graham Oct 1984 A
4550221 Mabusth Oct 1985 A
4659874 Landmeier Apr 1987 A
5194862 Edwards Mar 1993 A
5317919 Awtrey Jun 1994 A
5459463 Gruaz et al. Oct 1995 A
5483261 Yasutake Jan 1996 A
5488204 Mead et al. Jan 1996 A
5543590 Gillespie et al. Aug 1996 A
5631670 Tomiyoshi et al. May 1997 A
5825352 Bisset et al. Oct 1998 A
5835079 Shieh Nov 1998 A
5841078 Miller et al. Nov 1998 A
5844506 Binstead Dec 1998 A
5847690 Boie et al. Dec 1998 A
5880411 Gillespie et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5914465 Allen et al. Jun 1999 A
5923997 Miyanaga et al. Jul 1999 A
6057903 Colgan et al. May 2000 A
6137427 Binstead Oct 2000 A
6163313 Aroyan et al. Dec 2000 A
6188391 Seely et al. Feb 2001 B1
6204897 Colgan et al. Mar 2001 B1
6239788 Nohno et al. May 2001 B1
6310610 Beaton et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6329044 Inoue et al. Dec 2001 B1
6452514 Philipp Sep 2002 B1
6456952 Nathan Sep 2002 B1
6587358 Yasumura Jul 2003 B1
6690387 Zimmerman et al. Feb 2004 B2
6730863 Gerpheide et al. May 2004 B1
6847354 Vranish Jan 2005 B2
6970160 Mulligan et al. Nov 2005 B2
7015894 Morohoshi Mar 2006 B2
7030860 Hsu et al. Apr 2006 B1
7129935 Mackey Oct 2006 B2
7138686 Banerjee et al. Nov 2006 B1
7180508 Kent et al. Feb 2007 B2
7184064 Zimmerman et al. Feb 2007 B2
7337085 Soss Feb 2008 B2
7395717 DeAngelis et al. Jul 2008 B2
7412586 Rajopadhye et al. Aug 2008 B1
7504833 Seguine Mar 2009 B1
7538760 Hotelling et al. May 2009 B2
7548073 Mackey et al. Jun 2009 B2
7580030 Marten Aug 2009 B2
7639234 Orsley Dec 2009 B2
7663607 Hotelling et al. Feb 2010 B2
7719523 Hillis May 2010 B2
7701539 Shih et al. Aug 2010 B2
7864503 Chang Jan 2011 B2
7898122 Andrieux et al. Mar 2011 B2
7907126 Yoon et al. Mar 2011 B2
7932898 Philipp et al. Apr 2011 B2
8026904 Westerman Sep 2011 B2
8040142 Bokma Oct 2011 B1
8040321 Peng Oct 2011 B2
8040326 Hotelling et al. Oct 2011 B2
8045783 Lee et al. Oct 2011 B2
8058884 Betancourt Nov 2011 B2
8068097 GuangHai Nov 2011 B2
8120371 Day et al. Feb 2012 B2
8125312 Orr Feb 2012 B2
8169421 Wright May 2012 B2
8223133 Hristov Jul 2012 B2
8258986 Makovetskyy Sep 2012 B2
8259078 Hotelling et al. Sep 2012 B2
8283935 Liu et al. Oct 2012 B2
8319747 Hotelling et al. Nov 2012 B2
8339286 Cordeiro Dec 2012 B2
8355887 Harding et al. Jan 2013 B1
8441464 Lin et al. May 2013 B1
8479122 Hotelling et al. Jul 2013 B2
8484838 Badaye et al. Jul 2013 B2
8487898 Hotelling Jul 2013 B2
8507811 Hotelling et al. Aug 2013 B2
8508495 Hotelling et al. Aug 2013 B2
8525756 Kwon Sep 2013 B2
8537126 Yousefpor et al. Sep 2013 B2
8542208 Krah et al. Sep 2013 B2
8576193 Hotelling Nov 2013 B2
8593410 Hong et al. Nov 2013 B2
8593425 Hong et al. Nov 2013 B2
8614688 Chang Dec 2013 B2
8633915 Hotelling et al. Jan 2014 B2
8665237 Koshiyama et al. Mar 2014 B2
8680877 Lee et al. Mar 2014 B2
8760412 Hotelling et al. Jun 2014 B2
8766950 Morein et al. Jul 2014 B1
8773146 Hills et al. Jul 2014 B1
8773351 Rekimoto Jul 2014 B2
8810543 Kurikawa Aug 2014 B1
8884917 Seo Nov 2014 B2
8902172 Peng Dec 2014 B2
8917256 Roziere Dec 2014 B2
8922521 Hotelling et al. Dec 2014 B2
8957874 Elias Feb 2015 B2
8976133 Yao et al. Mar 2015 B2
8982096 Hong et al. Mar 2015 B2
8982097 Kuzo et al. Mar 2015 B1
9000782 Roziere Apr 2015 B2
9001082 Rosenberg et al. Apr 2015 B1
9024913 Jung et al. May 2015 B1
9035895 Bussat et al. May 2015 B2
9075463 Pyo et al. Jul 2015 B2
9086774 Hotelling et al. Jul 2015 B2
9151791 Roziere Oct 2015 B2
9189119 Liao Nov 2015 B2
9250757 Roziere Feb 2016 B2
9261997 Chang et al. Feb 2016 B2
9280251 Shih Mar 2016 B2
9292137 Kogo Mar 2016 B2
9317165 Hotelling et al. Apr 2016 B2
9329674 Lee et al. May 2016 B2
9329723 Benbasat et al. May 2016 B2
9372576 Westerman Jun 2016 B2
9442330 Huo Sep 2016 B2
9535547 Roziere Jan 2017 B2
9582131 Elias Feb 2017 B2
9640991 Blondin et al. May 2017 B2
9690397 Shepelev et al. Jun 2017 B2
9785295 Yang Oct 2017 B2
9804717 Schropp, Jr. Oct 2017 B2
9874975 Benbasat et al. Jan 2018 B2
9880655 O'Connor Jan 2018 B2
9886141 Yousefpor Feb 2018 B2
9904427 Co Feb 2018 B1
9996175 Hotelling et al. Jun 2018 B2
10001888 Hong et al. Jun 2018 B2
10061433 Imai Aug 2018 B2
10175832 Roziere Jan 2019 B2
10254896 Mori et al. Apr 2019 B2
10289251 Shih et al. May 2019 B2
10365764 Korapati et al. Jul 2019 B2
10705658 Li et al. Jul 2020 B2
10725591 Maharyta et al. Jul 2020 B1
20020015024 Westerman et al. Feb 2002 A1
20020152048 Hayes Oct 2002 A1
20030075427 Caldwell Apr 2003 A1
20030076325 Thrasher Apr 2003 A1
20030164820 Kent Sep 2003 A1
20030210235 Roberts Nov 2003 A1
20040017362 Mulligan et al. Jan 2004 A1
20040061687 Kent Apr 2004 A1
20040090429 Geaghan et al. May 2004 A1
20040119701 Mulligan et al. Jun 2004 A1
20040188151 Gerpheide et al. Sep 2004 A1
20040189617 Gerpheide et al. Sep 2004 A1
20040239650 Mackey Dec 2004 A1
20040241920 Hsiao et al. Dec 2004 A1
20040243747 Rekimoto Dec 2004 A1
20050007353 Smith et al. Jan 2005 A1
20050012724 Kent Jan 2005 A1
20050069718 Voss-Kehl et al. Mar 2005 A1
20050073507 Richter et al. Apr 2005 A1
20050083307 Aufderheide et al. Apr 2005 A1
20050104867 Westerman et al. May 2005 A1
20050126831 Richter et al. Jun 2005 A1
20050146509 Geaghan et al. Jul 2005 A1
20050219228 Alameh et al. Oct 2005 A1
20050239532 Inamura Oct 2005 A1
20050270039 Mackey Dec 2005 A1
20050270273 Marten Dec 2005 A1
20050280639 Taylor et al. Dec 2005 A1
20060001640 Lee Jan 2006 A1
20060017710 Lee et al. Jan 2006 A1
20060026521 Hotelling et al. Feb 2006 A1
20060038791 Mackey Feb 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060132463 Lee et al. Jun 2006 A1
20060146484 Kim et al. Jul 2006 A1
20060161871 Hotelling et al. Jul 2006 A1
20060197753 Hotelling Sep 2006 A1
20060202969 Hauck Sep 2006 A1
20060227115 Fry Oct 2006 A1
20060238522 Westerman et al. Oct 2006 A1
20060267953 Peterson Nov 2006 A1
20060278444 Binstead Dec 2006 A1
20060279548 Geaghan Dec 2006 A1
20060293864 Soss Dec 2006 A1
20070008299 Hristov Jan 2007 A1
20070012665 Nelson et al. Jan 2007 A1
20070023523 Onishi Feb 2007 A1
20070074914 Geaghan et al. Apr 2007 A1
20070075982 Morrison et al. Apr 2007 A1
20070216637 Ito Sep 2007 A1
20070216657 Konicek Sep 2007 A1
20070229468 Peng et al. Oct 2007 A1
20070229470 Snyder et al. Oct 2007 A1
20070247443 Philipp Oct 2007 A1
20070262963 Xiao-Ping et al. Nov 2007 A1
20070262969 Pak Nov 2007 A1
20070268273 Westerman et al. Nov 2007 A1
20070268275 Westerman et al. Nov 2007 A1
20070279395 Philipp Dec 2007 A1
20070279619 Chang Dec 2007 A1
20070283832 Hotelling Dec 2007 A1
20070285365 Lee Dec 2007 A1
20070296709 Guanghai Dec 2007 A1
20080006454 Hotelling Jan 2008 A1
20080007533 Hotelling Jan 2008 A1
20080012835 Rimon et al. Jan 2008 A1
20080018581 Park et al. Jan 2008 A1
20080024456 Peng et al. Jan 2008 A1
20080036742 Garmon Feb 2008 A1
20080042985 Katsuhito et al. Feb 2008 A1
20080042986 Westerman et al. Feb 2008 A1
20080042987 Westerman et al. Feb 2008 A1
20080042992 Kim Feb 2008 A1
20080047764 Lee et al. Feb 2008 A1
20080062140 Hotelling et al. Mar 2008 A1
20080062147 Hotelling et al. Mar 2008 A1
20080062148 Hotelling et al. Mar 2008 A1
20080062151 Kent Mar 2008 A1
20080074398 Wright Mar 2008 A1
20080100572 Boillot May 2008 A1
20080136787 Yeh et al. Jun 2008 A1
20080136792 Peng Jun 2008 A1
20080158145 Westerman Jul 2008 A1
20080158146 Westerman Jul 2008 A1
20080158167 Hotelling et al. Jul 2008 A1
20080158172 Hotelling et al. Jul 2008 A1
20080158174 Land et al. Jul 2008 A1
20080158181 Hamblin et al. Jul 2008 A1
20080158182 Westerman Jul 2008 A1
20080158185 Westerman Jul 2008 A1
20080162996 Krah et al. Jul 2008 A1
20080174321 Kang et al. Jul 2008 A1
20080180365 Ozaki Jul 2008 A1
20080188267 Sagong Aug 2008 A1
20080224962 Kasai et al. Sep 2008 A1
20080231292 Ossart et al. Sep 2008 A1
20080238871 Tam Oct 2008 A1
20080246496 Hristov et al. Oct 2008 A1
20080264699 Chang et al. Oct 2008 A1
20080277259 Chang Nov 2008 A1
20080283175 Hagood et al. Nov 2008 A1
20080303022 Tai et al. Dec 2008 A1
20080303964 Lee et al. Dec 2008 A1
20080309626 Westerman et al. Dec 2008 A1
20080309627 Hotelling et al. Dec 2008 A1
20080309629 Westerman et al. Dec 2008 A1
20080309632 Westerman et al. Dec 2008 A1
20080309633 Hotelling et al. Dec 2008 A1
20080309635 Matsuo Dec 2008 A1
20090002337 Chang Jan 2009 A1
20090009485 Bytheway Jan 2009 A1
20090019344 Yoon et al. Jan 2009 A1
20090020343 Rothkopf et al. Jan 2009 A1
20090054107 Feland et al. Feb 2009 A1
20090070681 Dawes et al. Mar 2009 A1
20090073138 Lee et al. Mar 2009 A1
20090085894 Gandhi et al. Apr 2009 A1
20090091546 Joo et al. Apr 2009 A1
20090091551 Hotelling et al. Apr 2009 A1
20090109192 Liu et al. Apr 2009 A1
20090114456 Wisniewski May 2009 A1
20090128516 Rimon et al. May 2009 A1
20090135157 Harley May 2009 A1
20090141046 Rathnam et al. Jun 2009 A1
20090160787 Westerman et al. Jun 2009 A1
20090174676 Westerman Jul 2009 A1
20090174688 Westerman Jul 2009 A1
20090179868 Ayres et al. Jul 2009 A1
20090182189 Lira Jul 2009 A1
20090184937 Grivna Jul 2009 A1
20090194344 Harley et al. Aug 2009 A1
20090205879 Halsey, IV et al. Aug 2009 A1
20090212642 Krah Aug 2009 A1
20090213090 Mamba et al. Aug 2009 A1
20090236151 Yeh et al. Sep 2009 A1
20090242283 Chiu Oct 2009 A1
20090251427 Hung et al. Oct 2009 A1
20090267902 Nambu et al. Oct 2009 A1
20090267903 Cady et al. Oct 2009 A1
20090273577 Chen et al. Nov 2009 A1
20090277695 Liu et al. Nov 2009 A1
20090303189 Grunthaner et al. Dec 2009 A1
20090309850 Yang Dec 2009 A1
20090309851 Bernstein Dec 2009 A1
20090314621 Hotelling Dec 2009 A1
20090315854 Matsuo Dec 2009 A1
20090322702 Chien et al. Dec 2009 A1
20100001973 Hotelling et al. Jan 2010 A1
20100004029 Kim Jan 2010 A1
20100006350 Elias Jan 2010 A1
20100007616 Jang Jan 2010 A1
20100019779 Kato et al. Jan 2010 A1
20100031174 Kim Feb 2010 A1
20100039396 Ho et al. Feb 2010 A1
20100059294 Elias et al. Mar 2010 A1
20100060608 Yousefpor Mar 2010 A1
20100079384 Grivna Apr 2010 A1
20100079401 Staton Apr 2010 A1
20100090964 Soo et al. Apr 2010 A1
20100097346 Sleeman Apr 2010 A1
20100102027 Liu et al. Apr 2010 A1
20100110035 Selker May 2010 A1
20100117985 Wadia May 2010 A1
20100123667 Kim et al. May 2010 A1
20100139991 Philipp et al. Jun 2010 A1
20100143848 Jain et al. Jun 2010 A1
20100149108 Hotelling et al. Jun 2010 A1
20100149127 Fisher et al. Jun 2010 A1
20100156810 Barbier et al. Jun 2010 A1
20100156846 Long et al. Jun 2010 A1
20100182018 Hazelden Jul 2010 A1
20100182278 Li et al. Jul 2010 A1
20100194695 Hotelling et al. Aug 2010 A1
20100194696 Chang et al. Aug 2010 A1
20100194697 Hotelling et al. Aug 2010 A1
20100194698 Hotelling et al. Aug 2010 A1
20100194707 Hotelling et al. Aug 2010 A1
20100201635 Klinghult et al. Aug 2010 A1
20100245286 Parker Sep 2010 A1
20100253638 Yousefpor et al. Oct 2010 A1
20100259503 Yanase et al. Oct 2010 A1
20100277418 Huang et al. Nov 2010 A1
20100328228 Elias Dec 2010 A1
20100328248 Mozdzyn Dec 2010 A1
20100328262 Huang et al. Dec 2010 A1
20100328263 Lin Dec 2010 A1
20110001491 Huang et al. Jan 2011 A1
20110006832 Land et al. Jan 2011 A1
20110007020 Hong et al. Jan 2011 A1
20110007021 Bernstein et al. Jan 2011 A1
20110007030 Mo et al. Jan 2011 A1
20110025623 Lin Feb 2011 A1
20110025629 Grivna et al. Feb 2011 A1
20110025635 Lee Feb 2011 A1
20110061949 Krah et al. Mar 2011 A1
20110074705 Yousefpor et al. Mar 2011 A1
20110080391 Brown et al. Apr 2011 A1
20110096016 Yilmaz Apr 2011 A1
20110134050 Harley Jun 2011 A1
20110157068 Parker Jun 2011 A1
20110175846 Wang et al. Jul 2011 A1
20110193776 Oda et al. Aug 2011 A1
20110199105 Otagaki et al. Aug 2011 A1
20110227874 Faahraeus et al. Sep 2011 A1
20110231139 Yokota Sep 2011 A1
20110234523 Chang et al. Sep 2011 A1
20110241907 Cordeiro Oct 2011 A1
20110248949 Chang et al. Oct 2011 A1
20110254795 Chen et al. Oct 2011 A1
20110261005 Joharapurkar et al. Oct 2011 A1
20110261007 Joharapurkar et al. Oct 2011 A1
20110282606 Ahed et al. Nov 2011 A1
20110298727 Yousefpor et al. Dec 2011 A1
20110310033 Liu et al. Dec 2011 A1
20110310064 Keski-Jaskari Dec 2011 A1
20120026099 Harley Feb 2012 A1
20120044199 Karpin et al. Feb 2012 A1
20120050206 Welland Mar 2012 A1
20120050214 Kremin Mar 2012 A1
20120050216 Kremin et al. Mar 2012 A1
20120050217 Noguchi et al. Mar 2012 A1
20120054379 Leung et al. Mar 2012 A1
20120056662 Wilson et al. Mar 2012 A1
20120056851 Chen et al. Mar 2012 A1
20120075239 Azumi et al. Mar 2012 A1
20120092288 Wadia Apr 2012 A1
20120098776 Chen et al. Apr 2012 A1
20120113047 Hanauer et al. May 2012 A1
20120146726 Huang Jun 2012 A1
20120146920 Lin et al. Jun 2012 A1
20120146942 Kamoshida et al. Jun 2012 A1
20120154324 Wright et al. Jun 2012 A1
20120162121 Chang et al. Jun 2012 A1
20120162133 Chen et al. Jun 2012 A1
20120162134 Chen et al. Jun 2012 A1
20120169652 Chang Jul 2012 A1
20120169653 Chang Jul 2012 A1
20120169655 Chang Jul 2012 A1
20120169656 Chang Jul 2012 A1
20120169664 Milne Jul 2012 A1
20120182251 Krah Jul 2012 A1
20120187965 Roziere Jul 2012 A1
20120211264 Milne Aug 2012 A1
20120249446 Chen et al. Oct 2012 A1
20120262395 Chan Oct 2012 A1
20120262410 Lim Oct 2012 A1
20120287068 Colgate et al. Nov 2012 A1
20120313881 Ge et al. Dec 2012 A1
20120320385 Mu et al. Dec 2012 A1
20130015868 Peng Jan 2013 A1
20130021291 Kremin et al. Jan 2013 A1
20130027118 Ho et al. Jan 2013 A1
20130027346 Yarosh et al. Jan 2013 A1
20130038573 Chang Feb 2013 A1
20130057511 Shepelev et al. Mar 2013 A1
20130069911 You Mar 2013 A1
20130076648 Krah et al. Mar 2013 A1
20130093712 Liu et al. Apr 2013 A1
20130100038 Yilmaz et al. Apr 2013 A1
20130100071 Wright Apr 2013 A1
20130120303 Hong et al. May 2013 A1
20130127739 Guard et al. May 2013 A1
20130141383 Woolley Jun 2013 A1
20130154996 Trend et al. Jun 2013 A1
20130173211 Hoch et al. Jul 2013 A1
20130176271 Sobel et al. Jul 2013 A1
20130176273 Li et al. Jul 2013 A1
20130215049 Lee Aug 2013 A1
20130215075 Lee et al. Aug 2013 A1
20130224370 Cok et al. Aug 2013 A1
20130234964 Kim et al. Sep 2013 A1
20130257785 Brown et al. Oct 2013 A1
20130257797 Wu et al. Oct 2013 A1
20130257798 Tamura et al. Oct 2013 A1
20130265276 Obeidat et al. Oct 2013 A1
20130271427 Benbasat et al. Oct 2013 A1
20130278447 Kremin Oct 2013 A1
20130278498 Jung et al. Oct 2013 A1
20130278525 Lim et al. Oct 2013 A1
20130278543 Hsu Oct 2013 A1
20130307821 Kogo Nov 2013 A1
20130308031 Theuwissen Nov 2013 A1
20130314342 Kim et al. Nov 2013 A1
20130320994 Brittain et al. Dec 2013 A1
20130321289 Dubery et al. Dec 2013 A1
20130328759 Al-Dahle et al. Dec 2013 A1
20130342479 Pyo et al. Dec 2013 A1
20140002406 Cormier et al. Jan 2014 A1
20140009438 Liu et al. Jan 2014 A1
20140022186 Hong et al. Jan 2014 A1
20140022201 Boychuk Jan 2014 A1
20140043546 Yamazaki et al. Feb 2014 A1
20140070823 Roziere Mar 2014 A1
20140071084 Sugiura Mar 2014 A1
20140078096 Tan et al. Mar 2014 A1
20140098051 Hong et al. Apr 2014 A1
20140104194 Davidson et al. Apr 2014 A1
20140104225 Davidson et al. Apr 2014 A1
20140104228 Chen et al. Apr 2014 A1
20140111707 Song et al. Apr 2014 A1
20140118270 Moses May 2014 A1
20140125628 Yoshida et al. May 2014 A1
20140132534 Kim May 2014 A1
20140132560 Huang et al. May 2014 A1
20140132860 Hotelling et al. May 2014 A1
20140145997 Tiruvuru May 2014 A1
20140152615 Chang et al. Jun 2014 A1
20140160058 Chen et al. Jun 2014 A1
20140160376 Wang et al. Jun 2014 A1
20140168540 Wang et al. Jun 2014 A1
20140192027 Ksondzyk et al. Jul 2014 A1
20140204043 Lin et al. Jul 2014 A1
20140204058 Huang et al. Jul 2014 A1
20140210779 Katsuta et al. Jul 2014 A1
20140232681 Yeh Aug 2014 A1
20140232955 Roudbari et al. Aug 2014 A1
20140240291 Nam Aug 2014 A1
20140247245 Lee Sep 2014 A1
20140253470 Havilio Sep 2014 A1
20140267070 Shahparnia et al. Sep 2014 A1
20140267128 Bulea et al. Sep 2014 A1
20140267146 Chang et al. Sep 2014 A1
20140267165 Roziere Sep 2014 A1
20140285469 Wright et al. Sep 2014 A1
20140306924 Lin Oct 2014 A1
20140347574 Tung et al. Nov 2014 A1
20140354301 Trend Dec 2014 A1
20140362030 Mo et al. Dec 2014 A1
20140362034 Mo et al. Dec 2014 A1
20140368436 Abzarian et al. Dec 2014 A1
20140368460 Mo et al. Dec 2014 A1
20140375598 Shen et al. Dec 2014 A1
20140375603 Hotelling et al. Dec 2014 A1
20140375903 Westhues et al. Dec 2014 A1
20150002176 Kwon et al. Jan 2015 A1
20150002448 Brunet et al. Jan 2015 A1
20150002464 Nishioka et al. Jan 2015 A1
20150009421 Choi Jan 2015 A1
20150015528 Vandermeijden Jan 2015 A1
20150026398 Kim Jan 2015 A1
20150042600 Lukanc et al. Feb 2015 A1
20150042607 Takanohashi Feb 2015 A1
20150049043 Yousefpor Feb 2015 A1
20150049044 Yousefpor Feb 2015 A1
20150062063 Cheng et al. Mar 2015 A1
20150077375 Hotelling et al. Mar 2015 A1
20150077394 Dai et al. Mar 2015 A1
20150091587 Shepelev et al. Apr 2015 A1
20150091849 Ludden Apr 2015 A1
20150103047 Hanauer et al. Apr 2015 A1
20150116263 Kim Apr 2015 A1
20150123939 Kim et al. May 2015 A1
20150227240 Hong et al. Aug 2015 A1
20150242028 Roberts et al. Aug 2015 A1
20150248177 Maharyta Sep 2015 A1
20150253907 Elias Sep 2015 A1
20150268789 Liao et al. Sep 2015 A1
20150268795 Kurasawa et al. Sep 2015 A1
20150309610 Rabii et al. Oct 2015 A1
20150324035 Yuan et al. Nov 2015 A1
20150338937 Shepelev et al. Nov 2015 A1
20150370387 Yamaguchi et al. Dec 2015 A1
20150378465 Shih et al. Dec 2015 A1
20160011702 Shih Jan 2016 A1
20160018348 Yau et al. Jan 2016 A1
20160022218 Hayes et al. Jan 2016 A1
20160034102 Roziere et al. Feb 2016 A1
20160041629 Rao Feb 2016 A1
20160048234 Chandran et al. Feb 2016 A1
20160062533 O'Connor Mar 2016 A1
20160077667 Chiang Mar 2016 A1
20160117032 Lin et al. Apr 2016 A1
20160139728 Jeon et al. May 2016 A1
20160154505 Chang Jun 2016 A1
20160154529 Westerman Jun 2016 A1
20160170533 Roziere Jun 2016 A1
20160195954 Wang et al. Jul 2016 A1
20160216808 Hotelling et al. Jul 2016 A1
20160224177 Krah Aug 2016 A1
20160224189 Yousefpor et al. Aug 2016 A1
20160246423 Fu Aug 2016 A1
20160253041 Park Sep 2016 A1
20160259448 Guameri Sep 2016 A1
20160266676 Wang et al. Sep 2016 A1
20160266679 Shahparnia et al. Sep 2016 A1
20160282980 Chintalapoodi Sep 2016 A1
20160283023 Shin et al. Sep 2016 A1
20160299603 Tsujioka et al. Oct 2016 A1
20160357344 Benbasat et al. Dec 2016 A1
20170060318 Gu et al. Mar 2017 A1
20170090599 Kuboyama et al. Mar 2017 A1
20170090619 Yousefpor Mar 2017 A1
20170090622 Badaye et al. Mar 2017 A1
20170097703 Lee Apr 2017 A1
20170108968 Roziere Apr 2017 A1
20170139539 Yao et al. May 2017 A1
20170168626 Konicek Jun 2017 A1
20170220156 Blondin et al. Aug 2017 A1
20170229502 Liu Aug 2017 A1
20170269729 Chintalapoodi Sep 2017 A1
20170285804 Yingxuan et al. Oct 2017 A1
20170315646 Roziere Nov 2017 A1
20170357371 Kim Dec 2017 A1
20180067584 Zhu et al. Mar 2018 A1
20180224962 Mori Aug 2018 A1
20180275824 Li et al. Sep 2018 A1
20180307374 Shah et al. Oct 2018 A1
20180307375 Shah et al. Oct 2018 A1
20180367139 Pribisic et al. Dec 2018 A1
20190138152 Yousefpor et al. May 2019 A1
20190220115 Mori et al. Jul 2019 A1
20200333902 Li et al. Oct 2020 A1
20200341585 Li et al. Oct 2020 A1
20200387259 Krah Dec 2020 A1
Foreign Referenced Citations (175)
Number Date Country
1202254 Dec 1998 CN
1246638 Mar 2000 CN
1527274 Sep 2004 CN
1672119 Sep 2005 CN
1689677 Nov 2005 CN
1711520 Dec 2005 CN
1782837 Jun 2006 CN
1818842 Aug 2006 CN
1864124 Nov 2006 CN
1945516 Apr 2007 CN
101046720 Oct 2007 CN
101071354 Nov 2007 CN
101122838 Feb 2008 CN
101349957 Jan 2009 CN
101419516 Apr 2009 CN
201218943 Apr 2009 CN
101840293 Sep 2010 CN
102023768 Apr 2011 CN
102411460 Apr 2012 CN
103049148 Apr 2013 CN
103052930 Apr 2013 CN
103221910 Jul 2013 CN
103258492 Aug 2013 CN
103294321 Sep 2013 CN
103365500 Oct 2013 CN
103365506 Oct 2013 CN
103577008 Feb 2014 CN
103809810 May 2014 CN
103885627 Jun 2014 CN
104020908 Sep 2014 CN
104142757 Nov 2014 CN
104252266 Dec 2014 CN
105045446 Nov 2015 CN
102648446 Jan 2016 CN
105278739 Jan 2016 CN
105474154 Apr 2016 CN
105824461 Aug 2016 CN
11 2008 001 245 Mar 2010 DE
102011089693 Jun 2013 DE
112012004912 Aug 2014 DE
0 853 230 Jul 1998 EP
1 192 585 Apr 2002 EP
1 192 585 Apr 2002 EP
1 573 706 Feb 2004 EP
1 573 706 Feb 2004 EP
1 455 264 Sep 2004 EP
1 455 264 Sep 2004 EP
1 644 918 Dec 2004 EP
1 717 677 Nov 2006 EP
1 717 677 Nov 2006 EP
1745356 Jan 2007 EP
1918803 May 2008 EP
1 986 084 Oct 2008 EP
2 077 489 Jul 2009 EP
2144146 Jan 2010 EP
2148264 Jan 2010 EP
2224277 Sep 2010 EP
2 256 606 Dec 2010 EP
1455264 May 2011 EP
2756048 May 1998 FR
2896595 Jul 2007 FR
2949008 Feb 2011 FR
3004551 Oct 2014 FR
1 546 317 May 1979 GB
2 144 146 Feb 1985 GB
2 428 306 Jan 2007 GB
2 437 827 Nov 2007 GB
2 450 207 Dec 2008 GB
10-505183 May 1998 JP
2000-163031 Jun 2000 JP
3134925 Feb 2001 JP
2002-342033 Nov 2002 JP
2003-066417 Mar 2003 JP
2004-503835 Feb 2004 JP
2004-526265 Aug 2004 JP
2005-30901 Feb 2005 JP
2005-084128 Mar 2005 JP
2005-301373 Oct 2005 JP
2006-251927 Sep 2006 JP
2007-018515 Jan 2007 JP
2008-510251 Apr 2008 JP
2008-117371 May 2008 JP
2008-225415 Sep 2008 JP
2009-86240 Apr 2009 JP
2009-157373 Jul 2009 JP
2010-528186 Aug 2010 JP
10-2004-0002983 Jan 2004 KR
10-20040091728 Oct 2004 KR
10-20070002327 Jan 2007 KR
10-2008-0019125 Mar 2008 KR
10-2008-0041278 May 2008 KR
10-2011-0044670 Apr 2011 KR
10-2012-0085737 Aug 2012 KR
10-2013-0094495 Aug 2013 KR
10-2013-0117499 Oct 2013 KR
10-2014-0074454 Jun 2014 KR
10-1609992 Apr 2016 KR
200715015 Apr 2007 TW
200826032 Jun 2008 TW
2008-35294 Aug 2008 TW
M341273 Sep 2008 TW
M344522 Nov 2008 TW
M344544 Nov 2008 TW
M352721 Mar 2009 TW
201115442 May 2011 TW
201203069 Jan 2012 TW
201401129 Jan 2014 TW
201419071 May 2014 TW
WO-9935633 Jul 1999 WO
WO-9935633 Jul 1999 WO
2000073984 Dec 2000 WO
WO-01097204 Dec 2001 WO
2002080637 Oct 2002 WO
2003079176 Sep 2003 WO
2004013833 Feb 2004 WO
2004114265 Dec 2004 WO
2004013833 Aug 2005 WO
WO-2005114369 Dec 2005 WO
WO-2005114369 Dec 2005 WO
WO-2006020305 Feb 2006 WO
WO-2006020305 Feb 2006 WO
WO-2006023147 Mar 2006 WO
WO-2006023147 Mar 2006 WO
WO-2006104745 Oct 2006 WO
WO-2006104745 Oct 2006 WO
2006126703 Nov 2006 WO
WO-2006130584 Dec 2006 WO
WO-2006130584 Dec 2006 WO
WO-2007012899 Feb 2007 WO
WO-2007034591 Mar 2007 WO
2007054018 May 2007 WO
WO-2007066488 Jun 2007 WO
WO-2007089766 Aug 2007 WO
WO-2007089766 Aug 2007 WO
WO-2007115032 Oct 2007 WO
2007146780 Dec 2007 WO
WO-2007146785 Dec 2007 WO
WO-2007146785 Dec 2007 WO
2007115032 Jan 2008 WO
2008000964 Jan 2008 WO
WO-2008007118 Jan 2008 WO
WO-2008007118 Jan 2008 WO
2008030780 Mar 2008 WO
WO-2008047990 Apr 2008 WO
WO-2008076237 Jun 2008 WO
2008076237 Aug 2008 WO
2007146780 Sep 2008 WO
WO-2008108514 Sep 2008 WO
WO-2008135713 Nov 2008 WO
WO-2009046363 Apr 2009 WO
WO-2009103946 Aug 2009 WO
WO-2009132146 Oct 2009 WO
WO-2009132150 Oct 2009 WO
WO-2010088659 Aug 2010 WO
WO-2010117882 Oct 2010 WO
2011015795 Feb 2011 WO
2011071784 Jun 2011 WO
2011015795 Jul 2011 WO
WO-2011137200 Nov 2011 WO
2013093327 Jun 2013 WO
WO-2013158570 Oct 2013 WO
2014105942 Jul 2014 WO
WO-2014127716 Aug 2014 WO
WO-2015017196 Feb 2015 WO
WO-2015023410 Feb 2015 WO
WO-2015072722 May 2015 WO
WO-2015107969 Jul 2015 WO
WO-2015178920 Nov 2015 WO
WO-2016048269 Mar 2016 WO
2016066282 May 2016 WO
WO-2016069642 May 2016 WO
WO-2016126525 Aug 2016 WO
WO-2016144437 Sep 2016 WO
2017058413 Apr 2017 WO
WO-2017058415 Apr 2017 WO
Non-Patent Literature Citations (242)
Entry
First Action Interview Pilot Program Pre-Interview Communication, dated Apr. 4, 2019, for U.S. Appl. No. 15,686,969, filed Aug. 25, 2017, three pages.
Notice of Allowance dated Apr. 3, 2019, for U.S. Appl. No. 15/687,078, filed Aug. 25, 2017, eight pages.
Non-Final Office Action dated Jan. 2, 2019, for U.S. Appl. No. 15/522,737, filed Apr. 27, 2017, thirteen pages.
Non-Final Office Action dated Jan. 18, 2019 , for U.S. Appl. No. 14/993,017, filed Jan. 11, 2016, 34 pages.
Non-Final Office Action dated Jan. 18, 2019 , for U.S. Appl. No. 15/087,956, filed Mar. 31, 2016, twelve pages.
Notice of Allowance dated Dec. 31, 2018, for U.S. Appl. No. 14/318,157, filed Jun. 27, 2014, eight pages.
Notice of Allowance dated Mar. 11, 2019, for U.S. Appl. No. 15/087,956, filed Mar. 31, 2016, ten pages.
Non-Final Office Action dated Dec. 21, 2018, for U.S. Appl. No. 15/313,549, filed Nov. 22, 2016, thirteen pages.
Final Office Action dated Feb. 6, 2019, for U.S. Appl. No. 15/009,774, filed Jan. 28, 2016, fifteen pages.
Non-Final Office Action dated Feb. 11, 2019 , for U.S. Appl. No. 15/507,722, filed Feb. 28, 2017, fifteen pages.
Cassidy, R. (Feb. 23, 2007). “The Tissot T-Touch Watch—A Groundbreaking Timepiece,” located at <http://ezinearticles.com/?The-Tissot-T-Touch-Watch---A-Groundbreaking-Timepiece&id . . . >, last visited Jan. 23, 2009, two pages.
Chinese Search Report completed Dec. 14, 2011, for CN Patent Application No. ZL201020108330X, filed Feb. 2, 2010, with English Translation, 12 pages.
Chinese Search Report completed May 18, 2015, for CN Patent Application No. 201310042816.6, filed Feb. 2, 2010, two pages.
European Search Report dated Jul. 21, 2010, for EP Patent Application 10151969.2, three pages.
European Search Report dated Apr. 25, 2012, for EP Patent Application No. 08022505.5, 12 pages.
European Search Report dated Dec. 3, 2012, for EP Patent Application No. 12162177.5, seven pages.
European Search Report dated Feb. 13, 2013, for EP Patent Application No. 12192450.0, six pages.
European Search Report dated Aug. 31, 2015, for EP Application No. 15166813.4, eight pages.
European Search Report dated Jul. 27, 2017, for EP Application No. 14902458.0, four pages.
European Search Report dated Jan. 31, 2018, for EP Application No. 17183937.6, four pages.
Final Office Action dated Jan. 5, 2012, for U.S. Appl. No. 12/206,680, filed Sep. 8, 2008, 15 pages.
Final Office Action dated Jan. 3, 2013, for U.S. Appl. No. 11/818,498, filed Jun. 13, 2007, 17 pages.
Final Office Action dated Feb. 1, 2013, for U.S. Appl. No. 12/642,466, filed Dec. 18, 2009, nine pages.
Final Office Action dated Feb. 5, 2013, for U.S. Appl. No. 12/500,911, filed Jul. 10, 2009, 15 pages.
Final Office Action dated Apr. 30, 2013, for U.S. Appl. No. 12/494,173, filed Jun. 29, 2009, 7 pages.
Final Office Action dated May 22, 2013, for U.S. Appl. No. 12/206,680, filed Sep. 8, 2008, 16 pages.
Final Office Action dated Jun. 21, 2013, for U.S. Appl. No. 12/545,754, filed Aug. 21, 2009, 6 pages.
Final Office Action dated Jul. 19, 2013, for U.S. Appl. No. 12/545,604, filed Aug. 21, 2009, 17 pages.
Final Office Action dated Aug. 12, 2013, for U.S. Appl. No. 12/238,333, filed Sep. 25, 2008, 19 pages.
Final Office Action dated Aug. 13, 2013, for U.S. Appl. No. 12/238,342, filed Sep. 25, 2008, 14 pages.
Final Office Action dated Jan. 27, 2014, for U.S. Appl. No. 12/206,680, filed Sep. 8, 2008, 20 pages.
Final Office Action dated Apr. 23, 2014 for U.S. Appl. No. 12/847,987, filed Jul. 30, 2010, 16 pages.
Final Office Action dated May 9, 2014, for U.S. Appl. No. 12/642,466, filed Dec. 18, 2009, 13 pages.
Final Office Action dated Jul. 16, 2014, for U.S. Appl. No. 12/545,604, filed Aug. 21, 2009, 18 pages.
Final Office Action dated Oct. 22, 2014, for U.S. Appl. No. 12/238,342, filed Sep. 25, 2008, 16 pages.
Final Office Action dated Oct. 22, 2014, for U.S. Appl. No. 13/448,182, filed Apr. 16, 2012, 11 pages.
Final Office Action dated Apr. 22, 2015, for U.S. Appl. No. 12/238,333, fiied Sep. 25, 2008, 23 pages.
Final Office Action dated Jun. 11, 2015, for U.S. Appl. No. 13/448,182, filed Apr. 16, 2012, 12 pages.
Final Office Action dated Nov. 12, 2015, for U.S. Appl. No. 14/082,074, filed Nov. 15, 2013, 22 pages.
Final Office Action dated Jan. 4, 2016, for U.S. Appl. No. 14/082,003, filed Nov. 15, 2013, 25 pages.
Final Office Action dated Jan. 29, 2016, for U.S. Appl. No. 12/642,466, filed Dec. 18, 2009, nine pages.
Final Office Action dated Apr. 8, 2016, for U.S. Appl. No. 13/899,391, filed May 21, 2013, ten pages.
Final Office Action dated May 9, 2016, for U.S. Appl. No. 14/318,157, filed Jun. 27, 2014, ten pages.
Final Office Action dated May 27, 2016, for U.S. Appl. No. 14/645,120, filed Mar. 11, 2015, twelve pages.
Final Office Action dated Jun. 14, 2016, for U.S. Appl. No. 14/550,686, filed Nov. 21, 2014, ten pages.
Final Office Action dated Sep. 29, 2016, for U.S. Appl. No. 14/558,529, filed Dec. 2, 2014, 22 pages.
Final Office Action dated Nov. 4, 2016, for U.S. Appl. No. 14/082,003, filed Nov. 15, 2013, 18 pages.
Final Office Action dated Jul. 26, 2017, for U.S. Appl. No. 14/318,157, filed Jun. 27, 2014, 10 pages.
Final Office Action dated Aug. 10, 2017, for U.S. Appl. No. 14/645,120, filed Mar. 11, 2015, twelve pages.
Final Office Action dated Aug. 21, 2017, for U.S. Appl. No. 14/550,686, filed Nov. 21, 2014, 11 pages.
Final Office Action dated Dec. 5, 2017, for U.S. Appl. No. 15/006,987, filed Jan. 26, 2016,16 pages.
Final Office Action dated May 14, 2018, for U.S. Appl. No. 15/006,987, filed Jan. 26, 2016, 11 pages.
Final Office Action dated May 17, 2018, for U.S. Appl. No. 15/017,463, filed Feb. 5, 2016, 22 pages.
Final Office Action dated Jul. 27, 2018, for U.S. Appl. No. 15/097,179, filed Apr. 12, 2016, 11 pages.
Final Office Action dated Aug. 16, 2018, for U.S. Appl. No. 14/993,017, filed Jan. 11, 2016, 35 pages.
International Search Report dated Mar. 10, 2010, for PCT Application No. PCT/US2010/22868, filed Feb. 2, 2010, three pages.
International Search Report dated Jan. 14. 2011, for PCT Application No. PCT/US2010/029698, filed Apr. 1, 2010, 4 pages.
International Search Report dated May 2, 2011, for PCT Application No. PCT/US2010/058988, filed Dec. 3, 2010, five pages.
International Search Report dated Aug. 6, 2013, for PCT Application No. PCT/US2013/036662, filed Apr. 15, 2013, three pages.
International Search Report dated Jan. 29, 2015, for PCT Application No. PCT/US2014/047888, filed Jul. 23, 2014, six pages.
International Search Report dated May 9, 2016, for PCT Application No. PCT/US2016/015479, filed Jan. 28, 2016, five pages.
International Search Report dated May 11, 2016, for PCT Application No. PCT/US2016/016011, filed Feb. 1, 2016, six pages.
Lee, S.K. et al. (Apr. 1985). “A Multi-Touch Three Dimensional Touch-Sensitive Tablet,” Proceedings of CHI: ACM Conference on Human Factors in Computing Systems, pp. 21-25.
Malik, S. et al. (2004). “Visual Touchpad: A Two-Handed Gestural Input Device,” Proceedings of the 6th International Conference on Multimodal Interfaces, State College, PA, Oct. 13-15, 2004, ICMI '04, ACM pp. 289-296.
Non-Final Office Action dated Feb. 4, 2011, for U.S. Appl. No. 12/038,760, filed Feb. 27, 2008, 18 pages.
Non-Final Office Action dated Jun. 9, 2011, for U.S. Appl. No. 12/206,680, filed Sep. 8, 2008, 13 pages.
Non-Final Office Action dated Mar. 9, 2012, for U.S. Appl. No. 12/238,342, filed Sep. 25, 2008, 26 pgs.
Non-Final Office Action dated May 3, 2012, for U.S. Appl. No. 12/238,333, filed Sep. 25, 2008, 22 pgs.
Non-Final Office Action dated May 25, 2012, for U.S. Appl. No. 11/818,498, filed Jun. 13, 2007, 16 pages.
Non-Final Office Action dated Jun. 7, 2012, for U.S. Appl. No. 12/500,911, filed Jul. 10, 2009, 16 pages.
Non-Final Office Action dated Aug. 28, 2012, for U.S. Appl. No. 12/642,466, filed Dec. 18, 2009, nine pages.
Non-Final Office Action dated Sep. 26, 2012, for U.S. Appl. No. 12/206,680, filed Sep. 8, 2008, 14 pages.
Non-Final Office Action dated Oct. 5, 2012, for U.S. Appl. No. 12/545,754, filed Aug. 21, 2009, 10 pages.
Non-Final Office Action dated Nov. 23, 2012, for U.S. Appl. No. 12/545,557, filed Aug. 21, 2009, 11 pages.
Non-Final Office Action dated Nov. 28, 2012, for U.S. Appl. No. 12/494,173, filed Jun. 29, 2009, six pages.
Non-Final office Action dated Jan. 7, 2013, for U.S. Appl. No. 12/545,604, filed Aug. 21, 2009, 12 pages.
Non-Final Office Action dated Jan. 7, 2013, for U.S. Appl. No. 12/238,333, filed Sep. 25, 2008, 20 pgs.
Non-Final Office Action dated Feb. 15, 2013, for U.S. Appl. No. 12/238,342, filed Sep. 25, 2008, 15 pages.
Non-Final Office Action dated Mar. 29, 2013 for U.S. Appl. No. 13/737,779, filed Jan. 9, 2013, nine pages.
Non-Final Office Action dated Sep. 6, 2013, for U.S. Appl. No. 12/847,987, filed Jul. 30, 2010, 15 pages.
Non-Final Office Action dated Sep. 10, 2013, for U.S. Appl. No. 12/545,754, filed Aug. 21, 2009, six pages.
Non-Final Office Action dated Sep. 30, 2013, for U.S. Appl. No. 12/206,680, filed Sep. 8, 2008, 18 pages.
Non-Final Office Action dated Nov. 8, 2013, for U.S. Appl. No. 12/642,466, filed Dec. 18, 2009, 12 pages.
Non-Final Office Action dated Dec. 19, 2013, for U.S. Appl. No. 12/545,604, filed Aug. 21, 2009, 17 pages.
Non-Final Office Action dated Jan. 2, 2014, for U.S. Appl. No. 12/545,754, filed Aug. 21, 2009, 11 pages.
Non-Final Office Action dated Jan. 3, 2014 , for U.S. Appl. No. 12/545,557, filed Aug. 21, 2009, 9 pages.
Non-Final Office Action dated Jan. 31, 2014, for U.S. Appl. No. 13/448,182, filed Apr. 16, 2012, 18 pages.
Non-Final Office Action dated Mar. 12, 2014, for U.S. Appl. No. 12/238,342, filed Sep. 25, 2008, 15 pages.
Non-Final Office Action dated Apr. 10, 2014, for U.S. Appl. No. 14/055,717, filed Oct. 16, 2013, 10 pages.
Non-Final Office Action dated Sep. 18, 2014, for U.S. Appl. No. 12/238,333, filed Sep. 25, 2008, 21 pages.
Non-Final Office Action dated Apr. 10, 2015, for U.S. Appl. No. 14/082,074, filed Nov. 15, 2013, 23 pages.
Non-Final Office Action dated May 4, 2015, for U.S. Appl. No. 12/642,466, filed Dec. 18, 2009, nine pages.
Non-Final Office Action dated May 8, 2015, for U.S. Appl. No. 14/082,003, filed Nov. 15, 2013, 25 pages.
Non-Final Office Action dated Aug. 20, 2015 , for U.S. Appl. No. 14/550,686, filed Nov. 21, 2014, ten pages.
Non-Final Office Action dated Oct. 5, 2015, for U.S. Appl. No. 13/899,391, filed May 21, 2013, ten pages.
Non-Final Office Action dated Oct. 6, 2015, for U.S. Appl. No. 14/318,157, filed Jun. 27, 2014, seven pages.
Non-Final Office Action dated Oct. 27, 2015, for U.S. Appl. No. 14/645,120, filed Mar. 11, 2015, eight pages.
Non-Final Office Action dated Apr. 14, 2016, for U.S. Appl. No. 14/558,529, filed Dec. 2, 2014, twenty pages.
Non-Final Office Action dated May 25, 2016, for U.S. Appl. No. 14/082,003, filed Nov. 15, 2013, 23 pages.
Non-Final Office Action dated Jun. 1, 2016, for U.S. Appl. No. 14/615,186, filed Feb. 5, 2015, eight pages.
Non-Final Office Action dated Dec. 14, 2016, for U.S. Appl. No. 14/550,686, filed Nov. 21, 2014, eight pages.
Non-Final Office Action dated Dec. 16, 2016, for U.S. Appl. No. 14/645,120, filed Mar. 11, 2015, ten pages.
Non-Final Office Action dated Dec. 19, 2016, for U.S. Appl. No. 14/318,157, filed Jun. 27, 2014, eleven pages.
Non-Final Office Action dated Mar. 13, 2017, for U.S. Appl. No. 14/082,003, filed Nov. 15, 2013, 20 pages.
Non-Final Office Action dated Apr. 7, 2017, for U.S. Appl. No. 15/144,706, filed May 2, 2016, eight pages.
Non-Final Office Action dated Jun. 14, 2017, for U.S. Appl. No. 15/006,987, filed Jan. 26, 2016, 14 pages.
Non-Final Office Action dated Jun. 26, 2017, for U.S. Appl. No. 14/558,529, filed Dec. 2, 2014, six pages.
Non-Final Office Action dated Sep. 14, 2017 , for U.S. Appl. No. 15/017,463, filed Feb. 5, 2016, 22 pages.
Non-Final Office Action dated Dec. 22, 2017 , for U.S. Appl. No. 14/993,017, filed Jan. 11, 2016, 23 pages.
Non-Final Office Action dated Jan. 22, 2018 , for U.S. Appl. No. 15/097,179, filed Apr. 12, 2016, 11 pages.
Non-Final Office Action dated Apr. 3, 2018, for U.S. Appl. No. 14/318,157, filed Jun. 27, 2014, twelve pages.
Non-Final Office Action dated Jun. 20, 2018, for U.S. Appl. No. 15/009,774, filed Jan. 28, 2016, seventeen pages.
Notice of Allowance dated Jun. 10, 2013, for U.S. Appl. No. 12/545,557, filed Aug. 21, 2009, 9 pages.
Notice of Allowance dated Aug. 19, 2013, for U.S. Appl. No. 12/500,911, filed Jul. 10, 2009, six pages.
Notice of Allowance dated Sep. 3, 2013, for U.S. Appl. No. 13/737,779, filed Jan. 9, 2013, 10 pages.
Notice of Allowance dated Apr. 11, 2014, for U.S. Appl. No. 12/545,557, filed Aug. 21, 2009, 9 pages.
Notice of Allowance dated Aug. 21, 2014, for U.S. Appl. No. 12/545,754, filed Aug. 21, 2009, ten pages.
Notice of Allowance dated Oct. 15, 2014, for U.S. Appl. No. 12/494,173, filed Jun. 29, 2009, eight pages.
Notice of Allowance dated Nov. 7, 2014, for U.S. Appl. No. 14/055,717, filed Oct. 16, 2013, six pages.
Notice of Allowance dated Mar. 16, 2015, for U.S. Appl. No. 14/312,489, filed Jun. 23, 2014, eight pages.
Notice of Allowance dated Dec. 1, 2015, for U.S. Appl. No. 12/238,333, filed Sep. 25, 2008, nine pages.
Notice of Allowance dated Jan. 8, 2016, for U.S. Appl. No. 13/448,182, filed Apr. 16, 2012, nine pages.
Notice of Allowance dated Dec. 2, 2016, for U.S. Appl. No. 14/615,186, filed Feb. 5, 2015, seven pages.
Notice of Allowance dated Sep. 20, 2017, for U.S. Appl. No. 14/082,003, filed Nov. 15, 2013, eight pages.
Notice of Allowance dated Sep. 20, 2017, for U.S. Appl. No. 15/144,706, filed May 2, 2016, nine pages.
Notice of Allowance dated Oct. 3, 2017, for U.S. Appl. No. 14/082,003, filed Nov. 15, 2013, nine pages.
Notice of Allowance dated Oct. 13, 2017, for U.S. Appl. No. 14/558,529, filed Dec. 2, 2014, eight pages.
Notice of Allowance dated Feb. 9, 2018, for U.S. Appl. No. 14/550,686, filed Nov. 21, 2014, 11 pages.
Notice of Allowance dated Mar. 1, 2018, for U.S. Appl. No. 14/645,120, filed Mar. 11, 2015, five pages.
Rekimoto, J. (2002). “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,” CHI 2002, Apr. 20-25, 2002. [(Apr. 20, 2002). 4(1):113-120.].
Rubine, D.H. (Dec. 1991). “The Automatic Recognition of Gestures,” CMU-CS-91-202, Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Computer Science at Carnegie Mellon University, 285 pages.
Rubine, D.H. (May 1992). “Combining Gestures and Direct Manipulation,” CHI '92, pp. 659-660.
Search Report dated Apr. 29, 2009, for NL Application No. 2001672, with English translation of Written Opinion, eight pages.
Search Report dated Oct. 14, 2015, for TW Application No. 103116003, one page.
Search Report dated Nov. 12, 2015, for ROC (Taiwan) Patent Application No. 103105965, with English translation, two pages.
TW Search Report dated May 3, 2016, for TW Application No. 104115152, one page.
Westerman, W. (Spring 1999). “Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface,” A Dissertation Submitted to the Faculty of the University of Delaware in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Electrical Engineering, 364 pages.
Wilson, A.D. (Oct. 15, 2006). “Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input,” ACM, USIT 06, Montreux, Switzerland, Oct. 15-18, 2006, pp. 255-258.
Yang, J-H. et al. (Jul. 2013). “A Noise-Immune High-Speed Readout Circuit for In-Cell Touch Screen Panels,” IEEE Transactions on Circuits and Systems—1: Regular Papers 60(7):1800-1809.
Advisory Action received for U.S. Appl. No. 11/818,498, dated May 17, 2013, 5 pages.
Advisory Action received for U.S. Appl. No. 11/818,498, dated Oct. 14, 2011, 5 pages.
Advisory Action received for U.S. Appl. No. 12/206,680, dated Apr. 16, 2012, 3 pages.
Advisory Action received for U.S. Appl. No. 12/238,333, dated Dec. 17, 2013, 3 pages.
Advisory Action received for U.S. Appl. No. 12/238,333, dated Oct. 21, 2015, 4 pages.
Advisory Action received for U.S. Appl. No. 12/500,911, dated May 17, 2013, 3 pages.
Advisory Action received for U.S. Appl. No. 12/642,466, dated May 23, 2013, 2 pages.
Advisory Action received for U.S. Appl. No. 14/082,003, dated Mar. 10, 2016, 3 pages.
Advisory Action received for U.S. Appl. No. 14/645,120, dated Nov. 25, 2016, 3 pages.
Advisory Action received for U.S. Appl. No. 15/017,463, dated Aug. 8, 2018, 3 pages.
Decision to Grant received for European Patent Application No. 16704768.7, dated May 23, 2019, 1 page.
Final Office Action received for U.S. Appl. No. 11/818,498, dated Jun. 10, 2011, 16 pages.
Final Office Action received for U.S. Appl. No. 12/038,760, dated Jul. 23, 2013, 20 pages.
Final Office Action received for U.S. Appl. No. 12/038,760, dated Jun. 8, 2011, 21 pages.
Final Office Action received for U.S. Appl. No. 12/110,024, dated Dec. 24, 2012, 21 pages.
Final Office Action received for U.S. Appl. No. 12/110,024, dated Jan. 19, 2012, 12 pages.
Final Office Action received for U.S. Appl. No. 12/110,075, dated Aug. 31, 2012, 15 pages.
Final Office Action received for U.S. Appl. No. 12/333,250, dated Dec. 15, 2011, 13 pages.
Final Office Action received for U.S. Appl. No. 14/157,737, dated Aug. 31, 2015, 28 pages.
Final Office Action received for U.S. Appl. No. 14/997,031, dated Jun. 14, 2018, 19 pages.
Final Office Action received for U.S. Appl. No. 15/090,555, dated Aug. 29, 2018, 18 pages.
Final Office Action received for U.S. Appl. No. 15/228,942, dated Apr. 17, 2019, 9 pages.
Final Office Action received for U.S. Appl. No. 15/313,549, dated Dec. 18, 2019, 24 pages.
Final Office Action received for U.S. Appl. No. 15/507,722, dated Sep. 13, 2019, 18 pages.
Final Office Action received for U.S. Appl. No. 15/522,737, dated Sep. 12, 2019, 15 pages.
Final Office Action received for U.S. Appl. No. 16/201,730, dated Nov. 1, 2019, 11 pages.
First Action Interview Office Action received for U.S. Appl. No. 15/686,969, dated Aug. 19, 2019, 7 pages.
First Action Interview received for U.S. Appl. No. 15/228,942, dated Nov. 26, 2018, 5 pages.
Gibilisco, Stan, “The Illustrated Dictionary of Electronics”, Eighth Edition, p. 173.
Intention to Grant received for European Patent Application No. 15166813.4, dated Sep. 20, 2019, 8 pages.
International Preliminary Report on Patentability received for PCT Patent Application No. PCT/US2010/022868, dated Jan. 27, 2011, 10 pages.
International Search Report received for PCT Patent Application No. PCT/US2008/078836, dated Mar. 19, 2009, 4 pages.
International Search Report received for PCT Patent Application No. PCT/US2009/041460, dated Jul. 17, 2009, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2009/041465, dated Aug. 5, 2009, 4 pages.
International Search Report received for PCT Patent Application No. PCT/US2014/039245, dated Sep. 24, 2014, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2014/056795, dated Dec. 12, 2014, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2015/057644, dated Jan. 8, 2016, 3 pages.
International Search Report received for PCT Patent Application No. PCT/US2016/048694, dated Oct. 31, 2016, 6 pages.
Non-Final Office Action received for U.S. Appl. No. 11/818,498, dated Dec. 13, 2010, 16 pages.
Non-Final Office Action received for U.S. Appl. No. 12/038,760, dated Jan. 2, 2013, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,024, dated Jul. 3, 2012, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,024, dated Jul. 11, 2011, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,075, dated Jan. 25, 2012, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,075, dated Jul. 8, 2011, 15 pages.
Non-Final Office Action received for U.S. Appl. No. 12/110,075, dated Mar. 28, 2013, 14 pages.
Non-Final Office Action received for U.S. Appl. No. 12/333,250, dated Aug. 17, 2011, 13 pages.
Non-Final Office Action received for U.S. Appl. No. 14/157,737, dated Feb. 10, 2015, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 15/009,774, dated Sep. 4, 2019, 18 pages.
Non-Final Office Action received for U.S. Appl. No. 15/017,463, dated May 15, 2019, 20 pages.
Non-Final Office Action received for U.S. Appl. No. 15/090,555, dated Nov. 3, 2017, 28 pages.
Non-Final Office Action received for U.S. Appl. No. 15/313,549, dated Jul. 10, 2019, 25 pages.
Non-Final Office Action received for U.S. Appl. No. 16/201,730, dated May 10, 2019, 9 pages.
Notice of Allowance received for U.S. Appl. No. 12/038,760, dated Nov. 8, 2013, 15 pages.
Notice of Allowance received for U.S. Appl. No. 12/110,024, dated May 23, 2013, 5 pages.
Notice of Allowance received for U.S. Appl. No. 12/110,075, dated Aug. 19, 2013, 8 pages.
Notice of Allowance received for U.S. Appl. No. 12/333,250, dated Aug. 28, 2012, 10 pages.
Notice of Allowance received for U.S. Appl. No. 12/545,604, dated Oct. 5, 2015, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/157,737, dated Dec. 14, 2015, 5 pages.
Notice of Allowance received for U.S. Appl. No. 14/329,719, dated Nov. 2, 2015, 9 pages.
Notice of Allowance received for U.S. Appl. No. 14/993,017, dated Jul. 12, 2019, 10 pages.
Notice of Allowance received for U.S. Appl. No. 15/090,555, dated Feb. 12, 2019, 7 pages.
Notice of Allowance received for U.S. Appl. No. 15/228,942, dated Aug. 30, 2019, 12 pages.
Notice of Allowance received for U.S. Appl. No. 15/686,969, dated Jan. 2, 2020, 8 pages.
Notice of Allowance received for U.S. Appl. No. 15/691,283, dated Jun. 5, 2019, 10 pages.
Notification of Grant received for Korean Patent Application No. 10-2016-7003645, dated May 31, 2019, 3 pages (1 page of English Translation and 2 pages of Official Copy).
Notification to Grant received for Chinese Patent Application No. 201610790093.1, dated Apr. 30, 2019, 4 pages (2 pages of English Translation and 2 page of Official Copy).
O'Connor, Todd, “mTouch Projected Capacitive Touch Screen Sensing Theory of Operation”, Microchip TB3064, Microchip Technology Inc., pp. 1-16.
Office Action received for Australian Patent Application No. 2019200698, dated Nov. 23, 2019, 3 pages.
Office Action received for Chinese Patent Application No. 201310330348.2, dated Nov. 3, 2015, 7 pages (4 pages of English Translation and 3 pages of Official copy).
Office Action received for Chinese Patent Application No. 201480081612.6, dated Jun. 4, 2019, 22 pages (11 of English Translation and 11 pages of Official Copy).
Office Action received for Chinese Patent Application No. 201580058366.7, dated May 28, 2019, 19 pages (10 pages of English Translation and 9 pages of Official Copy).
Office Action received for Chinese Patent Application No. 201680012966.4, dated Nov. 1, 2019, 19 pages (10 pages of English Translation and 9 pages of Official copy).
Preinterview First Office Action received for U.S. Appl. No. 15/228,942, dated Sep. 13, 2018, 4 pages.
Restriction Requirement received for U.S. Appl. No. 12/238,333, dated Mar. 8, 2012, 6 pages.
Restriction Requirement received for U.S. Appl. No. 12/494,173, dated Aug. 8, 2012, 5 pages.
Restriction Requirement received for U.S. Appl. No. 13/899,391, dated Apr. 8, 2015, 6 pages.
Restriction Requirement received for U.S. Appl. No. 15/087,956, dated Feb. 13, 2018, 8 pages.
Restriction Requirement received for U.S. Appl. No. 15/097,179, dated Sep. 28, 2017, 6 pages.
Restriction Requirement received for U.S. Appl. No. 15/228,942, dated Mar. 21, 2018, 6 pages.
Restriction Requirement received for U.S. Appl. No. 15/691,283, dated Mar. 5, 2019, 6 pages.
Search Report received for Chinese Patent Application No. 200820133814.2, dated Jan. 10, 2011, 25 pages.
Search Report received for Chinese Patent Application No. 200920008199.7, dated Jan. 7, 2011, 14 pages.
Search Report received for Chinese Patent Application No. ZL2009201524013, completed on Jun. 3, 2011, 20 pages.
Search Report received for European Patent Application No. 08017396.6, dated Mar. 19, 2009, 7 pages.
Search Report received for Great Britain Patent Application No. GB0817242.1, dated Jan. 19, 2009, 2 pages.
Search Report received for Great Britain Patent Application No. GB0817242.1, dated Jan. 19, 2010, 2 pages.
Written Opinion received for PCT Patent Application No. PCT/US2010/022868, dated Mar. 10, 2010, 4 pages.
Examiner's Answer to Appeal Brief received for U.S. Appl. No. 11/818,498, dated Dec. 20, 2013, 17 pages.
Extended European Search Report received for European Patent Application No. 18197785.1, dated Apr. 5, 2019, 8 pages.
Final Office Action received for U.S. Appl. No. 15/017,463, dated Feb. 13, 2020, 22 pages.
Non-Final Office Action received for U.S. Appl. No. 15/313,549, dated Apr. 23, 2020, 33 pages.
Notice of Allowance received for U.S. Appl. No. 15/009,774, dated Mar. 20, 2020, 16 pages.
Notice of Allowance received for U.S. Appl. No. 15/507,722, dated Feb. 27, 2020, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/522,737, dated Mar. 6, 2020, 8 pages.
Patent Board Decision received for U.S. Appl. No. 11/818,498, dated Nov. 2, 2016, 8 pages.
Search Report received for Chinese Patent Application No. 201680008313.9, dated Jul. 5, 2019, 4 pages (2 pages English Translation and 2 pages of Official copy).
Supplemental Notice of Allowance received for U.S. Appl. No. 15/686,969, dated Feb. 21, 2020, 2 pages.
Lowe, Doug, “Electronics Components: How to Use an Op Amp as a Voltage Comparator”, Dummies, Available online at :<https://www.dummies.com/programming/electronics/components/electronics-components-how-to-use-an-pp-amp-as-a-voltage-comparator/>, 2012, 9 pages.
Notice of Allowance received for U.S. Appl. No. 15/009,774, dated Jul. 1, 2020, 6 pages.
Notice of Allowance received for U.S. Appl. No. 15/313,549, dated Oct. 21, 2020, 10 pages.
Non-Final Office Action received for U.S. Appl. No. 16/921,817, dated Sep. 22, 2021, 24 pages.
Non-Final Office Action received for U.S. Appl. No. 16/924,047, dated Sep. 24, 2021, 17 pages.
Non-Final Office Action received for U.S. Appl. No. 17/003,133, dated Aug. 3, 2021, 22 pages.
Related Publications (1)
Number Date Country
20190034032 A1 Jan 2019 US
Provisional Applications (1)
Number Date Country
61019222 Jan 2008 US
Continuations (2)
Number Date Country
Parent 15006987 Jan 2016 US
Child 16152326 US
Parent 12238342 Sep 2008 US
Child 15006987 US