A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
The present invention relates generally to the field of touch screen technology and more particularly to the use of known non-random patterns of touch events to increase the accuracy of the analysis of touch screen events.
The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also be inventions.
Various electronic devices today are typically operated by a user interacting with a touch screen. This feature is particularly a characteristic of the recent generation of smart phones. Typically, touch screen display screens respond to finger contact to activate the display for further processes. Contact may also be nude using tools such as a stylus, other parts of the hand such as the palm and various parts of the finger. In many of these systems, touch type classification accuracy is not 100%. Put simply, sometimes the input type is confused. Thus, there is a need for mechanisms to help reduce errors.
Embodiments of the present invention include a method of analyzing touch screen events based on characterization of features derived from each touch event. The method comprises detecting touch events on a touch sensitive surface; generating vibro-acoustic waveform signals using at least one sensor detecting each such touch event; converting the waveform signals into at least a domain signal; extracting distinguishing features from said domain signal; and classifying said features to analyze the domain signal by employing spatiotemporal event data to weight the analysis of the touch events.
Other aspects and advantages of the present invention can be seen on review of the drawings, the detailed description and the claims, which follow.
The included drawings are for illustrative purposes and serve only to provide examples of possible structures and process steps for the disclosed techniques. These drawings in no way limit any changes in form and detail that may be made to embodiments by one skilled in the art without departing from the spirit and scope of the disclosure.
Embodiments of the present invention takes advantage of the fact that humans and user interfaces generate non-random patterns of events. Put simply, after one touch event, certain actions are more likely to follow than others. For example, if the system sees a palm touch, which then disappears (presumably lifted from the screen), and then sees a new touch event in the same position within 300 ms, it is much more likely that the user has re-rested their palms, than moved a finger underneath their previous palm position at high velocity. Another example of an action is a double touch with the knuckle (i.e., a “double knock”). These usually occur in quick succession—“knock knock”—often within $500 ms and within the same region of the screen. Thus if a touch event is classified as a knock, and then within 500 ms a new event in a similar location occurs, but the classification confidence is low (e.g., 60% nail, 40% knuckle), the classifier may add weight to a knuckle classification since this touch sequence is far more likely. Put succinctly, knowledge about the probabilities of follow-on touch events can be used to bias subsequent classification, adding weight to particular events. Additionally, events do not have to be the same touch type. For example, if a double knock event is frequently followed by a swipe with the fingertip and rarely with a third knock, if a third event occurs (meeting the necessary spatiotemporal criteria), the classifier will weigh the tip swipe as the more likely classification. Such temporal patterns may be “learned” over a period of time in response to actual sequences of users thereby developing a personalized history of touch sequences for each individual user.
Applications of methods and apparatus according to one or more embodiments are described in this section. These examples are being provided solely to add context and aid in the understanding of the present disclosure. It will thus be apparent to one skilled in the art that the techniques described herein may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order to avoid unnecessarily obscuring the present disclosure. Other applications are possible, such that the following examples should not be taken as definitive or limiting either in scope or setting.
In the following detailed description, references are made to the accompanying drawings, which form a par of the description and in which are shown, by way of illustration, specific embodiments. Although these embodiments are described in sufficient detail to enable one skilled in the art to practice the disclosure, it is understood that these examples are not limiting, such that other embodiments may be used and changes may be made without departing from the spirit and scope of the disclosure.
One or more embodiments may be implemented in numerous ways, including as a process, an apparatus, a system, a device, a method, a computer readable medium such as a computer readable storage medium containing computer readable instructions or computer program code, or as a computer program product comprising a computer usable medium having a computer readable program code embodied therein.
The disclosed embodiments may include a method for analyzing touch screen events based on characterization of features derived from each touch event. The method comprises detecting touch events on a touch sensitive surface; generating vibro-acoustic waveform signals using at least one sensor detecting each such touch event; converting the waveform signals into at least a domain signal; extracting distinguishing feature from the domain signal; and classifying the features to analyze the domain signal by employing spatiotemporal event data to weight the analysis.
The disclosed embodiments may include a method of analyzing touch screen events based on characterization of features derived from each touch event. The method comprises detecting touch events on a touch sensitive surface; generating vibro-acoustic waveform signals using at least one sensor associated with the touch events; converting the waveform signals into at least one domain signal; extracting distinguishing features from the domain signal; and classifying the features to analyze the domain signal by employing spatiotemporal event data to weight analysis of the touch events.
The disclosed embodiments may include computer readable medium containing instructions for classifying multiple touch event sequences in touch screen devices to improve the accuracy of determining the characteristics of the touch events imparted to a touch screen. Execution of the program instructions by a processor may cause the processor to analyze each touch event in the sequence to determine within a level of confidence what hand part caused the touch event; determine likelihoods of possible hand part touch event sequences; and combine the level of confidence for each touch event in the sequence and the likelihoods of possible hand part sequences to produce a most likely result of what hand part actually generated each touch event in the sequence. The hand part may be at least one of a fingertip, a finger knuckle, a fingernail and a hand palm. The instructions to determine likelihoods of possible hand part touch event sequences may use elapsed time over the multiple touch event sequences to determine the likelihoods. The instructions to combine the level of confidence for each touch event in the sequence and the likelihoods of possible hand part sequences may include weighting of the confidence levels by the likelihoods. The instructions to combine may include instructions to multiply the confidence levels by the likelihoods.
In general, when using a touch screen or touch-sensitive device, touch screen events may occur when contact is made with the touch screen using tools such as a stylus, other parts of the hand such as the palm and various parts of the finger, i.e., pad, nail, knuckle, etc. Each such different type of touch mechanism produces a different type of digital signature. Moreover, each user of a touch screen device may have his or her own unique touch event characteristics resulting from anatomical differences such as fleshiness, finger size, finger shape, BMI and the like. These differences in touch event characteristics, whether the result of different user anatomies or different touch mechanisms, may be used advantageously to improve the touch screen technology by enabling different types of touch, reducing ambiguities, distinguishing between users, responding only to intentional touch events and the like. Such advantageous uses are derived from sophisticated sensor-based analysis of the touch event coupled with one or more algorithms designed to provide further analytical characteristics otherwise hidden or not readily apparent in the data generated by the touch event. By way of example, one such apparatus is disclosed in pending U.S. patent application Ser. No. 14/483,150 filed on Sep. 11, 2014 by the Applicant hereof and entitled “METHOD AND APPARATUS FOR DIFFERENTIATING TOUCH SCREEN USERS BASED ON TOUCH EVENT ANALYSIS”. This co-pending application discloses that when a user touches a touch screen a mechanical force is applied to the screen resulting in mechanical vibrations that may be captured by a variety of sensors such as impact sensor, vibration sensors, accelerometers, strain gauges or acoustic sensor such as a microphone.
Once the vibro-acoustic signal has been captured, it can be converted into a series of features, for example; Average acoustic power, Standard Deviation, Variance, Skewness, Kurtosis, Absolute sum, Root Mean Square (RMS), Dispersion, Zero-crossings, Spectral centroid, Spectral density, Linear Prediction-based Cepstral Coefficients (LPCC), Perceptual Linear Prediction (PLP), Cepstral Coefficients Cepstrum Coefficients, MeI-Frequency Cepstral Coefficients (MFCC), Frequency phases (e.g., as generated by an FFT).
Simultaneously, many touch screen technologies are able to digitize several aspects of a touch event, such as the shape, size, capacitance, orientation, pressure, etc. The latter may be used as distinguishing features, or such features can be derived from them.
Because human fingers vary in their anatomical composition, their acoustic and touch properties can vary between humans. Moreover, the way users touch a screen can also be distinguishing (e.g., what finger, what part of the finger, how flat, how hard). Thus, the vibro-acoustic features and touch features contain properties that can be characteristic of different users and different parts of user's hands (e.g., finger tip, knuckle, nail).
It is thus possible to provide a classifier that can run on a touch computing device that upon receipt of a touch event, makes a guess about which user is operating the device, or whether the user is authorized or has any personalized features. Alternatively, it is also possible to provide a classifier that can run on a touch computing device that upon receipt of a touch event, makes a guess about what part of the finger was used to contact the screen.
In one exemplary embodiment thereof, the disclosed process may include the following operations and may include the following components;
(a) a sensing system that may be configured to continuously sample vibro-acoustic data, saving it into a buffer. This buffer can be of many lengths such as, for example 50 milliseconds;
(b) a touch sensitive screen may be configured wait for a touch event to occur. Any number of touch technologies may be possible. The operations of the touch sensitive screen may be configured to operate in parallel with the sensing system;
(c) when the touch sensitive screen detects a touch event, it may be configured to trigger a conversion, feature extraction, and classification process:
(d) the data from the vibro-acoustic buffer is retrieved. Because the touch screens typically have some latency, it may be necessary to look backwards in the buffer to find the vibro-acoustic waveform that corresponds to the touch impact (e.g., if the touch screen has a 20 ms latency, it may be necessary to look back in the buffer 20 ms to find the corresponding vibro-acoustic event). All or part of the buffer may be saved and passed to the next operations;
(e) conversion operations may be performed next. The waveform from the sensor is a time-domain representation of the vibro-acoustic signal. In addition to saving the waveform, the signal is converted into other forms. This includes filtering the waveform and transforming into other forms, including frequency domain representations;
(f) feature extraction operations may be performed next, where touch screen controller data and vibro-acoustic data are analyzed to extract features that characterize different users. For the vibro-acoustic data, features are computed for all representations of the signal;
(g) these features are then passed to a classification unit, which uses the information to label the touch event with a user (in addition to whatever the touch sensitive screen reports, e.g., X/Y position, major/minor axes, pressure, etc.);
(h) the augmented touch event is then passed to the OS or end user applications, to associate a use based on the touch event.
For some embodiments, a classifier may be configured to use one or more of the following features to perform its operations: location of touch contact (2D, or 3D in the case of curved glass or other non-planar geometry), size of touch contact (some touch technologies provide an ellipse of the touch contact with major and minor axes), rotation of the touch contact, surface area of the touch contact (e.g., in squared mm or pixels), pressure of touch (available on some touch system), shear of touch (“shear stress”, also called “tangential force” in the literature, arises from a force vector perpendicular to the surface normal of a touch screen. This is similar to normal stress—what is commonly called pressure—which arises from a force vector parallel to the surface normal.”), number of touch contacts, capacitance of touch (if using a capacitive touch screen), swept frequency capacitance of touch (if using a swept frequency6 capacitive touch screen), and swept frequency impedance of touch (if using a swept frequency capacitive touch screen). The computation phase may also compute the derivative of the above features over a short period of time, for example, touch velocity and pressure velocity. Other features that the classifier may also use include shape of touch (some touch technologies can provide the actual shape of the touch, and not just a circle or ellipse), and image of the hand pose (as imaged by e.g., an optical sensor, diffuse illuminated surface with camera, near-range capacitive sensing).
The classification engine may use any number of approaches, including but not limited to basic heuristics, decision trees, Support Vector Machine, Random Forest, Naïve bayes, elastic matching, dynamic time warping, template matching, k-means clustering, K-nearest neighbors algorithm neural network, Multilayer perceptron, multinomial logistic regression, Gaussian mixture models, and AdaBoost. Additionally, the results from several different classifiers may be combined through, for example, a voting scheme.
For some embodiments, it may be possible to use different classifiers based on one or more features. For example, two classifiers could be employed, one for processing sensor waveforms with a high Standard Deviation, and another classifier for waveforms with low Standard Deviation.
By employing such classification, touch screen technologies are able to classify what type of touch event occurred. For example, some screens differentiate between finger touches and touches by the palm (which are often rejected as inadvertent input). Alternatively, some systems differentiate between finger touches and those made with a stylus. In the case of the aforementioned disclosure, the system can differentiate between e.g., fingertips, knuckles, nails, stylus and other implements.
Referring to
When an object strikes a certain material, vibro-acoustic waves propagate outward through the material or along the surface of the material. Typically, interactive surfaces use rigid materials, such as plastic or glass, which both quickly distribute and faithfully preserve the signal. As such, when one or more fingers touch or contact the surface of the touch screen 100, vibro-acoustic responses are produced. The vibro-acoustic characteristics of the respective user fingers and their respective unique anatomical characteristics produce unique responses for each user.
Referring back to
The OS 130 runs the computing system so that the function can be activated in line with the classification of the vibro-acoustic signals and the corresponding user. The vibro-acoustic classifier 120 includes a segmentation unit 122 to segment the vibro-acoustic signal into a digital representation; a conversion unit 124 to convert the digitized vibro-acoustic signal into an electrical signal; a feature extraction unit 126 to derive a series of features from the electrical signal; and a classification unit 128 to classify touch characteristics using the above-described features to analyze the touch event as will be further described below.
The segmentation unit 122 samples the vibro-acoustic signal, for example, at a sampling rate of 96 kHz, using a sliding window of 4096 samples of the vibro-acoustic signal. The conversion unit 124 then performs, for example, a Fourier Transform on sampled time-dependent vibro-acoustic signal to produce an electrical signal having frequency domain representation. For example, the Fourier Transform of this window may produce 2048 bands of frequency power.
The vibro-acoustic classifier 120 may further down-sample this data into additional vectors (i.e., buckets often), providing a different aliasing. In addition, additional time-domain features may be calculated from the vibro-acoustic signal, such as the average absolute amplitude, total absolute amplitude, standard deviation of the absolute amplitude, the center of mass for both the segmented input signal and the Fourier Transformed signal, and zero crossings.
The feature extraction unit 126 may also calculate a series of features from the frequency domain representation of the vibro-acoustic signals, such as the fundamental frequency of the impact waveform. The classification unit 128 classifies the vibro-acoustic signal using the features to for example distinguish what user generated the touch event, so that the computing system may selectively activate a function related to the identified user depending on the classified vibro-acoustic signals. To aid classification, the user can provide supplemental training samples to the vibro-acoustic classifier 120.
For some embodiments, the classification unit 128 may be implemented with a support vector machine (SVM) for feature classification. The SVM is a supervised learning model with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. The classification process may be made more accurate by exploiting or leveraging known spatiotemporal patterns generated from historical data relating to sequences of multiple touch events within relatively short time windows. By way of example,
These likelihood figures are used herein for purpose of illustration only, however one can see that in two touch events only 100 milliseconds apart, it is far more likely that both events would come from the same hand part (i.e., both finger or both knuckle) simply because it would be less likely for a user to change hand parts in such a short period of time. On the other hand, if the time period were greater (i.e., >500 ms) then the greater opportunity for changing hand parts in that greater time period, would raise the likelihood of a finger, knuckle or knuckle, finger interpretation.
In any event, once these event sequence weights are accessed, they may be used to alter the classification by combining classification confidences with such sequence weights. One example of such combining, by for example multiplication is shown in
By carrying out the multiplications of
These and other aspects of the disclosure may be implemented by various types of hardware, software, firmware, etc. For example, some features of the disclosure may be implemented, at least in part, by machine-readable media that include program instructions, state information, etc., for performing various operations described herein. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher-level code that may be executed by the computer using an interpreter. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (“ROM”) and random access memory (“RAM”).
Any of the above embodiments may be used alone or together with one another in any combination. Although various embodiments may have been motivated by various deficiencies with the prior art, which may be discussed or alluded to in one or more places in the specification, the embodiments do not necessarily address any of these deficiencies. In other words, different embodiments may address different deficiencies that may be discussed in the specification. Some embodiments may only partially address some deficiencies or just one deficiency that may be discussed in the specification, and some embodiments may not address any of these deficiencies.
While various embodiments have been described herein, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present application should not be limited by any of the embodiments described herein, but should be defined only in accordance with the following and later-submitted claims and their equivalents.
This application is a continuation of and claims the benefit of priority under 35 U.S.C. § 120 of U.S. patent application Ser. No. 14/495,041, filed 24 Sep. 2014, entitled, “METHOD FOR IMPROVING ACCURACY OF TOUCH SCREEN EVENT ANALYSIS BY USE OF SPATIOTEMPORAL TOUCH PATTERNS”, which application is incorporated herein by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2008028 | McCortney | Jul 1935 | A |
2430005 | Denneen | Nov 1947 | A |
3354531 | Pryor | Nov 1967 | A |
3441790 | McDonald | Apr 1969 | A |
4561105 | Crane | Dec 1985 | A |
4597932 | Kurihara | Jul 1986 | A |
4686332 | Greanias | Aug 1987 | A |
5483261 | Yasutake | Jan 1996 | A |
5544265 | Bozinovic | Aug 1996 | A |
5596656 | Goldberg | Jan 1997 | A |
5615285 | Beernink | Mar 1997 | A |
5625818 | Zarmer | Apr 1997 | A |
5666438 | Beernink | Sep 1997 | A |
5867163 | Kurtenbach | Feb 1999 | A |
5933514 | Ostrem | Aug 1999 | A |
6028593 | Rosenberg | Feb 2000 | A |
6118435 | Fujita | Sep 2000 | A |
6208330 | Hasegawa | Mar 2001 | B1 |
6212295 | Ostrem | Apr 2001 | B1 |
6222465 | Kumar | Apr 2001 | B1 |
6246395 | Goyins | Jun 2001 | B1 |
6252563 | Tada | Jun 2001 | B1 |
6323846 | Westerman | Nov 2001 | B1 |
6337698 | Keely, Jr. | Jan 2002 | B1 |
6492979 | Kent | Dec 2002 | B1 |
6504530 | Wilson et al. | Jan 2003 | B1 |
6643663 | Dabney | Nov 2003 | B1 |
6707451 | Nagaoka | Mar 2004 | B1 |
6748425 | Duffy | Jun 2004 | B1 |
6772396 | Cronin | Aug 2004 | B1 |
6933930 | Devige | Aug 2005 | B2 |
6943665 | Chornenky | Sep 2005 | B2 |
7050955 | Carmel | May 2006 | B1 |
7084884 | Nelson | Aug 2006 | B1 |
7098896 | Kushler | Aug 2006 | B2 |
7212197 | Schkolne | May 2007 | B1 |
7443396 | Ilic | Oct 2008 | B2 |
7581194 | Iwema | Aug 2009 | B2 |
7982724 | Hill | Jul 2011 | B2 |
8086971 | Radivojevic | Dec 2011 | B2 |
8144126 | Wright | Mar 2012 | B2 |
8154524 | Wilson | Apr 2012 | B2 |
8154529 | Sleeman | Apr 2012 | B2 |
8170346 | Ludwig | May 2012 | B2 |
8199126 | Taubman | Jun 2012 | B1 |
8253744 | Macura | Aug 2012 | B2 |
8269744 | Agari | Sep 2012 | B2 |
8327029 | Purser | Dec 2012 | B1 |
8441790 | Pance | May 2013 | B2 |
8547357 | Aoyagi | Oct 2013 | B2 |
8624878 | Sarwar | Jan 2014 | B2 |
8670632 | Wilson | Mar 2014 | B2 |
8674943 | Westerman | Mar 2014 | B2 |
8743091 | Bernstein | Jun 2014 | B2 |
8760395 | Kim | Jun 2014 | B2 |
8762332 | Keebler | Jun 2014 | B2 |
8769524 | Bhullar | Jul 2014 | B2 |
9013452 | Harrison et al. | Apr 2015 | B2 |
9019244 | Harrison | Apr 2015 | B2 |
9030498 | Galor | May 2015 | B2 |
9052772 | West | Jun 2015 | B2 |
9060007 | Keebler | Jun 2015 | B2 |
9182882 | Fowler | Nov 2015 | B2 |
9329688 | Harrison | May 2016 | B2 |
9329715 | Schwarz | May 2016 | B2 |
9377863 | Bychkov | Jun 2016 | B2 |
9557852 | Tsai | Jan 2017 | B2 |
9612689 | Harrison | Apr 2017 | B2 |
9696859 | Heller et al. | Jul 2017 | B1 |
9864453 | Munemoto | Jan 2018 | B2 |
10082935 | Harrison | Sep 2018 | B2 |
10606417 | Schwarz et al. | Mar 2020 | B2 |
10642404 | Harrison | May 2020 | B2 |
20020009227 | Goldberg | Jan 2002 | A1 |
20020057837 | Wilkinson | May 2002 | A1 |
20020070927 | Fujitsuka | Jun 2002 | A1 |
20020126161 | Kuzunuki | Sep 2002 | A1 |
20030048260 | Matusis | Mar 2003 | A1 |
20030110085 | Murren | Jun 2003 | A1 |
20030132922 | Philipp | Jul 2003 | A1 |
20030217873 | Paradiso | Nov 2003 | A1 |
20040012573 | Morrison | Jan 2004 | A1 |
20040021681 | Liao | Feb 2004 | A1 |
20040054711 | Multer | Mar 2004 | A1 |
20040141010 | Fitzmaurice | Jul 2004 | A1 |
20040160421 | Sullivan | Aug 2004 | A1 |
20040199867 | Brandenborg | Oct 2004 | A1 |
20040225730 | Brown | Nov 2004 | A1 |
20050083313 | Hardie-Bick et al. | Apr 2005 | A1 |
20050104867 | Westerman et al. | May 2005 | A1 |
20050131778 | Bennett | Jun 2005 | A1 |
20050146512 | Hill | Jul 2005 | A1 |
20050165596 | Adar | Jul 2005 | A1 |
20050278467 | Gupta | Dec 2005 | A1 |
20050289461 | Amado | Dec 2005 | A1 |
20060010400 | Dehlin | Jan 2006 | A1 |
20060026535 | Hotelling | Feb 2006 | A1 |
20060031746 | Toepfer | Feb 2006 | A1 |
20060152499 | Roberts | Jul 2006 | A1 |
20060173985 | Moore | Aug 2006 | A1 |
20060184617 | Nicholas | Aug 2006 | A1 |
20060217126 | Sohm | Sep 2006 | A1 |
20060230021 | Diab | Oct 2006 | A1 |
20060288329 | Gandhi | Dec 2006 | A1 |
20070011205 | Majjasie | Jan 2007 | A1 |
20070044010 | Sull | Feb 2007 | A1 |
20070075965 | Huppi | Apr 2007 | A1 |
20070085157 | Fadell | Apr 2007 | A1 |
20070100959 | Eichstaedt | May 2007 | A1 |
20070109279 | Sigona | May 2007 | A1 |
20070126716 | Haverly | Jun 2007 | A1 |
20070168367 | Dickinson | Jul 2007 | A1 |
20070176907 | Ishii | Aug 2007 | A1 |
20070186157 | Walker | Aug 2007 | A1 |
20070192674 | Bodin | Aug 2007 | A1 |
20070245020 | Ott, IV | Oct 2007 | A1 |
20070257767 | Beeson | Nov 2007 | A1 |
20070291297 | Harmon | Dec 2007 | A1 |
20080005666 | Sefton | Jan 2008 | A1 |
20080036743 | Westerman | Feb 2008 | A1 |
20080042978 | Perez-Noguera | Feb 2008 | A1 |
20080082941 | Goldberg | Apr 2008 | A1 |
20080103906 | Singh | May 2008 | A1 |
20080117168 | Liu | May 2008 | A1 |
20080126388 | Naaman | May 2008 | A1 |
20080141132 | Tsai | Jun 2008 | A1 |
20080155118 | Glaser | Jun 2008 | A1 |
20080158147 | Westerman | Jul 2008 | A1 |
20080158168 | Westerman | Jul 2008 | A1 |
20080158185 | Westerman | Jul 2008 | A1 |
20080168403 | Westerman | Jul 2008 | A1 |
20080180406 | Han | Jul 2008 | A1 |
20080244468 | Nishihara | Oct 2008 | A1 |
20080288347 | Sifry | Nov 2008 | A1 |
20080319932 | Yih | Dec 2008 | A1 |
20090025987 | Perski | Jan 2009 | A1 |
20090073144 | Chen | Mar 2009 | A1 |
20090095540 | Zachut | Apr 2009 | A1 |
20090150373 | Davis | Jun 2009 | A1 |
20090157206 | Weinberg | Jun 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090178011 | Ording | Jul 2009 | A1 |
20090231275 | Odgers | Sep 2009 | A1 |
20090232355 | Minear | Sep 2009 | A1 |
20090254869 | Ludwig | Oct 2009 | A1 |
20090259628 | Farrell | Oct 2009 | A1 |
20090262637 | Badaye | Oct 2009 | A1 |
20090267893 | Kato | Oct 2009 | A1 |
20090315835 | De Goes | Dec 2009 | A1 |
20090318192 | Leblanc | Dec 2009 | A1 |
20100036967 | Caine | Feb 2010 | A1 |
20100060602 | Agari | Mar 2010 | A1 |
20100085216 | Ms | Apr 2010 | A1 |
20100094633 | Kawamura | Apr 2010 | A1 |
20100123666 | Wickholm | May 2010 | A1 |
20100127997 | Park | May 2010 | A1 |
20100194703 | Fedor | Aug 2010 | A1 |
20100214267 | Radivojevic | Aug 2010 | A1 |
20100225601 | Homma | Sep 2010 | A1 |
20100251112 | Hinckley | Sep 2010 | A1 |
20100265185 | Oksanen | Oct 2010 | A1 |
20100271322 | Kondoh | Oct 2010 | A1 |
20100274622 | Kennedy | Oct 2010 | A1 |
20100279738 | Kim | Nov 2010 | A1 |
20100289754 | Sleeman | Nov 2010 | A1 |
20100302184 | East | Dec 2010 | A1 |
20100306649 | Russ | Dec 2010 | A1 |
20100309158 | Iwayama | Dec 2010 | A1 |
20100309933 | Stark | Dec 2010 | A1 |
20110003550 | Klinghult | Jan 2011 | A1 |
20110007000 | Lim | Jan 2011 | A1 |
20110018825 | Kondo et al. | Jan 2011 | A1 |
20110057670 | Jordan | Mar 2011 | A1 |
20110057885 | Lehtovirta | Mar 2011 | A1 |
20110074544 | D Souza | Mar 2011 | A1 |
20110074701 | Dickinson | Mar 2011 | A1 |
20110080349 | Holbein et al. | Apr 2011 | A1 |
20110133934 | Tan | Jun 2011 | A1 |
20110134063 | Norieda | Jun 2011 | A1 |
20110134083 | Norieda | Jun 2011 | A1 |
20110141066 | Shimotani | Jun 2011 | A1 |
20110145706 | Wilson | Jun 2011 | A1 |
20110164029 | King | Jul 2011 | A1 |
20110167391 | Momeyer et al. | Jul 2011 | A1 |
20110169763 | Westerman | Jul 2011 | A1 |
20110169778 | Nungester | Jul 2011 | A1 |
20110173235 | Aman | Jul 2011 | A1 |
20110175813 | Sarwar | Jul 2011 | A1 |
20110175821 | King | Jul 2011 | A1 |
20110175832 | Miyazawa et al. | Jul 2011 | A1 |
20110187652 | Huibers | Aug 2011 | A1 |
20110202848 | Ismalon | Aug 2011 | A1 |
20110210943 | Zaliva | Sep 2011 | A1 |
20110231290 | Narcisse | Sep 2011 | A1 |
20110238613 | Shehory | Sep 2011 | A1 |
20110246463 | Carson, Jr. | Oct 2011 | A1 |
20110246503 | Bender | Oct 2011 | A1 |
20110248927 | Michaelis | Oct 2011 | A1 |
20110248948 | Griffin | Oct 2011 | A1 |
20110261083 | Wilson | Oct 2011 | A1 |
20110298798 | Krah | Dec 2011 | A1 |
20110310040 | Ben-Shalom | Dec 2011 | A1 |
20120001875 | Li | Jan 2012 | A1 |
20120007821 | Zaliva | Jan 2012 | A1 |
20120007836 | Wu et al. | Jan 2012 | A1 |
20120011106 | Reid | Jan 2012 | A1 |
20120019562 | Park | Jan 2012 | A1 |
20120051596 | Darnell | Mar 2012 | A1 |
20120056846 | Zaliva | Mar 2012 | A1 |
20120078942 | Cai | Mar 2012 | A1 |
20120096041 | Rao | Apr 2012 | A1 |
20120113017 | Benko et al. | May 2012 | A1 |
20120120000 | Lucic | May 2012 | A1 |
20120131139 | Siripurapu | May 2012 | A1 |
20120146938 | Worfolk | Jun 2012 | A1 |
20120150871 | Hua | Jun 2012 | A1 |
20120158629 | Hinckley et al. | Jun 2012 | A1 |
20120200517 | Nikolovski | Aug 2012 | A1 |
20120206330 | Cao | Aug 2012 | A1 |
20120256845 | Noble | Oct 2012 | A1 |
20120262407 | Hinckley | Oct 2012 | A1 |
20120274583 | Haggerty | Nov 2012 | A1 |
20120280827 | Kashiwagi | Nov 2012 | A1 |
20120280927 | Ludwig | Nov 2012 | A1 |
20120287056 | Ibdah | Nov 2012 | A1 |
20120287076 | Dao | Nov 2012 | A1 |
20120313969 | Szymczyk | Dec 2012 | A1 |
20120324349 | Pop-Lazarov | Dec 2012 | A1 |
20130009896 | Zaliva | Jan 2013 | A1 |
20130014248 | McLaughlin | Jan 2013 | A1 |
20130027404 | Sarnoff | Jan 2013 | A1 |
20130030782 | Yogeswaren et al. | Jan 2013 | A1 |
20130038554 | West | Feb 2013 | A1 |
20130091123 | Chen | Apr 2013 | A1 |
20130100071 | Wright et al. | Apr 2013 | A1 |
20130176264 | Alameh et al. | Jul 2013 | A1 |
20130176270 | Cattivelli et al. | Jul 2013 | A1 |
20130179773 | Lee | Jul 2013 | A1 |
20130187883 | Lim | Jul 2013 | A1 |
20130215070 | Sasaki | Aug 2013 | A1 |
20130234982 | Kang | Sep 2013 | A1 |
20130246861 | Colley | Sep 2013 | A1 |
20130257757 | Kim | Oct 2013 | A1 |
20130265269 | Sharma | Oct 2013 | A1 |
20130285942 | Ko | Oct 2013 | A1 |
20130287273 | Huang | Oct 2013 | A1 |
20130307814 | Chang | Nov 2013 | A1 |
20130307828 | Miller | Nov 2013 | A1 |
20130316813 | Derome | Nov 2013 | A1 |
20130328813 | Kuo | Dec 2013 | A1 |
20130335333 | Kukulski | Dec 2013 | A1 |
20140007002 | Chang | Jan 2014 | A1 |
20140009401 | Bajaj | Jan 2014 | A1 |
20140022189 | Sheng et al. | Jan 2014 | A1 |
20140032880 | Ka | Jan 2014 | A1 |
20140037951 | Shigetomi Kiyoe | Feb 2014 | A1 |
20140043295 | Alameh | Feb 2014 | A1 |
20140071095 | Godsill et al. | Mar 2014 | A1 |
20140082545 | Zhai et al. | Mar 2014 | A1 |
20140104191 | Davidson et al. | Apr 2014 | A1 |
20140104192 | Davidson | Apr 2014 | A1 |
20140104274 | Hilliges | Apr 2014 | A1 |
20140109004 | Sadhvani | Apr 2014 | A1 |
20140168116 | Sasselli | Jun 2014 | A1 |
20140208275 | Mongia | Jul 2014 | A1 |
20140210788 | Harrison et al. | Jul 2014 | A1 |
20140210791 | Hanauer et al. | Jul 2014 | A1 |
20140240271 | Land | Aug 2014 | A1 |
20140240293 | McCaughan et al. | Aug 2014 | A1 |
20140240295 | Harrison | Aug 2014 | A1 |
20140253477 | Shim | Sep 2014 | A1 |
20140267065 | Levesque | Sep 2014 | A1 |
20140267085 | Li | Sep 2014 | A1 |
20140289659 | Harrison et al. | Sep 2014 | A1 |
20140300559 | Tanimoto | Oct 2014 | A1 |
20140327626 | Harrison | Nov 2014 | A1 |
20140331313 | Kim et al. | Nov 2014 | A1 |
20140368436 | Abzarian | Dec 2014 | A1 |
20150002405 | Kuan et al. | Jan 2015 | A1 |
20150035759 | Harrison | Feb 2015 | A1 |
20150077378 | Duffield | Mar 2015 | A1 |
20150089435 | Kuzmin | Mar 2015 | A1 |
20150145820 | Huang | May 2015 | A1 |
20150227229 | Schwartz et al. | Aug 2015 | A1 |
20150242009 | Xiao | Aug 2015 | A1 |
20150253858 | Koukoumidis | Sep 2015 | A1 |
20150293592 | Cheong | Oct 2015 | A1 |
20160012348 | Johnson | Jan 2016 | A1 |
20160018942 | Kang | Jan 2016 | A1 |
20160062545 | Lai | Mar 2016 | A1 |
20160077615 | Schwarz | Mar 2016 | A1 |
20160077650 | Durojaiye et al. | Mar 2016 | A1 |
20160077664 | Harrison | Mar 2016 | A1 |
20160085324 | Schwarz | Mar 2016 | A1 |
20160085333 | Harrison | Mar 2016 | A1 |
20160085372 | Munemoto | Mar 2016 | A1 |
20160098185 | Xiao | Apr 2016 | A1 |
20160117015 | Veneri | Apr 2016 | A1 |
20160156837 | Rodzevski | Jun 2016 | A1 |
20160171192 | Holz | Jun 2016 | A1 |
20160224145 | Harrison | Aug 2016 | A1 |
20160231865 | Harrison | Aug 2016 | A1 |
20160299615 | Schwarz | Oct 2016 | A1 |
20170024892 | Harrison | Jan 2017 | A1 |
20170060279 | Harrison | Mar 2017 | A1 |
20170153705 | Kim | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
1797305 | Jul 2006 | CN |
1928781 | Mar 2007 | CN |
101111817 | Jan 2008 | CN |
101299174 | Nov 2008 | CN |
101339477 | Jan 2009 | CN |
101410781 | Apr 2009 | CN |
101424974 | May 2009 | CN |
101438218 | May 2009 | CN |
101566894 | Oct 2009 | CN |
101763190 | Jun 2010 | CN |
101763193 | Jun 2010 | CN |
101921610 | Dec 2010 | CN |
101968696 | Feb 2011 | CN |
102153776 | Aug 2011 | CN |
102349035 | Feb 2012 | CN |
102362249 | Feb 2012 | CN |
102708862 | Oct 2012 | CN |
102789332 | Nov 2012 | CN |
103150019 | Jun 2013 | CN |
104020878 | Sep 2014 | CN |
104160364 | Nov 2014 | CN |
105431799 | Mar 2016 | CN |
106200861 | Dec 2016 | CN |
107077242 | Aug 2017 | CN |
107924279 | Apr 2018 | CN |
108803933 | Nov 2018 | CN |
0938039 | Aug 1999 | EP |
1659481 | May 2006 | EP |
1762926 | Mar 2007 | EP |
2136358 | Dec 2009 | EP |
2280337 | Feb 2011 | EP |
3028125 | Mar 2017 | EP |
3195095 | Apr 2018 | EP |
3198386 | Apr 2018 | EP |
3341829 | Mar 2019 | EP |
2344894 | Jun 2000 | GB |
2468742 | Sep 2010 | GB |
H0969137 | Mar 1997 | JP |
2004213312 | Jul 2004 | JP |
2005018611 | Jan 2005 | JP |
2007524970 | Aug 2007 | JP |
2009543246 | Dec 2009 | JP |
2011028555 | Feb 2011 | JP |
2013519132 | May 2013 | JP |
2013532495 | Aug 2013 | JP |
20020075283 | Oct 2002 | KR |
20110061227 | Jun 2011 | KR |
20120100351 | Sep 2012 | KR |
9404992 | Mar 1994 | WO |
2006070044 | Jul 2006 | WO |
2006070044 | Jul 2006 | WO |
2008126347 | Oct 2008 | WO |
2009071919 | Jun 2009 | WO |
2011096694 | Aug 2011 | WO |
2012064034 | May 2012 | WO |
2012166277 | Dec 2012 | WO |
2013059488 | Apr 2013 | WO |
2013061998 | May 2013 | WO |
2014037951 | Mar 2014 | WO |
2014182435 | Nov 2014 | WO |
2015017831 | Nov 2015 | WO |
2016048848 | Mar 2016 | WO |
2016043957 | Sep 2016 | WO |
2017034752 | Mar 2017 | WO |
Entry |
---|
U.S. Appl. No. 13/958,427, CTFR—Final Rejection, dated Oct. 3, 2019, 2 pgs. |
U.S. Appl. No. 13/958,427, Non-Final Rejection, dated Apr. 6, 2020, 13 pgs. |
U.S. Appl. No. 14/486,800, Non-Final Rejection, dated Feb. 21, 2020, 26 pgs. |
U.S. Appl. No. 14/495,041, USPTO e-Office Action:Notice of Allowance and Fees Due (Ptol-85), dated Nov. 22, 2019, 9 pgs. |
Asano, Futoshi, Goto, Masataka, Itou, Katunobu, Asoh, Hideki; Real-Time Sound Source Localization and Separation System and Its Application to Automatic Speech Recognition; Proceedings of Eurospeech, 2001; p. 1013-1016; 2001. |
Benko, Hrvoje, Wilson, Andrew, Balakrishnan, Ravin; Sphere: Multi-Touch Interactions on a Spherical Display; Proceedings of UIST, 2008; pp. 77-86; 2008. |
Burges, Christopher; A Tutorial on Support Vector Machines for Pattern Recognition; Data Mining and Knowledge Discovery, 2; pp. 121-167; 1998. |
Cao, Xiang, Wilson, Andrew, Balakrishnan, Ravin, Hinckley, Ken, Hudson, Scott; ShapeTouch: Leveraging contact Shape on Interactive Surfaces; IEEE International Workshop on Horizontal Interactive Human Computer System (Tabletop); pp. 139-146; 2008. |
Deyle, Travis, Palinko, Szabolcs, Poole, Erika Shehan, Starner, Thad; Hambone: A Bio-Acoustic Gesture Interface; Proceedings of ISWC, 2007; pp. 1-8; 2007. |
Dietz Paul, Leigh, Darren; DiamondTouch: A Multi-User Touch Technology; ACM Symposium on User Interface Software & Technology (UIST); pp. 219-226; 2001. |
Dietz, Paul, Harsham, Bret, Forlines, Clifton, Leigh, Darren, Yerazunis, Wiliam, Shipman, Sam, Schmidt-Nielsen, Bent, Ryall, Kathy; DT Controls: Adding Identity to Physical Interfaces; ACM Symposium on User Interface Software & Technology (UIST); pp. 245-252; 2005. |
Final Office Action—dated Jun. 19, 2015—U.S. Appl. No. 13/958,427, filed Mar. 23, 2013. |
Gutwin, Carl, Greenberg, Saul, Blum, Roger, Dyck, Jeff, Tee, Kimberly, McEwan, Gregor; Supporting Informal Collaboration in Shared-Workspace Groupware; Journal of Universal Computer Science, 14(9); pp. 1411-1434; 2008. |
Hall, Mark, Frank, Eibe, Holmes, Geoffrey, Pfahringer, Bernhard, Reutemann, Peter, Witten, Ian; The WEKA Data Mining Software: An Update; SIGKDD Explorations, 11(1); pp. 10-18; 2009. |
Harrison, Chris, Hudson, Scott; Scratch Input: Creating Large, Inexpensive, Unpowered and Mobile Finger Input Surfaces; Proceedings of UIST, 2008; pp. 205-208; 2008. |
Harrison, Chris, Tan, Desney, Morris, Dan; Skinput: Appropriating the Body as an Input Surface; Proceedings of CHI, 2010; pp. 453-462; 2010. |
Hartmann Bjorn, Ringel Morris, Meredith, Benko, Hrvoje, Wilson, Andrew; Augmenting Interactive Tables with Mice & Keyboards; Proceedings of UIST, 2009; pp. 149-152; 2009. |
Hinckley, Ken, Song, Hyunyoung; Sensor Synaesthesia: Touch in Motion, and Motion in Touch; Proceedings of CHI, 2011; pp. 801-810; 2011. |
Hinckley, Ken, Yatani, Koji, Pahud, Michel, Coddington, Nicole, Rodenhouse, Jenny, Wilson, Andy, Benko, Hrvoje, Buxton, Bill; Pen + Touch=New Tools; Proceedings of UIST, 2010; pp. 27-36; 2010. |
Hinkley, Ken, Yatani, Koji, Pahud, Michel, Coddington, Nicole, Rodenhouse, Jenny, Wilson, Andy, Benko, Hrvoje, Buxton, Bill; Manual Deskterity: An Exploration of Simultaneous Pen + Touch Direct Input; Proceedings of CHI, 2010; pp. 2793-2802; 2010. |
Holz, Christian, Baudisch, Patrick; the Generalized Perceived Input Point Model and How to Double Touch Accuracy by Extracting Fingerprints; Proceedings of CHI, 2010; pp. 581-590; 2010. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2014/049485 dated Nov. 17, 2014, 13 pages. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2014/049485 dated Nov. 17, 2014. |
Kaltenbrunner, Martin, Bencina, Ross; reacTIVision: A Computer-Vision Framework for Table-Based Tangible Interaction; Proceedings of TEI, 2007; pp. 69-74; 2007. |
Matsushita, Nobuyuki, Rekimoto, Jun; HoloWall: Designing a Finger, Hand, Body, and Object Sensitive Wall; Proceedings of UIST, 1997; pp. 209-210; 1997. |
Olwal, Alex, Wilson, Andrew; SurfaceFusion: Unobtrusive Tracking of Everyday Objects in Tangible User Interfaces; Proceedings of GI, 2008; pp. 235-242; 2008. |
Paradiso, Joseph, Hsiao, Kai-yuh, Strickon, Joshua, Lifton, Joshua, Adler, Ari; Sensor Systems for Interactive Surfaces; IBM Systems Journal, 39(3-4); pp. 892-914; 2000. |
Paradiso, Joseph, Leo, Che King; Tracking and Characterizing Knocks Atop Large Interactive Displays; Sensor Review, 25(2); pp. 134-143; 2005. |
Patten, James, Ishii, Hiroshi, Hines, Jim, Pangaro, Gian; Sensetable: A Wireless Object Tracking Platform for Tangible User Interfaces; Proceedings of CHI, 2001; pp. 253-260; 2001. |
Rekimoto, Jun, Saitoh, Masanori; Augmented Surfaces: A Spatially Continuous Work Space for Hybrid Computing Environments; Proceedings of CHI, 1999; pp. 378-385; 1999. |
Rekimoto, Jun, Sciammarella, Eduardo; ToolStone: Effective use of the Physical Manipulation Vocabularies of Input Devices; Proceedings of UIST, 2000; pp. 109-117; 2000. |
Rekimoto, Jun; SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces; Proceedings of CHI, 2002; pp. 113-120; 2002. |
U.S. Appl. No. 13/958,427, filed Aug. 2, 2013, titled: “Capture of Vibro-Acoustic Data Used to Touch Types.” |
U.S. Appl. No. 13/958,427, filed Aug. 2, 2013. titled: “Capture of Vibro-Acoustic Data Used to Determine Touch Types.” |
Vandoren, Peter, Van Laerhoven, Tom, Claesen, Luc, Taelman, Johannes, Di Fiore, Fabian, Van Reeth, Frank, Flerackers, Eddy; DIP-IT: Digital Infrared Painting on an Interactive Table; Proceedings of CHI, 2008; pp. 2901-2906; 2008. |
Wang, Feng, Ren, Xiangshi; Empirical Evaluation for Finger Input Properties in Multi-Touch Interaction; Proceedings of CHI, 2009; pp. 1063-1072; 2009. |
Non-Final Office Action received for U.S. Appl. No. 14/242,127 dated Jun. 2, 2015, 33 pages. |
Final Office Action received for U.S. Appl. No. 14/242,127 dated Sep. 18, 2015, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/242,127 dated Dec. 28, 2015, 38 pages. |
Final Office Action received for U.S. Appl. No. 14/242,127 dated Mar. 31, 2016, 34 pages. |
Notice of Allowance received for U.S. Appl. No. 14/242,127 dated Apr. 13, 2016, 18 pages. |
Notice of Allowance received for U.S. Appl. No. 14/242,127 dated Sep. 2, 2016, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/206,554 dated Sep. 21, 2016, 36 pages. |
Japanese Office Action for Japanese Patent Application No. 2014-537253 dated May 16, 2017, 5 pages (including English Translation). |
European Search Report received for European Patent Application Serial No. 12842495.9, dated Jul. 24, 2015, 7 Pages. |
Japanese Office Action for Japanese Patent Application No. 2017-049566 dated Jun. 5, 2018, 7 pages (including English Translation). |
Final Office Action received for U.S. Appl. No. 15/073,407, dated Dec. 20, 2016, 49 pages. |
Final Office Action received for U.S. Appl. No. 14/219,919, dated Aug. 26, 2016, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/191,329, dated on Feb. 2, 2017, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/486,800, dated Dec. 1, 2016, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/684,407, dated Sep. 14, 2018, 24 pages. |
Final Office Action received for U.S. Appl. No. 14/834,434, dated May 1, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/751,589, dated Jun. 13, 2016, 20 pages. |
International Search Report and Written Opinion for PCT/US2016/044552; dated Oct. 17, 2016, 14 pages. |
International Search Report and Written Opinion for PCT/US2016/040194; dated Sep. 19, 2016, 7 pages. |
International Search Report and Written Opinion for PCT/US2015/051582; dated Feb. 26, 2016, 12 pages. |
International Search Report and Written Opinion for PCT/US2015/051106; dated Jan. 28, 2016, 9 pages. |
International Search Report and Written Opinion for PCT/US2015/047616; dated Jul. 1, 2016, 7 pages. |
European Patent Office Extended Search Report for EP 14 83 2247; dated Feb. 23, 2017, 11 pages. |
European Patent Office Extended Search Report for EP 14 79 4212; dated Nov. 9, 2016, 8 pages. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2014/033380 dated Mar. 13, 2015, 7 pages. |
Non-Final Office Action—dated Oct. 2, 2015 U.S. Appl. No. 14/486,800, filed Sep. 15, 2014, 21 pages. |
Non-Final Office Action received dated Nov. 5, 2015 U.S. Appl. No. 13/887,711, 19 pages. |
International Search Report and Written Opinion for PCT/US2015/051355; dated Dec. 15, 2015, 9 pages. |
International Search Report and Written Opinion for PCT/US2015/047428; dated Nov. 27, 2015, 6 pages. |
International Search Report and Written Opinion for PCT/US2015/050570; dated Dec. 17, 2015, 8 pages. |
Non-Final Office Action—dated Oct. 7, 2015 U.S. Appl. No. 14/495,041, 14 pages. |
Non-Final Office Action dated Jun. 13, 2016 in U.S. Appl. No. 15/073,407, 49 pages. |
U.S. Appl. No. 13/849,698, filed Mar. 25, 2013, titled: “Method and System for Activating Different Interactive Functions Using Different Types of Finger Contacts.” 52 pages. |
U.S. Appl. No. 13/780,494, filed Feb. 28, 2013, titled: “Input Tools Having Viobro-Acoustically Distinct Regions and Computing Device for Use With the Same.” 34 pages. |
Final Office Action dated Jul. 12, 2017 in U.S. Appl. No. 14/495,041, 14 pages. |
Final Office Action dated Jul. 18, 2017 in U.S. Appl. No. 14/191,329, 17 pages. |
Final Office Action dated Jun. 8, 2016 in U.S. Appl. No. 14/495,041, 16 pages. |
Final Office Action dated Mar. 7, 2018 in U.S. Appl. No. 14/219,919, 21 pages. |
Final Office Action dated May 6, 2016 in U.S. Appl. No. 14/191,329, 17 pages. |
Final Office Action dated May 13, 2016 in U.S. Appl. No. 14/390,831, 6 pages. |
Final Office Action dated May 20, 2016 in U.S. Appl. No. 14/503,894, 17 pages. |
Final Office Action dated Nov. 9,2016 in U.S. Appl. No. 14/612,089, 11 pages. |
Final Office Action dated Nov. 23, 2015 in U.S. Appl. No. 14/668,870, 14 pages. |
Final Office Action dated Sep. 6, 2017 in U.S. Appl. No. 14/486,800, 17 pages. |
International Search Report and Written Opinion dated Jul. 8, 2013 in International Application No. PCT/CA2013/000292, 9 pages. |
International Search Report and Written Opinion dated Jun. 6, 2012 in International Patent Application No. PCT/CA2012/050127, 10 pages. |
“Making it Easier to Share With Who You Want,” Facebook, Aug. 23, 2011, last updated on Dec. 12, 2012 retrieved from https://www .facebook.com/notes/facebook/making-it-easier -to-share-with-who-you-want/10150251867797131/, retrieved on Jun. 1, 2018, 14 pages. |
Cheng, B. et al., “SilentSense: Silent User Identification via Dynamics of Touch and Movement Behavioral Biometrics,” Cryptography and Security (cs CR); Human-Computer Interaction, pp. 9, Aug. 31, 2013, 9 pages. |
S. Furui, “Digital Speech Processing, synthesis, and recognition” Marcel Dekker, Inc. 2001. 40 pages. |
English Translation of Chinese Office Action dated Nov. 3, 2017 in Chinese Application No. 201480002856.0, 12 pages. |
English Translation of Final Rejection dated Dec. 12, 2014 in Korean Patent Application No. 10-2014-0027979, 3 pages. |
English Translation of First Office Action dated Oct. 11, 2017 in Chinese Patent Application No. 20150209998.0, 10 pages. |
English Translation of Notification of Reason for Refusal dated Jul. 10, 2014 in Korean patent application No. 10-2014-0027979, 3 pages. |
Final Office Action dated Jan. 5, 2018 in U.S. Appl. No. 14/503,894, 16 pages. |
Communication pursuant to Article 94(3) EPC dated Feb. 26, 2018 in European Patent Application No. 14785422.8, 7 pages. |
Communication pursuant to Article 94(3) EPC dated Mar. 5, 2018 in European Patent Application No. 14794212.2, 5 pages. |
Extended European Search Report dated Apr. 16, 2018 in European Application No. 15845310.0, 7 pages. |
Extended European Search Report dated Aug. 11, 2016 in European Patent Application No. 14785422.8, 8 Pages. |
Extended European Search Report dated Aug. 25, 2017 in European Patent Application No. 157 48667.1, 10 pages. |
Extended European Search Report dated Jul. 22, 2014 in European Patent Application No. 12755563.9, 5 Pages. |
Extended European Search Report dated Mar. 16, 2018 in European Patent Application No. 15842839.1, 7 Pages. |
Extended European Search Report dated Mar. 19, 2018 in European Patent Application No. 15840819.5, 9 Pages. |
Extended European Search Report dated Mar. 19, 2018 in European Patent Application No. 15843933.1, 8 Pages. |
Extended European Search Report dated Mar. 27, 2018 in European Patent Application No. 15843989.3, 8 Pages. |
Extended European Search Report dated May 14, 2018 in European Patent Application No. 15847469.2, 11 pages. |
Weidong, S. et al., “SenGuard: Passive user identification on smartphones using multiple sensors,” IEEE 7th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), pp. 141-148, 2011. |
Final Office Action dated Feb. 9, 2016 in U.S. Appl. No. 14/486,800, 14 pages. |
Final Office Action dated Feb. 26, 2016 in U.S. Appl. No. 14/492,604, 16 pages. |
Non-Final Office Action dated Sep. 9, 2016 in U.S. Appl. No. 13/887,711, 24 pages. |
Non-Final Office Action dated Sep. 29, 2016 in U.S. Appl. No. 14/834,434, 12 pages. |
Pedro, L et al., “Augmenting touch interaction through acoustic sensing”, Proceedings of the ACM International 3onference on Interactive Tabletops and Surfaces, pp. 53-56, Nov. 13-16, 2011. |
Sarah, M. K. et al., “A Personal Touch—Recognizing Users Based on Touch Screen Behavior,” PhoneSense'12, Nov. 6, 2012, Toronto, ON, Canada, Nov. 6, 2012, 5 pages. |
“Swype Advanced Tips”, [http://www.swype.com/tips/advanced-tips], Jun. 25, 2014, retrieved via the Wayback Machine on Jun. 29, 2018, [https:web.archive.Org/web/20140625073212/http://www.swype.com/tips/advanced-tips], 2 Pages. |
“Swype Basics”, [http://www.swype.com/tips/swype-basics], retrieved via the Wayback Machine dated Jun. 14, 2014,—retrieved via the Wayback Machine on Jun. 29, 2018, [https:web.archive.org/web/20140614200707/http://www.swype. mm/tips/swype-basics, 2 pages. |
“Swype Tips”, [http://www.swype.com/category/tips], Jul. 2, 2014, retrieved via the Wayback Machine on Jun. 29, 2018, [https:web.archive.Org/web/20140702102357/http://www.swype.com/category/tips, 2 pages. |
Non-Final Office Action dated Apr. 15, 2015 in U.S. Appl. No. 13/856,414, 17 pages. |
Non-Final Office Action dated Apr. 26, 2018 in U.S. Appl. No. 14/495,041, 15 pages. |
Non-Final Office Action dated Dec. 20, 2017 in U.S. Appl. No. 14/834,434, 12 pages. |
Non-Final Office Action dated Jul. 8, 2015 in U.S. Appl. No. 14/191,329, 18 pages. |
Non-Final Office Action dated Jul. 11, 2017 in U.S. Appl. No. 14/390,831, 79 pages. |
Non-Final Office Action dated Jul. 17, 2017 in U.S. Appl. No. 15/073,407, 8 pages. |
Non-Final Office Action dated Jul. 19, 2017 in U.S. Appl. No. 14/219,919, 20 pages. |
Non-Final Office Action dated Jun. 9, 2016 in U.S. Appl. No. 14/612,089, 11 pages. |
Non-Final Office Action dated May 7, 2018 in U.S. Appl. No. 14/191,329, 17 pages. |
Non-Final Office Action dated May 9, 2018 in U.S. Appl. No. 13/887,711, 27 pages. |
Non-Final Office Action dated Nov. 15, 2017 in U.S. Appl. No. 15/198,062, 24 pages. |
Non-Final Office Action dated Nov. 24, 2015 in U.S. Appl. No. 14/191,329, 31 pages. |
Non-Final Office Action dated Oct. 18, 2017 in U.S. Appl. No. 15/406,770, 12 pages. |
Non-Final Office Action dated Oct. 19, 2015 in U.S. Appl. No. 14/668,870, 6 pages. |
Non-Final Office Action dated Oct. 23, 2014 in U.S. Appl. No. 14/275,124, 10 pages. |
Non-Final Office Action dated Oct. 25, 2013 in U.S. Appl. No. 13/410,956, 8 pages. |
Non-Final Office Action dated Oct. 28, 2015 in U.S. Appl. No. 14/390,831, 22 pages. |
Non-Final Office Action dated Sep. 8, 2016 in U.S. Appl. No. 14/492,604, 14 pages. |
Notice of Allowance dated Jan. 26, 2015 in U.S. Appl. No. 13/849,698, 27 pages. |
Notice of Allowance dated Dec. 6, 2016 in U.S. Appl. No. 14/751,589, 27 pages. |
Non-Final Office Action dated Jul. 30, 2018 in U.S. Appl. No. 15/406,770, 20 pages. |
Notice of Allowance dated Feb. 2, 2015 in U.S. Appl. No. 13/780,494, 43 pages. |
Non-Final Office Action dated Jun. 26, 2018 in U.S. Appl. No. 14/486,800, 25 pages. |
Final Office Action dated Aug. 8, 2018 in U.S. Appl. No. 14/834,434, 19 pages. |
Non-Final Office Action dated Sep. 2, 2014 in U.S. Appl. No. 13/863,193, 41 pages. |
Final Office Action dated Mar. 4, 2015 in U.S. Appl. No. 13/863,193, 50 pages. |
Non-Final Office Action dated Jan. 7, 2016 in U.S. Appl. No. 13/863,193, 58 pages. |
Final Office Action dated Sep. 15, 2016 in U.S. Appl. No. 13/863,193, 50 pages. |
Non-Final Office Action dated Apr. 6, 2017 in U.S. Appl. No. 13/863,193, 70 pages. |
Final Office Action dated Jan. 9, 2018 in U.S. Appl. No. 13/863,193, 50 pages. |
Notice of Allowance dated May 22, 2018 in U.S. Appl. No. 13/863,193, 73 pages. |
Notice of Allowance dated Sep. 1, 2016 in U.S. Appl. No. 13/856,414, 28 pages. |
Chinese Office Action for Chinese Patent Application No. 201510240522.3 dated Jun. 28, 2018, 30 pages (including English Translation). |
Japanese Office Action dated Aug. 1, 2018 for Japanese Patent Application No. 2017-049566, 9 pages (including English translation). |
Korean Office Action dated Jan. 10, 2019 for Korean Patent Application No. 2014-7010323, 11 pages (including English translation). |
Final Office Action received for U.S. Appl. No. 15/075,648 dated Dec. 21, 2018, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/815,679 dated Sep. 28, 2018, 69 pages. |
Final Office Action received for U.S. Appl. No. 15/198,062 dated Sep. 6, 2018, 32 pages. |
Non-Final Office Action received for U.S. Appl. No. 16/126,175 dated Nov. 1, 2018, 86 pages. |
Third Chinese Office Action received for Chinese Patent Application No. 201480022056.5 dated Jul. 19, 2018, 6 pages (with English translation). |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 14785422.8 dated Nov. 22, 2018, 5 pages. |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15845310.0 dated Jan. 3, 2019, 4 pages. |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15840819.5 dated Jan. 23, 2019, 6 pages. |
Chinese First Office Action received for Chinese Patent Application No. 201510240372.6 dated Sep. 27, 2018, 18 pages (including English Translation). |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15843933.1 dated Jan. 23, 2019, 6 pages. |
Chinese Search Report received for Chinese Patent Application No. 201580053216.7, dated Apr. 16, 2019, 2 Pages. |
European Search Report received for European Patent Application No. 16839786.7, dated Feb. 12, 2019, 8 Pages. |
Communication pursuant to Rules 70(2) and 70a(2) EPC received for European Patent Application No. 16839786.7 dated Mar. 1, 2019, 1 page. |
Chinese Second Office Action received for Chinese Patent Application No. 201580000833.0 dated Jan. 15, 2018, 17 pages. |
European Search Report received for European Patent Application No. 16818725.0, dated Dec. 21, 2018, 8 Pages. |
Communication pursuant to Rules 70(2) and 70a(2) EPC received for European Patent Application No. 16818725.0 dated Jan. 8, 2019, 1 page. |
First Office Action received for Canadian Patent Application No. 2869699, dated Nov. 27, 2014, 3 pages. |
Second Office Action received for Canadian Patent Application No. 2869699, dated Jun. 14, 2016, 4 pages. |
Third Office Action received for Canadian Patent Application No. 2869699, dated Jan. 9, 2017, 3 pages. |
First Examination report received for Australian Patent Application No. 2012225130, dated Feb. 9, 2015, 4 pages. |
First Office Action received for Canadian Patent Application No. 2802746, dated Apr. 9, 2013, 3 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Jun. 10, 2019, 26 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Sep. 20, 2019, 26 pages. |
Final Office Action received for U.S. Appl. No. 14/495,041 dated Aug. 9, 2019, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/834,434 dated Aug. 5, 2019, 19 pages. |
Final Office Action received for U.S. Appl. No. 16/126,175 dated Aug. 2, 2019, 161 pages. |
Communication pursuant to Article 94(3) EPC received for European Patent Application No. 14832247.2 dated May 3, 2019, 7 pages. |
Final Office Action received for U.S. Appl. No. 15/075,648 dated May 31, 2019, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/836,798 dated Jul. 5, 2019, 95 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/191,329 dated Jul. 16, 2019, 30 pages. |
Chinese First Office Action received for Chinese Patent Application Serial No. 201580051873.8 dated Jun. 21, 2019, 15 pages (Including English Translation). |
Final Office Action received for U.S. Appl. No. 13/887,711 dated Jul. 25, 2019, 24 pages. |
Chinese Second Office Action received for Chinese Patent Application No. 201510240372.6 dated May 15, 2019, 16 Pages (including English Translation). |
“Mimio”, http://www.mimio.com, retrieved Jul. 8, 2019, 8 pages. |
U.S. Appl. No. 14/191,329, filed Feb. 26, 2014, titled “Using Capacitive Images for Touch Type Classification”, 42 pages. |
U.S. Appl. No. 13/887,711, filed May 6, 2013, titled: “Using Finger Touch Types to Interact with Electronic Devices.” 42 pages. |
U.S. Appl. No. 14/492,604, filed Sep. 22, 2014, titled: “Method and Apparatus for Improving Accuracy of Touch Screen Event Analysis by Use of Edge Classification.” 35 pages. |
U.S. Appl. No. 14/483,150, filed Sep. 11, 2014, titled: “Method and Apparatus for Differentiating Touch Screen Users Based on Touch Event Analysis.” 38 pages. |
U.S. Appl. No. 14/495,041, filed Sep. 24, 2014, titled: “Method for Improving Accuracy of Touch Screen Event Analysis by Use of Spatiotemporal Touch Patterns.” 34 pages. |
U.S. Appl. No. 14/242,127, filed Apr. 1, 2014, titled: “Method and Apparatus for Classifying Touch Events on a Touch Sensitive Surface.”, 36 pages. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2014/034977 dated Sep. 18, 2014, 8 pages. |
International Search Report and Written Opinion received for International Patent Application No. PCT/US2012/060865 dated Mar. 29, 2013, 10 pages. |
Chinese Office Action (including English translation) for App. No. CN201810617137.X, dated Oct. 28, 2020, 12 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/958,427, dated Nov. 10, 2016, 22 pages. |
Final Office Action dated Nov. 28, 2014 in U.S. Appl. No. 13/849,698, 21 pages. |
Non-Final Office Action dated Jun. 24, 2014 in U.S. Appl. No. 13/849,698, 21 pages. |
Non-Final Office Action dated Oct. 16, 2014 in U.S. Appl. No. 13/780,494, 10 pages. |
Final Office Action dated Jun. 30, 2017 in U.S. Appl. No. 13/958,427, 15 pages. |
Final Office Action dated Mar. 28, 2016 in U.S. Appl. No. 13/958,427, 16 pages. |
Non-Final Office Action dated Apr. 16, 2018 in U.S. Appl. No. 13/958,427, 14 pages. |
Non-Final Office Action dated Oct. 8, 2015 in U.S. Appl. No. 13/958,427, 15 pages. |
U.S. Appl. No. 14/483,150, filed Sep. 11, 2014, titled: “Method and Apparatus for Differentiating Touch Screen Users Based on Touch Event Analysis.” |
U.S. Appl. No. 14/492,604, filed Sep. 22, 2014, titled: “Method and Apparatus for Improving Accuracy of Touch Screen Event Analysis by Use of Edge Classification.” |
U.S. Appl. No. 14/495,041, filed Sep. 24, 2014, titled: “Method for Improving Accuracy of Touch Screen Event Analysis by Use of Spatiotemporal Touch Patterns.” |
English Translation of Final Rejection dated Apr. 27, 2015 in Korean Patent Application No. 10-2014-0027979, 3 pages. |
English Translation of First Office Action dated Feb. 27, 2017 in Chinese Application No. 201480002879.1, 13 pages. |
English Translation of First Office Action dated May 2, 2017 in Chinese Patent Application No. 201580000833.0, 9 pages. |
English Translation of Second Office Action dated Jul. 6, 2017 in Chinese Application No. 201480002879.1, 14 pages. |
English Translation of Third Office Action dated Oct. 16, 2017 in Chinese Application No. 201480002879.1, 4 pages. |
Schwarz, J. et al., “Probabilistic Palm Rejection Using Spatiotemporal Touch Features and Iterative Classification,” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2009-2012, Apr. 26-May 1, 2014. |
Hinckley et al., “Manual Deskterity: An Exploration of Simultaneous Pen+ Touch Direct Input”, Proceedings of CHI, 2010, pp. 2793-2802. |
Final Office Action issued for U.S. Appl. No. 15/206,554 dated Feb. 1, 2017, 20 pages. |
Chinese Office Action for Chinese Patent Application No. 201280062500.7 dated Nov. 7, 2016, 9 pages. |
Seo et al.., “Audio Fingerprinting Based on Normalized Spectral Subband Centroids,” Proc. ICASSP, {U.S.A.), 2005, vol. 3, p. 213-216. Retrieved on May 29, 2017, 4 pages. |
Kashino, K., “Audio fingerprinting: Techniques and applications”, Acoustical Science and Technology, The Acoustical Society of Japan, Feb. 1, 2010, vol. 66, No. 2, p. 71-76. Retrieved on May 29, 2017, 6 pages. |
Chinese Search Report dated Mar. 29, 2016 for Chinese Application No. 201280062500.7, 1 page. |
Chinese Office Action dated Apr. 15, 2016 for Chinese Application No. 201280062500.7, 11 pages. |
Japanese Office Action for Japanese Patent Application No. 2014-537253 dated Nov. 15, 2016, 3 pages. |
Japanese Office Action for Japanese Patent Application No. 2014-537253 dated Apr. 26, 2016, 3 pages. |
Communication pursuant to Article 94(3) EPC for EP Application No. 12842495.9 dated Jun. 18, 2018, 4 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/684,407 dated Jul. 8, 2016, 11 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Jan. 18, 2017, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/684,407 dated Aug. 2, 2017, 14 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Mar. 12, 2018, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/612,089 dated May 31, 2017, 21 pages. |
Final Office Action received for U.S. Appl. No. 13/887,711, dated Jun. 8, 2017, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/075,648, dated Apr. 21, 2017, 8 pages. |
Final Office Action received for U.S. Appl. No. 14/492,604, dated Mar. 17, 2017, 37 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/495,041, dated Nov. 25, 2016, 35 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/503,894, dated May 16, 2017, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/958,427, dated Mar. 13, 2015, 50 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/887,711, dated Apr. 6, 2015, 36 pages. |
Final Office Action received for U.S. Appl. No. 14/191,329, dated Aug. 7, 2015, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/492,604, dated Oct. 1, 2015, 16 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/483,150 dated Dec. 18, 2015, 7 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/503,894, dated Dec. 30, 2015, 18 pages. |
Non-Final Office Action—dated Jan. 29, 2016 U.S. Appl. No. 14/219,919, 11 pages. |
Final Office Action dated Feb. 24, 2016 U.S. Appl. No. 13/887,711, 23 pages. |
International Search Report and Written Opinion for PCT/US2015/014581; dated May 14, 2015, 7 pages. |
Search Report dated Apr. 21, 2017 in Chinese Patent Application No. 201580000833.0, 1 page. |
Kherallah, M et al., “On-line handwritten digit recognition based on trajectory and velocity modeling,” Pattern Recognition Letters, vol. 29, Issue 5, pp. 580-594, Apr. 1, 2008. |
Non-Final Office Action dated Apr. 19, 2017 in U.S. Appl. No. 14/869,998, 7 pages. |
Chinese Office Action for Chinese Patent Application No. 201280062500.7, dated Apr. 27, 2018, 19 pages (with English Translation). |
Chinese Office Action for Chinese Patent Application No. 201280062500.7, dated Oct. 10, 2018, 14 pages. |
Office Action dated Mar. 30, 2018 for U.S. Appl. No. 15/886,562, 44 pages. |
Office Action dated Aug. 10, 2018 for U.S. Appl. No. 15/886,562, 86 pages. |
Office Action dated Jan. 28, 2019 for U.S. Appl. No. 15/836,798, 30 pages. |
Chinese Office Action dated Apr. 21, 2017 for Chinese Patent Application No. 201480022056.5, 23 pages. (with Translation). |
Chinese Office Action dated Feb. 9, 2018 for Chinese Patent Application No. 201480022056.5, 19 pages. (with Translation). |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15842839.1 dated Apr. 9, 2019, 7 pages. |
European Search Report dated Apr. 8, 2019 for European Application No. 18195588.1, 7 pages. |
Chinese Office Action for Chinese Patent Application No. 201280062500.7 dated Apr. 17, 2017,15 pages (including English Translation). |
Number | Date | Country | |
---|---|---|---|
20200209996 A1 | Jul 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14495041 | Sep 2014 | US |
Child | 16798139 | US |