A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
The present invention relates generally to the field of touch screen technology and more particularly to the analysis of touch screen mechanical impact features and acoustic features to differentiate between different users.
The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches, which in and of themselves may also be inventions.
The following detailed description is made with reference to the technology disclosed. Preferred implementations are described to illustrate the technology disclosed, not to limit its scope, which is defined by the claims. Those of ordinary skill in the art will recognize a variety of equivalent variations on the description.
Various electronic devices today are typically operated by a user interacting with a touch screen. Some such devices, such as touch sensitive computer screens, are designed to be operated by multiple users, normally at different times. Other such devices, such as smart phones, are usually associated with a single user whose privacy could be seriously jeopardized if another user gains unauthorized access to such one-user devices.
With multiple user devices, it is usually necessary to provide an input to the computer to identify the particular user who is operating the system so that appropriate user-related programs are presented for further processing by the correct user.
With single user devices, exposure of detailed information to an unauthorized user would exacerbate the injury to privacy of the authorized user and therefore passwords, fingerprint readers and entry codes are often used to prevent access to the unauthorized user.
Unfortunately, such user ID inputs, passwords, fingerprint reading, entry codes and other such extra ID or security operations, are time-consuming and inconvenient and detract from the pleasure and efficiencies of using such devices.
For some embodiments, methods for differentiating touch screen users includes detecting a touch event from at least one user, generating a vibro-acoustic waveform signal associated with the touch event, converting the vibro-acoustic waveform signal into at least one converted waveform signal different from the vibro-acoustic wave form signal, extracting distinguishing features from the converted waveform signal, and using the extracted distinguishing features to associate the touch event with a particular user.
For some embodiments, apparatus for differentiating touch screen users include a touch sensitive surface configured for detecting a touch event from at least one user, a sensor configured to generate a vibro-acoustic waveform signal in response to occurrence of the touch event, a converter configured to convert the waveform signal into at least one converted wave form signal different from the vibro-acoustic waveform signal, a feature extractor configured for extracting distinguishing features from the converted waveform signal, and a classification unit configured to use the distinguishing features extracted by the extractor to associate the vibro-acoustic waveform signal with a particular user.
Other aspects and advantages of the present invention can be seen on review of the drawings, the detailed description and the claims, which follow.
The included drawings are for illustrative purposes and serve only to provide examples of possible structures and process steps for the disclosed techniques. These drawings in no way limit any changes in form and detail that may be made to embodiments by one skilled in the art without departing from the spirit and scope of the disclosure.
Applications of methods and apparatus according to one or more embodiments are described in this section. These examples are being provided solely to add context and aid in the understanding of the present disclosure. It will thus be apparent to one skilled in the art that the techniques described herein may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order to avoid unnecessarily obscuring the present disclosure. Other applications are possible, such that the following examples should not be taken as definitive or limiting either in scope or setting.
In the following detailed description, references are made to the accompanying drawings, which form a part of the description and in which are shown, by way of illustration, specific embodiments. Although these embodiments are described in sufficient detail to enable one skilled in the art to practice the disclosure, it is understood that these examples are not limiting, such that other embodiments may be used and changes may be made without departing from the spirit and scope of the disclosure.
One or more embodiments may be implemented in numerous ways, including as a process, an apparatus, a system, a device, a method, a computer readable medium such as a computer readable storage medium containing computer readable instructions or computer program code, or as a computer program product comprising a computer usable medium having a computer readable program code embodied therein.
The disclosed embodiments may include methods of differentiating touch screen users based on characterization of features derived from the touch event acoustics and mechanical impact and includes detecting a touch event on a touch sensitive surface, generating a vibro-acoustic waveform signal using at least one sensor detecting such touch event, converting the waveform signal into at least a domain signal, extracting distinguishing features from said domain signal, and classifying said features to associate the features of the domain signal with a particular user.
The disclosed embodiments may include systems to differentiate touch screen users of a touch screen device. The systems may include a processor and one or more stored sequences of instructions which, when executed by the processor, cause the processor to detect a touch event on a touch sensitive surface, generate a vibro-acoustic waveform signal using at least one sensor detecting such touch event, convert the waveform signal into at least a domain signal, extract distinguishing features from said domain signal, and classify said features to associate the features of the domain signal with a particular user.
The disclosed embodiments may include apparatus to differentiate touch screen users of a touch screen device. The apparatus may include a touch sensitive surface for detecting a touch event from at least one user, at least one sensor generating a vibro-acoustic waveform signal from such touch event, a converter for converting the waveform signal into at least a domain signal, a feature extractor for extracting distinguishing features from said domain signal, and a classification unit which uses the distinguishing features of said extractor to associate the features of the domain signal with a particular user.
The disclosed embodiments may include a machine-readable medium carrying one or more sequences of instructions for providing social information, which instructions, when executed by one or more processors, cause the one or more processors to detect a touch event on a touch sensitive surface, generate a vibro-acoustic waveform signal using at least one sensor detecting such touch event, convert the waveform signal into at least a domain signal, extract distinguishing features from said domain signal, and classify said features to associate the features of the domain signal with a particular user. The domain signal may be a time domain signal or a frequency domain signal.
In general, when a user touches a touch screen (i.e., a physical impact) of a computing system equipped with a touch screen (e.g., a smart phone), a mechanical force is applied to the touch screen, resulting in mechanical vibrations that propagate on and through the touch screen, as well as any contacting components (e.g., device chassis, electronics main board, enclosure). These mechanical vibrations may be captured by at least one of a variety of sensors, including impact sensors, vibration sensors, accelerometers, strain gauges, or acoustic sensors such as a condenser microphone, a piezoelectric microphone, MEMS microphone and the like.
Once the vibro-acoustic signal associated with the mechanical vibrations has been captured by a sensor, it can be converted into a series of features, for example: Average acoustic power, Standard Deviation, Variance, Skewness, Kurtosis, Absolute sum, Root Mean Square (RMS), Dispersion, Zero-crossings, Spectral centroid, Spectral density, Linear Prediction-based Cepstral Coefficients (LPCC), Perceptual Linear Prediction (PLP), Cepstral Coefficients Cepstrum Coefficients, Mel-Frequency Cepstral Coefficients (MFCC), Frequency phases (e.g., as generated by an FFT).
Many touch screen technologies are able to digitize several aspects of a touch event, such as the shape, size, capacitance, orientation, pressure, etc. The latter may be used as distinguishing features, or such features can be derived from them. Further, because human fingers vary in their anatomical composition, their acoustic and touch properties can vary between humans. Moreover, the way users touch a touch screen can also be distinguishing (e.g., what finger, what part of the finger, how flat, how hard). Thus, the vibro-acoustic features and touch features contain properties that can be characteristic of different users.
It is therefore possible to include a classifier in a computing system configured with a touch screen that upon receipt of a touch event, makes a determination about which user is operating the computing system or whether the user is authorized or has any personalized features. Any single event may not yield sufficient confidence as to identifying which user is operating the device. Therefore the classifier may withhold a conclusion until a sufficient level of confidence is reached, or a best guess can be forced at a predetermined period or event (e.g., after 10 touches, after 2 minutes, when entering a privileged application).
For some embodiments, the classifier may include a sensing system configured to continuously samples vibro-acoustic data and saving it into a buffer. The buffer can be of many lengths such as, for example, 50 milliseconds. The classifier may be coupled with a touch screen (or touch sensitive surface) configured to wait for a touch event to occur. Any number of touch technologies are possible for the touch screen. When the touch screen detects a touch event, it triggers a conversion, feature extraction, and classification process.
When the touch event is detected, data from the vibro-acoustic buffer is retrieved. Because the touch screen may have some latency, it is often necessary to look backwards in the buffer to find the vibro-acoustic waveform that corresponds to the touch impact (e.g., if the touch screen has a 20 ms latency, it may be necessary to look back in the buffer 20 ms to find the corresponding vibro-acoustic event). All or part of the buffer may be saved and passed to the next process.
The waveform from the sensor is a time-domain representation of the vibro-acoustic signal. During conversion, the signal is converted into other forms. This includes filtering the waveform and transforming into other forms, including frequency domain representations. During extraction, touch screen controller data and vibro-acoustic data are analyzed, and features that characterize different users are extracted. For the vibro-acoustic data, features are computed for all representations of the signal. These features are then passed to the classifier, which uses the information to label the touch event with a user (in addition to whatever the touch sensitive surface reports, e.g., X/Y position, major/minor axes, pressure, etc.) The augmented touch event may then be passed to the operating system (OS) or end user applications, to associate a user based on the touch event.
Referring to
When a user uses a finger to touch a surface of the touch screen 100, the touch event produces a vibro-acoustic response in the air and also mechanical vibrations inside the contacting surface (e.g., touch screen, enclosure, device chassis). Some embodiments of the present invention may utilize both sources of vibro-acoustic signal with one or more sensors (e.g., one for in-air acoustics, and one for mechanical vibrations, also referred to as structural acoustics). Several sensor types can be used including, for example, Piezo bender elements, Piezo film, Accelerometers (e.g., linear variable differential transformer (LVDT), Potentiometric, Variable Reluctance, Piezoelectric, Piezoresistive, Capacitive, Servo (Force Balance), MEMS, Displacement sensors, Velocity sensors, Vibration sensors, Gyroscopes, Proximity Sensors, Electric mics, Hydrophones, Condenser microphone, Electret condenser microphone, Dynamic microphone, Ribbon microphone, Carbon microphone, Piezoelectric microphone, Fiber optic microphone, Laser microphone, Liquid microphone, MEMS microphone.
Many touch screen computing systems have microphones and accelerometers built in (e.g., for voice and input sensing). These can be utilized without the need for additional sensors, or can work in concert with specialized sensors.
The sensor may capture a waveform, which is a time-domain representation of the vibro-acoustic signal. The signal may be converted into other forms. This includes filtering the waveform (e.g., kalman filter, exponential moving average, 2 kHz high pass filter, one euro filter, savitzky-golay filter). It also includes transformation into other representations (e.g., wavelet transform, derivative), including frequency domain representations (e.g., spectral plot, periodogram, method of averaged periodograms, Fourier transform, least-squares spectral analysis, Welch's method, discrete cosine transform (DCT), fast folding algorithm).
The availability of the following touch features depends on the touch screen technology used. A classifier can use none, some or all of these features. These features may include location of touch contact (2D, or 3D in the case of curved glass or other non-planar geometry), size of touch contact (some touch technologies provide an ellipse of the touch contact with major and minor axes), rotation of the touch contact, surface area of the touch contact (e.g., in squared mm or pixels), pressure of touch (available on some touch systems), shear of touch (“shear stress (also called “tangential force” in the literature) arises from a force vector perpendicular to the surface normal of a touch screen. This may be similar to normal stress—what is commonly called pressure—which arises from a force vector parallel to the surface normal”), number of touch contacts, capacitance of touch (if using a capacitive touch screen), swept frequency capacitance of touch (if using a swept frequency6 capacitive touch screen), swept frequency impedance of touch (if using a swept frequency capacitive touch screen), shape of touch (some touch technologies can provide the actual shape of the touch, and not just a circle or ellipse), image of the hand pose (as imaged by e.g., an optical sensor, diffuse illuminated surface with camera, near-range capacitive sensing). It may be noted that the computation phase may also compute the derivative of one or more of the above features over a short period of time, for example, touch velocity and pressure velocity.
For some embodiments, the time domain and frequency domain representations of the signal, including 1st, 2nd and 3rd order derivatives of such representations may be used as features. For some embodiments, filtered versions of the time domain and frequency domain representations and the 1st, 2nd and 3rd order derivatives of such filtered versions may also be used as features.
The following features may be computed on time domain and frequency domain representations of the signal, including 1st, 2nd and 3rd order derivatives of such representations, and further, filtered versions of the time domain and frequency domain representations and the 1st, 2nd and 3rd order derivatives of such filtered versions: average, standard deviation, standard deviation (normalized by overall amplitude), variance, skewness, kurtosis, sum, absolute sum, root mean square (RMS), crest factor, dispersion, entropy, power sum, center of mass, coefficient of variation, cross correlation (i.e., sliding dot product), zero-crossings, seasonality (i.e., cyclic variation), and DC Bias. Template match scores for a set of known exemplar signals may be performed using the following methods: convolution, inverse filter matching technique, sum-squared difference (SSD), dynamic time warping, and elastic matching.
The following features may be computed on frequency domain representations, including 1st, 2nd and 3rd order derivatives of such representations, and further, filtered versions of the frequency domain representations and the 1st, 2nd and 3rd order derivatives of such filtered versions: spectral centroid, spectral density, spherical harmonics, total average spectral energy, band energy ratio (e.g., for every octave), log spectral band ratios (e.g., for every pair of octaves, and ever pair of thirds), additional vibro-acoustic features, linear prediction-based cepstral coefficients (LPCC), perceptual linear prediction (PLP), cepstral coefficients, cepstrum coefficients, mel-frequency cepstral coefficients (MFCC), and frequency phases (e.g., as generated by an FFT).
For some embodiments, all of the above features may be computed on the content of the entire buffer (e.g., 1 ms), and are also computed for sub regions (e.g., around the peak of the waveform, the end of the waveform). For some embodiments, all of the above vibro-acoustic features may be combined to form hybrid features such as, for example, a ratio (e.g., zero-crossings/spectral centroid) or difference (zero-crossings−spectral centroid).
The classification engine may use any number of approaches, including but not limited to basic heuristics, decision trees, Support Vector Machine, Random Forest, Naïve bayes, elastic matching, dynamic time warping, template matching, k-means clustering, K-nearest neighbors algorithm, neural network, Multilayer perceptron, multinomial logistic regression, Gaussian mixture models, and AdaBoost.
For some embodiments, it may be possible to combine results from several different classifiers, for example, through voting scheme. It may also be possible to use different classifiers based on one or more features. For example, two classifiers may be employed, one for processing sensor waveforms with a high Standard Deviation, and another classifier for waveforms with low Standard Deviation.
The touch screen 100 is an electronic visual display and serves also an input/output device supplementing or substituted for a keyboard, a mouse, and/or other types of devices. The touch screen 100 displays one or more interactive elements such as graphical representation for services or applications designed to perform a specific function on the computing system. Touching the interactive elements with the finger parts of a user, including the conventional tip of the finger, causes the OS 130 to activate the application or service related to the interactive elements appropriate to the identified user. Fingers are diverse appendages, both in their motor capabilities and their anatomical compositions. Different users' fingers have different vibro-acoustic properties due to differences in bone density, fleshiness, skin quality, BMI and the like. A single digit contains different parts such as one or more knuckles, a tip, pad and fingernail. A user who tends to use the same finger part when activating a touch screen may add to the likelihood of a correct user identification.
The fingertip includes the fleshy mass on the palmar aspect of the extremity of the finger, as well as the finger sides up to the distal interphalangeal articulation. It also includes the very tip of the finger (i.e., the distal tip of the distal phalanx). However, the fingernail may not be included in an embodiment as part of fingertip definition, as this is an anatomically distinct feature and region.
The fingernail may encompass all parts of the keratin (or artificial nail material), horn-like envelope covering the dorsal aspect of the terminal phalanges of fingers. The knuckle may include the immediate areas surrounding the boney joints of human fingers, including joints on the thumb, and both major and minor knuckles. The boney regions may be within a 1 cm radius surrounding the metacarpophalangeal joints and interphalangeal articulations.
When an object strikes a certain material, vibro-acoustic waves propagate outward through the material or along the surface of the material. Typically, interactive surfaces use rigid materials, such as plastic or glass, which both quickly distribute and faithfully preserve the signal. As such, when one or more fingers touch or contact the surface of the touch screen 100, vibro-acoustic responses are produced. The vibro-acoustic characteristics of the respective user fingers and their respective unique anatomical characteristics produce unique responses for each user.
Referring back to
The OS 130 runs the computing system so that the function can be activated in line with the classification of the vibro-acoustic signals and the corresponding user. The vibro-acoustic classifier 120 includes a segmentation unit 122 to segment the vibro-acoustic signal into a digital representation; a conversion unit 124 to convert the digitized vibro-acoustic signal into an electrical signal; a feature extraction unit 126 derive a series of features from the electrical signal; and a classification unit 128 to classify each user using the above-described features to distinguish among multiple users.
The segmentation unit 122 may be configured to sample the vibro-acoustic signal, for example, at a sampling rate of 96 kHz, using a sliding window of 4096 samples of the vibro-acoustic signal. The conversion unit 124 may be configured to perform, for example, a Fourier Transform on sampled time-dependent vibro-acoustic signal to produce an electrical signal having frequency domain representation. For example, the Fourier Transform of this window may produce 2048 bands of frequency power.
The vibro-acoustic classifier 120 may be configured to down-sample this data into additional vectors (i.e., buckets often), providing a different aliasing. In addition, additional time-domain features may be calculated from the vibro-acoustic signal, such as the average absolute amplitude, total absolute amplitude, standard deviation of the absolute amplitude, the center of mass for both the segmented input signal and the Fourier Transformed signal, and zero crossings.
The feature extraction unit 126 may be configured to calculate a series of features from the frequency domain representation of the vibro-acoustic signals, such as the fundamental frequency of the impact waveform. The classification unit 128 may be configured to classify the vibro-acoustic signal using the features to distinguish what user generated the touch event, so that the computing system may selectively activate a function related to the identified user depending on the classified vibro-acoustic signals. To aid the classification operation, a user can provide supplemental training samples to the vibro-acoustic classifier 120. For some embodiments, the classification unit 128 may be implemented with a support vector machine (SVM) for feature classification. The SVM is a supervised learning model with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis.
These and other aspects of the disclosure may be implemented by various types of hardware, software, firmware, etc. For example, some features of the disclosure may be implemented, at least in part, by machine-readable media that include program instructions, state information, etc., for performing various operations described herein. Examples of program instructions include both machine code, such as produced by a compiler, and files containing higher-level code that may be executed by the computer using an interpreter. Examples of machine-readable media include, but are not limited to, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROM disks; magneto-optical media; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (“ROM”) and random access memory (“RAM”).
Any of the above embodiments may be used alone or together with one another in any combination. Although various embodiments may have been motivated by various deficiencies with the prior art, which may be discussed or alluded to in one or more places in the specification, the embodiments do not necessarily address any of these deficiencies. In other words, different embodiments may address different deficiencies that may be discussed in the specification. Some embodiments may only partially address some deficiencies or just one deficiency that may be discussed in the specification, and some embodiments may not address any of these deficiencies.
While various embodiments have been described herein, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present application should not be limited by any of the embodiments described herein, but should be defined only in accordance with the following and later-submitted claims and their equivalents.
This application is a continuation of and claims benefit under 35 U.S.C. § 120 to U.S. application No. 14/483,150, filed Sep. 11, 2014 by Julia Schwarz et al., now U.S. Pat. No. 9,329,715, issued May 3, 2016.
Number | Name | Date | Kind |
---|---|---|---|
2008028 | Mccortney et al. | Jul 1935 | A |
2430005 | Denneen et al. | Nov 1947 | A |
3354531 | Pryor | Nov 1967 | A |
4561105 | Crane et al. | Dec 1985 | A |
4597932 | Kurihara et al. | Jul 1986 | A |
4686332 | Greanias et al. | Aug 1987 | A |
5483261 | Yasutake | Jan 1996 | A |
5544265 | Bozinovic et al. | Aug 1996 | A |
5596656 | Goldberg | Jan 1997 | A |
5615285 | Beemink | Mar 1997 | A |
5625818 | Zarmer et al. | Apr 1997 | A |
5666438 | Beemink et al. | Sep 1997 | A |
5867163 | Kurtenbach | Feb 1999 | A |
5933514 | Ostrem et al. | Aug 1999 | A |
6028593 | Rosenberg et al. | Feb 2000 | A |
6118435 | Fujita et al. | Sep 2000 | A |
6208330 | Hasegawa et al. | Mar 2001 | B1 |
6212295 | Ostrem et al. | Apr 2001 | B1 |
6222465 | Kumar et al. | Apr 2001 | B1 |
6246395 | Goyins et al. | Jun 2001 | B1 |
6252563 | Tada et al. | Jun 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6337698 | Kelly, Jr. et al. | Jan 2002 | B1 |
6492979 | Kent et al. | Dec 2002 | B1 |
6504530 | Wilson et al. | Jan 2003 | B1 |
6643663 | Dabney et al. | Nov 2003 | B1 |
6707451 | Nagaoka | Mar 2004 | B1 |
6748425 | Duffy et al. | Jun 2004 | B1 |
6772396 | Cronin et al. | Aug 2004 | B1 |
6933930 | Devige et al. | Aug 2005 | B2 |
6943665 | Chornenky | Sep 2005 | B2 |
7050955 | Carmel et al. | May 2006 | B1 |
7084884 | Nelson et al. | Aug 2006 | B1 |
7098896 | Kushler et al. | Aug 2006 | B2 |
7212197 | Schkolne et al. | May 2007 | B1 |
7443396 | Ilic | Oct 2008 | B2 |
7581194 | Iwema et al. | Aug 2009 | B2 |
7982724 | Hill | Jul 2011 | B2 |
8086971 | Radivojevic et al. | Dec 2011 | B2 |
8144126 | Wright | Mar 2012 | B2 |
8154524 | Wilson et al. | Apr 2012 | B2 |
8154529 | Sleeman et al. | Apr 2012 | B2 |
8170346 | Ludwig | May 2012 | B2 |
8199126 | Taubman | Jun 2012 | B1 |
8253744 | Macura et al. | Aug 2012 | B2 |
8269744 | Agari et al. | Sep 2012 | B2 |
8327029 | Purser | Dec 2012 | B1 |
8441790 | Pance et al. | May 2013 | B2 |
8547357 | Aoyagi | Oct 2013 | B2 |
8624878 | Sarwar et al. | Jan 2014 | B2 |
8670632 | Wilson | Mar 2014 | B2 |
8674943 | Westerman et al. | Mar 2014 | B2 |
8743091 | Bernstein | Jun 2014 | B2 |
8760395 | Kim et al. | Jun 2014 | B2 |
8762332 | Keebler et al. | Jun 2014 | B2 |
8769524 | Bhullar et al. | Jul 2014 | B2 |
9013452 | Harrison et al. | Apr 2015 | B2 |
9019244 | Harrison | Apr 2015 | B2 |
9030498 | Galor et al. | May 2015 | B2 |
9052772 | West | Jun 2015 | B2 |
9060007 | Keebler et al. | Jun 2015 | B2 |
9182882 | Fowler et al. | Nov 2015 | B2 |
9329688 | Harrison | May 2016 | B2 |
9329715 | Schwarz et al. | May 2016 | B2 |
9377863 | Bychkov et al. | Jun 2016 | B2 |
9557852 | Tsai et al. | Jan 2017 | B2 |
9612689 | Harrison et al. | Apr 2017 | B2 |
9696859 | Heller et al. | Jul 2017 | B1 |
9864453 | Munemoto et al. | Jan 2018 | B2 |
10082935 | Harrison et al. | Sep 2018 | B2 |
20020009227 | Goldberg et al. | Jan 2002 | A1 |
20020057837 | Wilkinson et al. | May 2002 | A1 |
20020070927 | Fujitsuka et al. | Jun 2002 | A1 |
20020126161 | Kuzunuki et al. | Sep 2002 | A1 |
20030048260 | Matusis | Mar 2003 | A1 |
20030110085 | Murren et al. | Jun 2003 | A1 |
20030132922 | Phillip | Jul 2003 | A1 |
20030217873 | Paradiso et al. | Nov 2003 | A1 |
20040012573 | Morrison et al. | Jan 2004 | A1 |
20040021681 | Liao | Feb 2004 | A1 |
20040054711 | Multer | Mar 2004 | A1 |
20040141010 | Fitzmaurice et al. | Jul 2004 | A1 |
20040160421 | Sullivan | Aug 2004 | A1 |
20040199867 | Brandenborg | Oct 2004 | A1 |
20040225730 | Brown et al. | Nov 2004 | A1 |
20050083313 | Hardie-Bick | Apr 2005 | A1 |
20050131778 | Bennett et al. | Jun 2005 | A1 |
20050146512 | Hill et al. | Jul 2005 | A1 |
20050289461 | Amado et al. | Dec 2005 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060026535 | Hotelling et al. | Feb 2006 | A1 |
20060031746 | Toepfer et al. | Feb 2006 | A1 |
20060152499 | Roberts | Jul 2006 | A1 |
20060173985 | Moore | Aug 2006 | A1 |
20060184617 | Nicholas et al. | Aug 2006 | A1 |
20060217126 | Sohm et al. | Sep 2006 | A1 |
20060230021 | Diab et al. | Oct 2006 | A1 |
20060288329 | Gandhi et al. | Dec 2006 | A1 |
20070011205 | Majjasie et al. | Jan 2007 | A1 |
20070044010 | Sull et al. | Feb 2007 | A1 |
20070075965 | Huppi et al. | Apr 2007 | A1 |
20070085157 | Fadell et al. | Apr 2007 | A1 |
20070100959 | Eichstaedt et al. | May 2007 | A1 |
20070109279 | Sigona | May 2007 | A1 |
20070126716 | Haverty | Jun 2007 | A1 |
20070168367 | Dickinson et al. | Jul 2007 | A1 |
20070186157 | Walker et al. | Aug 2007 | A1 |
20070192674 | Bodin et al. | Aug 2007 | A1 |
20070245020 | Ott, IV | Oct 2007 | A1 |
20070257767 | Beeson | Nov 2007 | A1 |
20070291297 | Harmon et al. | Dec 2007 | A1 |
20080005666 | Sefton et al. | Jan 2008 | A1 |
20080036743 | Westerman et al. | Feb 2008 | A1 |
20080042978 | Perez-Noguera | Feb 2008 | A1 |
20080082941 | Goldberg et al. | Apr 2008 | A1 |
20080103906 | Singh | May 2008 | A1 |
20080117168 | Liu et al. | May 2008 | A1 |
20080126388 | Naaman | May 2008 | A1 |
20080141132 | Tsai | Jun 2008 | A1 |
20080155118 | Glaser et al. | Jun 2008 | A1 |
20080158147 | Westerman et al. | Jul 2008 | A1 |
20080158168 | Westerman et al. | Jul 2008 | A1 |
20080158185 | Westerman | Jul 2008 | A1 |
20080168403 | Westerman et al. | Jul 2008 | A1 |
20080180406 | Han et al. | Jul 2008 | A1 |
20080244468 | Nishihara et al. | Oct 2008 | A1 |
20080288347 | Sifry | Nov 2008 | A1 |
20080319932 | Yih et al. | Dec 2008 | A1 |
20090025987 | Perksi et al. | Jan 2009 | A1 |
20090073144 | Chen et al. | Mar 2009 | A1 |
20090095540 | Zachut et al. | Apr 2009 | A1 |
20090150373 | Davis et al. | Jun 2009 | A1 |
20090157206 | Weinberg et al. | Jun 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090178011 | Ording et al. | Jul 2009 | A1 |
20090231275 | Odgers | Sep 2009 | A1 |
20090232355 | Minear et al. | Sep 2009 | A1 |
20090254869 | Ludwig et al. | Oct 2009 | A1 |
20090259628 | Farrell et al. | Oct 2009 | A1 |
20090262637 | Badaye et al. | Oct 2009 | A1 |
20090315835 | De Goes et al. | Dec 2009 | A1 |
20090318192 | Leblanc et al. | Dec 2009 | A1 |
20100036967 | Caine et al. | Feb 2010 | A1 |
20100060602 | Agari et al. | Mar 2010 | A1 |
20100085216 | Ms | Apr 2010 | A1 |
20100094633 | Kawamura et al. | Apr 2010 | A1 |
20100123666 | Wickholm et al. | May 2010 | A1 |
20100127997 | Park et al. | May 2010 | A1 |
20100194703 | Fedor et al. | Aug 2010 | A1 |
20100214267 | Radivojevic et al. | Aug 2010 | A1 |
20100225601 | Homma et al. | Sep 2010 | A1 |
20100251112 | Hinckley et al. | Sep 2010 | A1 |
20100265185 | Oksanen | Oct 2010 | A1 |
20100271322 | Kondoh et al. | Oct 2010 | A1 |
20100274622 | Kennedy et al. | Oct 2010 | A1 |
20100279738 | Kim et al. | Nov 2010 | A1 |
20100289754 | Sleeman et al. | Nov 2010 | A1 |
20100302184 | East et al. | Dec 2010 | A1 |
20100306649 | Russ et al. | Dec 2010 | A1 |
20100309158 | Iwayama et al. | Dec 2010 | A1 |
20100309933 | Stark et al. | Dec 2010 | A1 |
20110003550 | Klinghult et al. | Jan 2011 | A1 |
20110007000 | Lim | Jan 2011 | A1 |
20110018825 | Kondo | Jan 2011 | A1 |
20110057670 | Jordan | Mar 2011 | A1 |
20110057885 | Lehtovirta | Mar 2011 | A1 |
20110074544 | D'Souza | Mar 2011 | A1 |
20110074701 | Dickinson et al. | Mar 2011 | A1 |
20110080349 | Holbein et al. | Apr 2011 | A1 |
20110133934 | Tan et al. | Jun 2011 | A1 |
20110134063 | Norieda | Jun 2011 | A1 |
20110134083 | Norieda | Jun 2011 | A1 |
20110141066 | Shimotani et al. | Jun 2011 | A1 |
20110145706 | Wilson et al. | Jun 2011 | A1 |
20110164029 | King et al. | Jul 2011 | A1 |
20110167391 | Momeyer et al. | Jul 2011 | A1 |
20110169763 | Westerman et al. | Jul 2011 | A1 |
20110169778 | Nungester et al. | Jul 2011 | A1 |
20110173235 | Aman et al. | Jul 2011 | A1 |
20110175813 | Sarwar et al. | Jul 2011 | A1 |
20110175821 | King | Jul 2011 | A1 |
20110187652 | Huibers | Aug 2011 | A1 |
20110202848 | Ismalon | Aug 2011 | A1 |
20110210943 | Zaliva | Sep 2011 | A1 |
20110231290 | Narcisse et al. | Sep 2011 | A1 |
20110238613 | Shehory et al. | Sep 2011 | A1 |
20110246463 | Carson, Jr. et al. | Oct 2011 | A1 |
20110246503 | Bender et al. | Oct 2011 | A1 |
20110248927 | Michaelis et al. | Oct 2011 | A1 |
20110248948 | Griffin et al. | Oct 2011 | A1 |
20110261083 | Wilson | Oct 2011 | A1 |
20110298798 | Krah | Dec 2011 | A1 |
20110310040 | Ben-Shalom et al. | Dec 2011 | A1 |
20120001875 | Li | Jan 2012 | A1 |
20120007821 | Zaliva | Jan 2012 | A1 |
20120007836 | Wu et al. | Jan 2012 | A1 |
20120011106 | Reid et al. | Jan 2012 | A1 |
20120019562 | Park et al. | Jan 2012 | A1 |
20120051596 | Darnell et al. | Mar 2012 | A1 |
20120056846 | Zaliva | Mar 2012 | A1 |
20120078942 | Cai et al. | Mar 2012 | A1 |
20120096041 | Rao et al. | Apr 2012 | A1 |
20120113017 | Benko et al. | May 2012 | A1 |
20120120000 | Lucic et al. | May 2012 | A1 |
20120131139 | Siripurapu et al. | May 2012 | A1 |
20120146938 | Worfolk et al. | Jun 2012 | A1 |
20120150871 | Hua et al. | Jun 2012 | A1 |
20120158629 | Hinckley et al. | Jun 2012 | A1 |
20120200517 | Nikolovski | Aug 2012 | A1 |
20120206330 | Cao et al. | Aug 2012 | A1 |
20120262407 | Hinckley et al. | Oct 2012 | A1 |
20120274583 | Haggerty | Nov 2012 | A1 |
20120280827 | Kashiwagi et al. | Nov 2012 | A1 |
20120280927 | Ludwig | Nov 2012 | A1 |
20120287056 | Ibdah | Nov 2012 | A1 |
20120287076 | Dao et al. | Nov 2012 | A1 |
20120313969 | Szymczyk et al. | Dec 2012 | A1 |
20120324349 | Pop-Lazarov et al. | Dec 2012 | A1 |
20130009896 | Zaliva | Jan 2013 | A1 |
20130014248 | McLaughlin et al. | Jan 2013 | A1 |
20130027404 | Sarnoff | Jan 2013 | A1 |
20130038554 | West | Feb 2013 | A1 |
20130091123 | Chen et al. | Apr 2013 | A1 |
20130100071 | Wright et al. | Apr 2013 | A1 |
20130176264 | Alameh et al. | Jul 2013 | A1 |
20130176270 | Cattivelli et al. | Jul 2013 | A1 |
20130179773 | Lee | Jul 2013 | A1 |
20130187883 | Lim | Jul 2013 | A1 |
20130215070 | Sasaki | Aug 2013 | A1 |
20130234982 | Kang | Sep 2013 | A1 |
20130246861 | Colley et al. | Sep 2013 | A1 |
20130257757 | Kim | Oct 2013 | A1 |
20130265269 | Sharma et al. | Oct 2013 | A1 |
20130285942 | Ko | Oct 2013 | A1 |
20130287273 | Huang | Oct 2013 | A1 |
20130307814 | Chang | Nov 2013 | A1 |
20130307828 | Miller et al. | Nov 2013 | A1 |
20130316813 | Derome et al. | Nov 2013 | A1 |
20130328813 | Kuo et al. | Dec 2013 | A1 |
20130335333 | Kukulski et al. | Dec 2013 | A1 |
20140007002 | Chang et al. | Jan 2014 | A1 |
20140009401 | Bajaj et al. | Jan 2014 | A1 |
20140022189 | Sheng et al. | Jan 2014 | A1 |
20140032880 | Ka | Jan 2014 | A1 |
20140037951 | Shigetomi et al. | Feb 2014 | A1 |
20140071095 | Godsill | Mar 2014 | A1 |
20140082545 | Zhai et al. | Mar 2014 | A1 |
20140104191 | Davidson et al. | Apr 2014 | A1 |
20140104192 | Davidson et al. | Apr 2014 | A1 |
20140104274 | Hilliges et al. | Apr 2014 | A1 |
20140109004 | Sadhvani et al. | Apr 2014 | A1 |
20140168116 | Sasselli et al. | Jun 2014 | A1 |
20140208275 | Mongia et al. | Jul 2014 | A1 |
20140210788 | Harrsion et al. | Jul 2014 | A1 |
20140210791 | Hanauer et al. | Jul 2014 | A1 |
20140240271 | Land et al. | Aug 2014 | A1 |
20140240295 | Harrison | Aug 2014 | A1 |
20140253477 | Shim et al. | Sep 2014 | A1 |
20140267065 | Levesque | Sep 2014 | A1 |
20140267085 | Li et al. | Sep 2014 | A1 |
20140289659 | Harrison et al. | Sep 2014 | A1 |
20140300559 | Tanimoto et al. | Oct 2014 | A1 |
20140327626 | Harrison et al. | Nov 2014 | A1 |
20140331313 | Kim et al. | Nov 2014 | A1 |
20140368436 | Abzarian et al. | Dec 2014 | A1 |
20150002405 | Kuan et al. | Jan 2015 | A1 |
20150035759 | Harrison et al. | Feb 2015 | A1 |
20150077378 | Duffield | Mar 2015 | A1 |
20150145820 | Huang et al. | May 2015 | A1 |
20150242009 | Xiao et al. | Aug 2015 | A1 |
20150253858 | Koukoumidis et al. | Sep 2015 | A1 |
20150293592 | Cheong et al. | Oct 2015 | A1 |
20160012348 | Johnson et al. | Jan 2016 | A1 |
20160018942 | Kang et al. | Jan 2016 | A1 |
20160062545 | Lai | Mar 2016 | A1 |
20160077615 | Schwarz et al. | Mar 2016 | A1 |
20160077650 | Durojaiye et al. | Mar 2016 | A1 |
20160077664 | Harrison et al. | Mar 2016 | A1 |
20160085324 | Schwarz et al. | Mar 2016 | A1 |
20160085333 | Christopher | Mar 2016 | A1 |
20160085372 | Munemoto et al. | Mar 2016 | A1 |
20160098185 | Xiao et al. | Apr 2016 | A1 |
20160117015 | Veneri et al. | Apr 2016 | A1 |
20160156837 | Rodzevski et al. | Jun 2016 | A1 |
20160171192 | Holz et al. | Jun 2016 | A1 |
20160224145 | Harrison et al. | Aug 2016 | A1 |
20160231865 | Harrison et al. | Aug 2016 | A1 |
20160299615 | Schwarz et al. | Oct 2016 | A1 |
20170024892 | Harrison et al. | Jan 2017 | A1 |
20170060279 | Harrison | Mar 2017 | A1 |
20170153705 | Kim et al. | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
1797305 | Jul 2006 | CN |
1928781 | Mar 2007 | CN |
101111817 | Jan 2008 | CN |
101299174 | Nov 2008 | CN |
101339477 | Jan 2009 | CN |
101410781 | Apr 2009 | CN |
101424974 | May 2009 | CN |
101438218 | May 2009 | CN |
101763190 | Jun 2010 | CN |
101763193 | Jun 2010 | CN |
101921610 | Dec 2010 | CN |
101968696 | Feb 2011 | CN |
102153776 | Aug 2011 | CN |
102362249 | Feb 2012 | CN |
102789332 | Nov 2012 | CN |
103150019 | Jun 2013 | CN |
104020878 | Sep 2014 | CN |
0 938 039 | Aug 1999 | EP |
1 659 481 | May 2006 | EP |
1 762 926 | Mar 2007 | EP |
2 136 358 | Dec 2009 | EP |
2 280 337 | Feb 2011 | EP |
2 344 894 | Jun 2000 | GB |
2 468 742 | Sep 2010 | GB |
H09-69137 | Mar 1997 | JP |
2004-213312 | Jul 2004 | JP |
2005-018611 | Jan 2005 | JP |
2007-524970 | Aug 2007 | JP |
2009-543246 | Dec 2009 | JP |
2011-028555 | Feb 2011 | JP |
2013-519132 | May 2013 | JP |
2013-532495 | Aug 2013 | JP |
10-2002-0075283 | Oct 2002 | KR |
10-2011-0061227 | Jun 2011 | KR |
10-2012-0100351 | Sep 2012 | KR |
94-04992 | Mar 1994 | WO |
2006070044 | Jul 2006 | WO |
2008126347 | Oct 2008 | WO |
2009071919 | Jun 2009 | WO |
2011096694 | Aug 2011 | WO |
2012064034 | May 2012 | WO |
2012166277 | Dec 2012 | WO |
2013059488 | Apr 2013 | WO |
WO-2013059488 | Apr 2013 | WO |
2013061998 | May 2013 | WO |
2014037951 | Mar 2014 | WO |
2014182435 | Nov 2014 | WO |
Entry |
---|
Sarah, M. K. et aL, “A Personal Touch—Recognizing Users Based on Touch Screen Behavior,” PhoneSense'12, Nov. 6, 2012, Toronto, ON, Canada, Nov. 6, 2012, pp. 5. |
Schwarz, J. et al., “Probabilistic Palm Rejection Using Spatiotemporal Touch Features and Iterative Classification,” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2009-2012, Apr. 26-May 1, 2014. |
Search Report dated Apr. 21, 2017 in Chinese Patent Application No. 201580000833.0, 1 page. |
“Swype Advanced Tips”, [http://www.swype.com/tips/advanced-tips], Jun. 25, 2014, retrieved via the Wayback Machine on Jun. 29, 2018, [https:web.archive.org/web/20140625073212/http://www.swype.com/tips/advanced-tips], 2 pages. |
“Swype Basics”, [http://www.swype.com/tips/swype-basics], retrieved via the Wayback Machine dated Jun. 14, 2014, retrieved via the Wayback Machine on Jun. 29, 2018, [https:web.archive.org/web/20140614200707/http://www.swype.com/tips/swype-basics, 2 pages. |
“Swype Tips”, [http://www.swype.com/category/tips], Jul. 2, 2014, retrieved via the Wayback Machine on Jun. 29, 2018, [https:web.archive.org/web/20140702102357/http://www.swype.com/category/tips, 2 pages. |
Kherallah, Metal., “On-line handwritten digit recognition based on trajectory and velocity modeling,” Pattern Recognition Letters, vol. 29, Issue 5, pp. 580-594, Apr. 1, 2008. |
Non-Final Office Action dated Apr. 15, 2015 in U.S. Appl. No. 13/856,414, 17 pages. |
Non-Final Office Action dated Apr. 16, 2018 in U.S. Appl. No. 13/958,427, 14 pages. |
Non-Final Office Action dated Apr. 19, 2017 in U.S. Appl. No. 14/869,998, 7 pages. |
Non-Final Office Action dated Apr. 26, 2018 in U.S. Appl. No. 14/495,041, 15 pages. |
Non-Final Office Action dated Dec. 20, 2017 in U.S. Appl. No. 14/834,434, 12 pages. |
Non-Final Office Action dated Jul. 8, 2015 in U.S. Appl. No. 14/191,329, 18 pages. |
Non-Final Office Action dated Jul. 11, 2017 in U.S. Appl. No. 14/390,831, 79 pages. |
Non-Final Office Action dated Jul. 17, 2017 in U.S. Appl. No. 15/073,407, 8 pages. |
Non-Final Office Action dated Jul. 19, 2017 in U.S. Appl. No. 14/219,919, 20 pages. |
Non-Final Office Action dated Jun. 9, 2016 in U.S. Appl. No. 14/612,089, 11 pages. |
Non-Final Office Action dated May 7, 2018 in U.S. Appl. No. 14/191,329, 17 pages. |
Non-Final Office Action dated May 9, 2018 in U.S. Appl. No. 13/887,711, 27 pages. |
Non-Final Office Action dated Nov. 15, 2017 in U.S. Appl. No. 15/198,062, 24 pages. |
Non-Final Office Action dated Nov. 24, 2015 in U.S. Appl. No. 14/191,329, 31 pages. |
Non-Final Office Action dated Oct. 8, 2015 in U.S. Appl. No. 13/958,427, 15 pages. |
Non-Final Office Action dated Oct. 18, 2017 in U.S. Appl. No. 15/406,770, 12 pages. |
Non-Final Office Action dated Oct. 19, 2015 in U.S. Appl. No. 14/668,870, 6 pages. |
Non-Final Office Action dated Oct. 23, 2014 in U.S. Appl. No. 14/275,124, 10 pages. |
Non-Final Office Action dated Oct. 25, 2013 in U.S. Appl. No. 13/410,956, 8 pages. |
Non-Final Office Action dated Oct. 28, 2015 in U.S. Appl. No. 14/390,831, 22 pages. |
Non-Final Office Action dated Sep. 8, 2016 in U.S. Appl. No. 14/492,604, 14 pages. |
Notice of Allowance dated Jan. 26, 2015 in U.S. Appl. No. 13/849,698, 27 pages. |
Notice of Allowance dated Dec. 6, 2016 in U.S. Appl. No. 14/751,589, 27 pages. |
Non-Final Office Action dated Jul. 30, 2018 in U.S. Appl. No. 15/406,770, 20 pages. |
Notice of Allowance dated Feb. 2, 2015 in U.S. Appl. No. 13/780,494, 43 pages. |
Non-Final Office Action dated Jun. 26, 2018 in U.S. Appl. No. 14/486,800, 25 pages. |
Final Office Action dated Aug. 8, 2018 in U.S. Appl. No. 14/834,434, 19 pages. |
Non-Final Office Action dated Sep. 2, 2014 in U.S. Appl. No. 13/863,193, 41 pages. |
Final Office Action dated Mar. 4, 2015 in U.S. Appl. No. 13/863,193, 50 pages. |
Non-Final Office Action dated Jan. 7, 2016 in U.S. Appl. No. 13/863,193, 58 pages. |
Final Office Action dated Sep. 15, 2016 in U.S. Appl. No. 13/863,193, 50 pages. |
Non-Final Office Action dated Apr. 6, 2017 in U.S. Appl. No. 13/863,193, 70 pages. |
Final Office Action dated Jan. 9, 2018 in U.S. Appl. No. 13/863,193, 50 pages. |
Notice of Allowance dated May 22, 2018 in U.S. Appl. No. 13/863,193, 73 pages. |
Notice of Allowance dated Sep. 1, 2016 in U.S. Appl. No. 13/856,414, 28 pages. |
Chinese Office Action for Chinese Patent Application No. 201510240522.3 dated Jun. 28, 2018, 30 pages. |
Chinese Office Action for Chinese Patent Application No. 201280062500.7, dated Apr. 27, 2018, 19 pages. |
Chinese Office Action for Chinese Patent Application No. 201280062500.7, dated Oct. 10, 2018, 14 pages. |
Office Action dated Mar. 30, 2018 for U.S. Appl. No. 15/886,562, 44 pages. |
Office Action dated Aug. 10, 2018 for U.S. Appl. No. 15/886,562, 86 pages. |
Japanese Office Action dated Aug. 1, 2018 for Japanese Patent Application No. 2017-049566, 9 pages (including English translation). |
Korean Office Action dated Jan. 10, 2019 for Korean Patent Application No. 2014-7010323, 11 pages (including English translation). |
Office Action dated Jan. 28, 2019 for U.S. Appl. No. 15/836,798, 30 pages. |
U.S. Appl. No. 14/492,604, filed Sep. 22, 2014, titled: “Method and Apparatus for Improving Accuracy of Touch Screen Event Analysis by Use of Edge Classification.” 35 pages. |
U.S. Appl. No. 14/495,041, filed Sep. 24, 2014, titled: “Method for Improving Accuracy of Touch Screen Event Analysis by Use of Spatiotemporal Touch Patterns.” 34 pages. |
U.S. Appl. No. 14/483,150, filed Sep. 11, 2014, titled: “Method and Apparatus for Differentiating Touch Screen Users Based on Touch Event Analysis.” 38 pages. |
U.S. Appl. No. 14/242,127, filed Apr. 1, 2014, titled: Method and Apparatus for Classifying DTouch Events on a Touch Sensitive Surface, 36 pages. |
U.S. Appl. No. 13/849,698, filed Mar. 23, 2013, titled: “Method and System for Activating Different Interactive Functions Using Different Types of Finger Contacts.” 52 pages. |
U.S. Appl. No. 13/780,494, filed Feb. 28, 2013, titled: “Input Tools Having Viobro-Acoustically Distinct Regions and Computing Device for Use With the Same.” 34 pages. |
Final Office Action dated Jul. 12, 2017 in U.S. Appl. No. 14/495,041, 14 pages. |
Final Office Action dated Jul. 18, 2017 in U.S. Appl. No. 14/191,329, 17 pages. |
Final Office Action dated Jun. 8, 2016 in U.S. Appl. No. 14/495,041, 16 pages. |
Final Office Action dated Jun. 30, 2017 in U.S. Appl. No. 13/958,427, 15 pages. |
Final Office Action dated Mar. 7, 2018 in U.S. Appl. No. 14/219,919, 21 pages. |
Final Office Action dated Mar. 28, 2016 in U.S. Appl. No. 13/958,427, 16 pages. |
Final Office Action dated May 6, 2016 in U.S. Appl. No. 14/191,329, 17 pages. |
Final Office Action dated May 13, 2016 in U.S. Appl. No. 14/390,831, 6 pages. |
Final Office Action dated May 20, 2016 in U.S. Appl. No. 14/503,894, 17 pages. |
Final Office Action dated Nov. 9,2016 in U.S. Appl. No. 14/612,089, 11 pages. |
Final Office Action dated Nov. 23, 2015 in U.S. Appl. No. 14/668,870, 14 pages. |
Final Office Action dated Sep. 6, 2017 in U.S. Appl. No. 14/486,800, 17 pages. |
International Search Report and Written Opinion dated Jul. 8, 2013 in International Application No. PCT/CA2013/000292, 9 pages. |
International Search Report and Written Opinion dated Jun. 6, 2012 in International Patent Application No. PCT/CA2012/050127, 10 pages. |
“Making it Easier to Share With Who You Want,” Facebook, Aug. 23, 2011, last updated on Dec. 12, 2012 retrieved from https://www.facebook.com/notes/facebook/making-it-easier-to-share-with-who-you-want/10150251867797131/, retrieved on Jun. 1, 2018, 14 pages. |
Cheng, B. et aL, “SilentSense: Silent User Identification via Dynamics of Touch and Movement Behavioral Biometrics,” Cryptography and Security (cs CR); Human-Computer Interaction, pp. 9, Aug. 31, 2013. |
S. Furui, “Digital Speech Processing, synthesis, and recognition” Marcel Dekker, Inc. 2001. 40 pages. |
English Translation of Chinese Office Action dated Nov. 3, 2017 in Chinese Application No. 201480002856.0, 12 pages. |
English Translation of Final Rejection dated Apr. 27, 2015 in Korean Patent Application No. 10-2014-0027979, 3 pages. |
English Translation of Final Rejection dated Dec. 12, 2014 in Korean Patent Application No. 10-2014-0027979, 3 pages. |
English Translation of First Office Action dated Feb. 27, 2017 in Chinese Application No. 201480002879.1, 13 pages. |
English Translation of First Office Action dated May 2, 2017 in Chinese Patent Application No. 201580000833.0, 9 pages. |
English Translation of First Office Action dated Oct. 11, 2017 in Chinese Patent Application No. 20150209998.0, 10 pages. |
English Translation of Notification of Reason for Refusal dated Jul. 10, 2014 in Korean patent application No. 10-2014-0027979, 3 pages. |
Final Office Action dated Jan. 5, 2018 in U.S. Appl. No. 14/503,894, 16 pages. |
English Translation of Second Office Action dated Jul. 6, 2017 in Chinese Application No. 201480002879.1, 14 pages. |
English Translation of Third Office Action dated Oct. 16, 2017 in Chinese Application No. 201480002879.1, 4 pages. |
Communication pursuant to Article 94(3) EPC dated Feb. 26, 2018 for European Patent Application No. 14785422.8, 7 pages. |
Communication pursuant to Article 94(3) EPC dated Mar. 5, 2018 for European Patent Application No. 14794212.2, 5 pages. |
Extended European Search Report dated Apr. 16, 2018 in European Application No. 15845310.0, 7 pages. |
Extended European Search Report dated Aug. 11, 2016 in European Patent Application No. 14785422.8, 8 pages. |
Extended European Search Report dated Aug. 25, 2017 in European Patent Application No. 157 48667.1, 10 pages. |
Extended European Search Report dated Jul. 22, 2014 in European Patent Application No. 12755563.9, 5 pages. |
Extended European Search Report dated Mar. 16, 2018 in European Patent Application No. 15842839.1, 7 pages. |
Extended European Search Report dated Mar. 19, 2018 in European Patent Application No. 15840819.5, 9 pages. |
Extended European Search Report dated Mar. 19, 2018 in European Patent Application No. 15843933.1, 8 pages. |
Extended European Search Report dated Mar. 27, 2018 in European Patent Application No. 15843989.3, 8 pages. |
Extended European Search Report dated May 14, 2018 in European Patent Application No. 15847469.2, 11 pages. |
Weidong, S. et al., “SenGuard: Passive user identification on smartphones using multiple sensors,” IEEE 7th International Conference on Wireless and Mobile Computing, Networking and Communications (WiMob), pp. 141-148, 2011. |
Final Office Action dated Feb. 9, 2016 in U.S. Appl. No. 14/486,800, 14 pages. |
Final Office Action dated Feb. 26, 2016 in U.S. Appl. No. 14/492,604, 16 pages. |
Non-Final Office Action dated Sep. 9, 2016 in U.S. Appl. No. 13/887,711, 24 pages. |
Non-Final Office Action dated Sep. 29, 2016 in U.S. Appl. No. 14/834,434, 12 pages. |
Pedro, L et al., “Augmenting touch interaction through acoustic sensing”, Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, pp. 53-56, Nov. 13-16, 2011. |
Final Office Action received for U.S. Appl. No. 15/075,648 dated Dec. 21, 2018, 13 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/815,679 dated Sep. 28, 2018, 69 pages. |
Final Office Action received for U.S. Appl. No. 15/198,062 dated Sep. 6, 2018, 32 pages. |
Chinese Office Action dated Apr. 21, 2017 for Chinese Patent Application No. 201480022056.5, 23 pages. (with Translation). |
Chinese Office Action dated Feb. 9, 2018 for Chinese Patent Application No. 201480022056.5, 19 pages. (with Translation). |
Non-Final Office Action received for U.S. Appl. No. 16/126,175 dated Nov. 1, 2018, 86 pages. |
Third Chinese Office Action received for Chinese Patent Application No. 201480022056.5 dated Jul. 19, 2018, 6 pages (with English translation). |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 14785422.8 dated Nov. 22, 2018, 5 pages. |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15845310.0 dated Jan. 3, 2019, 4 pages. |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15840819.5 dated Jan. 23, 2019, 6 pages. |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15842839.1 dated Apr. 9, 2019, 7 pages. |
Chinese First Office Action received for Chinese Patent Application No. 201510240372.6 dated Sep. 27, 2018, 18 pages. |
Chinese Second Office Action received for Chinese Patent Application No. 201510240372.6 dated May 15, 2019, 16 pages. |
Communication pursuant to Article 94(3) EPC for European Patent Application No. 15843933.1 dated Jan. 23, 2019, 6 pages. |
Chinese Search Report received for Chinese Patent Application No. 201580053216.7, dated Apr. 16, 2019, 2 pages. |
European Search Report received for European Patent Application No. 16839786.7, dated Feb. 12, 2019, 8 pages. |
Communication pursuant to Rules 70(2) and 70a(2) EPC received for European Patent Application No. 16839786.7 dated Mar. 1, 2019, 1 page. |
Chinese Second Office Action received for Chinese Patent Application No. 201580000833.0 dated Jan. 15, 2018, 17 pages. |
European Search Report received for European Patent Application No. 16818725.0, dated Dec. 21, 2018, 8 pages. |
Communication pursuant to Rules 70(2) and 70a(2) EPC received for European Patent Application No. 16818725.0 dated Jan. 8, 2019, 1 page. |
First Office Action received for Canadian Patent Application No. 2869699, dated Nov. 27, 2014, 3 pages. |
Second Office Action received for Canadian Patent Application No. 2869699, dated Jun. 14, 2016, 4 pages. |
Third Office Action received for Canadian Patent Application No. 2869699, dated Jan. 9, 2017, 3 pages. |
First Examination report received for Australian Patent Application No. 2012225130, dated Feb. 9, 2015, 4 pages. |
First Office Action received for Canadian Patent Application No. 2802746, dated Apr. 9, 2013, 3 pages. |
Communication pursuant to Article 94(3) EPC received for European Patent Application No. 14832247.2 dated May 3, 2019, 7 pages. |
Final Office Action received for U.S. Appl. No. 15/075,648 dated May 31, 2019, 17 pages. |
European Search Report dated Apr. 8, 2019 for European Application No. 18195588.1, 7 pages. |
Office Action dated Jul. 5, 2019 for U.S. Appl. No. 15/836,798, 95 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Jan. 18, 2017, 20 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/684,407 dated Aug. 2, 2017, 14 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Mar. 12, 2018, 14 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/612,089 dated May 31, 2017, 21 pages. |
Final Office Action received for U.S. Appl. No. 15/073,407, dated Dec. 20, 2016, 49 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/958,427, dated Nov. 10, 2016, 22 pages. |
Final Office Action received for U.S. Appl. No. 14/219,919, dated Aug. 26, 2016, 24 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/191,329, dated Feb. 2, 2017, 20 pages. |
Final Office Action received for U.S. Appl. No. 13/887,711, dated Jun. 8, 2017, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/075,648, dated Apr. 21, 2017, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/486,800, dated Dec. 1, 2016, 29 pages. |
Final Office Action received for U.S. Appl. No. 14/492,604, dated Mar. 17, 2017, 37 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/495,041, dated Nov. 25, 2016, 35 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/503,894, dated May 16, 2017, 33 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/684,407, dated Sep. 14, 2018, 24 pages. |
Final Office Action received for U.S. Appl. No. 14/834,434, dated May 1, 2017, 18 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/751,589, dated Jun. 13, 2016, 20 pages. |
International Search Report and Written Opinion for PCT/US2016/044552; dated Oct. 17, 2016, 14 pages. |
International Search Report and Written Opinion for PCT/US2016/040194; dated Sep. 19, 2016, 7 pages. |
International Search Report and Written Opinion for PCT/US2015/051582; dated Feb. 26, 2016, 12 pages. |
International Search Report and Written Opinion for PCT/US2015/051106; dated Jan. 28, 2016, 9 pages. |
International Search Report and Written Opinion for PCT/US2015/047616; dated Jul. 1, 2016, 7 pages. |
European Patent Office Extended Search Report for EP 14 83 2247; dated Feb. 23, 2017, 11 pages. |
European Patent Office Extended Search Report for EP 14 79 4212; dated Nov. 9, 2016, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/958,427, dated Mar. 13, 2015, 50 pages. |
Final Office Action received for U.S. Appl. No. 13/958,427, dated Jun. 19, 2015, 17 pages. |
Non-Final Office Action received for U.S. Appl. No. 13/887,711, dated Apr. 6, 2015, 36 pages. |
Final Office Action received for U.S. Appl. No. 14/191,329, dated Aug. 7, 2015, 29 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/492,604, dated Oct. 1, 2015, 16 pages. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2014/049485 dated Nov. 17, 2014, 9 pages. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2014/033380 dated Mar. 13, 2015, 7 pages. |
International Search Report and Written Opinion received for PCT Application No. PCT/US2014/034977 dated Sep. 18, 2014, 8 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/483,150 dated Dec. 18, 2015, 7 pages. |
Non-Final Office Action—dated Oct. 2, 2015 U.S. Appl. No. 14/486,800 filed Sep. 15, 2014, 21 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/503,894, dated Dec. 30, 2015, 18 pages. |
Non-Final Office Action—dated Jan. 29, 2016 U.S. Appl. No. 14/219,919, 11 pages. |
Non-Final Office Action received dated Nov. 5, 2015 U.S. Appl. No. 13/887,711, 19 pages. |
Final Office Action dated Feb. 24, 2016 U.S. Appl. No. 13/887,711, 23 pages. |
International Search Report and Written Opinion for PCT/US2015/051355; dated Dec. 15, 2015, 9 pages. |
International Search Report and Written Opinion for PCT/US2015/047428; dated Nov. 27, 2015, 6 pages. |
International Search Report and Written Opinion for PCT/US2015/050570; dated Dec. 17, 2015, 8 pages. |
International Search Report and Written Opinion for PCT/US2015/014581; dated May 14, 2015, 7 pages. |
Non-Final Office Action—dated Oct. 7, 2015 U.S. Appl. No. 14/495,041, 14 pages. |
Non-Final Office Action dated Jun. 13, 2016 in U.S. Appl. No. 15/073,407, 49 pages. |
Final Office Action dated Nov. 28, 2014 in U.S. Appl. No. 13/849,698, 21 pages. |
Non-Final Office Action dated Jun. 24, 2014 in U.S. Appl. No. 13/849,698, 21 pages. |
Non-Final Office Action dated Oct. 16, 2014 in U.S. Appl. No. 13/780,494, 10 pages. |
U.S. Appl. No. 13/958,427, filed Aug. 2, 2013, titled: “Capture ofVibro-Acoustic Data Used to Determine Touch Types.”. |
U.S. Appl. No. 14/191,329, filed Feb. 26, 2014, titled: “Using Capacitive Images for Touch Type Classification.”. |
U.S. Appl. No. 13/887,711, filed May 6, 2013, titled: “Using Finger Touch Types to Interact with Electronic Devices.”. |
Non-Final Office Action received for U.S. Appl. No. 14/242,127 dated Jun. 2, 2015, 33 pages. |
Final Office Action received for U.S. Appl. No. 14/242,127 dated Sepetmber 18, 2015, 28 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/242,127 dated Dec. 28, 2015, 38 pages. |
Final Office Action received for U.S. Appl. No. 14/242,127 dated Mar. 31, 2016, 34 pages. |
Notice of Allowance received for U.S. Appl. No. 14/242,127 dated Apr. 13, 2016, 18 pages. |
Notice of Allowance received for U.S. Appl. No. 14/242,127 dated Sep. 2, 2016, 16 pages. |
Asano et al., “Real-Time Sound Source Localization and Separation System and Its Application to Automatic Speech Recognition”, Proceedings of Eurospeech, 2001; p. 1013-1016; 2001. |
Benko et al., “Sphere: Multi-Touch Interactions on a Spherical Display”, Proceedings of UIST, 2008; pp. 77-86. |
Burges, Christopher J.C., “A Tutorial on Support Vector Machines for Pattern Recognition”, Data Mining and Knowledge Discovery, 2, 1998 pp. 121-167. |
Cao et al., “ShapeTouch: Leveraging Contact Shape on Interactive Surfaces”, IEEE International Workshop on Horizontal Interactive Human Computer System (TABLETOP), 2008, pp. 139-146. |
Deyle et al., “Hambone: A Bio-Acoustic Gesture Interface”, Proceedings of ISWC, 2007, pp. 1-8. |
Dietz et al., DT Controls: Adding Identity to Physical Interfaces, ACM Symposium on User Interface Software & Technology (UIST), 2005, pp. 245-252. |
Dietz et al., “DiamondTouch: A Multi-User Touch Technology” ACM Symposium on User Interface Software & Technology (UIST), 2001, pp. 219-226. |
Gutwin et al., “Supporting Informal Collaboration in Shared-Workspace Groupware”, Journal of Universal Computer Science, vol. 14, No. 9, 2008, pp. 1411-1434. |
Hall et al., “The WEKA Data Mining Software: An Update”, SIGKDD Explorations,vol. 11, No. 1, 2009, pp. 10-18. |
Harrison et al., Skinput: Appropriating the Body as an Input Surface, Proceedings of CHI, Apr. 10-15, 2010, pp. 453-462. |
Harrison et al., “Scratch Input: Creating Large, Inexpensive, Unpowered and Mobile Finger Input Surfaces”, Proceedings of UIST, 2008, pp. 205-208. |
Hartmann et al., “Augmenting Interactive Tables with Mice & Keyboards”, Proceedings of UIST, 2009, pp. 149-152. |
Hinckley et al., “Sensor Synaesthesia: Touch in Motion, and Motion in Touch”, Proceedings of CHI, 2011, pp. 801-810. |
Hinckley et al., “Pen+ Touch= New Tools”, Proceedings of UIST, 2010, pp. 27-36. |
Hinkley et al., “Manual Deskterity: An Exploration of Simultaneous Pen+ Touch Direct Input”, Proceedings of CHI, 2010, pp. 2793-2802. |
Holz et al., “The Generalized Perceived Input Point Model and How to Double Touch Accuracy by Extracting Fingerprints” Proceedings of CHI, 2010, pp. 581-590. |
Kaltenbrunner., “reacTIVision: A Computer-Vision Framework for Table-Based Tangible Interaction”, Proceedings ofTEl, 2007, pp. 69-74. |
Matsushita et al., “HoloWall: Designing a Finger, Hand, Body, and Object Sensitive Wall”, Proceedings of UIST, 1997, pp. 209-210. |
“Mimio”, http://www.mimio.com, retrieved Jul. 8, 2019, 8 pages. |
Olwal et al., “SurfaceFusion: Unobtrusive Tracking of Everyday Objects in Tangible User Interfaces”, Proceedings of GI, 2008, pp. 235-242. |
Paradiso et al., “Tracking and Characterizing Knocks Atop Large Interactive Displays”, Sensor Review, vol. 25, No. 2, 2005, pp. 134-143. |
Paradiso et al., “Sensor Systems for Interactive Surfaces”, IBM Systems Journal, vol. 39 No. 3&4, 2000, pp. 892-914. |
Patten, James, Mcmichael., “Sensetable: A Wireless Object Tracking Platform for Tangible User Interfaces”, Proceedings of CHI, 2001, pp. 253-260. |
Rekimoto et al., “Augmented Surfaces: A Spatially Continuous Work Space for Hybrid Computing Environments”, Proceedings of CHI, 1999, pp. 378-385. |
Rekimoto et al., “ToolStone: Effective use of the Physical Manipulation Vocabularies of Input Devices”, Proceedings of UIST, 2000, pp. 109-117. |
Rekimoto et al., “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces”, Proceedings of CHI, 2002, pp. 113-120. |
Vandoren et al., “DIP-IT: Digital Infrared Painting on an Interactive Table”, Proceedings of CHI, 2008, pp. 2901-2906. |
Wang et al., “Empirical Evaluation for Finger Input Properties in Multi-Touch Interaction”, Proceedings of CHI, 2009, pp. 1063-1072. |
International Search Report and Written Opinion received for International Patent Application No. PCT/US2012/060865 dated Mar. 29, 2013, 10 pages. |
Non-Final Office Action received for U.S. Appl. No. 15/206,554 dated Sep. 21, 2016, 36 pages. |
Final Office Action issued for U.S. Appl. No. 15/206,554 on Feb. 1, 2017, 20 pages. |
Chinese Office Action for Chinese Patent Application No. 201280062500.7 dated Nov. 7, 2016, 9 pages. |
Chinese Office Action for Chinese Patent Application No. 201280062500.7 dated Apr. 17, 2017, 15 pages. |
Japanese Office Action for Japanese Patent Application No. 2014-537253 dated May 16, 2017, 5 pages. |
Seo et al.., “Audio Fingerprinting Based on Normalized Spectral Subband Centroids,” Proc. ICASSP, {U.S.A.), 2005, vol. 3, p. 213-216. Retrieved on May 29, 2017, 4 pages. |
Kunio, “Audio fingerprinting: Techniques and applications”, Acoustical Science and Technology, The Acoustical Society of Japan, Feb. 1, 2010, vol. 66, No. 2, p. 71-76. Retrieved on May 29, 2017, 6 pages. |
European Search Report dated Jul. 24, 2015 for European Application No. 12842495.9, 7 pages. |
Chinese Search Report dated Mar. 29, 2016 for Chinese Application No. 201280062500.7, 1 page. |
Chinese Office Action dated Apr. 15, 2016 for Chinese Application No. 201280062500.7, 11 pages. |
Japanese Office Action for Japanese Patent Application No. 2014-537253 dated Nov. 15, 2016, 3 pages. |
Japanese Office Action for Japanese Patent Application No. 2014-537253 dated Apr. 26, 2016, 3 pages. |
Communication pursuant to Article 94(3) EPC for EP Application No. 12842495.9 dated Jun. 18, 2018, 4 pages. |
Japanese Office Action for Japanese Patent Application No. 2017-049566 dated Jun. 5, 2018, 7 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/684,407 dated Jul. 8, 2016, 11 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Jun. 10, 2019, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/191,329 dated Jul. 16, 2019, 30 pages. |
Chinese First Office Action received for Chinese Patent Application Serial No. 201580051873.8 dated Jun. 21, 2019, 15 pages (Including English Translation). |
Final Office Action received for U.S. Appl. No. 13/887,711 dated Jul. 25, 2019, 24 pages. |
Final Office Action received for U.S. Appl. No. 14/684,407 dated Sep. 20, 2019, 26 pages. |
Final Office Action received for U.S. Appl. No. 14/495,041 dated Aug. 9, 2019, 26 pages. |
Non-Final Office Action received for U.S. Appl. No. 14/834,434 dated Aug. 5, 2019, 19 pages. |
Final Office Action received for U.S. Appl. No. 16/126,175 dated Aug. 2, 2019, 161 pages. |
Number | Date | Country | |
---|---|---|---|
20170024055 A1 | Jan 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14483150 | Sep 2014 | US |
Child | 15075648 | US |