The present disclosure relates generally to a touch device, and more particularly, to methods and apparatuses for detecting multi-touch swipes on the touch device.
Devices such as computing devices, mobile devices, kiosks, etc. often employ a touch screen interface with which a user can interact with the devices by touch input (e.g., touch by a user or an input tool such as a pen). Touch screen devices employing the touch screen interface provide convenience to users, as the users can directly interact with the touch screen. The touch screen devices receive the touch input, and execute various operations based on the touch input. For example, a user may touch an icon displayed on the touch screen to execute a software application associated with the icon, or a user may draw on the touch screen to create drawings. The user may also drag and drop items on the touch screen or may pan a view on the touch screen with two fingers. Thus, a touch screen device that is capable of accurately analyzing the touch input on the touch screen is needed to accurately execute desired operations. Multiple touches occurring at the same time on the device may be more difficult to accurately determine how the multiple touches should connect to other multiple touches in a later or following time frame, and thus accurate methods for detecting multiple touches across multiple time frames are desired.
Disclosed are systems, apparatus and methods for tracking touch detections.
According to some aspects, disclosed is a method for touch detection, the method comprising: receiving first touch data comprising a first plurality of touch detections recorded at a first time; receiving second touch data comprising a second plurality of touch detections recorded at a second time; matching, for several matches, a plurality of the first plurality of touch detections to a corresponding plurality of the second plurality of touch detections, wherein the plurality of the first plurality of touch detections and the corresponding plurality of the second plurality of touch detections comprise a first set and a second set, and matching, for each match, further comprises: computing a rotation and translation matrix between the first set and the second set; applying the rotation and translation matrix to the first set to determine a result; and calculating a Euclidian distance between the result and the second set; and selecting a match, from the several matches, having a minimum Euclidian distance.
According to some aspects, disclosed is a device for touch detection, the device comprising: a touch sensor configured to: receive first touch data comprising a first plurality of touch detections recorded at a first time; and receive second touch data comprising a second plurality of touch detections recorded at a second time; and a processor coupled to the touch sensor and configured to: match, for several matches, a plurality of the first plurality of touch detections to a corresponding plurality of the second plurality of touch detections, wherein the plurality of the first plurality of touch detections and the corresponding plurality of the second plurality of touch detections comprise a first set and a second set, and the processor, for each match, is further configured to: compute a rotation and translation matrix between the first set and the second set; apply the rotation and translation matrix to the first set to determine a result; and calculate a Euclidian distance between the result and the second set; and select a match, from the several matches, having a minimum Euclidian distance.
According to some aspects, disclosed is a device for touch detection, the device comprising: means for receiving first touch data comprising a first plurality of touch detections recorded at a first time; means for receiving second touch data comprising a second plurality of touch detections recorded at a second time; means for matching, for several matches, a plurality of the first plurality of touch detections to a corresponding plurality of the second plurality of touch detections, wherein the plurality of the first plurality of touch detections and the corresponding plurality of the second plurality of touch detections comprise a first set and a second set, and the means for matching, for each match, further comprises: means for computing a rotation and translation matrix between the first set and the second set; means for applying the rotation and translation matrix to the first set to determine a result; and means for calculating a Euclidian distance between the result and the second set; and means for selecting a match, from the several matches, having a minimum Euclidian distance.
According to some aspects, disclosed is a non-transient computer-readable storage medium including program code stored thereon, comprising program code to: receive first touch data comprising a first plurality of touch detections recorded at a first time; receive second touch data comprising a second plurality of touch detections recorded at a second time; match, for several matches, a plurality of the first plurality of touch detections to a corresponding plurality of the second plurality of touch detections, wherein the plurality of the first plurality of touch detections and the corresponding plurality of the second plurality of touch detections comprise a first set and a second set, and the program code to match, for each match, further comprises program code to: compute a rotation and translation matrix between the first set and the second set; apply the rotation and translation matrix to the first set to determine a result; and calculate a Euclidian distance between the result and the second set; and select a match, from the several matches, having a minimum Euclidian distance.
It is understood that other aspects will become readily apparent to those skilled in the art from the following detailed description, wherein it is shown and described various aspects by way of illustration. The drawings and detailed description are to be regarded as illustrative in nature and not as restrictive.
The detailed description set forth below in connection with the appended drawings is intended as a description of various configurations and is not intended to represent the only configurations in which the concepts described herein may be practiced. The detailed description includes specific details for the purpose of providing a thorough understanding of various concepts. However, it will be apparent to those skilled in the art that these concepts may be practiced without these specific details. In some instances, well known structures and components are shown in block diagram form in order to avoid obscuring such concepts.
Several aspects of touch screen devices will now be presented with reference to various apparatus and methods. These apparatus and methods will be described in the following detailed description and illustrated in the accompanying drawings by various blocks, modules, components, circuits, steps, processes, algorithms, etc. (collectively referred to as “elements”). These elements may be implemented using electronic hardware, computer software, or any combination thereof. Whether such elements are implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system.
By way of example, an element, or any portion of an element, or any combination of elements may be implemented with a “processing system” that includes one or more processors. Examples of processors include microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to perform the various functionality described throughout this disclosure. One or more processors in the processing system may execute software. Software shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc., whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
Accordingly, in one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or encoded as one or more instructions or code on a computer-readable medium. Computer-readable media includes computer storage media. Storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), and floppy disk where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of computer-readable media.
As used herein, a device or mobile device, sometimes referred to as a mobile station (MS) or user equipment (UE), such as a cellular phone, mobile phone or other wireless communication device, personal communication system (PCS) device, personal navigation device (PND), Personal Information Manager (PIM), Personal Digital Assistant (PDA), laptop or other suitable mobile device which is capable of receiving wireless communication and/or navigation signals. The term “mobile device” is also intended to include devices which communicate with a personal navigation device (PND), such as by short-range wireless, infrared, wireline connection, or other connection—regardless of whether satellite signal reception, assistance data reception, and/or position-related processing occurs at the device or at the PND. Also, “mobile device” is intended to include all devices, including wireless communication devices, computers, laptops, etc. which are capable of communication with a server, such as via the Internet, WiFi, or other network, and regardless of whether satellite signal reception, assistance data reception, and/or position-related processing occurs at the device, at a server, or at another device associated with the network. Any operable combination of the above are also considered a “mobile device.”
Touch screen technology enables various types of uses. As discussed above, a user may touch a touch screen to execute various operations such as execution of an application. In one example, the touch screen provides a user interface with a direct touch such as a virtual-keyboard and user-directed controls. The user interface with the touch screen may provide proximity detection. The user may handwrite on the touch screen. In another example, the touch screen technology may be used for security features, such as surveillance, intrusion detection and authentication, and may be used for a use-environment control such as a lighting control and an appliance control. In another example, the touch screen technology may be used for healthcare applications (e.g., a remote sensing environment, prognosis and diagnosis).
Several types of touch screen technology are available today, with different designs, resolutions, sizes, etc. Examples of the touch screen technology with lower resolution include acoustic pulse recognition (APR), dispersive signal technology (DST), surface acoustic wave (SAW), traditional infrared (infrared or near infrared), waveguide infrared, optical, and force sensing. A typical mobile device includes a capacitive touch screen (e.g., a mutual projective-capacitance touch screen), which allows for higher resolution and a thin size of the screen. Further, a capacitive touch screen provides good accuracy, good linearity and good response time, as well as relatively low chances of false negatives and false positives. Therefore, the capacitive touch screen is widely used in mobile devices such as mobile phones and tablets. Examples of a capacitive touch screen used in mobile devices include an in-cell touch screen and an on-cell touch screen, which are discussed infra.
In some embodiments, the mobile device architecture 100 further includes a battery monitor and platform resource/power manager component 144 that is coupled to a battery charging circuit and power manager component 148 and to temperature compensated crystal oscillators (TCXOs), phase lock loops (PLLs), and clock generators component 146. The battery monitor and platform resource/power manager component 144 is also coupled to the application processor 102. The mobile device architecture 100 further includes sensors and user interface devices component 149 coupled to the application processor 102, and includes light emitters 150 and image sensors 152 coupled to the application processor 102. The image sensors 152 are also coupled to the multispectral multiview imaging core, correction/optimization/enhancement, multimedia processors and accelerators component 114.
The touch activity and status detection unit 216 receives the touch signal from the analog front end 214 and then communicates to the interrupt generator 218 of the presence of the user touch, such that the interrupt generator 218 communicates a trigger signal to the touch processor and decoder unit 220. When the touch processor and decoder unit 220 receives the trigger signal from the interrupt generator 218, the touch processor and decoder unit 220 receives the touch signal raw data from the analog front end 214 and processes the touch signal raw data to create touch data. The touch processor and decoder unit 220 sends the touch data to the host interface 224, and then the host interface 224 forwards the touch data to the multi-core application processor subsystem 206. The touch processor and decoder unit 220 is also coupled to the clocks and timing circuitry 222 that communicates with the analog front end 214.
In some embodiments, processing of the touch signal raw data is processed in the multi-core application processor subsystem 206 instead of in the decoder unit 220. In some such embodiments, the touch screen controller 204 or one or more components thereof, for example, the decoder unit 220, may be omitted. In other such embodiments, the touch screen controller 204 and/or all components thereof are included, but touch signal raw data is passed through to the multi-core application processor subsystem 206 without or with reduced processing. In some embodiments, processing of the touch signal raw data is distributed between the decoder unit 220 and the multi-core application processor subsystem 206.
The mobile touch screen device 200 also includes a display processor and controller unit 226 that sends information to the display interface 212, and is coupled to the multi-core application processor subsystem 206. The mobile touch screen device 200 further includes an on-chip and external memory 228, an application data mover 230, a multimedia and graphics processing unit (GPU) 232, and other sensor systems 234, which are coupled to the multi-core application processor subsystem 206. The on-chip and external memory 228 is coupled to the display processor and controller unit 226 and the application data mover 230. The application data mover 230 is also coupled to the multimedia and graphics processing unit 232.
The bottom electrode 310 is coupled to charge control circuitry 320. The charge control circuitry 320 controls a touch signal received from the top and bottom electrodes 308 and 310, and sends the controlled signal to a touch conversion unit 322, which converts the controlled signal to a proper signal for quantization. The touch conversion unit 322 sends the converted signal to the touch quantization unit 324 for quantization of the converted signal. The touch conversion unit 322 and the touch quantization unit 324 are also coupled to the touch scan control unit 302. The touch quantization unit 324 sends the quantized signal to a filtering/de-noising unit 326. After filtering/de-noising of the quantized signal at the filtering/de-noising unit 326, the filtering/de-noising unit 326 sends the resulting signal to a sense compensation unit 328 and a touch processor and decoder unit 330. The sense compensation unit 328 uses the signal from the filtering/de-noising unit 326 to perform sense compensation and provide a sense compensation signal to the charge control circuitry 320. In other words, the sense compensation unit 328 is used to adjust the sensitivity of the touch sensing at the top and bottom electrodes 308 and 310 via the charge control circuitry 320.
In some embodiments, the touch processor and decoder unit 330 communicates with clocks and timing circuitry 338, which communicates with the touch scan control unit 302. The touch processor and decoder unit 330 includes a touch reference estimation, a baselining, and adaptation unit 332 that receives the resulting signal from the filtering/de-noising unit 326, a touch-event detection and segmentation unit 334, and a touch coordinate and size calculation unit 336. The touch reference estimation, baselining, and adaptation unit 332 is coupled to the touch-event detection and segmentation unit 334, which is coupled to the touch coordinate and size calculation unit 336. The touch processor and decoder unit 330 also communicates with a small coprocessor/multi-core application processor (with HLOS) 340, which includes a touch primitive detection unit 342, a touch primitive tracking unit 344, and a symbol ID and gesture recognition unit 346. The touch primitive detection unit 342 receives a signal from the touch coordinate and size calculation unit 336 to perform touch primitive detection, and then the touch primitive tracking unit 344 coupled to the touch primitive detection unit 342 performs the touch primitive tracking The symbol ID and gesture recognition unit 346 coupled to the touch primitive tracking unit 344 performs recognition of a symbol ID and/or gesture.
Various touch sensing techniques are used in the touch screen technology. Touch capacitance sensing techniques may include electric field sensing, charge transfer, force sensing resistor, relaxation oscillator, capacitance-to-digital conversion (CDC), a dual ramp, sigma-delta modulation, and successive approximation with single-slope ADC. The touch capacitance sensing techniques used in today's projected-capacitance (P-CAP) touch screen controller may include a frequency based touch-capacitance measurement, a time based touch-capacitance measurement, and/or a voltage based touch-capacitance measurement.
In the frequency based measurement, a touch capacitor is used to create an RC oscillator, and then a time constant, a frequency, and/or a period are measured according to some embodiments. The frequency based measurement includes a first method using a relaxation oscillator, a second method using frequency modulation and a third method a synchronous demodulator. The first method using the relaxation oscillator uses a sensor capacitor as a timing element in an oscillator. In the second method using the frequency modulation, a capacitive sensing module uses a constant current source/sink to control an oscillator frequency. The third method using the synchronous demodulator measures a capacitor's AC impedance by exciting the capacitance with a sine wave source and measuring a capacitor's current and voltage with a synchronous demodulator four-wire ratiometric coupled to the capacitor.
The time based measurement measures charge/discharge time dependent on touch capacitance. The time based measurement includes methods using resistor capacitor charge timing, charge transfer, and capacitor charge timing using a successive approximation register (SAR). The method using resistor capacitor charge timing measures sensor capacitor charge/discharge time for with a constant voltage. In the method using charge transfer, charging the sensor capacitor and integrating the charge over several cycles, ADC or comparison to a reference voltage, determines charge time. Many charge transfer techniques resemble sigma-delta ADC. In the method using capacitor charge timing using the SAR, varying the current through the sensor capacitor, matches a reference ramp.
The voltage based measurement monitors a magnitude of a voltage to sense user touch. The voltage based measurement includes methods using a charge time measuring unit, a charge voltage measuring unit, and a capacitance voltage divide. The method using the charge time measuring unit charges a touch capacitor with a constant current source, and measures the time to reach a voltage threshold. The method using the charge voltage measuring unit charges the capacitor from a constant current source for a known time and measures the voltage across the capacitor. The method using the charge voltage measuring unit requires a very low current, a high precision current source, and a high impedance input to measure the voltage. The method using the capacitance voltage divide uses a charge amplifier that converts the ratio of the sensor capacitor to a reference capacitor into a voltage (Capacitive-Voltage-Divide). The method using the capacitance voltage divide is the most common method for interfacing to precision low capacitance sensors.
In some embodiments, processing of the touch signal raw data can be processed by the multi-core application processor subsystem (with HLOS) 406 instead of in the touch screen controller 404. In some such embodiments, the touch screen controller 404 or one or more components thereof may be omitted. In other such embodiments, the touch screen controller 404 and/or all components thereof are included, but touch signal raw data is passed through to the multi-core application processor subsystem (with HLOS) 406 without or with reduced processing.
There are known challenges for accurate sensing of touch in the touch screen. For example, a touch capacitance can be small, depending on a touch medium. The touch capacitance is sensed over high output impedance. Further, a touch transducer often operates in platforms with a large parasitic or in a noisy environment. In addition, touch transducer operation can be skewed with offsets and its dynamic range may be limited by a DC bias.
Several factors may affect touch screen signal quality. On the touch screen panel, the signal quality may be affected by a touch-sense type, resolution, a touch sensor size, fill factor, touch panel module integration configuration (e.g., out-cell, on-cell, in-cell, etc.), and a scan overhead. A type of a touch medium such as a hand/finger or stylus and a size of touch as well as responsivity such as touch sense efficiency and a transconductance gain may affect the signal quality. Further, sensitivity, linearity, dynamic range, and a saturation level may affect the signal quality. In addition, noises such as no-touch signal noise (e.g., thermal and substrate noise), a fixed-pattern noise (e.g., touch panel spatial non-uniformity), and a temporal noise (e.g., EMI/RFI, supply noise, display noise, use noise, use-environment noise) may affect the signal quality. In some instances, temporal noise can include noise imposed on the ground plane, for example, by a poorly designed charger.
Often gesture input geometry includes multiple touch inputs. For example, a user may use a multi-finger swipe, such as a three-finger swipe, to signify a particular action. User input is tracked from during sequential times, such that one point at a first time (e.g., time t) is tracked to one point at a second time (e.g., t+1). Any spurious points (i.e., points not matched to a tracked finger input) may be discarded. Detecting and tracking multiple touch inputs on a touch screen at one point in time to another point in time, however, may complicate touch detection algorithms.
For example, referring to
Referring to
Referring to
Generally, a touch detection algorithm may limit finger input (e.g., multi-finger fast swipes) to one or two degrees of freedom to represent a corresponding one or two hand swiping input. A fast swipe may be a movement greater than a predetermined threshold or speed. Multiple fingers may be grouped together, such as two, three, four or five fingers. For this discussion, a thumb is considered a finger. For example, a user may use multiple fingers in a one-hand input for a multi-finger input gesture.
Typically, relative finger positions for a gesture remain constant. For example, when used as an input gesture, a middle finger may move “randomly” about a touch screen but will stay between inputs provided by an index finger and a ring finger. That is, fingers stay positioned relative to one another in a constant fashion. Typically, fingers from a single hand do not move independently of one another. For example, a right hand showing movement of an index finger to the right, a middle finger up, and a ring finger to the left is highly unlikely or impossible. A touch detection algorithm may consider only possible or likely trajectories from points that define typical finger movement such that fingers are constrained or fixed relative to one another.
When finger tips are used for input gestures, movement may be characterized by a translation and/or a rotation. Translation may be represented by a change in center of mass of finger tips and may be defined with a 2D matrix. Rotation may be represented by an angular change about this center of mass. Rotation may also be defined with another 2D matrix. A touch detection algorithm may similarly constrain trajectories using a Markov model to “tie fingers” together such that the fingers move in a group.
Trajectories derived from input points may be limited to a fixed displacement with little or no rotation. Alternatively, trajectories from input points may be limited to both a fixed displacement with rotation. Alternatively, trajectories from input points may be limited to a rotation with little or no displacement. A threshold may be used to determine whether a trajectory represents sufficient translation verses little or no translation. A different threshold may be used to determine whether a trajectory represents sufficient rotation verses little or no rotation.
One degree of freedom is represented by a combination of translation and rotation. When swiping, fingers staying in a fixed position relative to each other with a determined linear displacement and/or a determined angular rotation. A second set of fingers from a second hand may represent a second degree of freedom. When a gesture is limited to one hand, a touch detection algorithm may similarly limit trajectories to a single degree of freedom. When a touch detection algorithm accepts gestures from two hands, the touch detection algorithm may limit points providing trajectories showing two degrees of freedom. The touch detection algorithm may further constrain trajectories to points not allowing (unlikely) twisted hand motion. For example, a touch detection algorithm may constrain trajectories representing rotations to less than 360 degrees of rotation in one hand. Similarly, a touch detection algorithm may restrict trajectories from two hands that would otherwise require hands to pass through one another.
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
The aforementioned example steps as described in
Referring
At step 704, the example process may eliminate the longest links connected between points from time t+i with time t+i+1 until the number of links equals the smaller number of touch detections among time t and time t+i+1. For example, if there are five touch detections at time t+i and only two touch detections at time t+i+1, then step 704 eliminates the longest links until there are only two links, corresponding to the smaller number of touch detections at time t+i+1.
At step 706, the example process may then determine if there is a one-to-one correspondence between connections from points at time t+i to point at time t+i+1. If there is not a one-to-one correspondence, then the shortest link is eliminated, and the example process iterates back to step 702 and repeats through steps 702, 704, and 706, but with the previously eliminated edges and points no longer considered.
The example process ends when it is determined that there is a one-to-one correspondence between connections from points at time t+i to point at time t+i+1.
In some embodiments, this example process is repeated for each time t+i and t+i+1, for all recorded frames i. For example, there may be 500 recorded touch frames, and thus each frame pairs (e.g., {0, 1}, {1, 2}, {2, 3} . . . {399, 500}) would need to be evaluated according to the described example process. The evaluated connections for each frame pair may then be connected to form a map or path of the user's swipes across the touch screen.
The previous description described one implementation that inherently groups fingers together during a swipe. The following description describes a general description that explicitly group fingers together.
A count of the first plurality of touch detections sometimes does not equal a count of the second plurality of touch detections. For example, a count of the first plurality of touch detections may be greater than a count of the second plurality of touch detections. In other situations, a count of the first plurality of touch detections may be less than a count of the second plurality of touch detections. Often a mismatch may occur by including extra detections of noise. Some embodiments operate on a fixed number of finger touch points (e.g., exactly two points, exactly three points or exactly four points) for gestures that use the specific number of finger points. For example, a three-point gesture may be sweeping of a thumb, an index finger and a middle finger from left to right and then from top to bottom.
At 830, the device matches a plurality of the first plurality of touch detections to a corresponding plurality of the second plurality of touch detections for each of several candidate matches. Either the plurality of the first plurality of touch detections comprises a first set and the corresponding plurality of the second plurality of touch detections comprises a second set, or alternatively, the plurality of the first plurality of touch detections comprises the second set and the corresponding plurality of the second plurality of touch detections comprises the first set. The matching may comprise an exhaustive matching and a selection may be made from the absolute minimum calculated Euclidian distance from all candidate matches. The Euclidian distance is the distance between two points that is given by the Pythagorean formula and one would measure with a ruler. Alternatively, as described below, a threshold distance or a RANSAC (RANdom SAmple Consensus) algorithm may be used to limit a total number of match operations performed.
For each matching, the method further comprises computing, applying and calculating as described below.
At 840, the device computes a rotation and translation matrix between the first set and the second set. The rotation and translation matrix may comprise a single matrix or may be represented as two matrices. Alternatively, the rotation and translation matrix may be represented with two vectors: a direction vector indicating a linear displacement (how much and in what direction) and an angular vector indicating an angular displacement between the first touch data and the second touch data. For example, linear displacement may be identified with a vector between the center of mass of the first touch data and the center of mass of the second touch data. The angular displacement between the first touch data and the second touch data may identify a rotation between the first touch data and the second touch data, assuming the centers of mass overlap, to minimize the Euclidian distance. In some embodiments, a device computing comprises a device determining a translation between a center of mass of the first set and a center of mass of the second set, and also determining an angular momentum between the first set and the second set.
At 850, the device applies the rotation and translation matrix to the first set to determine a result. Applying the rotation and translation matrix may comprise multiplying each point in the first set with the rotation and translation matrix to form the result. At 860, the device calculates a Euclidian distance between the result and the second set.
At 870, the device selects a match, from the several candidate matches, having a minimum Euclidian distance. Selecting the match may comprise selecting a first match having a Euclidian distance less than a threshold distance. That is, selecting the first match under of threshold distance such that exhaustive match is unnecessary. Alternatively, a RANSAC algorithm may be used to select candidate matches. A RANSAC algorithm may be applied as an iterative method to track finger positions from the plurality of touch detections which contains outliers. The method described above may be applied to two, three, four or five fingers on one hand. The method may be expanded to include multiple fingers from two hands.
In
At 935, the device matches the plurality of the first plurality of touch detections to a corresponding plurality of the second plurality of touch detections for each of several candidate matches for a second hand. The touch points used during step 930 may be removed before starting step 935. Either the plurality of the first plurality of touch detections comprises a third set and the corresponding plurality of the second plurality of touch detections comprises a fourth set, or alternatively, the plurality of the first plurality of touch detections comprises the fourth set and the corresponding plurality of the second plurality of touch detections comprises the third set.
At 945, the device computes a rotation and translation matrix between the third set and the fourth set. At 955, the device applies the rotation and translation matrix to the third set to determine a result. At 965, the device calculates a Euclidian distance between the result and the fourth set. At 975, the device selects a match, from the several candidate matches, having a minimum Euclidian distance.
The processor 1020 is coupled to the touch sensor 1010 and is configured to match and select. Specifically, the processor 1020 is configured to match, for several matches, a plurality of the first plurality of touch detections to a corresponding plurality of the second plurality of touch detections.
The processor 1020, for each match, is further configured to: compute, apply and calculate. That is, the processor 1020 is configure to: compute a rotation and translation matrix between the first set and the second set; apply the rotation and translation matrix to the first set to determine a result; and calculate a Euclidian distance between the result and the second set. Furthermore, the processor 1020 is configured to select a match, from the several matches, having a minimum Euclidian distance. Likewise, the processor 1020 acts as means for matching, computing, applying, calculating and selecting.
The methodologies described herein may be implemented by various means depending upon the application. For example, these methodologies may be implemented in hardware, firmware, software, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in a memory and executed by a processor unit. Memory may be implemented within the processor unit or external to the processor unit. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other memory and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
If implemented in firmware and/or software, the functions may be stored as one or more instructions or code on a computer-readable medium. Examples include computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
In addition to storage on computer readable medium, instructions and/or data may be provided as signals on transmission media included in a communication apparatus. For example, a communication apparatus may include a transceiver having signals indicative of instructions and data. The instructions and data are configured to cause one or more processors to implement the functions outlined in the claims. That is, the communication apparatus includes transmission media with signals indicative of information to perform disclosed functions. At a first time, the transmission media included in the communication apparatus may include a first portion of the information to perform the disclosed functions, while at a second time the transmission media included in the communication apparatus may include a second portion of the information to perform the disclosed functions.
It is understood that the specific order or hierarchy of steps in the processes disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged. Further, some steps may be combined or omitted. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Moreover, nothing disclosed herein is intended to be dedicated to the public.
This application claims the benefit of and priority under 35 U.S.C. §119(e) to U.S. Provisional Application No. 61/812,195, filed Apr. 15, 2013, titled “ID TRACKING OF GESTURE TOUCH GEOMETRY” and which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61812195 | Apr 2013 | US |