Portable computing devices, such as mobile phones, portable and tablet computers, entertainment devices, handheld navigation devices, and the like increasingly offer more functions and features which can make it difficult for a user to navigate and select commands that are relevant to a function the user wants to initiate on a device. In addition to the traditional techniques used to interact with computing devices, such as a mouse, keyboard, and other input devices, touch sensors and touch-screen displays are commonly integrated in mobile phones and tablet computers, and are utilized both for display and user-selectable touch and gesture inputs. A continuing design challenge with these types of portable devices having touch sensors and/or touch-screen displays is the touch signal processing to track touch and gesture inputs that are identified from successive frames of sensor image data. Touch contacts on a touch-screen display represent the motion trace of a gesture, such as when a user uses his or her fingers to contact a touch-screen display and gesture while maintaining the contact with the display. A failure to correctly track and interpret the motion trace of a touch contact for a gesture input can lead to the failure of gesture recognition operations and gesture tracking processing.
For a gesture motion that is a relatively small or short gesture, conventional tracking processing may statically match the spatially co-located touch contacts from successive frames. However, this approach is not effective for a gesture motion that is a relatively large or long gesture, such as may be typically used on a tablet computer or other type of slate form factor where fast gestures such as flicker or panning are involved. The tracking processing may not be sensitive to the complete motion of a gesture, which can result in a gesture “break”, which may then be recognized and processed as much shorter motion ranges than the actual gesture motion. Alternatively, if the tracking processing is overly sensitive to the gesture motion, this can lead to mis-tracking the touch contacts, such as when a user inputs text with a soft or virtual keyboard that is displayed on a touch-screen display.
This Summary introduces simplified concepts of multi-pass touch contact tracking, and the concepts are further described below in the Detailed Description and/or shown in the Figures. This Summary should not be considered to describe essential features of the claimed subject matter, nor used to determine or limit the scope of the claimed subject matter.
Multi-pass touch contact tracking is described. In embodiments, touch input sensor data is recognized as a series of components of a contact on a touch-screen display. The components can be determined to correlate to the contact based on multi-pass nearest-neighbor contact mapping that includes forward nearest-neighbor contact mapping of the components and reverse nearest-neighbor contact mapping of the components. The reverse nearest-neighbor contact mapping is initiated when unmapped components remain after the forward nearest-neighbor contact mapping. The components can then be associated to represent a tracking of the contact. Subsequent components of the contact can also be determined and associated with the previous components of the contact to further represent the tracking of the contact.
Embodiments of multi-pass touch contact tracking are described with reference to the following Figures. The same numbers may be used throughout to reference like features and components that are shown in the Figures:
Embodiments of multi-pass touch contact tracking are described. As noted above, touch and gesture inputs on a touch-screen display of a computing device, such as a mobile phone or portable computer, may not be accurately tracked and/or processed. Multi-pass touch contact tracking uses the prediction of components at touch positions from previous frames, and the tracking can be reliably achieved in the case of high motion while at the same time, mis-tracking in the case of a soft keyboard can be avoided. Touch contact tracking can be based on the predicted positions of previously identified components of a contact, with a set of rules defining the scope of validity of the tracking obtained by the prediction. In embodiments, multi-pass contact tracking is implemented for contact tracking with prediction based on nearest-neighbor distance matching (e.g., an algorithm or procedure), and a technique is implemented for nearest-neighbor distance matching in both forward and reverse directions.
Touch signal processing involves tracing the touch contacts identified from successive frames of touch input sensor data. When the motion of the touch contacts are relatively high, a prediction via a min-max distance determination can be implemented to sort the maximum distances in different permutations of the possible component mappings for different finger contacts of a multi-finger gesture input. The computational cost is proportional to N!, where N is the number of fingers used for the gesture input. This may be prohibitive when the N is more than ten fingers, such as when two or more users are interacting with a touch-screen and/or playing a game. An alternative solution implements forward and reverse passes (e.g., multi-pass) nearest-neighbor distance matching.
The tracking accuracy and results are comparable to the min-max distance determination technique when the motion velocity is smaller, and the computational cost is proportional to N2 instead of N!, which is less processing intensive. Embodiments provide a solution of computing the contact tracking with the multi-pass solution based on the nearest-neighbor distance matching in both the forward and reverse directions, such as when the prediction of the contacts from the previous frames are well-defined. In the event that a prediction does not exist, the original solution to initiate the prediction via min-max distance determination can still be applied. Alternately, some limiting approach can be implemented to avoid the min-max scheme altogether.
Embodiments of multi-pass touch contact tracking can include a prediction based two-level procedure for component identification and contact tracking. A first-level of procedure is used to establish an initial association of components of a contact for gesture input tracking based on a prediction, and a second level of procedure is used to validate based on a nearest-neighbor contact mapping criteria to generate a final association of the components of a contact. Also related is a set of rules defining the operations for multi-finger touch and gesture recognition. This may be implemented for any slate based device, tablet device, mobile phone or computer with a touch-screen display, as well as for other similar technologies such as surface, indirect touch, etc.
While features and concepts of multi-pass touch contact tracking can be implemented in any number of different devices, systems, environments, networks, and/or configurations, embodiments of multi-pass touch contact tracking are described in the context of the following example devices, systems, and methods.
In the example system 100, the computing device 102 includes a touch input module 114 (e.g., a lower layer component) that is implemented to recognize touch input sensor data 116 as the gesture input 112 on the touch-screen display 110. The computing device also includes a gesture recognition application 118 (e.g., a higher layer component) that receives the touch input sensor data from the touch input module as HID reports 120 (i.e., human interface device reports). The HID reports include a time and position data, as well as determined touch contact tracking, that correlates to gesture inputs on the touch-screen display of the computing device. The gesture recognition application 118 is implemented to recognize and generate various gestures as determined from touch input data (e.g. the HID reports 120) associated with inputs or combinations of inputs, such as the gesture input 112. The gesture recognition application can generate various gestures, such as select gestures, hold gestures, motion gestures, tap gestures, and other types of gestures from various user-selectable inputs.
An input recognition system of the computing device 102 may include any type of input detection features and/or devices to distinguish the various types of inputs, such as sensors (capacitive or resistive), light sensing pixels, touch sensors, cameras, and/or a natural user interface that interprets user interactions, gestures, inputs, and motions. In implementations, the input recognition system can detect motion inputs from discernable variables, such as from a direction variable, from start region position variables and end region position variables, and/or from a motion rate variable (e.g., a particular number of pixels per second).
As described herein, a gesture input may be recognized as a user input with one or more fingers on a touch-screen display of a device, and the gesture input includes one or more contacts that each correlate to the input of a finger on the touch-screen display. In the
The gesture input data is received as a series of frames, and a frame includes a component that represents one touch position of a contact (e.g., along a gesture input that is one finger). For a two-finger gesture input, a frame can include a component of a first contact that correlates to the input of a first finger, and include a component of a second contact that correlates to the input of a second finger (and so on for more than a two-finger gesture input).
In the
Therefore, a contact of a gesture input spans multiple frames and includes the components from each successive frame that have been identified as correlating to the contact, or to a section of the contact. A component represents a touch position of a contact in a frame (e.g., after the component has been identified as correlating to the contact). As described in embodiments, a component can be identified as correlating to a particular contact based on a nearest-neighbor contact mapping criteria that evaluates distance between component positions. However, if the nearest-neighbor contact mapping does not identify a component to one of the existing contacts, then a new contact of the gesture input can be generated to represent the tracking of an additional finger used to gesture on the touch-screen display.
The touch input module 114 recognizes the touch input sensor data 116 as the series of components of the two contacts 122, 124 of the gesture input 112 on the touch-screen display 110 of the computing device 102. In embodiments, the touch input module 114 is implemented to generate a sensor map 138 from the touch input sensor data 116 for each component of each contact. A sensor map represents an individual component of a contact, such as when a user initiates the gesture input 112 on the touch-screen display 110. In this example, the sensor map includes elements 140 shown as 8-bit hex values that represent the signal strength at an element position in the sensor map. A stronger sensor signal of the touch input sensor data indicates more touch contact with an element in the sensor map. The sensor map can be generated as a two-dimensional array, and array indices of the elements in the two-dimensional grid correlate to sensed touch contact from the gesture input on the touch-screen display. The stationary baseline level can be subtracted out so that the elements in an area around the sensor map that are not detected as part of the touch contact are normalized to a zero level.
The computing device 102 also includes a contact tracking service 142 that is implemented to determine predicted touch contact tracking 144 that corresponds to one or more contacts of a gesture input on the touch-screen display 110, such as the gesture input 112. The contact tracking service can be implemented as computer-executable instructions, such as a software application, and executed by one or more processors to implement the various embodiments described herein. The contact tracking service can also be implemented as firmware on dedicated sensor device hardware in the computing device. In this example, the contact tracking service is shown implemented as a component of the touch input module 114. Alternatively, the contact tracking service may be implemented as an independent software application or service to predict touch contact tracking.
In embodiments, the contact tracking service 142 is implemented to perform various procedures and/or algorithms of multi-pass touch contact tracking. The contact tracking service can identify and predict components of the gesture input that are mapped (e.g., correlated, associated) as the two contacts. The components 126-130 represent a tracking of the first contact 122, and the components 132-136 represent a tracking of the second contact 124. The components that are identified as correlating to a particular contact are all assigned the same identifier. For example, the components 126-130 of the first contact 122 are all assigned the same first identifier, and the components 132-136 of the second contact 124 are all assigned the same second identifier, where the first and second identifiers are different to distinguish the separate contacts. As further described below, the contact tracking service can validate that a predicted component position correlates to a subsequent component of a contact based on a nearest-neighbor criteria that evaluates distance from the predicted component position to the additional components of the contact. Additionally, the contact tracking service can determine that components correlate to a particular contact based on a min-max distance determination between the components of the contact.
Example methods 200 and 1200 are described with reference to respective
At block 202, a gesture input is recognized on a touch-screen display. For example, the touch input module 114 (
At block 204, multi-pass touch contact tracking is determined that corresponds to the gesture input. At block 206, next component positions of gesture input contacts is predicted based on one or more previous components of the contacts. At block 208, a next component of the contacts is predicted based on a forward nearest-neighbor contact mapping of the components of the contacts. At block 210, the next components of the contacts are predicted based on a reverse nearest-neighbor contact mapping of the components of the contacts. At block 212, the components of each contact of the gesture input are mapped to represent a tracking of each of the contacts. At block 214, a final association of the components of the contacts is validated based on the forward and reverse nearest-neighbor contact mapping.
For example, the contact tracking service 142 (
Touch input sensor data 116 is input at 402 (
In
Touch contacts that are not initial touch contacts (i.e., “no” from block 502), such as when at least two previous frames have been received, are input to a motion prediction module 314 for motion prediction at block 506 to generate touch contact predicted positions 316. These touch contact predicted positions, along with the connected components 508 of the current frame, are input into the touch contact map module 308 for nearest-neighbor contact mapping at block 510, which is based on the forward nearest-neighbor distance matching algorithm (e.g., procedure or determination). The result of the nearest-neighbor distance matching is checked against a criterion at 512 to determine whether the components of the current frame have been successfully mapped to previous components of a contact.
If the mapping of the components is successful (i.e., “yes” from block 512), then the mapped association is input to a touch contact merger module 318 at block 514. If the mapping of the components is not successful (i.e., “no” from block 512), then the components are input to the touch contact map module 308 for min-max contact mapping at block 516 and a two-level combinatory mapping is invoked. With the input from the motion prediction module 314 (i.e., as output at block 506) and the connected components 508, the min-max contact mapping attempts to establish a first level nearest-neighbor association between these two sets of component positions based on a min-max distance determination, along with a set of rules involving hand and/or finger kinematic dynamics.
The min-max scheme can be implemented to compute the distances between all pairs of components in a potential matching. For a configuration with N touch positions mapping to N components, the number of potential matching equals N! and for each match, a computation of the distances and a sorting of its maximum distance is determined. If the min-max distance determination is initiated when a forward nearest-neighbor distance determination fails to match components, a delay in processing may be noticeable to the user as a glitch.
For those contacts that do not have a mapping established at the first level (e.g., at block 518), they are routed over all of the components for a contact aliasing check at block 520 to determine possible aliasing, which may be merged contacts as determined at block 514 by the touch contact merger module 318. A single component may associate to multiple touch contact positions, which can occur when multiple fingers of a gesture input motion move closer to appear as a single component based on the touch input sensor data. To detect a touch contact merger, for any unmapped component after a first-level contact association, a nearest-neighbor verification can be initiated for contact associations for all of the components, and any match indicates an aliased association between a single component and multiple touch positions.
The touch contact merger module 318 processes and resolves the merged components, such as to independently check whether two touch positions have the same [x,y] grid point coordinates. A contact merger may include multiple touch positions aliased to one component, as well as a scenario of merge-on-landing when a first touch position is already sensed or detected as a first finger touches on a touch-screen display and a user lands a second finger closely next to the first one. The two touch positions may then be detected as merged together in one larger component.
A failure of the forward nearest-neighbor contact mapping can be attributed to the non-uniform nature of the distance used for a nearest-neighbor distance determination. As shown in
In embodiments, the reverse nearest-neighbor contact mapping 616 is implemented to resolve the situation of a component left unmapped, with a reverse pass after the forward nearest-neighbor matching pass. The reverse pass starts with the unmapped component D, and determines the best match from all of the other components in the contacts. Once an optimal component match is determined, such as the component B in this example, all of the associations on component B that were previously established during the forward nearest-neighbor mapping are released. This results in newly unmapped components, such as component C, and the reverse nearest-neighbor contact mapping is applied to these newly unmapped components in the reverse direction until there are no more unmapped components.
The forward nearest-neighbor contact mapping is initiated at 610 for each unmapped component to determine a matching contact. The result of the forward nearest-neighbor contact mapping is checked at 612 to determine whether all of the unmapped components of the current frame have been successfully mapped to previous components of a contact. If the mapping of the components is not successful (i.e., “no” from block 612), then all of the component to contact mappings are disconnected (e.g., released) and the reverse nearest-neighbor contact mapping is initiated at 616.
The use of processing resources for reverse nearest-neighbor contact mapping is minimal due to the proportional number of unmatched components. The propagation in the reverse direction is also minimal, as the process correlates to the three or four fingers of a user's hand staying together and moving at a fast speed along with the direction that the gesture input spans. For any components that are determined as unmapped due to a new finger making contact with the touch-screen display, the multi-pass touch contact tracking does not compromise the validity of the new contact as a new input because a newly determined touch contact is typically a distance far enough from the predicted touch positions, and thus the reverse nearest-neighbor contact mapping will exit after the first iteration.
If N=3 for example, one solution is a constant a1=2.5, a2=−2, and a3=0.5 determined via the simple constant acceleration condition. If N=2 for example, then a solution is a1=2 and a2=−1. In general, these coefficients may be time-dependent variables and a more advanced technique, such as a Kalman filter, can be utilized to determine the parameters through an iterative procedure.
For each component {circumflex over (X)}1(t) of the ith touch position generated in the prediction stage above, the nearest-neighbor contact mapping attempts to associate a component X(t) of the current frame. A nearest-neighbor contact mapping can be resolved as described herein and/or with other techniques and algorithms. A mapped association can be established when all of the components of a current frame are considered to determine a component with an X(t) that is within the two-dimensional decision region 802 centered around the predicted position {circumflex over (X)}1(t). The decision region can be constructed first with a round shape of radius r which corresponds to an area matching the actual touch shape of the ith contact at the frame t−1. The round shape region is then modified with a velocity related expansion along the direction of the velocity, and with an expansion factor λ proportional to the norm of the velocity. This expansion accounts for the error introduced by the inaccuracy of the velocity prediction. In general, λ can have an upper bound λ max to avoid erroneous association between fingers of a gesture input motion that are close together along the direction of the velocity. In one implementation, λ=(1+λmax|v|)/(1+|v|), however other choices for λ are also possible.
In practice, the procedure of verifying the nearest-neighbor contact mapping criterion can be performed in a reverse procedure. A difference vector: d=X(t)−{circumflex over (X)}1(t) is computed first, and then a reverse scaling on d is performed with the factor 1/λ along the direction of velocity vector: v=X(t−1)−X(t−2). The norm of the resultant vector {tilde over (d)} is then checked against the radius r of the decision region, and a value smaller than r indicates that an association has been determined Another extension of the nearest-neighbor contact mapping is to implement a probabilistic approach, where a probability distribution function of the vector d is defined, and instead of passing a hard decision of which touch the component X(t) belongs to, a probability distribution is passed among all of the components. This information can be passed through the HID reports so that the gesture recognition application 118 (
In embodiments, criteria that nearest-neighbor contact mapping is successful can be established, such as to determine a successful mapping at block 512 (
A simple condition to detect this instance of an unreliable association is to determine that the total number of unassociated components after the nearest-neighbor mapping is greater than the difference between the number of components and the number of touches. Note that this is a sufficient condition in that, theoretically, there may be good mappings classified as a bad mapping. Since the majority of frames have reliable association predictions, the instances of an unreliable association (e.g., a failure case) will likely be of a very small percentage (e.g., less than 5%) with a negligible computational cost. If a failure case is detected, an algorithm or procedure for combinatory contact mapping can be invoked to determine a correct association, as implemented by the touch contact map module 308 of the contact tracking service.
The combinatory contact mapping can establish a reliable mapping between a given set of contacts established in a previous frame and a given set of components in the current frame. When a prediction is determined and the nearest-neighbor contact mapping fails, the combinatory contact mapping can be used to establish a first-level association between the two sets (e.g., the touch positions and the components) by matching the prediction residuals between the prediction positions and that of the components. When a prediction is not determined at the initial phase of a contact life span, the combinatory contact mapping can also be used to establish the initial association. If an initial association of a contact does not exist, as may happen in the first two frames of a new contact, the touch contact map module 308 can then set the original position as the predicted position, and no prediction residual is present. The touch contact map module can include the combinatory contact mapping algorithm, the min-max distance determination algorithm for distance mapping, and a cross-trajectory suppressor for penalizing trajectories of gesture input motions that cross each other. The combinatory contact mapping algorithm can be implemented as described herein via a min-max algorithm and/or with other techniques and algorithms, such as the Hungarian Algorithm.
In embodiments, a nearest-neighbor verification for contact and component association can be implemented. The first-level association that is established with a min-max distance determination (as further described below) can be evaluated with a second-level verification process, which is similar to nearest-neighbor contact mapping. Specifically the component X(t) of the current frame, after the establishment of the initial association to {circumflex over (X)}l(t) at the min-max mapping stage, is confirmed to belong to the ith contact if X(t) falls into a two-dimensional predicted region centered around the predicted position {circumflex over (X)}l(t) of the ith contact. Generally, a difference to the nearest-neighbor contact mapping is that instead of evaluating all of the possible components for a given {circumflex over (X)}l(t), a component with the first-level association is selected for the verification process.
The min-max distance determination seeks to determine the mapping that has the smallest maximum distance between the components of a previous frame and the components of a current frame. In this example, a maximum distance 902 from a component 904 to a touch position 906 of a subsequent component has already been identified. A mapping is then determined from components 908 in a current frame to components 910 in a previous frame. For example, the determination for a first contact is whether to select the component association represented by the dashed line or the solid line at 912, and the determination for a second contact is whether to select the component association represented by the dashed line or the solid line at 914.
In an equation, P(i) defines a mapping from N to M where N represents the set of numerical labels on the contacts from a previous frame, and M represents the labels of the components in a current frame. More specifically, P(i) is a function of iεN (range 0 . . . N−1) taking the value in M (range 0 . . . M−1), such that P(i)≠P(j) for i≠j. Furthermore, P denotes the entire set of all possible P(i), and the best P(i) within P is determined so that the mapping defines an association of current components with previous touch positions that makes the most sense.
For any P(i) in P, an array D(i,P): i ε N denotes the distances for each pair i in the mapping. More specifically, for each pair i in the mapping, the distance D(i,P) is defined as the L2 distance between the position of the component of the current frame and the predicted position of the component of the previous frame if the prediction exists, or the position of the component of the previous frame. A descending sort of array of D(i,P) is then initiated and the result is denoted as Sorted D(k,P), where 0≦k<N and:
SortedD(0,P)≧SortedD(1,P) . . . ≧SortedD(N−1,P)
The best P can be obtained when solving the following minimization problem: BestP=arg minPεP(ED(P)+λ0*EC(P)); where:
The value ED is the contribution from the maximum distance matching, and the value MAXD is the maximum distance on the display screen (typically the diagonal distance). The layered maximum distance matching in this example accounts for the degeneracy of the configuration once the components with larger distances have been matched.
0<(b1*c0−b0*c1)/det<1
0<(a0*c1−a1*c0)/det<1
(det=b1*a0−b0*a1); where
a=(a0,a1)=x1−x0
b=(b0,b1)=x2−x3
c=(c0,c1)=x2−x0
At block 1202, touch input sensor data is recognized as a series of components of a contact on a touch-screen display. For example, the touch input module 114 (
At block 1204, the components are determined as correlating to the contact based on multi-pass nearest-neighbor contact mapping. For example, the contact tracking service 142 (
At block 1206, the forward nearest-neighbor contact mapping is initiated. For example, the contact tracking service 142 initiates the forward nearest-neighbor contact mapping to evaluate distance from one or more additional components of the contact to predicted component positions of the components. At block 1208, a determination is made as to whether any unmapped components remain after the forward nearest-neighbor contact mapping. If unmapped components remain (i.e., “yes” from block 1208), then at block 1210, mapped component associations that are mapped by the forward nearest-neighbor contact mapping are released and, at block 1212, the reverse nearest-neighbor contact mapping is initiated. For example, the contact tracking service 142 determines whether unmapped components remain after the forward nearest-neighbor contact mapping and, if yes, releases any mapped component associations and initiates the reverse nearest-neighbor contact mapping to evaluate distance from the predicted component positions to the one or more additional components of the contact.
At block 1214, the components of the contact are associated to represent a tracking of the contact and, at block 1216, a same identifier is then assigned to all of the components that are associated with the contact. For example, the contact tracking service 142 associates all of the components of the contact to represent a tracking of the contact and assigns the same identifier to all of the components.
The device 1300 includes communication devices 1302 that enable wired and/or wireless communication of device data 1304, such as received data, data that is being received, data scheduled for broadcast, data packets of the data, etc. The device data or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on the device can include any type of audio, video, and/or image data. The device includes one or more data inputs 1306 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs and any other type of audio, video, and/or image data received from any content and/or data source.
The device 1300 also includes communication interfaces 1308, such as any one or more of a serial, parallel, network, or wireless interface. The communication interfaces provide a connection and/or communication links between the device and a communication network by which other electronic, computing, and communication devices communicate data with the device.
The device 1300 includes one or more processors 1310 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of the device. Alternatively or in addition, the device can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 1312. In embodiments, the device 1300 can also include a touch input module 1314 that is implemented to recognize touch input sensor data. Although not shown, the device can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
The device 1300 also includes one or more memory devices 1316 (e.g., computer-readable storage media) that enable data storage, such as random access memory (RAM), non-volatile memory (e.g., read-only memory (ROM), flash memory, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable disc, and the like. The device may also include a mass storage media device.
Computer readable media can be any available medium or media that is accessed by a computing device. By way of example, and not limitation, computer readable media may comprise storage media and communication media. Storage media include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information and which can be accessed by a computer.
Communication media typically embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier wave or other transport mechanism. Communication media also include any information delivery media. A modulated data signal has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.
A memory device 1316 provides data storage mechanisms to store the device data 1304, other types of information and/or data, and various device applications 1318. For example, an operating system 1320 can be maintained as a software application with the memory device and executed on the processors. The device applications may also include a device manager, such as any form of a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on. In this example, the device applications 1318 include a gesture recognition application 1322 and a contact tracking service 1324 that implement embodiments of angular contact geometry as described herein.
The device 1300 also includes an audio and/or video processing system 1326 that generates audio data for an audio system 1328 and/or generates display data for a display system 1330. The audio system and/or the display system may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio device and/or to a display device via an RF (radio frequency) link, S-video link, composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link. In implementations, the audio system and/or the display system are external components to the device. Alternatively, the audio system and/or the display system are integrated components of the example device, such as an integrated touch-screen display.
Although embodiments of multi-pass touch contact tracking have been described in language specific to features and/or methods, the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of multi-pass touch contact tracking.
This application claims priority to U.S. Provisional Application Ser. No. 61/449,538 filed Mar. 4, 2011 entitled “Multi-Pass Touch Contact Tracking” to Zhao et al., the disclosure of which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4421997 | Forys | Dec 1983 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5856822 | Du et al. | Jan 1999 | A |
5943043 | Furuhata et al. | Aug 1999 | A |
6008636 | Miller et al. | Dec 1999 | A |
6091406 | Kambara et al. | Jul 2000 | A |
6323846 | Westerman et al. | Nov 2001 | B1 |
6671406 | Anderson | Dec 2003 | B1 |
6741237 | Benard et al. | May 2004 | B1 |
6856259 | Sharp | Feb 2005 | B1 |
6977646 | Hauck et al. | Dec 2005 | B1 |
7053887 | Kraus et al. | May 2006 | B2 |
7174649 | Harris | Feb 2007 | B1 |
7254775 | Geaghan et al. | Aug 2007 | B2 |
7295191 | Kraus et al. | Nov 2007 | B2 |
7362313 | Geaghan et al. | Apr 2008 | B2 |
7375454 | Takasaki | May 2008 | B2 |
7580556 | Lee et al. | Aug 2009 | B2 |
7592999 | Rosenberg et al. | Sep 2009 | B2 |
7619618 | Westerman et al. | Nov 2009 | B2 |
7711450 | Im et al. | May 2010 | B2 |
7725014 | Lam et al. | May 2010 | B2 |
7728821 | Hillis et al. | Jun 2010 | B2 |
7746325 | Roberts | Jun 2010 | B2 |
7797115 | Tasher et al. | Sep 2010 | B2 |
7812828 | Westerman et al. | Oct 2010 | B2 |
7907750 | Ariyur et al. | Mar 2011 | B2 |
7938009 | Grant et al. | May 2011 | B2 |
7978182 | Ording et al. | Jul 2011 | B2 |
8061223 | Pan | Nov 2011 | B2 |
8217909 | Young | Jul 2012 | B2 |
8314780 | Lin et al. | Nov 2012 | B2 |
8493355 | Geaghan et al. | Jul 2013 | B2 |
20030164820 | Kent | Sep 2003 | A1 |
20040207606 | Atwood et al. | Oct 2004 | A1 |
20050012724 | Kent | Jan 2005 | A1 |
20050063566 | Beek et al. | Mar 2005 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060175485 | Cramer | Aug 2006 | A1 |
20070081726 | Westerman et al. | Apr 2007 | A1 |
20080041639 | Westerman et al. | Feb 2008 | A1 |
20080062140 | Hotelling et al. | Mar 2008 | A1 |
20080068229 | Chuang | Mar 2008 | A1 |
20080150909 | North et al. | Jun 2008 | A1 |
20080158185 | Westerman | Jul 2008 | A1 |
20080180399 | Cheng | Jul 2008 | A1 |
20080211778 | Ording et al. | Sep 2008 | A1 |
20080211782 | Geaghan et al. | Sep 2008 | A1 |
20080278453 | Reynolds et al. | Nov 2008 | A1 |
20080284899 | Haubmann et al. | Nov 2008 | A1 |
20080309624 | Hotelling | Dec 2008 | A1 |
20080309629 | Westerman et al. | Dec 2008 | A1 |
20090009483 | Hotelling et al. | Jan 2009 | A1 |
20090046073 | Pennington et al. | Feb 2009 | A1 |
20090096753 | Lim | Apr 2009 | A1 |
20090141046 | Rathnam et al. | Jun 2009 | A1 |
20090157206 | Weinberg et al. | Jun 2009 | A1 |
20090160763 | Cauwels et al. | Jun 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090225036 | Wright | Sep 2009 | A1 |
20090241701 | Pan | Oct 2009 | A1 |
20090250268 | Staton et al. | Oct 2009 | A1 |
20090251435 | Westerman et al. | Oct 2009 | A1 |
20090251436 | Keskin | Oct 2009 | A1 |
20090267903 | Cady et al. | Oct 2009 | A1 |
20090273584 | Staton et al. | Nov 2009 | A1 |
20090303202 | Liu | Dec 2009 | A1 |
20090312009 | Fishel | Dec 2009 | A1 |
20100053099 | Vincent et al. | Mar 2010 | A1 |
20100073318 | Hu et al. | Mar 2010 | A1 |
20100103118 | Townsend et al. | Apr 2010 | A1 |
20100103121 | Kim et al. | Apr 2010 | A1 |
20100134429 | You et al. | Jun 2010 | A1 |
20100193258 | Simmons et al. | Aug 2010 | A1 |
20100214233 | Lee | Aug 2010 | A1 |
20100231508 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100277505 | Ludden et al. | Nov 2010 | A1 |
20100302211 | Huang | Dec 2010 | A1 |
20100309139 | Ng | Dec 2010 | A1 |
20100315266 | Gunawardana et al. | Dec 2010 | A1 |
20100315366 | Lee et al. | Dec 2010 | A1 |
20100315372 | Ng | Dec 2010 | A1 |
20110018822 | Lin et al. | Jan 2011 | A1 |
20110025629 | Grivna et al. | Feb 2011 | A1 |
20110042126 | Spaid et al. | Feb 2011 | A1 |
20110050620 | Hristov | Mar 2011 | A1 |
20110080348 | Lin et al. | Apr 2011 | A1 |
20110084929 | Chang et al. | Apr 2011 | A1 |
20110106477 | Brunner | May 2011 | A1 |
20110115709 | Cruz-Hernandez | May 2011 | A1 |
20110141054 | Wu | Jun 2011 | A1 |
20110261005 | Joharapurkar et al. | Oct 2011 | A1 |
20110267481 | Kagei | Nov 2011 | A1 |
20110298709 | Vaganov | Dec 2011 | A1 |
20110298745 | Souchkov | Dec 2011 | A1 |
20110299734 | Bodenmueller | Dec 2011 | A1 |
20110304577 | Brown | Dec 2011 | A1 |
20110304590 | Su et al. | Dec 2011 | A1 |
20120030624 | Migos | Feb 2012 | A1 |
20120032891 | Parivar | Feb 2012 | A1 |
20120065779 | Yamaguchi et al. | Mar 2012 | A1 |
20120068957 | Puskarich et al. | Mar 2012 | A1 |
20120075331 | Mallick | Mar 2012 | A1 |
20120105334 | Aumiller et al. | May 2012 | A1 |
20120131490 | Lin et al. | May 2012 | A1 |
20120146956 | Jenkinson | Jun 2012 | A1 |
20120187956 | Uzelac | Jul 2012 | A1 |
20120188176 | Uzelac | Jul 2012 | A1 |
20120188197 | Uzelac | Jul 2012 | A1 |
20120191394 | Uzelac | Jul 2012 | A1 |
20120206377 | Zhao | Aug 2012 | A1 |
20120206380 | Zhao | Aug 2012 | A1 |
20120268416 | Pirogov et al. | Oct 2012 | A1 |
20120280934 | Ha et al. | Nov 2012 | A1 |
20120319992 | Lee | Dec 2012 | A1 |
20130016045 | Zhao | Jan 2013 | A1 |
20130063167 | Jonsson | Mar 2013 | A1 |
20130113751 | Uzelac | May 2013 | A1 |
20130197862 | Uzelac et al. | Aug 2013 | A1 |
20130238129 | Rose et al. | Sep 2013 | A1 |
20130345864 | Park | Dec 2013 | A1 |
Number | Date | Country |
---|---|---|
1761932 | Apr 2006 | CN |
200947594 | Sep 2007 | CN |
101553777 | Oct 2009 | CN |
101661373 | Mar 2010 | CN |
101937296 | Jan 2011 | CN |
201828476 | May 2011 | CN |
2201903594 | Jul 2011 | CN |
202093112 | Dec 2011 | CN |
101545938 | Jan 2012 | CN |
202171626 | Mar 2012 | CN |
202196126 | Apr 2012 | CN |
102436334 | May 2012 | CN |
101982783 | Jul 2012 | CN |
19939159 | Mar 2000 | DE |
2284654 | Feb 2011 | EP |
2003303051 | Oct 2003 | JP |
1020050003155 | Jan 2005 | KR |
20050094359 | Sep 2005 | KR |
100763057 | Oct 2007 | KR |
1020080066416 | Jul 2008 | KR |
100941441 | Feb 2010 | KR |
20100067178 | Jun 2010 | KR |
20100077298 | Jul 2010 | KR |
20100129015 | Dec 2010 | KR |
101007049 | Jan 2011 | KR |
1020110011337 | Feb 2011 | KR |
101065014 | Sep 2011 | KR |
WO-2006042309 | Apr 2006 | WO |
WO-2010073329 | Jul 2010 | WO |
WO-2013063042 | May 2013 | WO |
Entry |
---|
“Actuation Force of Touch Screen”, Solutions @ Mecmesin, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188971>,(Dec. 31, 2010), 1 page. |
“AO Touch Screen Tester”, retrieved from <http://www.ao-cs.com/Projects/touch%20screen%20tester%20project.html>, (Dec. 31, 2010), 1 page. |
“Capacitive Touch Sensors—Application Fields, Technology Overview and Implementation Example”, Fujitsu Microelectronics Europe GmbH; retrieved from http://www.fujitsu.com/downloads/MICRO/fme/articles/fujitsu-whitepaper-capacitive-touch-sensors.pdf on Jul. 20, 2011, (Jan. 12, 2010),12 pages. |
“Haptic-Actuator Controllers”, retrieved from <http://www.maxim-ic.com/products/data—converters/touch-interface/haptic-actuator.cfm> on May 4, 2011, 1 page. |
“How to Use the Precision Touch Testing Tool”, retrieved from <http://feishare.com/attachments/article/279/precision-touch-testind-tool-Windows8-hardware-certification.pdf>, (Apr. 15, 2012),10 pages. |
“Linearity Testing Solutions in Touch Panels”, retrieved from <advantech.com/machine-automation/ . . . /(%7BD05BC586-74DD-4BFA-B81A-2A9F7ED489F/>, (Nov. 15, 2011), 2 pages. |
“MAX11871”, retrieved from <http://www.maxim-ic.com/datasheet/index.mvp/id/7203> on May 4, 2011, 2 pages. |
“MicroNav Integration Guide Version 3.0”, retrieved from <http://www.steadlands.com/data/interlink/micronavintguide.pdf>, (Dec. 31, 2003),11 pages. |
“Microsoft Windows Simulator Touch Emulation”, retrieved from <blogs.msdn.com/b/visualstudio/archive/2011/09/30/microsoft-windows-simulator-touch-emulation.aspx>, (Sep. 30, 2011), 3 pages. |
“OptoFidelity Touch & Test”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188969, (Feb. 20, 2012), 2 pages. |
“OptoFidelity Touch and Test”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188420>, (May 4, 2012), 2 pages. |
“OptoFidelity Two Fingers—robot”, video available at <http://www.youtube.com/watch?v=YppRASbXHfk&feature=player—embedded#!section>, (Sep. 15, 2010), 2 pages. |
PCT Search Report and Written Opinion, Application No. PCT/US2012/024781, (Sep. 3, 2012), 9 pages. |
“Projected Capacitive Test Fixture”, retrieved from <http://www.touch-intl.com/downloads/DataSheets%20for%20Web/6500443-PCT-DataSheet-Web.pdf>, (2009), 2 pages. |
“Resistive Touch Screen—Resistance Linearity Test”, video available at <http://www.youtube.com/watch?v=hb23GpQdXXU>, (Jun. 17, 2008), 2 pages. |
“Smartphone Automatic Testing Robot at UEI Booth”, video available at <http://www.youtube.com/watch?v=f-Q4ns-b9sA>, (May 9, 2012), 2 pages. |
“STM23S-2AN NEMA 23 Integrated Drive+Motor”, Retrieved from: <http://www.applied-motion.com/products/integrated-steppers/stm23s-2an> on Jan. 24, 2012, 3 pages. |
“Technology Comparison: Surface Acoustic Wave, Optical and Bending Wave Technology”, 3M Touch Systems, Available at >http://multimedia.3m.com/mws/mediawebserver?mwsId=66666UuZjcFSLXTtnXT2NXTaEVuQEcuZgVs6EVs6E666666--&fn=DST-Optical-SAW%20Tech%20Brief.pdf>,(2009), pp. 1-4. |
“Touch Panel Inspection & Testing Solution”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188967 , (Dec. 31, 2010),1 page. |
“Touch Panel Semi-Auto Handler Model 3810”, retrieved from <http://www.chromaus.com/datasheet/3810—en.pdf>, (Dec. 31, 2010), 2 pages. |
“TouchSense Systems Immersion”, retrieved from http://www.ArticleOnePartners.com/index/servefile?fileId=188486>, (Jun. 19, 2010), 20 pages. |
“Using Low Power Mode on the MPR083 and MPR084”, Freescale Semiconductor Application Note, Available at <http://cache.freescale.com/files/sensors/doc/app—note/AN3583.pdf>, (Nov. 2007), pp. 1-5. |
Asif, Muhammad et al., “MPEG-7 Motion Descriptor Extraction for Panning Camera Using Sprite Generated”, In Proceedings of AVSS 2008, Available at <http://www.ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4730384>, (Sep. 2008), pp. 60-66. |
Brodkin, Jon “Windows 8 hardware: Touchscreens, sensor support and robotic fingers”, <<http://arstechnica.com/business/news/2011/09/windows-8-hardware-touch-screens-sensor-support-and-robotic-fingers.ars>>, (Sep. 13, 2011),1 Page. |
Buffet, Y “Robot Touchscreen Analysis”, <<http://ybuffet.posterous.com/labsmotocom-blog-archive-robot-touchscreen-an>>, (Apr. 19, 2010), 2 Pages. |
Cravotta, Robert “The Battle for Multi-touch”, Embedded Insights, retrieved from <http://www.embeddedinsights.com/channels/2011/04/12/the-battle-for-multi-touch/> on May 4, 2011,(Apr. 12, 2011), 3 pages. |
Dillow, Clay “Liquid-Filled Robot Finger More Sensitive to Touch Than a Human's”, retrieved from <www.popsci.com/technology/article/2012-06/new-robot-finger-more-sensitive-touch-human> on Jul. 27, 2012, (Jun. 19, 2012), 3 pages. |
Hoggan, Eve et al., “Mobile Multi-Actuator Tactile Displays”, In 2nd international conference on Haptic and audio interaction design, retrieved from <http://www.dcs.gla.ac.uk/˜stephen/papers/HAID2.pdf>, (Oct. 29, 2007),12 pages. |
Hoshino, et al., “Pinching at finger tips for humanoid robot hand”, Retrieved at <<http://web.mit.edu/zoz/Public/HoshinoKawabuchiRobotHand.pdf>>, (Jun. 30, 2005), 9 Pages. |
Kastelan, et al., “Stimulation Board for Automated Verification of Touchscreen-Based Devices”, 22nd International Conference on Field Programmable Logic and Applications, Available at <https://www2.lirmm.fr/lirmm/interne/BIBLI/CDROM/MIC/2012/FPL—2012/Papers/PHD7.pdf>,(Aug. 29, 2012), 2 pages. |
Kastelan, et al., “Touch-Screen Stimulation for Automated Verification of Touchscreen-Based Devices”, In IEEE 19th International Conference and Workshops on Engineering of Computer Based Systems, (Apr. 11, 2012), pp. 52-55. |
Khandkar, Shahedul H., et al., “Tool Support for Testing Complex MultiTouch Gestures”, ITS 2010, Nov. 7-10, 2010, Saarbrucken, Germany, (Nov. 7, 2010), 10 pages. |
Kjellgren, Olof “Developing a remote control application for Windows CE”, Bachelor Thesis performed in Computer Engineering at ABE Robotics, Miilardalen University, Department of Computer Science and Electronics, Retrieved at <<http://www.idt.mdh.se/utbildning/exjobblfiles/TR0661.pdf>>,(May 30, 2007), 43 Pages. |
Kuosmanen, Hans “OptoFidelity Automating UI Testing”, video available at <http://youtube.com/watch?v=mOZ2r7ZvyTg&feature=player—embedded#!section>, (Oct. 14, 2010), 2 pages. |
Kuosmanen, Hans “Testing the Performance of Touch-Enabled Smartphone User Interfaces”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188442>, (Dec. 31, 2008), 2 pages. |
Levin, Michael et al., “Tactile-Feedback Solutions for an Enhanced User Experience”, retrieved from >http://www.pbinterfaces.com/documents/Tactile—Feedback—Solutions.pdf>, (Oct. 31, 2009), pp. 18-21. |
McGlaun, Shane “Microsoft's Surface 2.0 Stress Testing Robot Called Patty Shown off for First Time”, Retrieved at <<http://www—.slashgear.—com/microsofts-surface-2—-0-stress-testing-robot—-called-patty-shown-off—-for—-first-time-19172971/>>, (Aug. 19, 2011), 1 Page. |
McMahan, William et al., “Haptic Display of Realistic Tool Contact via Dynamically Compensated Control of a Dedicated Actuator”, International Conference on Intelligent Robots and Systems, St. Louis, MO, Oct. 11-15, 2009, retrieved from <http://repository.upenn.edu/meam—papers/222>,(Dec. 15, 2009), 9 pages. |
Pratt, Susan “Factors Affecting Sensor Response”, Analog Devices, AN-830 Application Note, Available at <http://www.analog.com/static/imported-files/application—notes/5295737729138218742AN830—0.pdf>,(Dec. 2005), pp. 1-8. |
Takeuchi, et al., “Development of a Muti-fingered Robot Hand with Softness changeable Skin Mechanism”, International Symposium on and 2010 6th German Conference on Robotics (ROBOTIK), Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=05756853>>,(Jun. 7, 2010), 7 Pages. |
Terpstra, Brett “BetterTouchTool Makes Multi-touch Infinitely More Useful, for Free”, retrieved from <http://www.tuaw.com/2010/01/05/bettertouchtool-makes-multi-touch-infinitely-more-useful-for-fr/> on Jul. 20, 2012, (Jan. 5, 2010), 4 pages. |
Toto, Serkan “Video: Smartphone Test Robot Simulates Countless Flicking and Tapping”, retrieved from <techcrunch.com/2010/12/21/video-smartphone-test-robot-simulates-countless-flicking-and-tapping/>, (Dec. 21, 2010), 2 pages. |
Wimmer, Raphael et al., “Modular and Deformable Touch-Sensitive Surfaces Based on Time Domain Reflectometry”, In Proceedings of UIST 2011, Available at <http://www.medien.ifi.lmu.de/pubdb/publications/pub/wimmer2011tdrTouch/wimmer2011tdrTouch.pdf>,(Oct. 2011),10 pages. |
Zivkov, et al., “Touch Screen Mobile Application as Part of Testing and Verification System”, Proceedings of the 35th International Convention, (May 21, 2012), pp. 892-895. |
International Search Report and Written Opinion, Application No. PCT/US2011/055621, (Jun. 13, 2012), 8 pages. |
International Search Report, Application No. PCT/US2011/058855, (Nov. 1, 2011), 8 pages. |
Non-Final Office Action, U.S. Appl. No. 12/941,693, (Jul. 18, 2012), 19 pages. |
International Search Report Mailed Date Sep. 3, 2012, Application No. PCT/US2012/027642, Filing Date : Mar. 4, 2012, pp. 9. |
Binns, Francis Styrion, “Multi-“Touch” Interaction via Visual Tracking”, Retrieved at <<http://www.cs.bath.ac.uk/˜mdv/courses/CM30082/projects.bho/2008-9/Binns-FS-dissertation-2008-9.pdf>>, May 2009, pp. 81. |
Baraldi, et al., “WikiTable: Finger Driven Interaction for Collaborative Knowledge-Building Workspaces”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1640590>>, 2006 Conference on Computer Vision and Pattern Recognition Workshop, Jul. 5, 2006, pp. 6. |
Dillencourt, et al., “A General Approach to Connected-Component Labeling for Arbitrary Image Representations”, Retrieved at <<http://www.cs.umd.edu/˜hjs/pubs/DillJACM92.pdf>>, Journal of the Association for Computing Machinery, vol. 39, No. 2, Apr. 1992, pp. 253-280. |
Tao, et al., “An Efficient Cost Model for Optimization of Nearest Neighbor Search in Low and Medium Dimensional Spaces”, Retrieved at <<http://www.cais.ntu.edu.sg/˜jzhang/papers/ecmonns.pdf>>, IEEE Transactions on Knowledge and Data Engineering, vol. 16, No. 10, Oct. 2004, pp. 36. |
Westman, et al., “Color Segmentation by Hierarchical Connected Components Analysis with Image Enhancement by Symmetric Neighborhood Filter”, Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=118219>>, Proceedings of 10th International Conference on Pattern Recognition, Jun. 16-21, 1990, pp. 796-802. |
Input Testing Tool, U.S. Appl. No. 13/659,777, (Oct. 24, 2012),31 pages. |
Non-Final Office Action, U.S. Appl. No. 12/941,693, (May 16, 2013),13 pages. |
Non-Final Office Action, U.S. Appl. No. 13/183,377, (Jun. 21, 2013),10 pages. |
Non-Final Office Action, U.S. Appl. No. 13/293,060, (Jul. 12, 2013), 9 pages. |
PCT Search Report and Written Opinion, Application No. PCT/US2013/021787, (May 13, 2013), 9 pages. |
“Touch Quality Test Robot”, U.S. Appl. No. 13/530,692, filed Jun. 22, 2012, 20 pages. |
Benko, Hrvoje et al., “Resolving Merged Touch Contacts”, U.S. Appl. No. 12/914,693, filed Nov. 8, 2010,22 pages. |
Cao, Xiang et al., “Evaluation of an On-line Adaptive Gesture Interface with Command Prediction”, In Proceedings of GI 2005, Available at <http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=DAB1B08F620C23464427932BAF1ECF49?doi=10.1.1.61.6749&rep=rep1&type=pdf>,(May 2005),8 pages. |
Cao, Xiang et al., “ShapeTouch: Leveraging Contact Shape on Interactive Surfaces”, In Proceedings of TABLETOP 2008, Available at <http://www.cs.toronto.edu/˜caox/tabletop2008—shapetouch.pdf>,(2008),pp. 139-146. |
Dang, Chi T., et al., “Hand Distinction for Multi-Touch Tabletop Interaction”, University of Augsburg; Institute of Computer Science; Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, (Nov. 23-25, 2009),8 pags. |
Tsuchiya, Sho et al., “Vib-Touch: Virtual Active Touch Interface for Handheld Devices”, In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, Available at <http://www.mech.nagoya-u.ac.jp/asi/en/member/shogo—okamoto/papers/tsuchiyaROMAN2009.pdf>,(Oct. 2009),pp. 12-17. |
Wilson, Andrew D., “TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction”, In Proceedings of ICIM 2004, Available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.95.3647&rep=rep1&type=pdf>,(Oct. 2004),8 pages. |
PCT Search Report and Written Opinion, Application No. PCT/US2012/024780, (Sep. 3, 2012), 9 pages. |
Final Office Action, U.S. Appl. No. 12/941,693, (Nov. 26, 2012), 22 Pages. |
Non-Final Office Action, U.S. Appl. No. 13/152,991, (Mar. 21, 2013), 10 pages. |
Final Office Action, U.S. Appl. No. 13/152,991, (Sep. 20, 2013),14 pages. |
Final Office Action, U.S. Appl. No. 13/183,377, (Oct. 15, 2013),12 pages. |
Final Office Action, U.S. Appl. No. 13/293,060, (Sep. 25, 2013),10 pages. |
Non-Final Office Action, U.S. Appl. No. 13/293,060, (Nov. 29, 2013),11 pages. |
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/046208, Sep. 27, 2013, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/154,161, Jan. 3, 2014, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/941,693, Nov. 18, 2013, 21 Pages. |
“Notice of Allowance”, U.S. Appl. No. 13/198,415, Dec. 26, 2013, 8 pages. |
“Foreign Office Action”, CN Application No. 201210018527.8, Feb. 24, 2014, 10 Pages. |
“Foreign Office Action”, CN Application No. 201210029859.6, Feb. 21, 2014, 15 Pages. |
“Final Office Action”, U.S. Appl. No. 13/530,692, Apr. 10, 2014, 16 pages. |
“Final Office Action”, U.S. Appl. No. 13/154,161, Apr. 22, 2014, 16 pages. |
“International Search Report & Written Opinion for PCT Application No. PCT/US2013/061067”, Mailed Date: Feb. 7, 2014, Filed Date: Sep. 21, 2013, 11 Pages. |
Non-Final Office Action, U.S. Appl. No. 13/530,692, Jan. 31, 2014, 14 pages. |
Non-Final Office Action, U.S. Appl. No. 13/152,991, Mar. 21, 2014, 18 pages. |
Non-Final Office Action, U.S. Appl. No. 13/198,036, Jan. 31, 2014, 14 pages. |
Non-Final Office Action, U.S. Appl. No. 13/099,288, Feb. 6, 2014, 13 pages. |
Non-Final Office Action, U.S. Appl. No. 13/183,377, Feb. 27, 2014, 12 pages. |
Number | Date | Country | |
---|---|---|---|
20120223894 A1 | Sep 2012 | US |
Number | Date | Country | |
---|---|---|---|
61449538 | Mar 2011 | US |