Multi-finger detection and component resolution

Information

  • Patent Grant
  • 8913019
  • Patent Number
    8,913,019
  • Date Filed
    Thursday, July 14, 2011
    13 years ago
  • Date Issued
    Tuesday, December 16, 2014
    9 years ago
Abstract
In embodiments of multi-finger detection and component resolution, touch input sensor data is recognized as a component of a multi-finger gesture on a touch-screen display. An ellipse is determined that approximately encompasses the component, and the ellipse has a primary axis and a secondary axis that are orthogonal. A distribution is then generated that projects sensor data elements from the primary axis based on detected intensity of the touch input sensor data. A histogram function can then be generated based on the distribution, where the histogram function indicates individual contacts of the component and separation of the individual contacts.
Description
BACKGROUND

Portable computing devices, such as mobile phones, portable and tablet computers, entertainment devices, handheld navigation devices, and the like increasingly offer more functions and features which can make it difficult for a user to navigate and select commands that are relevant to a function the user wants to initiate on a device. In addition to the traditional techniques used to interact with computing devices, such as a mouse, keyboard, and other input devices, touch sensors and touch-screen displays are commonly integrated in mobile phones and tablet computers, and are utilized both for display and user-selectable touch and gesture inputs. A continuing design challenge with these types of portable devices having touch sensors and/or touch-screen displays is the touch signal processing to track touch and gesture inputs that are identified from successive frames of sensor image data.


Touch contacts on a touch-screen display represent the motion trace of a gesture, such as when a user uses his or her fingers to contact a touch-screen display and gesture while maintaining the contact with the display. A failure to correctly track and interpret the motion trace of a touch contact for a gesture input can lead to the failure of gesture recognition operations and gesture tracking processing. For example, multi-finger gesture processing, such as for multi-finger tapping, attempts to detect and resolve when a connected component is associated to multiple fingers that are merged together. Conventional processing techniques use either temporal domain prediction, which depends on finger touch timing and thus can be unreliable, or is based on determining component contour, which is more susceptible to boundary noise in touch input sensor data.


SUMMARY

This Summary introduces simplified concepts of multi-finger detection and component resolution, and the concepts are further described below in the Detailed Description and/or shown in the Figures. This Summary should not be considered to describe essential features of the claimed subject matter, nor used to determine or limit the scope of the claimed subject matter.


Multi-finger detection and component resolution is described. In embodiments, touch input sensor data is recognized as a component of a multi-finger gesture on a touch-screen display. An ellipse is determined that approximately encompasses the component, and the ellipse has a primary axis and a secondary axis that are orthogonal. A distribution is then generated that projects sensor data elements from the primary axis based on detected intensity of the touch input sensor data. A histogram function can then be generated based on the distribution, where the histogram function indicates individual contacts of the component and separation of the individual contacts.


In other embodiments, each individual contact of the component can be associated with a different finger input of the multi-finger gesture. Histogram function high points indicate the individual contacts of the component, and a histogram function low point indicates the separation of the individual contacts of the component. Additionally, a retrace of the histogram function can be performed to confirm the histogram function high points and to eliminate a false indication of an individual contact. The individual contacts of the component can be separated, and an individual ellipse can then be determined for each individual contact to map each of the individual contacts for association with the different finger inputs of the multi-finger gesture.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of multi-finger detection and component resolution are described with reference to the following Figures. The same numbers may be used throughout to reference like features and components that are shown in the Figures:



FIG. 1 illustrates an example system in which embodiments of multi-finger detection and component resolution can be implemented.



FIG. 2 illustrates an example multi-finger gesture and a connected component in accordance with one or more embodiments.



FIG. 3 illustrates examples of a distribution and a histogram function based on the component for the multi-finger gesture in accordance with one or more embodiments.



FIG. 4 illustrates an example of separating individual contacts of the component of the multi-finger gesture in accordance with one or more embodiments.



FIG. 5 illustrates example method(s) of multi-finger detection and component resolution in accordance with one or more embodiments.



FIG. 6 illustrates various components of an example device that can implement embodiments of multi-finger detection and component resolution.





DETAILED DESCRIPTION

Embodiments of multi-finger detection and component resolution are described. As noted above, touch and gesture inputs on a touch-screen display of a computing device, such as a mobile phone or portable computer, may not be accurately tracked and/or processed. One of the operational stages in multi-touch signal processing is to detect and resolve the multiple contacts in a single connected component. These multiple contacts correspond to multiple fingers when they are close together so that a single connected component is observed. Correctly identifying and resolving these contacts is an important processing aspect of various multi-touch gestures, such as a three-finger tapping gesture versus two- or four-finger tapping. Multi-finger detection and component resolution is implemented as a static approach with single frame dependency to detect multiple fingers that are merged in a single connected component by performing analysis using the data from the current frame only in a statistical framework, and to eliminate the timing dependency, such as in a temporal based solution.


While features and concepts of multi-finger detection and component resolution can be implemented in any number of different devices, systems, environments, and/or configurations, embodiments of multi-finger detection and component resolution are described in the context of the following example devices, systems, and methods.



FIG. 1 illustrates an example system 100 in which various embodiments of multi-finger detection and component resolution can be implemented. The example system includes a computing device 102, which may be any one or combination of a mobile phone, entertainment device, navigation device, user device, wireless device, portable device, tablet computer, dual-screen folding device, and the like. The computing device includes an integrated touch-screen display 104, which is implemented to sense a gesture input 106, such as a user-initiated gesture in a user interface that is displayed on the touch-screen display. In this example, the gesture input is a two finger gesture across the touch-screen display in an approximate direction indicated by the arrow, but may be a single finger gesture input, or a multi-finger gesture input or tap (e.g., three or four finger gesture input tapping). Any of the computing devices can be implemented with various components, such as one or more processors and memory devices, as well as any number and combination of differing components as further described with reference to the example device shown in FIG. 6 to implement embodiments of multi-finger detection and component resolution.


In the example system 100, the computing device 102 includes a touch input module 108 (e.g., a lower layer component) that is implemented to recognize touch input sensor data 110 as a multi-finer gesture input, such as the gesture input 106 on the touch-screen display 104. The computing device also includes a gesture recognition application 112 (e.g., a higher layer component) that receives the touch input sensor data from the touch input module as HID reports 114 (i.e., human interface device reports). The HID reports include a time and position data, as well as determined touch contact tracking, that correlates to gesture inputs on the touch-screen display of the computing device. The gesture recognition application 112 is implemented to recognize and generate various gestures as determined from touch input data (e.g. the HID reports 114) associated with inputs or combinations of inputs, such as the gesture input 106. The gesture recognition application can generate various gestures, such as select gestures, hold gestures, motion gestures, tap gestures, and other types of gestures from various user-selectable inputs.


An input recognition system of the computing device 102 may include any type of input detection features and/or devices to distinguish the various types of inputs, such as sensors (capacitive or resistive), light sensing pixels, touch sensors, cameras, and/or a natural user interface that interprets user interactions, gestures, inputs, and motions. In implementations, the input recognition system can detect motion inputs from discernable variables, such as from a direction variable, from start region position variables and end region position variables, and/or from a motion rate variable (e.g., a particular number of pixels per second).


As described herein, a gesture input may be recognized as a user input with one or more fingers on a touch-screen display of a device, and the gesture input includes one or more contacts that each correlate to the input of a finger on the touch-screen display. In the FIG. 1 example, the two-finger gesture input 106 includes two contacts identified as a first contact 116 that correlates to a first finger of the gesture input, and a second contact 118 that correlates to a second finger of the gesture input.


The gesture input data is received as a series of frames, and a frame includes a component that represents one touch position of a contact (e.g., along a gesture input that is one finger). For a two-finger gesture input, a frame can include a component of a first contact that correlates to the input of a first finger, and include a component of a second contact that correlates to the input of a second finger (and so on for more than a two-finger gesture input).


In the FIG. 1 example, the first contact 116 of the gesture input 106 includes successive components, such as component 120, component 122, and component 124 at different touch positions along the first contact. Similarly, the second contact 118 of the gesture input 112 includes successive components, such as component 126, component 128, and component 130 at different touch positions along the second contact. Accordingly, a first frame of the two-finger gesture input includes the component 120 and the component 126 of the respective first and second contacts at N-2 in the series of components. Similarly, a next frame of the gesture input at N-1 includes the component 122 and the component 128 of the respective first and second contacts, and a current frame of the gesture input at N includes the component 124 and the component 130 of the respective first and second contacts.


Therefore, a contact of a gesture input spans multiple frames and includes the components from each successive frame that have been identified as correlating to the contact, or to a section of the contact. A component represents a touch position of a contact in a frame (e.g., after the component has been identified as correlating to the contact).


The touch input module 108 recognizes the touch input sensor data 110 as the series of components of the two contacts 116 and 118 of the gesture input 106 on the touch-screen display 104 of the computing device 102. In embodiments, the touch input module 108 is implemented to generate a sensor map 132 from the touch input sensor data 110 for each component of each contact. A sensor map represents an individual component of a contact, such as when a user initiates the gesture input 106 on the touch-screen display 104. In this example, the sensor map includes sensor data elements 134 shown as 8-bit hex values that represent the signal strength at an element position in the sensor map. A stronger sensor signal of the touch input sensor data indicates more touch contact with an element in the sensor map. The sensor map can be generated as a two-dimensional array, and array indices of the elements in the two-dimensional grid correlate to sensed touch contact from the gesture input on the touch-screen display. The stationary baseline level can be subtracted out so that the elements in an area around the sensor map that are not detected as part of the touch contact are normalized to a zero level.


The computing device 102 also includes a component resolution service 136 that is implemented for multi-finger detection and resolution of connected components 138. The component resolution service can be implemented as computer-executable instructions, such as a software application, and executed by one or more processors to implement the various embodiments described herein. The component resolution service can also be implemented as firmware on dedicated sensor device hardware in the computing device. In this example, the component resolution service is shown implemented as a component of the touch input module 108. Alternatively, the component resolution service may be implemented as an independent software application or service for multi-finger detection and component resolution.



FIG. 2 illustrates an example multi-finger gesture 200, such as a tapping gesture with three fingers. Embodiments of multi-finger detection and component resolution can be implemented by the electronic device 102 and the various components described with reference to FIG. 1. For example, the touch input module 108 recognizes touch input sensor data as a component 202 of the multi-finger gesture on the touch-screen display of the device. In this example, the component 202 is represented as a sensor map, which is generated as a two-dimensional array of sensor data elements that correlate to the detected intensity of the multi-finger gesture on the touch-screen display. In an embodiment, the component resolution service 136 is implemented determine an ellipse 204 that approximately encompasses the component 202, and the ellipse has a primary axis 206 and a secondary axis 208 that are orthogonal.


The detection of the multi-finger condition can be based on the mixture Gaussian model to model the multiple contacts and uses the standard Expectancy-Maximum (EM) procedure to obtain the centroids of individual contacts in the component. In an embodiment, the component resolution service 136 in the first stage can model the sensor map as a Gaussian distribution, and the ellipse 204 that approximately encompasses the component 202 is determined from the Gaussian distribution.


The component 202 can be modeled as a Gaussian distribution and then a Maximum Likelihood Estimation (MLE) performed on the connected component to estimate the two axis of the ellipse for the component shape. The center-of-mass and covariance matrix can be determined as in the following equations (1), (2), and (3):










μ
^

=


1
N






y
=
0


H
-
1











x
=
0


W
-
1










s


[
x
]




[
y
]



x








(
1
)







Σ
^

=


1
N






y
=
0


H
-
1











x
=
0


W
-
1










s


[
x
]




[
y
]




(

x
-

μ
^


)




(

x
-

μ
^


)

T









(
2
)






N
=




y
=
0


H
-
1











x
=
0


W
-
1









s


[
x
]




[
y
]








(
3
)







The value of s[x][y] is the element value treated as a histogram of all the samples at a particular grid position (x,y). The Eigen problem to the 2×2 matrix is solved in the equation (4):

{circumflex over (Σ)}φ=Λφ  (4)


The parameter Λ=diag(λ01) is the 2×2 diagonal matrix of Eigen values, and the parameter φ=(φ0φ1) is the Eigen vector matrix of columns that correspond to λ0 and λ1. For this 2×2 Eigen problem, there exists an exact solution, and the two Eigen values can be determined by solving the following quadratic equation (5):

λ2−Tr({circumflex over (Σ)})λ+|{circumflex over (Σ)}|=0  (5)


The primary axis 206 is direction of the Eigen vector φ0 that corresponds to the larger Eigen value λ0, and the secondary axis 208 is defined by the other Eigen vector φ1.



FIG. 3 illustrates examples 300 of a distribution and a histogram function based on the component 202 for the multi-finger gesture 200 described with reference to FIG. 2. In embodiments, the component resolution service 136 in the second stage is implemented to generate a distribution 302 that projects sensor data elements to the primary axis 206 of the ellipse 204 based on detected intensity of the touch input sensor data. The dimensional reduction projects the sensor data elements onto the primary axis in a weighted fashion, which emphasizes the off-axis contribution along with the sensor data element value.


The component resolution service 136 is also implemented to generate a histogram function 304 based on the distribution 302. In this example, the histogram function indicates individual contacts of the component and separation of the individual contacts. For example, histogram function high points 306 (also referred to as “peaks” in the graph) indicate the individual contacts of the connected component 202. Additionally, histogram function low points 308 (also referred to as “valleys” in the graph) indicate the separation of the individual contacts of the component.


The dimensional reduction of the sensor data elements to the primary axis 206 can be performed by projecting each sensor data element of the component 202 to the primary axis. The histogram function h(t) can be generated for each given integer t=int(x) where x and xare defined as in the following equations (6) and (7):

x=(x−{circumflex over (μ)}·φ0  (6)
x=(x−{circumflex over (μ)}·φ1  (7)


Therefore, xrepresents the element location (relative to the centroid) projected along the primary axis, and xis the projection along the normal direction (e.g., the secondary axis). Implementing a good histogram function h(t) can be used to discriminate the high points 306 in the histogram function h(t), while minimizing false detection. One example of such function h(t) can be obtained by first defining the following three functions ƒ(t), n(t), and m(t) as shown in the following equations (8), (9), and (10):

ƒ(t)=Σx(p[x][y]+α|x|)·δ(t−int(x∥))  (8)
n(t)=Σxδ(t−int(x))  (9)
m(t)=maxx(p[x][y]·δ(t−int(x)))  (10)


The final histogram function h(t) can be obtained by the equation (11):










h


(
t
)


=



f


(
t
)


*

m


(
t
)




n


(
t
)







(
11
)







Here α (set to sixty-four (64) in an implementation) is the mixing constant between sensor unit and the displacement unit; and δ(t−int(x)) imposes a constraint in the domain of summation or maximum to pick out the elements having a projection to the primary axis that equals t. In these three functions, the function ƒ(t) represents the general shape of the histogram function high points at a finger touch. The function n(t) is the normalization function which neutralizes the contribution from a different number of elements for a value t and the neighbor value (t−1 or t+1), in order to offset the noise impact on the boundaries for a given t. The function m(t) is a weighting function which reduces the contribution from a particular value of t that has smaller peak value, which may be seen when two fingers of a gesture input stick together.


In embodiments, the component resolution service 136 is further implemented to perform a retrace of the histogram function to confirm the histogram function high points and to eliminate a false indication 310 of an individual contact. In the histogram function 304, the individual contacts can be located by tracking the local max and local min values of h(t) as t moves across the range of the entire component. A high point, or peak, of h(t) can be confirmed after a sufficient pull-back from a last local maximum. Likewise, a low point, or valley, of h(t) can be confirmed after a sufficient pull-up from a last local minimum. In an implementation, the condition for being sufficient is defined with a percentage of retrace in the opposite direction, such as 50% or more, which suffices to prevent a noise-induced, false confirmation of an individual contact in the histogram function h(t).


As is apparent in the illustration, the histogram function high points 306 (also referred to as peaks or local max) and the histogram function low points 308 (also referred to as valleys or local min) are laid out relative to the primary axis 206 in an interleave fashion. Each of the histogram function high points 306 can be associated with a particular finger input, and each of the histogram function low points 308 indicate the separation of the individual contacts in the component. In an implementation, and for convenience of computation, the individual contacts can be processed as rectangular shapes, which may or may not fit well with the actual shape of the different finger inputs of the multi-finger gesture. We can then use them as initial seeds to feed into an EM procedure for additional iterations in the mixture Gaussian model, where typically one additional iteration can produce a better result. In the event that detection along the primary axis results in a single histogram function high point (e.g., a single peak in the histogram function), the same operation can be repeated on the secondary axis 208 of the ellipse 204, which may confirm a single touch only when both directions render a single peak.



FIG. 4 illustrates an example 400 of separating the individual contacts of the component 202 of the multi-finger gesture 200 described with reference to FIG. 2, and based on the histogram function that is generated as described with reference to FIG. 3. In embodiments, the component resolution service 136 is implemented to separate the individual contacts 402, 404, and 406 of the component 202 and associate each touch input with a different finger input of the multi-finger gesture. The component resolution service 136 is also implemented to again perform the Gaussian fitting distribution via Maximum Likelihood Estimation (MLE) for each individual touch input to determine the centroid and covariance matrix of each contact in the component.


The component resolution service 136 is implemented to model the sensor map that correlates to an individual touch input as a Gaussian distribution, with a probabilistic distribution function as in equation (12):










p


(
x
)


=


1

2

π




Σ







exp


(


-

1
2





(

x
-
μ

)

T




Σ

-
1




(

x
-
μ

)



)







(
12
)







The variable x=(x,y) is an index vector into the two-dimensional sensor map. The parameter μ is the mean, and the parameter Σ is a 2×2 matrix of the covariance matrix. The parameters μ and Σ can be determined so that the probability density function (Gaussian PDF) best fits the sensor map s[x][y] that represents the contact shape of the touch input. To do so, the component resolution service is implemented to perform the MLE to derive the following equations (13) and (14):










μ
^

=


1
N






i
=
0


N
-
1








x
i







(
13
)







Σ
^

=


1
N






i
=
0


N
-
1









(


x
i

-

μ
^


)




(


x
i

-

μ
^


)

T








(
14
)







The parameter N is the total number of sample points when performing the MLE. In this implementation, the value of s[x][y] is treated as a histogram of all the samples at a particular index point (x,y). As such, the parameter N can be derived as in the following equation (15):

N=Σy=0H-1Σx=0W-1s[x][y]  (15)


The equations (13) and (14) can be rewritten in terms of a weighted sum with s[x][y] as the weight, as in the following equations (16) and (17):










μ
^

=


1
N






y
=
0


H
-
1











x
=
0


W
-
1










s


[
x
]




[
y
]



x








(
16
)







Σ
^

=


1
N






y
=
0


H
-
1











x
=
0


W
-
1










s


[
x
]




[
y
]




(

x
-

μ
^


)




(

x
-

μ
^


)

T









(
17
)







Although the summations are now over the entire two-dimensional grid, the summation can be processed and determined quickly since s[x][y] of the sensor map is non-zero only near the touch input. Note that the parameter {circumflex over (μ)} is the center of mass of the touch input, and the covariance matrix {circumflex over (Σ)} designates the constant-level contours of the Gaussian distribution, which is the shape of an ellipse. In embodiments, the ellipse represents the contact shape of the touch input. Generally, the contact shape of the touch input is irregular in shape, and the component resolution service is implemented to determine the ellipse of a size and rotation angle that approximately encompasses the elements 408 of the sensor map. The component resolution service determines the ellipse 410 (also referred to as the “best-fit ellipse”) from the Gaussian distribution.


In embodiments, the component resolution service 136 is implemented to determine the ellipse shape from the covariance matrix {circumflex over (Σ)}, recognizing that the two main axis (e.g., minor axis 412 and major axis 414) of the ellipse correspond to the two Eigenvectors of {circumflex over (Σ)} that each have a length proportional to the square root of the corresponding Eigen values. Accordingly, the following Eigen problem is solved as in equation (18):

{circumflex over (Σ)}φ=Λφ  (18)


The parameter Λ=diag(λ01) is the 2×2 diagonal matrix of Eigen values, and the parameter φ is the Eigen vector matrix of columns that correspond to λ0 and λ1. For this 2×2 Eigen problem, there exists an exact solution, and the two Eigen values can be determined by solving the following quadratic equation (19):

λ2−Tr({circumflex over (Σ)})λ+|{circumflex over (Σ)}|=0  (19)


As shown at 416, the ellipse 410 that corresponds to the contact shape of the touch input is defined by the two axis vectors 412 and 414 that are determined by scaling the two Eigen vectors by the squared root of the corresponding Eigen values Λ1/2φ. The Eigen values Λ1/2φ can be globally scaled so that the resulting angular contact geometry fits the actual contact shape of the touch input, and an appropriate constant-level contour is selected for the shape matching. In practice, a scaling factor α can also be selected so that the area of the scaled ellipse numerically matches to the actual contact shape of the touch input from s[x][y] of the sensor map. As shown at 418, the ellipse 410 can also be represented as a rectangle 420 that bounds the ellipse, where the rectangle is defined by a height, a width, and a rotation angle.


Example method 500 is described with reference to FIG. 5 in accordance with one or more embodiments of multi-finger detection and component resolution. Generally, any of the services, functions, methods, procedures, components, and modules described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or any combination thereof. A software implementation represents program code that performs specified tasks when executed by a computer processor. The example methods may be described in the general context of computer-executable instructions, which can include software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and the like. The program code can be stored in one or more computer-readable storage media devices, both local and/or remote to a computer processor. The methods may also be practiced in a distributed computing environment by multiple computer devices. Further, the features described herein are platform-independent and can be implemented on a variety of computing platforms having a variety of processors.



FIG. 5 illustrates example method(s) 500 of multi-finger detection and component resolution. The order in which the method blocks are described are not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement a method, or an alternate method.


At block 502, touch input sensor data is recognized as a component of a multi-finger gesture on a touch-screen display. For example, the touch input module 108 at the computing device 102 (FIG. 1) recognizes the touch input sensor data 110 as a component 202 of the multi-finger gesture 200 (FIG. 2) on a touch-screen display. In embodiments the component can be represented as a sensor map, which is generated as a two-dimensional array of the sensor data elements that correlate to the detected intensity of the multi-finger gesture on the touch-screen display. A stronger sensor signal of the touch input sensor data indicates more contact with a sensor data element in the sensor map. The sensor map can be modeled as a Gaussian distribution, and an ellipse that approximately encompasses the component can be determined from the Gaussian distribution.


At block 504, an ellipse is determined that approximately encompasses the component. For example, the component resolution service 136 at the computing device 102 determines the ellipse 204 that approximately encompasses the component 202, and the ellipse has a primary axis 206 and a secondary axis 208 that are orthogonal. At block 506, a distribution is generated that projects sensor data elements from the primary axis based on detected intensity of the touch input sensor data. For example, the component resolution service 136 generates the distribution 302 (FIG. 3) that projects sensor data elements to the primary axis 206 of the ellipse 204 based on detected intensity of the touch input sensor data. The dimensional reduction projects the sensor data elements onto the primary axis in a weighted fashion, which emphasizes the off-axis contribution along with the sensor data element value.


At block 508, a histogram function is generated based on the distribution. For example, the component resolution service 136 generates the histogram function 304 based on the distribution 302. The histogram function indicates individual contacts of the component and separation of the individual contacts. For example, histogram function high points 306 (also referred to as “peaks” in the graph) indicate the individual contacts of the connected component 202. Additionally, histogram function low points 308 (also referred to as “valleys” in the graph) indicate the separation of the individual contacts of the component.


At block 510, a retrace of the histogram function is performed to confirm the histogram function high points and to eliminate a false indication of an individual contact. For example, the component resolution service 136 performs a retrace of the histogram function to confirm the histogram function high points and to eliminate a false indication 310 of an individual contact. At block 512, each individual contact of the component is associated with a different finger input of the multi-finger gesture and, at block 514, the individual contacts of the component are separated. For example, the component resolution service 136 separates the individual contacts 402, 404, and 406 (FIG. 4) of the component 202 and associates each touch input with a different finger input of the multi-finger gesture 200. An individual ellipse can then be determined for each individual touch input to map each of the individual contacts for association with the different finger inputs of the multi-finger gesture.



FIG. 6 illustrates various components of an example device 600 that can be implemented as any of the devices, or services implemented by devices, described with reference to the previous FIGS. 1-5. In embodiments, the device may be implemented as any one or combination of a fixed or mobile device, in any form of a consumer, computer, portable, user, communication, phone, navigation, television, appliance, gaming, media playback, and/or electronic device. The device may also be associated with a user (i.e., a person) and/or an entity that operates the device such that a device describes logical devices that include users, software, firmware, hardware, and/or a combination of devices.


The device 600 includes communication devices 602 that enable wired and/or wireless communication of device data 604, such as received data, data that is being received, data scheduled for broadcast, data packets of the data, etc. The device data or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on the device can include any type of audio, video, and/or image data. The device includes one or more data inputs 606 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs and any other type of audio, video, and/or image data received from any content and/or data source.


The device 600 also includes communication interfaces 608, such as any one or more of a serial, parallel, network, or wireless interface. The communication interfaces provide a connection and/or communication links between the device and a communication network by which other electronic, computing, and communication devices communicate data with the device.


The device 600 includes one or more processors 610 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of the device. Alternatively or in addition, the device can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 612. In embodiments, the device 600 can also include a touch input module 614 that is implemented to recognize touch input sensor data. Although not shown, the device can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.


The device 600 also includes one or more memory devices 616 (e.g., computer-readable storage media) that enable data storage, such as random access memory (RAM), non-volatile memory (e.g., read-only memory (ROM), flash memory, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable disc, and the like. The device may also include a mass storage media device.


Computer readable media can be any available medium or media that is accessed by a computing device. By way of example, and not limitation, computer readable media may comprise storage media and communication media. Storage media include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store information and which can be accessed by a computer.


Communication media typically embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier wave or other transport mechanism. Communication media also include any information delivery media. A modulated data signal has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.


A memory device 616 provides data storage mechanisms to store the device data 604, other types of information and/or data, and various device applications 618. For example, an operating system 620 can be maintained as a software application with the memory device and executed on the processors. The device applications may also include a device manager, such as any form of a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on. In this example, the device applications 618 include a gesture recognition application 622 and a component resolution service 624 that implement embodiments of multi-finger detection and component resolution as described herein.


The device 600 also includes an audio and/or video processing system 626 that generates audio data for an audio system 628 and/or generates display data for a display system 630. The audio system and/or the display system may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio device and/or to a display device via an RF (radio frequency) link, S-video link, composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link. In implementations, the audio system and/or the display system are external components to the device. Alternatively, the audio system and/or the display system are integrated components of the example device, such as an integrated touch-screen display.


Although embodiments of multi-finger detection and component resolution have been described in language specific to features and/or methods, the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of multi-finger detection and component resolution.

Claims
  • 1. A method, comprising: recognizing touch input sensor data as a component of a multi-finger gesture on a touch-screen display;determining an ellipse that approximately encompasses the component, the ellipse having a primary axis and a secondary axis that are orthogonal;generating a distribution that projects sensor data elements from the primary axis based on detected intensity of the touch input sensor data; andgenerating a histogram function based on the distribution, using a ratio of a weighting function which reduces contribution from a location with a peak value and a normalization function which neutralizes contribution from a neighbor location, the ratio being the weighting function which reduces contribution from a location with a peak value divided by the normalization function which neutralizes contribution from a neighbor location, the histogram function indicating individual contacts of the component and separation of the individual contacts.
  • 2. A method as recited in claim 1, further comprising associating each individual contact of the component with a different finger input of the multi-finger gesture.
  • 3. A method as recited in claim 1, wherein histogram function high points indicate the individual contacts of the component, and a histogram function low point indicates the separation of the individual contacts of the component.
  • 4. A method as recited in claim 3, further comprising performing a retrace of the histogram function to confirm the histogram function high points and to eliminate a false indication of an individual contact.
  • 5. A method as recited in claim 1, wherein: the component is represented as a sensor map, which is generated as a two dimensional array of the sensor data elements that correlate to the detected intensity of the multi-finger gesture on the touch-screen display; anda stronger sensor signal of the touch input sensor data indicates more contact with a sensor data element in the sensor map.
  • 6. A method as recited in claim 5, further comprising modeling the sensor map as a Gaussian distribution, and wherein the ellipse that approximately encompasses the component is determined from the Gaussian distribution.
  • 7. A method as recited in claim 1, further comprising: separating the individual contacts of the component; anddetermining an individual ellipse for each individual contact to map each of the individual contacts for association with the different finger inputs of the multi-finger gesture.
  • 8. A computing device, comprising: a touch-screen display;a touch input module configured to recognize touch input sensor data as a component of a multi-finger gesture on the touch-screen display;at least a memory and a processor to implement a component resolution service configured to: determine an ellipse that approximately encompasses the component, the ellipse having a primary axis and a secondary axis that are orthogonal;generate a distribution that projects sensor data elements from the primary axis based on detected intensity of the touch input sensor data; andgenerate a histogram function based on the distribution, using a ratio of a weighting function which reduces contribution from a location with a peak value and a normalization function which neutralizes contribution from a neighbor location, the ratio being the weighting function which reduces contribution from a location with a peak value divided by the normalization function which neutralizes contribution from a neighbor location, the histogram function indicating individual contacts of the component and separation of the individual contacts.
  • 9. A computing device as recited in claim 8, wherein the component resolution service is further configured to associate each individual contact of the component with a different finger input of the multi-finger gesture.
  • 10. A computing device as recited in claim 8, wherein histogram function high points indicate the individual contacts of the component, and a histogram function low point indicates the separation of the individual contacts of the component.
  • 11. A computing device as recited in claim 10, wherein the component resolution service is further configured to perform a retrace of the histogram function to confirm the histogram function high points and to eliminate a false indication of an individual contact.
  • 12. A computing device as recited in claim 8, wherein: the component is represented as a sensor map, which is generated as a two dimensional array of the sensor data elements that correlate to the detected intensity of the multi-finger gesture on the touch-screen display; anda stronger sensor signal of the touch input sensor data indicates more contact with a sensor data element in the sensor map.
  • 13. A computing device as recited in claim 12, wherein the component resolution service is further configured to model the sensor map as a Gaussian distribution, and wherein the ellipse that approximately encompasses the component is determined from the Gaussian distribution.
  • 14. A computing device as recited in claim 8, wherein the component resolution service is further configured to: separate the individual contacts of the component; anddetermine an individual ellipse for each individual contact to map each of the individual contacts for association with the different finger inputs of the multi-finger gesture.
  • 15. One or more computer-readable storage media devices comprising instructions that are executable and, responsive to executing the instructions, a computing device: recognizes touch input sensor data as a component of a multi-finger gesture on a touch-screen display;determines an ellipse that approximately encompasses the component, the ellipse having a primary axis and a secondary axis that are orthogonal;generates a distribution that projects sensor data elements from the primary axis based on detected intensity of the touch input sensor data; andgenerates a histogram function based on the distribution, using a ratio of a weighting function which reduces contribution from a location with a peak value and a normalization function which neutralizes contribution from a neighbor location, the ratio being the weighting function which reduces contribution from a location with a peak value divided by the normalization function which neutralizes contribution from a neighbor location, the histogram function indicating individual contacts of the component and separation of the individual contacts.
  • 16. One or more computer-readable storage media devices as recited in claim 15, further comprising additional instructions that are executable and, responsive to executing the additional instructions, the computing device associates each individual contact of the component with a different finger input of the multi finger gesture.
  • 17. One or more computer-readable storage media devices as recited in claim 15, further comprising additional instructions that are executable and, responsive to executing the additional instructions, the computing device generates the histogram function to include high points that indicate the individual contacts of the component, and to include histogram function low points that indicate the separation of the individual contacts of the component.
  • 18. One or more computer-readable storage media devices as recited in claim 17, further comprising additional instructions that are executable and, responsive to executing the additional instructions, the computing device performs a retrace of the histogram function to confirm the histogram function high points and to eliminate a false indication of an individual contact.
  • 19. One or more computer-readable storage media devices as recited in claim 15, further comprising additional instructions that are executable and, responsive to executing the additional instructions, the computing device: separates the individual contacts of the component; anddetermines an individual ellipse for each individual contact to map each of the individual contacts for association with the different finger inputs of the multi-finger gesture.
  • 20. One or more computer-readable storage media devices as recited in claim 15, further comprising additional instructions that are executable and, responsive to executing the additional instructions, the computing device models a sensor map as a Gaussian distribution, wherein: the ellipse that approximately encompasses the component is determined from the Gaussian distribution;the component is represented as the sensor map, which is generated as a two dimensional array of the sensor data elements that correlate to the detected intensity of the multi-finger gesture on the touch-screen display; anda stronger sensor signal of the touch input sensor data indicates more contact with a sensor data element in the sensor map.
US Referenced Citations (139)
Number Name Date Kind
4421997 Forys Dec 1983 A
5493294 Morita Feb 1996 A
5825352 Bisset et al. Oct 1998 A
5856822 Du et al. Jan 1999 A
5943043 Furuhata et al. Aug 1999 A
6008636 Miller et al. Dec 1999 A
6091406 Kambara et al. Jul 2000 A
6323846 Westerman et al. Nov 2001 B1
6671406 Anderson Dec 2003 B1
6741237 Benard et al. May 2004 B1
6856259 Sharp Feb 2005 B1
6977646 Hauck et al. Dec 2005 B1
7053887 Kraus et al. May 2006 B2
7174649 Harris Feb 2007 B1
7254775 Geaghan et al. Aug 2007 B2
7295191 Kraus et al. Nov 2007 B2
7362313 Geaghan et al. Apr 2008 B2
7375454 Takasaki May 2008 B2
7489303 Pryor Feb 2009 B1
7580556 Lee et al. Aug 2009 B2
7592999 Rosenberg et al. Sep 2009 B2
7619618 Westerman et al. Nov 2009 B2
7711450 Im et al. May 2010 B2
7725014 Lam et al. May 2010 B2
7728821 Hillis et al. Jun 2010 B2
7746325 Roberts Jun 2010 B2
7797115 Tasher et al. Sep 2010 B2
7812828 Westerman et al. Oct 2010 B2
7907750 Ariyur et al. Mar 2011 B2
7938009 Grant et al. May 2011 B2
7978182 Ording et al. Jul 2011 B2
8061223 Pan Nov 2011 B2
8217909 Young Jul 2012 B2
8314780 Lin et al. Nov 2012 B2
8493355 Geaghan et al. Jul 2013 B2
8725443 Uzelac et al. May 2014 B2
8773377 Zhao et al. Jul 2014 B2
20030164820 Kent Sep 2003 A1
20040207606 Atwood et al. Oct 2004 A1
20050012724 Kent Jan 2005 A1
20050063566 Beek et al. Mar 2005 A1
20060097991 Hotelling et al. May 2006 A1
20060175485 Cramer Aug 2006 A1
20060227120 Eikman Oct 2006 A1
20070081726 Westerman et al. Apr 2007 A1
20080041639 Westerman et al. Feb 2008 A1
20080062140 Hotelling et al. Mar 2008 A1
20080068229 Chuang Mar 2008 A1
20080150909 North et al. Jun 2008 A1
20080158185 Westerman Jul 2008 A1
20080180399 Cheng Jul 2008 A1
20080211778 Ording et al. Sep 2008 A1
20080211782 Geaghan et al. Sep 2008 A1
20080252616 Chen Oct 2008 A1
20080278453 Reynolds et al. Nov 2008 A1
20080284899 Haubmann et al. Nov 2008 A1
20080309624 Hotelling Dec 2008 A1
20080309629 Westerman et al. Dec 2008 A1
20090009483 Hotelling et al. Jan 2009 A1
20090046073 Pennington et al. Feb 2009 A1
20090096753 Lim Apr 2009 A1
20090141046 Rathnam et al. Jun 2009 A1
20090157206 Weinberg et al. Jun 2009 A1
20090160763 Cauwels et al. Jun 2009 A1
20090174679 Westerman Jul 2009 A1
20090190399 Shibata et al. Jul 2009 A1
20090225036 Wright Sep 2009 A1
20090241701 Pan Oct 2009 A1
20090250268 Staton et al. Oct 2009 A1
20090251435 Westerman et al. Oct 2009 A1
20090251436 Keskin Oct 2009 A1
20090267903 Cady et al. Oct 2009 A1
20090273584 Staton et al. Nov 2009 A1
20090303202 Liu Dec 2009 A1
20090312009 Fishel Dec 2009 A1
20100053099 Vincent et al. Mar 2010 A1
20100060604 Zwart et al. Mar 2010 A1
20100073318 Hu et al. Mar 2010 A1
20100103118 Townsend et al. Apr 2010 A1
20100103121 Kim et al. Apr 2010 A1
20100117962 Westerman et al. May 2010 A1
20100134429 You et al. Jun 2010 A1
20100193258 Simmons et al. Aug 2010 A1
20100214233 Lee Aug 2010 A1
20100231508 Cruz-Hernandez et al. Sep 2010 A1
20100277505 Ludden et al. Nov 2010 A1
20100302211 Huang Dec 2010 A1
20100309139 Ng Dec 2010 A1
20100315266 Gunawardana et al. Dec 2010 A1
20100315366 Lee et al. Dec 2010 A1
20100315372 Ng Dec 2010 A1
20110001633 Lam et al. Jan 2011 A1
20110018822 Lin et al. Jan 2011 A1
20110025629 Grivna et al. Feb 2011 A1
20110042126 Spaid et al. Feb 2011 A1
20110050620 Hristov Mar 2011 A1
20110080348 Lin et al. Apr 2011 A1
20110084929 Chang et al. Apr 2011 A1
20110106477 Brunner May 2011 A1
20110115709 Cruz-Hernandez May 2011 A1
20110115747 Powell et al. May 2011 A1
20110141054 Wu Jun 2011 A1
20110242001 Zhang et al. Oct 2011 A1
20110248941 Abdo et al. Oct 2011 A1
20110261005 Joharapurkar et al. Oct 2011 A1
20110267481 Kagei Nov 2011 A1
20110298709 Vaganov Dec 2011 A1
20110298745 Souchkov Dec 2011 A1
20110299734 Bodenmueller Dec 2011 A1
20110304577 Brown Dec 2011 A1
20110304590 Su et al. Dec 2011 A1
20120030624 Migos Feb 2012 A1
20120032891 Parivar Feb 2012 A1
20120065779 Yamaguchi et al. Mar 2012 A1
20120065780 Yamaguchi et al. Mar 2012 A1
20120068957 Puskarich et al. Mar 2012 A1
20120075331 Mallick Mar 2012 A1
20120105334 Aumiller et al. May 2012 A1
20120131490 Lin et al. May 2012 A1
20120146956 Jenkinson Jun 2012 A1
20120153652 Yamaguchi et al. Jun 2012 A1
20120187956 Uzelac Jul 2012 A1
20120188176 Uzelac Jul 2012 A1
20120188197 Uzelac Jul 2012 A1
20120191394 Uzelac Jul 2012 A1
20120206377 Zhao Aug 2012 A1
20120206380 Zhao Aug 2012 A1
20120223894 Zhao Sep 2012 A1
20120268416 Pirogov et al. Oct 2012 A1
20120280934 Ha et al. Nov 2012 A1
20120280946 Shih et al. Nov 2012 A1
20120301009 Dabic Nov 2012 A1
20120319992 Lee Dec 2012 A1
20130063167 Jonsson Mar 2013 A1
20130113751 Uzelac May 2013 A1
20130197862 Uzelac et al. Aug 2013 A1
20130238129 Rose et al. Sep 2013 A1
20130345864 Park Dec 2013 A1
20140081793 Hoffberg Mar 2014 A1
Foreign Referenced Citations (35)
Number Date Country
1761932 Apr 2006 CN
1942853 Apr 2007 CN
200947594 Sep 2007 CN
101553777 Oct 2009 CN
101661373 Mar 2010 CN
101937296 Jan 2011 CN
201828476 May 2011 CN
2201903594 Jul 2011 CN
202093112 Dec 2011 CN
101545938 Jan 2012 CN
202171626 Mar 2012 CN
202196126 Apr 2012 CN
102436334 May 2012 CN
101982783 Jul 2012 CN
19939159 Mar 2000 DE
2077490 Jul 2009 EP
2284654 Feb 2011 EP
2003303051 Oct 2003 JP
2007323731 Dec 2007 JP
20050003155 Jan 2005 KR
20050094359 Sep 2005 KR
100763057 Oct 2007 KR
20080066416 Jul 2008 KR
100941441 Feb 2010 KR
1020100067178 Jun 2010 KR
20100077298 Jul 2010 KR
1020100129015 Dec 2010 KR
101007049 Jan 2011 KR
20110011337 Feb 2011 KR
101065014 Sep 2011 KR
WO-9938149 Jul 1999 WO
WO-2005114369 Dec 2005 WO
WO-2006042309 Apr 2006 WO
WO-2010073329 Jul 2010 WO
WO-2013063042 May 2013 WO
Non-Patent Literature Citations (101)
Entry
“PCT Search Report and Written Opinion”, Application No. PCT/US2012/027642, (Sep. 3, 2012), 9 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2012/024780, (Sep. 3, 2012), 9 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2012/024781, (Sep. 3, 2012), 9 pages.
“Final Office Action”, U.S. Appl. No. 12/941,693, (Nov. 26, 2012), 22 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/152,991, (Mar. 21, 2013), 10 pages.
“International Search Report”, Mailed Date: Jun. 13, 2012, Application No. PCT/US2011/055621, Filed Date: Oct. 10, 2011, pp. 8.
“International Search Report”, Application No. PCT/US2011/058855, (Nov. 1, 2011), 8 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/941,693, (Jul. 18, 2012), 19 pages.
Baraldi, Stefano et al., “WikiTable: Finger Driven Interaction for Collaborative Knowledge-Building Workspaces”, Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW '06), available at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1640590>>,(Jul. 5, 2006),6 pages.
Benko, Hrvoje et al., “Resolving Merged Touch Contacts”, U.S. Appl. No. 12/914,693, (Nov. 8, 2010),22 pages.
Binns, Francis S., “Multi-“Touch” Interaction via Visual Tracking”, Bachelor of Science in Computer Science with Honours, The University of Bath, available at <<http://www.cs.bath.ac.uk/˜mdv/courses/CM30082/projects.bho/2008-9/Binns-FS-dissertation-2008-9.pdf>>,(May 2009),81 pages.
Cao, Xiang et al., “Evaluation of an On-line Adaptive Gesture Interface with Command Prediction”, In Proceedings of GI 2005, Available at <http://citeseerx.ist.psu.edu/viewdoc/download: jsessionid=DAB1B08F620C23464427932BAF1ECF49?doi=10.1.1.61.6749&rep=rep1&type=pdf>,(May 2005),8 pages.
Cao, Xiang et al., “ShapeTouch: Leveraging Contact Shape on Interactive Surfaces”, In Proceedings of TABLETOP 2008, Available at <http://www.cs.toronto.edu/˜caox/tabletop2008—shapetouch.pdf>,(2008),pp. 139-146.
Dang, Chi T., et al., “Hand Distinction for Multi-Touch Tabletop Interaction”, University of Augsburg; Institute of Computer Science; Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, (Nov. 23-25, 2009),8 pags.
Dillencourt, Michael B., et al., “A General Approach to Connected-Component Labeling for Arbitrary Image Representations”, Journal of the Association for Computing Machinery, vol. 39, No. 2, available at <<http://www.cs.umd.edu/˜hjs/pubs/DillJACM92.pdf>>,(Apr. 1992),pp. 253-280.
Tao, Yufei et al., “An Efficient Cost Model for Optimization of Nearest Neighbor Search in Low and Medium Dimensional Spaces”, Knowledge and Data Engineering, vol. 16 Issue:10, retrieved from <<http://www.cais.ntu.edu.sg/˜jzhang/papers/ecmonns.pdf>> on Mar. 16, 2011,(Oct. 2004),16 pages.
Tsuchiya, Sho et al., “Vib-Touch: Virtual Active Touch Interface for Handheld Devices”, In Proceedings of The 18th IEEE International Symposium on Robot and Human Interactive Communication, Available at <http://www.mech.nagoya-u.ac.jp/asi/en/member/shogo—okamoto/papers/tsuchiyaROMAN2009.pdf>,(Oct. 20, 2009),pp. 12-17.
Westman, Tapani et al., “Color Segmentation by Hierarchical Connected Components Analysis with Image Enhancement by Symmetric Neighborhood Filter”, Pattern Recognition, 1990. Proceedings., 10th International Conference on Jun. 16-21, 1990, retrieved from <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=118219>> on Mar. 16, 2011,(Jun. 16, 1990),pp. 796-802.
Wilson, Andrew D., “TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction”, In Proceedings of ICIM 2004, Available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.95.3647&rep=rep1&type=pdf>, (Oct. 2004),8 pages.
“Actuation Force of Touch Screen” Solutuons @ Mecmesin, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileld=188971>,(Dec. 31, 2010),1 page.
“AO Touch Screen Tester”, retrieved from <http://www.ao-cs.com/Projects/touch%20screen%20tester%20project.html>, (Dec. 31, 2010),1 page.
“Capacitive Touch Sensors—Application Fields, Technology Overview and Implementation Example”, Fujitsu Microelectronics Europe GmbH; retrieved from http://www.fujitsu.com/downloads/MICRO/fme/articles/fujisu-whitepaper-capacitive-touch-sensors.pdf on Jul. 20, 2011, (Jan. 12, 2010), 12 pages.
“Haptic-Actuator Controllers”, retrieved from <http://www.maxim-ic.com/products/data—converters/touch-interface/haptic-actuator.cfm> on May 4, 2011, 1 page.
“How to Use the Precision Touch Testing Tool”, retrieved from <http://feishare.com/attachments/article/279/precision-touch-testing-tool-Windows8-hardware-certification.pdf>, (Apr. 15, 2012), 10 pages.
“Linearity Testing Solutions in Touch Panels” retrieved from <advantech.com/machine-automation/ .../%7BD05BC586-74DD-4BFA-B81A-2A9F7ED489F/>, (Nov. 15, 2011), 2 pages.
“MAX11871”, retrieved from <http://www.maxim-ic.com/datasheet/index.mvp/id/7203> on May 4, 2011, 2 pages.
“MicroNav Integration Guide Version 3.0” retrieved from <http://www.steadlands.com/data/interlink/micronavintguide.pdf>, (Dec. 31, 2003), 11 pages.
“Microsoft Windows Simulator Touch Emulation”, retrieved from <blogs.msdn.com/b/visualstudio/archive/2011/09/30/microsoft-windows-simulator-touch-emulation.aspx>, (Sep. 30, 2011), 3 pages.
“OptoFidelity Touch & Test” retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileld=188969, (Feb. 20, 2012), 2 pages.
“OptoFidelity Touch & Test” retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileld=188420>, (May 4, 2012), 2 pages.
“OptoFidelity Two Fingers -robot” video available at <http://www.youtube.com/watch?v=YppRASbXHfk&feature=player—embedded#!section>, (Sep. 15, 2010), 2 pages.
“Projected Capacitive Test Fixture”, retrieved from <http://www.touch-intl.com/downloads/DataSheets%20for%20Web/6500443-PCT-DataSheet-Web.pdf>, (2009), 2 pages.
“Resistive Touch Screen—Resistance Linearity Test”, video available at <http://www.youtube.com/watch?v=hb23GpQdXXU>, (Jun. 17, 2008), 2 pages.
“Smartphone Automatic Testing Robot at UEI Booth”, video available at <http://www.youtube.com/watch?v=f-Q4ns-b9sA>, (May 9, 2012), 2 pages.
“STM23S-2AN NEMA 23 Integrated Drive+Motor” Retrieved from: <http://www.applied-motion.com/products/integrated-steppers/stm23s-2an> on Jan. 24, 2012, 3 pages.
“Technology Comparison: Surface Acoustic Wave, Optical and Bending Wave Technology”, 3M Touch Systems, Available at >http://multimedia.3m.com/mws/mediawebserver?mwsld=66666UuZjcFSLXTtnXT2NXTaEVuQEcuZgVs6EVs6E666666--&fn=DST-Optical-SAW%20Tech%20Brief.pdf>,(2009), pp. 1-4.
“Touch Panel Inspection & Testing Solution”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileld=188967>, (Dec. 31, 2010),1 page.
“Touch Panel Semi-Auto Handler Model 3810”, retrieved from <http://www.chromaus.com/datasheet/3810—en.pdf>, (Dec. 31, 2010), 2 pages.
“TouchSense Systems Immersion”, retrieved from <http://www.ArticleOnePartners.com/idex/servefile?fileld=188486>, (Jun. 19, 2010), 20 pages.
“Using Low Power Mode on the MPR083 and MPR084” Freescale Semiconductor Application Note, Available at <http://cache.freescale.com/files/sensors/doc/app—note/AN3583.pdf>,(Nov. 2007), pp. 1-5.
Asif, Muhammad et al., “MPEG-7 Motion Descriptor Extraction for Panning Camera Using Sprite Generated”, In Proceedings of AVSS 2008, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4730384>,(Sep. 2008), pp. 60-66.
Brodkin, Jon “Windows 8 hardware: Touchscreens, sensor support and robotic fingers”, <<http://arstechnica.com/business/news/2011/09/windows-8-hardware-touch-screens-sensor-support-and-robotic-fingers.ars>>, (Sep. 13, 2011),1 Page.
Buffet, Y “Robot Touchscreen Analysis”, <<http://ybuffet.posterous.com/labsmotocom-blog-archive-robot-touchscreen-an>>, (Apr. 19, 2010), 2 Pages.
Cravotta, Robert “The Battle for Multi-touch”, Embedded Insights, retrieved from <http://www.embeddedinsights.com/channels/2011/04/12/the-battle-for-multi-touch/> on May 4, 2011,(Apr. 12, 2011), 3 pages.
Dillow, Clay “Liquid-Filled Robot Finger More Sensitive to Touch Than a Human's”, retrieved from <www.popsci.com/technology/article/2012-06/new-robot-finger-more-sensitive-touch-human> on Jul. 27, 2012, (Jun. 19, 2012), 3 pages.
Hoggan, Eve et al., “Mobile Multi-Actuator Tactile Displays”, In 2nd international conference on Haptic and audio interaction design, retrieved from <http://www.dcs.gla.ac.uk/˜stephen/papers/HAID2.pdf>,(Oct. 29, 2007),12 pages.
Hoshino, et al., “Pinching at finger tips for humanoid robot hand”, Retrieved at <<http://web.mit.edu/zoz/Public/HoshinoKawabuchiRobotHand.pdf>>, (Jun. 30, 2005), 9 Pages.
Kastelan, et al., “Stimulation Board for Automated Verification of Touchscreen-Based Devices”, 22nd International Conference on Field Programmable Logic and Applications, Available at <https://www2.lirmm.fr/lirmm/interne/BIBLI/CDROM/MIC/2012/FPL—2012/Papers/PHD7.pdf>,(Aug. 29, 2012), 2 pages.
Kastelan, et al., “Touch-Screen Stimulation for Automated Verification of Touchscreen-Based Devices”, In IEEE 19th International Conference and Workshops on Engineering of Computer Based Systems, (Apr. 11, 2012), pp. 52-55.
Khandkar, Shahedul H., et al., “Tool Support for Testing Complex MultiTouch Gestures” ITS 2010, Nov. 7-10, 2010, Saarbrucken, Germany, (Nov. 7, 2010), 10 pages.
Kjellgren, Olof “Developing a remote control application for Windows CE”, Bachelor Thesis performed in Computer Engineering at ABE Robotics, Miilardalen University, Department of Computer Science and Electronics, Retrieved at <<http://www.idt.mdh.se/utbildning/exjobblfiles/TR0661.pdf>>,(May 30, 2007), 43 Pages.
Kuosmanen, Hans “OptoFidelity Automating UI Testing”, video available at <http://www.youtube.com/watch?v=mOZ2r7ZvyTg&feature=player—embedded#!section>, (Oct. 14, 2010), 2 pages.
Kuosmanen, Hans “Testing the Performance of Touch-Enabled Smartphone User Interfaces”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileld=188442>, (Dec. 31, 2008), 2 pages.
Levin, Michael et al., “Tactile-Feedback Solutions for an Enhanced User Experience”, retrieved from >http://www.pbinterfaces.com/documents/Tactile—Feedback—Solutions.pdf>, (Oct. 31, 2009), pp. 18-21.
McGlaun, Shane “Microsoft's Surface 2.0 Stress Testing Robot Called Patty Shown off for First Time”, Retrieved at <<http://www—.slashgear.—com/microsofts-surface-2—0-stress-testing-robot—called-patty-shown-off—-for—-first-time-19172971/>>, (Aug. 19, 2011), 1 Page.
McMahan, William et al., “Haptic Display of Realistic Tool Contact via Dynamically Compensated Control of a Dedicated Actuator”, International Conference on Intelligent Robots and Systems, St. Louis, MO, Oct. 11-15, 2009, retrieved from <http://repository.upenn.edu/meam—papers/222>,(Dec. 15, 2009), 9 pages.
Pratt, Susan “Factors Affecting Sensor Response”, Analog Devices, AN-830 Application Note, Available at <http://www.analog.com/static/imported-files/application—notes/5295737729138218742AN830—0.pdf>,(Dec. 2005), pp. 1-8.
Takeuchi, et al., “Development of a Multi-fingered Robot Hand with Softness changeable Skin Mechanism”, International Symposium on and 2010 6th German Conference on Robotics (ROBOTIK), Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=05756853>>,(Jun. 7, 2010), 7 Pages.
Terpstra, Brett “BetterTouchTool Makes Multi-touch Infinitely More Useful, for Free”, retrieved from <http://www.tuaw.com/2010/01/05/bettertouchtool-makes-multi-touch-infinitely-more-useful-for-fr/> on Jul. 20, 2012, (Jan. 5, 2010), 4 pages.
Toto, Serkan “Video: Smartphone Test Robot Simulates Countless Flicking and Tapping”, retrieved from <techcrunch.com/2010/12/21/video-smartphone-test-robot-simulates-countless-flicking-and-tapping/>, (Dec. 21, 2010), 2 pages.
Wimmer, Raphael et al., “Modular an Deformable Touch-Sensitive Surfaces Based on Time Domain Reflectometry”, In Proceedings of UIST 2011, Available at <http://www.medien.ifi.lmu.de/pubdb/publications/pub/wimmer2011tdrTouch/wimmer2011tdrTouch.pdf>,(Oct. 2011), 10 pages.
Zivkov, et al., “Touch Screen Mobile Application as Part of Testing and Verification System”, Proceedings of the 35th International Convention, (May 21, 2012), pp. 892-895.
“Input Testing Tool”, U.S. Appl. No. 13/659,777, (Oct. 24, 2012), 45 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/941,693, (May 16, 2013), 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/293,060, (Jul. 12, 2013), 9 pages.
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/021787, (May 13, 2013), 9 pages.
“Touch Quality Test Robot”, U.S. Appl. No. 13/530,692, (Jun. 22, 2012), 40 pages.
“Final Office Action”, U.S. Appl. No. 13/293,060, Sep. 25, 2013, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/293,060, Nov. 29, 2013, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/156,243, Sep. 19, 2013, 12 pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/046208, Sep. 27, 2013, 12 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/099,288, Feb. 6, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/198,036, Jan. 31, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/152,991, Sep. 20, 2013, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/154,161, Jan. 3, 2014, 14 pages.
“Non-Final Office Action”, U.S. Appl. No. 12/941,693, Nov. 18, 2013, 21 Pages.
“Notice of Allowance”, U.S. Appl. No. 13/156,243, Jan. 28, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/198,415, Dec. 26, 2013, 8 pages.
“Final Office Action”, U.S. Appl. No. 13/154,161, Apr. 22, 2014, 16 pages.
“Final Office Action”, U.S. Appl. No. 13/530,692, Apr. 10, 2014, 16 pages.
“Foreign Office Action”, CN Application No. 201210018527.8, Feb. 24, 2014, 10 Pages.
“Foreign Office Action”, CN Application No. 201210029859.6, Feb. 21, 2014, 15 Pages.
“Foreign Office Action”, CN Application No. 201210031164.1, Mar. 5, 2014, 14 Pages.
“International Search Report and Written Opinion”, Application No. PCT/US2013/061067, Feb. 7, 2014, 11 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/152,991, Mar. 21, 2014, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/530,692, Jan. 31, 2014, 14 pages.
“Restriction Requirement”, U.S. Appl. No. 13/205,319, May 8, 2014, 6 pages.
“Corrected Notice of Allowance”, U.S. Appl. No. 13/156,243, 4 pages.
“Foreign Office Action”, TW Application No. 101100606, Apr. 15, 2014, 10 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/099,288, Jun. 10, 2014, 22 pages.
“Extended European Search Report”, EP Application No. 11840170.2, Jul. 16, 2014, 10 pages.
“Final Office Action”, U.S. Appl. No. 13/152,991, Aug. 20, 2014, 14 pages.
“Final Office Action”, U.S. Appl. No. 13/198,036, Aug. 14, 2014, 17 pages.
“Foreign Notice of Allowance”, CN Application No. 201110349777.5, May 28, 2014, 6 pages.
“Foreign Office Action”, CN Application No. 201210031164.1, Sep. 11, 2014, 9 Pages.
“Non-Final Office Action”, U.S. Appl. No. 13/530,692, Aug. 25, 2014, 18 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/205,319, Sep. 9, 2014, 13 pages.
“Non-Final Office Action”, U.S. Appl. No. 13/293,060, Jul. 23, 2014, 12 pages.
“Notice of Allowance”, U.S. Appl. No. 12/941,693, Aug. 13, 2014, 8 pages.
“Notice of Allowance”, U.S. Appl. No. 13/362,238, Jul. 28, 2014, 11 pages.
“Supplemental Notice of Allowance”, U.S. Appl. No. 13/362,238, Sep. 18, 2014, 4 pages.
Related Publications (1)
Number Date Country
20130016045 A1 Jan 2013 US