Testing touch and near-touch device displays can be a challenging task due to the precision that is typically utilized, along with the desirability for repeatability for different testing scenarios. These testing scenarios often involve testing a wide range of functionality that is utilized to test touch and near-touch inputs. These inputs can include linear motion inputs, rotational inputs, tapping inputs, and converging and diverging inputs such as “pinch” and “spread” gestures.
One way of testing such inputs is to use individual stencil guides and actuators for each particular different type of input that is desired to be tested. Stencils are typically not adjustable which, in turn, leads to a situation in which each different type of input utilizes a different dedicated stencil.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter.
Various embodiments provide an input test tool that promotes precision testing, flexibility and repeatability over a wide variety of functionality tests that are utilized in both touch and near-touch input scenarios. The input test tool enables a variety of degrees of motion, including both linear and rotational motion, so that a device under test can be tested utilizing a number of different linear and/or rotational input scenarios.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
Overview
Various embodiments provide an input test tool that promotes precision testing, flexibility and repeatability over a wide variety of functionality tests that are utilized in both touch and near-touch input scenarios. The input test tool enables a variety of degrees of motion, including both linear and rotational motion, so that a device under test can be tested utilizing a number of different linear and/or rotational input scenarios. In one or more embodiments, the input test tool can be configured to operate in a manual mode. Alternately or additionally, the input test tool can operate in an automated mode.
In various embodiments, the input test tool is reconfigurable to promote a variety of different touch and near-touch tests. This can be done through the use of one or more actuators which are removably mountable on the input test tool to accommodate a variety of testing scenarios. In addition, a variety of degrees of motion can be utilized including, by way of example and not limitation, linear motion in the X-axis and Y-axis directions, as well as in the Z-axis direction for height adjustments. Further, one or more rotational degrees of motion can be provided. Rotational degrees of motion can be utilized to rotationally position a device under test and/or to provide an arc motion in which one or more actuators can be moved in an arc relative to a device display. The various tests that can be performed can include testing a display device, as described below, or any other structure that receives touch input, such as a touch pad.
In the discussion that follows, a section entitled “Example Input Testing Tool” describes an example input testing tool in accordance with one or more embodiments. Following this, a section entitled “X-axis Motion” describes an example input test scenario in accordance with one or more embodiments. Next, a section entitled “Y-axis Motion” describes an example input test scenario in accordance with one or more embodiments. Following this, a section entitled “Z-axis Motion” describes an example input test scenario in accordance with one or more embodiments. Next, a section entitled “R1-axis Motion” describes an example input test scenario in accordance with one or more embodiments. Following this, a section entitled “R2-axis Motion” describes an example input test scenario in accordance with one or more embodiments. Next, a section entitled “Cross Rail Bracket and Actuator” describes an example cross rail bracket in accordance with one or more embodiments. Following this, a section entitled “Pinch Hanger” describes an example pinch hanger in accordance with one or more embodiments. Next, a section entitled “Automated Mode Testing” describes an example embodiment in which testing can be conducted in an automated fashion in accordance with one or more embodiments. Last, a section entitled “Example Device” describes an example device in accordance with one or more embodiments.
Having provided an overview of various embodiments that are to be described below, consider now an example input testing tool in accordance with one or more embodiments.
Example Input Testing Tool
a illustrate an example input testing tool in accordance with one or more embodiments, generally at 10. The input testing tool 10 includes a frame 11 having a base 11a, a pair of generally opposed sidewalls 11b, 11c mounted to base 11a, and a rear wall 11d mounted to base 11a and both sidewalls 11b, 11c. Each sidewall 11b, 11c supports a respective side rail 12, 14.
The input testing tool 10 includes a transverse rail 16 slidably mounted on side rails 12, 14 for reciprocation along the side rails. A carriage 18 is slidably mounted on transverse rail 16 for reciprocation along the transverse rail and includes an upper portion 20 that resides generally above the transverse rail 16, and a lower portion 22 that resides generally beneath the transverse rail 16. The lower portion 22 of carriage 18 supports an actuator assembly 23 described in more detail below.
The input testing tool 10 also includes a platform 24 that is rotatably mounted on base 11a. The platform 24 includes a plurality of locking members, two of which are illustrated at 26. The locking members 26 enable a device under test to be secured to platform 24 during testing.
In operation, as noted above, the input test tool 10 enables a variety of degrees of motion, including both linear and rotational motion, so that a device under test can be tested utilizing a number of different linear and/or rotational input scenarios. In the illustrated and described embodiment, linear motion can occur along the X-axis, Y-axis, and Z-axis.
Linear motion along the X-axis occurs by moving carriage 18 along the transverse rail 16 in the direction of the double-headed arrow designated “X”. Linear motion along the Y-axis occurs by moving transverse rail 16 in the direction of the double-headed arrow designated “Y” along side rails 12, 14. Linear motion along the Z-axis occurs by moving at least a portion of carriage 18 in the direction of the double-headed arrow designated “Z”, thus changing the height of the carriage's lower portion 22 relative to the device under test that is mounted on platform 24.
In the illustrated and described embodiment, rotational motion can be achieved along the plurality of different rotation axes. In this specific example, a first rotation axis R1 enables platform 24 and hence, a device under test, to be rotated relative to carriage 18. A second rotation axis R2 enables the lower portion 22 and hence, the actuator assembly 23, to be rotated relative to the device under test mounted on platform 24. Example usage scenarios are described just below.
X-Axis Motion
In operation, the X-axis functions to enable alignment of actuators, such as actuator 30, with a desired x starting position on the display device of the device under test 36. Once aligned, movement in the Y-axis direction can be locked by a locking mechanism (not specifically shown). Once the actuators have been aligned with the test pattern 38, the actuator assembly 23 can be lowered to test the device under test. As an example, consider
There, the actuator assembly 23 has been lowered by virtue of a handle (not specifically shown) to bring actuator 30 into an operative testing position with respect to the display device of device under test 36. The operative testing position can be a touch-position in which the actuator 30 physically engages the display device. Alternately, the operative testing position can be a near-touch-position in which the actuator 30 does not physically engage the display device. Such would be the case in scenarios where the display device is configured with capacitive, optical, resistive and/or any other type of near-field sensing capabilities.
Once in the operative testing position, carriage 18 and hence, actuator 30 can be moved in the X direction along the test pattern 38. This movement can be facilitated by virtue of four pre-loaded bearings coupled between the carriage 18 and transverse rail 16. As an example, consider
There, carriage 18 has been moved along the X-axis to the right. Once movement along the test pattern 38 has been completed, the carriage 18 can be returned to its original z position as indicated in
Y-Axis Motion
In operation, the Y-axis functions to enable alignment of actuators, such as actuators 30, with a desired y starting position on the display device of the device under test 36. Once aligned, movement in the X-axis direction can be locked by a locking mechanism (not specifically shown). Movement in the Y-axis direction can be utilized for swipe tests. Once the actuators have been aligned to the display device as, for example, by being aligned with a test pattern, the actuator assembly 23 can be lowered to test the device under test. As an example, consider
There, the actuator assembly 23 has been lowered by virtue of a handle (not specifically shown) to bring actuators 30 into an operative testing position with respect to the display device of device under test 36. The operative testing position can be a touch-position in which the actuators 30 physically engage the display device. Alternately, the operative testing position can be a near-touch-position in which the actuators 30 do not physically engage the display device.
Once in the operative testing position, carriage 18 and hence, actuators 30 can be moved in the Y direction along the display device. This movement can be facilitated by virtue of four pre-loaded bearings coupled between the carriage 18 and the side rails 12, 14 (
There, carriage 18 has been moved along the Y-axis in a direction toward the reader. Once movement along the display device has been completed, the carriage 18 can be returned to its original z position as indicated in
Z-Axis Motion
In the illustrated and described embodiment, the Z-axis is concentric with the R2-axis, as perhaps best illustrated in
In the illustrated and described embodiment, the Z-axis uses a counterbalance spring within linkage 34 to remain stationary once positioned. A threaded knob, not specifically shown, is included in linkage 34 and is used as a lock for motion in the Z-axis direction. A high precision ball spline and a bearing allow for Z-axis motion to not interfere with R2-axis motion.
R1-Axis Motion
Referring to
R2-Axis Motion
In the illustrated and described embodiment, the R2-axis provides rotational motion for the actuator assembly 23. This is achieved through the use of a rotational stage 32 that is mounted between transverse cross rail 16 and linkage 34. Rotational motion about the R2-axis allows for 360° of freedom which rotates the actuator assembly 23 about the Z-axis. Specifically, the R2-axis is concentric with the Z-axis. This is achieved by utilizing a Z-axis ball spline through the center of rotational stage 32.
In addition, the rotational stage 32 includes a locking mechanism so that the rotational movement of the actuator assembly 23 can be locked in place during linear motion and static tests.
In operation, the once the actuator assembly 23 is aligned, movement in the X-axis and Y-axis directions can be locked. The actuator assembly 23 can now be lowered to test the device under test. As an example, consider
There, the actuator assembly 23 has been lowered by virtue of a handle (not specifically shown) to bring actuators 30 into an operative testing position with respect to the display device of device under test 36. The operative testing position can be a touch-position in which the actuators 30 physically engage the display device. Alternately, the operative testing position can be a near-touch-position in which the actuators 30 do not physically engage the display device.
Once in the operative testing position, lower portion 22 of carriage 18 can be rotationally moved about the R2-axis by virtue of handle 25. As an example, consider
Cross Rail Bracket and Actuator
As noted above, cross rail bracket 28 is used as a carriage to mount one or more actuators 30 onto the input test tool 10. The cross rail bracket is configured to enable the actuators 30 to be removably mounted thereon. Any suitable type of arrangement can be utilized to enable the actuators 30 to be removably mounted, an example of which is provided below. In the illustrated and described embodiment, a metric scale 50 is provided on the cross rail bracket 28 to enable precise alignment of the actuators.
In the illustrated and described embodiment, terminus 58 includes a conductive tip 64 that can be made from any suitable type of conductive material. In the illustrated and described embodiment, the conductive tip 64 is made from brass which is then wrapped with a conductive fabric. The conductive fabric can comprise any suitable type of conductive fabric. In the illustrated and described embodiment, the conductive fabric is formed from a shielding material such as that used for EMF shielding. A piece of heat shrink material 66 holds the conductive fabric in place on the tip. It is to be appreciated and understood that the tip can be formed from any suitable type of material, including materials that are not conductive in nature, such as various plastics as well as other materials.
Pinch Hanger
In operation, once the actuators are aligned in the X and Y directions, movement in the X-axis and Y-axis directions can be locked by a locking mechanism (not specifically shown). Once these directions are locked, the actuator assembly 23 can be lowered to test the device under test. As an example, consider
There, the actuator assembly 23 has been lowered by virtue of a handle (not specifically shown) to bring actuators 30 into an operative testing position with respect to the display device of device under test 36. The operative testing position can be a touch-position in which the actuators 30 physically engage the display device. Alternately, the operative testing position can be a near-touch-position in which the actuators 30 do not physically engage the display device.
Once in the operative testing position, the actuators 30 can be moved toward one another along the test pattern 70 to provide a pinch input. As an example, consider
Once movement along the test pattern 70 has been completed, the carriage 18 can be returned to its original z position as indicated in
Automated Mode Testing
As noted above, in one or more embodiments, the input test tool can be configured to operate in an automated mode. In these embodiments, suitably-configured motors, such as servo motors, stepper motors, and the like, can move the carriage 18 and lower portion 22 to achieve movement in the X-axis direction, Y-axis direction, Z-axis direction, and both R1 and R2 axes. In addition, a camera or cameras can be mounted on the input test tool to acquire a test pattern and suitably configured software can then appropriately position the actuator assembly 23 for testing a device under test.
Example Methods
Step 2100 moves an actuator assembly of an input testing tool into an operative testing position relative to a display device of a device under test. In the illustrated and described embodiment, the actuator assembly includes one or more actuators, as noted above. Step 2102 tests the display device of the device under test by moving the actuator or actuators relative to the display device. As noted above, the operative testing position can be a touch-position in which the actuators physically engage the display device. Alternately, the operative testing position can be a near-touch-position in which the actuators do not physically engage the display device.
Movement of the actuator assembly into the operative testing position can occur in any suitable way. For example, in at least some embodiments, the actuator assembly can be moved in one or more of the X-axis direction or the Y-axis direction. Once positioned, the actuator assembly can be moved in the Z-axis direction and into the operative testing position.
The actuators can be moved to affect testing of the display device in any suitable way. For example, the actuator assembly and, hence, the actuators can be moved in the X-axis or Y-axis direction. Alternately or additionally, the actuator assembly can be rotationally moved relative to the display device to affect testing, examples of which are provided above.
In automated scenarios, movement of the actuator assembly and the actuators can be performed under the influence of one or more motors, examples of which are provided above.
Step 2104 moves a movable carriage supporting actuator assembly into a position relative to a display device of a device under test. The actuator assembly includes one or more actuators that are removably mounted thereon. The movable carriage is movable in an X-axis direction and a Y-axis direction. In the illustrated and described embodiments, the actuator assembly is also movable in the Z-axis direction. Step 2106 moves actuator assembly into an operative testing position. In one or more embodiments, movement of the actuator assembly into the operative testing position occurs in the Z-axis direction. Step 2108 moves one or more actuators relative to the display device.
As noted above, the operative testing position can be a touch-position or a near-touch-position.
Further, movement of the actuators can occur in any of the manners described above. For example, the actuators can be moved in the X-axis direction, the Y-axis direction, and/or relative to or around a rotation axis. Further, movement of the actuators can occur relative to each other to affect, for example, a pinch test.
In automated scenarios, movement of the carriage, actuator assembly and the actuators can be performed under the influence of one or more motors, examples of which are provided above.
Step 2110 acquires a test pattern on a display device of a device under test. This step can be performed by using one or more suitably-configured cameras. The cameras can capture a test pattern that is displayed on the display device and convey test pattern data to testing software for subsequent processing. Step 2112 moves an actuator assembly of an input testing tool into an operative testing position. The actuator assembly includes one or more actuators, such as those described above. Movement of the actuator assembly can occur through the use of one or more motors under the influence of testing software that utilizes the test pattern data captured by the cameras. Step 2114 tests the display device of the device under test space by moving one or more actuators relative to the test pattern on the display device.
Testing of the display device by moving the actuators can occur in any suitable way, examples of which are provided above. In addition, the automated testing space scenario can be used for touch-testing and for near-touch-testing.
Having described example operating environments in which various embodiments can be utilized, consider now a discussion of an example device that can be utilized in an automated testing mode, in accordance with one or more embodiments.
Example Device
Device 2200 includes communication devices 2202 that enable wired and/or wireless communication of device data 2204 (e.g., received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.). The device data 2204 or other device content can include configuration settings of the device, media content stored on the device, and/or information associated with a user of the device. Media content stored on device 2204 can include any type of audio, video, and/or image data. Device 2204 includes one or more data inputs 2206 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television media content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
Device 2200 also includes communication interfaces 2208 that can be implemented as any one or more of a serial and/or parallel interface, a wireless interface, any type of network interface, a modem, and as any other type of communication interface. The communication interfaces 2208 provide a connection and/or communication links between device 2200 and a communication network by which other electronic, computing, and communication devices communicate data with device 2200.
Device 2200 includes one or more processors 2210 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable or readable instructions to control the operation of device 2200 and to implement the embodiments described above. Alternatively or in addition, device 2200 can be implemented with any one or combination of hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits which are generally identified at 2212. Although not shown, device 2200 can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
Device 2200 also includes computer-readable media 2214, such as one or more memory components, examples of which include random access memory (RAM), non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable compact disc (CD), any type of a digital versatile disc (DVD), and the like. Device 2200 can also include a mass storage media device 2216.
Computer-readable media 2214 provides data storage mechanisms to store the device data 2204, as well as various device applications 2218 and any other types of information and/or data related to operational aspects of device 2200. For example, an operating system 2220 can be maintained as a computer application with the computer-readable media 2214 and executed on processors 2210. The device applications 2218 can include a device manager (e.g., a control application, software application, signal processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, etc.), as well as other applications that can include, web browsers, image processing applications, communication applications such as instant messaging applications, word processing applications and a variety of other different applications. The device applications 2218 also include any system components or modules to implement embodiments of the techniques described herein. In this example, the device applications 2218 include an input testing module 2222 that is shown as a software module and/or computer application. Input testing module 2222 is representative of software that is used to control testing scenarios, as described above. Alternatively or in addition, input testing module 2222 can be implemented as hardware, software, firmware, or any combination thereof.
Device 2200 also includes an audio and/or video input-output system 2224 that provides audio data to an audio system 2226 and/or provides video data, as from one or more cameras, to a display system 2228 or input testing module 2222 for processing as described above. The audio system 2226 and/or the display system 2228 can include any devices that process, display, and/or otherwise render audio, video, and image data. Video signals and audio signals can be communicated from device 2200 to an audio device and/or to a display device via an RF (radio frequency) link, S-video link, composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link. In an embodiment, the audio system 2226 and/or the display system 2228 are implemented as external components to device 2200. Alternatively, the audio system 2226 and/or the display system 2228 are implemented as integrated components of example device 2200.
Various embodiments provide an input test tool that promotes precision testing, flexibility and repeatability over a wide variety of functionality tests that are utilized in both touch and near-touch input scenarios. The input test tool enables a variety of degrees of motion, including both linear and rotational motion, so that a device under test can be tested utilizing a number of different linear and/or rotational input scenarios.
Although the embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that the embodiments defined in the appended claims are not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed embodiments.
Number | Name | Date | Kind |
---|---|---|---|
4890241 | Hoffman et al. | Dec 1989 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825666 | Freifeld | Oct 1998 | A |
5856822 | Du et al. | Jan 1999 | A |
5943043 | Furuhata et al. | Aug 1999 | A |
6008636 | Miller et al. | Dec 1999 | A |
6021495 | Jain et al. | Feb 2000 | A |
6091406 | Kambara et al. | Jul 2000 | A |
6323846 | Westerman et al. | Nov 2001 | B1 |
6360254 | Linden et al. | Mar 2002 | B1 |
6377936 | Henrick et al. | Apr 2002 | B1 |
6671406 | Anderson | Dec 2003 | B1 |
6741237 | Benard et al. | May 2004 | B1 |
6856259 | Sharp | Feb 2005 | B1 |
6959362 | Wall et al. | Oct 2005 | B2 |
6977646 | Hauck et al. | Dec 2005 | B1 |
7039871 | Cronk | May 2006 | B2 |
7053887 | Kraus et al. | May 2006 | B2 |
7254775 | Geaghan et al. | Aug 2007 | B2 |
7295191 | Kraus et al. | Nov 2007 | B2 |
7362313 | Geaghan et al. | Apr 2008 | B2 |
7375454 | Takasaki | May 2008 | B2 |
7489303 | Pryor | Feb 2009 | B1 |
7536386 | Samji et al. | May 2009 | B2 |
7580556 | Lee et al. | Aug 2009 | B2 |
7619618 | Westerman et al. | Nov 2009 | B2 |
7711450 | Im et al. | May 2010 | B2 |
7728821 | Hillis et al. | Jun 2010 | B2 |
7746325 | Roberts | Jun 2010 | B2 |
7797115 | Tasher et al. | Sep 2010 | B2 |
7812828 | Westerman et al. | Oct 2010 | B2 |
7907750 | Ariyur et al. | Mar 2011 | B2 |
7938009 | Grant et al. | May 2011 | B2 |
7978182 | Ording et al. | Jul 2011 | B2 |
8061223 | Pan | Nov 2011 | B2 |
8217909 | Young | Jul 2012 | B2 |
8311513 | Nasserbakht et al. | Nov 2012 | B1 |
8314780 | Lin et al. | Nov 2012 | B2 |
8493355 | Geaghan et al. | Jul 2013 | B2 |
20020147929 | Rose | Oct 2002 | A1 |
20020153877 | Harris et al. | Oct 2002 | A1 |
20030061520 | Zellers et al. | Mar 2003 | A1 |
20030164820 | Kent | Sep 2003 | A1 |
20030200465 | Bhat et al. | Oct 2003 | A1 |
20040207606 | Atwood et al. | Oct 2004 | A1 |
20050012724 | Kent | Jan 2005 | A1 |
20050063566 | Beek et al. | Mar 2005 | A1 |
20050086296 | Chi et al. | Apr 2005 | A1 |
20050195978 | Babic et al. | Sep 2005 | A1 |
20050277323 | Eldridge et al. | Dec 2005 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060175485 | Cramer | Aug 2006 | A1 |
20060227120 | Eikman | Oct 2006 | A1 |
20060235803 | Romney | Oct 2006 | A1 |
20060259957 | Tam et al. | Nov 2006 | A1 |
20070081726 | Westerman et al. | Apr 2007 | A1 |
20080013064 | Nishii | Jan 2008 | A1 |
20080041639 | Westerman et al. | Feb 2008 | A1 |
20080062140 | Hotelling et al. | Mar 2008 | A1 |
20080068229 | Chuang | Mar 2008 | A1 |
20080109912 | Rivera | May 2008 | A1 |
20080134343 | Pennington et al. | Jun 2008 | A1 |
20080150909 | North et al. | Jun 2008 | A1 |
20080158185 | Westerman | Jul 2008 | A1 |
20080180399 | Cheng | Jul 2008 | A1 |
20080209329 | DeFranco et al. | Aug 2008 | A1 |
20080211778 | Ording et al. | Sep 2008 | A1 |
20080211782 | Geaghan et al. | Sep 2008 | A1 |
20080252616 | Chen | Oct 2008 | A1 |
20080278453 | Reynolds et al. | Nov 2008 | A1 |
20080284899 | Haubmann et al. | Nov 2008 | A1 |
20080309624 | Hotelling | Dec 2008 | A1 |
20080309629 | Westerman et al. | Dec 2008 | A1 |
20090009483 | Hotelling et al. | Jan 2009 | A1 |
20090046073 | Pennington et al. | Feb 2009 | A1 |
20090096753 | Lim | Apr 2009 | A1 |
20090141046 | Rathnam et al. | Jun 2009 | A1 |
20090157206 | Weinberg et al. | Jun 2009 | A1 |
20090160763 | Cauwels et al. | Jun 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090234876 | Schigel et al. | Sep 2009 | A1 |
20090241701 | Pan | Oct 2009 | A1 |
20090250268 | Staton et al. | Oct 2009 | A1 |
20090251435 | Westerman et al. | Oct 2009 | A1 |
20090267903 | Cady et al. | Oct 2009 | A1 |
20090273584 | Staton et al. | Nov 2009 | A1 |
20090300020 | Chen et al. | Dec 2009 | A1 |
20090303202 | Liu et al. | Dec 2009 | A1 |
20090312009 | Fishel | Dec 2009 | A1 |
20100042682 | Kaye | Feb 2010 | A1 |
20100053099 | Vincent et al. | Mar 2010 | A1 |
20100073318 | Hu et al. | Mar 2010 | A1 |
20100088372 | Shridhar et al. | Apr 2010 | A1 |
20100103118 | Townsend et al. | Apr 2010 | A1 |
20100103121 | Kim et al. | Apr 2010 | A1 |
20100121657 | Rosenberger et al. | May 2010 | A1 |
20100134429 | You et al. | Jun 2010 | A1 |
20100192211 | Bono et al. | Jul 2010 | A1 |
20100193258 | Simmons et al. | Aug 2010 | A1 |
20100214233 | Lee | Aug 2010 | A1 |
20100241711 | Ansari et al. | Sep 2010 | A1 |
20100266993 | Gregoire et al. | Oct 2010 | A1 |
20100277505 | Ludden et al. | Nov 2010 | A1 |
20100302211 | Huang | Dec 2010 | A1 |
20100309139 | Ng | Dec 2010 | A1 |
20100315266 | Gunawardana et al. | Dec 2010 | A1 |
20100315366 | Lee et al. | Dec 2010 | A1 |
20100315372 | Ng | Dec 2010 | A1 |
20110018822 | Lin et al. | Jan 2011 | A1 |
20110022414 | Ge et al. | Jan 2011 | A1 |
20110025629 | Grivna et al. | Feb 2011 | A1 |
20110042126 | Spaid et al. | Feb 2011 | A1 |
20110047590 | Carr et al. | Feb 2011 | A1 |
20110050620 | Hristov | Mar 2011 | A1 |
20110055062 | Juntilla et al. | Mar 2011 | A1 |
20110055912 | Fusari et al. | Mar 2011 | A1 |
20110080348 | Lin et al. | Apr 2011 | A1 |
20110084929 | Chang et al. | Apr 2011 | A1 |
20110106477 | Brunner | May 2011 | A1 |
20110122072 | Lin et al. | May 2011 | A1 |
20110126280 | Asano | May 2011 | A1 |
20110141054 | Wu | Jun 2011 | A1 |
20110173547 | Lewis et al. | Jul 2011 | A1 |
20110214148 | Gossweiler, III et al. | Sep 2011 | A1 |
20110261005 | Joharapurkar et al. | Oct 2011 | A1 |
20110267481 | Kagei | Nov 2011 | A1 |
20110289143 | Polis et al. | Nov 2011 | A1 |
20110298709 | Vaganov | Dec 2011 | A1 |
20110298745 | Souchkov | Dec 2011 | A1 |
20110299734 | Bodenmueller | Dec 2011 | A1 |
20110304577 | Brown | Dec 2011 | A1 |
20110304590 | Su et al. | Dec 2011 | A1 |
20110320380 | Zahn | Dec 2011 | A1 |
20120030624 | Migos | Feb 2012 | A1 |
20120032891 | Parivar | Feb 2012 | A1 |
20120044194 | Peng et al. | Feb 2012 | A1 |
20120065779 | Yamaguchi et al. | Mar 2012 | A1 |
20120065780 | Yamaguchi et al. | Mar 2012 | A1 |
20120075331 | Mallick | Mar 2012 | A1 |
20120105334 | Aumiller et al. | May 2012 | A1 |
20120117156 | Anka | May 2012 | A1 |
20120124615 | Lee | May 2012 | A1 |
20120131490 | Lin et al. | May 2012 | A1 |
20120146956 | Jenkinson | Jun 2012 | A1 |
20120153652 | Yamaguchi et al. | Jun 2012 | A1 |
20120185933 | Belk et al. | Jul 2012 | A1 |
20120187956 | Uzelac | Jul 2012 | A1 |
20120188176 | Uzelac | Jul 2012 | A1 |
20120188197 | Uzelac | Jul 2012 | A1 |
20120191394 | Uzelac | Jul 2012 | A1 |
20120206377 | Zhao | Aug 2012 | A1 |
20120206380 | Zhao | Aug 2012 | A1 |
20120223894 | Zhao | Sep 2012 | A1 |
20120259773 | Hoffman | Oct 2012 | A1 |
20120260158 | Steelberg | Oct 2012 | A1 |
20120265841 | Ross et al. | Oct 2012 | A1 |
20120268416 | Pirogov et al. | Oct 2012 | A1 |
20120280934 | Ha et al. | Nov 2012 | A1 |
20120280946 | Shih et al. | Nov 2012 | A1 |
20120317208 | Sousa et al. | Dec 2012 | A1 |
20120319992 | Lee | Dec 2012 | A1 |
20120331108 | Ferdowsi et al. | Dec 2012 | A1 |
20130016045 | Zhao | Jan 2013 | A1 |
20130063167 | Jonsson | Mar 2013 | A1 |
20130066975 | Kantor et al. | Mar 2013 | A1 |
20130067303 | Kantor | Mar 2013 | A1 |
20130067594 | Kantor | Mar 2013 | A1 |
20130113751 | Uzelac | May 2013 | A1 |
20130197862 | Uzelac et al. | Aug 2013 | A1 |
20130238129 | Rose et al. | Sep 2013 | A1 |
20130278539 | Valentine et al. | Oct 2013 | A1 |
20130278550 | Westhues | Oct 2013 | A1 |
20130345864 | Park | Dec 2013 | A1 |
20140111485 | Welch et al. | Apr 2014 | A1 |
20140354310 | Hargrove et al. | Dec 2014 | A1 |
Number | Date | Country |
---|---|---|
201828476 | May 2011 | CN |
2201903594 | Jul 2011 | CN |
202093112 | Dec 2011 | CN |
101545938 | Jan 2012 | CN |
202171626 | Mar 2012 | CN |
202196126 | Apr 2012 | CN |
102436334 | May 2012 | CN |
101982783 | Jul 2012 | CN |
19939159 | Mar 2000 | DE |
2284654 | Feb 2011 | EP |
2003303051 | Oct 2003 | JP |
20010019445 | Mar 2001 | KR |
20050003155 | Jan 2005 | KR |
20050094359 | Sep 2005 | KR |
20070007963 | Jan 2007 | KR |
100763057 | Oct 2007 | KR |
20080019949 | Mar 2008 | KR |
20080066416 | Jul 2008 | KR |
100941441 | Feb 2010 | KR |
20100067178 | Jun 2010 | KR |
20100077298 | Jul 2010 | KR |
20100129015 | Dec 2010 | KR |
20100135982 | Dec 2010 | KR |
101007049 | Jan 2011 | KR |
20110005946 | Jan 2011 | KR |
20110011337 | Feb 2011 | KR |
20110016349 | Feb 2011 | KR |
101065014 | Sep 2011 | KR |
WO-2006042309 | Apr 2006 | WO |
WO-20130063042 | May 2013 | WO |
Entry |
---|
“Capacitive Touch Sensors—Application Fields, Technology Overview and Implementation Example”, Fujitsu Microelectronics Europe GmbH; retrieved from http://www.fujitsu.com/downloads/MICRO/fme/articles/fujitsu-whitepaper-capacitive-touch-sensors.pdf on Jul. 20, 2011, Jan. 12, 2010, 12 pages. |
“Final Office Action”, U.S. Appl. No. 12/941,693, Nov. 18, 2013, 21 Pages. |
“Final Office Action”, U.S. Appl. No. 12/941,693, Nov. 26, 2012, 22 Pages. |
“Final Office Action”, U.S. Appl. No. 13/152,991, Sep. 20, 2013, 14 pages. |
“Final Office Action”, U.S. Appl. No. 13/183,377, Oct. 15, 2013, 12 pages. |
“Final Office Action”, U.S. Appl. No. 13/229,121, Nov. 21, 2013, 15 pages. |
“Final Office Action”, U.S. Appl. No. 13/229,214, Jul. 26, 2013, 25 pages. |
“Final Office Action”, U.S. Appl. No. 13/293,060, Sep. 25, 2013, 10 pages. |
“Haptic-Actuator Controllers”, retrieved from <http://www.maxim-ic.com/products/data—converters/touch-interface/haptic-actuator.cfm> on May 4, 2011, 1 page. |
“International Search Report and Written Opinion”, Application No. PCT/US2013/053621, Feb. 20, 2013, 10 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2012/053681, Feb. 27, 2013, 11 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2013/061067, Feb. 7, 2014, 11 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2011/055621, Jun. 13, 2012, 8 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2013/021787, May 13, 2013, 9 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2012/024780, Sep. 3, 2012, 9 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2012/024781, Sep. 3, 2012, 9 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2012/027642, Sep. 3, 2012, 9 pages. |
“International Search Report”, Application No. PCT/US2011/058855, Nov. 1, 2011, 8 pages. |
“MAX11871”, retrieved from <http://www.maxim-ic.com/datasheet/index.mvp/id/7203> on May 4, 2011, Mar. 25, 2011, 2 pages. |
“MOSS User Permissions and ‘Modify Shared Webpart’ Link”, retrieved from http://www.sharepointdev.net/sharepoint-general-question-answers-discussion/moss-user-permissions-modify-shared-webpart-link-13744.shtml on Aug. 8, 2011, 2009, 3 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/941,693, May 16, 2013, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 12/941,693, Jul. 18, 2012, 19 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/099,288, Feb. 6, 2014, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/152,991, Mar. 21, 2014, 18 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/152,991, Mar. 21, 2013, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/154,161, Jan. 3, 2014, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/156,243, Sep. 19, 2013, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/183,377, Feb. 27, 2014, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/183,377, Jun. 21, 2013, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/198,036, Jan. 31, 2014, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/228,283, Aug. 27, 2013, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/229,121, Jun. 7, 2013, 13 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/229,214, Feb. 15, 2013, 23 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/293,060, Nov. 29, 2013, 11 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/293,060, Jul. 12, 2013, 9 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/530,692, Jan. 31, 2014, 14 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/156,243, Jan. 28, 2014, 8 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/198,415, Dec. 26, 2013, 8 pages. |
“Office Web Apps: Share a SkyDriver Folder”, retrieved from http://explore.live.com/office-web-apps-skydrive-share-using on Aug. 8, 2011, 2 pages. |
“Office Web Apps: Share files on SkyDrive”, retrieved from http://explore.live.com/office-web-apps-skydrive-share-files-using on Aug. 8, 2011, 1 page. |
“Public or Private Articles”, retrieved from <http://www.presspublisher.com/features/public-or-private-articles> on Aug. 8, 2011, 3 pages. |
“Setting Sharing Permissions for Google Docs and Google Sites”, retrieved from http://www.library.kent.edu/files/SMS—Google—Sharing—Permissions.pdf on Aug. 8, 2011, 8 pages. |
“Share Office documents in SkyDrive”, retrieved from http://office.microsoft.com/en-us/web-apps-help/share-office-documents-in-skydrive-HA101820121.aspx on Aug. 8, 2011, 3 pages. |
“Shared Folder Permissions”, retrieved from http://www.tech-faq.com/shared-folder-permissions.html on Aug. 8, 2011, 7 pages. |
“STM23S-2AN NEMA 23 Integrated Drive+Motor”, Retrieved from: <http://www.applied-motion.com/products/integrated-steppers/stm23s-2an> on Jan. 24, 2012, Jan. 24, 2010, 3 pages. |
“Technology Comparison: Surface Acoustic Wave, Optical and Bending Wave Technology”, 3M Touch Systems, Available at >http://multimedia.3m.com/mws/mediawebserver?mwsId=66666UuZjcFSLXTtnXT2NXTaEVuQEcuZgVs6EVs6E666666--&fn=DST-Optical-SAW%20Tech%20Brief.pdf>,2009, pp. 1-4. |
“Using Low Power Mode on the MPR083 and MPR084”, Freescale Semiconductor Application Note, Available at <http://cache.freescale.com/files/sensors/doc/app—note/AN3583.pdf>,Nov. 2007, pp. 1-5. |
Asif, et al.,' “MPEG-7 Motion Descriptor Extraction for Panning Camera Using Sprite Generated”, In Proceedings of AVSS 2008, Available at <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4730384>,Sep. 2008, pp. 60-66. |
Baraldi, et al.,' “WikiTable: Finger Driven Interaction for Collaborative Knowledge-Building Workspaces”, Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW '06), available at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1640590>>,Jul. 5, 2006, 6 pages. |
Benko, et al.,' “Resolving Merged Touch Contacts”, U.S. Appl. No. 12/914,693, Nov. 8, 2010, 22 pages. |
Binns, “Multi-“Touch” Interaction via Visual Tracking”, Bachelor of Science in Computer Science with Honours, The University of Bath, available at <<http://www.cs.bath.ac.uk/˜mdv/courses/CM30082/projects.bho/2008-9/Binns-FS-dissertation-2008-9.pdf>>,May 2009, 81 pages. |
Cao, et al.,' “Evaluation of an On-line Adaptive Gesture Interface with Command Prediction”, In Proceedings of GI 2005, Available at <http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=DAB1B08F620C23464427932BAF1ECF49?doi=10.1.1.61.6749&rep=rep1&type=pdf>,May 2005, 8 pages. |
Cao, et al.,' “ShapeTouch: Leveraging Contact Shape on Interactive Surfaces”, In Proceedings of Tabletop 2008, Available at <http://www.cs.toronto.edu/˜caox/tabletop2008—shapetouch.pdf>,2008, pp. 139-146. |
Cravotta, “The Battle for Multi-touch”, Embedded Insights, retrieved from <http://www.embeddedinsights.com/channels/2011/04/12/the-battle-for-multi-touch/> on May 4, 2011,Apr. 12, 2011, 3 pages. |
Dang, et al.,' “Hand Distinction for Multi-Touch Tabletop Interaction”, University of Augsburg; Institute of Computer Science; Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, Nov. 23-25, 2009, 8 pages. |
Dillencourt, et al.,' “A General Approach to Connected-Component Labeling for Arbitrary Image Representations”, Journal of the Association for Computing Machinery, vol. 39, No. 2, available at <<http://www.cs.umd.edu/˜hjs/pubs/DillJACM92.pdf>>,Apr. 1992, pp. 253-280. |
Li, et al.,' “Role Based Access Control for Social Network Sites”, Department of Computer Science, Sun Yat-sen University; retrieved from http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=05420153, Dec. 3-5, 2009, 6 pages. |
Pratt, “Factors Affecting Sensor Response”, Analog Devices, AN-830 Application Note, Available at <http://www.analog.com/static/imported-files/application—notes/5295737729138218742AN830—0.pdf>,Dec. 2005, pp. 1-8. |
Tao, et al.,' “An Efficient Cost Model for Optimization of Nearest Neighbor Search in Low and Medium Dimensional Spaces”, Knowledge and Data Engineering, vol. 16 Issue:10, retrieved from <<http://www.cais.ntu.edu.sg/˜jzhang/papers/ecmonns.pdf>> on Mar. 16, 2011,Oct. 2004, 16 pages. |
Tsuchiya, et al.,' “Vib-Touch: Virtual Active Touch Interface for Handheld Devices”, In Proceedings of The 18th IEEE International Symposium on Robot and Human Interactive Communication, Available at <http://www.mech.nagoya-u.ac.jp/asi/en/member/shogo—okamoto/papers/tsuchiyaROMAN2009.pdf>,Oct. 2009, pp. 12-17. |
Westman, et al.,' “Color Segmentation by Hierarchical Connected Components Analysis with Image Enhancement by Symmetric Neighborhood Filter”, Pattern Recognition, 1990. Proceedings., 10th International Conference on Jun. 16-21, 1990, retrieved from <<http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=118219>> on Mar. 16, 2011,Jun. 16, 1990, pp. 796-802. |
Wilson, “TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction”, In Proceedings of ICIM 2004, Available at <http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.95.3647&rep=rep1&type=pdf>, Oct. 2004, 8 pages. |
Wimmer, et al.,' “Modular and Deformable Touch-Sensitive Surfaces Based on Time Domain Reflectometry”, In Proceedings of UIST 2011, Available at <http://www.medien.ifi.Imu.de/pubdb/publications/pub/wimmer2011tdrTouch/wimmer2011tdrTouch.pdf>,Oct. 2011, 10 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/664,840, Jul. 15, 2014, 14 pages. |
“Final Office Action”, U.S. Appl. No. 13/530,692, Apr. 10, 2014, 16 pages. |
Brodkin, Jon “Windows 8 hardware: Touchscreens, sensor support and robotic fingers”, <<http://arstechnica.com/business/news/2011/09/windows-8-hardware-touch-screens-sensor-support-and-robotic-fingers.ars>>, (Sep. 13, 2011), 1 Page. |
Buffet, Y “Robot Touchscreen Analysis”, <<http://ybuffet.posterous.com/labsmotocom-blog-archive-robot-touchscreen-an>>, (Apr. 19, 2010), 2 Pages. |
Hoshino, et al., “Pinching at finger tips for humanoid robot hand”, Retrieved at <<http://web.mit.edu/zoz/Public/HoshinoKawabuchiRobotHand.pdf>>, (Jun. 30, 2005), 9 Pages. |
Kastelan, et al., “Stimulation Board for Automated Verification of Touchscreen-Based Devices”, 22nd International Conference on Field Programmable Logic and Applications, Available at <https://www2.lirmm.fr/lirmm/interne/BIBLI/CDROM/MIC/2012/FPL—2012/Papers/PHD7.pdf>,(Aug. 29, 2012), 2 pages. |
Kastelan, et al., “Touch-Screen Stimulation for Automated Verification of Touchscreen-Based Devices”, In IEEE 19th International Conference and Workshops on Engineering of Computer Based Systems, (Apr. 11, 2012), pp. 52-55. |
Kjellgren, Olof “Developing a remote control application for Windows CE”, Bachelor Thesis performed in Computer Engineering at ABE Robotics, Miilardalen University, Department of Computer Science and Electronics, Retrieved at <<http://www.idt.mdh.se/utbildning/exjobblfiles/TR0661.pdf>>,(May 30, 2007), 43 Pages. |
McGlaun, Shane “Microsoft's Surface 2.0 Stress Testing Robot Called Patty Shown off for First Time”, Retrieved at <<http://www .slashgear. com/microsofts-surface-2 -0-stress-testing-robot -called-patty-shown-off -for -first-time-19172971/>>, (Aug. 19, 2011), 1 Page. |
Takeuchi, et al., “Development of a Muti-fingered Robot Hand with Softness changeable Skin Mechanism”, International Symposium on and 2010 6th German Conference on Robotics (ROBOTIK), Retrieved at <<http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=05756853>>, (Jun. 7, 2010), 7 Pages. |
Zivkov, et al., “Touch Screen Mobile Application as Part of Testing and Verification System”, Proceedings of the 35th International Convention, (May 21, 2012), pp. 892-895. |
“Touch Quality Test Robot”, U.S. Appl. No. 13/530,692, (Jun. 22, 2012), 20 pages. |
“Actuation Force of Touch Screen”, Solutions @ Mecmesin, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188971>,(Dec. 31, 2010), 1 page. |
“AO Touch Screen Tester”, retrieved from <http://www.ao-cs.com/Projects/touch%20screen%20tester%20project.html>, (Dec. 31, 2010), 1 page. |
“Linearity Testing Solutions in Touch Panels”, retrieved from <advantech.com/machine-automation/.../%7BD05BC586-74DD-4BFA-B81A-2A9F7ED489F/>, (Nov. 15, 2011), 2 pages. |
“MicroNav Integration Guide Version 3.0”, retrieved from <http://www.steadlands.com/data/interlink/micronavintguide.pdf>, (Dec. 31, 2003), 11 pages. |
“Microsoft Windows Simulator Touch Emulation”, retrieved from <blogs.msdn.com/b/visualstudio/archive/2011/09/30/microsoft-windows-simulator-touch-emulation.aspx>, (Sep. 30, 2011), 3 pages. |
“OptoFidelity Touch & Test”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188969, (Feb. 20, 2012), 2 pages. |
“OptoFidelity Touch and Test”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188420>, (May 4, 2012), 2 pages. |
“OptoFidelity Two Fingers—robot”, video available at <http://www.youtube.com/watch?v=YppRASbXHfk&feature=player—embedded#!section>, (Sep. 15, 2010),2 pages. |
“Project Capacitive Test Fixture”, retrieved from <http://www.touch-intl.com/downloads/DataSheets%20for%20Web/6500443-PCT-DataSheet-Web.pdf>, (2009), 2 pages. |
“Resistive Touch Screen—Resistance Linearity Test”, video available at <http://www.youtube.com/watch?v=hb23GpQdXXU>, (Jun. 17, 2008), 2 pages. |
“Smartphone Automatic Testing Robot at UEI Booth”, video available at <http://www.youtube.com/watch?v=f-Q4ns-b9sA>, (May 9, 2012), 2 pages. |
“Touch Panel Inspection & Testing Solution”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188967>, (Dec. 31, 2010), 1 page. |
“Touch Panel Semi-Auto Handler Model 3810”, retrieved from <http://www.chromaus.com/datasheet/3810 en.pdf>, (Dec. 31, 2010), 2 pages. |
“TouchSense Systems Immersion”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileId=188486>, (Jun. 19, 2010), 20 pages. |
Dillow, Clay “Liquid-Filled Robot Finger More Sensitive to Touch Than a Human's”, retrieved from <www.popsci.com/technology/article/2012-06/new-robot-finger-more-sensitive-touch-human> on Jul. 27, 2012, (Jun. 19, 2012), 3 pages. |
Khandkar, Shahedul H., et al., “Tool Support for Testing Complex MultiTouch Gestures”, ITS 2010, Nov. 7-10, 2010, Saarbrucken, Germany, (Nov. 7, 2010), 10 pages. |
Kuosmanen, Hans “OptoFidelity Automating UI Testing”, video available at <http://www.youtube.com/watch?v=mOZ2r7ZvyTg&feature=player—embedded#!section>, (Oct. 14, 2010), 2 pages. |
Kuosmanen, Hans “Testing the Performance of Touch-Enabled Smartphone User Interfaces”, retrieved from <http://www.ArticleOnePartners.com/index/servefile?fileld=188442>, (Dec. 31, 2008), 2 pages. |
Levin, Michael et al., “Tactile-Feedback Solutions for an Enhanced User Experience”, retrieved from>http://www.pbinterfaces.com/documents/Tactile—Feedback—Solutions.pdf>, (Oct. 31, 2009), pp. 18-21. |
McMahan, William et al., “Haptic Display of Realistic Tool Contact via Dynamically Compensated Control of a Dedicated Actuator”, International Conference on Intelligent Robots and Systems, St. Louis, MO, Oct. 11-15, 2009, retrieved from <http://repository.upenn.edu/meam—papers/222>,(Dec. 15, 2009), 9 pages. |
Toto, Serkan “Video: Smartphone Test Robot Simulates Countless Flicking and Tapping”, retrieved from <techcrunch.com/2010/12/21/video-smartphone-test-robot-simulates-countless-flicking-and-tapping/>, (Dec. 21, 2010), 2 pages. |
“PCT Search Report and Written Opinion”, Application No. PCT/US2013/046208, Sep. 27, 2013, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/530,692, Aug. 25, 2014, 18 pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/664,840”, Mailed Date: Sep. 18, 2015, 17 Pages. |
“Corrected Notice of Allowance”, U.S. Appl. No. 13/530,692, Apr. 23, 2015, 4 pages. |
“Final Office Action”, U.S. Appl. No. 13/664,840, Mar. 27, 2015, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/905,093, Feb. 20, 2015, 18 pages. |
“Notice of Allowance”, U.S. Appl. No. 13/530,692, Mar. 3, 2015, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20140111484 A1 | Apr 2014 | US |