Electronic devices are ubiquitous, and include cellular phones, eBook readers, tablet computers, desktop computers, portable media devices, and so forth. These electronic devices may utilize touch sensors for accepting user input.
The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
Overview
Described herein are techniques and apparatus for characterizing touches based upon resistive and capacitive input. A resistive sensor providing resistive input may be coupled to a capacitive touch sensor providing capacitive touch input. As used in this disclosure for illustration, and not by way of limitation, the resistive sensor may comprise an interpolating force-sensitive resistor (IFSR) array touch sensor. In another implementation, the (IFSR) array touch sensor may be configured to detect touches resistively as well as capacitively.
By utilizing resistive and capacitive input and analyzing and comparing the two, it becomes possible to detect near-touches, detect light touches, accurately characterize touches, and so forth. Furthermore, the IFSR array allows for measuring the pressure exerted by a touch on the sensor. Additionally, the IFSR array may also be used in conjunction with an active stylus to determine a position of the active stylus relative to the IFSR array.
An IFSR array comprises two layers. A first layer incorporates columns of electrodes and a second layer comprises rows of electrodes. These layers are disposed generally perpendicular to one another, such that a grid pattern is formed. A resistive material fills gaps between the electrodes within the layers and acts as a linear resistor. When a user pushes upon the array, the resistance at various intersections between the rows and columns changes. Because of the linear responsiveness of the resistive material, it is possible for a touch controller to interpolate the location of a touch between intersections. Additionally, a magnitude of the applied force is discernable as well.
Because the first and second layers are not directly in contact, direct capacitive effects are present as well. In some implementations projected capacitive effects may present instead of, or in addition to, direct capacitive effects. The resistive material, air, or other materials within the sensor act as a dielectric for the capacitor. When a charge is applied to a portion of one layer, such as a particular row, capacitive coupling introduces a charge within overlapping columns. As described herein, this may be utilized to provide for touch and hovering (near-touch) detection. Furthermore, because the degree of capacitive coupling varies depending upon the composition of material, the capacitive effect may be utilized to characterize the material which is impinging the touch. For example, a human finger provides a greater capacitive coupling effect than an inanimate plastic or ceramic stylus.
Resistive and capacitive effects within a single sensor are discernable in a time domain of a signal generated by the IFSR array. These effects may vary based upon the physical implementation of the IFSR array. In one implementation, capacitive effects result in a brief voltage spike shortly after pulsing the electrodes, whereas resistive effects result in a generally steady state voltage over a longer time span. A timeframe of these effects may be short, such as on the order of microsecond.
The IFSR array may be configured to act in a capacitive mode only by placing a non-conductive layer between the first and second touch sensor layers. This implementation allows for the fabrication of an inexpensive capacitive sensor. Furthermore, this configuration may also be used in conjunction with the active stylus describe below.
In some implementations, the IFSR array operating in resistive mode may be coupled with a discrete capacitive touch sensor. The two sensors may be arranged such that they are adjacent to and coincident with each other, such as one behind another. In another implementation, the two sensors may be distributed in an alternating row, grid, or other pattern such that they are side-by side. Output signals from both touch sensors may be compared to provide the characterization capabilities described above. For example, a touch which is detected by the IFSR array but not by the capacitive sensor may be characterized as a stylus touch.
The IFSR array may also be used in conjunction with an active stylus. The active stylus contains electronics which allow it to interact with the IFSR touch sensor array. In one implementation, a signal may be generated by the touch sensor, which is then received by the active stylus. Position of the active stylus may be determined by comparing the received signal with a known scan pattern, timing, modulation of the signal, and so forth. In other implementations, the active stylus may generate the signal which is received by the IFSR array which determines the position. The IFSR array may continue to operate in resistive, capacitive, or both modes while using the active stylus.
Illustrative Device
The device 100 incorporates an input module 108 which processes input from the touch sensor 104 to characterize touches. Characterization of the touches may include distinction of composition, location, force, area, duration, and so forth.
The touch sensor 104 may be configured to operate in several modes: projected capacitive sensing 110, direct capacitive sensing 112, resistive sensing 114, or a combination thereof. These modes may be used individually or combined in the same touch sequence. For the following discussion, it is useful to consider the approach and eventual contact of a user's 116 finger or hand to the touch sensor 104.
The projected capacitive sensing 110 mode provides for sensing an object which is not in contact with the touch sensor, or hovering, proximate to the touch sensor. For example, as the user's 116 finger approaches the touch sensor 104, the capacitive sensing 110 detects the finger without contact to the surface of the touch sensor 104 at all. Upon approach to the surface, the user's finger 116 distorts an electric field at a junction of rows and columns due to capacitive coupling, and results in a detectable signal which may be interpreted as a touch. The user may thus hover, or place the finger proximate to, but not in contact with, the touch sensor 104.
As the user 116 continues to move the finger towards the touch sensor, it may eventually lightly contact the touch sensor 104 surface. While operating in direct capacitive sensing 112 mode, very light touches will register a touch. These very light touches may be so slight as to result be imperceptible to the user 116, but result in deformation or compression of the touch sensor layers which affects capacitance between the components therein. A slightly harder push will result in increased changes to the direct capacitance until reaching a threshold level at which point increasing pressure no longer results in a change in output signal.
As the user 116 continues to press harder, resistive sensing due to the physical deformation or compression of the touch sensor 104 layers may predominate. While the direct capacitive effects reach a threshold level, the resistive material continues to provide variation which is proportionate to the force applied. Thus, the harder the touch, the more the resistive changes in the material of the touch sensor 104.
A stylus 118 may also be used for input. In some implementations, the stylus 118 may include active components, forming an active stylus 1200 as discussed below with regards to
Peripherals 204 couple to the processor 202. An image processing unit 206 is shown coupled to one or more display components 102 (or “displays”). In some implementations, multiple displays may be present and coupled to the image processing unit 206. These multiple displays may be located in the same or different enclosures or panels. Furthermore, one or more image processing units 206 may couple to the multiple displays.
The display 102 may present content in a human-readable format to a user. The display 102 may be reflective, emissive, or a combination of both. Reflective displays utilize incident light and include electrophoretic displays, interferometric modulator displays, cholesteric displays, pre-printed materials, and so forth. Emissive displays do not rely on incident light and, instead, emit light. Emissive displays include backlit liquid crystal displays, time multiplexed optical shutter displays, light emitting diode displays, backlit pre-printed materials, and so forth. When multiple displays are present, these displays may be of the same or different types. For example, one display may be an electrophoretic display while another may be a liquid crystal display.
For convenience only, the display 102 is shown in a generally rectangular configuration. However, it is understood that the display 102 may be implemented in any shape, and may have any ratio of height to width. Also, for stylistic or design purposes, the display 102 may be curved or otherwise non-linearly shaped. Furthermore the display 102 may be flexible and configured to fold or roll.
The content presented on the display 106 may take the form of electronic books or “eBooks.” For example, the display 106 may depict the text of the eBooks and also any illustrations, tables, or graphic elements that might be contained in the eBooks. The terms “book” and/or “eBook”, as used herein, include electronic or digital representations of printed works, as well as digital content that may include text, multimedia, hypertext, and/or hypermedia. Examples of printed and/or digital works include, but are not limited to, books, magazines, newspapers, periodicals, journals, reference materials, telephone books, textbooks, anthologies, instruction manuals, proceedings of meetings, forms, directories, maps, web pages, and so forth. Accordingly, the terms “book” and/or “eBook” may include any readable or viewable content that is in electronic or digital form.
The device 100 may have an input device controller 208 configured to accept input from a keypad, keyboard, or other user actuable controls 210. These user actuable controls 210 may have dedicated or assigned operations. For instance, the actuatable controls may include page turning buttons, a navigational keys, a power on/off button, selection keys, joystick, touchpad, and so on.
The device 100 may also include a USB host controller 212. The USB host controller 212 manages communications between devices attached to a universal serial bus (“USB”) and the processor 202 and other peripherals.
The touch sensor controller 214 is configured to determine characteristics of interaction with the touch sensor 104. These characteristics may include the location of the touch on the touch sensor 104, magnitude of the force, shape of the touch, and so forth.
The USB host controller 212 may also couple to a wireless module 216 via the universal serial bus. The wireless module 216 may allow for connection to wireless local or wireless wide area networks (“WWAN”). Wireless module 216 may include a modem 218 configured to send and receive data wirelessly and one or more antennas 220 suitable for propagating a wireless signal. In other implementations, the device 100 may include a wired network interface.
The device 100 may also include an external memory interface (“EMI”) 222 coupled to external memory 224. The EMI 222 manages access to data stored in external memory 224. The external memory 224 may comprise Static Random Access Memory (“SRAM”), Pseudostatic Random Access Memory (“PSRAM”), Synchronous Dynamic Random Access Memory (“SDRAM”), Two Data Rate SDRAM (“DDR”), Phase-Change RAM (“PCRAM”), or other computer-readable storage media.
The external memory 224 may store an operating system 226 comprising a kernel 228 operatively coupled to one or more device drivers 230. The device drivers 230 are also operatively coupled to peripherals 204, such as the touch sensor controller 214. The external memory 224 may also store data 232, which may comprise content objects for consumption on eBook reader device 100, executable programs, databases, user settings, configuration files, device status, the input module 108, and so forth.
One or more batteries 234 provide operational electrical power to components of the device 100 for operation when the device is disconnected from an external power supply. The device 100 may also include one or more other, non-illustrated peripherals, such as a hard drive using magnetic, optical, or solid state storage to store information, a firewire bus, a Bluetooth™ wireless network interface, camera, global positioning system, PC Card component, and so forth.
Couplings, such as that between the touch sensor controller 214 and the USB host controller 212, are shown for emphasis. There are couplings between many of the components illustrated in
Resistive and Capacitive Sensing with an IFSR Array
As shown at 302, prior to contact the hand “hovers” over the surface of the touch sensor 104, and may be sensed via projected capacitive sensing 110.
As the hand comes lightly in contact as shown at 304, the slight (and perhaps imperceptible to the user) deformation or compression of the touch sensor 104 results in direct capacitive sensing 112. In some implementations, capacitive sensing (both projected and/or direct) reaches a threshold output level at about 50 grams of applied force. The touch may also indicated by projected capacitive sensing 110, which may vary depending upon increasing contact area where the touch is generated by a deformable object, such as a fingertip.
As the user presses harder, as shown at 306 the resistive mode begins to generate a signal which is proportional to the amount of force applied. In some implementations, the resistive sensing begins to output a signal at about 20 grams of applied force, which then increases proportionately as the applied force increases. Thus, medium touches are sensed both by the direct capacitive sensing 112 and projected capacitive sensing 110 as well as resistive sensing 114.
As the pressure of the touch by the user 116 continues to increase as shown at 308, the resistive sensing 114 mode predominates. Signals from the projected capacitive sensing 110 and the direct capacitive sensing 112 modes have reached stable levels, unlike the resistive sensing 114 mode which continues to vary providing input as to how hard the touch is. As a result, the resistive mode 114 allows for continued determination of magnitude of the applied force.
Also shown in
The touch sensor controller 214 scans the touch sensor 104. The resistance measurement module 314 quantifies any resistive readings from the sensor 104 while the capacitance measurement module 316 quantifies any capacitance readings. The touch sensor controller 214 generates touch data based on the measurements of modules 314 and 316 and the touch data is sent to the input module 108. The input module 108 may then characterize touches, initiate actions based upon the touches, and so forth.
In this figure, the X axis indicates increasing time 402, while the perpendicular Y axis indicates voltage 404 measured by the touch sensor controller 214. A signal 406 is shown plotted on the X-Y graph. The signal 406 is received by the touch sensor controller 214 during or after pulsing on or more electrodes in the touch sensor 104. Beginning at time T0, a user approaches and touches the touch sensor 104 with contact occurring at T1. As the touch nears the surface, the signal shows an increase in signal between times T0 and T1 owing to projected capacitive sensing 116. As the touch continues between times T1 and T2 the user increases force, presenting a light touch which results in direct capacitive effects which further increase the signal to a peak at T2. A capacitive effect 408 is thus reflected as a voltage spike in the time domain of the signal. At time T3, the user is in contact with and depressing on the touch sensor 104. The capacitive effects 408 have tapered off and the resistance effect 410 predominate, as characterized by a steady state voltage from time T3 to T4 representing a touch with relatively constant applied force. The magnitude of this voltage may be proportionate to the magnitude of an applied force. For example, a harder press may result in a greater steady state voltage.
The time scale may vary depending upon the electrical characteristics of the components involved. In one implementation, the timeframe of these effects may such that the capacitive effect 408 has a duration on the order of microseconds.
In comparison, at 508, a stylus is used to touch the touch sensor 104, rather than a human finger. The corresponding temporal voltage profile shows no signal from time T0 to time T1 due to the lack of projected capacitive coupling to the stylus. When the stylus contacts the surface at T1, the direct capacitive sensing 112 begins to produce a signal, and the resistive sensing 114 predominates from time T2 to T3. As a result of this difference, touches may be characterized. For example, as shown here it is possible to distinguish a touch from a finger which capacitively couples with elements within the sensor via projected capacitance and a touch from an inanimate stylus which does not provide capacitive coupling via projected capacitance sufficient to generate a signal.
As described above, the temporal voltage profile incorporates characteristics resulting from resistive and capacitive effects detected by the touch sensor 104. The following processes may be used to generate the temporal voltage profile and analyze the profile to characterize a touch.
At 602, the touch sensor controller 214 pulses a voltage to a first portion of an interpolating force-sensitive resistor array. As shown here, this array may comprise a set of rows and columns which together form an addressable array of junctions. For example, a pulse having voltage Vpulse may be made to row number 1 in the IFSR array 104. At 604, the touch sensor controller 214 measures voltage Vout over time at a second portion of the IFSR array. For example, the touch sensor controller 214 may measure voltage on column A.
At 606, the touch sensor controller 214 generates a temporal voltage profile 400 from the measured voltage over time which corresponds to a selected junction of the touch sensor 104. Continuing the above example, the resulting temporal voltage profile 400 thus corresponds to the intersection of row 1 and column A. As described above with regards to
At 702, the input module 108 determines when a voltage spike associated with a capacitive coupling of a portion of the array is present within the temporal voltage profile 400. For example, this may occur when a finger capacitively couples with a junction within the touch sensor 104, altering the capacitance of the junction.
At 704, the input module 108 determines when a steady state voltage over a pre-determined period of time associated with a resistive contact within a portion of the array is present within the temporal voltage profile. For example, as described above, the physical distortion of the touch sensor which results in the resistive sensing 114 has a duration greater than the capacitive effect. The pre-determined period of time may be static, such as 300 microseconds, or dynamically adjusted.
At 706, when the voltage spike and the steady state voltage are present, the input module 108 categorizes the touch as a user touch. The user touch may comprise a finger, palm, knuckle, cheek, and so forth.
At 708, when the voltage spike is absent and the steady state voltage is present, the input module 108 categorizes the touch as a non-user touch. The non-user touch may comprise a stylus or other inanimate object. Thus, inanimate objects which do not generate appreciable capacitive coupling are distinguished.
IFSR Array Configured for Capacitive Sensing
In some implementations, it may be desirable for the IFSR touch sensor array 104 to be configured to operate in a capacitive mode only. There are various ways to accomplish this, and one implementation is described next.
In a similar fashion to that used with regards to resistive sensing, the touch controller 214 may interpolate location of one or more touches based upon the output of the touch sensor 104 operating in capacitive mode. By scanning the rows and columns and measuring the magnitude of the capacitive response, touches between intersections may thus be determined.
Combining Discrete IFSR and Capacitive Sensors
Both touch sensors may be configured to share a common field of view. The touch sensor controller 214 receives the signals from the touch sensors and provides data to the input module 108. The data from both touch sensors may be compared to provide the characterization capabilities described above. For example, a touch which is detected by the IFSR array 104 but not by the capacitive sensor 902 may be characterized as a stylus touch.
Inspection of the IFSR sensor signal 1006 for the touch sensor operating in resistive mode shows no signal from T0 to time T1, followed by the characteristic increase in signal amplitude from T1 to T2 to a relatively steady state voltage which is relatively constant from T3 to T4. In contrast, inspection of the capacitive sensor signal 1008 shows the increase from time T0 to T1 resulting from projected capacitive coupling prior to contact, followed by an increase due to direct capacitive effects from time T1 to T2. The capacitive effects decrease from the peak at T2 to zero at T3.
By assessing the profiles associated with both signals, it is possible to characterize a touch as described above with respect to
At 1106, when the capacitive touch sensor 902 and the IFSR touch sensor 104 report a touch at substantially the same position, the input module 108 categorizes the touch as a finger. At 1108, when the capacitive touch sensor 902 reports no touch and the IFSR touch sensor 104 reports a touch at the position, the input module 108 categorizes the touch as a non-finger. For example, a plastic stylus may not capacitively couple strongly enough to the capacitive touch sensor 902 to generate an input signal. However, the same plastic stylus may be pushing such that a resistive signal is generated with the IFSR touch sensor 104.
Active Stylus
The active stylus 1200 may comprise a stylus tip 1202 suitable for contact with the touch sensor 104, coupled to or integral with a stylus body 1204. The stylus body 1204 encloses a transducer 1206 configured to receive, transmit, or receive and transmit electromagnetic or electric signals. For example, the transducer may be configured to generate a radio frequency signal, an electric field from a capacitor, and so forth.
Coupled to the transducer 1206, a power and control module 1208 may contain a processor, memory, and other circuitry suitable for performing the functions of the active stylus 1200 as described herein. A battery, capacitor, or other storage device may be used to provide power, while inductive or capacitive coupling with the device 100 may be used to provide electrical power to operate the active stylus 1200.
An orientation/motion sensor module 1210 may be present within the active stylus 1200. This module may include one or more accelerometers, gyroscopes, magnetic field sensors, gravimeters, and so forth. These sensors may be used to determine a position and orientation of the stylus relative to the touch sensor 104. In some implementations, the orientation may be determined independent of the touch sensor 104, such as by a gravimeter.
A communication module 1212 couples to the power and control module 1208 and allows the active stylus 1200 to communicate with the device 100. The communication module 1212 may exchange data with the device 100 via optical, acoustic, electromagnetic, hardwired, or other means. In some implementations the communication module 1212 may utilize the Bluetooth standard.
In one implementation, a signal may be generated by the touch sensor 104, which is then received by the active stylus 1200. Position of the active stylus may be determined by comparing the received signal with a known scan pattern, timing, modulation of the signal, and so forth.
In another implementation, the active stylus 1200 may generate the signal which is received by the IFSR array 104. The active stylus 1200 is configured to emit a signal which is received at one or more junctions within the IFSR array 104. By analyzing which junctions have received the signal and signal characteristics of strength, the touch sensor controller 214 may determine the position of the active stylus 1200 relative to the touch sensor 104.
Regardless of whether the active stylus 1200 or the touch sensor 104 generates the signal, the IFSR array 104 may continue to operate in resistive, capacitive, or both modes while using the active stylus 1200. Thus the various capabilities of the touch sensor 104 to sense near touches, light touches, and characterize touches remains intact.
Touches associated with the active stylus 1200 may be interpolated in a similar fashion as the resistive and capacitive touches. This interpolating allows for the touch sensor 104 to be scanned at a relatively low resolution, while still maintaining tracking ability. Higher resolution may be required when tracking multiple touches close to one another.
At 1404, the active stylus 1200 determines the position of the stylus relative to the touch sensor array. For example, the signal generated by the touch sensor 104 may be modulated to indicate which intersection is active at a given moment.
At 1406, the active stylus 1200 may transmit to the device 100 the position of the active stylus 1200 for use by the input module 108. In some implementations, the active stylus 1200 may also transmit orientation information.
At 1504, the touch controller 214 receives from the active stylus 1200 an indication that the signal was received at the particular time. For example, a simple binary signal indicating the touch sensor's signal was detected is transmitted from the active stylus 1200 to the touch controller 214. The magnitude of the signal strength detected may also be sent in some implementations, and used by the touch controller 214 to interpolate position.
At 1506, based at least in part upon the indication, the touch controller 214 generates a position corresponding to the active stylus 1200. Unlike the method of
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the claims. For example, the methodological acts need not be performed in the order or combinations described herein, and may be performed in any combination of one or more acts.
The present application claims priority to U.S. Provisional Application Ser. No. 61/230,592, filed on Jul. 31, 2009, entitled “Inventions Related to Touch Screen Technology” and U.S. application Ser. No. 12/380,350, filed on Feb. 26, 2009, entitled “Method and apparatus for providing input to a processor, and a sensor pad.” These pending applications are hereby incorporated by reference in their entirety, and the benefit of the filing dates of these pending applications are claimed to the fullest extent permitted.
Number | Name | Date | Kind |
---|---|---|---|
3944740 | Murase | Mar 1976 | A |
4526043 | Boie et al. | Jul 1985 | A |
4587378 | Moore | May 1986 | A |
4952031 | Tsunoda et al. | Aug 1990 | A |
4983786 | Stevens | Jan 1991 | A |
5105548 | Fowler | Apr 1992 | A |
5543589 | Buchana et al. | Aug 1996 | A |
5597183 | Johnson | Jan 1997 | A |
5666113 | Logan | Sep 1997 | A |
5761485 | Munyan | Jun 1998 | A |
5818430 | Heiser | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5847698 | Reavey et al. | Dec 1998 | A |
6029214 | Dorfman et al. | Feb 2000 | A |
6072474 | Morimura et al. | Jun 2000 | A |
6128007 | Seybold | Oct 2000 | A |
6229502 | Schwab | May 2001 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6594606 | Everitt | Jul 2003 | B2 |
6707438 | Ishizuka et al. | Mar 2004 | B1 |
6762752 | Perski et al. | Jul 2004 | B2 |
6980202 | Carro | Dec 2005 | B2 |
6982699 | Lenssen et al. | Jan 2006 | B1 |
7123243 | Kawasaki et al. | Oct 2006 | B2 |
7166966 | Naugler, Jr. et al. | Jan 2007 | B2 |
7190348 | Kennedy et al. | Mar 2007 | B2 |
7199322 | Bourdelais et al. | Apr 2007 | B2 |
7324093 | Gettemy et al. | Jan 2008 | B1 |
7331245 | Nishimura et al. | Feb 2008 | B2 |
7339577 | Sato et al. | Mar 2008 | B2 |
7471284 | Bathiche et al. | Dec 2008 | B2 |
7619616 | Rimas Ribikauskas et al. | Nov 2009 | B2 |
7760187 | Kennedy | Jul 2010 | B2 |
7800586 | Serban et al. | Sep 2010 | B2 |
7825905 | Philipp | Nov 2010 | B2 |
8089470 | Schediwy et al. | Jan 2012 | B1 |
8223278 | Kim et al. | Jul 2012 | B2 |
8243424 | Babu et al. | Aug 2012 | B1 |
8265717 | Gorsica et al. | Sep 2012 | B2 |
8316324 | Boillot | Nov 2012 | B2 |
8427424 | Hartmann et al. | Apr 2013 | B2 |
8558767 | Kwon | Oct 2013 | B2 |
8902174 | Peterson | Dec 2014 | B1 |
8947351 | Noble | Feb 2015 | B1 |
9069417 | Rimon et al. | Jun 2015 | B2 |
9244562 | Rosenberg et al. | Jan 2016 | B1 |
20010013855 | Fricker et al. | Aug 2001 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020080123 | Kennedy et al. | Jun 2002 | A1 |
20020109668 | Rosenberg et al. | Aug 2002 | A1 |
20020149572 | Schulz et al. | Oct 2002 | A1 |
20020180714 | Duret | Dec 2002 | A1 |
20030067449 | Yoshikawa et al. | Apr 2003 | A1 |
20030095115 | Brian et al. | May 2003 | A1 |
20030156098 | Shaw et al. | Aug 2003 | A1 |
20030210235 | Roberts | Nov 2003 | A1 |
20030234768 | Rekimoto et al. | Dec 2003 | A1 |
20040125087 | Taylor et al. | Jul 2004 | A1 |
20040174324 | Yamazaki et al. | Sep 2004 | A1 |
20050083316 | Brian et al. | Apr 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050174336 | Nakayama et al. | Aug 2005 | A1 |
20050200798 | Tanaka | Sep 2005 | A1 |
20050259087 | Hoshino et al. | Nov 2005 | A1 |
20060007172 | Baker et al. | Jan 2006 | A1 |
20060007182 | Sato et al. | Jan 2006 | A1 |
20060012580 | Perski et al. | Jan 2006 | A1 |
20060012581 | Haim et al. | Jan 2006 | A1 |
20060028459 | Underwood et al. | Feb 2006 | A1 |
20060050062 | Ozawa et al. | Mar 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060109252 | Kolmykov-Zotov et al. | May 2006 | A1 |
20060192726 | Huitema et al. | Aug 2006 | A1 |
20060198080 | Hawes et al. | Sep 2006 | A1 |
20060209045 | Su et al. | Sep 2006 | A1 |
20060236263 | Bathiche et al. | Oct 2006 | A1 |
20060244735 | Wilson | Nov 2006 | A1 |
20060293864 | Soss | Dec 2006 | A1 |
20070128948 | Nakanishi et al. | Jun 2007 | A1 |
20070152976 | Townsend et al. | Jul 2007 | A1 |
20070235231 | Loomis et al. | Oct 2007 | A1 |
20070236618 | Maag et al. | Oct 2007 | A1 |
20070262246 | Pittel et al. | Nov 2007 | A1 |
20080018608 | Serban et al. | Jan 2008 | A1 |
20080018611 | Serban et al. | Jan 2008 | A1 |
20080030464 | Sohm et al. | Feb 2008 | A1 |
20080053293 | Georges et al. | Mar 2008 | A1 |
20080074400 | Gettemy et al. | Mar 2008 | A1 |
20080143679 | Harmon et al. | Jun 2008 | A1 |
20080158183 | Hotelling et al. | Jul 2008 | A1 |
20080160656 | Chanda et al. | Jul 2008 | A1 |
20080168403 | Westerman et al. | Jul 2008 | A1 |
20080180406 | Han et al. | Jul 2008 | A1 |
20080204426 | Hotelling et al. | Aug 2008 | A1 |
20080211796 | Kim | Sep 2008 | A1 |
20080246723 | Baumbach | Oct 2008 | A1 |
20080254822 | Tilley | Oct 2008 | A1 |
20080296073 | McDermid | Dec 2008 | A1 |
20080303799 | Schwesig et al. | Dec 2008 | A1 |
20080309631 | Westerman et al. | Dec 2008 | A1 |
20090095540 | Zachut et al. | Apr 2009 | A1 |
20090102805 | Meijer et al. | Apr 2009 | A1 |
20090120696 | Hayakawa et al. | May 2009 | A1 |
20090141008 | Johnson et al. | Jun 2009 | A1 |
20090153152 | Maharyta | Jun 2009 | A1 |
20090165296 | Carmi | Jul 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090218310 | Zu et al. | Sep 2009 | A1 |
20090219258 | Geaghan et al. | Sep 2009 | A1 |
20090227295 | Kim | Sep 2009 | A1 |
20090237371 | Kim et al. | Sep 2009 | A1 |
20090237374 | Li et al. | Sep 2009 | A1 |
20090249236 | Westerman et al. | Oct 2009 | A1 |
20090256817 | Perlin et al. | Oct 2009 | A1 |
20090289914 | Cho | Nov 2009 | A1 |
20090315848 | Ku et al. | Dec 2009 | A1 |
20100005427 | Zhang et al. | Jan 2010 | A1 |
20100006350 | Elias | Jan 2010 | A1 |
20100013780 | Ikeda et al. | Jan 2010 | A1 |
20100013797 | Kim et al. | Jan 2010 | A1 |
20100020043 | Park et al. | Jan 2010 | A1 |
20100026647 | Abe et al. | Feb 2010 | A1 |
20100039395 | Nurmi et al. | Feb 2010 | A1 |
20100056277 | Marks et al. | Mar 2010 | A1 |
20100090964 | Soo et al. | Apr 2010 | A1 |
20100117974 | Joguet et al. | May 2010 | A1 |
20100123670 | Philipp | May 2010 | A1 |
20100139990 | Westerman et al. | Jun 2010 | A1 |
20100156805 | Brand et al. | Jun 2010 | A1 |
20100182285 | Tremblay | Jul 2010 | A1 |
20100199221 | Yeung et al. | Aug 2010 | A1 |
20100225604 | Homma et al. | Sep 2010 | A1 |
20100267421 | Rofougaran | Oct 2010 | A1 |
20100277439 | Charlier et al. | Nov 2010 | A1 |
20100295780 | Vaisanen et al. | Nov 2010 | A1 |
20100295781 | Alameh et al. | Nov 2010 | A1 |
20110007021 | Bernstein et al. | Jan 2011 | A1 |
20110025619 | Joguet et al. | Feb 2011 | A1 |
20110037709 | Cottarel et al. | Feb 2011 | A1 |
20110061947 | Krah et al. | Mar 2011 | A1 |
20110074701 | Dickinson et al. | Mar 2011 | A1 |
20110096033 | Ko | Apr 2011 | A1 |
20110109577 | Lee et al. | May 2011 | A1 |
20110141009 | Izumi | Jun 2011 | A1 |
20110163992 | Cordeiro | Jul 2011 | A1 |
20110242037 | Gruber | Oct 2011 | A1 |
20110254864 | Tsuchikawa et al. | Oct 2011 | A1 |
20110267265 | Stinson | Nov 2011 | A1 |
20110267280 | De Mers et al. | Nov 2011 | A1 |
20110285657 | Shimotani et al. | Nov 2011 | A1 |
20120050181 | King et al. | Mar 2012 | A1 |
20120057064 | Gardiner et al. | Mar 2012 | A1 |
20120084691 | Yun | Apr 2012 | A1 |
20120105324 | Lee et al. | May 2012 | A1 |
20120173067 | Szczerba et al. | Jul 2012 | A1 |
20120174004 | Seder et al. | Jul 2012 | A1 |
20120206333 | Kim | Aug 2012 | A1 |
20120299848 | Homma et al. | Nov 2012 | A1 |
20120299849 | Homma et al. | Nov 2012 | A1 |
20120313880 | Geaghan et al. | Dec 2012 | A1 |
20120320247 | Kim et al. | Dec 2012 | A1 |
20120326994 | Miyazawa et al. | Dec 2012 | A1 |
20130002551 | Imoto et al. | Jan 2013 | A1 |
20140028557 | Otake et al. | Jan 2014 | A1 |
20140085202 | Hamalainen et al. | Mar 2014 | A1 |
20140267176 | Bathiche et al. | Sep 2014 | A1 |
20140285418 | Adachi | Sep 2014 | A1 |
20150109257 | Jalali | Apr 2015 | A1 |
Number | Date | Country |
---|---|---|
09282100 | Oct 2007 | JP |
WO2007141566 | Dec 2007 | WO |
WO2009008568 | Jan 2009 | WO |
WO2009021836 | Feb 2009 | WO |
Entry |
---|
Ashbrook, et al., “Nenya: Subtle and Eyes-Free Mobile Input with a Magnetically-Tracked Finger Ring”, CHI 2011, May 7-12, 2011, 4 pages. |
Harrison, et al., “Abracadabra: Wireless, High-Precision, and Unpowered Finger Input for Very Small Mobile Devices”, In Proceedings of the 22nd Annual ACM Symposium on User interface Software and Technology (Victoria, British Columbia, Canada, Oct. 4-7, 2009). UIST '09. ACM, New York, NY, 4 pages. |
Office action for U.S. Appl. No. 12/846,328, mailed on Dec. 24, 2012, Rosenberg et al., “Two-Sided Touch Sensor”, 15 pages. |
Non-Final Office Action for U.S. Appl. No. 12/846,368, mailed on Feb. 15, 2013, Ilya D. Rosenberg et al., “Touch Distinction”, 23 pages. |
Office Action for U.S. Appl. No. 12/846,539, mailed on Feb. 15, 2013, Ilya D. Rosenberg et al., “Magnetic Touch Discrimination”, 20 pages. |
Non-Final Office Action for U.S. Appl. No. 13/247,669, mailed on Feb. 1, 2013, Julien G. Beguin et al., “Interacting Through Noncontact Gestures”, 22 pages. |
Office action for U.S. Appl. No. 12/846,519, mailed on Apr. 24, 2013, Rosenberg et al., “Touch Sensing Techniques”, 23 pages. |
Office action for U.S. Appl. No. 12/846,295, mailed on May 21, 2013, Rosenberg et al., “Visually Consistent Arrays”, 14 pages. |
Office action for U.S. Appl. No. 12/846,268, mailed on May 3, 2013, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 15 pages. |
Office action for U.S. Appl. No. 13/247,699, mailed on Jul. 19, 2013, Beguin et al., “Interacting Through Noncontact Gestures”, 32 pages. |
Final Office Action for U.S. Appl. No. 12/846,539, mailed on Oct. 25, 2013, Ilya D. Rosenberg, “Magnetic Touch Discrimination”, 26 pages. |
Office Action for U.S. Appl. No. 12/846,268, mailed on Oct. 23, 2013, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 37 pages. |
Office Action for U.S. Appl. No. 12/846,428, mailed on Oct. 9, 2013, Ilya D. Rosenberg, “Hardware Enabled Interpolating Sensor and Display”, 25 pages. |
Office action for U.S. Appl. No. 12/846,519, mailed on Nov. 14, 2013, Rosenberg, et al., “Touch Sensing Techniques”, 24 pages. |
Office action for U.S. Appl. No. 12/846,328, mailed on Aug. 15, 2013, Rosenberg et al., “Two-Sided Touch Sensor”, 18 pages. |
Office action for U.S. Appl. No. 12/846,368, mailed on Sep. 13, 2013,Rosenberg et al., “Touch Distinction”, 36 pages. |
Wolf, et al., “Angles, Azimuths, and Bearings”, Pearson Prentice Hall, Elementary Surveying, 12th Edition, 2008, Chapter 7, pp. 165-184. |
Office action for U.S. Appl. No. 12/846,428, mailed on Feb. 21, 2014, Rosenberg, et al., “Hardware Enabled Interpolating Sensor and Display”, 30 pages. |
Final Office Action for U.S. Appl. No. 12/846,295, mailed on Dec. 23, 2013, Ilya D. Rosenberg, “Visually Consistent Arrays including Conductive Mesh”, 16 pages. |
Office Action for U.S. Appl. No. 13/247,699, mailed on Jan. 31, 2014, Julien G. Beguin, “Interacting Through Noncontact Gestures”, 28 pages. |
Office Action for U.S. Appl. No. 12/846,328, mailed on Dec. 19, 2013, Ilya D. Rosenberg, “Two-Sided Touch Sensor”, 13 pages. |
Final Office Action for U.S. Appl. No. 12/846,368, mailed on Feb. 27, 2015, Ilya D. Rosenbert, “Touch Distinction”, 49 pages. |
Office action for U.S. Appl. No. 13/247,699, mailed on Aug. 27, 2015, Beguin et al., “Interacting Through Noncontact Gestures”, 24 pages. |
Final Office Action for U.S. Appl. No. 12/846,428, mailed on Dec. 1, 2014, Ilya D. Rosenberg, “Hardware Enabled Interpolating Sensor and Display”, 26 pages. |
Office Action for U.S. Appl. No. 12/846,268, mailed on Dec. 22, 2014, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 36 pages. |
Office Action for U.S. Appl. No. 12/846,539, mailed on Feb. 24, 2015, Ilya D. Rosenberg, “Magnetic Touch Discrimination”, 17 pages. |
Office Action for U.S. Appl. No. 12/846,519, mailed on Mar. 11, 2015, Ilya D. Rosenberg, “Touch Sensing Techniques”, 35 pages. |
Final Office Action for U.S. Appl. No. 12/846,268, mailed on Apr. 2, 2015, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 37 pages. |
Office action for U.S. Appl. No. 12/846,519 mailed on Nov. 18, 2015, Rosenberg et al., “Touch Sensing Techniques”, 36 pages. |
Office action for U.S. Appl. No. 13/247,699, mailed on Mar. 24, 2016, Beguin et al., “Interacting Through Noncontact Gestures”, 25 pages. |
Office action for U.S. Appl. No. 12/846,368, mailed on Apr. 21, 2016, Rosenberg et al., “Touch Distinction”, 24 pages. |
Office action for U.S. Appl. No. 12/846,368, mailed on Oct. 18, 2016, Rosenberg et al., “Touch Distinction”, 21 pages. |
Office action for U.S. Appl. No. 15/003,086, mailed on Dec. 15, 2016, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 23 pages. |
Office action for U.S. Appl. No. 15/003,086, mailed on Jun. 17, 2016, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 11 pages. |
Office Action for U.S. Appl. No. 12/846,268, mailed on Jul. 29, 2010, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 32 pages. |
Office action for U.S. Appl. No. 12/846,368, mailed on Jul. 17, 2014, Rosenberg et al., “Touch Distinction”, 45 pages. |
Office Action for U.S. Appl. No. 12/846,428, mailed on Aug. 21, 2014, Ilya D. Rosenberg, “Hardware Enabled Interpolating Sensor and Display”, 24 pages. |
Office action for U.S. Appl. No. 12/846,295, mailed on Sep. 24, 2014, Rosenberg et al., “Visually Consistent Arrays including Conductive Mesh”, 17 pages. |
Final Office Action for U.S. Appl. No. 13/247,699, mailed on Sep. 26, 2014, Julien G. Beguin, “Interacting Through Noncontact Gestures”, 30 pages. |
Office action for U.S. Appl. No. 15/003,086, dated Apr. 4, 2017. Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 24 pages. |
Moscovich, et al., “Multi-finger Cursor Techniques”, Department of Computer Science, Brown University, Year of Publication: 2006, 7 pages. |
Office action for U.S. Appl. No. 12/846,368, dated Sep. 10, 2015, Rosenberg et al., “Touch Distinction”, 20 pages. |
Number | Date | Country | |
---|---|---|---|
61230592 | Jul 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12380350 | Feb 2009 | US |
Child | 12846497 | US |