Electronic devices that accept input from users are ubiquitous, and include cellular phones, eBook readers, tablet computers, desktop computers, portable media devices, and so forth. Increasingly, users desire these devices to accept input without the use of traditional keyboards or mice.
The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
Overview
Described herein are devices and techniques for distinguishing and characterizing touches upon a touch sensor. The touch sensor generates output corresponding to one or more touches at points on the touch sensor. The output from the touch sensors may be used to generate a touch profile. Touch profiles may comprise several characteristics such as area, magnitude of applied force, linear force distribution, temporal force distribution, location or distribution of the force, variation over time, duration, and so forth.
With sufficient numbers of sensing elements, or “sensels,” and resulting data, the touch profiles allow for distinction between different objects and characterization of those objects. For example, with sufficient force per unit area resolution, the relatively narrow tip of a stylus may be distinguished from a relatively blunt fingertip, even where the two exert the same overall force on the touch sensor.
Force sensing touch sensors also allow distinguishing intentional and unintentional touches by assessing the touch profile. For example, the linear force distribution of the touch profile of a user resting a portion of their hand on the touch sensor differs from a finger pressing a particular spot on the touch sensor.
Where an object in contact with the force sensitive touch sensor either contacts the touch sensor at a sensel amidst a plurality of sensels, or moves over a sensel, the resulting touch profile allows for the shape of different objects, or at least the portion in contact with the touch sensor, to be distinguished. For example, the relatively narrow tip of the stylus produces an abrupt spike in both a spatial and a temporal force distribution of the touch profile as compared to the relatively blunt tip of the finger.
Characterization of prior touches may be used to characterize future touches within a specific area on the touch sensor. For example, within an area in which the input was determined to be from a stylus, subsequent input in that area may be biased in favor of being characterized as from a stylus. This biasing to characterize the additional touches made within the area may vary over time. In some implementations, over a pre-determined period of time the biasing may decrease. This persistence of characterization provides rapid characterization and improves responsiveness to the user. In one example, such persistence allows for easy entry of diacritical marks while the user is handwriting on the touch sensor. By persisting the characterization of a stylus as a stylus, the user is able to enter the dot atop a lowercase “i” or a horizontal bar through a body of a lowercase “t” without requiring modification of writing style or other input behaviors.
To further facilitate characterization of touches, larger portions or regions of the touch sensor may be designated. Touches within these regions are biased more heavily in favor of a particular characterization associated with the region. Regions may be determined based on history of user input, orientation of content on a display in relation to the touch sensor, and so forth. For example, a perimeter or border of the touch sensor may be designated a first region. Within this first region, touches are most likely finger touches and may further be biased to be considered holding touches or associated with a particular subset of gestures. Meanwhile, touches in a second region which is surrounded by the first region may be characterized as most likely stylus touches associated with controls or handwriting.
While using a force sensitive touch sensor, the magnitude of the force may be a factor for generating output. For example, a harder touch may result in a change to the output. However, maintaining a constant force while the touch moves is difficult for some users due to varying geometries, biomechanical factors, and so forth. As a result, to avoid erroneous inputs during motion of a touch on the force sensitive touch sensor, the input may be modified. In some instances, this modification may include de-rating or de-emphasizing the force input. Thus, the user may have to push harder while the touch is moving to affect the same change compared to a lesser push while the touch is stationary. In some instances, force input may be disregarded altogether during motion of the touch. In some implementations, during movement, an overall touch detection threshold may be decreased during movement, improving tracking of the touch during motion.
Touch sensors are used in a variety of devices ranging from handheld e-book reader devices to graphics tablets on desktop computers. Users interact with the devices in a variety of ways and in many different physical environments and orientations. Part of the user's palm may rest on the touch sensor, or the user may grip a portion of the touch sensor to hold the device, such as in a portable device where the surface area of the touch sensor predominates. These holding touches may be detected and prevented from generating erroneous output.
The holding touches may be recognized by their distinctive touch profile. In one implementation, a touch which maintains about the same contact area for a pre-determined time interval and exerts a relatively constant force may be categorized as a holding touch. Once characterized, input received from that area may be de-emphasized or disregarded while the holding touch is in progress. Once the holding touch releases, input from the area of the holding touch may be re-emphasized. In some implementations, this re-emphasis may occur over a pre-determined time interval. For example, a user may be holding a touch sensor and generating a holding touch. By re-emphasizing over a period of time, the user may shift their hands for a moment, and reassert the holding touch without generating erroneous input. Additionally, a history of touch profiles for touches determined to be holding touches or which result from a training phase may also be used to distinguish the holding touches.
Reducing power consumption in electronic devices offers several benefits such as extending battery life in portable devices, thermal management, and so forth. Touch sensors consume power while operational. In many implementations, touch sensors and/or associated components may be operated with different parameters. These parameters may include scan rate, scan area, and so forth. These parameters may directly affect the amount of power consumed. For example, when maximum responsiveness to user input is called for, the scan rate and scan resolution may be configured accordingly, resulting in significant power consumption. In many devices, users seldom constantly input data and seldom use all available areas on the touch sensor simultaneously. Thus, it is possible to alter parameters to reduce power consumption.
By monitoring the touch profiles and comparing changes to one or more pre-determined thresholds, it is possible to dynamically change the operational mode of the touch sensor and/or associated components. When a touch is determined to be a quiescent touch, or no touches are present, the touch sensor and/or associated components such as a touch controller may be placed into a lower power mode. For example, consider an eBook reader device with a touch sensitive screen. While the user is reading the user's rate of touch input drops because they are focused on reading. Holding touches may remain, but no active touches are present. The touch sensor may then be placed into a lower power mode.
In one implementation, this low power mode may comprise a decrease in the scan rate of the touch sensor, change in which parts of the touch sensor are being scanned, alteration in scan resolution, and so forth. When the user starts to touch the screen, such as to when handwriting a comment or selecting a command, the touch sensor resumes a more wakeful mode, such as normal power mode.
Furthermore, there may be different operational modes depending upon the characterization of the touch. For example, where the touches comprise handwriting input via the stylus, the touch sensor may be configured to operate in the highest scan rate with best resolution to capture the detailed input from the user. Where the input is less precise, such as selection of user controls or finger based input, the scan rate and resolution may be set to an intermediate value configured to provide adequate input.
In addition to fingers, users may wish to use implements such as styli, pucks, tokens, and so forth in conjunction with touch and near-touch sensors. These implements may incorporate magnets suitable to work in conjunction with the magnetic field sensors. For ease of description, these implements are referred to as “magnetic implements.” However it is understood that the implement incorporates at least one magnet, but need not be entirely magnetic. These implements thus incorporate a magnet which, when used in conjunction with magnetic field sensors coupled to the device, provides for additional functionality and features.
The magnetic field sensors, such as a magnetometer, allow for the detection and characterization of an impinging magnetic field. For example, a magnetometer may allow for determining a field strength, angular bearing, polarity of the magnetic field, and so forth. In some implementations, the magnetometer may comprise a Hall-effect device. Magnetic fields, particularly in the environment within which electronic devices operate, are predictable and well understood. As a result, it becomes possible to use one or more magnetometers to determine presence and in some implementations the position, orientation, and so forth of the magnetic implement.
Touches are distinguishable based on the presence or absence of the magnetic field. For example, when no magnetic field meeting pre-defined criteria is present, a touch may be determined to be a finger touch in contrast to when the magnetic field having the pre-defined criteria is present which determines the touch to be a magnetic implement such as a stylus. In another example, which end of a stylus is touching the touch sensor is distinguishable independent of the touch profile based on the polarity of the magnetic field detected. These pre-defined criteria of the magnetic field may include field strength, direction, and so forth.
Additionally, by using the position information of magnetic implements, near-touch sensing is possible. For example, movement of the stylus or other magnetic implement proximate to the magnetometer but not in contact with the touch sensor may still provide input.
Illustrative Device
Within or coupled to the device, an input module 106 accepts input from the touch sensor 102 and other sensors. For example, as depicted here with a broken line is a user touch 108 on the touch sensor 102. Also depicted is a stylus 110 having two opposing terminal structures, a stylus tip 112 and a stylus end 114. The stylus tip 112 is shown in contact with the touch sensor 102 as indicated by the stylus touch 116.
Returning to the sensors within the device 100, one or more magnetometers 118 are accessible to the input module 106. These magnetometers are configured to detect and in some implementations characterize impinging magnetic fields. One or more orientation sensors 120 such as accelerometers, gravimeters, and so forth may also be present. These sensors are discussed in more detail next with regards to
An image processing unit 206 is shown coupled to one or more display components 104 (or “displays”). In some implementations, multiple displays may be present and coupled to the image processing unit 206. These multiple displays may be located in the same or different enclosures or panels. Furthermore, one or more image processing units 206 may couple to the multiple displays.
The display 104 may present content in a human-readable format to a user. The display 104 may be reflective, emissive, or a combination of both. Reflective displays utilize incident light and include electrophoretic displays, interferometric modulator displays, cholesteric displays, and so forth. Emissive displays do not rely on incident light and, instead, emit light. Emissive displays include backlit liquid crystal displays, time multiplexed optical shutter displays, light emitting diode displays, and so forth. When multiple displays are present, these displays may be of the same or different types. For example, one display may be an electrophoretic display while another may be a liquid crystal display.
For convenience only, the display 104 is shown in
The content presented on the display 104 may take the form of electronic books or “eBooks.” For example, the display 104 may depict the text of the eBooks and also any illustrations, tables, or graphic elements that might be contained in the eBooks. The terms “book” and/or “eBook”, as used herein, include electronic or digital representations of printed works, as well as digital content that may include text, multimedia, hypertext, and/or hypermedia. Examples of printed and/or digital works include, but are not limited to, books, magazines, newspapers, periodicals, journals, reference materials, telephone books, textbooks, anthologies, instruction manuals, proceedings of meetings, forms, directories, maps, web pages, and so forth. Accordingly, the terms “book” and/or “eBook” may include any readable or viewable content that is in electronic or digital form.
The device 100 may have an input device controller 208 configured to accept input from a keypad, keyboard, or other user actuable controls 210. These user actuable controls 210 may have dedicated or assignable operations. For instance, the actuable controls may include page turning buttons, a navigational keys, a power on/off button, selection keys, joystick, touchpad, and so on.
The device 100 may also include a USB host controller 212. The USB host controller 212 manages communications between devices attached to a universal serial bus (“USB”) and the processor 202 and other peripherals.
The touch sensor 102 may comprise various technologies including interpolating force-sensing resistance (IFSR) sensors, capacitive, magnetic, force sensitive resistors, acoustic, optical, and so forth. The touch sensor 102 may be configured such that user input through contact or gesturing relative to the device 100 may be received.
The touch sensor controller 214 is configured to determine characteristics of interaction with the touch sensor. These characteristics may include the location of the touch on the touch sensor, magnitude of the force, shape of the touch, and so forth. In some implementations, the touch sensor controller 214 may provide some or all of the functionality provided by the input module 106, described below.
The magnetometer 118 may be coupled to the USB host controller 212, or another interface. The magnetometer 118, allows for the detection and characterization of an impinging magnetic field. For example, the magnetometer 118 may be configured to determine a field strength, angular bearing, polarity of the magnetic field, and so forth. In some implementations, the magnetometer may comprise a Hall-effect device. Magnetic fields, particularly in the environment within which electronic devices operate, are predictable and well understood. As a result, it becomes possible to use one or more magnetometers to determine presence and in some implementations the position, orientation, and so forth of the magnetic implement. A plurality of magnetometers 118 may be used in some implementations.
One or more orientation sensors 120 may also be coupled to the USB host controller 212, or another interface. The orientation sensors 120 may include accelerometers, gravimeters, gyroscopes, proximity sensors, and so forth. Data from the orientation sensors 120 may be used at least in part to determine the orientation of the user relative to the device 100. Once an orientation is determined, input received by the device may be adjusted to account for the user's position. For example, as discussed below with regards to
The USB host controller 212 may also couple to a wireless module 216 via the universal serial bus. The wireless module 216 may allow for connection to wireless local or wireless wide area networks (“WWAN”). Wireless module 216 may include a modem 218 configured to send and receive data wirelessly and one or more antennas 220 suitable for propagating a wireless signal. In other implementations, the device 100 may include a wired network interface.
The device 100 may also include an external memory interface (“EMI”) 222 coupled to external memory 224. The EMI 222 manages access to data stored in external memory 224. The external memory 224 may comprise Static Random Access Memory (“SRAM”), Pseudostatic Random Access Memory (“PSRAM”), Synchronous Dynamic Random Access Memory (“SDRAM”), Double Data Rate SDRAM (“DDR”), Phase-Change RAM (“PCRAM”), or other computer-readable storage media.
The external memory 224 may store an operating system 226 comprising a kernel 228 operatively coupled to one or more device drivers 230. The device drivers 230 are also operatively coupled to peripherals 204, such as the touch sensor controller 214. The external memory 224 may also store data 232, which may comprise content objects for consumption on eBook reader device 100, executable programs, databases, user settings, configuration files, device status, and so forth. Executable instructions comprising an input module 106 may also be stored in the memory 224. The input module 106 is configured to receive data from the touch sensor controller 214 and generate input strings or commands. In some implementations, the touch sensor controller 214, the operating system 226, the kernel 228, one or more of the device drivers 230, and so forth, may perform some or all of the functions of the input module 106.
One or more batteries 236 provide operational electrical power to components of the device 100 for operation when the device is disconnected from an external power supply. The device 100 may also include one or more other, non-illustrated peripherals, such as a hard drive using magnetic, optical, or solid state storage to store information, a firewire bus, a Bluetooth™ wireless network interface, camera, global positioning system, PC Card component, and so forth.
Couplings, such as that between the touch sensor controller 214 and the USB host controller 212, are shown for emphasis. There are couplings between many of the components illustrated in
Illustrative Hand
Touch Profiles
The touch sensor 102 generates output corresponding to one or more touches at points on the touch sensor 102. The output from the touch sensors may be used to generate a touch profile which describes the touch. Touch profiles may comprise several characteristics such as shape of touch, linear force distribution, temporal force distribution, area of the touch, magnitude of applied force, location or distribution of the force, variation over time, duration, and so forth. The characteristics present within touch profiles may vary depending upon the output available from the touch sensor 102. For example, a touch profile generated by a projected capacitance touch sensor may have shape of touch and duration information, while a touch profile generated by an IFSR sensor may additionally supply force distribution information.
The touch profiles may comprise the contact areas 400. As shown here, the stylus point 112 when in contact with the touch sensor 102 generates a very small contact area which is roughly circular, while the stylus end 114 generates a larger roughly circular area. A contact area associated with one of the finger pads 314 is shown which is larger still and generally oblong.
Should the user's palm 302 come in contact with the touch sensor 102, the contact areas of the metacarpophalangeal joints 316, the hypothenar eminence 318, and the thenar eminence 320 may produce contact areas as shown. Other portions of the hand (omitted for clarity, and not by way of limitation) may come in contact with the touch sensor 102 during normal use. For example, when the user manipulates the stylus 110 to write on the touch sensor 102, the user may rest the hand which holds the stylus 110 on the touch sensor, resulting in sensing of the edge of the hypothenar eminence 319.
By monitoring the touches to the touch sensor 102 and building touch profiles, it becomes possible to dynamically adjust a user interface. For example, when the touch profile indicates small fingers such as found in a child, the user interface may automatically adjust to provide a simpler set of commands, reduce force thresholds to activate commands, and so forth.
Even when objects are distinguished, the objects themselves may produce intentional or unintentional touches. For example, the user may rest a thumb 312 or stylus on the touch sensor 102 without intending to initiate a command or enter data. It is thus worthwhile to distinguish intentional and unintentional touches to prevent erroneous input.
In this figure, time 602 is shown along an X axis while the magnitude of applied force 604 is shown along a Y axis. A pre-determined threshold of force F1 may be set such that when a touch exceeds the pre-determined threshold of force, an action is initiated. For example, the action may comprise changing an eBook page on the display 104.
While the pre-determined threshold of force allows for some distinction between intentional and unintentional touches, unintentional activation by exceeding the threshold may still occur. This figure depicts the temporal force distributions of an intentional touch 606 and an unintentional touch 608. Both of these touches may have applied forces which exceed threshold F1 and thus could result in initiation of an action. However, activation from the unintentional touch 608, such as from a user grasping the touch sensor 102, is undesirable.
By analyzing the temporal force distributions 606 and 608, it is possible to determine when a touch is intentional or unintentional. For example, as illustrated here, the intentional hold 606 exhibits a sharp increase in force at about time T1, lasting until about time T2, and then experiences a decline over time. In contrast, the unintentional hold 608 ramps up at T1, then remains at about the same magnitude past time T3. Thus, the intentional hold is clearly distinguished by its distinctive temporal force distribution.
At 702, the input module 106 detects a touch 108 on the touch sensor 102. At 704, the input module 106 determines a touch profile of the touch on the touch sensor. As described above this touch profile may include a linear force distribution, a temporal force distribution, an area, duration, or other characteristics. At 706, the input module 106 characterizes the touch based at least in part upon the touch profile. For example, a linear force distribution of the touch with a contact area below a pre-determined threshold and steep sides may be characterized as the stylus tip 112.
Interaction with the touch sensor 102 provides additional details in the touch profile which are suitable for distinguishing touches.
At 806, the stylus tip 112 is closest to the sensel C3. As a result, a corresponding peak in the applied force at C3 with a measured force of 9 is shown flanked by forces of 1 at sensels C2 and C4. Because of the small contact area and concentrated force at the presence at a column, the force is detectable at three of the sensels shown and reflects a well defined peak of force across those sensels. As the stylus tip 112 moves along, similar to above with regards to 802, the sensels at C3 and C4 measures a force of 5, while C2 and C5 (not shown) are zero. This distribution of forces may be contrasted with the scenario in the next figure involving the blunter finger of the user.
At 904, the finger is closest to the sensel C3 and a corresponding peak in the applied force measurements along the sensels is apparent. The magnitude of the applied force 804 peaks at C3 with a force of 8, while the adjacent sensels C2 and C4 report a force of 5. The more distant sensels C1 and C5 (not shown) report no force. This distribution in force generated by the finger is thus readily distinguishable from the peak generated by the stylus 112. At least a portion of this distinction arises from a comparison of a ratio of force applied and distance between sensels. As a result, it is possible to distinguish between the object based at least in part upon the interaction with the sensels. At 906, as the finger moves away from sensel closest to C3 and at the midpoint between C3 and C4 demonstrates a similar force profile as discussed above at 902.
The above scenario of
At 1008, the input module 106 assesses the reliability metric for the characterization. When a reliability threshold of characterization has been reached, the process proceeds to 1010. At 1010, the input module 106 continues the characterization so long as the touch maintains a minimum pre-determined force. When the reliability threshold of characterization has not been reached, at 1008 the process returns to 1006 and continues to characterize and re-characterize the touch.
For example, where the touch profile indicates a force spike consistent with the stylus 112, the input module 106 maintains the designation of the touch as being a stylus input while the touch moves around on the touch sensor. In some implementations, this may allow the input controller 106 to cease the process of characterizing the ongoing touches, resulting in reduced computational overhead, reduced power consumption, and so forth.
Managing Touches
The input module 106 may characterize touches in order to more readily determine what input is intended by the user. Touches may be characterized as finger touches, stylus tip touches, stylus end touches, intentional touches, unintentional touches, holding touches, and so forth. Where the touch, or a series of touches, are ongoing, such as during entry of handwriting, it is advantageous to maintain some persistence as to the characterization between touches which are adjacent in space, time, or both. Advantages include minimizing the need to computationally re-characterize a touch, ability to avoid calculation of contact area centroids, and so forth.
Furthermore, persistence of the characterization improves responsiveness and ease of use for the user. In the following examples, assume the user is entering text with the stylus 110 on the touch sensor 102. This text contains various letters, as well as several diacritical marks such as the dots atop several lowercase “i” characters and horizontal bars through lowercase “t” characters.
Proximate to the stylus tip 112 is a high stylus bias area 1102. Touches within this area are assigned a high bias as being deemed touches by the stylus tip 112. Within this area, touches may not be continuously re-characterized. This minimizes the processing requirements for characterization, while also improving the user interface experience by providing persistence for the entry of additional characters, diacritical marks, and so forth. For example, as shown here, the user may draw the vertical bodies of three exclamation points and follow up with placing the corresponding dots below the body of each exclamation point without manually actuating any further command.
The area of persistence may be envisioned in some implementations as a trailing area which fades over time. As shown here, a medium stylus bias area 1104 covers a first portion of the word “alliteration.” Within this area 1104, touches may be assigned a medium bias, indicating that they are likely to be stylus touches. Similarly, in an older portion of entered text, a low stylus bias area 1106 is present. Touches within this area are given a lesser bias. As time continues, the persistence continues to fade and the remaining areas are assigned no stylus bias 1108. Within these areas, the input module 106 again characterizes the touches without regard to prior activity. In some implementations, the biasing may fade based on distance from a pre-determined point, rather than time.
At 1208, the input module characterizes additional touches made within the area based at least in part upon the initial characterization. In some implementations, this characterization may comprise applying a biasing factor to the additional touches. For example, the touches made to place the dots under the exclamation point bodies within the high stylus bias area 1102 are characterized as stylus touches.
At 1210, the input module 106 decreases the bias over a pre-determined period of time given to touches within the area. For example, if the user of
To further facilitate characterization of touches, larger portions or regions of the touch sensor may be designated.
A second region 1304 is depicted in this illustration within the first region, shown here as a central area of the touch sensor 102. Similar to the areas above with respect to
Regions may be set based upon a variety of user and environmental factors. In one implementation, the input module 106 may adapt to usage patterns. For example, the input module 106 may learn that the user is likely to hold the device on the left side, and designate a corresponding region to bias touches within that region as unintentional touches. In another implementation, the regions may be designated based upon the orientation or nature of content presented by the device 100. For example, where an eBook is presented in landscape mode on the display 104, regions on the top and bottom of the touch sensor 102 where a user holding the device 100 when in landscape mode may be designated as likely holding touch regions. Data from other sensors such as orientation sensors 120 may also be used to determine the placement, size, shape, and so forth of the regions.
Biasing may vary from region to region. For example, touches within the first region may be biased heavily as being holding touches, while touches within the second region may be lightly biased as being stylus touches. Some regions may be configured with zero biasing, indicating each touch is to be characterized. The regions may manifest as other shapes. For example, regions may be circular, elliptical, rectangular (as shown), or may be defined by an irregular or non-geometric boundary. For example, a two-dimensional array of numbers describing the strength of the bias across the surface of the touch sensor 102 may be used. In some implementations, bias values for positions between locations of the array may also be interpolated.
At 1502, the input module 106 detects a touch on the touch sensor 102. At 1504, the input module 106 detects a motion of the touch on the touch sensor 102.
At 1506, the input module 106 modifies force input from the user which is associated with the touch while the touch is in motion. This modification may include de-rating or de-emphasizing the force input. Thus, a pre-determined command activation threshold force necessary to initiate a command may be increased during movement such that the user is required to push harder to activate the command. Likewise, when the touch is stationary the pre-determined command activation threshold force may be decreased, allowing for a lesser push to activate the command. In some instances force input may be disregarded altogether. This modification thus reduces the likelihood of erroneous input due to variations in the applied force during a moving touch.
In some implementations, during movement, an overall touch detection threshold may be decreased during movement, allowing the lighter or varying pressure touch to be continuously tracked while in motion. For example, the overall touch detection threshold may require a touch to apply a pressure of 20 grams or more to be considered a touch. Once the touch is recognized and the touch begins to move, the overall touch detection threshold may be reduced during movement, such as to 10 grams in this example. This temporary reduction improves tracking of a moving touch where applied force may vary due to changes in orientation, biomechanics, and so forth. Once motion stops, the overall touch detection threshold may resume the previous value, in this example 20 grams.
At 1508, the input module 106 detects cessation of motion of the touch on the touch sensor 102. At 1510, the input module 106 removes the modification of the touch input from the touch sensor. With the motion complete, force input again becomes fully available to receive user inputs.
At 1602, the input module 106 determines a touch profile of a touch on the touch sensor 102. At 1604 the input module 106 determines the touch profile corresponds to a holding touch. In one implementation, this determination may comprise assessing whether the touch exceeds a pre-determined time interval and that the temporal force distribution is relatively stable. A holding touch may be characterized by the touch profile indicating the application of force to an area on the touch sensor remains within a pre-determined force boundary for a pre-determined time. For example, a touch of more than 10 seconds which maintains a force between 2 newton and 3 newtons is considered stable. In other implementations, the pre-determined force boundary may comprise a percentage of variation over time. For example, the touch may be considered stable when the applied force is maintained within 5% of a moving average of the applied force.
At 1606, the input module 106 de-emphasizes input associated with the holding touch on the touch sensor. For example, touches within an area associated with the holding touch are disregarded.
At 1608, the input module 106 detects release of the holding touch. For example, the user may set down and release a grasp on the device. At 1610, the input module 106 re-emphasizes over a pre-determined time interval input associated with the area defined by the holding touch. This re-emphasis over the pre-determined time interval prevents erroneous input in situations such as where the user is holding the device, and momentarily releases and re-grasps the device.
At 1702, the input module 106 determines a touch profile of a touch on the touch sensor 102. For example, the input module 106 may determine that a touch is a holding touch. Holding touches, unintentional touches, and so forth may be considered quiescent touches. In contrast, active touches include a moving stylus, active control inputs, and so forth.
At 1704, when the touch profile indicates a quiescent touch, or when no touches are present, the touch sensor 102 and/or associated components may be placed into a low power mode. For example low power mode may comprise a decrease in the scan rate of the sensor, changing which parts of the sensor are being scanned, altering scan resolution, and so forth.
At 1706, when the touch profile indicates an active touch, enter a normal power mode. For example, when the user starts to move a touch on the screen, such as to handwrite a comment or select a command, the touch sensor 102 awakens and resumes a more wakeful mode, such as normal power mode.
Furthermore, there may be different operational modes depending upon the characterization of the touch or pre-determined settings. For example, where the touches comprise handwriting input via the stylus, the touch sensor may be configured to operate in the highest scan rate with best resolution to capture the detailed input from the user. Where the input is less precise, such as selection of user controls or finger based input, the scan rate and resolution may be set to an intermediate value configured to provide adequate input.
Magnetic Implements and Operation
In addition to fingers, users may wish to use implements such as styli, pucks, tokens, and so forth in conjunction with touch and near-touch sensors. These implements may incorporate magnets suitable to work in conjunction with the magnetic field sensors. For ease of description, these implements are referred to as “magnetic implements.” However, it is understood that the implement incorporates at least one magnet, but need not be entirely magnetic. These implements thus incorporate a magnet which, when used in conjunction with magnetic field sensors coupled to the device, provides for additional functionality and features.
The input module 106 may be configured to recognize which end of the stylus is in use, and modify input accordingly. For example, input determined to be from the stylus tip 112 may be configured to initiate a handwriting function on the device 100, while input determined to be from the stylus end 114 may be configured to highlight text. In other implementations, it may be recognized when the stylus 110 is placed flat relative to the touch sensor, such as when the stylus 110 lies upon the touch sensor 102. In this orientation, the input module 106 may be configured to wipe or erase contents on the display 104.
In some implementations, the permanent magnet 1802 may also be configured to hold the stylus 110 to the electronic device 100 or an accessory such as a cover. Additionally, as described above, magnetic implements may embody a variety of form factors. For example, the magnetic implement may comprise a wrist band, ring worn around a finger, may be integrated with a glove for a user's hand, puck, and so forth. Furthermore, because the stylus itself contains no active parts production cost is relatively low cost and reliability is significantly improved.
As shown in this illustration, the stylus 110 is positioned above the surface of the device 100. Shown at approximately the center of the device 100 is the magnetometer 118, which may be disposed beneath the display 104. In other implementations, the magnetometer 118 may reside in other locations within or adjacent to the device.
The magnetometer 118 senses the magnetic field 1804 generated by the permanent magnet 1802 within the stylus 110, and is configured to characterize the magnetic field. An angle θ1 is depicted describing an angle between a field line of the magnetic field 1804 and the Y axis of the device. A single angle θ1 is shown here for clarity, but it is understood that several angular comparisons may be made within the magnetometer 118. By analyzing the angular variation and utilizing known characteristics about the permanent magnet 1802, the device 100 is able to determine an angular bearing to the source. For example, assume that the magnetometer 118 is configured to read out in degrees, with the 12 o'clock position being 0 degrees, and increasing in a clockwise fashion, device 100 may determine the stylus is located at an angular bearing of about 135 degrees relative to the magnetometer 118.
Furthermore, the magnetometer 118 may also determine a field strength measurement H1 as shown. When compared to a known source such as the permanent magnet 1802 within the stylus 110, it becomes possible to estimate distance to a magnetic field source based at least in part upon the field strength.
The input module 106 may also use data from the magnetometer 118 to determine a field orientation. The orientation of a magnetic field may be considered the determination of which end of the magnet is the North pole and which is the South pole. This field orientation may be used to disambiguate the angular bearing (for example, determine the bearing is 135 and not 315 degrees), determine which end of the stylus 110 is proximate to the device, and so forth.
In some implementations, the input module 106 may provide a calibration routine whereby the user places the stylus in one or more known positions and/or orientations, and magnetometer 118 output is assessed. For example, the device 100 may be configured to calibrate field strength, position, and orientation information when the stylus 110 is docked with the device 100. This calibration may be useful to mitigate interference from other magnetic fields such as those generated by audio speakers, terrestrial magnetic field, adjacent electromagnetic sources, and so forth.
As described above it is possible to determine an angular bearing of the magnetic field source, such as the permanent magnet 1802 within the stylus 110, relative to the magnetometer 118. In a similar fashion it is possible as shown here to measure angles of an impinging magnetic field 1804 to determine a tilt angle of the magnetic field source. Due to the closed loop nature of magnetic fields which extend unbroken from a first pole to a second pole, better results are obtained from using longer magnets. For example, where the permanent magnet 1802 extends substantially along the stylus body 1812, better angular resolution is possible compared to a short magnet placed within the stylus tip 112. Also, a longer magnet reduces field flipping or ambiguity compared to a shorter magnet. Furthermore, distance to the object along the angular bearing may be determined by analyzing the strength of the magnetic field source at the magnetometer 118.
As shown here, the magnetic field 1804 impinges on the magnetometer 118 and angles θ2, and θ3 are described between the magnetic field lines 1804 and a defined reference plane. By comparing the field strength to estimate distance and by measuring the angles, it is thus possible to calculate a tilt angle of the stylus relative to the reference plane defined within the magnetometer 118, and thus the device 100. Additionally, as mentioned above, by determining the polarity of the magnetic field, it is possible to determine which end of the stylus is proximate to the device.
Additional magnetometers may be used to provide more accurate position information.
In addition to determining location based upon the angle of impinging magnetic fields, field strength H may be used to determine approximate location. For example, given the position of the stylus 110 and corresponding permanent magnet 1802 adjacent to magnetometer 118(3), close to magnetometer 118(4), and most distant from magnetometer 118(1), based upon the field strength the position of the magnetic field source may be triangulated.
Furthermore, as mentioned above, by observing the polarity of the magnetic field, it is possible to determine accurately which end of the stylus 110 is proximate to the device. This is particularly useful in situations where the touch sensor is not able to generate force data, such as with a projected capacitance touch sensor. By monitoring the magnetic field orientation, determination of whether a stylus tip 112 or a stylus end 114 is closest to the touch sensor is readily accomplished with a stylus having a permanent magnet within.
At 2302, one or more magnetometers detect a magnetic field generated by a magnetic field source and generate data about the field. At 2304, an input module 106 determines a position of the magnetic field source based upon the data from the one or more magnetometers. For example, as described above with regards to
At 2306, output is modified at least in part based upon the position of the magnetic field source. As described above, the input generated by the magnetic field source may be near-touch. For example, the user may wave the magnetic stylus above the device 100 to initiate an action, such as changing a displayed page of an eBook. Or in another example, the tilt angle of the stylus may control how fast the display 104 scrolls pages.
At 2404, the input module 106 determines an angular bearing relative to the one or more magnetometers of a magnetic field source generating the magnetic field. For example, as described above the input module 106 may observe the angle with which the magnetic fields impinge upon the magnetometers and determine the angular bearing.
At 2406 a polarity or orientation of the magnetic field is determined. As described above, this orientation may allow for disambiguation of the angular bearing, provide information as to what part of the magnetic implement is proximate to the device, and so forth.
At 2408, a field strength of the magnetic field is determined at one or more of the magnetometers. At 2410, the input module 106 determines position and orientation of the magnetic field source based at least in part upon the angular bearing, the field strength, or both.
At 2412, the input module 106 receives input from the touch sensor 102 and calibrates the determination of the position of the magnetic field source. For example, when the stylus tip 112 of the magnetic stylus touches the touch sensor 102, the device 100 now has an accurate known location of the touch. This known location may be used to adjust the determination of the position via the magnetometers to improve accuracy.
At 2506, the input module 106 modifies input at least partly in response to the tilt angle. For example, when the user tilts the stylus at an extreme angle, the input may be configured to highlight text rather than enter text.
Depending upon the capabilities of the touch sensor 102, is may be difficult or impossible to characterize a touch. For example, when the touch sensor 102 does not support force sensing, use of a force distribution is unavailable to distinguish a tip of a stylus from a blunt end of the stylus. By using input from both the touch sensor 102 and the magnetometers 118, it becomes possible to more readily distinguish objects which are in contact with the touch sensor 102, particularly magnetic implements compared to non-magnetic implements.
When at 2604 the input module 106 determines that a magnetic field is detected by the one or more magnetometers 118, the input module 106 module may further compare position information. At 2608, when the position of the detected magnetic field corresponds to the location of the touch upon the touch sensor 102, the process continues to 2610. At 2610, the input module 106 categorizes the touch as a stylus (or magnetic implement) touch.
Returning to determination 2608, when the position of the detected magnetic field does not correspond to the location of the touch upon the touch sensor 102, the process continues to 2606, where the touch is categorized as a non-stylus (e.g., a finger).
At 2702, the input module 106 detects a touch at a location on the touch sensor 102. At 2704, the input module 106 determines whether a magnetic field is detected by the one or more magnetometers 118. When at 2704 no magnetic field is detected, at 2706 the input module categorizes the touch as a non-stylus or non-magnetic implement touch.
When at 2704 the input module 106 determines that a magnetic field is detected by the one or more magnetometers 118, the input module 106 module may further compare position information. At 2708, when the position of the detected magnetic field corresponds to the location of the touch upon the touch sensor 102, the process continues to 2710. When at 2708 the position of the detected magnetic field does not correspond to the location of the touch upon the touch sensor 102, the process proceeds to 2706 and categorizes the touch as a non-stylus touch.
At 2710, the input module determines the polarity or orientation of the magnetic field. When at 2710 the magnetic field is in a first polarity, the process proceeds to 2712 and categorizes the touch as a first stylus touch. For example, the north magnetic pole of the stylus may be associated with the stylus tip 112, while the south magnetic pole may be associated with the stylus end 114. By determining the field polarity it is thus possible to distinguish which end of the stylus is proximate to the magnetometers 118, and thus the device 100. When at 2710 the magnetic field is in a second polarity, the process proceeds to 2714 and categorizes the touch as a second stylus touch.
It may be useful to determine which end of the magnetic implement is proximate to the device, without determining the position of the magnetic implement via magnetometer. For example, the device 100 may have a touch sensor and single magnetic field sensor unable to determine angular bearing.
At 2802, the input module 106 detects a touch on the touch sensor 102. At 2804, the input module 106 determines whether a magnetic field is detected by the one or more magnetometers 118. When at 2804 no magnetic field is detected, at 2806 the input module 106 categorizes the touch as a non-stylus or non-magnetic implement touch.
When at 2804 a magnetic field is detected, the process continues to 2808. At 2808, the input module determines the polarity or orientation of the magnetic field. When at 2808 the magnetic field is in a first polarity, the process proceeds to 2810 and categorizes the touch as a first stylus touch. When at 2808 the magnetic field is in a second polarity, the process proceeds to 2812 and categorizes the touch as a second stylus touch.
The input module 106 is now able to more readily determine which end of a magnetic implement is generating the touch. For example, when the field is oriented a first polarity, the input module 106 can determine that the touch corresponds to the stylus tip 112, while the second polarity indicates the stylus end 114 is closest to the device 100. Likewise, when a touch is sensed with no magnetic field present, the touch is not from the magnetic implement.
Conclusion
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the claims. For example, the methodological acts need not be performed in the order or combinations described herein, and may be performed in any combination of one or more acts.
The present application claims priority to U.S. Provisional Application Ser. No. 61/230,592, filed on Jul. 31, 2009, entitled “Inventions Related to Touch Screen Technology” and U.S. Provisional Application Ser. No. 61/263,015, filed on Nov. 20, 2009, entitled “Device and Method for Distinguishing a Pen or Stylus Contact from the Contact of a Finger or other Object Using Magnetic Sensing.” These pending applications are hereby incorporated by reference in their entirety, and the benefit of the filing dates of these pending applications are claimed to the fullest extent permitted.
Number | Name | Date | Kind |
---|---|---|---|
3944740 | Murase et al. | Mar 1976 | A |
4526043 | Boie et al. | Jul 1985 | A |
4587378 | Moore | May 1986 | A |
4952031 | Tsunoda et al. | Aug 1990 | A |
4983786 | Stevens et al. | Jan 1991 | A |
5105548 | Fowler | Apr 1992 | A |
5543589 | Buchana et al. | Aug 1996 | A |
5597183 | Johnson | Jan 1997 | A |
5666113 | Logan | Sep 1997 | A |
5761485 | Munyan | Jun 1998 | A |
5818430 | Heiser | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5847698 | Reavey et al. | Dec 1998 | A |
6029214 | Dorfman | Feb 2000 | A |
6072474 | Morimura et al. | Jun 2000 | A |
6128007 | Seybold | Oct 2000 | A |
6229502 | Schwab | May 2001 | B1 |
6392636 | Ferrari et al. | May 2002 | B1 |
6594606 | Everitt | Jul 2003 | B2 |
6707438 | Ishizuka et al. | Mar 2004 | B1 |
6762752 | Perski et al. | Jul 2004 | B2 |
6980202 | Carro | Dec 2005 | B2 |
6982699 | Lenssen et al. | Jan 2006 | B1 |
7123243 | Kawasaki et al. | Oct 2006 | B2 |
7166966 | Naugler, Jr. et al. | Jan 2007 | B2 |
7190348 | Kennedy et al. | Mar 2007 | B2 |
7199322 | Bourdelais et al. | Apr 2007 | B2 |
7324093 | Gettemy et al. | Jan 2008 | B1 |
7331245 | Nishimura | Feb 2008 | B2 |
7339577 | Sato et al. | Mar 2008 | B2 |
7471284 | Bathiche et al. | Dec 2008 | B2 |
7619616 | Rimas Ribikauskas et al. | Nov 2009 | B2 |
7760187 | Kennedy | Jul 2010 | B2 |
7800586 | Serban et al. | Sep 2010 | B2 |
7825905 | Philipp | Nov 2010 | B2 |
8089470 | Schediwy et al. | Jan 2012 | B1 |
8223278 | Kim et al. | Jul 2012 | B2 |
8243424 | Babu et al. | Aug 2012 | B1 |
8265717 | Gorsica et al. | Sep 2012 | B2 |
8316324 | Boillot | Nov 2012 | B2 |
8427424 | Hartmann et al. | Apr 2013 | B2 |
8466880 | Westerman et al. | Jun 2013 | B2 |
8558767 | Kwon | Oct 2013 | B2 |
8902174 | Peterson | Dec 2014 | B1 |
8947351 | Noble | Feb 2015 | B1 |
9069417 | Rimon | Jun 2015 | B2 |
9244562 | Rosenberg et al. | Jan 2016 | B1 |
20010013855 | Fricker et al. | Aug 2001 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020080123 | Kennedy et al. | Jun 2002 | A1 |
20020109668 | Rosenberg et al. | Aug 2002 | A1 |
20020149572 | Schulz et al. | Oct 2002 | A1 |
20020180714 | Duret | Dec 2002 | A1 |
20030067449 | Yoshikawa et al. | Apr 2003 | A1 |
20030095115 | Brian et al. | May 2003 | A1 |
20030156098 | Shaw et al. | Aug 2003 | A1 |
20030210235 | Roberts | Nov 2003 | A1 |
20030234768 | Rekimoto et al. | Dec 2003 | A1 |
20040125087 | Taylor et al. | Jul 2004 | A1 |
20040174324 | Yamazaki et al. | Sep 2004 | A1 |
20050083316 | Brian et al. | Apr 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050174336 | Nakayama et al. | Aug 2005 | A1 |
20050200798 | Tanaka | Sep 2005 | A1 |
20050259087 | Hoshino et al. | Nov 2005 | A1 |
20060007172 | Baker | Jan 2006 | A1 |
20060007182 | Sato | Jan 2006 | A1 |
20060012580 | Perski et al. | Jan 2006 | A1 |
20060012581 | Haim | Jan 2006 | A1 |
20060028459 | Underwood et al. | Feb 2006 | A1 |
20060050062 | Ozawa et al. | Mar 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060109252 | Kolmykov-Zotov et al. | May 2006 | A1 |
20060192726 | Huitema et al. | Aug 2006 | A1 |
20060198080 | Hawes et al. | Sep 2006 | A1 |
20060209045 | Su et al. | Sep 2006 | A1 |
20060236263 | Bathiche et al. | Oct 2006 | A1 |
20060244735 | Wilson | Nov 2006 | A1 |
20060293864 | Soss | Dec 2006 | A1 |
20070128948 | Nakanishi et al. | Jun 2007 | A1 |
20070152976 | Townsend et al. | Jul 2007 | A1 |
20070235231 | Loomis et al. | Oct 2007 | A1 |
20070236618 | Maag et al. | Oct 2007 | A1 |
20080018608 | Serban et al. | Jan 2008 | A1 |
20080018611 | Serban et al. | Jan 2008 | A1 |
20080030464 | Sohm et al. | Feb 2008 | A1 |
20080053293 | Georges et al. | Mar 2008 | A1 |
20080074400 | Gettemy et al. | Mar 2008 | A1 |
20080143679 | Harmon et al. | Jun 2008 | A1 |
20080158183 | Hotelling et al. | Jul 2008 | A1 |
20080160656 | Chanda et al. | Jul 2008 | A1 |
20080168403 | Westerman et al. | Jul 2008 | A1 |
20080180406 | Han et al. | Jul 2008 | A1 |
20080204426 | Hotelling | Aug 2008 | A1 |
20080211796 | Kim | Sep 2008 | A1 |
20080246723 | Baumbach | Oct 2008 | A1 |
20080254822 | Tilley | Oct 2008 | A1 |
20080296073 | McDermid | Dec 2008 | A1 |
20080303799 | Schwesig et al. | Dec 2008 | A1 |
20080309631 | Westerman | Dec 2008 | A1 |
20090095540 | Zachut et al. | Apr 2009 | A1 |
20090102805 | Meijer et al. | Apr 2009 | A1 |
20090120696 | Hayakawa et al. | May 2009 | A1 |
20090141008 | Johnson | Jun 2009 | A1 |
20090165296 | Carmi | Jul 2009 | A1 |
20090174679 | Westerman | Jul 2009 | A1 |
20090218310 | Zu et al. | Sep 2009 | A1 |
20090219258 | Geaghan et al. | Sep 2009 | A1 |
20090227295 | Kim | Sep 2009 | A1 |
20090237371 | Kim et al. | Sep 2009 | A1 |
20090237374 | Li et al. | Sep 2009 | A1 |
20090249236 | Westerman | Oct 2009 | A1 |
20090256817 | Perlin et al. | Oct 2009 | A1 |
20090289914 | Cho | Nov 2009 | A1 |
20090309616 | Klinghult et al. | Dec 2009 | A1 |
20090315848 | Ku et al. | Dec 2009 | A1 |
20100005427 | Zhang et al. | Jan 2010 | A1 |
20100006350 | Elias | Jan 2010 | A1 |
20100013780 | Ikeda et al. | Jan 2010 | A1 |
20100013797 | Kim et al. | Jan 2010 | A1 |
20100020043 | Park et al. | Jan 2010 | A1 |
20100026647 | Abe et al. | Feb 2010 | A1 |
20100039395 | Nurmi et al. | Feb 2010 | A1 |
20100056277 | Marks et al. | Mar 2010 | A1 |
20100090964 | Soo et al. | Apr 2010 | A1 |
20100117974 | Joguet et al. | May 2010 | A1 |
20100123670 | Philipp | May 2010 | A1 |
20100139990 | Westerman et al. | Jun 2010 | A1 |
20100156805 | Brand | Jun 2010 | A1 |
20100182285 | Tremblay | Jul 2010 | A1 |
20100199221 | Yeung et al. | Aug 2010 | A1 |
20100225604 | Homma | Sep 2010 | A1 |
20100267421 | Rofougaran | Oct 2010 | A1 |
20100277439 | Charlier et al. | Nov 2010 | A1 |
20100295780 | Vaisanen et al. | Nov 2010 | A1 |
20100295781 | Alameh et al. | Nov 2010 | A1 |
20110007021 | Bernstein et al. | Jan 2011 | A1 |
20110025619 | Joguet | Feb 2011 | A1 |
20110037709 | Cottarel et al. | Feb 2011 | A1 |
20110061947 | Krah et al. | Mar 2011 | A1 |
20110074701 | Dickinson | Mar 2011 | A1 |
20110096033 | Ko | Apr 2011 | A1 |
20110109577 | Lee et al. | May 2011 | A1 |
20110141009 | Izumi | Jun 2011 | A1 |
20110163992 | Cordeiro et al. | Jul 2011 | A1 |
20110242037 | Gruber | Oct 2011 | A1 |
20110254864 | Tsuchikawa et al. | Oct 2011 | A1 |
20110267265 | Stinson | Nov 2011 | A1 |
20110267280 | De Mers | Nov 2011 | A1 |
20110285657 | Shimotani et al. | Nov 2011 | A1 |
20120050181 | King et al. | Mar 2012 | A1 |
20120057064 | Gardiner et al. | Mar 2012 | A1 |
20120084691 | Yun | Apr 2012 | A1 |
20120105324 | Lee et al. | May 2012 | A1 |
20120173067 | Szczerba et al. | Jul 2012 | A1 |
20120174004 | Seder et al. | Jul 2012 | A1 |
20120206333 | Kim | Aug 2012 | A1 |
20120299848 | Homma et al. | Nov 2012 | A1 |
20120299849 | Homma et al. | Nov 2012 | A1 |
20120313880 | Geaghan et al. | Dec 2012 | A1 |
20120320247 | Kim et al. | Dec 2012 | A1 |
20120326994 | Miyazawa et al. | Dec 2012 | A1 |
20130002551 | Imoto et al. | Jan 2013 | A1 |
20140028557 | Otake et al. | Jan 2014 | A1 |
20140085202 | Hamalainen et al. | Mar 2014 | A1 |
20140267176 | Bathiche et al. | Sep 2014 | A1 |
20140285418 | Adachi | Sep 2014 | A1 |
20150109257 | Jalali | Apr 2015 | A1 |
Number | Date | Country |
---|---|---|
09282100 | Oct 2007 | JP |
WO2007141566 | Dec 2007 | WO |
WO2009008568 | Jan 2009 | WO |
WO2009021836 | Feb 2009 | WO |
Entry |
---|
Moscovich, et al., “Multi-finger Cursor Techniques”, Department of Computer Science, Brown University, Year of Publication: 2006, 7 pages. |
Ashbrook, et al., “Nenya: Subtle and Eyes-Free Mobile Input with a Magnetically-Tracked Finger Ring”, CHI 2011, May 7-12, 2011, 4 pages. |
Harrison, et al., “Abracadabra: Wireless, High-Precision, and Unpowered Finger Input for Very Small Mobile Devices”, In Proceedings of the 22nd Annual ACM Symposium on User interface Software and Technology (Victoria, British Columbia, Canada, Oct. 4-7, 2009). UIST '09. ACM, New York, NY, 4 pages. |
Non-Final Office Action for U.S. Appl. No. 12/846,497, mailed on Dec. 14, 2012, Ilya D. Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 26 pages. |
Office action for U.S. Appl. No. 12/846,328, mailed on Dec. 24, 2012, Rosenberg et al., “Two-Sided Touch Sensor”, 15 pages. |
Non-Final Office Action for U.S. Appl. 13/247,669, mailed on Feb. 1, 2013, Julien G. Beguin et al., “Interacting Through Noncontact Gestures”, 22 pages. |
Office Action for U.S. Appl. No. 12/846,539, mailed on Feb. 15, 2013, Ilya D. Rosenberg et al., “Magnetic Touch Discrimination”, 20 pages. |
Office action for U.S. Appl. No. 12/846,519, mailed on Apr. 24, 2013, Rosenberg et al., “Touch Sensing Techniques”, 23 pages. |
Office action for U.S. Appl. No. 12/846,497, mailed on Apr. 25, 2013, Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 27 pages. |
Office action for U.S. Appl. No. 12/846,268, mailed on May 3, 2013, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 15 pages. |
Office action for U.S. Appl. No. 12/846,295, mailed on May 21, 2013, Rosenberg et al., “Visually Consistent Arrays”, 14 pages. |
Office action for U.S. Appl. No. 13/247,699, mailed on Jul. 19, 2013, Beguin et al., “Interacting Through Noncontact Gestures”, 32 pages. |
Final Office Action for U.S. Appl. No. 12/846,539, mailed on Oct. 25, 2013, Ilya D. Rosenberg, “Magnetic Touch Discrimination”, 26 pages. |
Office Action for U.S. Appl. No. 12/846,268, mailed on Oct. 23, 2013, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 37 pages. |
Office Action for U.S. Appl. No. 12/846,428, mailed on Oct. 9, 2013, Ilya D. Rosenberg, “Hardware Enabled Interpolating Sensor and Display”, 25 pages. |
Office action for U.S. Appl. No. 12/846,519, mailed on Nov. 14, 2013, Rosenberg, et al., “Touch Sensing Techniques”, 24 pages. |
Office action for U.S. Appl. No. 12/846,328, mailed on Aug. 15, 2013, Rosenberg et al., “Two-Sided Touch Sensor”, 18 pages. |
Wolf, et al., “Angles, Azimuths, and Bearings”, Pearson Prentice Hall, Elementary Surveying, 12th Edition, 2008, Chapter 7, pp. 165-184. |
Final Office Action for U.S. Appl. No. 12/846,295, mailed on Dec. 23, 2013, Ilya D. Rosenberg, “Visually Consistent Arrays including Conductive Mesh”, 16 pages. |
Office Action for U.S. Appl. No. 13/247,699, mailed on Jan. 31, 2014, Julien G. Beguin, “Interacting Through Noncontact Gestures”, 28 pages. |
Office Action for U.S. Appl. No. 12/846,328, mailed on Dec. 19, 2013, Ilya D. Rosenberg, “Two-Sided Touch Sensor”, 13 pages. |
Office action for U.S. Appl. No. 12/846,428, mailed on Feb. 21, 2014, Rosenberg, et al., “Hardware Enabled Interpolating Sensor and Display”, 30 pages. |
Office Action for U.S. Appl. No. 12/846,497, mailed on Oct. 23, 2014, Ilya D. Rosenberg, “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 25 pages. |
Office Action for U.S. Appl. No. 12/846,268, mailed on Jul. 29, 2010, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 32 pages. |
Office Action for U.S. Appl. No. 12/846,428, mailed on Aug. 21, 2014, Ilya D. Rosenberg, “Hardware Enabled Interpolating Sensor and Display”, 24 pages. |
Office action for U.S. Appl. No. 12/846,295, mailed on Sep. 24, 2014, Rosenberg et al., “Visually Consistent Arrays including Conductive Mesh”, 17 pages. |
Final Office Action for U.S. Appl. No. 13/247,699, mailed on Sep. 26, 2014, Julien G. Beguin, “Interacting Through Noncontact Gestures”, 30 pages. |
Final Office Action for U.S. Appl. No. 12/846,428, mailed on Dec. 1, 2014, Ilya D. Rosenberg, “Hardware Enabled Interpolating Sensor and Display”, 26 pages. |
Office Action for U.S. Appl. No. 12/846,268, mailed on Dec. 22, 2014, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 36 pages. |
Office Action for U.S. Appl. No. 12/846,539, mailed on Feb. 24, 2015, Ilya D. Rosenberg, “Magnetic Touch Discrimination”, 17 pages. |
Final Office Action for U.S. Appl. No. 12/846,368, mailed on Feb. 27, 2015, Ilya D. Rosenbert, “Touch Distinction”, 49 pages. |
Office Action for U.S. Appl. No. 12/846,519, mailed on Mar. 11, 2015, Ilya D. Rosenberg, “Touch Sensing Techniques”, 35 pages. |
Final Office Action for U.S. Appl. No. 12/846,497, mailed on Mar. 20, 2015, Ilya D. Rosenberg, “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 37 pages. |
Final Office Action for U.S. Appl. No. 12/846,268, mailed on Apr. 2, 2015, Ilya D. Rosenberg, “Gestures and Touches on Force-sensitive Input Devices”, 37 pages. |
Final Office Action for U.S. Appl. No. 12/846,368, mailed on Feb. 27, 2015, Ilya D. Rosenbert, “Touch Distinction,” 49 pages. |
Office action for U.S. Appl. No. 12/846,519 mailed on Nov. 18, 2015, Rosenberg et al., “Touch Sensing Techniques,” 36 pages. |
Office action for U.S. Appl. No. 13/247,699, mailed on Aug. 27, 2015, Beguin et al., “Interacting Through Noncontact Gestures,” 24 pages. |
Office action for U.S. Appl. No. 15/003,086, mailed on Dec. 15, 2016, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 23 pages. |
Office Action for U.S. Appl. No. 12/846,497, mailed on Dec. 22, 2016, Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 43 pages. |
Office action for U.S. Appl. No. 12/846,497, mailed on Mar. 15, 2016, Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 37 pages. |
Office action for U.S. Appl. No. 13/247,699, mailed on Mar. 24, 2016, Beguin et al., “Interacting Through Noncontact Gestures”, 25 pages. |
Office action for U.S. Appl. No. 15/003,086, mailed on Jun. 17, 2016, Rosenberg et al., “Gestures and Touches on Force-sensitive Input Devices”, 11 pages. |
Office Action for U.S. Appl. No. 12/846,497, mailed on Sep. 23, 2016, Rosenberg et al., “Capacitive Sensing with Interpolating Force-Sensitive Resistor Array”, 43 pages. |
Number | Date | Country | |
---|---|---|---|
61230592 | Jul 2009 | US | |
61263015 | Nov 2009 | US |