Human interface devices, such as computer mice, keyboards, keypads, and the like, may provide multiple mechanisms for navigating a computer user interface. For example, a computer mouse having optical tracking and selection buttons to interact with a cursor on a user interface also may include a scroll wheel that allows a user to scroll displayed content independent of cursor motion. Some mice also may include a built-in touch sensor configured to allow a user to perform touch interactions via the mouse. Such touch-sensors may have a structure similar to a touch sensor on a display, in that the touch sensor comprises a matrix of row and column sensing elements. Touch may be sensed by measuring capacitance from each row to each column, or from each column to ground and each row to ground.
Embodiments are disclosed that relate to human interface devices having touch sensors. For example, one disclosed embodiment provides a human interface device comprising a touch sensor having two or more touch sensing units, each touch sensing unit comprising a touch sensing pad and a charge accumulation capacitor in communication with the touch sensing pad. The charge accumulation capacitor may have a larger capacitance than the capacitance between the touch sensing pad and electrical ground when a human finger or thumb is proximate to the touch sensing pad. The human interface device further comprises a controller in communication with each touch sensing unit, the controller being configured to acquire a touch sensing sample from each touch sensing unit by iteratively charging the touch sensing pad of the touch sensing unit and transferring charge from the touch sensing pad of the touch sensing unit to the charge accumulation capacitor of the touch sensing unit until a threshold has been met, and detect a touch gesture based upon the touch sensing samples.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Various human interface devices, such as computer mice, keyboards, touch-sensitive displays and the like, may comprise a touch sensor to enable a user to interact with a computing device via a touch sensitive surface. Current touch sensors on computer mice may utilize an array of rows and column sensing elements that are read by measuring a capacitance between each row and each column, or between each row/column and ground. However, such touch sensors may be subject to noise from other human interface device components. As the capacitances between the rows and columns may be relatively small, such noise may lead to problems in touch sensing. Additionally, such sensors may be unduly expensive and complex if it is desired to detect a relatively small number of specific gestures.
Thus, embodiments are disclosed herein that relate to touch sensors for human interface devices. Briefly, the disclosed touch sensors utilize a the capacitance between a touch sensing pad and electrical ground, in connection with a and charge accumulation capacitor, for each of a plurality of touch sensing units to sense touch. Further, rather than being formed from a matrix of horizontal and vertical conductors, the disclosed embodiments of touch sensors utilize a relatively smaller number of larger area touch sensing pads arranged, for example, along a path of a gesture to be detected via the sensor. A touch sensing capacitor may be formed based on the self-capacitance that exists between a touch sensing pad that may be, for example, located on a computer mouse, and electrical ground. The self-capacitance is altered by the proximity of a human finger or thumb to the touch sensing pad. Herein, the capacitor that is formed by the touch sensing pad and surfaces connected to electrical ground, and influenced by objects located between the touch sensing pad and electrical ground, if any, may be referred to as a touch sensing capacitor. Alternative touch sensors may rely on mutual capacitance, projected capacitance, or surface capacitance.
To sample a touch sensing unit, the capacitance between the touch sensing pad of the touch sensing unit and electrical ground is charged and then discharged onto the charge accumulation capacitor a plurality of times during each sample cycle. As a capacitance of the touch sensing capacitor increases when a finger is positioned over the touch sensing capacitor, more charge may be transferred to the charge accumulation capacitor during each charge/discharge cycle when touched than when not touched. Thus, a measure of the capacitance of the touch sensing capacitor may be determined from the sampling process. In one non-limiting example, the iterative charging and discharging of the touch sensing capacitor may be performed until a threshold voltage across the charge accumulation capacitor is reached. In such embodiments, a number of charge/discharge cycles used to reach the threshold voltage may represent the capacitance of the touch sensing capacitor, such that a smaller number of charge/discharge cycles represents a larger capacitance. In other embodiments, a selected threshold number of charge/discharge cycles may be performed, and then a time for the charge accumulation capacitor to drain may be determined as a measure of the capacitance of the touch sensing capacitor. It will be understood that these examples are intended to be illustrative, and not limiting.
The iterative charging of the touch sensitive capacitor and discharging onto the charge accumulation capacitor, combined with the relatively larger size of touch sensing capacitors relative to those of column/row matrix touch sensors, may help to provide a higher signal to noise ratio than row/column type touch sensors, in which a single measurement of each relatively smaller sensing element is taken per scan. Further, information about the change in the capacitance of each touch sensing capacitor over time may be used to detect touch gestures, as described below.
As described in more detail below, the touch sensor 102 may be configured to detect a relatively small number of specific touch interactions, such as a thumb swipe gesture and/or a thumb tap gesture. As such, the touch sensor 102 may comprise a relatively low number of separate touch sensing elements arranged in a manner configured to allow detection of those gestures.
A touch sensor configured to detect specific gestures may be utilized, for example, where the specific gestures are mapped to specific operating system and/or application interactions. For example, some computing device operating systems may maintain menus and other user interface controls hidden during ordinary computing device use, but reveal these user interface controls upon performance of a specific touch gesture. As a more specific example, an operating system adapted for use with a touch sensitive display may be configured to reveal a menu bar, either vertically oriented at a side of the display or horizontally oriented at a bottom or top of a display, when a user performs a touch gesture (e.g. a tap or swipe gesture) on the touch sensitive display. Thus, gestures detectable by the touch sensor 102, such as a thumb swipe or tap, may be mapped to outputs configured to perform such interactions. As one non-limiting example, a thumb swipe gesture may be mapped to an output that is recognized by a host computing device as representing a keyboard input configured to reveal a hidden menu bar.
The charge accumulation capacitors are each connected to a corresponding first pin and second pin of a controller 310. In the depicted embodiment, the first charge accumulation capacitor 302 is shown as being connected to a first general purpose input/output (GPIO) pin 312 and a second GPIO pin 314, and the second charge accumulation capacitor 304 is shown as being connected to a third GPIO pin 316 and a fourth GPIO pin 318. Each corresponding touch sensing capacitor, resistor, and charge accumulation capacitor may be referred to herein as a touch sensing unit, wherein a first touch sensing unit is depicted at 320 and a second touch sensing unit is depicted at 322.
When a finger is close to or in contact with a touch sensing capacitor, the capacitance of the touch sensing capacitor increases. Thus, it takes fewer charge/discharge samples to reach a threshold amount of charge on the charge accumulation capacitor. By counting the number of charge/discharge cycles taken to reach the threshold, a touch state of each touch sensing capacitor may be determined.
The touch sensing units 320, 322 may be operated in any suitable manner One non-limiting method is as follows. While described in the context of the first touch sensing unit 320, it will be understood that the second touch sensing unit 322 may be operated in a similar manner.
First, the controller 310 may set the first GPIO pin 312 and the second GPIO pin 314 to ground to discharge the touch sensing circuit, and set a counter to zero to initialize a sampling process. Next, the first GPIO pin 312 is set to logic HIGH (e.g., a binary “1” value, which may be represented by 5V, 3.3V, 1.8V, etc.), and the second GPIO pin 314 is set to HIGH-Z (high impedance) to charge the first touch sensing capacitor 202. As mentioned above, the touch sensing capacitor may hold more charge when in a touched state than when in an untouched state. Next, the first GPIO pin 312 is set to HIGH-Z and the second GPIO pin 314 is set to LOW (e.g., a binary “0” value, which may be represented by 0V) to transfer charge from the first touch sensing capacitor 302 to the first charge accumulation capacitor 302. The counter is then incremented to reflect a number of charge/discharge cycles that have been performed. To determine whether the threshold has been met, the first GPIO pin 312 may be set to LOW and the second GPIO pin 314 may be set to an input and sampled to see if the logic threshold of the input has been crossed. If the logic threshold has not been crossed, then the first GPIO pin 312 is again set to high and the second GPIO pin 314 is set to HIGH-Z to again charge the first touch sensing capacitor 202 for another charge/discharge cycle. These steps may be repeated until the logic threshold of the second GPIO pin 314 in the input mode is met. Once the logic threshold is met, the current value of the counter is recorded as a measure of capacitance. The values obtained from each touch sensing unit may then be analyzed to detect the occurrence of any recognized touch gestures.
The touch sensor data includes output 404 of a first touch sensor (sensor0), output 406 of a second touch sensor (sensor1), finger position 408 as a function of time along a horizontal path through the two sensors as measured from a left edge of sensor0 to a right edge of sensor1, and a difference 410 between the first and second sensor outputs.
In the depicted example, relative confidences of each of the examples may depend upon such factors as starting location, ending location, starting condition (e.g. wait or no wait before movement across sensors), and ending condition (e.g. pause on sensor after movement or no pause). These confidences may be based, for example, upon a likelihood that a user will perform an intended touch gesture in a deliberate manner. For example, referring to the top two example gestures 420 and 422, where a touch starts from beyond a left edge of sensor0 or within sensor0, and travels all the way past the right edge of sensor1, a likelihood that a user intended to perform a swipe gesture may be relatively high. Next referring to the third example gesture 424 from the top, where a user stops and pauses over sensor1, it may be somewhat less likely, but still relatively likely, that the user intended to perform a swipe gesture. Continuing, the fourth gesture 426 from the top may potentially be less likely to be a swipe gesture, as the gesture starts in an ambivalent location that is over a border between sensor0 and sensor1. Next, the bottommost gesture 428 may have an even lower likelihood of being a swipe, as the touch position is held in an ambivalent location before movement.
Any suitable rules may be applied to determine whether or not a detected sensor interaction is a gesture input. As one non-limiting example, it may be determined that a swipe occurred if the following conditions are met: (a) the samples from the touch sensing units have a crossover point; (b) the touch goes from one touch sensing capacitor to another, and not back; and (c) the touch interacted with each sensor for a threshold time. Further, a rate of change between a touched and not touched state (or vice versa) for each sensor may be considered, wherein rapid changes may be considered more likely to represent a gesture. Additionally, in some embodiments, gestures may be sensed only after all sensors are untouched for a period of time. It will be understood that these rules for determining swipes are presented for the purpose of example, and are not intended to be limiting in any manner.
Rules may be used in any suitable manner to determine whether and how to provide outputs of detected gestures. For example, in some embodiments, rules may be used to determine a confidence value that is provided along with information regarding the touch gesture detected. This may allow an application (e.g. a computer program being executed on a computing device that receives the sensor output) to decide whether, and how, to use the output. Likewise, rules also may be used to determine whether or not to provide an output of a detected instance of the gesture.
In some embodiments, more than one gesture may be detected by a touch sensor. For example, touch sensor 102 may be configured to detect both swipe and tap inputs. In this case, rules also may specify an order of priority for gestures. As one non-limiting example, touch sensor 102 may give swipe gestures a higher priority than tap gestures. In such an embodiment, rules may specify that a tap gesture is reported if swipe criteria are not met, and then tap criteria are met. Thus, in this example, the sensor data is not analyzed for a tap input unless swipe criteria are not met. It will be understood that tap inputs may be assigned a higher priority in some embodiments, and that any set of allowed gesture inputs may be given any suitable priority order. Further, in some embodiments, a system may analyze multiple possible gestures and assign a confidence value to each, rather than utilizing a priority order.
As with swipe gestures, any suitable rules may be used to identify tap gestures.
Upon identifying a touch gesture, such as a swipe or tap, a human interface device may provide any suitable output to a computing device. For example, in some embodiments, specific gestures may be mapped to specific operating system or application interactions. As a more specific example, some operating systems, such as the WINDOWS 8 operating system and versions thereof, may utilize horizontal touch gestures on a touch-sensitive display to interact with the operating system (e.g. change between application views, reveal menus, etc.). Thus, in some embodiments, thumb swipe gestures may be mapped to such interactions. Further, the human interface device may be configured to provide an output to the computing device that is recognizable by the computing device as representing a specific gesture, without having to install any associated logic on the computing device to understand the human interface device output. As one non-limiting example, the human interface device may provide an output recognizable by a host computing device as a specific keystroke or combination of keystrokes from a keyboard that are configured to invoke the interaction.
The acquisition of touch sensor samples at 902 may be performed on a periodic basis, and may be performed at any suitable frequency. Suitable frequencies include, but are not limited to, frequencies of 30 Hz and higher.
Upon collecting touch sensor samples, method 900 comprises, at 914, detecting a touch gesture based upon a change in the touch sensor outputs over time. For example, in the above-described embodiments, changes in counts over time may be used to detect touch gestures. Any suitable changes in sensor outputs over time may be used to detect touch gestures, depending upon the gestures to be detected and/or a layout of sensor elements used. It will be understood that the examples described above with regard to
Method 900 further comprises, at 916, providing an output based upon a mapping of the touch gesture to a computing device function. It will be understood that a touch gesture may be mapped to any suitable computing device function, including but not limited to application-specific functions as well as operating system functions. Further, the output may be configured to be utilizable by a host computing device without the installation of any corresponding logic on the host computing device. For example, in some embodiments, an output may be configured to be recognizable as a specific keystroke or combination of keystrokes ordinarily performed via a keyboard. In other embodiments, an output may be mapped to a different gesture than that performed. As one more specific example, a thumb swipe gesture may be mapped to a horizontal user interface command (e.g. to reveal a menu bar, to change application windows, etc.).
While described herein in the context of a computer mouse, it will be understood that a touch sensor according to the present disclosure may be used with any suitable human interface device for a computing device, including but not limited to touch screens, track pads, keyboards, game controllers, wearable computing devices (e.g. wrist computing devices, head-mounted computing devices), interface sensors located on a smartphone, tablet, netbook, pocket PC, remote controls, set-top boxes, etc., The use of a relatively small number of touch sensing capacitors may allow larger touch sensing capacitors to be used than a grid-style touch sensor. Likewise, the use of multiple cycles of charging a touch sensing capacitor and then transferring the charge to a charge accumulation capacitor may help to achieve relatively high signal to noise ratios compared to the use of grid-style touch sensors.
In some embodiments, the methods and processes described herein may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
Computing system 1000 includes a logic machine 1002 and a storage machine 1004. Computing system 1000 may optionally include a display subsystem 1006, input subsystem 1008, communication subsystem 1010, and/or other components not shown in
Logic machine 1002 includes one or more physical devices configured to execute instructions. For example, the logic machine may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
The logic machine may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic machine may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic machine may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic machine optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic machine may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
Storage machine 1004 includes one or more physical devices configured to hold instructions executable by the logic machine to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage machine 1004 may be transformed—e.g., to hold different data.
Storage machine 1004 may include removable and/or built-in devices. Storage machine 1004 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage machine 1004 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices. Storage machine 1004 and logic machine 1002 may in some embodiments be incorporated in controller on a human interface device.
It will be appreciated that storage machine 1004 includes one or more physical devices. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
Aspects of logic machine 1002 and storage machine 1004 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
The term “program” may be used to describe an aspect of computing system 1000 implemented to perform a particular function. In some cases, a program may be instantiated via logic machine 1002 executing instructions held by storage machine 1004. It will be understood that different programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The term program may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
When included, display subsystem 1006 may be used to present a visual representation of data held by storage machine 1004. This visual representation may take the form of a graphical user interface (GUI) with which a user may interact via a human interface device. As the herein described methods and processes change the data held by the storage machine, and thus transform the state of the storage machine, the state of display subsystem 1006 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1006 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic machine 1002 and/or storage machine 1004 in a shared enclosure, or such display devices may be peripheral display devices.
Input subsystem 1008 may comprise or interface with one or more user-input devices such as a keyboard, game controller, mouse, touch sensor, button, optical position tracker, etc. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board.
Communication subsystem 1010 may be configured to communicatively couple computing system 1000 with one or more other computing devices (e.g. to communicatively couple a human interface device to a host computing device). Communication subsystem 1010 may include wired and/or wireless communication devices compatible with one or more different communication protocols.
It will be understood that the configurations and/or approaches described herein are presented for the purpose of example, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.