Many computing devices utilize touch sensors as user input devices. Inputs made via a touch sensor may be translated to actions on a graphical user interface in various ways. For example, in some instances, a touch sensor may be used purely for tracking changes in finger location on the surface, for example, to control movement of a cursor. Thus, the specific location of the touch on the touch sensor does not affect the specific location of the cursor on the graphical user interface. Such interpretation of touch inputs may be used, for example, with a touch pad for a laptop computer, where the touch sensor is not located directly over a display device.
In other instances, locations on a touch sensor may be mapped to corresponding locations on a graphical user interface. In such instances, a touch made to a touch sensor may affect a user interface element at a specific display screen location mapped to that touch sensor location. Such direct mapping may be used, for example, where a transparent touch sensor is located over a display.
Various embodiments are disclosed that relate to dynamically scaling a mapping between a touch sensor and a display screen. For example, one disclosed embodiment provides a method comprising setting a first user interface mapping that maps an area of the touch sensor to a first area of the display screen, receiving a user input from the user input device that changes a user interaction context of the user interface, and in response to the user input, setting a second user interface mapping that maps the area of the touch sensor to a second area of the display screen. The method further comprises providing to the display device an output of a user interface image representing the user input at a location based on the second user interface mapping.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
As mentioned above, a touch sensor may be mapped to a graphical user interface such that specific locations on the touch sensor correspond to specific locations on the graphical user interface. Where such a touch sensor is located directly over a graphical user interface, as with a smart phone or notepad computer, selecting an appropriate location to make a desired touch input simply involves touching the surface directly over the desired user interface element.
However, finding a correct location on a touch sensor to make a touch input may be more difficult in situations where the touch sensor is not located directly over a graphical user interface.
In such a use environment, it may be desirable not to display an image of the user interface on the remote control device during use to avoid the potentially disruptive user experience of having to look back and forth between the display screen and the remote control device. However, a user may experience some difficulties in quickly selecting user interface elements when looking at a relatively distant display screen when the touch sensor is not in the user's direct field of view. To help overcome such difficulties, current touch-sensitive devices may allow a user to zoom in on a portion of the user interface for more precision. However, this may obscure other areas of the user interface, and also may increase a complexity of interacting with the user interface.
Therefore, embodiments are disclosed herein that relate to facilitating the use of a touch-sensitive user input device by dynamically scaling a mapping of the touch sensor to an active portion of a user interface. Referring again to
Thus, according to the disclosed embodiments, when the user 102 navigates to the text entry user interface 110, the mapping of the touch sensor 118 to the display screen 116 may be dynamically adjusted such that a larger relative area of the touch sensor 118 is mapped to the areas of the display device 106 corresponding to active areas of the user interface 110. This may allow a user to have more precise control of user inputs.
In some embodiments, different areas of the touch sensor may be dynamically scaled to different degrees relative to a user interface. This may allow, for example, more-often used user interface controls to be allotted relatively more area on the touch sensor than less-often used controls of a similar size on the user interface. This may allow a user to select the more-often used controls with less precise touch inputs than the less-often used controls. Likewise, user interface controls with greater consequences for an incorrect selection may be allotted relatively less area on the touch-sensor than a control of similar size but with lesser consequences for an incorrect selection. This may require a user to select higher-consequence actions more deliberately. As a more specific example, a mapping of a touch sensor may be scaled differently for a “pause” control and a “stop” control on a media playback user interface such that the “pause” control is easier to select, as accidentally selecting a “pause” control may be less consequential than accidentally selecting a “stop” control.
Continuing with
As mentioned above, in some embodiments, different areas of the touch sensor may be dynamically scaled to different degrees relative to a user interface so that different user interface controls may be more easily or less easily located. This may allow, for example, more-often used user interface controls to be allotted relatively more area on the touch sensor than less-often used controls of a similar size on the user interface.
In some embodiments, the user interface mapping may be configured to exhibit some hysteresis when a touch input moves between sub-regions. For example, after a user's finger enters a touch sensor region corresponding to a user interface control by crossing a boundary from a first sub-region into a second sub-region of the touch sensor/user interface mapping, the user interface element in the second sub-region that is currently in focus due to the touch input may not be changed even after the user crosses the boundary back toward the first sub-region until the cursor passes a threshold distance beyond the boundary. This may involve more deliberate user inputs to move between user interface controls, and therefore may help to avoid inadvertent inputs. In other embodiments, a single boundary location may be used to recognize a switch between touch sensor sub-regions in either direction of movement. It will be understood that a degree of hysteresis between sub-regions may vary similarly to the mapping of sub-regions. For example, a greater amount of hysteresis may be applied when moving into regions having a greater consequence of inadvertent selection compared to regions having a lesser consequence.
As mentioned above, dynamic scaling of a touch sensor to a user interface may be used with any suitable touch-sensitive input device, including but not limited to smart phones, portable media players, notepad computers, laptop computers, and dedicated remote control devices.
The use of two touch areas and two actuators allows a user to independently manipulate separate cursors for each hand, as depicted in
The remote control device 600 further comprises a logic subsystem 612, and a data-holding subsystem 614 comprising instructions stored thereon that are executable by the logic subsystem 612 to perform various tasks, such as receiving user inputs and communicating the user inputs to a media presentation system, display system, etc. Examples of these components are discussed in more detail below.
The use of separate first and second touch areas each having an independently operable actuator may allow a user to enter text quickly with two thumbs or other digits, without lifting the digits off of the surface between letter entries. Further, as remote control device 600 may lack a display screen, a user is not distracted by looking down at the remote control device 600 during use, but rather may place full attention on the display device. These features may offer various advantages over other methods of entering text in a use environment in which the touch sensor may be located a distance from a display screen and out of direct view when a user is looking at the display screen. For example, some remote control devices utilize a directional pad (e.g. a control with up, down, left and right commands) to move a cursor on a displayed alphanumeric keyboard layout. However, such text entry may be slow and tedious. Other remote control devices may comprise a hard keyboard. A hard keyboard may improve the efficiency of text entry compared to the use of a directional pad, but also may increase the size, complexity, and cost of the input device. The inclusion of a hard keyboard also may force a user to split attention between looking down at the device and up at the display screen. In contrast, in the embodiment of
The first actuator 608 and second actuator 610 may utilize any suitable actuation mechanism. In some embodiments, the actuators 608, 610 may comprise physical buttons to provide tactile feedback when text is selected. In other embodiments, the actuators 608, 610 may utilize pressure sensors or other actuation mechanisms. Where pressure sensors or the like are utilized, the remote control device 600 may include a haptic feedback system 616, such as a vibration mechanism, to provide user feedback regarding registered inputs.
In the embodiment of
It will be understood that the number of displayed cursors, as well as the mapping of the touch sensor 602 to the display screen, may depend upon a number of fingers touching the touch sensor 602. For example, as depicted in
As mentioned above, the display systems and touch-sensitive input devices described above, including but not limited to touch-sensitive device 104, display device 106, media presentation device 107, and remote control device 600, each may take the form of a computing system.
The computing system 1000 includes a logic subsystem 1002 and a data-holding subsystem 1004. The computing system 1000 may optionally include a display subsystem 1006, or may omit a display system (as described with reference to the remote control device of
The logic subsystem 1002 may include one or more physical devices configured to execute one or more instructions. For example, the logic subsystem 1002 may be configured to execute one or more instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result.
The logic subsystem 1002 may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the logic subsystem 1002 may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic subsystem 1002 may be single core or multicore, and the programs executed thereon may be configured for parallel or distributed processing. The logic subsystem 1002 may optionally include individual components that are distributed throughout two or more devices, which may be remotely located and/or configured for coordinated processing. One or more aspects of the logic subsystem 1002 may be virtualized and executed by remotely accessible networked computing devices configured in a cloud computing configuration.
The data-holding subsystem 1004 may include one or more physical, non-transitory, devices comprising computer readable media configured to store data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem 1004 may be transformed (e.g., to hold different data).
The data-holding subsystem 1004 may include removable media and/or built-in devices. The data-holding subsystem 1004 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard disk drive, floppy disk drive, tape drive, MRAM, etc.), among others. The data-holding subsystem 1004 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, logic subsystem 1002 and the data-holding subsystem 1004 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
It is to be appreciated that data-holding subsystem 1004 includes one or more physical, non-transitory devices. In contrast, in some embodiments aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for at least a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
When included, display subsystem 1006 may be used to present a visual representation of data held by data-holding subsystem 1004. As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of display subsystem 1006 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1006 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 1002 and/or data-holding subsystem 1004 in a shared enclosure, or such display devices may be peripheral display devices.
Communication subsystem 1008 may be configured to communicatively couple computing system 1000 with one or more other computing devices. Communication subsystem 1008 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As nonlimiting examples, the communication subsystem may be configured for communication via a wireless telephone network, a wireless local area network, a wired local area network, a wireless wide area network, a wired wide area network, etc. In some embodiments, the communication subsystem may allow computing system 1000 to send and/or receive messages to and/or from other devices via a network such as the Internet.
It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.