This disclosure relates to data processing on a computing device.
A user may often interact with a computing device (e.g., mobile phone, personal data assistant, smart phone, or the like) to provide manual user input. For instance, a user may use a keyboard, mouse, trackpad, touchscreen, or other user interface to provide input during execution of one or more applications on the computing device.
In many instances, users may interact with mobile computing devices via touchscreen or trackpad devices by providing touch-based input (e.g., one or more touch gestures) recognized by these devices. Touchscreen devices allow a user to provide direct interaction with a computing device, while trackpad devices typically provide indirect interaction that has been modeled from mouse-based interfaces.
In general, this disclosure describes techniques for mapping trackpad interactions and operations to touchscreen events without the use of a touchscreen user interface. For example, a computing device (e.g., mobile computing device, desktop device) may include or be coupled to a pointing device, such as a trackpad device, but may or may not be coupled to a separate touchscreen device. Trackpad operations may be directly mapped to touchscreen events, which may be processed by applications that may be configured to process such events (e.g., applications and/or operating systems designed around a touchscreen user interface). For example, certain tap or multi-touch operations input via a trackpad device may be mapped to corresponding touchscreen events that may then be processed by such applications. In such fashion, a computing device may be capable of processing touchscreen-based events for applications that are configured to process such events during execution on the computing device, regardless of whether the computing device is or is not coupled to a touchscreen device.
In one example, a computer-readable storage medium includes instructions that, when executed, cause one or more processors of a computing device to: receive, via a trackpad device coupled to the computing device, touch-based input comprising one or more gestures, wherein the trackpad device is physically distinct from a display device coupled to the computing device; determine a trackpad operation based upon the touch-based input; determine a touchscreen event based upon a mapping of the trackpad operation to the touchscreen event, wherein the touchscreen event is determined without receiving any user input from a touchscreen device; and generate the touchscreen event for processing by an application executing on the computing device, wherein the application is designed to process touchscreen events initiated by touchscreen devices.
In one example, a method includes receiving, via a trackpad device coupled to a computing device, touch-based input comprising one or more gestures, wherein the trackpad device is physically distinct from a display device coupled to the computing device. The method further includes determining, by the computing device, a trackpad operation based upon the touch-based input, and determining, by the computing device, a touchscreen event based upon a mapping of the trackpad operation to the touchscreen event, wherein the touchscreen event is determined without receiving any user input from a touchscreen device. The method further includes generating, by the computing device, the touchscreen event for processing by an application executing on the computing device, wherein the application is designed to process touchscreen events initiated by touchscreen devices.
In one example, a computing device includes one or more processors, a trackpad driver, and an application. The trackpad driver is operable by the one or more processors to receive, via a trackpad device coupled to the computing device, touch-based input comprising one or more gestures, wherein the trackpad device is physically distinct from a display device that is also coupled to the computing device. The application is also operable by the one or more processors, the application being designed to process touchscreen events initiated by touchscreen devices. The computing device further includes means for determining a touchscreen event based upon a mapping from a trackpad operation that is based upon the touch-based input, wherein the touchscreen event is determined without receiving any user input from a touchscreen device. The computing device further includes means for generating the touchscreen event for processing by the application.
The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
Techniques of the present disclosure may allow a computing device (e.g., a mobile/portable device) to transform or map trackpad operations to corresponding touchscreen events, where the computing device may or may not include a touchscreen interface/device but at least includes, or is coupled to, a trackpad device. Certain applications and operating systems (e.g., the Android® operating system) have been designed around a touchscreen user interface, but it may be beneficial to allow such applications and operating systems to be implemented on more traditional devices (e.g., desktop/netbook/laptop devices) that include physical keyboards and/or pointing devices, such as trackpad devices. Trackpad operations may be directly mapped to touchscreen events, which may be processed by applications that may be configured to process such events. In such fashion, a computing device may be capable of processing touchscreen-based events for applications that are configured to process such events during execution on the computing device, regardless of whether the computing device includes or is coupled to a touchscreen.
For instance, a user may initially move a single finger on the trackpad device to cause a displayed pointer to move correspondingly on a display device of the computing device. The user may touch or tap a single finger on the trackpad device to deliver a simulated touchscreen finger tap at the current pointer location as displayed on the display device. The trackpad finger tap operation may thereby be mapped to a touchscreen finger tap event that may be generated for processing by an application that is designed to receive input via a touchscreen interface.
If the user touches two fingers on the trackpad device and substantially moves them together, those movements can be mapped into absolute touchscreen events relative to the current pointer location displayed at the start of movement, allowing the user to directly interact with graphical elements located at the current pointer location as if they are using a touchscreen device, and perform traditional touchscreen operations (e.g., dragging/flinging content, scrolling). There may also be a mechanism to map multi-touch operations from the trackpad device to select certain touchscreen events, such as pinch-zoom events. The computing device may, for example, generate such touchscreen events upon determining that the user's two fingers are moving in different directions on the trackpad device, or even possibly by determining that the user is holding down a modifier on the physical keyboard while moving the two fingers on the trackpad device. In such fashion, the track pad can function as a “virtual” touchscreen input device.
Computing device 2 may be capable of executing one or more applications 10A-10N (collectively, applications 10). Applications 10 may include any number of applications that may be executed by computing device 2, such as a digital multimedia player application, a video game application, a web browser application, an email application, a word processing application, a spreadsheet application, and/or a document reader application, to name only a few. Applications 10 may execute on computing device 2 via an operating system that is also executed by computing device 2. This operating system and/or one or more of applications 10 may be designed around a touchscreen user interface, or be configured to process touchscreen events, even though computing device 2 may or not include/be coupled to touchscreen 3.
As shown in
Computing device 2 includes trackpad driver 6, which manages the operational interface to trackpad device 4. Trackpad driver 6 may comprise one or more software/firmware modules, in some examples, that manage this interface. In some cases, trackpad driver 6 may include software that is executed as part of the operating system of computing device 2. Computing device 2 also includes a trackpad/touchscreen event dispatcher 8 that may be implemented or executed by computing device. In some cases, trackpad/touchscreen event dispatcher 8 may include software that is executed as part of the operating system of computing device 2.
The user input (e.g., single- and/or multi-touch gesture input) received via trackpad device 4 may be processed or determined by trackpad driver 6, and trackpad driver 6 may provide trackpad touch data corresponding to the received user input, which may comprise touch-based input (e.g., multi-touch input). This trackpad touch data may, in some cases, include raw touch data.
Trackpad driver 6 and/or trackpad/touchscreen event dispatcher 8 may determine trackpad operations (e.g., multi-touch movement operations) based upon the touch-based user input. For example, trackpad driver and/or trackpad/touchscreen event dispatcher 8 may determine, based upon user input received via trackpad device 4, that the user has initiated tap, other single-touch, and/or multi-touch gestures.
Trackpad/touchscreen event dispatcher 8 may determine corresponding touchscreen events based upon mappings of the trackpad operations to these touchscreen events, wherein the touchscreen events are determined without receiving any user input from a touchscreen device. Trackpad/touchscreen event dispatcher 8 may then generate touchscreen events for processing by one or more of applications 10 that are designed to process touchscreen events initiated by touchscreen devices. These one or more of applications 10 and/or trackpad/touchscreen event dispatcher 8 may update the display of information displayed via display device 12, such as the content displayed via display device 12 at a current location of pointer 13 that is also displayed via display device 12. As shown in
For instance, trackpad/touchscreen event dispatcher 8 may map a trackpad tap operation into a touchscreen tap operation that is generated at the current location of pointer 13 that is displayed via display device 12. In some examples, a user may move two fingers (i.e., digits) of hand 5 from a first location to a second location on the surface of trackpad device 4. In these examples, trackpad/touchscreen event dispatcher 8 may map such a trackpad multi-touch movement operation from the first location to the second location into a touchscreen event (e.g., touchscreen single-touch event) comprising movement from the first location to the second location relative to the current location of pointer 13 at the start of movement. In this example, a multi-touch trackpad operation may be mapped into a single-touch touchscreen event processed by one or more of applications 10 relative to the current location of pointer 13. For example, the single-touch touchscreen event may be processed by one of applications 10 to perform certain actions or instructions, such as dragging or scrolling content that is displayed at the location of pointer 13. Numerous other examples of trackpad operation to touchscreen event mapping will be provided below.
In some cases, the user may interact with trackpad device 4 to cause movement of pointer 13 that is displayed via display device 12. For instance, in some cases, the use may move one finger of hand 5 on the surface of trackpad device 4 to cause movement of pointer 13. The single-touch input via trackpad device 4 may not necessarily be translated or mapped into a corresponding touchscreen event, but may be utilized to move pointer 13. Subsequent touchscreen events that are mapped from trackpad operations may be processed by one or more of applications 10 respective to the current location of pointer 13.
Certain aspects of the disclosure may provide one or more benefits. For example, certain aspects may allow computing device 2 to transform or map trackpad operations to corresponding touchscreen events, where computing device 2 includes, or is coupled to, trackpad device 4. Computing device 2 may implement or execute certain applications and operating systems (e.g., the Android® operating system) that have been designed around a touchscreen user interface, even though computing device 2 may or may not be coupled to optional touchscreen 3.
Trackpad operations may be directly mapped to touchscreen events, which may be processed by applications (e.g., one or more of applications 10) that may be configured to process such events. In such fashion, computing device 2, which may or may not include/be coupled to touchscreen device 3, may still be capable of processing touchscreen-based events for applications that are configured to process such events during execution on computing device 2 based upon input provided to trackpad device 4.
As indicated in
However, in examples where computing device 2 is coupled to, or does include, touchscreen 3, a user need not necessarily use touchscreen 3 in order to cause computing device 2 to generate touchscreen events for processing by applications 10. Instead, the user may interact with trackpad device 4 to cause trackpad driver 6 to provide trackpad touch data to event dispatcher 8, which maps one or more trackpad operations to touchscreen events, as described above, and provides such touchscreen events for processing by applications 10.
For instance, in one specific example, computing device 2 may comprise a tablet computer that includes a touchscreen interface. However, the tablet computing can be docked with a separate keyboard, which may include a trackpad device (e.g., trackpad 4). When the tablet computer is docked, the user can interact with the trackpad device without needing to reach over to and use the touchscreen. In such fashion, the user need not use the touchscreen of the tablet computing, but can rather interact with the separate trackpad device to cause the tablet to generate and process touchscreen events, just as though the user were using the touchscreen. The trackpad device may thereby serve as a “virtual” touchscreen input device.
As shown in the specific example of
Processors 22 may be configured to implement functionality and/or process instructions for execution within computing device 2. Processors 22 may be capable of processing instructions stored in memory 24 or instructions stored on storage devices 28.
Memory 24 may be configured to store information within computing device 2 during operation. Memory 24 may, in some examples, be described as a computer-readable storage medium. In some examples, memory 24 is a temporary memory, meaning that a primary purpose of memory 24 is not long-term storage. Memory 24 may also, in some examples, be described as a volatile memory, meaning that memory 24 does not maintain stored contents when the computer is turned off. Examples of volatile memories include random access memories (RAM), dynamic random access memories (DRAM), static random access memories (SRAM), and other forms of volatile memories known in the art. In some examples, memory 24 may be used to store program instructions for execution by processors 22. Memory 24 may be used by software or applications running on computing device 2 (e.g., one or more of applications 10) to temporarily store information during program execution.
Storage devices 28 may also include one or more computer-readable storage media. Storage devices 28 may be configured to store larger amounts of information than memory 24. Storage devices 28 may further be configured for long-term storage of information. In some examples, storage devices 28 may comprise non-volatile storage elements. Examples of such non-volatile storage elements may include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
Computing device 2 also includes network interface 26. Computing device 2 may utilize network interface 26 to communicate with external devices via one or more networks, such as one or more wireless networks. In some examples, network interface 26 may include a Bluetooth® network interface module. In these examples, computing device 2 may utilize network interface 26 to wirelessly communicate with external device 4 via Bluetooth® communication. Display module 30 is operable to provide an interface to a display device (e.g., display device 12 shown in
Any applications implemented within or executed by computing device 2 (e.g., applications 10) may be implemented or contained within, operable by, executed by, and/or be operatively/communicatively coupled to processors 22, memory 24, network interface 26, and/or storage devices 28.
As shown in the example of
Example modules 32, 34, and 36 of trackpad/touchscreen event dispatcher 8 are shown in
Trackpad driver 6 may be operable by processors 22 to receive, via trackpad device 4, touch-based input comprising one or more gestures. One or more of applications 10, operable by processors 22, may be designed to process touchscreen events initiated by touchscreen devices, even though computing device may not include or be otherwise coupled to a touchscreen device. Operation/event mapping module 32 may be operable by processors 22 to perform the mapping of a given trackpad operation to a touchscreen event at least by receiving trackpad touch data corresponding to the touch-based input that is provided by trackpad driver 6.
Event generation module 34 may be operable by processors 22 to generate the touchscreen event at least by providing mapped touch data corresponding to the touchscreen event to an application (e.g., one of applications 10, such as application 10A). Display module 30 may be operable by processors 22 to update content displayed via the display device (e.g., display device 12) based upon the processing of the touchscreen event by application 10A. Display module 30 may, in some cases, receive display information from trackpad/touchscreen event dispatcher (e.g., from display pointer module 36) and/or the application 10A.
Display pointer module 36 may be operable by processors 22 to determine a current location of pointer 13 that is displayed via display device 12. Event generation module 34 may be operable to generate the touchscreen event for processing by application 10A at the current location determined by display pointer module 36. In some cases, display pointer module 36 is further operable to update the current location of pointer 13 that is displayed via display device 12 based upon a second trackpad operation that corresponds to movement via trackpad device 4 based upon additional touch-based input. For instance, a user may initiate a single-touch gesture via trackpad device 4 to cause movement of pointer 13.
In one example, the trackpad operation may comprise a trackpad tap operation, where the user touches or taps a finger of hand 5 (
In some cases, trackpad/touchscreen event dispatcher 8 may provide touchscreen event data to any of applications 10, including application 10A, via an asynchronous event handling interface. In the example above, trackpad/touchscreen event dispatcher 8 may provide the touchscreen tap event data to application 10A via such an interface, and may also further provide location information for the current/present location of pointer 13, such that the event may be processed by application 10A with respect to this location. The event data may be provided by event generation module 34, and the pointer location information may be provided by display pointer module 36.
In one example, the trackpad operation may comprise a trackpad multi-touch movement operation from a first location to a second location based upon the touch-based input via trackpad device 4. The touch-based input may comprise movement via trackpad device 4 of at least two user digits of hand 5 from the first location to the second location. Operation/event mapping module 32 may be operable to determine, based upon a mapping from the trackpad operation, a touchscreen single-touch event comprising movement from the first location to the second location relative to the current location of pointer 13 at the start of movement, according to this specific example. Thus, in this example, a multi-touch trackpad gesture received via trackpad device 4, where at least two of the user's fingers are moving substantially together across trackpad device 4, may be mapped to a single-touch touchscreen gesture event (e.g., simulating one finger/digit down on a touchscreen) that may be processed by application 10A.
In another example, the trackpad operation may comprise a trackpad multi-touch movement operation in multiple directions based upon the touch-based input, where the touch-based input comprising movement via trackpad device 4 of at least two user digits in the multiple directions. Operation/event mapping module 32 may be operable to determine, based upon a mapping from the trackpad operation, a touchscreen multi-touch event comprising movement in the multiple directions relative to the current location of pointer 13 at the start of movement. Thus, in this example, a multi-touch trackpad gesture received via trackpad device 4, where at least two of the user's fingers are moving in different directions across trackpad device 4, may be mapped to a multi-touch touchscreen gesture event (e.g., simulating two fingers/digits down on a touchscreen) that may be processed by application 10A (e.g., for operations such as pinch zoom).
The method includes receiving, via a trackpad device coupled to a computing device, touch-based input comprising one or more gestures, wherein the trackpad device is physically distinct from a display device coupled to the computing device (50). The method further includes determining, by the computing device, a trackpad operation based upon the touch-based input (52), and determining, by the computing device, a touchscreen event based upon a mapping of the trackpad operation to the touchscreen event, wherein the touchscreen event is determined without receiving any user input from a touchscreen device (54). The method also includes generating, by the computing device, the touchscreen event for processing by an application executing on the computing device, wherein the application is designed to process touchscreen events initiated by touchscreen devices (56). In some cases, the computing device may be further coupled to a separate touchscreen device.
The touch-based input may comprise multi-touch input, and the trackpad operation may include a multi-touch movement operation. In some examples, determining the touchscreen event may include receiving trackpad touch data corresponding to the touch-based input that is provided by a trackpad driver associated with the trackpad device, and generating the touchscreen event may include providing mapped touch data corresponding to the touchscreen event to the application.
The method of
In some examples, the trackpad operation may include a trackpad tap operation, and determining the touchscreen event may include determining a touchscreen tap event at the current location of the pointer. In some examples, the touch-based input may comprise movement via the trackpad device of at least two user digits from a first location to a second location, the trackpad operation may include a trackpad multi-touch movement operation from the first location to the second location based upon the touch-based input, and determining the touchscreen event may include determining a touchscreen single-touch event comprising movement from the first location to the second location relative to the current location of the pointer.
In some examples, the trackpad operation may include a trackpad multi-touch movement operation in multiple directions based upon the touch-based input, where the touch-based input comprises movement via the trackpad device of at least two user digits in the multiple directions. Determining the touchscreen event may include determining a touchscreen multi-touch event comprising movement in the multiple directions relative to the current location of the pointer.
As shown in
The mappings shown in
The first trackpad operation shown in
As one example,
In such fashion, touchscreen tap event 72 may be generated for processing by one or more of applications 10 without the use of an actual touchscreen device. In this case, trackpad 4 functions as a virtual touchscreen input device. In some examples, touchscreen tap event 72 may cause selection of item presented for display at a current location of pointer 13 by one or more of applications 10. For example, if pointer 13 is displayed by, with, or over an icon or other graphical item, the generation of touchscreen tap event 72 may cause selection of this icon or other graphical item during execution of the one or more of applications 10.
Referring again to
Mapping module 32 may map this trackpad operation to a touchscreen single-touch gesture event corresponding to movement from the first location to the second location relative the position of pointer 13. The current pointer location of pointer 13 may be monitored and/or managed by display pointer module 36 (
As one example,
Thus, trackpad 4 may function as a virtual touchscreen input device, in which trackpad multi-touch operation 80 may be mapped to touchscreen single-touch gesture event 82. As shown in
This touchscreen event 82 comprises an event that is generated by event generation module 34 upon processing of trackpad multi-touch operation 80, even though trackpad 4 does not comprise an actual touchscreen device, but instead functions a virtual touchscreen input mechanism. In such fashion, a touchscreen single-touch gesture event may be generated for processing by one or more of applications 10 without the use of an actual touchscreen device. For instance, the generated touchscreen single-touch gesture event 82 may cause one or more of applications 10 to perform certain actions or execute corresponding instructions, such as, for instance, dragging or scrolling content that may be presently displayed at the current location of pointer 13 on display device 12, just as would be the case if display device 12 were a touchscreen display device.
As shown in the example scenario of
Upon receipt and processing of touchscreen event 82 shown in
Thus, in this example, the processing of touchscreen single-touch movement gesture 82, for downward direction 81, causes the content shown in
Referring again to
Mapping module 32 may map this trackpad operation to a touchscreen multi-touch gesture event corresponding to movement in these different directions relative the position of pointer 13. The current pointer location of pointer 13 may be monitored and/or managed by display pointer module 36 (
As one example,
Thus, trackpad 4 may function as a virtual touchscreen input device, in which trackpad multi-touch operation 90 may be mapped to touchscreen multi-touch gesture event 2. As shown in
This touchscreen event 92 comprises an event that is generated by event generation module 34 upon processing of trackpad multi-touch operation 90, where trackpad 4 functions a virtual touchscreen input mechanism. In such fashion, a touchscreen multi-touch gesture event may be generated for processing by one or more of applications 10 without the use of an actual touchscreen device. For instance, the generated touchscreen multi-touch gesture event 92 may cause one or more of applications 10 to pinch zoom content that may be presently displayed at the current location of pointer 13 on display device 12, just as would be the case if display device 12 were a touchscreen display device.
In the example of
In some cases, mapping module 32 may map trackpad operation 90 to touchscreen event 92 based upon trackpad touch data that is received from trackpad driver 6 (
In some instances, when initiating multi-touch gestures comprising movement in different directions across trackpad 4, the user may also optionally select another input mechanism, such as a key on a keyboard that is part of or coupled to computing device 2, when performing multi-touch gestures of movement in these different directions via trackpad 4. In these instances, the data associated with such a selection may be provided (e.g., by a keyboard driver) to mapping module 32. Mapping module 32 may utilize this data in conjunction with the touch data provided by trackpad driver 6 when determining the event (e.g., touchscreen event 92) that is to be generated. For example, the data associated with a keyboard key selection may trigger mapping module 32 to determine that touchscreen multi-touch event 92 is to be generated by event generation module 34.
Referring again to
Display pointer module 36 may process the information for the trackpad operation to update the current location of pointer 13 displayed via display device 12. Thus, in such fashion, a user can move and control the position of pointer 13 on display device 12 with respect to other content that is displayed, such that the user may subsequently interact with trackpad 4 to provide additional input with respect to content displayed by, with, or beneath pointer 13 (e.g., to select the content, to scroll the content, to zoom into or out of the content, as described above).
As shown in
For instance, a trackpad tap operation may be mapped to a touchscreen tap event that is generated and processed by application 10A, which manages the content displayed via display 12, to select the content displayed next to pointer 13. A trackpad multi-touch gesture corresponding to multiple fingers moving together across trackpad 4 may be mapped to a touchscreen single-touch event that is generated and processed by application 10A to scroll or drag the content displayed next to pointer 13. A trackpad multi-touch gesture corresponding to multiple fingers moving in different directions across trackpad 4 may be mapped to a touchscreen multi-touch event that is generated and processed by application 10A to perform a pinch-zoom (e.g., zoom in, zoom out) operation with respect to the content.
The techniques described in this disclosure may be implemented, at least in part, in hardware, software, firmware, or any combination thereof. For example, various aspects of the described techniques may be implemented within one or more processors, including one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or any other equivalent integrated or discrete logic circuitry, as well as any combinations of such components. The term “processor” or “processing circuitry” may generally refer to any of the foregoing logic circuitry, alone or in combination with other logic circuitry, or any other equivalent circuitry. A control unit including hardware may also perform one or more of the techniques of this disclosure.
Such hardware, software, and firmware may be implemented within the same device or within separate devices to support the various techniques described in this disclosure. In addition, any of the described units, modules or components may be implemented together or separately as discrete but interoperable logic devices. Depiction of different features as modules or units is intended to highlight different functional aspects and does not necessarily imply that such modules or units must be realized by separate hardware, firmware, or software components. Rather, functionality associated with one or more modules or units may be performed by separate hardware, firmware, or software components, or integrated within common or separate hardware, firmware, or software components.
The techniques described in this disclosure may also be embodied or encoded in a computer-readable medium, such as a computer-readable storage medium, containing instructions. Instructions embedded or encoded in a computer-readable medium, including a computer-readable storage medium, may cause one or more programmable processors, or other processors, to implement one or more of the techniques described herein, such as when instructions included or encoded in the computer-readable medium are executed by the one or more processors. Computer readable storage media may include random access memory (RAM), read only memory (ROM), programmable read only memory (PROM), erasable programmable read only memory (EPROM), electronically erasable programmable read only memory (EEPROM), flash memory, a hard disk, a compact disc ROM (CD-ROM), a floppy disk, a cassette, magnetic media, optical media, or other computer readable media. In some examples, an article of manufacture may comprise one or more computer-readable storage media.
Various aspects of the disclosure have been described. These and other aspects are within the scope of the following claims.