Embodiments of the present invention relate generally to computing systems and, more specifically, to mapping touchscreen gestures to ergonomic controls across application scenes.
With some software applications, a user may navigate and interact with the application by performing certain touch-oriented gestures via a touchscreen input mechanism. Employment of a touchscreen input mechanism is particularly common in video game applications, such as those downloaded for use on tablets or smart phones, since a touchscreen is the primary means of input for such devices. But while such applications are oftentimes designed for user navigation and interaction via a touchscreen input mechanism, other computing devices besides tablets and smart phones may also be used to run these programs.
Video gaming consoles are well-suited for running many video game applications, but considerably less so for touchscreen-oriented programs, such as video games originally designed for tablets. This is because gaming consoles typically include ergonomic mechanical navigation controls that greatly facilitate navigating and interacting with a video game application, but these controls are typically unavailable for use with touchscreen-oriented programs. Thus, a user must resort to using touchscreen-based controls on the integrated screen of the controller, which results in a lower-quality gaming experience. Consequently, for some touchscreen-oriented programs, these mechanical navigation controls, e.g., buttons and joystick controllers, can be mapped to particular locations on the screen of the gaming console and used to mimic an actual user touch on the touchscreen. However, for video games that include multiple scenes, the advantages of using a video gaming console in this particular fashion is limited. Because the mechanical navigation controls can only be mapped to the on-screen touch controls of a single scene of the video game application, the same mapping must be used across all scenes in the video game. Thus, for any other scene in the application that does not have touch controls identical to the touch controls of the mapped scene, navigation and other interactions must rely on the touch controls displayed on the touchscreen of the video gaming console.
As the foregoing illustrates, what is needed in the art is a more effective way to execute software applications that are designed for touchscreen interactions on computing devices with mechanical controls.
One embodiment of the present invention sets forth a method for implementing on-screen gestures associated with a software application. The method includes receiving a first control input that relates to a first scene associated with the software application, translating the first control input into a first set of instructions recognizable to the software application based on a first mapping of the first control input to at least one touch location within a region of the first scene, and providing the first set of instructions to an operating system that is configured to include the first set of instructions in the software application. The method also includes receiving a second control input that relates to a second scene associated with the software application, translating the second control input into a second set of instructions recognizable to the software application based on a second mapping of the second control input to at least one touch location within a region of the second scene, wherein the second mapping is different than the first mapping, and providing the second set of instructions to the operating system, wherein the operating system is configured to include the second set of instruction in the software application.
One advantage of the disclosed embodiments is that mechanical control inputs can be implemented as on-screen gestures in a software application that is normally controlled by touchscreen gestures. Such mechanical control inputs can be used to navigate and interact with a software application even when the application includes multiple scenes with different touchscreen controls. An additional advantage is that a user or third party can create a custom mapping for a particular application and make this mapping available to other users via cloud computing.
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
For clarity, identical reference numbers have been used, where applicable, to designate identical elements that are common between figures. It is contemplated that features of one embodiment may be incorporated in other embodiments without further recitation.
In operation, I/O bridge 107 is configured to receive user input information from input devices 108, such as a keyboard, a mouse, or game console control buttons, and forward the input information to CPU 102 for processing via communication path 106 and memory bridge 105. Switch 116 is configured to provide connections between I/O bridge 107 and other components of the computer system 100, such as a network adapter 118 and various add-in cards 120 and 121.
As also shown, I/O bridge 107 is coupled to a system disk 114 that may be configured to store content and applications and data for use by CPU 102 and parallel processing subsystem 112. As a general matter, system disk 114 provides non-volatile storage for applications and data and may include fixed or removable hard disk drives, flash memory devices, and CD-ROM (compact disc read-only-memory), DVD-ROM (digital versatile disc-ROM), Blu-ray, HD-DVD (high definition DVD), or other magnetic, optical, or solid state storage devices. Finally, although not explicitly shown, other components, such as universal serial bus or other port connections, compact disc drives, digital versatile disc drives, film recording devices, and the like, may be connected to I/O bridge 107 as well.
In various embodiments, memory bridge 105 may be a Northbridge chip, and I/O bridge 107 may be a Southbrige chip. In addition, communication paths 106 and 113, as well as other communication paths within computer system 100, may be implemented using any technically suitable protocols, including, without limitation, AGP (Accelerated Graphics Port), HyperTransport, or any other bus or point-to-point communication protocol known in the art.
In some embodiments, parallel processing subsystem 112 comprises a graphics subsystem that delivers pixels to a display device 110 that may be any conventional cathode ray tube, liquid crystal display, light-emitting diode display, or the like. In such embodiments, the parallel processing subsystem 112 incorporates circuitry optimized for graphics and video processing, including, for example, video output circuitry. Such circuitry may be incorporated across one or more parallel processing units (PPUs) included within parallel processing subsystem 112, and one or more of these PPUs may be configured as a graphics processing unit (GPU). Alternatively, such circuitry may reside in a device or sub-system that is separate from parallel processing subsystem 112, such as memory bridge 105, I/O bridge 107, or add-in cards 120 or 121.
In other embodiments, the parallel processing subsystem 112 incorporates circuitry optimized for general purpose and/or compute processing. Again, such circuitry may be incorporated across one or more PPUs that are included within parallel processing subsystem 112 and configured to perform such general purpose and/or compute operations. In yet other embodiments, the one or more PPUs included within parallel processing subsystem 112 may be configured to perform graphics processing, general purpose processing, and compute processing operations.
System memory 104 includes at least one device driver 103 configured to manage the processing operations of the one or more PPUs within parallel processing subsystem 112. In various embodiments, parallel processing subsystem 112 may be integrated with one or more of the other elements of
It will be appreciated that the system shown herein is illustrative and that variations and modifications are possible. The connection topology, including the number and arrangement of bridges, the number of CPUs 102, and the number of parallel processing subsystems 112, may be modified as desired. For example, in some embodiments, system memory 104 is connected to CPU 102 directly rather than through memory bridge 105, and other devices communicate with system memory 104 via memory bridge 105 and CPU 102. In other alternative topologies, parallel processing subsystem 112 may be connected to I/O bridge 107 or directly to CPU 102, rather than to memory bridge 105. In still other embodiments, I/O bridge 107 and memory bridge 105 may be integrated into a single chip instead of existing as one or more discrete devices. Lastly, in certain embodiments, one or more components shown in
Integrated screen 201 is a display device, such as display device 110 in
Mechanical input controls 220 greatly facilitate navigation and interaction with a video game application being run with video gaming console 200. This is because mechanical input controls 220 are significantly more ergonomic and responsive than touchscreen controls typically used on electronic tablets and smart phones. As shown, mechanical input controls 220 may include one or more joystick controllers 221 and a plurality of control buttons 222. Joystick controllers 221 and control buttons 222 may be arranged in any other configuration than that illustrated in
OS 310 resides in physical memory of video gaming console 200 during operation, such as in system memory 104 in
Application 311 is any suitable software application configured to run on video gaming console 200. For example, in some embodiments, application 311 is a video game application. Alternatively, application 311 may be a drafting program or any other software application that may benefit by being navigated or interacted with by a user via mechanical input controls 220 of video gaming console 200 instead of via touchscreen controls displayed on integrated screen 201. In some embodiments, upon startup, application 311 generates an application client 312, which establishes a connection 313 to mapper service 330. As described below, application client 312 can send to mapper service 330 an application-specific mapping 340 associated with application 311. Application 311 may be a particular video game application or other application that is configured to cause touchscreen-based controls (such as an icon) to be displayed on integrated screen 201 for user navigation and interaction with application 311. In addition, application 311 typically includes multiple scenes (also referred to as pages), where each scene has a different configuration of touchscreen-based controls. Ordinarily, a user navigates between these multiple scenes using the touchscreen-based controls displayed on integrated screen 201, as illustrated in
Referring back now to
Mapper service 330 is configured to run as a background process, and is generated by OS 310 during operation of video gaming console 200. When application 311 is started and connection 313 is established between application client 312 and mapper service 330, mapper service 330 creates service client 331, which is configured to communicate with application 311. For example, in some embodiments, service client 331 is configured to receive an application-specific mapping 340 from application client 312, which is associated with application 311. In addition, in various embodiments, service client 331 is configured to recognize control inputs (for example when mechanical input controls 220 are manipulated by a user), translate these control inputs into instructions recognizable to application 311, and send the instructions to OS 310 to be included in or performed by application 311. Service client 331 bases the translation of the control inputs into instructions on a mapping of one or more mechanical control inputs to one or more respective touch locations within a region of the currently active scene of application 311, where an input indicates a user touch occurring at the corresponding touch location the current scene. Such mappings may be included in application-specific mapping 340, and are described in greater detail below.
Application-specific database 340 includes multiple mappings of mechanical control inputs to screen touches or gestures for a particular application 311. The mechanical control inputs that are mapped are from, for example, mechanical input controls 220, and the screen entries or gestures are the screen-based inputs for user navigation and interaction normally used in application 311, such as when a user touches integrated screen 201.
Because application 311 typically includes multiple scenes or pages, the application-specific database 340 for application 311 includes a separate mapping of mechanical control inputs for some or each different scene of application 311. In this way, an input from a particular input device 360 (e.g., a specified motion of joystick controller 221 or depression of a particular control button 222) can be used for user navigation of multiple scenes of application 311. For example, referring back again to
In some embodiments, an application-specific mapping 340 for a particular application 311 includes a separate mapping of mechanical control inputs to screen entries or gestures for every page or scene in application 311. In other embodiments, application-specific mapping 340 includes a different mapping for multiple pages or scenes in application 311, but not necessarily for each page or scene in application 311.
In some embodiments, one or more control buttons or control gestures cause the currently active scene of application 311 to switch to a different scene of application 311. For example, according to mapping 510, when scene 501 is the active scene of application 311 and a user depresses button Y, the active scene of application 311 changes from scene 501 to scene 502. In some embodiments, such a scene change may also occur in conjunction with a screen touch. For example, according to mapping 510, when scene 501 is the active scene of application 311 and a user simultaneously presses buttons A, B, and Y, the active scene of application 311 changes from scene 501 to scene 502 and a user touch is reported to application 311 in the region defined by X2 to X3 and Y4 to Y5.
In the embodiment illustrated in
Any other control button mapping may be used to indicate scene changes and other user touches. For example, in some embodiments, some or all scenes of application-specific mapping 340 each have a particular control button or control gesture associated therewith that is the same in each scene. Thus, no matter what scene in active in application 311, when a user depresses the particular control button or performs the control gesture associated with a particular scene, that particular scene becomes the active scene of application 311. In this way, a user may manually select a specific scene of application 311 regardless of what scene is currently the active scene of application 311. In such an embodiment, each mapping in application-specific mapping 340 includes a number of identical control button/control gesture entries, one for each scene mapped in this way. Furthermore, in such an embodiment, it is noted that service client 331 may also be notified what scene is currently active when a particular control button or control gesture is actuated by a user to manually change scenes. Such notification enables service client 331 to use the appropriate mapping (e.g., mapping 510, 520, 530, or 540).
As described above, in some embodiments the active scene of application 311 is directly, or “manually,” selected by performing a control gesture or by depressing a control button that is mapped to switch to a particular scene. In other embodiments, the active scene of application 311 is changed using a control button or control gesture that is mapped to a touch location of the active scene of application 311 that corresponds to a change scene command or icon within the active scene. Thus, in such embodiments, the internal controls of application 311 may be used to perform the scene change. However, because a control button is depressed or a control gesture is performed to initiate such a scene change, service client 331 can still accurately track what scene is the active scene of application 311 and use the appropriate mapping when mechanical input controls 220 of video gaming console 200 are subsequently used.
In some embodiments, service client 331 can automatically determine what scene of application 311 is currently active. Specifically, in some embodiments, service client 331 determines the currently active scene of application 311 based on image data, such as data residing in a frame buffer of video gaming console 200. By examining the contents of such a frame buffer, service client 331 can detect previously established markers included in each scene to determine which scene is currently active, i.e., being displayed on integrated screen 201. Alternatively or additionally, service client 331 may use real-time image processing of data residing in a frame buffer to recognize what scene is currently active in application 311. For example, specific touchscreen control icons or other shapes in the currently active scene may be used by service client 331 to automatically determine which scene this is. Service client 331 then uses the appropriate mapping associated with the active scene when mechanical input controls 220 of video gaming console 200 are used. In some embodiments, service client 331 automatically determines the active scene of application 311 whenever mechanical control inputs from mechanical input controls 220 are received.
In some embodiments, service client 331 uses any combination of the above-described approaches for determining what scene of application 311 is currently active. For instance, the manual selection of an active scene may be used in combination with the fully automatic determination approach involving real-time image processing of data residing in a frame buffer of video gaming console 200. In one example embodiment, the manual selection approach may be used to override automatic scene determination by service client 331. In another embodiment, the manual selection approach is used in lieu of real-time image processing when conserving energy and/or computing resources is especially beneficial.
In some embodiments, one or more of application-specific databases 340 reside locally in video gaming console 200. For example, in some instances mappings for mechanical input controls 220 is provided with a particular application 311. Alternatively, whenever a particular application 311 is first launched in video gaming console 200, application client 312 determines whether a suitable application-specific mapping 340 is present in video gaming console 200. If not, such as when mappings for mechanical input controls 220 are not provided with an application 311, application client 312 can access an appropriate application-specific mapping 340 from a local or remote database, such as mapping database 350. In some embodiments, application client 312 can access an appropriate application-specific mapping 340 via the Internet or other communication network.
In some embodiments, application client 312 is configured to store user-selected mappings for a particular application 311 in mapping database 350. In such embodiments, application client 312 records the user-selected mappings when a user utilizes user interface 370, which may be a drop-down menu that operates outside of application 311. Application client 312 then stores the mappings for application 311 in a dedicated mapping database 350, which may reside locally in video gaming console 200 and/or remotely, such as in a server accessible by other users of application 311. For reference, the application-specific mapping 340 may be stored with an appropriate package (pkg) name indicating the intended application 311.
Prior to implementation of the method steps, an application-specific mapping 340 is generated for one or more mechanical input controls 220 of video gaming console 200, where application-specific mapping 340 includes a mapping for two or more scenes of a particular application 311. The mapping for each scene associates at least one touch location within a region of the scene with a particular control input from mechanical input controls 220, where the control input is generated when a user actuates one of (or a combination of) mechanical input controls 220. Application-specific mapping 340 may be generated by a developer of application 311, a user of application 311, and/or a manufacturer of video gaming console 200, and may be stored locally in video gaming console 200 and/or in remote mapping database 350. Mapping database 350 may be available via a communication network, such as the Internet.
In some embodiments, the mappings for the two or more scenes of application 311 are accessed from a database associated with application 311, such as mapping database 350, prior to the method. For example, when application 311 is initially launched, application client 312 (or any other suitable control circuit or system) may search for a locally available application-specific mapping 340 for application 311. Typically, application client 312 is created when application 311 is first launched. If such a mapping is not stored locally in video gaming console 200, application client 312 is configured to search for application-specific mapping 340 in a remote mapping database 350.
As shown in
In step 602, service client 331 translates the first control input into a first set of instructions recognizable to application 311 based on a first mapping of the first control input to at least one touch location within a region of the first scene. For example, a touch location to which the first control input is mapped corresponds to a touchscreen-based control icon displayed on integrated screen 201 for user navigation and interaction with application 311. An example embodiment of step 602 is described in greater detail below in conjunction with
In step 603, service client 331 provides the first set of instructions to OS 310, where OS 310 is configured to include the first set of instructions in the software application. In this way, physical actuation of mechanical input devices by a user can be realized as on-screen touches in a software application that is configured for user interaction via a touchscreen.
In step 604, service client 331 receives a second control input that relates to a second scene associated with application 311, where the second control input is generated by one or more mechanical input devices.
In step 605, service client 331 translates the second control input into a second set of instructions recognizable to application 311 based on a second mapping of the second control input to at least one touch location within a region of the second scene, wherein the second mapping is different than the first mapping. It is noted that the first mapping and the second mapping are generally included in application-specific mapping 340. An example embodiment of step 605 is described in greater detail below in conjunction with
In step 606, service client 331 provides the second set of instructions to OS 310, wherein OS 310 is configured to include the second set of instructions in application 311. Thus, physical actuation of mechanical input devices by a user can be realized as on-screen touches in multiple scenes of application 311, even though each of the multiple scenes of application 311 has a different mapping of mechanical input devices to on-screen touches.
Prior to implementation of the method steps, and as described above in method 600, service client 331 receives a control input that relates to the currently active scene in application 311. Service client 331 is created by mapper service 330 after application 311 is started and a connection between mapper service 330 and application 311 is established. The received control input is generated by mechanical input controls 220 of video gaming console 200 when actuated by a user.
As shown in
In step 702, service client 331 determines a touch location from application-specific mapping 340. Because service client 331 tracks which scene of application 311 is active, service client 311 can determine a touch location from application-specific mapping 340 that corresponds to the received control input. It is noted that the touch location determined in step 702 may differ depending on which scene of application 311 is currently active, as illustrated by mappings 510, 520, 530, and 540 in
In step 703, service client 331 generates a set of instructions that are recognizable to application 311 and indicate to application 311 that a touch has occurred in the touch location determined in step 702. In this way, a control input can be translated into a set of instructions recognizable to application 311 indicating a user touch at a touch location within a region of the active scene of application 311.
In sum, the disclosed techniques provide a way to effectively execute software applications that are designed for touchscreen interactions on computing devices with mechanical controls. According to some embodiments, a separate mapping of mechanical control inputs to on-screen gestures is generated and stored for each of multiple scenes in a software application. Thus, for each of the multiple scenes, a different mapping of mechanical control inputs can be employed. Advantageously, multiple scenes of the software application can be navigated using a computing device with mechanical controls, even though the software application is configured for user interaction via a touchscreen.
One embodiment of the invention may be implemented as a program product for use with a computer system. The program(s) of the program product define functions of the embodiments (including the methods described herein) and can be contained on a variety of computer-readable storage media. Illustrative computer-readable storage media include, but are not limited to: (i) non-writable storage media (e.g., read-only memory devices within a computer such as compact disc read only memory (CD-ROM) disks readable by a CD-ROM drive, flash memory, read only memory (ROM) chips or any type of solid-state non-volatile semiconductor memory) on which information is permanently stored; and (ii) writable storage media (e.g., floppy disks within a diskette drive or hard-disk drive or any type of solid-state random-access semiconductor memory) on which alterable information is stored.
While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.