The present disclosure relates generally to the field of data processing systems and, more particularly, to a graphical user interface for controlling a game on a touch-screen device.
A better understanding of the present invention can be obtained from the following detailed description in conjunction with the following drawings, in which:
a-b illustrate a game controller.
The assignee of the present application has developed an online video gaming system. Certain embodiments of this system are described, for example, in U.S. patent application Ser. No. 12/538,081, filed, Aug. 7, 2009, entitled, “System and Method for Compressing Video Based on Latency Measurements and Other Feedback” and U.S. application Ser. No. 12/359,150, filed Jan. 23, 2009, entitled, “System And Method for Protecting Certain Types of Multimedia Data Transmitted Over A Communication Channel.” These applications are sometimes referred to herein as the “co-pending applications” and are incorporated herein by reference.
Described herein is a unique controller and touch-screen graphical user interface (GUI) for controlling online video games as described in the co-pending applications. While the controller and touch screen GUI described below may be used to control “online” games in one embodiment of the invention, the underlying principles of the invention are not limited to “online” games. For example, the controller and touch screen GUI described below may be used to control games being executed locally on the gaming apparatus to which they are connected (in the case of the controller) and/or on which they are displayed (in the case of the GUI).
As illustrated in
As illustrated in
The numerical designations in
In one embodiment, touching the joysticks 201-202 indicates to the touch screen device that the user is intending to manipulate the joysticks, and subsequent motion while continuously touching the joystick 201-202 is interpreted by the touch screen device as if the user had moved a physical joystick a similar distance from the center of the joystick. In one embodiment, a touch that is closer to the center of the joystick than to any other button is considered to the user touching the joystick and activating it, and a touch closer to another button is considered a touch of the other button. In one embodiment, a touch that activates the joystick is considered to define the center point position of the joystick, so that any subsequent motion from the center point is considered to be a movement away that center point. In another embodiment, a touch that activates the joystick is considered to have moved the joystick to the position of this first touch, and subsequent motion from this point is considered to have moved the joystick from that position. In one embodiment, there is an option where either the user or the game being played is able to select whether the first touch is interpreted as touching the joystick at its center point or moving the joystick to the position of the touch. The physical joysticks 101 and 102 may be equipped with a press down button capability, such that if the user presses down on a joystick, that downward press is detected. In one embodiment, the joysticks 201 and 202 are responsive to a “double tap”, in which a rapid touch-touch action (with a limited time duration between the taps defined by the touch screen device, the game, or set by user as a setting, so as to distinguish a “double tap” from a release of the joystick followed by a retouch by the finger) is defined to be analogous to pressing down the physical joysticks 101 and 102. In one embodiment, such a “double tap” is only interpreted as such within a limited radius of the center of the joystick 201 or 202.
In one embodiment, touch of the buttons in
A graphical mouse button 211 illustrated in
A graphical record button 212 is also provided. In one embodiment, when selected, the record button causes the online gaming service to begin recording the user's game video output, or game actions. The recorded game video or game actions may then be used for many purposes as described in the co-pending applications, including Brag Clips™ video recordings, which, which are recordings of previous game play may be reviewed by the user and/or shared with others.
The button layout and functionality illustrated in
In addition, in one embodiment, when a user is actively controlling the joystick controls 201-202, the buttons 208, 205 surrounding the joystick controls and potentially other touch sensitive areas are “deactivated” while the user continues to manipulate the joystick. Thus, after the user touches the touch screen over the image of the joystick 201 or 202, and does not cease to touch the touch screen with that finger while moving the finger around, the touch screen device will consider that the joystick is still being manipulated, and despite the fact the finger may pass over another button, the touch screen device will not interpret that as a touch of the button underneath the finger, until the finger is lifted from the touch screen and again touches the touch screen. This allows the user to touch the joystick 201 or 202 and then have a wider area of motion to manipulate the joystick than would normally be possible if the motion were constrained to avoid moving over a button 205 or 208, resulting in the erroneous interpretation as a touch of that button. Also, given the intensity of many games, it allows the user to vigorously manipulate the joystick without fear of inadvertently hitting a nearby button, which might result in an undesired behavior in the game (e.g. inadvertently shooting an ally while using the joystick to turn quickly with an extreme motion).
In one embodiment, the range of that the user will be allowed to manipulate the joystick is limited to some specified distance from the center of the joystick, for example, to a circle of some radius around the joystick. The game could either cease to interpret the motion once the range of motion is exceeded (and the user could potentially realize because there was no corresponding action in the game beyond the allowable range of motion that the range had been exceeded), or there could be some other indication to the user, such as an auditory alert (e.g. a beep) or a graphical indication (e.g. a flash on the screen).
In one embodiment, when a joystick 201 or 202 is interpreted as activated, the nearby buttons are dimmed to be less prominent than they are usually (or some other graphical indication is used) to remind the user that touching the nearby buttons (without first releasing the joystick) will not be interpreted as a button press. For example, when the joystick 201 is activated, the D-pad buttons 208 would be dimmed out or when joystick 202 is activated one of the action buttons 205 would be dimmed out. When the user lifts his/her thumbs up from the joysticks, the D-pad buttons 208 or one of the action buttons 205 would become active again and would be restored to their normal appearance, providing a visual indication to the user that they can be actuated. In one embodiment, when the user lifts his/her fingers off of the graphical joysticks 201-202 as described, the graphical joysticks may be configured to be interpreted as (a) remaining in the last position they were in or (b) returning to a center position. In one embodiment, such state in the preceding sentence would be shown visually by the position that the joystick 201 or 202 graphical image is draw in.
Similarly, to the joystick 201 and 202 actuation being handled over nearby buttons in the preceding three paragraphs, dragging after pressing a button such as the LT 207b and RT 207a button could, in one embodiment, deactivate nearby buttons to allow motion that overlaps nearby buttons. Similar to the joystick 201 and 202 actuation, such dragging can be limited in range, and visual and/or auditory indicators can be provided to the user.
As mentioned above, in one embodiment the GUI shown in
In one embodiment, the game that is executing in the touch screen device or in a remotely-operated game requests the particular configuration of buttons so as to best suit the needs of the game.
In one embodiment, and non-game application is used, and it requests an interface suitable for its operation.
In one embodiment, the touch interface described herein is rendered by the local touch screen device. In another embodiment, the touch interface described herein is rendered by a remotely-operating game or application such as that described in the co-pending applications. In another embodiment part of the touch interface described herein is rendered locally and part is rendered remotely.
As illustrated in
In one embodiment, as the hosting service 310 stores the UI configuration using a meta language that specifies the type, size, rotation and location of the UI element to be rendered and the list of actions to be executed upon user interaction with the rendered UI element. There could be multiple actions associated with a UI widget as a function of the user interaction. For instance, one UI widget could support different actions being returned to the hosting service 310 for “press” or “swipe” independently.
In one embodiment, the various graphical elements illustrated herein and the associated functions may be generated by a general purpose or a special purpose processor executing instructions. For example, a processor within the touch screen device may execute instructions to generate the graphical buttons shown in
Elements of the disclosed subject matter may also be provided as a machine-readable medium for storing the machine-executable instructions. The machine-readable medium may include, but is not limited to, flash memory, optical disks, CD-ROMs, DVD ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, propagation media or other type of machine-readable media suitable for storing electronic instructions. For example, the present invention may be downloaded as a computer program which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
It should also be understood that elements of the disclosed subject matter may also be provided as a computer program product which may include a machine-readable medium having stored thereon instructions which may be used to program a computer (e.g., a processor or other electronic device) to perform a sequence of operations. Alternatively, the operations may be performed by a combination of hardware and software. The machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, CD-ROMs, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, magnet or optical cards, propagation media or other type of media/machine-readable medium suitable for storing electronic instructions. For example, elements of the disclosed subject matter may be downloaded as a computer program product, wherein the program may be transferred from a remote computer or electronic device to a requesting process by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
Additionally, although the disclosed subject matter has been described in conjunction with specific embodiments, numerous modifications and alterations are well within the scope of the present disclosure. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
The application is a continuation-in-part and claims the benefit of U.S. patent application Ser. No. 13/016,785 entitled “Graphical User Interface, System and Method For Implementing A Game Controller On A Touch-Screen Device”, filed on Jan. 28, 2011 now U.S. Pat. No. 8,382,591 which is a continuation-in-part and claims the benefit of U.S. Provisional Application No. 61/351,268 entitled, “Graphical User Interface, System and Method For Implementing A Game Controller On A Touch-Screen Device”, filed on Jun. 3, 2010.
Number | Name | Date | Kind |
---|---|---|---|
4582324 | Koza et al. | Apr 1986 | A |
5956025 | Goulden et al. | Sep 1999 | A |
6014706 | Cannon et al. | Jan 2000 | A |
6665872 | Krishnamurthy et al. | Dec 2003 | B1 |
6699127 | Lobb et al. | Mar 2004 | B1 |
6850252 | Hoffberg | Feb 2005 | B1 |
7587520 | Kent et al. | Sep 2009 | B1 |
20020034980 | Lemmons et al. | Mar 2002 | A1 |
20020101442 | Costanzo et al. | Aug 2002 | A1 |
20020111995 | Mansour et al. | Aug 2002 | A1 |
20020129096 | Mansour et al. | Sep 2002 | A1 |
20030079026 | Watanabe et al. | Apr 2003 | A1 |
20030177187 | Levine et al. | Sep 2003 | A1 |
20030226150 | Berberet et al. | Dec 2003 | A1 |
20040097280 | Gauselmann | May 2004 | A1 |
20040111755 | Perlman | Jun 2004 | A1 |
20040207724 | Crouch et al. | Oct 2004 | A1 |
20040263626 | Piccionelli | Dec 2004 | A1 |
20050024341 | Gillespie et al. | Feb 2005 | A1 |
20050104889 | Clemie et al. | May 2005 | A1 |
20060230428 | Craig et al. | Oct 2006 | A1 |
20070060343 | Sakaguchi | Mar 2007 | A1 |
20070061126 | Russo et al. | Mar 2007 | A1 |
20070083899 | Compton et al. | Apr 2007 | A1 |
20070243925 | LeMay et al. | Oct 2007 | A1 |
20070281789 | Wiltshire et al. | Dec 2007 | A1 |
20080238879 | Jaeger et al. | Oct 2008 | A1 |
20080250120 | Mick et al. | Oct 2008 | A1 |
20090300701 | Karaoguz et al. | Dec 2009 | A1 |
20110126255 | Perlman et al. | May 2011 | A1 |
20110248927 | Michaelis et al. | Oct 2011 | A1 |
20110285636 | Howard et al. | Nov 2011 | A1 |
20110314093 | Sheu et al. | Dec 2011 | A1 |
Number | Date | Country |
---|---|---|
0 502 673 | Sep 1992 | EP |
0 554 586 | Aug 1993 | EP |
1 132 122 | Sep 2001 | EP |
1 637 197 | Mar 2006 | EP |
2 347 3332 | Aug 2000 | GB |
9530465 | Nov 1995 | WO |
0044169 | Jul 2000 | WO |
0141447 | Jun 2001 | WO |
0175545 | Oct 2001 | WO |
03047710 | Jun 2003 | WO |
03075116 | Sep 2003 | WO |
2005045551 | May 2005 | WO |
2006034124 | Mar 2006 | WO |
2006124811 | Nov 2006 | WO |
2007008356 | Jan 2007 | WO |
2007119236 | Oct 2007 | WO |
2008042098 | Apr 2008 | WO |
2009078319 | Jun 2009 | WO |
Entry |
---|
“Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration” from foreign counterpart PCT Application No. PCT/US2011/038627, mailed Sep. 16, 2011, 8 pages. |
Notice of Allowance from U.S. Appl. No. 13/016,785, mailed Jun. 26, 2012, 18 pages. |
Bungie,“HALO 3 How-to: Saved Films, New ViDOC”, http://www.bungie.net/News/content.aspx?type=topnews&link=h3savedfilms, Sep. 20, 2007, pp. 1-8. |
Duong, Ta Nguyen Binh, “A Dynamic Load Sharing Algorithm for Massively Multiplayer Online Games:”, IEEE, 2003, pp. 131-136. |
Kubota, Shuji, “High-Quality Frame-Synchronization for Satellite Video Signal Transmission”, IEEE Transactions on Aerospace and Electronic Systems, vol. 31, No. 1, Jan. 1995, pp. 430-440. |
Nguyen, Cong Duc, “Optimal Assignment of Distributed Servers to Virtual Partitions for the Provision of Immersive Voice Communication in Massively Multiplayer Games”, Computer Communications 29 2006, available online Nov. 15, 2005, pp. 1260-1270. |
Wu, Dapeng, “Transporting Real-time Video over the Internet: Challenges and Approaches”, Proceedings of the IEEE, vol. 88,No. 12, Dec. 2000, pp. 1-18. |
Frauenfelder, M., G-Cluster Makes Games to Go, EE Times, Nov. 6, 2001, http://www.thefeaturachives.com/13267.html, 3 pages. |
International Search Report & Written Opinion from foreign counterpart PCT Application No. PCT/US2012/040940 mailed Aug. 23, 2012, 9 pages. |
Number | Date | Country | |
---|---|---|---|
20120242590 A1 | Sep 2012 | US |
Number | Date | Country | |
---|---|---|---|
61351268 | Jun 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13016785 | Jan 2011 | US |
Child | 13155633 | US |