Field of the Invention
The present invention generally relates to electronic gaming. More specifically, the present invention relates to assisted aiming in electronic games.
Description of the Related Art
A real-world user or game player may be represented by a game character that can move about a virtual game environment. This environment is displayed on a display screen associated with an electronic game. The virtual game environment may be displayed from a perspective of the game character. The real-world game player typically controls the game character's actions using a game controller that includes one or more joysticks and/or buttons.
The game character encounters various scenarios throughout the course of the game, which may include a combat where the character is required to fight one or more threats or enemies. The enemies may approach the game character from one or more directions in the game environment (as reflected on the display screen) and then attack the game character. The real-world user uses the game controller to move the game character and cause the game character to attack or defend against the enemy using a variety of weapons or maneuvers.
In order to maneuver the game character through the game environment, the real-world game player typically uses an analog joystick or directional button on a controller. To approach an opponent or attack an enemy, the real-world game player pushes on the joystick or presses a directional button on the controller. The real-world game player may then press another controller button to initiate an attack action such as firing a gun, jabbing a sword, or throwing a punch.
The controller may include multiple controller buttons. Each of these buttons may be associated with an attack action; each button may be fixedly associated with such an attack action. When the real-world game player presses the button, the video game character may initiate the attack action regardless of whether the character is actually facing an enemy or even near an enemy. Thus, in some instances, the game character may blindly swing or fire at a portion of the game environment regardless of whether an enemy or other object is present. In order for the attack action to affect the enemy or object, however, the game player generally must aim the weapon (e.g., gun, sword, or punch) at the enemy or object with some degree of accuracy before initiating the attack action.
The two analog sticks 110 may operate with respect to character movement and aim control. The analog sticks 110 may be positioned in a neutral position or moved into a non-neutral position by moving the stick in a particular direction. Movement of the two analog sticks 110 into a non-neutral position in a given direction may result in the controller 100 outputting a corresponding directional command to the game platform. The result is a corresponding movement in the video game environment. For example, one of the two analog sticks 110 may be used to move the game character around the game environment while the other analog stick 110 may be used to control or direct a particular action such as aiming an attack action toward a particular location in the game environment (e.g., a particular enemy).
Video game combat and other directional interactions are simplest when there are few enemies or object simultaneously present in the game environment. As a result, it is relatively easy for the real-world game player to correctly maneuver the game character into, for example, an attack position, aim the attack action, and then activate the attack action. If there is only one enemy on the display screen, the game player can concentrate attention and other resources on the single enemy. Consequently, the game player can orient the game character to face that enemy and initiate an attack on the enemy with relative ease.
As the number of enemies or other objects on the display screen increase, it becomes increasingly difficult for the game player to attack specific enemies. The game character may be surrounded by several enemies, each of which moves about the game environment. The increased number of enemies makes it difficult for the game player to maneuver the game character, aim a weapon, and then activate the weapon for each of the multiple enemies. For example, if the game character is attacked by multiple enemies simultaneously any delay in response may result in a loss, injury, or fatality to the game character.
These effects may, in some instances, be minimized by utilizing a control device like controller 100 of
In those games or control environments with a single analog stick, the aim of the game character may be fixed to its position. The real-world game player must move and reorient the game character within the game environment in order to adjust the aim. While some games may allow the game player to switch the use of the single analog stick between control over movement in the game environment and control over aim adjustment or allow the game player to adjust the aim and then lock the aim on a particular object as a target, these solutions require the real-world game player to perform an additional action such as pressing yet another button on the controller in order to lock on a particular target. This additional step only slows down the response time of a game player. Referring again to the combat situation with multiple simultaneously attacking enemies, the game player may not have enough time to aim and lock on a series of targets.
Even in those games played with a controller that includes two analog sticks like that of
Even when a real-world game player enjoys the benefit of dual analog control sticks 110 utilizing a controller like controller 100 of
Another problem associated with simultaneously confronting multiple enemies is that it becomes difficult for the game player to attack a succession of different enemies. Under the conventional attack method, the game player has to orient the character toward a first enemy and then attack that enemy. In order to subsequently attack a second enemy, the game player must manually maneuver the character so that the character is facing the second enemy. This can become quite cumbersome for the player, particularly if the second enemy is located at an awkward position relative to the character, such as behind the character or at a distance removed from the character. This often results in the player fumbling with the joystick and losing an attack opportunity. The requirement of re-orienting the character to the second enemy also takes time, which can be detrimental in an action game where characters must successfully and quickly attack enemies with success or otherwise risk incurring damage from the enemies.
The significance of the aforementioned problems only increases as the graphics processing power of video game systems increases. Modern video game systems are able to display and control an increasing number of enemy characters or other objects on the video game display at one time. Thus, it is becoming even more difficult and cumbersome for game players to target and attack specific enemies in a video game environment regardless of skill or controller design. Consequently, it would enrich the video game experience to allow players to efficiently and intelligently target and attack enemies or direct actions to game objects. There is, therefore, a need for improved methods for game aim assist.
The present invention provides for methods for game aim assist. In an electronic game, a game environment may be displayed from a third-person perspective with respect to a game character. The display of the game environment may include a defined focus area. During game play, the focus area may be evaluated for the presence of objects. One or more objects may be detected. An object may then be selected based on the location of the object with respect to the center of the focus area. The aim of the game character may then automatically be adjusted toward the selected object.
Various embodiments of the present invention include methods for game aim assist. Such methods may include displaying a game environment including a defined focus area, detecting one or more objects in the defined focus area, selecting one of the objects based on distance between the object and the center of the defined focus area, and automatically adjusting the aim of the game character toward the selected object. In some embodiments, selection of the object may be further based on distance between the object and a focal point of the game character, user action, a weapon operated by the game character, and the like.
Some embodiments may include selecting an object based on input from a controller with a single analog stick. Such input may include a button being pressed at a certain pressure or for a certain length of time. Some embodiments further include calibrating for new players or new game sessions. Calibration may occur with respect to button press times or button pressure settings. Embodiments of the present invention may further include resetting the display of the game environment after a period of inactivity.
Once the aim of the game character has been adjusted toward an object (e.g., an enemy), the game player may choose to initiate an attack action such as shooting a firearm, stabbing with a sword, or throwing a punch. The attack action may injure, kill, or otherwise neutralize the enemy object. Some embodiments of the present invention include automatically selecting the next target from the remaining objects.
Embodiments of the present invention include computer-readable storage media having embodied thereon programs that, when executed by a processor or computing device, perform methods for game aim assist.
An exemplary apparatus for game aim assist may include a display for indicating a perspective of a game environment including a defined focus area, a controller interface for manipulating actions of the game character based on user input, and a processor for detecting one or more objects in the defined focus area, selecting one object based on distance between the object and the center of the focus area, and automatically adjusting the aim of the game character toward the selected object.
Various electronic games allow a game player to control the actions of a game character. The game environment may be displayed from a third-person perspective with respect to such a game character. In embodiments of the present invention, the display of the game environment includes a defined focus area. During game play, one or more objects may be detected in the focus area. One of the objects may then be selected as a target based on the distance between the object and the center of the focus area. The aim of the game character may then be automatically adjusted toward the selected object, which allows the game player to direct an attack action on the object.
In step 210, a perspective of a game environment including a defined focus area may be displayed. The game environment may be displayed from a point of view that is close to (i.e., third-person) or that belongs to (i.e., first-person) a game character whose movement and actions may be controlled by the real-world game player. As the real-world game player moves the game character through the game environment, the display of the game environment may be adjusted to reflect the changes around the game character.
In some embodiments, the display may reset after a period of inactivity in game play. Resetting of the display may include a return to an original state, which may include a particular line-of-sight or directional orientation of the game character. For example, the real-world game player may direct the game character to look up toward the sky or to focus on an object at extremely close range. After a period of inactivity, the display may automatically reset such that the display reflects the game environment from the original perspective of or with respect to the game character (e.g., straight ahead with no intensified focus).
Returning to
In step 230, an object may be selected from the objects detected in step 220. The selection of the object may be based on the distance between a particular object and the center of the focus area. Referring again to
As illustrated in
Selection of the object may be further based on a location depth of the object (i.e., distance between the object and a focal point of the game character).
Selection may alternatively be based on user input. For various reasons, a real-world game player may wish to select a particular object apart from proximity to the center of the focus area and/or proximity to the game character. For example,
User input may include information concerning an amount of pressure on a button on the controller or a length of time that a button on the controller is pressed. For example, a user may press a button for a particular duration to indicate that the user wishes to select a different object than the object automatically selected. Pressure and time settings may be calibrated for each particular user and/or for each new game session.
Returning once again to
The speed at which the targeting display 330 moves toward a selected object may be customized by the user or game player or may be defined by the particular game title. An indication of when the aim has been adjusted toward a selected object may be provided. Such indications may include various visual, audio, and/or textual indications. One such example of an indication may be a change in color, shape, or size of the targeting display 330.
In optional step 250 of
In optional step 260 of
In optional step 270, the aim of the game character may be automatically adjusted toward the selected next object. As in step 240, automatic adjustment allows the game character to direct an action with some degree of accuracy toward the selected next object without manual intervention, which may be complicated in game controllers like that illustrated in
The present invention may be implemented in a game that may be operable using a variety of end user devices. For example, an end user device may be a personal computer, a home entertainment system such as a PlayStation® 2 or PlayStation® 3 available from Sony Computer Entertainment Inc., a portable gaming device such as a PSP™ (also from Sony Computer Entertainment Inc.), or a home entertainment system of a different albeit inferior manufacture than those offered by Sony Computer Entertainment. The present methodologies described herein are fully intended to be operable on a variety of devices. The present invention may also be implemented with cross-title neutrality wherein an embodiment of the present system may be utilized across a variety of titles from various publishers.
It is noteworthy that any hardware platform suitable for performing the processing described herein is suitable for use with the invention. Computer-readable storage media refer to any medium or media that participate in providing instructions to a CPU for execution. Such media can take many forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, any other memory chip or cartridge.
Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU.
While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. The descriptions are not intended to limit the scope of the invention to the particular forms set forth herein. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments. It should be understood that the above description is illustrative and not restrictive. To the contrary, the present descriptions are intended to cover such alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims and otherwise appreciated by one of ordinary skill in the art. The scope of the invention should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents.
The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 13/732,290 filed Dec. 31, 2012, issuing as U.S. Pat. No. 9,295,912, which is a continuation and claims the priority benefit of U.S. patent application Ser. No. 12/283,846 filed Sep. 15, 2008, issued as U.S. Pat. No. 8,342,926, which claims the priority benefit of U.S. provisional application No. 61/080,266 filed Jul. 13, 2008, the disclosures of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4701130 | Whitney et al. | Oct 1987 | A |
4787051 | Olson | Nov 1988 | A |
4843568 | Krueger | Jun 1989 | A |
5128671 | Thomas, Jr. | Jul 1992 | A |
5440326 | Quinn | Aug 1995 | A |
5528265 | Harrison | Jun 1996 | A |
5947823 | Nimura | Sep 1999 | A |
6017272 | Rieder | Jan 2000 | A |
6072571 | Houlberg | Jun 2000 | A |
6157368 | Faegar | Dec 2000 | A |
6210273 | Matsuno | Apr 2001 | B1 |
6267674 | Kondo et al. | Jul 2001 | B1 |
6273818 | Komoto | Aug 2001 | B1 |
6283861 | Kawai et al. | Sep 2001 | B1 |
6319121 | Yamada et al. | Nov 2001 | B1 |
6375571 | Ohnuma et al. | Apr 2002 | B1 |
6375572 | Masuyama | Apr 2002 | B1 |
6409604 | Matsuno | Jun 2002 | B1 |
6413163 | Yamuchi et al. | Jul 2002 | B1 |
6419580 | Ito | Jul 2002 | B1 |
6504539 | Hiraki | Jan 2003 | B1 |
6533663 | Iwao et al. | Mar 2003 | B1 |
6652384 | Kondo et al. | Nov 2003 | B2 |
6729954 | Atsumi et al. | May 2004 | B2 |
6878065 | Yamamoto et al. | Apr 2005 | B2 |
6955296 | Lusher et al. | Oct 2005 | B2 |
6992596 | Cole et al. | Jan 2006 | B2 |
7121946 | Paul et al. | Oct 2006 | B2 |
7137891 | Neveu et al. | Nov 2006 | B2 |
7158118 | Liberty | Jan 2007 | B2 |
7239301 | Liberty et al. | Jul 2007 | B2 |
7262760 | Liberty | Aug 2007 | B2 |
7414611 | Liberty | Aug 2008 | B2 |
7455589 | Neveu et al. | Nov 2008 | B2 |
7470195 | Baldwin | Dec 2008 | B1 |
7489298 | Liberty et al. | Feb 2009 | B2 |
7489299 | Liberty et al. | Feb 2009 | B2 |
7585224 | Dyke-Wells | Sep 2009 | B2 |
7721227 | Ronkainen | May 2010 | B2 |
7782297 | Zalewski et al. | Aug 2010 | B2 |
8167718 | Haga et al. | May 2012 | B2 |
8210943 | Woodard | Jul 2012 | B1 |
8342926 | Garvin | Jan 2013 | B2 |
8827804 | Woodard | Sep 2014 | B2 |
9295912 | Garvin | Mar 2016 | B2 |
20020034979 | Yamamoto et al. | Mar 2002 | A1 |
20020085097 | Comenarez | Jul 2002 | A1 |
20020103031 | Neveu et al. | Aug 2002 | A1 |
20030064803 | Komata | Apr 2003 | A1 |
20040212589 | Hall | Oct 2004 | A1 |
20040242321 | Overton | Dec 2004 | A1 |
20050093846 | Marcus et al. | May 2005 | A1 |
20050225530 | Evans et al. | Oct 2005 | A1 |
20060084509 | Novak | Apr 2006 | A1 |
20060192759 | Adams et al. | Aug 2006 | A1 |
20060239471 | Mao et al. | Oct 2006 | A1 |
20070002035 | Plut | Jan 2007 | A1 |
20070060231 | Neveu et al. | Mar 2007 | A1 |
20070060383 | Dohta | Mar 2007 | A1 |
20070060391 | Ikeda et al. | Mar 2007 | A1 |
20070082729 | Letovsky | Apr 2007 | A1 |
20080070684 | Haigh-Hutchinson | Mar 2008 | A1 |
20080188302 | Haga et al. | Aug 2008 | A1 |
20090017909 | Yamada | Jan 2009 | A1 |
20090325660 | Langridge | Dec 2009 | A1 |
20120322523 | Woodard | Dec 2012 | A1 |
20130196757 | Garvin | Aug 2013 | A1 |
Number | Date | Country |
---|---|---|
0 913 175 | May 1999 | EP |
2388418 | Dec 2003 | GB |
07-178246 | Jul 1995 | JP |
11-197359 | Jul 1999 | JP |
2001-009156 | Jan 2001 | JP |
WO 2008056180 | May 2008 | WO |
Entry |
---|
U.S. Appl. No. 15/261,723, Bruce Woodard, Target Interface, filed Sep. 9, 2016. |
Bolt, R.A., “Put-that-there”: voice and gesture at the graphics interface, Computer Graphics, vol. 14, No. 3, (ACM SIGGRAPH Conference Proceedings) Jul. 1980, pp. 262-270. |
DeWitt, Thomas and Edelstein, Phil, “Pantomation: A System for Position Tracking,” Proceedings of the 2nd Symposium on Small Computers in the Arts, Oct. 1982, pp. 61-69. |
Diefendorff, Keith “Sony's Emotionally Charged Chip,” Microprocessor Report, vol. 13, No. 5, Apr. 19, 1999. |
FantaVision Game Manual, Sony Computer Entertainment, Inc. 2000. |
Halo 2, released , “My Thoughts on the Past, Present, and Future of teh Halo Series—An Open Letter to Bungie”, Nov. 9, 2004, published by Microsoft Game studios, as evidenced by the web page http://nikon.bungie.org/misc/bungie_open_letter.html, downloaded from http:I/web.archive.org/web/200701291 00341 /http://nikon.bungie.org/misc/bungie_open_letter.html , with a archive.org verified date of Jan. 29, 2007. |
ID Software, The Story QUAKE game manual, with a replacementdocs.com cited upload date as Aug. 24, 2005, downloaded from http://www.replacementdocs.com/request.php?3247. |
Wikipedia—The Free Encyclopedi, “AIMBOT,” http://en.wikipedia.org/wiki/Aimbot, updated Jun. 3, 2005, last accessed Jul. 5, 2005. |
U.S. Appl. No. 11/650,311 Final Office Action dated Apr. 13, 2011. |
U.S. Appl. No. 11/650,311 Office Action dated Sep. 29, 2010. |
U.S. Appl. No. 13/540,841 Final Office Action dated May 7, 2013. |
U.S. Appl. No. 13/540,841 Office Action dated Nov. 9, 2012. |
U.S. Appl. No. 12/283,846 Final Office Action dated Jan. 31, 2012. |
U.S. Appl. No. 12/283,846 Office Action dated Jun. 23, 2011. |
U.S. Appl. No. 13/732,290 Office Action dated Jan. 28, 2015. |
U.S. Appl. No. 15/261,723 Office Action dated Apr. 10, 2018. |
Number | Date | Country | |
---|---|---|---|
20160287990 A1 | Oct 2016 | US |
Number | Date | Country | |
---|---|---|---|
61080266 | Jul 2008 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13732290 | Dec 2012 | US |
Child | 15083658 | US | |
Parent | 12283846 | Sep 2008 | US |
Child | 13732290 | US |