Placement of user information in a game space

Information

  • Patent Grant
  • 10786736
  • Patent Number
    10,786,736
  • Date Filed
    Tuesday, May 11, 2010
    14 years ago
  • Date Issued
    Tuesday, September 29, 2020
    3 years ago
Abstract
The generation, association, and display of in-game tags are disclosed. Such tags introduce an additional dimension of community participation to both single and multiplayer games. Through such tags, players are empowered to communicate through filtered text messages and images as well as audio clips that other game players, including top rated players, have generated and placed at particular coordinates and/or in context of particular events within the game space. The presently described in-game tags and associated user generated content further allow for label based searches with respect to game play.
Description
BACKGROUND OF THE INVENTION
Field of the Invention

The present invention generally relates to interactive game play. More specifically, the present application relates to placement of user-generated content to aid a user with interactive game play.


Description of the Related Art

Improvements in processing power and graphics quality have led to increasingly complex interactive gaming environments. For example, the PlayStation®3's RSX graphics processor allows for freedom of graphics expression in creating next-generation, real-time 3D imagery. Working in tandem with Sony Computer Entertainment Inc.'s Cell Broadband Engine™ Architecture, RSX processor rendered graphics are unparalleled in quality and realism.


Increasingly complex gaming environments have, in turn, resulted in more complex story lines, game play objectives, missions and tasks, and capabilities associated with game play avatars. As a result, interactive game play has become more challenging even for experienced game players. If a game becomes too challenging, however, game players may forsake future game play out of frustration.


To help game players overcome obstacles or achieve goals in a variety of interactive games, various content providers have begun publishing game magazines. These magazines provide game players with a ‘walk thru’ that tell the reader/game player where to go and what to do in order to ‘win’ the game or obtain the highest possible score. Hints or suggestions with respect to special moves or avatar capabilities may also be described in these gaming magazines.


While these magazines may be informative, they suffer from a number of drawbacks. If the magazine is not published by an official source (e.g., an official partner of the game developer), the magazine may omit essential information. In some instances, an unofficial magazine may publish incorrect information. Incorrect information may also result from the tendency to rush and publish these magazines concurrently with the release of an interactive game title to allow for concurrent purchase—even if the magazine is published by an official source.


Game players may also discover ‘Easter Eggs’ or other secrets during the course of game play. These secrets may not be a part of even an official magazine due to the fact that some game design engineers ‘slip in’ these Easter Eggs without the knowledge of the magazine publisher. Many interactive games also allow for the creation of special moves that may not have initially been conceived of by the game developer. As a result, these special moves are not a part of the game play magazine—official or otherwise—as their development occur after the magazine and associated game has gone to market.


Once game play magazines publish, subsequent editions tend not to be published. The lack of subsequent, updated editions may further the information that may be withheld from game players. Unique game play situations or circumstances may not become apparent until the interactive game is played by a large number of game players. These situations and circumstances may not be addressed in the gaming magazine thereby leaving game players at a loss as to how they may properly address the same.


In contrast, the Internet offers the opportunity for endless publishing and republishing of information. Notwithstanding endless publishing possibilities, websites on the Internet are often decentralized and unorganized. In some instances, there is no ‘official website’ as game developers may wish for game players to purchase a ‘for fee’ official magazine rather than access a free on-line website. Additionally, one website may offer one solution for one particular game play situation whereas another website may offer a solution for another situation. In order for a game player to obtain a complete ‘walk thru’ of a particular interactive game, the user may have to visit multiple websites on the Internet. Since these websites tend to be ‘unofficial,’ there is often an issue with the veracity or accuracy of the information displayed on these websites.


A further lacking in the aforementioned prior art solutions is the fact that this information—regardless of source, thoroughness, or quality—is that the information lacks contextual relevance. Some game play environments include a variety of ‘acts’ or ‘levels’ of game play; these scenes or levels often include a variety of subsidiary ‘scenes’ or ‘stages.’ For example, a game based on the D-Day military offensive may involve four scenes: crossing the English Channel; advancing up Omaha Beach; taking artillery positions at the head of the beach; and securing numerous military objectives in the French countryside. Game play advice concerning how to best maneuver an LCM Landing Craft while crossing the English Channel has no value to the game player that currently needs advice on how to best conduct a room-to-room search in the bombed out buildings of the nearby town of Bayeux. Locating the contextually appropriate game play advice may be time consuming if not confusing to a game player in the ‘heat of battle.’


The aforementioned prior art game play advice solutions are also wanting for lack of real-time provisioning of information. Many of today's interactive games are incredibly realistic, action-intensive simulations such as Warhawk from Sony Computer Entertainment America Inc. A game player often finds themselves ‘in the zone’ with respect to game play. If a game player is continually forced to interrupt game play (e.g., ‘pausing’ the game) in order to flip through pages of a game play magazine or click-thru various pages of content on the Internet, the game player will quickly find themselves losing their rhythm. In such complex game play environments, loss of that rhythm may be to the detriment of continued game play regardless of any hints or information that may have been acquired during the interruption.


Many games are also network or community-based with multiple players located around the country or around the world. Such games may occur in real-time. In certain of these games, the interruption of game play through ‘pause’ functionality may not be an option as may be available in a single-player game environment. The game player may be forced to drop out of a particular network game because the gaming environment cannot both exist in a timed-out/paused state for one game player yet continue in real-time for all others.


While some network or community-based games may allow for a ‘pause’ or other ‘time out’ feature, doing so may be to the detriment of the player invoking the interruption. In some games, for example, other game players may continue to advance through the game play environment by obtaining objects of value or reaching objectives within the environment. In other games, competing and non-paused players may position themselves to take retributive action on the ‘paused’ game player when they reenter the gaming environment. For example, a non-paused player may sneak up behind a ‘paused’ player in a combat environment and assassinate the ‘paused’ player at point-blank range as the ‘paused’ player is unable to observe or react to events in the game environment while in a paused state.


There is a need in the art for game play advice that is complete and up-to-date regardless of when a particular interactive gaming title is released. Further, there is a need for game play advice that is pervasive and easily accessible to game players. There is a still further need for game play advice that is contextually appropriate and provided in real-time when such information is needed most.


SUMMARY OF THE INVENTION

Embodiments of the present invention provide a system and methods for placement of user-generated content to aid a user with interactive game play.


A first claimed embodiment of the present invention includes a method for managing user-generated game play advice. An indication of a location within a game space using a virtual coordinate system is received. The location corresponds to the desirability for rendering of game play advice. Game play advice is received from a user and assigned to a location within a game space previously identified as desirous of game play advice by using a virtual coordinate system. Game play advice is then displayed during subsequent game play at the same location within the game space using the virtual coordinate system, the game play advice displayed in a manner that is appropriate with respect to a present context of game play.


A further claimed embodiment of the present invention includes a computer-readable storage medium having embodied thereon a program. The program is executable by a computer to perform a method like that described above.


In a third claimed embodiment, a system for managing user-generated game play advice is described. The system includes a content submission engine for receiving game play advice over a network and a virtual coordinate system engine for assigning the game play advice to a particular location within a game space. A context engine identifies a context of an event during game play. The context of the event corresponds to game play advice associated with the particular location within the game space. A display engine displays game play advice corresponding to the context of the event identified by the context engine and at the location of the event as identified by the virtual coordinate system.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an exemplary system for placement of user-generated content to aid a user with interactive game play.



FIG. 2 illustrates an exemplary method for receipt and subsequent display of user-generated game play advice using in-game tags.



FIG. 3 illustrates a game space including user-generated content.





DETAILED DESCRIPTION

The present invention allows for the generation, association, and display of in-game tags. Such tags introduce an additional dimension of community participation to both single and multiplayer games. Through such tags, players are empowered to communicate through filtered text messages and images as well as audio clips that other game players, including top rated players, have generated and placed at particular coordinates and/or in context of particular events within the game space. The presently described in-game tags and associated user generated content further allow for label based searches with respect to game play.


In this context, the elements identified throughout are exemplary and may include various alternatives, equivalents, or derivations thereof. Various combinations of hardware, software, and computer-executable instructions may be utilized. Program modules and engines may include routines, programs, objects, components, and data structures that effectuate the performance of particular tasks when executed by a processor, which may be general purpose or application specific. Computer-executable instructions and associated data structures stored in a computer-readable storage medium represent examples of programming means for executing the steps of the methods and/or implementing particular system configurations disclosed herein.



FIG. 1 illustrates an exemplary system 100 for placement of user-generated content to aid a user with interactive game play. The system 100 of FIG. 1 includes a content submission engine 110, content database 120, virtual spatial coordinate (VSC) engine 130, game event and context engine 140, and matching/display engine 150. While various engines and databases are described in the context of FIG. 1, an embodiment of the present invention may offer the functionality of each or certain of these engines and databases in a single ‘content management’ engine or database.


System 100 may be implemented in a network environment such as the Internet, a proprietary communications environment, or a combination of the two. In one example, system 100 is an integrated component of the PlayStation® Network. System 100 (or components thereof) may communicate with the network environment utilizing any number of network interfaces as are known in the art. Examples of such interfaces include a 1000BASE-T Ethernet port or an IEEE 802.11b/g network WiFi interface.


System 100 may be implemented in a computing device such as a server dedicated to managing user-generated content including maintenance of various databases. Alternatively, system 100 may be implemented in a computing device hosting a number of applications such as community maintenance, admission, and network game data distribution. System 100 may be dedicated to a single network game, a genre of games, or any number of games having no particular affiliation at all.


System 100 may also be implemented in a distributed peer-to-peer environment. In such an implementation, certain applications and/or responsibilities may be managed by a group of computing devices in the environment.


Various engines may be distributed to a community of users (e.g., players of a particular game or users in a general gaming network) through a push operation from a tasked server in the game community. Alternatively, various engines may be embodied in a computer-readable storage medium that also includes a particular game application (e.g., a disc). Distributed applications and engines may communicate directly via a group of peers or may be administered by a management server.


Content submission engine 110 is executable to allow a user to communicate with the system 100 over network for generation of in-game tags and the corresponding submission of user generated content. In-game tags include custom information placed by a user during game play and can include text messages, web links, images, audio or video clips, and user profile information. In-game tags rely upon virtual space coordinates, which are governed by the virtual spatial coordinate engine 130 and described in further detail below, which allow for consistent positional information pertaining to the game space to be assigned to an in-game tag.


Execution of content submission engine 110 may generate a user-interface for allowing user interaction with the system 100. The interface allows a user to assign user generated information to a particular virtual space coordinate (VSC) and a corresponding tag within the game space. The interface specifically allows for allocation of user generated content as might contemporaneously or previously have been stored in content database 120.


During game play, a user may navigate a particular portion of a game environment such as a particular passageway as illustrated in FIG. 3. After having played a particular game a number of times, a user might believe that they have particularly useful information to offer other players of the same game such as warnings about enemies entering that passageway or the best way to navigate the passageway and move onto a subsequent game environment. A user might wish to share that information with other game players.


Through depressing a particular button on a control device (or combination of buttons) used in conjunction with game play, a tag is assigned to that particular locale in the game space. Other means of assigning a tag are envisioned including gesture based assignment in those games utilizing motion based or gesture recognition controls. Audio commands may likewise be used to assign a tag in those games utilizing voice commands or having voice recognition capabilities (e.g., ‘drop tag’ or ‘assign tag’).


The particular locale in the game space has a VSC, which is the in-game equivalent to a global positioning system location. Through the use of a VSC, and as further described with respect to VSC engine 130, the particular tag will consistently be correlated to that portion of the game space. Whenever another game player (or the same game player) passes by that VSC after the tag has been assigned, the tag and any corresponding information in the content database 120 will be made accessible for review and study.


Content submission engine 110 allows a user to assign user generated information to a tag that was ‘dropped’ in the game space. It is difficult, if not impossible, to provide detailed information, hints, or other data during the course of game play. The content submission engine 110 provides the interface environment that allows for casual entry of that information following the completion of game play. The content submission engine 110 provides a post-game play listing of all tags that were dropped or assigned during game play and allows the user the means to provide an associated set of information to be stored in or retrieved from content database 120.


Through an interface generated by the content submission engine 110, a user may provide a detailed text message concerning information about the game play environment. The content may further include links to web pages concerning game play, that provide further related information, or information concerning upcoming tournaments, clans, and discussion groups. A tag might also be associated with screen shots or other images related to game play and that might prove useful such as maps or of interest such as ‘kill shots.’ A tag can also be assigned to audio and video clips generated by a user and that might provide a ‘replay’ of a particular portion of the game or verbal coaching as to game play. Profile information of the user providing the tag and corresponding user information may also be associated with a tag.


Entry of the game play information may be textual where a user enters a written description of the game play advice (e.g., ‘watch out for this guy’ or ‘through this door’ as shown in FIG. 3). Text-entry may occur through a virtual keyboard manipulated by a game controller coupled to a gaming platform. The gaming platform, in turn, is coupled to the system 100 via network. Submission of game play advice may be audible and provided by speaking into a USB microphone headset. Combinations of game play advice submissions are also within the scope of the present invention (e.g., a video clip with audible narration).


In some embodiments, the content submission engine 110 allows the user to re-trace game play and generate tags after the completion of game play. Some games might be so intense that even the act of generating a mere tag might interfere with optimal game play. In such a game, the user can execute the content submission engine 110 after game play is complete and ‘re-trace’ their steps, as the game will have tracked what portions of the environment were and were not accessed during play. The user may then assign tags to particular portions of the game space using a VSC system and the information associated therewith.


Submission of game play advice may also be contextually relevant. As many games are dynamic, especially first-person shooter type games, a particular scenario encountered in a particular environment during one round of game play (e.g., particular enemies) may differ significantly from a subsequent encounter albeit in the exact same game space depending on a particular scenario generated by the game play intelligence. In such an instance, providing a tag indicative of game play advice to a subsequent user when the event giving rise to the tag is not at hand may be distracting and actually detract from effective game play.


Game event and context engine 140 may track these particular nuanced events and, in conjunction with the matching and display engine 150, ensure that only contextually relevant tags are displayed. Information concerning context may be automatically be displayed by the content submission engine 110. Alternatively, a user might identify specific contextually specific limitations during the information provisioning process.


In order to avoid inconsistent naming protocols and that might otherwise complicate presentation of context sensitive game play advice, the content submission engine 110 may indicate that hints related to storming the beach at Omaha in a World War II combat simulation are all provided under the category of ‘Omaha Beach’ instead of a series of user generated titles such as ‘storming the beach,’ ‘Omaha,’ ‘chapter II,’ and others. The content submission engine 110 may work in conjunction with the game event and context engine 140 with respect to providing naming protocols.


The content submission engine 110 may also allow for user corrections or annotations of game play advice. For example, a previous user might provide information concerning accessing a particular weapon, but erroneously identifies the particular weapon or provides some other contextually inappropriate information. A subsequent user (or users) receiving that contextually inappropriate information may recognize the error or that the information might be better presented in a subsequent stage or area of game play (or simply correct an otherwise minor error). The subsequent user may lodge a complaint or suggest that an entity tasked with quality assurance of game play advice review the submission and/or context of the same.


Content database 120 manages user-generated game play advice submitted through the content submission engine 110. Content database 120 may manage submitted game play advice by user, game title, nature of the advice, date, size, content of the advice (e.g., video, audio, text, combinations of content), context, and so forth. Content database 120 may include non-user generated game play advice (e.g., prestocked game play advice from the game publisher) that may be displayed by system 100.


Content database 120 may store all game play advice received through an interface generated by content submission engine 110. Alternatively, certain game play advice may expire over time or upon the occurrence of certain events. For example, the content database 120 may only retain the top-100 ranked game play advice submissions (as described in further detail herein). Once a particular instance of game play advice falls below a top-100 threshold, that particular instance may be deleted from the content database 120. Expiration may be temporal such that instances of game play advice that are not accessed for a particular period of time are removed from the content database 120. Instances of game play advice may also be removed from the game play advice content database 120 a predetermined number of days after having been submitted to the system 100.


System 100 may include a ranking engine (not shown) to manage the ranking of game play advice stored in content database 120. As described in co-pending patent publication numbers U.S. 2010-0041475 A1 for “Real-Time, Contextual Display of Ranked, User-Generated Game Play Advice” and U.S. 2009-0063463 A1 for “Ranking of User-Generated Game Play Advice,” the disclosures of each being incorporated herein by reference, when new game play advice is received, a ranking engine may assign a default ranking to a new instance of game play advice. This default ranking and any other ranking (including those generated as a result of user feedback) may be measured utilizing any rubric capable of distinguishing one instance of user-generated game play advice from another. In conjunction with a feedback engine and optional weighting engine, both of which are described in the aforementioned publications, the perceived quality of game play advice as adjudicated by a community of users may be more readily identified.


Virtual spatial coordinate engine 130, as noted above, operates as a global positioning system for a particular game space. Depending on the particular layout of the game environment, the VSC engine 130 may identify an X, Y, and (if appropriate) Z coordinate for the game space. This coordinate in the game space is then associated with individual instances of in-game tags such that the tags are consistently provided in the same game space as when they were originally assigned. The VSC engine 130 not only provides consistent presentation of information, but also accurate presentation as more general descriptions such as ‘hallway by the door,’ ‘on the beach,’ or ‘Level II’ as might otherwise be utilized may not provide the specificity required to render useful game play advice. The VSC engine 130 may operate in conjunction with information concerning the rendering and tracking of user information for a particular game title and may thus be agnostic as to any particular game title.


Information concerning VSC data may be provided to the content submission engine 110 to allow for generation of content and matching to in-game tags. VSC data from engine 130 may likewise be provided to content database 120 to allow for proper retrieval and display of user content and in-game tags by matching and display engine 150. VSC data may also be used by game event and context engine 140 to assign proper game context to tags and associated content vis-à-vis the submission engine and the matching/display engine 150.


Game event and context engine 140 is tasked with providing game play advice in an appropriate context of game play such that it may be appropriately displayed by the matching and display engine 150. Content submission engine 110 allows for annotation of appropriate contexts of game play advice by means of an in-game tag. The game event and context engine 140 may identify the context of game play that would be appropriate for game play advice. For example, walking down an alleyway without threats, obstacles, or other encounters that would require tactical game play are not likely to warrant the need for hints or advice. Advancing up the beaches of Normandy on D-Day with heavy gun fire from German forces, obstacles and landmines on the beach, and advancing troops and equipment from the English Channel would clearly require quick and strategic thinking. In this instance, the game event and context engine 140 would, in conjunction with the matching and display engine 150, identify that tags providing game play advice are appropriate and feed that tag information to the display engine 150 such that tags may be displayed and content eventually accessed in content database 120.


A game developer may make initial determinations as to whether a particular task or level will provide certain challenges thus making advice warranted. The game event and context engine 140 may be programmed to correspond to such determinations. Further, the game developer may allow for the introduction of user generated game play advice in those contexts where the game developer provides their own default game play advice; these points may likewise be introduced into the game event and context engine 140. Game developers, too, may study game play feedback in network games with respect to identifying choke points or other areas where particular obstacles might prove to be more challenging in actual game play implementation than those obstacles were during the course of prerelease testing. A game developer may release an update to the game event and context engine 140 over a network that allows for introduction of user advice post-release. The content submission engine 110 may then access the game event and context engine 140 to allow for users to provide this information. These points may be with respect to levels, obstacles, events, enemies, and so forth.


As noted with respect to the content submission engine 110, the game event and context engine 140 may identify certain points of game play related to objects, challenges, or enemies as well as levels or stages as a whole. Game code or other metadata may be flagged with respect to objects or enemies and these flags may be recognized by the game event and context engine 140 upon execution of the game code by a gaming system or processing device. These flags or metadata may be tied to allowing for entry of game play advice. For example, in a World War II simulation, a player might be crossing a field. The field, without any enemies present, may not warrant the need for game play advice—submissions or providing of the same. Later in that same game environment (the field) a tank may enter the scene and begin firing upon the game player. With the introduction of the tank, providing or receiving game play advice may now be warranted. For the tank to appear in the scene would require the execution of code related to the tank. The code for introducing and intelligently controlling the tank by the game platform may be flagged or identified by the aforementioned metadata. Once that flagged code or metadata is recognized by the game event and context engine 140, a user may provide advice or receive the same.


The gamer event and context engine 140, in this regard, is not only responsible for identifying those points or instances of game play where a user may provide advice, but also those instances where providing advice is appropriate. For example, in the previously mentioned alleyway example, no challenges are present thus making the introduction of advice by the system inappropriate or unnecessary. Should a sniper suddenly begin firing upon the game player, then advice on how to deal with the sniper may be appropriate for the user to consider. The game event and context engine 140 may recognize that providing information related to the sniper is appropriate based on the game platform loading flagged code related to the sniper. Similar provisioning of advice may occur with respect to encountering objects and the like. The game event and context engine 140 may be tied to the game play advice display engine 150 to allow for timely and contextually appropriate display of that advice.


Game play advice display engine 150 is configured to allow for the eventual display of user-generated game play advice via in-game tags and VSC data. Display of this advice may be in further accordance with a ranking result generated by a ranking engine and in further consideration of determinations made by the game event and context engine 140. Game play advice display engine 150 acquires information from the game play advice content database 120 (the advice) and a ranking database (if appropriate), which has ranked game play advice as determined by a ranking engine, and displays the game play advice (or makes available the game play advice) in accordance with the VSC data from engine 130 as well as the game event and context engine 140's determination that the display of advice related to a particular in-game tag and aspect of game play is appropriate.


By working in conjunction with the game event and context engine 140, the display engine 150 may display the highest ranked information but do so in the most appropriate context. For example, displaying information about a particular enemy may be inappropriate when the user has not encountered that enemy notwithstanding the fact that the user providing the information previously encountered that enemy at the same VSC coordinates.


The display engine 150 may utilize an asynchronous programming language to provide real-time (or substantially near real-time) updates to ranked game play advice for display to a community of users. The display engine 150 may, therefore, utilize a ladder ranking of game play advice with respect to determining which in-game tags to display. In such an embodiment, the highest quality advice is presented as that advice ranks at the top of a ladder. In some embodiments, the particular arrangement of the advice as it corresponds to a given tag may be subject to user or system preferences such as particular tags searched by a user or identified as being desirable by a user.


For example, a user may consistently experience difficulty using a particular weapon during game play (e.g., a sniper rifle). Prior to game play, a user seeking advice may, through a corresponding search engine or other interface, inform system 100 that only those in-game tags and corresponding advice with respect to user of the sniper-rifle is wanted. In this manner, the user is not inundated with data concerning the use of grenades, hand guns, and rocket launchers—all weapons with which the user might be quite prolific and for which advice is not needed.


Similar searching and screening of tags may be used with respect to advice from particular users or particular clans. This information may be derived from profile information provided during tag and advice generation. In some instances, a user providing game play advice may limit the accessibility of that advice to a limited number of users. A user wishing to access device from a particular providing user may need to have been identified in advance of in-game tag access or otherwise provide a password or some indicia indicating that they are authorized to access in-game tags and corresponding advice generated by a particular user.


Display engine 150 may display advice in the context of a real-world virtual environment and/or a first- or third-person avatar. Game play advice may be expressly provided via an in-game tag as shown in FIG. 3. Game play advice may also be provided through a series of hyperlinks provided through the tag. Graphic images may also be utilized, especially in the context of game play advice that incorporates full motion video or still images. Links to audio files may be appropriate in the case of audio-rendered advice. All of the aforementioned means of providing game play advice to a community of users (and in accordance with an assigned default or feedback controlled ranking) may be managed by the display engine 150 and the game event and context engine 140.



FIG. 2 illustrates an exemplary method 200 for receipt and subsequent display of user-generated game play advice using in-game tags. The steps identified in FIG. 2 (and the order thereof) are exemplary and may include various alternatives, combinations, equivalents, or derivations thereof including but not limited to the order of execution of the same. The steps of the process of FIG. 2 (and its various alternatives) may be embodied in hardware or software including a computer-readable storage medium (e.g., optical disc, memory card, or hard drive) including instructions executable by the processor of a computing device.


In step 210, user-generated game play advice is received from a user in the community via an interface generated by the content submission engine 110. Upon receipt of the user-generated advice in step 210, the advice is processed by the system 100 as described in the context of FIG. 1 and stored in game play advice content database 120. Various rankings may also be assigned.


In step 220, the user-generated game play advice, which is associated with a tag, is assigned a particular context either by the user submitting the advice or by the game event and context engine 140 as well as being matched with a given tag using VCS coordinates. In some instances, the game event and context engine 140 may control the available contexts that a user assigns to the advice. In other instances, the game event and context engine 140 may make a determination as to the specific context of advice.


Following subsequent game play (230), the same or a different game player may be navigating a particular game space. A previously generated tag may be identified by means of VSC coordinates at step 240 (i.e., a tag exists as to some particular game play advice at this particular locale in the game space). The context of a game event is then identified in step 250. Identification step 250 occurs as a result of the joint operation of the game event and context engine 140 and display engine 150 and may be similar to identification of an initial context of game play advice as occurs in the context of step 230 (but not otherwise displayed in FIG. 2). Upon a particular context being identified in an environment and that corresponds to a particular VSC, then advice that is relevant to that particular context is identified. That advice is rendered in conjunction with display engine 150 at step 260. The display of advice may take into account user rankings and/or user defined search tags or other limitations.


The method 200 of FIG. 2 may operate in real-time (or substantially in real-time) using an asynchronous programming language. Through the use of an asynchronous language, small amounts of data may be continually exchanged with a database so that an entire user interface need not be reloaded in response to each user interaction. In such an embodiment, an XMLHttpRequest object may, for example, be utilized to fetch the most recent, contextually, and locally relevant game play advice from database 120 as referenced in FIG. 1. Relationships between rankings, user feedback, context, and game play advice may be reflected by metadata or header data stored in the various databases of system 100. Game play advice rankings and context determinations may thus be updated as feedback is received and new rankings are calculated.


Updating of information displayed in FIG. 2 may also operate subject to a predetermined schedule. For example, a ranking engine may update rankings via user feedback at five minute intervals (or any other time period as may be determined by a system administrator). Similar updates may occur with respect to context. Once an update is complete as a result of a regularly scheduled ranking operation, the newly updated information may be pushed to the display engine 150 for display to the community of users in conjunction with appropriate VSC coordinates and context. The updated information may also be available for access in response to a user request or query.


While the present invention has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the true spirit and scope of the present invention. Various alternative systems may be utilized to implement the various methodologies described herein and various methods may be used to achieve certain results from the aforementioned systems.

Claims
  • 1. A computer-implemented method for providing only contextually relevant user-generated game play advice for a location within a dynamic game, the method comprising: executing, by a processor, first instructions stored in a memory to receive an indication of a location within a game space using a virtual coordinate system, the location corresponding to a desirability for rendering of game play advice;executing, by the processor, second instructions stored in the memory: (i) to receive the game play advice for a first location from a user, and(ii) to determine a first scenario generated by the dynamic game that corresponds to the game play advice for the first location;executing, by the processor, third instructions stored in the memory to assign the game play advice to the first location specifically for the first scenario generated by the dynamic game within the game space, the first location and the first scenario being previously identified as desirous of the game play advice;executing, by the processor, fourth instructions stored in the memory to, during subsequent game play, determine: (iii) the subsequent game play's location within the game space; and(iv) the subsequent game play's scenario at the subsequent game play's location; andexecuting, by the processor, fifth instructions stored in the memory to, in response to the subsequent game play being at the first location within the game space, automatically display at the first location any of the game play advice that is appropriate for the determined subsequent game play's scenario at the first location, such that the game play advice other than that which is appropriate for the subsequent game play's scenario at the first location is not displayed at the first location.
  • 2. The method of claim 1, wherein the game play advice is displayed in a three-dimensional virtual environment.
  • 3. The method of claim 2, wherein the virtual coordinate system uses X, Y, and Z coordinates.
  • 4. The method of claim 1, wherein the game play advice is textual.
  • 5. The method of claim 1, wherein the game play advice is visual.
  • 6. The method of claim 1, wherein the game play advice is audible.
  • 7. The method of claim 1, wherein executing to display the game play advice includes a consideration of a ranking of all available game play advice, and wherein only game play advice of a particular ranking is displayed at the location and with respect to the present context of game play, the game play advice further allowing for corrections to be made by the user.
  • 8. The method of claim 1, further comprising retracing the game play advice upon which the game play is based, the game play advice retraceable by the user after completion of the game play for revising or adding to the game play advice.
  • 9. A system for providing only contextually relevant user-generated game play advice for a location within a dynamic game, the system comprising: a processor; anda memory communicatively coupled with the processor, the memory storing instructions which when executed by the processor perform a method, the method comprising: receiving, via a content submission engine, the game play advice over a network from a user;assigning, via a virtual coordinate system engine, the game play advice to a location within the game space;identifying, via a context engine, a first scenario generated by the dynamic game that corresponds to the game play advice for the first location, and assigning the game play advice to the first location specifically for the first scenario generated by the dynamic game within the game space;during subsequent game play, determining, via the context engine:(iii) the subsequent game play's location within the game space; and(iv) the subsequent game play's scenario at the subsequent game play's location; andin response to the subsequent game play being at the first location within the game space, automatically displaying, via a display engine, at the first location any of the game play advice that is appropriate for the determined subsequent game play's scenario at the first location, such that the game play advice other than that which is appropriate for the subsequent game play's scenario at the first location is not displayed at the first location.
  • 10. The system of claim 9, wherein the method further comprises affecting, via a ranking engine, the game play advice displayed by the display engine notwithstanding the context of the event and the location of the event.
  • 11. The system of claim 10, wherein the method further comprises receiving, via a feedback engine, feedback from a community of users with respect to the quality of the game play advice displayed by the display engine, wherein the feedback engine and the ranking engine operate to allocate a new ranking to the game play advice in accordance with the feedback received from the community of users, the game play advice being subsequently displayed by the display engine in accordance with the new ranking.
  • 12. The system of claim 9, wherein the display engine operates using an asynchronous programming language to continually update displayed game play advice submissions in accordance with a most recent determination as to the context of the event.
  • 13. A method for providing only contextually relevant user-generated game play advice for a location within a dynamic game, the method comprising: receiving an indication of a location within a game space using a virtual coordinate system;receiving game play advice from a user;recognizing metadata associated with objects, challenges or enemies in the game play at the location that indicate that user generated advice is allowed;assigning the user generated game play advice to the location within the game space and assign the user generated game play advice a tag based upon the recognized metadata; andautomatically displaying game play advice during subsequent game play at the same location within the game space using the virtual coordinate system if recognized metadata in the subsequent game play is similar to the recognized metadata indicated by the tag associated with the game play advice.
  • 14. The method of claim 13, further comprising retracing the game play advice upon which the game play is based, the game play advice retraceable by the user after completion of the game play for revising or adding to the game play advice.
US Referenced Citations (466)
Number Name Date Kind
3147341 Gibson Sep 1964 A
3200193 Eiggs Aug 1965 A
3717345 Banville Feb 1973 A
3943277 Everly et al. Mar 1976 A
4051491 Toyoda Sep 1977 A
4051520 Davidse et al. Sep 1977 A
4068847 Lukkarila et al. Jan 1978 A
4090216 Constable May 1978 A
4116444 Mayer et al. Sep 1978 A
4133004 Fitts Jan 1979 A
4166429 Smorzaniuk Sep 1979 A
4166430 Johnson, Jr. Sep 1979 A
4203385 Mayer et al. May 1980 A
4241341 Thorson Dec 1980 A
4321635 Tsuyuguchi Mar 1982 A
4355334 Fitzgibbon et al. Oct 1982 A
4361850 Nishimura Nov 1982 A
4448200 Brooks et al. May 1984 A
4514727 Van Antwerp Apr 1985 A
4533937 Yamamoto et al. Aug 1985 A
4646075 Andrews et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4658247 Gharachorloo Apr 1987 A
4672564 Egli et al. Jun 1987 A
4675562 Herlein et al. Jun 1987 A
4677569 Nakano et al. Jun 1987 A
4683466 Holtey et al. Jul 1987 A
4685054 Manninen et al. Aug 1987 A
4685146 Fenster et al. Aug 1987 A
4709231 Sakaibara et al. Nov 1987 A
4727365 Bunker et al. Feb 1988 A
4737921 Goldwasser et al. Apr 1988 A
4757525 Matthews et al. Jul 1988 A
4764727 McConchie, Sr. Aug 1988 A
4807158 Blanton et al. Feb 1989 A
4817005 Kubota et al. Mar 1989 A
4843568 Krueger et al. Jun 1989 A
4860197 Langendorf et al. Aug 1989 A
4864515 Deck Sep 1989 A
4866637 Gonzalez-Lopez et al. Sep 1989 A
4901064 Deering Feb 1990 A
4905147 Logg Feb 1990 A
4905168 McCarthy et al. Feb 1990 A
4933864 Evans, Jr. et al. Jun 1990 A
4934908 Turrell et al. Jun 1990 A
4942538 Yuan et al. Jul 1990 A
4943938 Aoshima et al. Jul 1990 A
4952917 Yabuuchi Aug 1990 A
4956794 Zeevi et al. Sep 1990 A
4962540 Tsujiuchi et al. Oct 1990 A
4969036 Bhanu et al. Nov 1990 A
4980823 Liu Dec 1990 A
4991223 Bradley Feb 1991 A
4992972 Brooks et al. Feb 1991 A
5014327 Potter et al. May 1991 A
5034986 Karmann et al. Jul 1991 A
5045843 Hansen Sep 1991 A
5057744 Barbier et al. Oct 1991 A
5064291 Reiser Nov 1991 A
5067014 Bergen et al. Nov 1991 A
5128671 Thomas, Jr. Jul 1992 A
5128794 Mocker et al. Jul 1992 A
5162781 Cambridge Nov 1992 A
5194941 Grimaldi et al. Mar 1993 A
5208763 Hong et al. May 1993 A
5212888 Cary et al. May 1993 A
5222203 Obata Jun 1993 A
5227985 DeMenthon Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5253339 Wells et al. Oct 1993 A
5261820 Slye et al. Nov 1993 A
5265888 Yamamoto et al. Nov 1993 A
5268996 Steiner et al. Dec 1993 A
5269687 Mott et al. Dec 1993 A
5274560 LaRue Dec 1993 A
5297061 Dementhon et al. Mar 1994 A
5305389 Palmer Apr 1994 A
5307137 Jones et al. Apr 1994 A
5335557 Yasutake Aug 1994 A
5351090 Nakamura Sep 1994 A
5354202 Moncrief et al. Oct 1994 A
5361147 Katayama et al. Nov 1994 A
5363120 Drumm Nov 1994 A
5366376 Copperman et al. Nov 1994 A
5367615 Economy et al. Nov 1994 A
5369737 Gholizadeh et al. Nov 1994 A
5377997 Wilden et al. Jan 1995 A
5387943 Silver Feb 1995 A
5405151 Naka et al. Apr 1995 A
5446714 Yoshio et al. Aug 1995 A
5446798 Morita et al. Aug 1995 A
5448687 Hoogerhyde et al. Sep 1995 A
5450504 Calia Sep 1995 A
5469193 Giobbi et al. Nov 1995 A
5473736 Young Dec 1995 A
5526041 Glatt Jun 1996 A
5534917 MacDougall Jul 1996 A
5537638 Morita et al. Jul 1996 A
5548667 Tu Aug 1996 A
5550960 Shirman et al. Aug 1996 A
5555532 Sacha Sep 1996 A
5557684 Wang et al. Sep 1996 A
5559950 Cannon Sep 1996 A
5563989 Billyard Oct 1996 A
5572261 Cooper Nov 1996 A
5574836 Broemmelsiek Nov 1996 A
5577179 Blank Nov 1996 A
5577913 Moncrief et al. Nov 1996 A
5586231 Florent et al. Dec 1996 A
5590248 Zarge et al. Dec 1996 A
5598297 Yamanaka et al. Jan 1997 A
5611000 Szeliski et al. Mar 1997 A
5616078 Oh Apr 1997 A
5617407 Bareis Apr 1997 A
5630033 Purcell et al. May 1997 A
5631697 Nishimura et al. May 1997 A
5647019 Iino et al. Jul 1997 A
5649032 Burt et al. Jul 1997 A
5659671 Tannenbaum et al. Aug 1997 A
5660547 Copperman Aug 1997 A
5668646 Katayama et al. Sep 1997 A
5672820 Rossi et al. Sep 1997 A
5673374 Sakaibara et al. Sep 1997 A
5680487 Markandey Oct 1997 A
5684887 Lee et al. Nov 1997 A
5699497 Erdahl et al. Dec 1997 A
5704024 Voorhies et al. Dec 1997 A
5717148 Ely et al. Feb 1998 A
5717848 Watanabe et al. Feb 1998 A
5734384 Yanof et al. Mar 1998 A
5748865 Yamamoto et al. May 1998 A
5748867 Cosman et al. May 1998 A
5751928 Bakalash May 1998 A
5756354 Tzidon et al. May 1998 A
5757360 Nitta et al. May 1998 A
5760781 Kaufman et al. Jun 1998 A
5761401 Kobayashi et al. Jun 1998 A
5764803 Jacquin et al. Jun 1998 A
5769718 Rieder Jun 1998 A
5774124 Itoh et al. Jun 1998 A
5781194 Ponomarev et al. Jul 1998 A
5786801 Ichise Jul 1998 A
5793376 Tanaka et al. Aug 1998 A
5796952 Davis et al. Aug 1998 A
5798519 Vock et al. Aug 1998 A
5805170 Burch Sep 1998 A
5805745 Graf Sep 1998 A
5805782 Foran Sep 1998 A
5808617 Kenworthy et al. Sep 1998 A
5808619 Choi et al. Sep 1998 A
5812136 Keondjian Sep 1998 A
5812141 Kamen et al. Sep 1998 A
5818424 Korth Oct 1998 A
5818553 Koenck et al. Oct 1998 A
5825308 Rosenberg Oct 1998 A
5831623 Negishi et al. Nov 1998 A
5838366 Snape et al. Nov 1998 A
5852443 Kenworthy Dec 1998 A
5854632 Steiner Dec 1998 A
5856844 Batterman et al. Jan 1999 A
5864342 Kajiya et al. Jan 1999 A
5864742 Gasper et al. Jan 1999 A
5870097 Snyder et al. Feb 1999 A
5870098 Gardiner Feb 1999 A
5880736 Peercy et al. Mar 1999 A
5880856 Ferriere Mar 1999 A
5889505 Toyama et al. Mar 1999 A
5890122 Van Kleeck et al. Mar 1999 A
5894308 Isaacs Apr 1999 A
5899810 Smith May 1999 A
5903318 Demay et al. May 1999 A
5905894 De Bonet May 1999 A
5912830 Krech, Jr. et al. Jun 1999 A
5913727 Ahdoot Jun 1999 A
5914724 Deering et al. Jun 1999 A
5915972 Tada Jun 1999 A
5917937 Szeliski et al. Jun 1999 A
5922318 Bandman et al. Jul 1999 A
5923381 Demay et al. Jul 1999 A
5929860 Hoppe Jul 1999 A
5933150 Ngo et al. Aug 1999 A
5933535 Lee et al. Aug 1999 A
5935198 Blomgren Aug 1999 A
5949424 Cabral et al. Sep 1999 A
5953485 Abecassis Sep 1999 A
5959673 Lee et al. Sep 1999 A
5963209 Hoppe Oct 1999 A
5964660 James et al. Oct 1999 A
5966133 Hoppe Oct 1999 A
5977977 Kajiya et al. Nov 1999 A
5982352 Pryor Nov 1999 A
5982390 Stoneking et al. Nov 1999 A
5986668 Szeliski et al. Nov 1999 A
5987164 Szeliski et al. Nov 1999 A
5990901 Lawton et al. Nov 1999 A
6002738 Cabral et al. Dec 1999 A
6009188 Cohen et al. Dec 1999 A
6009190 Szeliski et al. Dec 1999 A
6010403 Adam et al. Jan 2000 A
6016150 Lengyel et al. Jan 2000 A
6018347 Willis Jan 2000 A
6018349 Szeliski et al. Jan 2000 A
6023523 Cohen et al. Feb 2000 A
6026182 Lee et al. Feb 2000 A
6031934 Ahmad et al. Feb 2000 A
6034691 Aono et al. Mar 2000 A
6034692 Gallery et al. Mar 2000 A
6034693 Kobayashi et al. Mar 2000 A
6035067 Ponticos Mar 2000 A
6037947 Nelson et al. Mar 2000 A
6040842 Wavish et al. Mar 2000 A
6044181 Szeliski et al. Mar 2000 A
6046744 Hoppe Apr 2000 A
6049619 Anandan et al. Apr 2000 A
6049636 Yang Apr 2000 A
6058397 Barrus et al. May 2000 A
6072494 Nguyen Jun 2000 A
6072504 Segen Jun 2000 A
6081274 Shiraishi Jun 2000 A
6100898 Malamy et al. Aug 2000 A
6101289 Kellner Aug 2000 A
6112240 Pogue et al. Aug 2000 A
6121953 Walker Sep 2000 A
6127936 Gendel et al. Oct 2000 A
6130673 Pulli et al. Oct 2000 A
6137492 Hoppe Oct 2000 A
6141013 Nelson et al. Oct 2000 A
6141041 Carlbom et al. Oct 2000 A
6155924 Nakagawa et al. Dec 2000 A
6157386 Wilde Dec 2000 A
6162123 Woolston Dec 2000 A
6172354 Adan et al. Jan 2001 B1
6175367 Parikh et al. Jan 2001 B1
6181384 Kurashige et al. Jan 2001 B1
6181988 Schneider et al. Jan 2001 B1
6199093 Yokoya Mar 2001 B1
6200138 Ando et al. Mar 2001 B1
6201581 Moriwake et al. Mar 2001 B1
6203426 Matsui et al. Mar 2001 B1
6208347 Migdal et al. Mar 2001 B1
6220962 Miyamoto et al. Apr 2001 B1
6222555 Christofferson et al. Apr 2001 B1
6229553 Duluk, Jr. et al. May 2001 B1
6233291 Shukhman et al. May 2001 B1
6252608 Snyder et al. Jun 2001 B1
6268875 Duluk, Jr. et al. Jul 2001 B1
6273814 Komoto Aug 2001 B1
6288730 Duluk, Jr. et al. Sep 2001 B1
6313841 Ogata et al. Nov 2001 B1
6313842 Tampieri Nov 2001 B1
6319129 Igarashi et al. Nov 2001 B1
6320580 Yasui et al. Nov 2001 B1
6323838 Thanasack et al. Nov 2001 B1
6330000 Fenney et al. Dec 2001 B1
6331851 Suzuki et al. Dec 2001 B1
6342885 Knittel et al. Jan 2002 B1
6348921 Zhao et al. Feb 2002 B1
6353272 van der Hoeven Mar 2002 B1
6356263 Migdal et al. Mar 2002 B2
6356288 Freeman et al. Mar 2002 B1
6361438 Morihira Mar 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6392647 Migdal et al. May 2002 B1
6396490 Gorman May 2002 B1
6400842 Fukuda Jun 2002 B2
6411298 Goto et al. Jun 2002 B1
6414960 Kuhn et al. Jul 2002 B1
6417836 Kumar et al. Jul 2002 B1
6421057 Lauer et al. Jul 2002 B1
6426720 Ross et al. Jul 2002 B1
6426755 Deering Jul 2002 B1
6456977 Wang Sep 2002 B1
6476807 Duluk, Jr. et al. Nov 2002 B1
6488505 Hightower Dec 2002 B1
6489955 Newhall, Jr. Dec 2002 B1
6496189 Yaron et al. Dec 2002 B1
6496598 Harman Dec 2002 B1
6504538 Freund et al. Jan 2003 B1
6529206 Ohki et al. Mar 2003 B1
6529875 Nakajima et al. Mar 2003 B1
6538666 Ozawa et al. Mar 2003 B1
6545663 Arbter et al. Apr 2003 B1
6554707 Sinclair et al. Apr 2003 B1
6563499 Waupotitsch et al. May 2003 B1
6571208 Kuhn et al. May 2003 B1
6572475 Okabe et al. Jun 2003 B1
6573890 Lengyel Jun 2003 B1
6577312 Deering et al. Jun 2003 B2
6578197 Peercy et al. Jun 2003 B1
6585599 Horigami et al. Jul 2003 B1
6594388 Gindele et al. Jul 2003 B1
6597363 Duluk, Jr. et al. Jul 2003 B1
6609976 Yamagishi et al. Aug 2003 B1
6611265 Hong et al. Aug 2003 B1
6639594 Zhang et al. Oct 2003 B2
6639609 Hayashi Oct 2003 B1
6646639 Greene et al. Nov 2003 B1
6646640 Nagy Nov 2003 B2
6650329 Koike Nov 2003 B1
6652376 Yoshida et al. Nov 2003 B1
6664955 Deering Dec 2003 B1
6664959 Duluk, Jr. et al. Dec 2003 B2
6680746 Kawai et al. Jan 2004 B2
6686924 Mang et al. Feb 2004 B1
6714236 Wada et al. Mar 2004 B1
6717576 Duluk, Jr. et al. Apr 2004 B1
6717579 Deslandes et al. Apr 2004 B1
6717599 Olano Apr 2004 B1
6720949 Pryor et al. Apr 2004 B1
6738059 Yoshinaga et al. May 2004 B1
6744442 Chan et al. Jun 2004 B1
6750867 Gibson Jun 2004 B1
6753870 Deering et al. Jun 2004 B2
6755654 Hightower Jun 2004 B2
6764403 Gavin Jul 2004 B2
6771264 Duluk et al. Aug 2004 B1
6771813 Katsuyama Aug 2004 B1
6778181 Kilgariff et al. Aug 2004 B1
6781594 Day Aug 2004 B2
6795068 Marks Sep 2004 B1
6798411 Gorman et al. Sep 2004 B1
6803910 Pfister et al. Oct 2004 B2
6803964 Post et al. Oct 2004 B1
6807296 Mishima Oct 2004 B2
6825851 Leather Nov 2004 B1
6850236 Deering Feb 2005 B2
6850243 Kilgariff et al. Feb 2005 B1
6853382 Van Dyke et al. Feb 2005 B1
6854632 Larsson Feb 2005 B1
6864895 Tidwell et al. Mar 2005 B1
6903738 Pfister et al. Jun 2005 B2
6912010 Baker et al. Jun 2005 B2
6917692 Murching et al. Jul 2005 B1
6928433 Goodman et al. Aug 2005 B2
6956871 Wang et al. Oct 2005 B2
6962527 Baba Nov 2005 B2
6995788 James Feb 2006 B2
7006101 Brown et al. Feb 2006 B1
7072792 Freifeld Jul 2006 B2
7079138 Day Jul 2006 B2
7081893 Cerny Jul 2006 B2
7085722 Luisi Aug 2006 B2
7101284 Kake et al. Sep 2006 B2
7113193 Marks Sep 2006 B2
7162314 Fay et al. Jan 2007 B2
7180529 Covannon et al. Feb 2007 B2
7194539 Hughes et al. Mar 2007 B2
7214133 Jen et al. May 2007 B2
7233904 Luisi Jun 2007 B2
7251315 Quinton Jul 2007 B1
7293235 Powers et al. Nov 2007 B1
7304667 Watanabe et al. Dec 2007 B2
7333150 Cooper Feb 2008 B2
7339589 Annunziata Mar 2008 B2
7589723 Wang et al. Sep 2009 B2
7636126 Mallinson Dec 2009 B2
7777746 Annunziata Aug 2010 B2
7800646 Martin Sep 2010 B2
7877262 Luisi Jan 2011 B2
7880746 Marks et al. Feb 2011 B2
7916215 Wu et al. Mar 2011 B2
7920209 Mallinson Apr 2011 B2
7965338 Chen Jun 2011 B2
8194940 Kiyohara et al. Jun 2012 B1
8204272 Marks Jun 2012 B2
8243089 Marks et al. Aug 2012 B2
8270684 Kiyohara et al. Sep 2012 B2
8284310 Mallinson Oct 2012 B2
8341145 Dodson et al. Dec 2012 B2
8798401 Johnson et al. Aug 2014 B1
9342817 Elliott May 2016 B2
20010048434 Brown Dec 2001 A1
20020018063 Donovan et al. Feb 2002 A1
20020041335 Taraci et al. Apr 2002 A1
20020047937 Wells Apr 2002 A1
20020068626 Takeda et al. Jun 2002 A1
20020080136 Kouadio Jun 2002 A1
20020107070 Nagy Aug 2002 A1
20020130866 Stuttard Sep 2002 A1
20020140703 Baker et al. Oct 2002 A1
20020162081 Solomon Oct 2002 A1
20020167518 Migdal et al. Nov 2002 A1
20030009748 Glanville et al. Jan 2003 A1
20030043163 Day Mar 2003 A1
20030045359 Leen et al. Mar 2003 A1
20030050112 Leen et al. Mar 2003 A1
20030058238 Doak et al. Mar 2003 A1
20030104868 Okita et al. Jun 2003 A1
20030112238 Cerny et al. Jun 2003 A1
20030117391 Olano Jun 2003 A1
20030142232 Albean Jul 2003 A1
20030179220 Dietrich, Jr. et al. Sep 2003 A1
20030216177 Aonuma et al. Nov 2003 A1
20040003370 Schenk et al. Jan 2004 A1
20040051716 Sevigny Mar 2004 A1
20040056860 Collodi Mar 2004 A1
20040100582 Stanger May 2004 A1
20040130550 Blanco et al. Jul 2004 A1
20040130552 Duluk, Jr. et al. Jul 2004 A1
20040166935 Gavin et al. Aug 2004 A1
20040219976 Campbell Nov 2004 A1
20040263636 Cutler et al. Dec 2004 A1
20040268413 Reid Dec 2004 A1
20050001836 Day Jan 2005 A1
20050019020 Sato et al. Jan 2005 A1
20050024379 Marks Feb 2005 A1
20050026689 Marks Feb 2005 A1
20050078116 Sloan et al. Apr 2005 A1
20050090302 Campbell Apr 2005 A1
20050090312 Campbell Apr 2005 A1
20050243094 Patel et al. Nov 2005 A1
20050246638 Whitten Nov 2005 A1
20050253965 Cooper Nov 2005 A1
20060015348 Cooper et al. Jan 2006 A1
20060039017 Park et al. Feb 2006 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060071933 Green et al. Apr 2006 A1
20060209210 Swan et al. Sep 2006 A1
20060214943 Day Sep 2006 A1
20060238549 Marks Oct 2006 A1
20060290810 Mallinson Dec 2006 A1
20070035831 Gutierrez Novelo Feb 2007 A1
20070094335 Tu Apr 2007 A1
20070106760 Houh May 2007 A1
20070168309 Tzruya et al. Jul 2007 A1
20070191097 Johnson Aug 2007 A1
20070257928 Marks et al. Nov 2007 A1
20070279427 Marks Dec 2007 A1
20080052349 Lin Feb 2008 A1
20080070655 Tanabe Mar 2008 A1
20080215994 Harrison Sep 2008 A1
20080268956 Suzuki Oct 2008 A1
20080268961 Brook et al. Oct 2008 A1
20080274798 Walker Nov 2008 A1
20090007186 Hartwell Jan 2009 A1
20090017908 Miyamoto Jan 2009 A1
20090040222 Green et al. Feb 2009 A1
20090063463 Turner et al. Mar 2009 A1
20090088233 O'Rourke Apr 2009 A1
20090118015 Chang et al. May 2009 A1
20090131177 Pearce May 2009 A1
20090193453 Cansler Jul 2009 A1
20090209337 Vrignaud et al. Aug 2009 A1
20090227368 Wyatt Sep 2009 A1
20100029387 Luisi Feb 2010 A1
20100041475 Zalewski Feb 2010 A1
20100050090 Leebow Feb 2010 A1
20100053430 Mallinson Mar 2010 A1
20100179857 Kalaboukis et al. Jul 2010 A1
20100191827 Martin Jul 2010 A1
20100232656 Ryu Sep 2010 A1
20100325218 Castro et al. Dec 2010 A1
20110013810 Engstrom et al. Jan 2011 A1
20110052012 Bambha et al. Mar 2011 A1
20110064281 Chan Mar 2011 A1
20110066743 Hurley et al. Mar 2011 A1
20110181776 Mallinson Jul 2011 A1
20110205240 Marks et al. Aug 2011 A1
20110249072 Marks Oct 2011 A1
20110249144 Chang Oct 2011 A1
20120250950 Papakipos et al. Oct 2012 A1
20130013683 Elliott Jan 2013 A1
20130129142 Miranda-Steiner May 2013 A1
20140087877 Krishnan Mar 2014 A1
20160042251 Cordova-Diba et al. Feb 2016 A1
20160261669 Elliott Sep 2016 A1
Foreign Referenced Citations (132)
Number Date Country
1201180 Dec 1998 CN
1369849 Sep 2002 CN
1160580 Aug 2004 CN
1652063 Aug 2005 CN
1806236 Jul 2006 CN
1910619 Feb 2007 CN
ZL02103524.5 May 2007 CN
101198277 Jun 2008 CN
101375596 Feb 2009 CN
101401081 Apr 2009 CN
101553862 Oct 2009 CN
103002960 Mar 2013 CN
103635892 Mar 2014 CN
103706117 Apr 2014 CN
103002960 Nov 2016 CN
103706117 Jan 2017 CN
106964155 Jul 2017 CN
103635892 Oct 2017 CN
107491701 Dec 2017 CN
19905076 May 2000 DE
60235994.5-08 Apr 2010 DE
448411 Sep 1991 EP
0553973 Aug 1993 EP
615386 Sep 1994 EP
789296 Aug 1997 EP
850673 Jul 1998 EP
0947948 Oct 1999 EP
1029569 Aug 2000 EP
1176559 Jan 2002 EP
1229499 Aug 2002 EP
1419481 May 2004 EP
1479421 Nov 2004 EP
1541207 Jun 2005 EP
1541208 Jun 2005 EP
1630754 Mar 2006 EP
1650706 Apr 2006 EP
1419481 Apr 2010 EP
2569063 Mar 2013 EP
3608003 Feb 2020 EP
1419481 Apr 2010 FR
2351637 Jan 2001 GB
2411065 Aug 2005 GB
1419481 Apr 2010 GB
15201 Apr 2014 IN
59-002040 Jan 1984 JP
59-202779 Nov 1984 JP
61-131110 Jun 1986 JP
H01-229393 Sep 1989 JP
H01-308908 Dec 1989 JP
H04-151780 May 1992 JP
H527779 Apr 1993 JP
H05-336540 Dec 1993 JP
H06-089342 Mar 1994 JP
6266854 Sep 1994 JP
2006-301474 Oct 1994 JP
H06-301474 Oct 1994 JP
7-160412 Jun 1995 JP
2007271999 Oct 1995 JP
H07-253774 Oct 1995 JP
2007334664 Dec 1995 JP
2008-112449 May 1996 JP
H08-112449 May 1996 JP
2008-155140 Jun 1996 JP
H09-047576 Feb 1997 JP
H09-178426 Jul 1997 JP
9265379 Oct 1997 JP
10055454 Feb 1998 JP
H10-165649 Jun 1998 JP
11070273 Mar 1999 JP
11-179050 Jul 1999 JP
2000-020193 Jan 2000 JP
2000-070546 Mar 2000 JP
2000137828 May 2000 JP
2000-157724 Jun 2000 JP
2000311251 Jul 2000 JP
2000218036 Aug 2000 JP
2000233072 Aug 2000 JP
3384978 Sep 2000 JP
2000237453 Sep 2000 JP
2000-339491 Dec 2000 JP
200338993 Dec 2000 JP
2001029649 Feb 2001 JP
2001-198350 Jul 2001 JP
2002-140705 Aug 2001 JP
3244798 Oct 2001 JP
2002-052256 Feb 2002 JP
2002-153676 May 2002 JP
2002159749 Jun 2002 JP
2002177547 Jun 2002 JP
2002-304637 Oct 2002 JP
2001079263 Mar 2003 JP
3588351 Aug 2004 JP
2004-321797 Nov 2004 JP
2005-500629 Jan 2005 JP
2005-125095 May 2005 JP
2005-125098 May 2005 JP
3821822 Jun 2006 JP
2006-178948 Jul 2006 JP
3901960 Jan 2007 JP
2007-136215 Jun 2007 JP
3971380 Jun 2007 JP
2008165784 Jul 2008 JP
2008-278937 Nov 2008 JP
4381371 Oct 2009 JP
2010-088694 Apr 2010 JP
4616330 Oct 2010 JP
2013-528432 Jul 2013 JP
5889876 Feb 2016 JP
20000072753 Dec 2000 KR
20020065397 Aug 2002 KR
100606653 Jul 2006 KR
10-2013-0118209 Oct 2013 KR
1020170091779 Aug 2017 KR
101881787 Jul 2018 KR
561448 Nov 2003 TW
191667 Mar 2004 TW
1994018790 Aug 1994 WO
1998002223 Jan 1998 WO
1998053443 Nov 1998 WO
2000010130 Feb 2000 WO
2001029768 Apr 2001 WO
2001082626 Nov 2001 WO
2003017200 Feb 2003 WO
2005040900 May 2005 WO
2006033360 Mar 2006 WO
2006041993 Apr 2006 WO
2007001633 Jan 2007 WO
2008018943 Feb 2008 WO
2008058271 May 2008 WO
2008058271 Aug 2008 WO
2011142857 Nov 2011 WO
2013006584 Jan 2013 WO
Non-Patent Literature Citations (288)
Entry
Office actions dated Jul. 23, 2003, Jan. 10, 2004, and Mar. 29, 2005 in U.S. Appl. No. 09/935,123, filed Aug. 21, 2001.
Office Action dated Jul. 2, 2007 in U.S. Appl. No. 10/280,640, filed Oct. 24, 2002.
Office Action dated Mar. 31, 2005 in U.S. Appl. No. 10/268,495, filed Oct. 9, 2002.
Office actions dated Feb. 9, 2005, and Jul. 13, 2005 in U.S. Appl. No. 10/267,341, filed Oct. 8, 2002.
Office actions dated Jun. 30, 2003, and Dec. 16, 2003 in U.S. Appl. No. 10/267,176, filed Oct. 8, 2002.
Office actions dated Mar. 22, 2005, and Sep. 7, 2005 in U.S. Appl. No. 10/267,234, filed Oct. 8, 2002.
Office Action dated Dec. 5, 2003 in U.S. Appl. No. 09/621,578, filed Jul. 21, 2000.
Office actions dated Dec. 8, 2006, and Jun. 4, 2007 in U.S. Appl. No. 10/443,612, filed May 21, 2003.
Office actions dated Aug. 7, 2006, Feb. 22, 2007, Nov. 13, 2007, Jul.9, 2008, Dec. 2, 2008, Oct. 29, 2009, Sep. 1, 2010, and Aug. 16, 2011 in U.S. Appl. No. 10/691,929, filed Oct. 22, 2003.
Office actions dated Apr. 7, 2006, Oct. 16, 2006, Apr. 2, 2007, Mar. 18, 2008, and Sep. 18, 2008 in U.S. Appl. No. 10/873,066, filed Jun. 21, 2004.
Office Action dated Oct. 6, 2005 in U.S. Appl. No. 10/927,918, filed Aug. 26, 2004.
Office actions dated Aug. 10, 2005, and Feb. 3, 2006 in U.S. Appl. No. 10/901,840, filed Jul. 28, 2004.
Office Action dated Mar. 23, 2009 in U.S. Appl. No. 11/165,473, filed Jun. 22, 2005.
Office Action dated Oct. 14, 2009 in U.S. Appl. No. 12/074,456, filed Mar. 3, 2008.
Office actions dated Feb. 12, 2008, Sep. 4, 2008, Jun. 10, 2009, and Dec. 24, 2009 in U.S. Appl. No. 11/222,207, filed Sep. 7, 2005.
Office actions dated Oct. 23, 2008 and Aug. 19, 2009 in U.S. Appl. No. 11/455,273, filed Sep. 8, 2005.
Office actions dated Oct. 17, 2007, May 1, 2008, Nov. 13, 2008, May 13, 2009, Dec. 16, 2009, May 25, 2010, and Nov. 9, 2010 in U.S. Appl. No. 11/455,273, filed Jun. 15, 2006.
Office Action dated Jun. 28, 2011 in U.S. Appl. No. 12/353,777, filed Jan. 14, 2009.
Office actions dated Jun. 9, 2010, and Nov. 8, 2010 in U.S. Appl. No. 12/287,317, filed Oct. 7, 2008.
Office Action dated Jul. 13, 2010 in U.S. Appl. No. 12/577,656, filed Oct. 12, 2009.
Office Action dated Sep. 16, 2010 in U.S. Appl. No. 12/615,942, filed Nov. 10, 2009.
Office actions dated Oct. 4, 2010, and Jan. 26, 2011 in U.S. Appl. No. 12/841,919, filed Jul. 22, 2010.
Office Action dated Aug. 18, 2011 in U.S. Appl. No. 12/842,353, filed Jul. 23, 2010.
Office Action dated Jun. 8, 2011 in U.S. Appl. No. 13/019,231, filed Feb. 1, 2011.
Agui, Takeshi et al., “Computer Graphics”, Shokodo Co., Ltd., Jul. 1992, 1st ed., pp. 80-101 (Environment Mapping).
Aguilera, S et al., “Impaired Persons Facilities Based on a Multi-Modality Speech Processing System”, Proc. On Speech & Language Tech., 1993.
Appeal Brief filed Feb. 1, 2008 for U.S. Appl. No. 10/959,695.
Appeal Brief filed Jun. 16, 2008 for U.S. Appl. No. 10/959,695.
Arons, B., “Authoring and Transcription Tools for Speech-Based Hypermedia”, Proc. Of American Voice I/O Society, 1991.
Arons, B., “Hyperspeech: Navigating in Speech-Only Hypermedia”, Proc. Of Hypertext, 1991.
Auslander et al., “Fast, Effective Dynamic Compilation,” SIGPLAN Notices ACM, 1996.
Balakrishnan et al., “Exploring Interactive Curve and Surface Manipulation Using a Bend and Twist Sensitive Input Strip,” Proc. Of 1999 ACM symp. On Interactive 3D Graphics.
Balakrishnan et al., “Performance Differences in the Fingers, Wrist, and Forearm in Computer Input Control,” Proc. Of 1997 ACM Conf. on Human Factors in Computing Systems.
Balakrishnan et al., “The PadMouse: Facilitating Selection and Spatial Postioning for the Non-Dominant Hand,” Proc. Of 1998 ACM Conf. on Human Factors in Computing Systems.
Balakrsihnan et al., Exploring Bimanual Camera Control and Object Manipulation in 3D Graphics Interfaces,: Proc. Of 1999 ACM Conf. on Human Factors in Computing Systems.
Bates, Jason, “Half-Life Review,” IGN, Nov. 25, 1998.
Bennacef, S.K., “A Spoken Language System for Information Retrieval”, Proc. Of ICSLP, 1994.
Beshers et al., “Generating Efficient Virtual Worlds for Visualization Using Partial Evaluation and Dynamic Compilation,” ACM 1997.
Bizarre Creations, Project Gotham Racing Manual, 2001, Microsoft Corporation, pp. 1-27, http://www.gamemanuals.net/download/2d54fbeb2d3e8ca2224ebad31c1b257f/Project_Gotham_Racing_%28EN%29.pdf.
Blinn, J.F. et al., “Texture and Reflection in Computer Generated Images”, Communications of the Association for Computing Machinery, ACM, Oct. 1, 1976, pp. 542-547, vol. 19, No. 10, New York, NY USA.
Blinn, J.F., “Light Reflection Functions for Simulation of Clouds and Dusty Surfaces,” ACM Graphics, vol. 16, No. 3, Jul. 1982.
Blinn, J.F., “Models of Light Reflection for Computer Synthesized Pictures”, Proc. Siggraph 1977, Computer Graphics 11(2), pp. 92-198, Jul. 1977.
Calvert, Justin, SCEE's lastest plans for its EyeToy peripheral will effectively turn the PlayStation 2 into a videophone. First screens inside., SCEE announces EyeToy;Chat, Game Spot, http://www.gamespot.com/news/6095429.html., May 5, 2004.
Chan, E., Ng R., Sen P., Proudfoot, K., and Hanarahan, P. 2002. Efficient Partioning of fragment shaders for multipass rendering on programmable graphics hardware. In Proceedings of the ACM SIGGRAPH/EUROGRAPHICS Conference on Graphics Hardware (Sarrbrucken, Germany, Sep. 1-2, 2002).
Davenport, G. et al., “Cinematic Primitives for Multimedia”, IEEE Computer Graphics and Applications (Aug. 1991), vol. 11, No. 4, pp. 67-74.
Dorsey, Julie O'B et al., Design and Simultaion of Opera Lighting and Projection Effects, Program of Computer Graphics, Computer Graphics, Jul. 1991, vol. 25, No. 4, New York.
European Examination Report dated Jul. 27, 2010 in European patent application No. 04 256 331.2, filed Oct. 14, 2004.
Examiner's Answer to Appeal Brief, Apr. 14, 2008.
Fernando R. and Kilgard M. J. 2003 The Cg Tutorial:the Definitve Guide to Programmable Real-Time Graphics. Addison-Wesley Longman Publishing Co., Inc., in CH. 1 sections 1.2 and 1.4, in Appendix C section C.2.
Fitzmaurice et al., “Sampling, Synthesis, and Input Devices,” Communications of the ACM, vol. 42, No. *, Aug. 1999.
Foley et al., “Computer Graphics: Principles and Practice”, Oct. 1996, pp. 721-745.
Foley et al., “Computer Graphics: Principles and Practice”, Second Edition in C, pp. 731.
Gauvain, J. L. et al., “Spoken LanguageComponent of the MASK Kiosk”, Human Comfort and Security of Information Systems, 1995.
Gauvain, J.L. et al, “The LIMSI Continuous Speech Dictation System”, Proc. ARPA Human Language & Technology, 1994.
Gauvain, J.L. et al, “The LIMSI Continuous Speech Dictation System: Evaluation on the ARPA Wall Street Journal Task”, Proc. Of the IEEE-ICASSP, 1994.
Gauvain, J.L. et al., “Speech recognition for an Information Kiosk”, Proc. Of ICSLP, 1996.
Glorianna Davenport, Thomas Aguirre Smith, Natalio Pincever, “Cinematic Primitives for Multimedia,” Aug. 1991, IEEE Computer Graphics and Applications, vol. 11, No. 4, pp. 67-74.
Goddeau, D. et al., “Galaxy: A Human-Language Interface to On-Line Travel Information”, Proc. Of ICSLP, 1994.
Gran Turismo 3 (“GT3”), Sony Computer Entertainment, Released Apr. 28, 2001, User manual, pp. 7.
Gran Turismo 3 (“GT3”), Wikipedia, Release Date Apr. 28, 2001, pp. 1, accessed Aug. 5, 2009.
Gueziec, A. et al., “Simplicial Maps for Progressive Transmission of Polygonal Surfaces”, Proceedings, VRML 98 Third Symposium on the Virtual Reality Modeling Language ACM, 1998, pp. 25-31, 131, New York, NY, USA.
Hayano, Masayuki, et al., “Mesh Simplification Using Edge Operation with Feature Detection”, Inf. Proc. Soc. Of Japan SIG Technical Report, Feb. 27, 1998, vol. 98, No. 16.
House, D., “Spoken-Language Access to Multimedia (SLAM): Masters Thesis”, Oregon Graduate Inst., Dept. of CS and Eng., 1995.
http://www.nintendo.com/games/detail/1OTtO06SP7M52gi5m8pD6CnahbW8CzxE.
internet.com, “Graphical User Interface”, available at http://www.webopedia.com; accessed Sep. 24, 2004. Last Modified May 17, 2004.
Konma, Toshihiro, “Rendering and Texture: Introduction to CG Creation in the Multimedia Age”, Nikkei Bus. Pub., Inc. Nov. 1996, pp. 237 (Bump Mapping).
Lamel, L.F. et al., “Recent Developments in spoken Language systems for Information Retrieval”, ESCA ETRW Spoken Dialog Systems, 1995.
Language Industry Monitor, “Janet Baker's Optimism”, 1992.
Matsushita, Yasuyuki, “Special Effects: Interobject Reflection effect: Starting OpenGL Programming with Mesa 3D”, Itsutsubachi Res. Co., Ltd., Jan. 2000, pp. 148-153.
McCool et al., “Texture Shaders,” Eurographics Los Angeles, 1999.
Moller, T. & Haines, E., “Real-time rendering”, 1999, pp. 69-81, A.K. Peters Ltd.
Mostow, Jack, et al., “Towards a Reading Coach That Listens: Automated Detection of Oral Reading Errors”, Proc. Of the 11th Ntl. Conf. on A.I., 1993.
Nakamura, Hiroko, et al., “Adaptive Transmission of Polygonal Patch Datasets . . . ”, Inf. Proc. Soc. Of Japan SIG Technical Report, Sep. 8, 2000, vol. 2000, No. 8.
Nayer, Shree K., et al., Lighting Sensitive Display, ACM Transactions on Graphics, Oct. 2004, vol. 23, No. 4, pp. 963-979, New York.
Nvidia Corporation, “User Guide CgFX Plug-In for 3ds Max,” Nov. 13, 2002.
Palmer, CHRISs et al., “Tile Based Games FAQ,” GAMEDEV, Aug. 31, 2000.
Peercy, et al., “Interactive Multi-Pass Programmable Shading,” Computer Graphics Proceedings, SIGGRAPH 2000, Jul. 2000.
Phong, Bui Tuong, “Illumination for Computer Generated Pictures,” Communication of the ACM, 18(6), pp. 311-317, Jun. 1975.
Pratt, David R., “A Software Architecture for the Construction and Management of Real-Time Virtual Worlds”, Jun. 1993, pp. 62-67.
Project Gotham Racing release information, Aug. 2, 2006, Gamespot.com, http://www.gamespot.com/xbox/driving/projectgothamracing/similar.html?mode=versions.
Project Gotham Racing Screenshot, Avault, Nov. 14, 2001, http://www.avault.com/consoles/reviews/xbox/avscreenshot.asp?pic=pgr&num=5.
Proudfood, et al., “A Real-Time Procedural Shading System for Programmable Graphics Hardware,” Computer Graphics Proceedings, SIGGRAPH 2001, Aug. 2001.
Road Blasters Path Markers, MobyGames, Jan. 25, 2007, http://www.mobygames.com/game/nes/readblasters/ screenshots/gameShotId,35174/.
Road Blasters Release Information, GameSpot, Jan. 25, 2007, http://www.gamespot.com/nes/driving/roadblasters/index.html?q=roadblasters.
Rushmeier, et al., “Extending the Radiosity Method to Include Specularly Reflecting and Translucent Materialsm” ACM Transaction on Graphics, vol. 9, No. 1, Jan. 1990.
Russell, M. et al., “applications of Automatic Speech Recognition to Speech and Language development in Young Children”, Proc of ICSLP, 1996.
Schlick, C., “A Survey of Shading and Reflectance Models,” Computer Graphics Forum, Jun. 1994, pp. 121-132, vol. 13, No. 2.
Schlicke, C., “A Fast Alternative to Phong's Specular Model,” Graphics Gems IV, pp. 385-386, 1994.
Screen Shot of a Civilization Building Game; Available at http://www.s2.com.br/s2arquivos/361/Imagens/2323 Image, jpg (accessed Oct. 11, 2005).
Screen Shot of a Civilization Building Game; Available at http://www.s2.com.br/s2arquivos/361/Imagens/2324 Image, jpg (accessed Oct. 11, 2005).
Screen Shot of a Flight Simulator; Avalable at http://orbit.medphys.ucl.ac.uk/images/gallery64.jpg (accessed Oct. 11, 2005).
Screen Shot of a Role Playing Game; Available at http://images.fok.nl/upload/lotrrotk2.jpg (accessed Oct. 11, 2005).
Screen Shot of a Role Playing Game; Available at http://images.fok.nl/upload/lotrrotk3.jpg (accessed Oct. 11, 2005).
Segen et al., “Gesture VR: Vision-Based 3D Hand Interface for Spatial Interaction,” Proceedings of Sixth ACM International Conference on Multimedia, 1998.
Spagnoletti, Simon, Phillips Ambilight TV, Home Entertainment, engadget, Jul. 8, 2004.
Tang et al., “Blending Structured Graphics and Layout”, Symposium on User Interface Software and Technology, Proceedings of the 7th Annual ACM Symposium on User Interface Software and Technology, Marina del Rey California, United States, pp. 167-173 (1994).
Taylor, Philip, “The MSDN Shader Workshop Application, Part 1,” Microsoft Corporation, Mar. 25, 2002.
Thalmann, et al., “Interactive Computer Animation”, 1996, Prentice Hall Europe, pp. 182-186.
The PlayStation 2 Books Riding Spirits Official Complete Guide (graphics), Japan, SoftBank Publishing, Sep. 6, 2003, First Edition, p. 005.
Voorhoies, D., et al., “Reflection Vector Shading Hardware”, Computer Graphics Proceedings, annual conference Series 1994, Siggraph 94 conference Proceedings, ACM, 1994, pp. 163-166, New York, NY, USA.
Ware et al., “Reaching for Objects in VR: Displays:Lag and Frame Rate,” ACM Transactions on Computer-Human Interaction, vol. 1, No. 4, Dec. 1994.
White, Stephen, “The Technology of Jak & Daxter,” Game Developer's Conference, Mar. 6, 2003.
Shiu, YC, et al., Pose Determination of Circular Cylinders Using Elliptical and Side Projections, Proceedings of the International Conference on Systems Engineering. Fairborn, Aug. 1-3, 1991; New York, IEEE, USA. p. 265-268. ISBN: 0-7803-0173-0 ; Wright State University, Dept. of Electrical Engineering, Dayton, OH, 1991.
Nicewarner, Keith E, et al., Vision-Guided Grasping of a Strut for Truss Structure Assembly, Electrical, Computer and Systems ENgineering Dept., Center for Intelligent Robotic Systems for Space Exploration. Rensselaer Polytechnic Institute, Troy, NY. Oct. 1992., pp. 86-93.
Nilsson, “ID3 Tag Version 2.3.0,” ID3v2: The Audience is Informed, Feb. 3, 1999; http://www.id3.org/id3v2.3.0 (last accessed Oct. 26, 2011).
Pearce, “Shadow Attenuation for Ray Tracing Transparent Objects”, “Graphics Gems”, 1991, pp. 397-399.
Woo et al., “A Survey of Shadow Algorithms”. “IEEE Computer Graphics and Applications”, 1990, IEEE Service Center, New York , NY US, vol. 10, Nr: 6, pp. 13-32.
Ohashi et al., “A Gesture Recognition Method for a Stick Input System (<Special Issue> Human Interface and Interaction)”, IPSJ Journal 40(2), 567-576, Feb. 15, 1999.
First Examination Report dated Apr. 22, 2005 in Chinese Application No. 02103524.5 filed Feb. 5, 2002.
Second Examination Report dated Jul. 14, 2006 in Chinese Application No. 02103524.5 filed Feb. 5, 2002.
Notice of Allowance dated Jan. 12, 2007 in Chinese Application No. 02103524.5 filed Feb. 5, 2002.
First Examination Report dated Apr. 22, 2005 in Korean Application No. 10-2002-6770 filed Feb. 6, 2002.
Second Examination Report dated Oct. 21, 2005 in Korean Application No. 10-2002-6770 filed Feb. 6, 2002.
Notice of Allowance dated May 24, 2006 in Korean Application No. 10-2002-6770 filed Feb. 6, 2002.
First Examination Report dated Apr. 14, 2004 in Japanese Application No. 2002-28892 filed Feb. 6, 2002.
Notice of Allowance dated Jul. 20, 2004 in Japanese Application No. 2002-28892 filed Feb. 6, 2002.
First Examination Report dated Jan. 16, 2007 in Japanese Application No. 2003-522033 filed Aug. 16, 2002.
Notice of Allowance dated May 15, 2007 in Japanese Application No. 2003-522033 filed Aug. 16, 2002.
Notice of Allowance dated Oct. 14, 2003 in Taiwan Application No. 91118743 filed Aug. 20, 2002.
First Examination Report dated Oct. 4, 2005 in Japanese Application No. 2001-220048 filed Jul. 19, 2001.
Second Examination Report dated Apr. 18, 2006 in Japanese Application No. 2001-220048 filed Jul. 19, 2001.
Third Examination Report dated Sep. 5, 2006 in Japanese Application No. 2001-220048 filed Jul. 19, 2001.
Notice of Allowance dated Nov. 28, 2006 in Japanese Application No. 2001-220048 filed Jul. 19, 2001.
First Examination Report dated Oct. 11, 2005 in Japanese Application No. 2004-304868 filed Oct. 19, 2004.
Notice of Allowance dated May 23, 2006 in Japanese Application No. 2004-304868 filed Oct. 19, 2004.
First Examination Report dated Mar. 3, 2009 in Japanese Application No. 2005-351271 filed Dec. 5, 2005.
Second Examination Report dated May 26, 2009 in Japanese Application No. 2005-351271 filed Dec. 5, 2005.
Notice of Allowance dated Aug. 18, 2009 in Japanese Application No. No. 2005-351271 filed Dec. 5, 2005.
International Search Report dated Feb. 4, 2003 in PCT Application No. PCT/US02/26360.
International Preliminary Examination Report dated Dec. 15, 2003 in PCT Application No. PCT/US02/26360 filed Aug. 16, 2002.
European Search Report dated Apr. 2, 2004 in EP Application No. EP02001875.
Communication from the Examining Division dated May 24, 2007 in EP Application No. EP02001875.
Communication from the Examining Division dated Jun. 17, 2009 in EP Application No. EP02001875.
Supplementary European Search Report dated Aug. 22, 2006 in EP02768608.
Mark et al. “Compiling to a VLIW Fragment Pipeline”, In Proceedings of 2001 SIGGRAPH/Eurographics Workshop on Graphics Hardware, pp. 47-55.
Arvo, “Backward Ray Tracing”. “Computer Graphics Proceedings. Annual Conference Series”. SIGGRAPH 1986. vol. 12, pp. 1-8.
International Search Report dated Feb. 6, 2006 in PCT Application No. PCT/US2005/035947.
Communication from Examining Division dated Jan. 12, 2007 in EP02768608.8.
Communication from Examining Division regarding Intention to grant a European Patent dated Nov. 12, 2009 in EP02768608.8.
Communication from Examining Division regarding Decision to grant a European Patent dated Mar. 18, 2010 in EP02768608.8.
Partial European Search Report dated Jul. 19, 2007 in EP 01306264.1.
European Search Report dated Oct. 12, 2007 in EP 01306264.1.
Communication from Examining Division dated Jul. 8, 2008 in EP 01306264.1.
Communication from Examining Division dated Oct. 16, 2009 in EP 01306264.1.
Moby Games, “RoadBlasters”, Released Jan. 1990 (screen shots).
Supplementary European Search Report dated Jan. 2, 2014 in EP 11780949.1.
International Search Report dated Sep. 24, 2012 in PCT Application No. PCT/US2012/015314.
Communication from Examining Division regarding European Search Report—Search Not Possible dated Dec. 22, 2004 in EP 04256331.2.
Communication from Examining Division dated Nov. 7, 2008 in EP 04256331.2.
Communication from Examining Division dated Jul. 27, 2010 in EP 04256331.2.
International Search Report dated Aug. 7, 2007 in PCT Application No. PCT/US2006/017574.
Communication from Examining Division regarding European Search Report—Search Not Possible dated Dec. 22, 2004 in EP 04256342.9.
1st Communication from Examining Division dated Apr. 28, 2006 in EP 04256342.9.
International Search Report dated Mar. 31, 2011 in PCT Application No. PCT/US2011/023780.
Rejection in JP application 2003-522033 dated Jan. 16, 2007.
Notice of Allowance in JP application 2003-522033 dated May 15, 2007.
Rejection in JP application 2001-220048 dated Oct. 4, 2005.
Rejection in JP application 2001-220048 dated Apr. 18, 2006.
Rejection in JP application 2001-220048 dated Sep. 5, 2006.
Notice of Allowance in JP application 2001-220048 dated Nov. 28, 2006.
Rejection in JP application 2004-304868 dated Oct. 11, 2005.
Notice of Allowance in JP application 2004-304868 dated May 23, 2006.
First Rejection in CN application 021035245 dated Apr. 22, 2005.
Second Rejection in CN application 021035245 dated Jul. 14, 2006.
Notice of Allowance in CN application 021035245 dated Jan. 12, 2007.
Rejection in JP application 2002-028892 dated Apr. 13, 2004.
Notice of Allowance in JP application 2002-028892 dated Jun. 10, 2004.
First Rejection in KR application 10-2002-0006770 dated Apr. 21, 2005.
Second Rejection in KR application 10-2002-0006770 dated Oct. 20, 2005.
Notice of Allowance in KR application 10-2002-0006770 dated May 24, 2006.
Rejection in JP application 2005-351271 dated Mar. 3, 2009.
Rejection in JP application 2005-351271 dated May 26, 2009.
Notice of Allowance in JP application 2005-351271 dated Aug. 18, 2009.
European Search Report for EP 04251842.3 dated Aug. 20, 2004.
1st Communication from the Examining Division for EP 04251842.3 dated Apr. 28, 2005.
Communication from the Examining Division re: Summons to Attend Oral Proceedings for EP 04251842.3 dated Jun. 14, 2006.
Communication from the Examining Division re: Decision to Refuse the Application for EP 04251842.3 dated Feb. 1, 2007.
Scott Osborne: “Suzuki Alstare Extreme Racing” IGN Insider (Online), Dec. 12, 2000; http://pc.ign.com/articles/164/164981p1.html (retreived Jun. 23, 2004).
Robert Norberg: “Phoenix Fighters Official Website” Bitwise, Alive Mediasoft (online), Nov. 18, 1999; http://www.cs.umu.se/˜dva95rng/pf.html (retreived Jun. 23, 2004).
Unknown: “Mad Driver v. Net” 3D-Level (Online), Oct. 21, 2002; http://www/3dlevel.com/maddriver.net.php (retreived Jun. 23, 2004).
Mataj Jan: www.3dlevel.com (online) Feb. 21, 2002; http://www.3dlevel.com/index.php (retreived Jun. 23, 2004).
Landrum, Caswell: “Pitstop II,” EPYX, Synergistic Software, US Gold (Online) 1984; http://www.mobygames.com/game/versions/gameid,578 (retreived Jun. 23, 2004).
Nintendo of America, Inc. Mario Kart 64: The Game Manual Archive. 1997. (http://www/gamemanuals.net/download/ cb07befddcc6f305d53088749775dcc2/Mario%20Kart%2064.pdf).
IGN.com. Ridge Racer 64. 2000. (http://media.ign64.ign.com/media/011/011541/img_1221593.html).
Gamespot. Ridge Racer 64. 2000. (http://www.gamespot.com/n64/driving/ridgeracer64/review.html).
Gamespot. Ridge Racer 64 Release Date. 2000. (http://www.gamespot.com/n64/driving/ridgeracer64/index.html?q=ridge%20racer%2064).
Acclaim. Turok: Dinosaur Hunter. The Game Manual Archive. 1997. (http://www.gamemanuals.net/download/df6d1a913ff01c09207d682a242c1a15/Turok.pdf).
Andale, “Andale Counters Getting Started Manual,” Aug. 2005.
Office Action dated Jul. 3, 2014 in Chinese Application No. 201180034304.4 filed Jan. 11, 2013.
Office Action dated Sep. 16, 2014 in Japanese Application No. 2013-510086 filed Feb. 4, 2011.
Office Action dated Mar. 13, 2015 in Chinese Application No. 201180034304.4 filed Feb. 4, 2011.
Final Office Action dated Apr. 7, 2015 in Japanese Application No. 2013-510086 filed Feb. 4, 2011.
Fighting Studio, Playstation Complete Mastery Series 65 Tsuri-michi—Sea fishing episodes—Official guidebook (Japanese), Futabasha Publishers Ltd., Nov. 25, 1998, 2nd printing. 6 pages.
Osborne, S. “Suzuki Alstare Extreme Racing.” IGN Insider (Online), Dec. 12, 2000. [retrieved on Sep. 18, 2015]. Retrieved from the Internet URL: <http://www.ign.com/articles/2000/12/22/suzuki-alstare-extreme-racing>.
Office Action dated Jan. 29, 2016 in Chinese Application 201310484667.9 filed Oct. 17, 2013.
Office Action dated Apr. 7, 2016 in Chinese Application 201180034304.4 filed Feb. 4, 2011.
Office Action dated Sep. 18, 2015 in Chinese Application 201180034304.4 filed Feb. 4, 2011.
Office Action dated Sep. 30, 2015 in Chinese Application 201280033005.3 filed Jul. 2, 2012.
Office Action dated Aug. 8, 2016 in Chinese Application 201280033005.3 filed Jul. 2, 2012, pp. 1-6.
Office Action dated Jun. 20, 2016 in Korean Application 10-2012-7032205 filed Feb. 4, 2011, pp. 1-10.
Notice of Allowance dated Aug. 4, 2016 in Chinese Application 201180034304.4 filed Feb. 4, 2011, pp. 1-5.
Korean Application No. 10-2012-7032205, “Office Action,” dated Dec. 26, 2016, 3 pages [6 pages including translation].
Notice of Allowance dated Sep. 5, 2016 in Chinese Application 201310484667.9 filed Feb. 4, 2011, 5 pages.
“Office Action,” European Patent Application No. 11780949.1, dated Feb. 13, 2017, 7 pages.
Chinese Application No. 201280033005.3, “Office Action,” dated Feb. 28, 2017, 3 pages [5 pages including translation].
“Notice of Allowance,” China Patent Application No. 201280033005.3, dated Jun. 29, 2017, 2 pages [5 pages including translation].
“Office Action,” South Korea Patent Application No. 1020127032205, dated Apr. 28, 2017, 4 pages [8 pages including translation].
Notice of Allowance dated Jan. 26, 2016 in Japanese Application 2013-510086 filed Feb. 4, 2011. 3 pages.
“Office Action,” South Korea Patent Application No. 1020177021490, dated Sep. 20, 2017, 5 pages [10 pages including translation].
“Office Action,” South Korea Patent Application No. 1020177021490, dated Mar. 30, 2018, 4 pages [8 pages including translation].
“Notice of Allowance,” South Korea Patent Application No. 1020177021490, dated May 8, 2018, 2 pages [3 pages including translation].
“Summons,” European Patent Application No. 11780949.1, dated Feb. 14, 2019, 9 pages.
“Office Action,” India Patent Application No. 9585/CHENP/2012, dated Oct. 22, 2019, 6 pages.
“Minutes of Oral Proceeding”, European Patent Convention Application No. 11780949.1, dated Oct. 25, 2019, 4 pages.
“Office Action,” Brazil Patent Application No. BR1120120289241, dated Aug. 2, 2019, 4 pages.
International Search Report dated Dec. 18, 2015 in PCT Application No. PCT/US2015/050908, 11 pages.
International Search Report dated Dec. 11, 2015 in PCT Application No. PCT/US2015/050870, 13 pages.
Notice of Allowance dated Jan. 7, 2016, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
“Extended European Search Report,” European Application No. 19197482.3, dated Nov. 6, 2019, 10 pages.
“Decision to Refuse,” European Patent Convention Application No. 11780949.1, dated Nov. 4, 2019, 16 pages.
“Office Action,” China Patent Application No. 201610912795.2, dated Oct. 21, 2019, 5 pages (11 pages including translation).
“Office Action,” Brazil Patent Application No. BR1120120289241, dated Mar. 11, 2020, 4 pages.
Office Action, dated Jan. 29, 2003, U.S. Appl. No. 09/778,183, filed Feb. 6, 2001.
Notice of Allowance, dated Jul. 1, 2003, U.S. Appl. No. 09/778,183, filed Feb. 6, 2001.
Notice of Allowance, dated Apr. 6, 2004, U.S. Appl. No. 09/935,123, filed Aug. 21, 2001.
Notice of Allowance, dated May 4, 2004, U.S. Appl. No. 09/621,578, filed Jul. 21, 2000.
Office Action, dated Feb. 21, 2006, U.S. Appl. No. 10/456,415, filed Jun. 5, 2003.
Final Office Action, dated Jul. 12, 2006, U.S. Appl. No. 10/456,415, filed Jun. 5, 2003.
Office Action, dated Mar. 13, 2007, U.S. Appl. No. 10/456,415, filed Jun. 5, 2003.
Advisory Action, dated Jul. 3, 2007, U.S. Appl. No. 10/691,929, filed Oct. 22, 2003.
Office Action, dated Dec. 15, 2008, U.S. Appl. No. 10/691,929, filed Oct. 22, 2003.
Office Action, dated Aug. 9, 2006, U.S. Appl. No. 10/691,929, filed Oct. 22, 2003.
Office Action, dated Dec. 8, 2006, U.S. Appl. No. 10/691,929, filed Oct. 22, 2003.
Notice of Allowance, dated Nov. 30, 2011, U.S. Appl. No. 10/691,929, filed Oct. 22, 2003.
Notice of Allowance, dated Jan. 6, 2012, U.S. Appl. No. 10/691,929, filed Oct. 22, 2003.
Office Action, dated Nov. 13, 2006, U.S. Appl. No. 10/737,143, filed Dec. 15, 2003.
Final Office Action, dated May 15, 2007, U.S. Appl. No. 10/737,143, filed Dec. 15, 2003.
Final Office Action, dated Aug. 6, 2008, U.S. Appl. No. 10/928,778, filed Aug. 26, 2004.
Office Action, dated Dec. 27, 2007, U.S. Appl. No. 10/928,778, filed Aug. 26, 2004.
Final Office Action, dated Jun. 19, 2007, U.S. Appl. No. 10/928,778, filed Aug. 26, 2004.
Office Action, dated Dec. 14, 2006, U.S. Appl. No. 10/928,778, filed Aug. 26, 2004.
Final Office Action, dated Jul. 6, 2006, U.S. Appl. No. 10/928,778, filed Aug. 26, 2004.
Office Action, dated Jan. 25, 2006, U.S. Appl. No. 10/928,778, filed Aug. 26, 2004.
Advisory Action, dated Dec. 19, 2008, U.S. Appl. No. 10/928,778, filed Aug. 26, 2004.
Notice of Allowance, dated Apr. 6, 2006, U.S. Appl. No. 10/927,918, filed Aug. 26, 2004.
Notice of Allowance, dated Apr. 6, 2006, U.S. Appl. No. 10/901,840, filed Jul. 28, 2004.
Final Office Action, dated May 9, 2007, U.S. Appl. No. 10/959,695, filed Oct. 6, 2004.
Office Action, dated Sep. 1, 2006, U.S. Appl. No. 10/959,695, filed Oct. 6, 2004.
Advisory Action, dated Aug. 13, 2007, U.S. Appl. No. 10/959,695, filed Oct. 6, 2004.
Notice of Allowance, dated Sep. 8, 2009, U.S. Appl. No. 11/165,473, filed Jun. 22, 2005.
Office Action, dated Mar. 11, 2008, U.S. Appl. No. 11/256,520, filed Oct. 20, 2005.
Final Office Action, dated Sep. 16, 2008, U.S. Appl. No. 11/256,520, filed Oct. 20, 2005.
Advisory Action, dated Jan. 6, 2009, U.S. Appl. No. 11/256,520, filed Oct. 20, 2005.
Final Office Action, dated Sep. 2, 2009, U.S. Appl. No. 11/448,454, filed Jun. 6, 2006.
Office Action, dated Jan. 30, 2009, U.S. Appl. No. 11/448,454, filed Jun. 6, 2006.
Final Office Action, dated May 28, 2008, U.S. Appl. No. 11/448,454, filed Jun. 6, 2006.
Office Action, dated Oct. 9, 2007, U.S. Appl. No. 11/448,454, filed Jun. 6, 2006.
Notice of Allowance, dated Sep. 24, 2010, U.S. Appl. No. 11/744,816, filed May 4, 2007.
Office Action, dated Jul. 8, 2010, U.S. Appl. No. 11/744,816, filed May 4, 2007.
Office Action, dated Sep. 22, 2006, U.S. Appl. No. 11/442,226, filed May 26, 2006.
Office Action, dated Apr. 10, 2007, U.S. Appl. No. 11/442,226, filed May 26, 2006.
Final Office Action, dated Oct. 5, 2007, U.S. Appl. No. 11/442,226, filed May 26, 2006.
Appeal Brief, dated Jan. 23, 2008, U.S. Appl. No. 11/442,227, filed May 26, 2006.
Appeal Brief, dated Mar. 20, 2008, U.S. Appl. No. 11/442,228, filed May 26, 2006.
Examiner's Answer, dated Jun. 11, 2008, U.S. Appl. No. 11/442,229, filed May 26, 2006.
Reply Brief, dated Aug. 5, 2008, U.S. Appl. No. 11/442,230, filed May 26, 2006.
BPAI Decision, dated Apr. 23, 2010, U.S. Appl. No. 11/442,230, filed May 26, 2006.
Advisory Action, dated May 17, 2011, U.S. Appl. No. 12/287,317, filed Oct. 7, 2008.
Office Action, dated Mar. 15, 2012, U.S. Appl. No. 12/287,317, filed Oct. 7, 2008.
Notice of Allowance, dated Aug. 6, 2012, U.S. Appl. No. 12/287,317, filed Oct. 7, 2008.
Notice of Allowance, dated Nov. 16, 2010, U.S. Appl. No. 12/577,656, filed Oct. 12, 2009.
Notice of Allowance, dated Nov. 29, 2010, U.S. Appl. No. 12/615,942, filed Nov. 10, 2009.
Notice of Allowance, dated Mar. 21, 2012, U.S. Appl. No. 13/019,231, filed Feb. 1, 2011.
Office Action, dated Dec. 29, 2011, U.S. Appl. No. 13/019,231, filed Feb. 1, 2011.
Notice of Allowance, dated Sep. 28, 2011, U.S. Appl. No. 13/019,231, filed Feb. 1, 2011.
Office Action, dated Mar. 13, 2012, U.S. Appl. No. 13/080,649, filed Apr. 5, 2011.
Final Office Action, dated Jun. 6, 2012, U.S. Appl. No. 13/080,649, filed Apr. 5, 2011.
Notice of Allowance, dated Jul. 25, 2012, U.S. Appl. No. 13/080,649, filed Apr. 5, 2011.
Office Action, dated Jan. 17, 2013, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
Office Action, dated May 31, 2013, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
Office Action, dated Sep. 24, 2013, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
Office Action, dated Jan. 24, 2014, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
Final Office Action, dated May 23, 2014, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
Notice of Allowance, dated Jan. 30, 2012, U.S. Appl. No. 13/163,621, filed Jun. 17, 2011.
Office Action, dated Dec. 23, 2014, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
Final Office Action, dated Apr. 1, 2015, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
Office Action, dated Jul. 27, 2015, U.S. Appl. No. 13/220,536, filed Aug. 29, 2011.
“Notice of Allowance,” China Patent Application No. 201610912795.2, dated Jul. 6, 2020, 2 pages (5 pages including translation).
Related Publications (1)
Number Date Country
20110281648 A1 Nov 2011 US