Placement of user information in a game space

Information

  • Patent Grant
  • 11478706
  • Patent Number
    11,478,706
  • Date Filed
    Wednesday, September 23, 2020
    4 years ago
  • Date Issued
    Tuesday, October 25, 2022
    2 years ago
Abstract
The generation, association, and display of in-game tags are disclosed. Such tags introduce an additional dimension of community participation to both single and multiplayer games. Through such tags, players are empowered to communicate through filtered text messages and images as well as audio clips that other game players, including top rated players, have generated and placed at particular coordinates and/or in context of particular events within the game space. The presently described in-game tags and associated user generated content further allow for label based searches with respect to game play.
Description
BACKGROUND OF THE INVENTION
Field of the Invention

The present invention generally relates to interactive game play. More specifically, the present application relates to placement of user-generated content to aid a user with interactive game play.


Description of the Related Art

Improvements in processing power and graphics quality have led to increasingly complex interactive gaming environments. For example, the PlayStation®3's RSX graphics processor allows for freedom of graphics expression in creating next-generation, real-time 3D imagery. Working in tandem with Sony Computer Entertainment Inc.'s Cell Broadband Engine™ Architecture, RSX processor rendered graphics are unparalleled in quality and realism.


Increasingly complex gaming environments have, in turn, resulted in more complex story lines, game play objectives, missions and tasks, and capabilities associated with game play avatars. As a result, interactive game play has become more challenging even for experienced game players. If a game becomes too challenging, however, game players may forsake future game play out of frustration.


To help game players overcome obstacles or achieve goals in a variety of interactive games, various content providers have begun publishing game magazines. These magazines provide game players with a ‘walk thru’ that tell the reader/game player where to go and what to do in order to ‘win’ the game or obtain the highest possible score. Hints or suggestions with respect to special moves or avatar capabilities may also be described in these gaming magazines.


While these magazines may be informative, they suffer from a number of drawbacks. If the magazine is not published by an official source (e.g., an official partner of the game developer), the magazine may omit essential information. In some instances, an unofficial magazine may publish incorrect information. Incorrect information may also result from the tendency to rush and publish these magazines concurrently with the release of an interactive game title to allow for concurrent purchase—even if the magazine is published by an official source.


Game players may also discover ‘Easter Eggs’ or other secrets during the course of game play. These secrets may not be a part of even an official magazine due to the fact that some game design engineers ‘slip in’ these Easter Eggs without the knowledge of the magazine publisher. Many interactive games also allow for the creation of special moves that may not have initially been conceived of by the game developer. As a result, these special moves are not a part of the game play magazine—official or otherwise—as their development occur after the magazine and associated game has gone to market.


Once game play magazines publish, subsequent editions tend not to be published. The lack of subsequent, updated editions may further the information that may be withheld from game players. Unique game play situations or circumstances may not become apparent until the interactive game is played by a large number of game players. These situations and circumstances may not be addressed in the gaming magazine thereby leaving game players at a loss as to how they may properly address the same.


In contrast, the Internet offers the opportunity for endless publishing and republishing of information. Notwithstanding endless publishing possibilities, websites on the Internet are often decentralized and unorganized. In some instances, there is no ‘official website’ as game developers may wish for game players to purchase a ‘for fee’ official magazine rather than access a free on-line website. Additionally, one website may offer one solution for one particular game play situation whereas another website may offer a solution for another situation. In order for a game player to obtain a complete ‘walk thru’ of a particular interactive game, the user may have to visit multiple websites on the Internet. Since these websites tend to be ‘unofficial,’ there is often an issue with the veracity or accuracy of the information displayed on these websites.


A further lacking in the aforementioned prior art solutions is the fact that this information—regardless of source, thoroughness, or quality—is that the information lacks contextual relevance. Some game play environments include a variety of ‘acts’ or ‘levels’ of game play; these scenes or levels often include a variety of subsidiary ‘scenes’ or ‘stages.’ For example, a game based on the D-Day military offensive may involve four scenes: crossing the English Channel; advancing up Omaha Beach; taking artillery positions at the head of the beach; and securing numerous military objectives in the French countryside. Game play advice concerning how to best maneuver an LCM Landing Craft while crossing the English Channel has no value to the game player that currently needs advice on how to best conduct a room-to-room search in the bombed out buildings of the nearby town of Bayeux. Locating the contextually appropriate game play advice may be time consuming if not confusing to a game player in the ‘heat of battle.’


The aforementioned prior art game play advice solutions are also wanting for lack of real-time provisioning of information. Many of today's interactive games are incredibly realistic, action-intensive simulations such as Warhawk from Sony Computer Entertainment America Inc. A game player often finds themselves ‘in the zone’ with respect to game play. If a game player is continually forced to interrupt game play (e.g., ‘pausing’ the game) in order to flip through pages of a game play magazine or click-thru various pages of content on the Internet, the game player will quickly find themselves losing their rhythm. In such complex game play environments, loss of that rhythm may be to the detriment of continued game play regardless of any hints or information that may have been acquired during the interruption.


Many games are also network or community-based with multiple players located around the country or around the world. Such games may occur in real-time. In certain of these games, the interruption of game play through ‘pause’ functionality may not be an option as may be available in a single-player game environment. The game player may be forced to drop out of a particular network game because the gaming environment cannot both exist in a timed-out/paused state for one game player yet continue in real-time for all others.


While some network or community-based games may allow for a ‘pause’ or other ‘time out’ feature, doing so may be to the detriment of the player invoking the interruption. In some games, for example, other game players may continue to advance through the game play environment by obtaining objects of value or reaching objectives within the environment. In other games, competing and non-paused players may position themselves to take retributive action on the ‘paused’ game player when they re-enter the gaming environment. For example, a non-paused player may sneak up behind a ‘paused’ player in a combat environment and assassinate the ‘paused’ player at point-blank range as the ‘paused’ player is unable to observe or react to events in the game environment while in a paused state.


There is a need in the art for game play advice that is complete and up-to-date regardless of when a particular interactive gaming title is released. Further, there is a need for game play advice that is pervasive and easily accessible to game players. There is a still further need for game play advice that is contextually appropriate and provided in real-time when such information is needed most.


SUMMARY OF THE INVENTION

Embodiments of the present invention provide a system and methods for placement of user-generated content to aid a user with interactive game play.


A first claimed embodiment of the present invention includes a method for managing user-generated game play advice. An indication of a location within a game space using a virtual coordinate system is received. The location corresponds to the desirability for rendering of game play advice. Game play advice is received from a user and assigned to a location within a game space previously identified as desirous of game play advice by using a virtual coordinate system. Game play advice is then displayed during subsequent game play at the same location within the game space using the virtual coordinate system, the game play advice displayed in a manner that is appropriate with respect to a present context of game play.


A further claimed embodiment of the present invention includes a computer-readable storage medium having embodied thereon a program. The program is executable by a computer to perform a method like that described above.


In a third claimed embodiment, a system for managing user-generated game play advice is described. The system includes a content submission engine for receiving game play advice over a network and a virtual coordinate system engine for assigning the game play advice to a particular location within a game space. A context engine identifies a context of an event during game play. The context of the event corresponds to game play advice associated with the particular location within the game space. A display engine displays game play advice corresponding to the context of the event identified by the context engine and at the location of the event as identified by the virtual coordinate system.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an exemplary system for placement of user-generated content to aid a user with interactive game play.



FIG. 2 illustrates an exemplary method for receipt and subsequent display of user-generated game play advice using in-game tags.



FIG. 3 illustrates a game space including user-generated content.





DETAILED DESCRIPTION

The present invention allows for the generation, association, and display of in-game tags. Such tags introduce an additional dimension of community participation to both single and multiplayer games. Through such tags, players are empowered to communicate through filtered text messages and images as well as audio clips that other game players, including top rated players, have generated and placed at particular coordinates and/or in context of particular events within the game space. The presently described in-game tags and associated user generated content further allow for label based searches with respect to game play.


In this context, the elements identified throughout are exemplary and may include various alternatives, equivalents, or derivations thereof. Various combinations of hardware, software, and computer-executable instructions may be utilized. Program modules and engines may include routines, programs, objects, components, and data structures that effectuate the performance of particular tasks when executed by a processor, which may be general purpose or application specific. Computer-executable instructions and associated data structures stored in a computer-readable storage medium represent examples of programming means for executing the steps of the methods and/or implementing particular system configurations disclosed herein.



FIG. 1 illustrates an exemplary system 100 for placement of user-generated content to aid a user with interactive game play. The system 100 of FIG. 1 includes a content submission engine 110, content database 120, virtual spatial coordinate (VSC) engine 130, game event and context engine 140, and matching/display engine 150. While various engines and databases are described in the context of FIG. 1, an embodiment of the present invention may offer the functionality of each or certain of these engines and databases in a single ‘content management’ engine or database.


System 100 may be implemented in a network environment such as the Internet, a proprietary communications environment, or a combination of the two. In one example, system 100 is an integrated component of the PlayStation® Network. System 100 (or components thereof) may communicate with the network environment utilizing any number of network interfaces as are known in the art. Examples of such interfaces include a 1000BASE-T Ethernet port or an IEEE 802.11 b/g network WiFi interface.


System 100 may be implemented in a computing device such as a server dedicated to managing user-generated content including maintenance of various databases. Alternatively, system 100 may be implemented in a computing device hosting a number of applications such as community maintenance, admission, and network game data distribution. System 100 may be dedicated to a single network game, a genre of games, or any number of games having no particular affiliation at all.


System 100 may also be implemented in a distributed peer-to-peer environment. In such an implementation, certain applications and/or responsibilities may be managed by a group of computing devices in the environment.


Various engines may be distributed to a community of users (e.g., players of a particular game or users in a general gaming network) through a push operation from a tasked server in the game community. Alternatively, various engines may be embodied in a computer-readable storage medium that also includes a particular game application (e.g., a disc). Distributed applications and engines may communicate directly via a group of peers or may be administered by a management server.


Content submission engine 110 is executable to allow a user to communicate with the system 100 over network for generation of in-game tags and the corresponding submission of user generated content. In-game tags include custom information placed by a user during game play and can include text messages, web links, images, audio or video clips, and user profile information. In-game tags rely upon virtual space coordinates, which are governed by the virtual spatial coordinate engine 130 and described in further detail below, which allow for consistent positional information pertaining to the game space to be assigned to an in-game tag.


Execution of content submission engine 110 may generate a user-interface for allowing user interaction with the system 100. The interface allows a user to assign user generated information to a particular virtual space coordinate (VSC) and a corresponding tag within the game space. The interface specifically allows for allocation of user generated content as might contemporaneously or previously have been stored in content database 120.


During game play, a user may navigate a particular portion of a game environment such as a particular passageway as illustrated in FIG. 3. After having played a particular game a number of times, a user might believe that they have particularly useful information to offer other players of the same game such as warnings about enemies entering that passageway or the best way to navigate the passageway and move onto a subsequent game environment. A user might wish to share that information with other game players.


Through depressing a particular button on a control device (or combination of buttons) used in conjunction with game play, a tag is assigned to that particular locale in the game space. Other means of assigning a tag are envisioned including gesture based assignment in those games utilizing motion based or gesture recognition controls. Audio commands may likewise be used to assign a tag in those games utilizing voice commands or having voice recognition capabilities (e.g., ‘drop tag’ or ‘assign tag’).


The particular locale in the game space has a VSC, which is the in-game equivalent to a global positioning system location. Through the use of a VSC, and as further described with respect to VSC engine 130, the particular tag will consistently be correlated to that portion of the game space. Whenever another game player (or the same game player) passes by that VSC after the tag has been assigned, the tag and any corresponding information in the content database 120 will be made accessible for review and study.


Content submission engine 110 allows a user to assign user generated information to a tag that was ‘dropped’ in the game space. It is difficult, if not impossible, to provide detailed information, hints, or other data during the course of game play. The content submission engine 110 provides the interface environment that allows for casual entry of that information following the completion of game play. The content submission engine 110 provides a post-game play listing of all tags that were dropped or assigned during game play and allows the user the means to provide an associated set of information to be stored in or retrieved from content database 120.


Through an interface generated by the content submission engine 110, a user may provide a detailed text message concerning information about the game play environment. The content may further include links to web pages concerning game play, that provide further related information, or information concerning upcoming tournaments, clans, and discussion groups. A tag might also be associated with screen shots or other images related to game play and that might prove useful such as maps or of interest such as ‘kill shots.’ A tag can also be assigned to audio and video clips generated by a user and that might provide a ‘replay’ of a particular portion of the game or verbal coaching as to game play. Profile information of the user providing the tag and corresponding user information may also be associated with a tag.


Entry of the game play information may be textual where a user enters a written description of the game play advice (e.g., ‘watch out for this guy’ or ‘through this door’ as shown in FIG. 3). Text-entry may occur through a virtual keyboard manipulated by a game controller coupled to a gaming platform. The gaming platform, in turn, is coupled to the system 100 via network. Submission of game play advice may be audible and provided by speaking into a USB microphone headset. Combinations of game play advice submissions are also within the scope of the present invention (e.g., a video clip with audible narration).


In some embodiments, the content submission engine 110 allows the user to re-trace game play and generate tags after the completion of game play. Some games might be so intense that even the act of generating a mere tag might interfere with optimal game play. In such a game, the user can execute the content submission engine 110 after game play is complete and ‘re-trace’ their steps, as the game will have tracked what portions of the environment were and were not accessed during play. The user may then assign tags to particular portions of the game space using a VSC system and the information associated therewith.


Submission of game play advice may also be contextually relevant. As many games are dynamic, especially first-person shooter type games, a particular scenario encountered in a particular environment during one round of game play (e.g., particular enemies) may differ significantly from a subsequent encounter albeit in the exact same game space depending on a particular scenario generated by the game play intelligence. In such an instance, providing a tag indicative of game play advice to a subsequent user when the event giving rise to the tag is not at hand may be distracting and actually detract from effective game play.


Game event and context engine 140 may track these particular nuanced events and, in conjunction with the matching and display engine 150, ensure that only contextually relevant tags are displayed. Information concerning context may be automatically be displayed by the content submission engine 110. Alternatively, a user might identify specific contextually specific limitations during the information provisioning process.


In order to avoid inconsistent naming protocols and that might otherwise complicate presentation of context sensitive game play advice, the content submission engine 110 may indicate that hints related to storming the beach at Omaha in a World War II combat simulation are all provided under the category of ‘Omaha Beach’ instead of a series of user generated titles such as ‘storming the beach,’ ‘Omaha,’ ‘chapter II,’ and others. The content submission engine 110 may work in conjunction with the game event and context engine 140 with respect to providing naming protocols.


The content submission engine 110 may also allow for user corrections or annotations of game play advice. For example, a previous user might provide information concerning accessing a particular weapon, but erroneously identifies the particular weapon or provides some other contextually inappropriate information. A subsequent user (or users) receiving that contextually inappropriate information may recognize the error or that the information might be better presented in a subsequent stage or area of game play (or simply correct an otherwise minor error). The subsequent user may lodge a complaint or suggest that an entity tasked with quality assurance of game play advice review the submission and/or context of the same.


Content database 120 may store all game play advice received through an interface generated by content submission engine 110. Alternatively, certain game play advice may expire over time or upon the occurrence of certain events. For example, the content database 120 may only retain the top-100 ranked game play advice submissions (as described in further detail herein). Once a particular instance of game play advice falls below a top-100 threshold, that particular instance may be deleted from the content database 120. Expiration may be temporal such that instances of game play advice that are not accessed for a particular period of time are removed from the content database 120. Instances of game play advice may also be removed from the game play advice content database 120 a predetermined number of days after having been submitted to the system 100.


Content database 120 may store all game play advice received through an interface generated by content submission engine 110. Alternatively, certain game play advice may expire over time or upon the occurrence of certain events. For example, the database 120 may only retain the top-100 ranked game play advice submissions (as described in further detail herein). Once a particular instance of game play advice falls below a top-100 threshold, that particular instance may be deleted from the database 120. Expiration may be temporal such that instances of game play advice that are not accessed for a particular period of time are removed from the database 120. Instances of game play advice may also be removed from the game play advice database 120 a predetermined number of days after having been submitted to the system 100.


System 100 may include a ranking engine (not shown) to manage the ranking of game play advice stored in content database 120. As described in co-pending patent publication numbers U.S. 2010-0041475 A1 for “Real-Time, Contextual Display of Ranked, User-Generated Game Play Advice” and U.S. 2009-0063463 A1 for “Ranking of User-Generated Game Play Advice,” the disclosures of each being incorporated herein by reference, when new game play advice is received, a ranking engine may assign a default ranking to a new instance of game play advice. This default ranking and any other ranking (including those generated as a result of user feedback) may be measured utilizing any rubric capable of distinguishing one instance of user-generated game play advice from another. In conjunction with a feedback engine and optional weighting engine, both of which are described in the aforementioned publications, the perceived quality of game play advice as adjudicated by a community of users may be more readily identified.


Virtual spatial coordinate engine 130, as noted above, operates as a global positioning system for a particular game space. Depending on the particular layout of the game environment, the VSC engine 130 may identify an X, Y, and (if appropriate) Z coordinate for the game space. This coordinate in the game space is then associated with individual instances of in-game tags such that the tags are consistently provided in the same game space as when they were originally assigned. The VSC engine 130 not only provides consistent presentation of information, but also accurate presentation as more general descriptions such as ‘hallway by the door,’ ‘on the beach,’ or ‘Level II’ as might otherwise be utilized may not provide the specificity required to render useful game play advice. The VSC engine 130 may operate in conjunction with information concerning the rendering and tracking of user information for a particular game title and may thus be agnostic as to any particular game title.


Information concerning VSC data may be provided to the content submission engine 110 to allow for generation of content and matching to in-game tags. VSC data from engine 130 may likewise be provided to content database 120 to allow for proper retrieval and display of user content and in-game tags by matching and display engine 150. VSC data may also be used by game event and context engine 140 to assign proper game context to tags and associated content vis-à-vis the submission engine and the matching/display engine 150.


Game event and context engine 140 is tasked with providing game play advice in an appropriate context of game play such that it may be appropriately displayed by the matching and display engine 150. Content submission engine 110 allows for annotation of appropriate contexts of game play advice by means of an in-game tag. The game event and context engine 140 may identify the context of game play that would be appropriate for game play advice. For example, walking down an alleyway without threats, obstacles, or other encounters that would require tactical game play are not likely to warrant the need for hints or advice. Advancing up the beaches of Normandy on D-Day with heavy gun fire from German forces, obstacles and landmines on the beach, and advancing troops and equipment from the English Channel would clearly require quick and strategic thinking. In this instance, the game event and context engine 140 would, in conjunction with the matching and display engine 150, identify that tags providing game play advice are appropriate and feed that tag information to the display engine 150 such that tags may be displayed and content eventually accessed in content database 120.


A game developer may make initial determinations as to whether a particular task or level will provide certain challenges thus making advice warranted. The game event and context engine 140 may be programmed to correspond to such determinations. Further, the game developer may allow for the introduction of user generated game play advice in those contexts where the game developer provides their own default game play advice; these points may likewise be introduced into the game event and context engine 140. Game developers, too, may study game play feedback in network games with respect to identifying choke points or other areas where particular obstacles might prove to be more challenging in actual game play implementation than those obstacles were during the course of pre-release testing. A game developer may release an update to the game event and context engine 140 over a network that allows for introduction of user advice post-release. The content submission engine 110 may then access the game event and context engine 140 to allow for users to provide this information. These points may be with respect to levels, obstacles, events, enemies, and so forth.


As noted with respect to the content submission engine 110, the game event and context engine 140 may identify certain points of game play related to objects, challenges, or enemies as well as levels or stages as a whole. Game code or other metadata may be flagged with respect to objects or enemies and these flags may be recognized by the game event and context engine 140 upon execution of the game code by a gaming system or processing device. These flags or metadata may be tied to allowing for entry of game play advice. For example, in a World War II simulation, a player might be crossing a field. The field, without any enemies present, may not warrant the need for game play advice—submissions or providing of the same. Later in that same game environment (the field) a tank may enter the scene and begin firing upon the game player. With the introduction of the tank, providing or receiving game play advice may now be warranted. For the tank to appear in the scene would require the execution of code related to the tank. The code for introducing and intelligently controlling the tank by the game platform may be flagged or identified by the aforementioned metadata. Once that flagged code or metadata is recognized by the game event and context engine 140, a user may provide advice or receive the same.


The game event and context engine 140, in this regard, is not only responsible for identifying those points or instances of game play where a user may provide advice, but also those instances where providing advice is appropriate. For example, in the previously mentioned alleyway example, no challenges are present thus making the introduction of advice by the system inappropriate or unnecessary. Should a sniper suddenly begin firing upon the game player, then advice on how to deal with the sniper may be appropriate for the user to consider. The game event and context engine 140 may recognize that providing information related to the sniper is appropriate based on the game platform loading flagged code related to the sniper. Similar provisioning of advice may occur with respect to encountering objects and the like. The game event and context engine 140 may be tied to the game play advice display engine 150 to allow for timely and contextually appropriate display of that advice.


Game play advice display engine 150 is configured to allow for the eventual display of user-generated game play advice via in-game tags and VSC data. Display of this advice may be in further accordance with a ranking result generated by a ranking engine and in further consideration of determinations made by the game event and context engine 140. Game play advice display engine 150 acquires information from the game play advice content database 120 (the advice) and a ranking database (if appropriate), which has ranked game play advice as determined by a ranking engine, and displays the game play advice (or makes available the game play advice) in accordance with the VSC data from engine 130 as well as the game event and context engine 140's determination that the display of advice related to a particular in-game tag and aspect of game play is appropriate.


By working in conjunction with the game event and context engine 140, the display engine 150 may display the highest ranked information but do so in the most appropriate context. For example, displaying information about a particular enemy may be inappropriate when the user has not encountered that enemy notwithstanding the fact that the user providing the information previously encountered that enemy at the same VSC coordinates.


The display engine 150 may utilize an asynchronous programming language to provide real-time (or substantially near real-time) updates to ranked game play advice for display to a community of users. The display engine 150 may, therefore, utilize a ladder ranking of game play advice with respect to determining which in-game tags to display. In such an embodiment, the highest quality advice is presented as that advice ranks at the top of a ladder. In some embodiments, the particular arrangement of the advice as it corresponds to a given tag may be subject to user or system preferences such as particular tags searched by a user or identified as being desirable by a user.


For example, a user may consistently experience difficulty using a particular weapon during game play (e.g., a sniper rifle). Prior to game play, a user seeking advice may, through a corresponding search engine or other interface, inform system 100 that only those in-game tags and corresponding advice with respect to user of the sniper-rifle is wanted. In this manner, the user is not inundated with data concerning the use of grenades, hand guns, and rocket launchers—all weapons with which the user might be quite prolific and for which advice is not needed.


Similar searching and screening of tags may be used with respect to advice from particular users or particular clans. This information may be derived from profile information provided during tag and advice generation. In some instances, a user providing game play advice may limit the accessibility of that advice to a limited number of users. A user wishing to access device from a particular providing user may need to have been identified in advance of in-game tag access or otherwise provide a password or some indicia indicating that they are authorized to access in-game tags and corresponding advice generated by a particular user.


Display engine 150 may display advice in the context of a real-world virtual environment and/or a first- or third-person avatar. Game play advice may be expressly provided via an in-game tag as shown in FIG. 3. Game play advice may also be provided through a series of hyperlinks provided through the tag. Graphic images may also be utilized, especially in the context of game play advice that incorporates full motion video or still images. Links to audio files may be appropriate in the case of audio-rendered advice. All of the aforementioned means of providing game play advice to a community of users (and in accordance with an assigned default or feedback controlled ranking) may be managed by the display engine 150 and the game event and context engine 140.



FIG. 2 illustrates an exemplary method 200 for receipt and subsequent display of user-generated game play advice using in-game tags. The steps identified in FIG. 2 (and the order thereof) are exemplary and may include various alternatives, combinations, equivalents, or derivations thereof including but not limited to the order of execution of the same. The steps of the process of FIG. 2 (and its various alternatives) may be embodied in hardware or software including a computer-readable storage medium (e.g., optical disc, memory card, or hard drive) including instructions executable by the processor of a computing device.


In step 210, user-generated game play advice is received from a user in the community via an interface generated by the content submission engine 110. Upon receipt of the user-generated advice in step 210, the advice is processed by the system 100 as described in the context of FIG. 1 and stored in game play advice content database 120. Various rankings may also be assigned.


In step 220, the user-generated game play advice, which is associated with a tag, is assigned a particular context either by the user submitting the advice or by the game event and context engine 140 as well as being matched with a given tag using VSC coordinates. In some instances, the game event and context engine 140 may control the available contexts that a user assigns to the advice. In other instances, the game event and context engine 140 may make a determination as to the specific context of advice.


Following subsequent game play (230), the same or a different game player may be navigating a particular game space. A previously generated tag may be identified by means of VSC coordinates at step 240 (i.e., a tag exists as to some particular game play advice at this particular locale in the game space). The context of a game event is then identified in step 250. Identification step 250 occurs as a result of the joint operation of the game event and context engine 140 and display engine 150 and may be similar to identification of an initial context of game play advice as occurs in the context of step 230 (but not otherwise displayed in FIG. 2). Upon a particular context being identified in an environment and that corresponds to a particular VSC, then advice that is relevant to that particular context is identified. That advice is rendered in conjunction with display engine 150 at step 260. The display of advice may take into account user rankings and/or user defined search tags or other limitations.


The method 200 of FIG. 2 may operate in real-time (or substantially in real-time) using an asynchronous programming language. Through the use of an asynchronous language, small amounts of data may be continually exchanged with a database so that an entire user interface need not be reloaded in response to each user interaction. In such an embodiment, an XMLHttpRequest object may, for example, be utilized to fetch the most recent, contextually, and locally relevant game play advice from database 120 as referenced in FIG. 1. Relationships between rankings, user feedback, context, and game play advice may be reflected by metadata or header data stored in the various databases of system 100. Game play advice rankings and context determinations may thus be updated as feedback is received and new rankings are calculated.


Updating of information displayed in FIG. 2 may also operate subject to a predetermined schedule. For example, a ranking engine may update rankings via user feedback at five minute intervals (or any other time period as may be determined by a system administrator). Similar updates may occur with respect to context. Once an update is complete as a result of a regularly scheduled ranking operation, the newly updated information may be pushed to the display engine 150 for display to the community of users in conjunction with appropriate VSC coordinates and context. The updated information may also be available for access in response to a user request or query.


While the present invention has been described with reference to exemplary embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the true spirit and scope of the present invention. Various alternative systems may be utilized to implement the various methodologies described herein and various methods may be used to achieve certain results from the aforementioned systems.

Claims
  • 1. A computer-implemented method for providing only contextually relevant user-generated game play advice for a location within a dynamic game, the method comprising: executing, by a processor, instructions stored in a memory to: receive an indication of the location within a game space;receive the user-generated game play advice for a first location from a user;determine a first scenario generated by the dynamic game that corresponds to the user-generated game play advice for the first location;assign the user-generated game play advice to the first location specifically for the first scenario generated by the dynamic game within the game space;during subsequent game play, determine: the subsequent game play's location within the game space; andthe subsequent game play's scenario at the subsequent game play's location; andin response to the subsequent game play being at the first location within the game space, automatically display at the first location any of the user-generated game play advice.
  • 2. The computer-implemented method of claim 1, further comprising retracing the user-generated game play advice upon which game play is based, the user-generated game play advice retraceable by the user after completion of the game play for revising or adding to the user-generated game play advice.
  • 3. The computer-implemented method of claim 1, wherein the user-generated game play advice is displayed in a three-dimensional virtual environment.
  • 4. The computer-implemented method of claim 3, wherein a virtual coordinate system uses X, Y, and Z coordinates.
  • 5. The computer-implemented method of claim 1, wherein the user-generated game play advice is textual.
  • 6. The computer-implemented method of claim 1, wherein the user-generated game play advice is visual.
  • 7. The computer-implemented method of claim 1, wherein the user-generated game play advice is audible.
  • 8. The computer-implemented method of claim 1, wherein automatically displaying the user-generated game play advice includes a consideration of a ranking of all available game play advice, and wherein only game play advice of a particular ranking is displayed at the first location and with respect to a present context of game play, the user-generated game play advice further allowing for corrections to be made by the user.
  • 9. A system for providing only contextually relevant user-generated game play advice for a location within a dynamic game, the system comprising: a processor; anda memory communicatively coupled with the processor, the memory storing instructions which when executed by the processor perform a method, the method comprising: receiving the user-generated game play advice over a network from a user;assigning the user-generated game play advice to the location within the game space;identifying a first scenario generated by the dynamic game that corresponds to the user-generated game play advice for a first location, and assigning the user-generated game play advice to the first location specifically for the first scenario generated by the dynamic game within the game space;during subsequent game play, determining:the subsequent game play's location within the game space; andthe subsequent game play's scenario at the subsequent game play's location; andin response to the subsequent game play being at the first location within the game space, automatically displaying at the first location any of the user-generated game play advice.
  • 10. The system of claim 9, wherein the method further comprises affecting, via a ranking engine, the user-generated game play advice displayed by a display engine notwithstanding a context of an event and a location of the event.
  • 11. The system of claim 10, wherein the method further comprises receiving, via a feedback engine, feedback from a community of users with respect to a quality of the user-generated game play advice displayed by the display engine, wherein the feedback engine and the ranking engine operate to allocate a new ranking to the user-generated game play advice in accordance with the feedback received from the community of users, the user-generated game play advice being subsequently displayed by the display engine in accordance with the new ranking.
  • 12. The system of claim 10, wherein the display engine operates using an asynchronous programming language to continually update displayed game play advice submissions in accordance with a most recent determination as to the context of the event.
  • 13. A method for providing only contextually relevant user-generated game play advice for a location within a dynamic game, the method comprising: receiving an indication of a location within a game space;receiving game play advice from a user;recognizing metadata associated with objects, challenges or enemies in game play at the location that indicates that user-generated advice is allowed;assigning the user-generated game play advice to the location within the game space and assign the user-generated game play advice a tag based upon the recognized metadata; andautomatically displaying game play advice during subsequent game play at the location within the game space.
  • 14. The method of claim 13, further comprising retracing the user-generated game play advice upon which the game play is based, the user-generated game play advice retraceable by the user after completion of the game play for revising or adding to the user-generated game play advice.
  • 15. The method of claim 13, wherein the user-generated game play advice is displayed in a three-dimensional virtual environment.
  • 16. The method of claim 15, wherein a virtual coordinate system uses X, Y, and Z coordinates.
  • 17. The method of claim 13, wherein the user-generated game play advice is textual.
  • 18. The method of claim 13, wherein the user-generated game play advice is visual.
  • 19. The method of claim 13, wherein the user-generated game play advice is audible.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present continuation application claims the priority benefit of U.S. Non-Provisional patent application Ser. No. 12/777,268 filed on May 11, 2010 and titled, “Placement of User Information in a Game Space,” which is hereby incorporated by reference in its entirety.

US Referenced Citations (451)
Number Name Date Kind
3147341 Gibson, Jr. Sep 1964 A
3200193 Biggs et al. Aug 1965 A
3717345 Banville Feb 1973 A
3943277 Everly et al. Mar 1976 A
4051491 Toyoda Sep 1977 A
4051520 Davidse et al. Sep 1977 A
4068847 Lukkarila et al. Jan 1978 A
4090216 Constable May 1978 A
4116444 Mayer et al. Jun 1978 A
4133004 Fitts Jan 1979 A
4166429 Smorzaniuk Sep 1979 A
4166430 Johnson, Jr. Sep 1979 A
4203385 Mayer et al. May 1980 A
4241341 Thorson Dec 1980 A
4321635 Tsuyuguchi Mar 1982 A
4355334 Fitzgibbon et al. Oct 1982 A
4361850 Nishimura Nov 1982 A
4448200 Brooks et al. May 1984 A
4514727 Van Antwerp Apr 1985 A
4533937 Yamamoto et al. Aug 1985 A
4646075 Andrews et al. Feb 1987 A
4649504 Krouglicof et al. Mar 1987 A
4658247 Gharachorloo Apr 1987 A
4672564 Egli et al. Jun 1987 A
4675562 Herlein et al. Jun 1987 A
4677569 Nakano et al. Jun 1987 A
4683466 Holtey et al. Jul 1987 A
4685054 Manninen et al. Aug 1987 A
4685146 Fenster et al. Aug 1987 A
4709231 Sakaibara et al. Nov 1987 A
4727365 Bunker et al. Feb 1988 A
4737921 Goldwasser et al. Apr 1988 A
4757525 Matthews et al. Jul 1988 A
4764727 McConchie, Sr. Aug 1988 A
4807158 Blanton et al. Feb 1989 A
4817005 Kubota et al. Mar 1989 A
4843568 Krueger et al. Jun 1989 A
4860197 Langendorf et al. Aug 1989 A
4864515 Deck Sep 1989 A
4866637 Gonzalez-Lopez et al. Sep 1989 A
4901064 Deering Feb 1990 A
4905147 Logg Feb 1990 A
4905168 McCarthy et al. Feb 1990 A
4933864 Evans, Jr. et al. Jun 1990 A
4934908 Turrell et al. Jun 1990 A
4942538 Yuan et al. Jul 1990 A
4943938 Aoshima et al. Jul 1990 A
4952917 Yabuuchi Aug 1990 A
4956794 Zeevi et al. Sep 1990 A
4962540 Tsujiuchi et al. Oct 1990 A
4969036 Bhanu et al. Nov 1990 A
4980823 Liu Dec 1990 A
4991223 Bradley Feb 1991 A
4992972 Brooks et al. Feb 1991 A
5014327 Potter et al. May 1991 A
5034986 Karmann et al. Jul 1991 A
5045843 Hansen Sep 1991 A
5057744 Barbier et al. Oct 1991 A
5064291 Reiser Nov 1991 A
5067014 Bergen et al. Nov 1991 A
5128671 Thomas, Jr. Jul 1992 A
5128794 Mocker et al. Jul 1992 A
5162781 Cambridge Nov 1992 A
5194941 Grimaldi et al. Mar 1993 A
5208763 Hong et al. May 1993 A
5212888 Cary et al. May 1993 A
5222203 Obata Jun 1993 A
5227985 DeMenthon Jul 1993 A
5230623 Guthrie et al. Jul 1993 A
5253339 Wells et al. Oct 1993 A
5261820 Slye et al. Nov 1993 A
5265888 Yamamoto et al. Nov 1993 A
5268996 Steiner et al. Dec 1993 A
5269687 Mott et al. Dec 1993 A
5274560 LaRue Dec 1993 A
5297061 Dementhon et al. Mar 1994 A
5305389 Palmer Apr 1994 A
5307137 Jones et al. Apr 1994 A
5335557 Yasutake Aug 1994 A
5351090 Nakamura Sep 1994 A
5354202 Moncrief et al. Oct 1994 A
5361147 Katayama et al. Nov 1994 A
5363120 Drumm Nov 1994 A
5366376 Copperman et al. Nov 1994 A
5367615 Economy et al. Nov 1994 A
5369737 Gholizadeh et al. Nov 1994 A
5377997 Wilden et al. Jan 1995 A
5387943 Silver Feb 1995 A
5405151 Naka et al. Apr 1995 A
5446714 Yoshio et al. Aug 1995 A
5446798 Morita et al. Aug 1995 A
5448687 Hoogerhyde et al. Sep 1995 A
5450504 Calia Sep 1995 A
5469193 Giobbi et al. Nov 1995 A
5473736 Young Dec 1995 A
5526041 Glatt Jun 1996 A
5534917 MacDougall Jul 1996 A
5537638 Morita et al. Jul 1996 A
5548667 Tu Aug 1996 A
5550960 Shirman Aug 1996 A
5555532 Sacha Sep 1996 A
5557684 Wang et al. Sep 1996 A
5559950 Cannon Sep 1996 A
5563989 Billyard Oct 1996 A
5572261 Cooper Nov 1996 A
5574836 Broemmelsiek Nov 1996 A
5577179 Blank Nov 1996 A
5577913 Moncrief et al. Nov 1996 A
5586231 Florent et al. Dec 1996 A
5590248 Zarge et al. Dec 1996 A
5598297 Yamanaka et al. Jan 1997 A
5611000 Szeliski et al. Mar 1997 A
5616078 Oh Apr 1997 A
5617407 Bareis Apr 1997 A
5630033 Purcell et al. May 1997 A
5631697 Nishimura et al. May 1997 A
5647019 Iino et al. Jul 1997 A
5649032 Burt et al. Jul 1997 A
5659671 Tannenbaum et al. Aug 1997 A
5660547 Copperman Aug 1997 A
5668646 Katayama et al. Sep 1997 A
5672820 Rossi et al. Sep 1997 A
5673374 Sakaibara et al. Sep 1997 A
5680487 Markandey Oct 1997 A
5684887 Lee et al. Nov 1997 A
5699497 Erdahl et al. Dec 1997 A
5704024 Voorhies et al. Dec 1997 A
5717848 Watanabe et al. Feb 1998 A
5734384 Yanof et al. Mar 1998 A
5748865 Yamamoto et al. May 1998 A
5748867 Cosman et al. May 1998 A
5751928 Bakalash May 1998 A
5756354 Tzidon et al. May 1998 A
5757360 Nitta et al. May 1998 A
5760781 Kaufman et al. Jun 1998 A
5761401 Kobayashi et al. Jun 1998 A
5764803 Jacquin et al. Jun 1998 A
5769718 Rieder Jun 1998 A
5774124 Itoh et al. Jun 1998 A
5781194 Ponomarev et al. Jul 1998 A
5786801 Ichise Jul 1998 A
5793376 Tanaka et al. Aug 1998 A
5796952 Davis et al. Aug 1998 A
5798519 Vock et al. Aug 1998 A
5805170 Burch Sep 1998 A
5805745 Graf Sep 1998 A
5805782 Foran Sep 1998 A
5808617 Kenworthy et al. Sep 1998 A
5808619 Choi et al. Sep 1998 A
5812136 Keondjian Sep 1998 A
5812141 Kamen et al. Sep 1998 A
5818424 Korth Oct 1998 A
5818553 Koenck et al. Oct 1998 A
5825308 Rosenberg Oct 1998 A
5831623 Negishi et al. Nov 1998 A
5838366 Snape et al. Nov 1998 A
5852443 Kenworthy Dec 1998 A
5854632 Steiner Dec 1998 A
5856844 Batterman et al. Jan 1999 A
5864342 Kajiya et al. Jan 1999 A
5864742 Gasper et al. Jan 1999 A
5870097 Snyder et al. Feb 1999 A
5870098 Gardiner Feb 1999 A
5880736 Peercy et al. Mar 1999 A
5880856 Ferriere Mar 1999 A
5889505 Toyama et al. Mar 1999 A
5890122 Van Kleeck et al. Mar 1999 A
5894308 Isaacs Apr 1999 A
5899810 Smith May 1999 A
5903318 Demay et al. May 1999 A
5905894 De Bonet May 1999 A
5912830 Krech, Jr. et al. Jun 1999 A
5913727 Ahdoot Jun 1999 A
5914724 Deering et al. Jun 1999 A
5915972 Tada Jun 1999 A
5917937 Szeliski et al. Jun 1999 A
5923318 Zhai et al. Jul 1999 A
5923381 Demay et al. Jul 1999 A
5929860 Hoppe Jul 1999 A
5933150 Ngo et al. Aug 1999 A
5933535 Lee et al. Aug 1999 A
5935198 Blomgren Aug 1999 A
5949424 Cabral et al. Sep 1999 A
5953485 Abecassis Sep 1999 A
5959673 Lee et al. Sep 1999 A
5963209 Hoppe Oct 1999 A
5964660 James et al. Oct 1999 A
5966133 Hoppe Oct 1999 A
5977977 Kajiya et al. Nov 1999 A
5982352 Pryor Nov 1999 A
5982390 Stoneking et al. Nov 1999 A
5986668 Szeliski et al. Nov 1999 A
5987164 Szeliski et al. Nov 1999 A
5990901 Lawton et al. Nov 1999 A
6002738 Cabral et al. Dec 1999 A
6009188 Cohen et al. Dec 1999 A
6009190 Szeliski et al. Dec 1999 A
6010403 Adam et al. Jan 2000 A
6016150 Lengyel et al. Jan 2000 A
6018347 Willis Jan 2000 A
6018349 Szeliski et al. Jan 2000 A
6023523 Cohen et al. Feb 2000 A
6026182 Lee et al. Feb 2000 A
6031934 Ahmad et al. Feb 2000 A
6034691 Aono et al. Mar 2000 A
6034692 Gallery et al. Mar 2000 A
6034693 Kobayashi et al. Mar 2000 A
6035067 Ponticos Mar 2000 A
6037947 Nelson et al. Mar 2000 A
6040842 Wavish et al. Mar 2000 A
6044181 Szeliski et al. Mar 2000 A
6046744 Hoppe Apr 2000 A
6049619 Anandan et al. Apr 2000 A
6049636 Yang Apr 2000 A
6058397 Barrus et al. May 2000 A
6072494 Nguyen Jun 2000 A
6072504 Segen Jun 2000 A
6081274 Shiraishi Jun 2000 A
6100898 Malamy et al. Aug 2000 A
6101289 Kellner Aug 2000 A
6112240 Pogue et al. Aug 2000 A
6121953 Walker Sep 2000 A
6127936 Gendel et al. Oct 2000 A
6130673 Pulli et al. Oct 2000 A
6137492 Hoppe Oct 2000 A
6141013 Nelson et al. Oct 2000 A
6141041 Carlborn et al. Oct 2000 A
6155924 Nakagawa et al. Dec 2000 A
6157386 Wilde Dec 2000 A
6162123 Woolston Dec 2000 A
6172354 Adan et al. Jan 2001 B1
6175367 Parikh et al. Jan 2001 B1
6181384 Kurashige et al. Jan 2001 B1
6181988 Schneider et al. Jan 2001 B1
6199093 Yokoya Mar 2001 B1
6200138 Ando et al. Mar 2001 B1
6201581 Moriwake et al. Mar 2001 B1
6203426 Matsui et al. Mar 2001 B1
6208347 Migdal et al. Mar 2001 B1
6220962 Miyamoto et al. Apr 2001 B1
6222555 Christofferson et al. Apr 2001 B1
6229553 Duluk, Jr. et al. May 2001 B1
6233291 Shukhman et al. May 2001 B1
6252608 Snyder et al. Jun 2001 B1
6268875 Duluk, Jr. et al. Jul 2001 B1
6273814 Komoto Aug 2001 B1
6288730 Duluk, Jr. et al. Sep 2001 B1
6313841 Ogata et al. Nov 2001 B1
6313842 Tampieri Nov 2001 B1
6319129 Igarashi et al. Nov 2001 B1
6320580 Yasui et al. Nov 2001 B1
6323838 Thanasack Nov 2001 B1
6330000 Fenney et al. Dec 2001 B1
6331851 Suzuki et al. Dec 2001 B1
6342885 Knittel et al. Jan 2002 B1
6348921 Zhao et al. Feb 2002 B1
6353272 van der Hoeven Mar 2002 B1
6356263 Migdal et al. Mar 2002 B2
6356272 Matsumoto et al. Mar 2002 B1
6356288 Freeman et al. Mar 2002 B1
6361438 Morihira Mar 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6392647 Migdal et al. May 2002 B1
6396490 Gorman May 2002 B1
6400842 Fukuda Jun 2002 B2
6411298 Goto et al. Jun 2002 B1
6414960 Kuhn et al. Jul 2002 B1
6417836 Kumar et al. Jul 2002 B1
6421057 Lauer et al. Jul 2002 B1
6426720 Ross et al. Jul 2002 B1
6426755 Deering Jul 2002 B1
6456977 Wang Sep 2002 B1
6476807 Duluk, Jr. et al. Nov 2002 B1
6488505 Hightower Dec 2002 B1
6489955 Newhall, Jr. Dec 2002 B1
6496189 Yaron et al. Dec 2002 B1
6496598 Harmon Dec 2002 B1
6504538 Freund et al. Jan 2003 B1
6529206 Ohki et al. Mar 2003 B1
6529875 Nakajima et al. Mar 2003 B1
6538666 Ozawa et al. Mar 2003 B1
6545663 Arbter et al. Apr 2003 B1
6554707 Sinclair et al. Apr 2003 B1
6563499 Waupotitsch et al. May 2003 B1
6571208 Kuhn et al. May 2003 B1
6572475 Okabe et al. Jun 2003 B1
6573890 Lengyel Jun 2003 B1
6577312 Deering et al. Jun 2003 B2
6578197 Peercy et al. Jun 2003 B1
6585599 Horigami et al. Jul 2003 B1
6594388 Gindele et al. Jul 2003 B1
6597363 Duluk, Jr. et al. Jul 2003 B1
6609976 Yamagishi et al. Aug 2003 B1
6611265 Hong et al. Aug 2003 B1
6639594 Zhang et al. Oct 2003 B2
6639609 Hayashi Oct 2003 B1
6646639 Greene et al. Nov 2003 B1
6646640 Nagy Nov 2003 B2
6650329 Koike Nov 2003 B1
6652376 Yoshida et al. Nov 2003 B1
6664955 Deering Dec 2003 B1
6664959 Duluk, Jr. et al. Dec 2003 B2
6680746 Kawai et al. Jan 2004 B2
6686924 Mang et al. Feb 2004 B1
6714236 Wada et al. Mar 2004 B1
6717576 Duluk, Jr. et al. Apr 2004 B1
6717579 Deslandes et al. Apr 2004 B1
6717599 Olano Apr 2004 B1
6720949 Pryor et al. Apr 2004 B1
6738059 Yoshinaga et al. May 2004 B1
6744442 Chan et al. Jun 2004 B1
6750867 Gibson Jun 2004 B1
6753870 Deering et al. Jun 2004 B2
6755654 Hightower Jun 2004 B2
6764403 Gavin Jul 2004 B2
6771264 Duluk, Jr. et al. Aug 2004 B1
6771813 Katsuyama Aug 2004 B1
6778181 Kilgariff et al. Aug 2004 B1
6781594 Day Aug 2004 B2
6795068 Marks Sep 2004 B1
6798411 Gorman et al. Sep 2004 B1
6803910 Pfister et al. Oct 2004 B2
6803964 Post et al. Oct 2004 B1
6807296 Mishima Oct 2004 B2
6825851 Leather Nov 2004 B1
6850236 Deering Feb 2005 B2
6850243 Kilgariff et al. Feb 2005 B1
6853382 Van Dyke et al. Feb 2005 B1
6854632 Larsson Feb 2005 B1
6864895 Tidwell et al. Mar 2005 B1
6903738 Pfister et al. Jun 2005 B2
6912010 Baker et al. Jun 2005 B2
6917692 Murching et al. Jul 2005 B1
6928433 Goodman et al. Aug 2005 B2
6956871 Wang et al. Oct 2005 B2
6962527 Baba Nov 2005 B2
6995788 James Feb 2006 B2
7006101 Brown et al. Feb 2006 B1
7072792 Freifeld Jul 2006 B2
7079138 Day Jul 2006 B2
7081893 Cerny Jul 2006 B2
7085722 Luisi Aug 2006 B2
7101284 Kake et al. Sep 2006 B2
7113193 Marks Sep 2006 B2
7162314 Fay et al. Jan 2007 B2
7180529 Covannon et al. Feb 2007 B2
7194539 Hughes et al. Mar 2007 B2
7214133 Jen et al. May 2007 B2
7233904 Luisi Jun 2007 B2
7251315 Quinton Jul 2007 B1
7293235 Powers et al. Nov 2007 B1
7304667 Watanabe et al. Dec 2007 B2
7333150 Cooper Feb 2008 B2
7339589 Annunziata Mar 2008 B2
7589723 Wang et al. Sep 2009 B2
7636126 Mallinson Dec 2009 B2
7777746 Annunziata Aug 2010 B2
7877262 Luisi Jan 2011 B2
7880746 Marks et al. Feb 2011 B2
7916215 Wu et al. Mar 2011 B2
7920209 Mallinson Apr 2011 B2
7965338 Chen Jun 2011 B2
8133115 Campbell Mar 2012 B2
8204272 Marks Jun 2012 B2
8243089 Marks et al. Aug 2012 B2
8284310 Mallinson Oct 2012 B2
8289325 Green et al. Oct 2012 B2
8798401 Johnson et al. Aug 2014 B1
10786736 Weising Sep 2020 B2
20010048434 Brown Dec 2001 A1
20020018063 Donovan et al. Feb 2002 A1
20020041335 Taraci et al. Apr 2002 A1
20020047937 Wells Apr 2002 A1
20020068626 Takeda et al. Jun 2002 A1
20020080136 Kouadio Jun 2002 A1
20020107070 Nagy Aug 2002 A1
20020130866 Stuttard Sep 2002 A1
20020140703 Baker et al. Oct 2002 A1
20020162081 Solomon Oct 2002 A1
20020167518 Migdal et al. Nov 2002 A1
20030009748 Glanville et al. Jan 2003 A1
20030043163 Day Mar 2003 A1
20030045359 Leen et al. Mar 2003 A1
20030050112 Leen et al. Mar 2003 A1
20030058238 Doak et al. Mar 2003 A1
20030104868 Okita et al. Jun 2003 A1
20030112238 Cerny et al. Jun 2003 A1
20030117391 Olano Jun 2003 A1
20030142232 Albean Jul 2003 A1
20030179220 Dietrich et al. Sep 2003 A1
20030216177 Aonuma et al. Nov 2003 A1
20040003370 Schenk et al. Jan 2004 A1
20040051716 Sevigny Mar 2004 A1
20040056860 Collodi Mar 2004 A1
20040100582 Stanger May 2004 A1
20040130550 Blanco Jul 2004 A1
20040130552 Duluk, Jr. et al. Jul 2004 A1
20040166935 Gavin et al. Aug 2004 A1
20040219976 Campbell Nov 2004 A1
20040263636 Cutler et al. Dec 2004 A1
20040268413 Reid et al. Dec 2004 A1
20050001836 Day Jan 2005 A1
20050019020 Sato et al. Jan 2005 A1
20050024379 Marks Feb 2005 A1
20050026689 Marks Feb 2005 A1
20050078116 Sloan et al. Apr 2005 A1
20050090302 Campbell Apr 2005 A1
20050090312 Campbell Apr 2005 A1
20050243094 Patel et al. Nov 2005 A1
20050246638 Whitten Nov 2005 A1
20050253965 Cooper Nov 2005 A1
20060015348 Cooper et al. Jan 2006 A1
20060039017 Park Feb 2006 A1
20060047704 Gopalakrishnan Mar 2006 A1
20060071933 Green et al. Apr 2006 A1
20060209210 Swan et al. Sep 2006 A1
20060214943 Day Sep 2006 A1
20060238549 Marks Oct 2006 A1
20060290810 Mallinson Dec 2006 A1
20070035831 Gutierrez Novelo Feb 2007 A1
20070094335 Tu Apr 2007 A1
20070106760 Houh et al. May 2007 A1
20070168309 Tzruya et al. Jul 2007 A1
20070191097 Johnson Aug 2007 A1
20070257928 Marks et al. Nov 2007 A1
20070279427 Marks Dec 2007 A1
20080070655 Tanabe Mar 2008 A1
20080215994 Harrison Sep 2008 A1
20080268956 Suzuki Oct 2008 A1
20080268961 Brook et al. Oct 2008 A1
20080274798 Walker et al. Nov 2008 A1
20090007186 Hartwell Jan 2009 A1
20090017908 Miyamoto Jan 2009 A1
20090040222 Green et al. Feb 2009 A1
20090063463 Turner et al. Mar 2009 A1
20090088233 O'Rourke Apr 2009 A1
20090118015 Chang et al. May 2009 A1
20090131177 Pearce May 2009 A1
20090193453 Cansler et al. Jul 2009 A1
20090209337 Vrignaud et al. Aug 2009 A1
20090227368 Wyatt Sep 2009 A1
20100029387 Luisi Feb 2010 A1
20100041475 Zalewski Feb 2010 A1
20100053430 Mallinson Mar 2010 A1
20100179857 Kalaboukis et al. Jul 2010 A1
20110181776 Mallinson Jul 2011 A1
20110205240 Marks et al. Aug 2011 A1
20110249072 Marks Oct 2011 A1
20110281648 Weising Nov 2011 A1
20130129142 Miranda-Steiner May 2013 A1
20140087877 Krishnan Mar 2014 A1
Foreign Referenced Citations (130)
Number Date Country
112012028924 Feb 2021 BR
1201180 Dec 1998 CN
1369849 Sep 2002 CN
1652063 Aug 2005 CN
1806236 Jul 2006 CN
1910619 Feb 2007 CN
1317666 May 2007 CN
101198277 Jun 2008 CN
101375596 Feb 2009 CN
101401081 Apr 2009 CN
101553862 Oct 2009 CN
103002960 Mar 2013 CN
103706117 Apr 2014 CN
103002960 Nov 2016 CN
103706117 Jan 2017 CN
106964155 Jul 2017 CN
107491701 Dec 2017 CN
106964155 Oct 2020 CN
107491701 Oct 2020 CN
19905076 May 2000 DE
60235994.508 Apr 2010 DE
0448411 Sep 1991 EP
0553973 Aug 1993 EP
0615386 Sep 1994 EP
0789296 Aug 1997 EP
0850673 Jul 1998 EP
0947948 Oct 1999 EP
1029569 Aug 2000 EP
1176559 Jan 2002 EP
1229499 Aug 2002 EP
1419481 May 2004 EP
1479421 Nov 2004 EP
1541207 Jun 2005 EP
1541208 Jun 2005 EP
1630754 Mar 2006 EP
1650706 Apr 2006 EP
1419481 Apr 2010 EP
2569063 Mar 2013 EP
3608003 Feb 2020 EP
1419481 Apr 2010 FR
2351637 Jan 2001 GB
2411065 Aug 2005 GB
1419481 Apr 2010 GB
9902CHENP2012 Apr 2014 IN
S59002040 Jan 1984 JP
S59202779 Nov 1984 JP
S61131110 Jun 1986 JP
H01229393 Sep 1989 JP
H01308908 Dec 1989 JP
H04151780 May 1992 JP
H0527779 Apr 1993 JP
H05336540 Dec 1993 JP
H06089342 Mar 1994 JP
H06266854 Sep 1994 JP
2006301474 Oct 1994 JP
H06301474 Oct 1994 JP
H07160412 Jun 1995 JP
H07253774 Oct 1995 JP
H07271999 Oct 1995 JP
H07334664 Dec 1995 JP
H08112449 May 1996 JP
H08155140 Jun 1996 JP
H09047576 Feb 1997 JP
H09178426 Jul 1997 JP
H09265379 Oct 1997 JP
H10055454 Feb 1998 JP
H10165649 Jun 1998 JP
H11070273 Mar 1999 JP
11179050 Jul 1999 JP
2000020193 Jan 2000 JP
2000070546 Mar 2000 JP
2000137828 May 2000 JP
2000157724 Jun 2000 JP
2000311251 Jul 2000 JP
2000218036 Aug 2000 JP
2000233072 Aug 2000 JP
3384978 Sep 2000 JP
2000237453 Sep 2000 JP
2000338993 Dec 2000 JP
2000339491 Dec 2000 JP
3244798 Jan 2001 JP
2001029649 Feb 2001 JP
2001198350 Jul 2001 JP
2002052256 Feb 2002 JP
2002140705 May 2002 JP
2002153676 May 2002 JP
2002159749 Jun 2002 JP
2002177547 Jun 2002 JP
2002304637 Oct 2002 JP
2001079263 Mar 2003 JP
3588351 Nov 2004 JP
2004321797 Nov 2004 JP
2005500629 Jan 2005 JP
2005125095 May 2005 JP
2005125098 May 2005 JP
2006178948 Jul 2006 JP
3821822 Sep 2006 JP
3901960 Apr 2007 JP
2007136215 Jun 2007 JP
3971380 Sep 2007 JP
2008165784 Jul 2008 JP
2008278937 Nov 2008 JP
4381371 Dec 2009 JP
2010088694 Apr 2010 JP
4616330 Oct 2010 JP
2013528432 Jul 2013 JP
5889876 Mar 2016 JP
20000072753 Dec 2000 KR
20020065397 Aug 2002 KR
100606653 Jul 2006 KR
1020130118209 Oct 2013 KR
1020170091779 Aug 2017 KR
101881787 Jul 2018 KR
561448 Nov 2003 TW
191667 Mar 2004 TW
WO1994018790 Aug 1994 WO
WO1998002223 Jan 1998 WO
WO1998053443 Nov 1998 WO
WO2000010130 Feb 2000 WO
WO2001029768 Apr 2001 WO
WO2001082626 Nov 2001 WO
WO2003017200 Feb 2003 WO
WO2005040900 May 2005 WO
WO2006033360 Mar 2006 WO
WO2006041993 Apr 2006 WO
WO2007001633 Jan 2007 WO
WO2008018943 Feb 2008 WO
WO2008058271 May 2008 WO
WO2008058271 Aug 2008 WO
WO2011142857 Nov 2011 WO
Non-Patent Literature Citations (117)
Entry
International Preliminary Examination Report dated Dec. 15, 2003 in PCT Application No. PCT/US2002/026360.
European Search Report dated Apr. 2, 2004 in EP Application No. EP02001875.0.
Communication from Examining Division dated May 24, 2007 in EP Application No. 02001875.0.
Communication from Examining Division dated Jun. 17, 2009 in EP Application No. 02001875.0.
Supplementary European Search Report dated Aug. 1, 2006 in EP Application No. 02768608.8.
Mark et al. “Compiling to a VLIW Fragment Pipeline.” In Proceedings of 2001 SIGGRAPH/Eurographics Workshop on Graphics Hardware, pp. 47-55. 9 pages.
Arvo, J. “Backward Ray Tracing.” In Developments in Ray Tracing, Computer Graphics, Proceedings of ACM SIGGRAPH 86 Course Notes, vol. 12, Aug. 1986. pp. 259-263. Retyped from original. 2 pages.
International Search Report dated Feb. 6, 2006 in PCT Application No. PCT/US2005/035947.
Communication from Examining Division dated Jan. 12, 2007 in EP Application No. 02768608.8.
Communication from Examining Division dated Nov. 12, 2009 regarding Intention to Grant a European Patent in EP Application No. 02768608.8.
Communication from Examining Division dated Mar. 18, 2010 regarding Decision to Grant a European Patent in EP Application No. 02768608.8.
Partial European Search Report dated Jul. 19, 2007 in EP Application No. EP01306264.1.
European Search Report dated Oct. 12, 2007 in EP Application No. EP01306264.1.
Communication from Examining Division dated Jul. 8, 2008 in EP Application No. 01306264.1.
Communication from Examining Division dated Oct. 16, 2009 in EP Application No. 01306264.1.
Supplementary European Search Report dated Jan. 2, 2014 in EP Application No. 11780949.1.
Communication from Examining Division dated Dec. 22, 2004 regarding European Search Report—Search Not Possible in EP Application No. 04256331.2.
Communication from Examining Division dated Nov. 7, 2008 in EP Application No. 04256331.2.
Communication from Examining Division dated Jul. 27, 2010 in EP Application No. 04256331.2.
International Search Report dated Aug. 7, 2007 in PCT Application No. PCT/US2006/017574.
Communication from Examining Division dated Dec. 22, 2004 regarding European Search Report—Search Not Possible 4 in EP Application No. 04256342.9.
Communication from Examining Division dated Apr. 28, 2006 in EP Application No. 04256342.9.
International Search Report dated Mar. 31, 2011 in PCT Application No. PCT/US2011/023780.
European Search Report dated Aug. 20, 2004 for EP Application No. EP04251842.3.
Communication from Examining Division dated Apr. 28, 2005 in EP Application No. 04251842.3.
Communication from the Examining Division dated Jun. 14, 2006 regarding Summons to Attend Oral Proceedings for EP Application No. 04251842.3.
Communication from Examining Division dated Feb. 1, 2007 regarding Decision to Refuse the Application in EP Application No. 04251842.3.
Office Action dated Jul. 3, 2014 in Chinese Application No. 201180034304.4 filed Jan. 11, 2013.
Final Office Action dated Apr. 7, 2015 in Japanese Application No. 2013-510086 filed Feb. 4, 2011.
Office Action dated Sep. 18, 2015 in Chinese Application 201180034304.4 filed Feb. 4, 2011.
Office Action dated Jan. 29, 2016 in Chinese Application 201310484667.9 filed Oct. 17, 2013, 11 pages.
Office Action dated Apr. 7, 2016 in Chinese Application 201180034304.4 filed Feb. 4, 2011, 4 pages.
Office Action dated Jun. 20, 2016 in Korean Application 10-2012-7032205 filed Feb. 4, 2011, 10 pages.
Notice of Allowance dated Aug. 4, 2016 in Chinese Application 201180034304.4 filed Feb. 4, 2011, 5 pages.
“Notice of Allowance”, Chinese Patent Application No. 201310484667.9, dated Sep. 5, 2016, 5 pages.
“Office Action,” European Patent Application No. 11780949.1, dated Feb. 13, 2017, 7 pages.
Korean Application No. 10-2012-7032205, “Office Action,” dated Dec. 26, 2016, 3 pages [6 pages including translation].
Chinese Application No. 201280033005.3, “Office Action,” dated Feb. 28, 2017, 3 pages [5 pages including translation].
“Office Action,” South Korea Patent Application No. 1020127032205, dated Apr. 28, 2017, 4 pages [8 pages including translation].
“Office Action,” South Korea Patent Application No. 1020177021490, dated Sep. 20, 2017, 5 pages [10 pages including translation].
“Office Action,” South Korea Patent Application No. 1020177021490, dated Mar. 30, 2018, 4 pages [8 pages including translation].
“Notice of Allowance,” South Korea Patent Application No. 1020177021490, dated May 8, 2018, 2 pages [3 pages including translation].
“Summons,” European Patent Application No. 11780949.1, Feb. 14, 2019, 9 pages.
International Search Report dated Dec. 18, 2015 in PCT Application No. PCT/US2015/050908, 11 pages.
International Search Report dated Dec. 11, 2015 in PCT Application No. PCT/US2015/050870, 13 pages.
“Office Action,” Brazil Patent Application No. BR1120120289241, dated Aug. 2, 2019, 4 pages.
“Office Action,” India Patent Application No. 9585/CHENP/2012, dated Oct. 22, 2019, 6 pages.
“Minutes of Oral Proceeding”, European Patent Convention Application No. 11780949.1, Oct. 25, 2019, 4 pages.
“Extended European Search Report,” European Application No. 19197482.3, dated Nov. 6, 2019, 10 pages.
“Decision to Refuse,” European Patent Convention Application No. 11780949.1, dated Nov. 4, 2019, 16 pages.
“Office Action,” China Patent Application No. 201610912795.2, dated Oct. 21, 2019, 5 pages (11 pages including translation).
Aguilera et al. “Impaired Persons Facilities Based on a Multi-Modality Speech Processing System.” In Proceedings on Speech & Language Technology, 1993. 4 pages.
Arons, B. “Authoring and Transcription Tools for Speech-Based Hypermedia.” In Proceedings of American Voice I/O Society, 1991. 5 pages.
Arons, B. “Hyperspeech: Navigating in Speech-Only Hypermedia.” In Proceedings of the Third Annual ACM Conference on Hypertext, ACM, 1991. pp. 133-146.
Bennacef, S.K. “A Spoken Language System for Information Retrieval.” In ICSLP, vol. 94, pp. 1271-1274. 1994. 4 pages.
Gauvain et al. “Speech recognition for an Information Kiosk.” In Spoken Language, 1996. ICSLP 96. Proceedings. Fourth International Conference on, vol. 2, pp. 849-852. IEEE, 1996.
Gauvain et al. “Spoken Language Component of the MASK Kiosk.” In Human Comfort and Security of Information Systems, pp. 93-103. Springer Berlin Heidelberg, 1997. 11 pages.
Gauvain et al. “The LIMSI Continuous Speech Dictation System.” In Proceedings of the Workshop on Human Language Technology, pp. 319-324. Association for Computational Linguistics, Apr. 1994. 6 pages.
Gauvain et al. “The LIMSI Continuous Speech Dictation System: Evaluation on the ARPA Wall Street Journal Task.” In Acoustics, Speech, and Signal Processing, 1994. ICASSP-94., 1994 IEEE International Conference on, vol. 1, IEEE, 1994, 4 pp.
Goddeau et al. “Galaxy: A Human-Language Interface to On-Line Travel Information.”In Third International Conference on Spoken Language Processing. 1994, pp. 707-710.
House, D. “Spoken-Language Access to Multimedia (SLAM): Masters Thesis.” Oregon Graduate Inst., Dept. of CS and Eng., 1995. 59 pages.
Lamel et al. “Recent Developments in Spoken Language Systems for Information Retrieval.” In Spoken Dialogue Systems—Theories and Applications. 1995, 4 pages.
Language Industry Monitor, “Janet Baker's Optimism.” 1992. [retrieved on Oct. 25, 2011]. Retrieved from the Internet: <URL: http://www.lim.nl/monitor/dragon.html>, 2 pages.
Mostow et al. “Towards a Reading Coach That Listens: Automated Detection of Oral Reading Errors.” Proc. of the 11th Ntl. Conf. on A.I., 1993, pp. 392-397.
Russell et al. “Applications of Automatic Speech Recognition to Speech and Language Development in Young Children.” In Fourth International Conference on Spoken Language, 1996. ICSLP 96. Vol. 1, pp. 176-179. IEEE, 1996. 4 pages.
“Office Action,” Brazil Patent Application No. BR1120120289241, dated Mar. 11, 2020, 4 pages [8 pages including translation].
“Notice of Allowance,” China Patent Application No. 201610912795.2, dated Jul. 6, 2020, 2 pages (5 pages including translation).
Agui et al. “Computer Graphics.” Shokodo Co., Ltd., Jul. 1992, 1st ed., pp. 80-101 (Environment Mapping).
Auslander et al. “Fast, Effective Dynamic Compilation.” In ACM SIGPLAN Notices, vol. 31, No. 5, pp. 149-159. ACM, 1996. 11 pages.
Balakrishnan et al. “Exploring Interactive Curve and Surface Manipulation Using a Bend and Twist Sensitive Input Strip.” In Proceedings of the 1999 Symposium on Interactive 3D Graphics, pp. 111-118. ACM, 1999. 9 pages.
Balakrishnan et al. “Performance Differences in the Fingers, Wrist, and Forearm in Computer Input Control.” In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 303-310. ACM, 1997. 10 pages.
Balakrishnan et al. “The PadMouse: Facilitating Selection and Spatial Postioning for the Non-Dominant Hand.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 9-16. ACM Press/Addison-Wesley Publishing Co., 1998. 9 pages.
Balakrishnan et al. “Exploring Bimanual Camera Control and Object Manipulation in 3D Graphics Interfaces.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 56-62. ACM, 1999. 8 pages.
Beshers et al. “Generating Efficient Virtual Worlds for Visualization Using Partial Evaluation and Dynamic Compilation.” In ACM SIGPLAN Notices, vol. 32, No. 12, pp. 107-115. ACM, 1997. 9 pages.
Blinn et al. “Texture and Reflection in Computer Generated Images.” Communications of the Association for Computing Machinery, ACM, Oct. 1, 1976, pp. 542-547, vol. 19, No. 10, New York, NY USA.
Blinn, J.F. “Light Reflection Functions for Simulation of Clouds and Dusty Surfaces.” ACM SIGGRAPH Computer Graphics, pp. 21-29, vol. 16, Issue 3, Jul. 1982.
Blinn, J.F. “Models of Light Reflection for Computer Synthesized Pictures.” In ACM SIGGRAPH Computer Graphics, vol. 11, No. 2, pp. 192-198. ACM, 1977.
Calvert, J. “SCEE announces EyeToy:Chat.” Game Spot, May 5, 2004. [retrieved Jun. 5, 2006]. Retrieved from the Internet: <URL: http://www.gamespot.com/news/6095429.html.>.
Chan et al. “Efficient Partitioning of Fragment Shaders for Multipass Rendering on Programmable Graphics Hardware.” In Proceedings of the ACM SIGGRAPH/Eurographics Conference on Graphics Hardware. pp. 69-78, 156. Eurographics Association, 2002 (Sarrbrucken, Germany, Sep. 1-2, 2002).
Davenport et al. “Cinematic Primitives for Multimedia.” IEEE Computer Graphics and Applications (Aug. 1991), vol. 11, No. 4, pp. 67-74.
Dorsey et al. “Design and Simultaion of Opera Lighting and Projection Effects.” In ACM SIGGRAPH Computer Graphics, vol. 25, No. 4, pp. 41-50. ACM, 1991.
European Search Report dated Jul. 27, 2010 in EP Application No. 04256331.2, filed Oct. 14, 2004. 4 pages.
Fernando et al. “The Cg Tutorial: The Definitve Guide to Programmable Real-Time Graphics.” Addison-Wesley Longman Publishing Co., Inc. CH. 1 sections 1.2 and 1.4; Appendix C section C.2. 40 pages.
Fitzmaurice et al. “Sampling, Synthesis, and Input Devices.” Communications of the ACM, pp. 55-63. Vol. 42, No. 8, pp. 55-63. Aug. 1999.
Foley et al. “Computer Graphics: Principles and Practice.” Second Edition in C, Oct. 1996, pp. 721-745.
Shiu et al. “Pose Determination of Circular Cylinders Using Elliptical and Side Projections.” In Systems Engineering, 1991. IEEE International Conference on. New York, IEEE, USA. p. 265-268.
Nicewarner et al. Vision-Guided Grasping of a Strut for Truss Structure Assembly. In Intelligent Robotic Systems for Space Exploration, 1992. Proceedings. Fourth Annual Conference on, pp. 86-93. IEEE, 1992.
Gueziec et al. “Simplicial Maps for Progressive Transmission of Polygonal Surfaces.” In Proceedings of the Third Symposium on the Virtual Reality Modeling Language ACM, 1998, pp. 25-31, 131, New York, NY, USA.
Hayano et al. “Mesh Simplification Using Edge Operation with Feature Detection.” Inf. Proc. Soc. of Japan SIG Technical Report, Feb. 27, 1998, vol. 98, No. 16, pp. 72-76.
Konma, T. “Rendering and Texture: Introduction to CG Creation in the Multimedia Age.” Nikkei Bus. Pub., Inc. Nov. 1996, No. 122, pp. 237 (Bump Mapping).
Nilsson, “ID3 Tag Version 2.3.0.” ID3v2: The Audience is Informed, Feb. 3, 1999; [retrieved on Oct. 26, 2011]. Retrieved from the Internet: <URL: http://www.id3.org/id3v2.3.0>.
Matsushita, Yasuyuki, “Special Effects: Interobject Reflection Effect: Starting OpenGL Programming with Mesa 3D.” Itsutsubachi Res. Co., Ltd., Jan. 2000, vol. 3, No. 1, pp. 148-153. 6 pages.
McCool et al. “Texture Shaders.” In Proceedings of the ACM SIGGRAPH/Eurographics Workshop on Graphics Hardware, pp. 117-126. ACM, 1999. 11 pages.
Akenine-Möller et al. “Real-time rendering.” 1999, pp. 68-81, A.K. Peters Ltd. 2008.
Nakamura et al. “Adaptive Transmission of Polygonal Patch Datasets Taking into Account Shape and Color Distribution Features.” Inf. Proc. Soc. of Japan SIG Technical Report, Sep. 8, 2000, vol. 2000, No. 8, pp. 25-30.
Nayar et al. “Lighting Sensitive Display.” ACM Transactions on Graphics, Oct. 2004, vol. 23, No. 4, pp. 963-979, New York.
NVIDIA Corporation, “User Guide CgFX Plug-In for 3ds Max.” Nov. 13, 2002, 26 pp.
Palmer et al. “Tile Based Games FAQ.” GAMEDEV, Aug. 31, 2000. [retrieved on Oct. 25, 2011]. Retrieved from the Internet: <URL: http://www.ggdn.net/GameDev/3d/BlockGames/article728.asp.htm>.
Peercy et al. “Interactive Multi-Pass Programmable Shading.” In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, pp. 425-432. ACM Press/Addison-Wesley Publishing Co., 2000. 8 pages.
Phong, Bui Tuong, “Illumination for Computer Generated Pictures.” Communications of the ACM, 18(6), pp. 311-317, Jun. 1975.
Pratt, David R., “A Software Architecture for the Construction and Management of Real-Time Virtual Worlds”, Disertation, Naval Post Graduate School, Monterey, California, Jun. 1993, pp. 62-67.
Proudfoot et al. “A Real-Time Procedural Shading System for Programmable Graphics Hardware.” In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, pp. 159-170. ACM, Aug. 2001. 12 pages.
Rushmeier et al. “Extending the Radiosity Method to Include Specularly Reflecting and Translucent Materials.” ACM Transaction on Graphics, vol. 9, No. 1, pp. 1-27, Jan. 1990.
Schlick, C. “A Survey of Shading and Reflectance Models.” In Computer Graphics Forum, vol. 13, No. 2, pp. 121-131. Blackwell Science Ltd., 1994. 10 pages.
Schlick, C. “A Fast Alternative to Phong's Specular Model.” In Graphics Gems IV, Morgan Kaufmann, pp. 385-386, Academic Press Professional, Inc., 1994.
Segen et al. “Gesture VR: Vision-Based 3D Hand Interface for Spatial Interaction.” In Proceedings of the Sixth ACM International Conference on Multimedia, pp. 455-464. ACM, 1998.
Pearce, A. “Shadow Attenuation for Ray Tracing Transparent Objects.” In Graphics Gems, pp. 397-399. Academic Press Professional, Inc., 1990.
Tang et al. “Blending Structured Graphics and Layout.” In Proceedings of the 7th Annual ACM Symposium on User Interface Software and Technology, pp. 167-173. ACM, 1994. 8 pages.
Taylor, Philip, “The MSDN Shader Workshop Application, Part 1.” Microsoft Corporation, Mar. 25, 2002. [retrieved on Oct. 26, 2011]. Retrieved from the Internet: <URL: http://msdn.microsoft.com/en-us/library/ms810474(d=printer)>.
Magnenat-Thalmann et al. “Interactive Computer Animation.” No. VRLAB-BOOK-2007-003. pp. 182-186. Prentice Hall, 1996.
The PlayStation 2 Books Riding Spirits Official Complete Guide (graphics), Japan, SoftBank Publishing, Sep. 6, 2003, First Edition, 5 pages.
Voorhoies et al. “Reflection Vector Shading Hardware.” In Proceedings of the 21st Annual Conference on Computer Graphics and Interactive Techniques, pp. 163-166. ACM, 1994. 4 pages.
Ware et al. “Reaching for Objects in VR: Displays: Lag and Frame Rate.” ACM Transactions on Computer-Human Interaction (TOCHI), vol. 1, No. 4, Dec. 1994: 331-356.
White, S. “The Technology of Jak & Daxter.” Game Developer's Conference, Mar. 6, 2003, [retrieved on Mar. 29, 2007]. Retrieved from the Internet: <URL: http://www.gamasutra.com/gdcarchive/2003/White_Stephen.ppt>.
Woo et al. “A Survey of Shadow Algorithms.” Computer Graphics and Applications, IEEE 10, No. 6 (1990): 13-32.
Ohashi et al. “A Gesture Recognition Method for a Stick Input System .” Transactions of the Information Processing Society of Japan 40, No. 2 (1999) [retrieved on Mar. 19, 2014]. Retrieved from the Internet: <URL: http://ci.nii.ac.jp/naid/110002764810>. 12 pages.
International Search Report dated Feb. 4, 2003 in PCT Application No. PCT/US2002/026360.
Related Publications (1)
Number Date Country
20210001224 A1 Jan 2021 US
Continuations (1)
Number Date Country
Parent 12777268 May 2010 US
Child 17029990 US