Some embodiments of the disclosed subject matter relate to video games, and more specifically to gesture-based music games.
Although video games and video game consoles are prevalent in many homes, game controllers, with their myriad of buttons and joysticks, are sometimes intimidating and confusing to people that do not often play video games. For these people, using a game controller to interact with the game can be an obstacle to enjoying it. Also, where the game is a dance and/or full-body motion game, often an additional controller is used in the form of a dance mat or dance pad. These dance mats have specific input sections (similar to buttons on a traditional controller) that typically react to pressure from the user's feet. These mats, however, typically take up a lot of space and are often single use controllers—they are used only for dance games and are typically rolled up and stored when not in use.
At least some of the embodiments described in the present disclosure relate generally to video games and systems used therewith. More specifically, some of the embodiments described herein relate to camera-based interactive game systems that include music and allow for user interaction with the game system in a manner that is responsive to the music.
In general, in an aspect, embodiments of the disclosed subject matter can include a method including displaying, on a display, a multi-part visual cue that instructs a player to perform a gesture at a specified time, the multi-part visual cue including a first part indicating the gesture that is to be performed by the player at the specified time, and a second part providing an indication of i) the specified time, and ii) a preparation period before the specified time, wherein the distance between the first and the second parts is variable over time, receiving, from a video camera, position information associated with positions of at least part of the player over time, determining a first displacement of the at least part of the player using the position information, determining whether the first displacement of the at least part of the player matches a first target displacement criterion associated with the multi-part visual cue, when the first displacement matches the first target displacement criterion within a timing window of the specified time, altering a gameplay characteristic of the video game.
In general, in another aspect, embodiments of the disclosed subject matter can include a video game system including a memory storing computer executable instructions, one or more processors coupled to the memory and configured to execute the instructions such that the one or more processors, cause the display of a multi-part visual cue that instructs a player to perform a gesture at a specified time, the multi-part visual cue including, a first part indicating the gesture that is to be performed by the player at the specified time, and a second part providing an indication of i) the specified time, and ii) a preparation period before the specified time, wherein the distance between the first and the second parts is variable over time, receive, from a video camera, position information associated with positions of at least part of the player over time, determine a first displacement of the at least part of the player using the position information, determine whether the first displacement of the at least part of the player matches a first target displacement criterion associated with the multi-part visual cue, and when the first displacement matches the first target displacement criterion within a timing window of the specified time, alter a gameplay characteristic of the video game.
In general, in still another aspect, embodiments of the disclosed subject matter can include a non-transitory computer readable medium comprising instructions that, when executed by one or more processors, cause the one or more processors to display, on a display, a multi-part visual cue that instructs a player to perform a gesture at a specified time, the multi-part visual cue including, a first part indicating the gesture that is to be performed by the player at the specified time, and a second part providing an indication of i) the specified time, and ii) a preparation period before the specified time, wherein the distance between the first and the second parts is variable over time, receive, from a video camera, position information associated with positions of at least part of the player over time, determine a first displacement of the at least part of the player using the position information, determine whether the first displacement of the at least part of the player matches a first target displacement criterion associated with the multi-part visual cue, and when the first displacement matches the first target displacement criterion within a timing window of the specified time, alter a gameplay characteristic of the video game.
In general, in yet another aspect, embodiments of the disclosed subject matter can include a computerized method including displaying, on a display, a primary cursor that is controlled by a player using a video camera, detecting an interaction between the primary cursor and an object displayed on the display, in response to detecting the interaction, constraining the primary cursor to the object, and displaying a secondary cursor that is controlled by the player via the video camera system to manipulate the object.
At least some of the embodiments described herein can provide one or more of the following capabilities. One or more users can interact with a video game system more efficiently than in the past. One or more users can interact with a video game more naturally than in the past. One or more users can interact with a video game in a more immersive manner than with prior techniques. One or more users can interact with a video game using body gestures, including controlling the progression of the video game. User gestures can be recognized more efficiently when compared with prior techniques. One or more users can interact with a virtual world within a video game more intuitively than in the past. A video game can determine whether users have performed gestures at a specified time more accurately than in the past. A video game can determine whether users have performed a wider variety of gestures than in the past. These and other capabilities will be more fully understood after a review of the following figures, detailed description, and claims.
Embodiments of the disclosed subject matter can provide techniques for a player to interact with a video game, such as a music-based video game, using a video camera-based controller. For example, a video game can provide an experience where a player reacts to and/or creates music by moving his or her body. The player's movements can be captured by the video camera-based controller system such that the player can interact with the game using partial- and full-body gestures. In some aspects of the game, the user can be expected to create music by performing certain gestures, and in other aspects of the game, the user can be expected to react to pre-existing music. Certain specialized gestures can be used by the game such as swipe, push, sustain, and path gestures. The player can be prompted to perform certain gestures using two-part, on-screen cues.
In another aspect of the disclosed subject matter, a video game can provide a virtual world populated by objects. A player can interact with these objects using a cursor whose movements are controlled by the player's movements. For example, the video game can use the video camera-based controller to track the movement of either or both of the player's hands (or other body part), and move an on-screen cursor in a way that mimics the movement of the player's hands (or other body part) in two dimensions or three dimensions. A player can interact with objects by, for example, moving the cursor near to or on top of objects in the virtual world (other ways of interacting with objects are possible, e.g., by leaving the cursor on top of an object for a predetermined time). In some embodiments, a two-element cursor can be used: when a player's primary cursor moves near to or on top of an object in the virtual world, the primary cursor can “stick” onto this object, and a secondary cursor can appear. This secondary cursor can be configured to track the movement of the player's hand (or other body part). In this mode, the secondary cursor's motion can now be used to manipulate or change the state of the object that the primary cursor is constrained to.
Other embodiments are possible.
Referring to
While the foregoing paragraph describes the use of a video-camera based sensor 106, this is exemplary only. Other methods for tracking a player's body are possible. For example, in some embodiments, the video camera system 106 can be used with transducers and/or markers attached to the player's body in three dimensions. In other embodiments, the entertainment system 100 can use infrared pointing devices or other motion tracking peripherals. In still other embodiments, the system 106 may not even include a camera (e.g., it could track position/movement using lasers). Regardless of the specific hardware used, preferably the entertainment system 100 can determine the position of a player over time in two- or three dimensions so that information such as motion, velocity, acceleration, and/or displacement can be derived. Additionally, determining position in three dimensions typically makes the techniques described herein easier to implement due to the additional information provided to the game console 104. In some embodiments, it can be desirable to scale the player position information to compensate for different size players.
Referring now to
Although the KINECT provides a framework for determining positional information of a user's body, it typically does not provide a means for interpreting the user's movements, including determining whether a user has completed a gesture (as described below) in time with the music, or operating a two-cursor graphic (also as described below).
Referring still to
As used herein, the terms “joint,” “bone,” “body part,” “location on the body,” “skeleton” and similar terms are not limited to their respective dictionary definitions and are intended to have the meaning one of skill in the art of motion capture, Kinect-based gaming, and animation would ascribe to them. For example, a skeleton derived from a video camera system can comprise bones, but the number of bones and their positions can be a function of the motion capture equipment and/or animation rig and do not necessarily correlate to the number and positions of bones in a human skeleton. Similarly, a joint is not limited to the point where two bones come together. For example, a joint can be at a distal endpoint of a single bone (e.g., a fingertip or head) or can be located midway along a bone (e.g., in the middle of a femur). Joints can also represent regions of the player's body such as the player's torso or head (even though these do not correspond to an specific bone in the human body.
An example of the KINECT skeleton is shown in
One of the benefits provided by the skeleton-based system is that the skeletal model can be used to calculate scale vectors based on two or more joints. This provides a spatially relative system, e.g., what is the positional distance from body part X to body part Y compared to the positional distance from body part X to body part Z, instead of an absolute coordinate system.
It should be appreciated that the KINECT system typically provides sets of skeleton data representing the position of a player at an instant in time (e.g., each set of skeleton data received from KINECT can include the X/Y/Z coordinates of respective joints). The game running on game platform 104 can then combine the multiple sets of skeleton data to determine motion. The operation of camera 106, and how the data provided therefrom can be processed is described in further detail in U.S. application Ser. No. 12/940,794, filed on Nov. 5, 2010, and Ser. No. 13/828,035, filed on Mar. 14, 2013, both of which are incorporated by reference herein in their entirety. In particular, paragraphs 4-50 (among other paragraphs) of the published application for U.S. application Ser. No. 12/940,794 (i.e., U.S. Pub. No. 2011/0306396) describe the operation of video camera sensors that can track the position of different parts of a player's body. Also, pages 2-9 (among other pages) of the application as filed for U.S. application Ser. No. 13/828,035 also describe the operation of video camera sensors that can track the position of different parts of a player's body.
During gameplay, the game console 104 can output audio such as a musical soundtrack via the audio speakers 116. At the same time, the game console 104 can cause the display screen 108 to display cues that instruct the player 102 to perform certain gestures at specific points in time. The cues displayed by display screen 108 can be timed to coincide with musically significant events in the musical soundtrack played through the audio speakers 116. For example, the cues can correspond to downbeats or with particular climaxes or crescendos in the musical soundtrack.
Using the camera 106, the game console 104 can track in real-time the movement of one or more parts of the body of the player 102 such as the player's left and right hands. By analyzing the positional information (e.g., analyzing the skeleton data received over time) received from the camera 106, the game console 104 can determine whether the player has successfully completed the requested gestures, and can alter a characteristic of the gameplay as a result. For example, the game console 104 can award points depending on how well the player 102 executes gestures in time with the music, or output visual or auditory special effects depending on the actions of the player 102. Once the player has completed (or missed) one cue, the game console 104 can cause the display screen 108 to display the next cue for the next gesture. In some embodiments, the game console 104 can display two or more cues simultaneously, which can be completed by one player using both hands (or using two different body parts), or by two separate players when the game console 104 is operating in a two-player mode. In some embodiments, the game console 104 can be configured to display a succession of cues as part of a song, and can keep track of the player's performance in completing the indicated gestures. At the end of the song, the game console 104 can be configured to display a cumulative score for the player that is based on the player's performance. A gesture can include one or more movements of one or more joints and/or body parts, during one or more times and/or time windows.
Some cues can take the form of a multi-part cue, wherein one part indicates a gesture to be performed by a player and another part indicates a timing of the gesture, including the occurrence of a preparation period leading up to the time at which the gesture is to be performed. Some of the parts of the cue can be stationary, partially fixed, and/or moving. In some embodiments, multiple parts of the cue can collide, or the distance between them can increase or decrease, to indicate when the player is to perform a gesture. For example, in one embodiment a two part cue indicating a gesture to be performed can include a first part that indicates a gesture to be performed by the right and/or left hand of the player and a second part that indicates when the gesture is to be performed and a preparation period leading up to that time. When the first and second parts collide, this can provide an indication of the time when the player is to perform the gesture. Also, the movement of one part along a trajectory can give the player an indication of the time when the player is to perform the gesture. Additionally, in some embodiments the “gesture” to be performed can include keeping a body part motionless for a period of time (e.g., holding the player's right hand still in an extended position for a period of time).
The dart 206 can take on the appearance of any recognizable shape. In the exemplary embodiment depicted in
In the embodiment shown in
Although the swipe gesture should ideally be performed at the trigger time, the game can be configured to determine that the gesture is successfully completed if the player moves his or her hand in the appropriate direction within a time window of the trigger time (e.g., within a short time before or after the trigger time). In other words, if t is the trigger time, the game can be configured to determine that the gesture is successfully completed if the player moves his or her hand in the appropriate direction anytime between t−Δt1 and t+Δt2, where Δt1 and Δt2 are typically designer-configurable durations of time, but can also be preprogrammed and/or user-adjustable. In one embodiment, Δt1 and Δt2 can be 100 ms. In other embodiments, Δt1 and Δt2 can be of different lengths. Thus, at a high-level, the player can still “get credit” for performing the gesture if the player performs the gesture slightly before or after the trigger time t. In some embodiments, the amount of credit that the player earns can be a function of when the player performed the gesture with respect to the trigger time t. For example, the player can earn more points if they perform the gesture closer to the trigger time t (and, if the gesture takes time to execute, the game can be configured to evaluate, for example, whether the beginning time, the middle time, or the ending time is close to the trigger time).
The game space 202 can further include a plurality of particles 210 interspersed throughout, which can appear as motes or particles of light and color, or which can exhibit the appearance of other small shapes. These particles 210 can move through game space 202 (or stay still) and can be altered to enhance gameplay and/or provide feedback to the user. For example, the particles 210 can be programmed to respond to the motions of the player's hand 208. As another example, the particles 210 can move in the direction of the player's hand. Furthermore, if the player successfully completes the swipe gesture, the particles 210 can change color, increase their brightness or size, or swirl in an agitated state. The number of particles does not have to be constant, and can increase or decrease throughout the game, whether in response to the player's actions or independent of the player's actions. For example, if the player completes a gesture or a number of gestures correctly, the number of particles can increase. The particles 210 can therefore provide a more immersive and interactive experience for the player 102.
While
In operation, referring to
At stage 250 the game can determine the current time. This can be done using, for example, a timer or clock in the game console 104. The time can also be measured using a beat clock, such that time is indexed using a current beat or fraction of a beat instead of real time (e.g., seconds or milliseconds). Indexing time using a beat clock can differ from indexing using real time because the beat can be variable through the duration of a song.
At stage 252, in embodiments where a time window is used, the game can determine whether the time window for the swipe gesture has started yet, e.g., whether time t—Δt1 has arrived. If yes, the process 248 can continue to stage 254. Otherwise, the process 248 can continue back to stage 250.
At stage 254, the process 248 can check the velocity of a predetermined point, joint, and/or reference point on the player's body (in some embodiments, these can be points in a Kinect skeleton, as discussed above) such as the player's hand 208 (or other body part, such as the player's head, left or right elbow, shoulder, knee or foot). The velocity that the process 248 checks can include a subset of component velocities related to the player's hand 208 (or other body part), e.g., velocity in the X, Y, or Z direction. In some embodiments, the game can be configured to check the velocity of multiple predetermined points on the player's body (e.g., both of the player's hands) in stage 254. In some embodiments, camera 106 does not provide velocity information directly, but only positional information of parts, joints, and/or other reference points of the player's body at successive points in time (e.g., provides successive frames of positional data). Game console 104 can then compute velocity by sampling the position of a specific body part, joint, and/or reference point between 2 frames, and then dividing the displacement between these two positions by the time between the frames. The frames chosen need not be consecutive (e.g., game console 104 can consider only every third or fourth frame). In some embodiments, the frames that are chosen can be aligned with the beat of the music. In some of these embodiments, whether used in relation to this Figure or other Figures, the term “velocity” used herein need not refer to strict velocities (e.g., distance divided by a constant amount of real time) because “velocities” can be calculated using a beat clock, in which the duration of a beat can be variable throughout the course of a song.
While this figure, and other figures herein discuss measuring the velocity, displacement and other characteristics can be used as well. For example, throughout the embodiments descried herein, rather than determining the velocity, a displacement, speed, trajectory, and/or acceleration can be calculated instead. In each instance, this can be an instantaneous, average, mean, and/or median value (e.g., velocity can be derived by averaging the velocity computed over several different frames from the video camera).
At stage 256, the process 248 can check if the measured velocity of the player's hand 208 matches one or more pre-programmed swipe gesture criterion. Swipe gesture criteria can include, for example, the direction of the hand 208's velocity (e.g., left, right, up, down, towards the screen and away from the screen). Swipe gesture criteria can also include, for example, a threshold magnitude of the player's hand 208's velocity, such that a gesture is only completed if the player's hand 208 is moving at a certain speed (this threshold speed can be computed relative to some body unit indicative of the size of the player's body). In yet other embodiments, evaluating whether the player's hand's velocity matches the swipe gesture criteria can include taking the dot product of the direction of the corresponding joint with the direction of the cue, and then determining if the magnitude is greater than a threshold. In other embodiments, the game can compute the square of the cosine of the angle between the direction of the corresponding joint and the direction of the cue, which can narrow the “correct” band and can help prevent the game from responding to flailing arms. In this instance, since the swipe cue 204 is directing the player to move his hand towards the left, the swipe gesture criteria can require that the player's hand move in the appropriate direction (e.g., towards the left) with a certain minimum speed and/or distance. If the game determines that the velocity of or distance traveled by the player's hand 208 matches the swipe gesture criteria, the process 248 can branch to stage 262, otherwise the process 248 can continue to stage 258. In some embodiments, if process 248 checks the velocity of both of the player's hands in stage 254, the process 248 can be configured to determine whether the velocity of either hand satisfies the gesture criteria in stage 256. If there are multiple cues displayed, the game can check whether each cue that is available to a particular player was satisfied by the player's movement. If there are two cues in the same direction, one cue can be configured to be completed by the player's left hand, while the other cue can be configured to be completed by the player's right hand—in this way, only one cue will be satisfied by a swipe by one hand.
At stage 258, the process 248 can check whether the time window for the swipe gesture has expired, (e.g., whether time t+Δt2 has arrived). If the time window has not yet expired, the process 248 can branch back to stage 254. Otherwise, the process can continue to stage 260.
At stage 260, the process 248 can indicate that the player 102 has missed the swipe gesture (e.g., the velocity of the player's hand never matched the swipe gesture criteria during the time window, or matched for less than a threshold time).
At stage 262, the process 248 can determine that the player 102 has completed the swipe gesture. The process 248 can, for example, credit the player 102 with “points,” unlock other features within the game, and/or alter a gameplay characteristic based on the determination that the player has completed the gesture.
In some embodiments, the process 248 can require that the player's movement match the gesture criteria for some minimum threshold amount of time (or fractional number of beats). This may require, for example, checks of the player's movement at multiple points in time, rather than just a single point in time.
In the embodiments shown in
While
In operation, referring to
At stage 350, the game can determine the current time. This can be done using, for example, a counter or clock in game console 104. The time can also be measured using a beat clock, such that time is indexed using a current beat or fraction of a beat instead of real time (e.g., seconds or milliseconds). Indexing time using a beat clock can differ from indexing using real time because the beat can be variable through the duration of a song.
At stage 352, in embodiments where a time window is used, the game can determine whether the time window for the push gesture has started yet, e.g., whether time t−Δ1 has arrived. If yes, process 348 can continue to stage 354. Otherwise, the process 348 can continue back to stage 350.
At stage 354, the process 348 can check the velocity of the player's hand 308 (or other body part, such as the player's head, left or right elbow, shoulder, knee or foot). The velocity that the process 348 checks can include a subset of component velocities related to the player's hand 308 (or other body part), e.g., in the X, Y, or Z direction. In some embodiments, the game can be configured to check the velocity of both of the player's hands in stage 354, perhaps by using some of the procedure discussed above in relation to
At stage 356, the game can check if the measured velocity of the player's hand 308 matches one or more pre-programmed push gesture criteria. Push gesture criteria can include, for example, the direction of the hand 308's velocity (e.g., left, right, up, down, towards the screen and away from the screen). Push gesture criteria can also include, for example, a threshold magnitude of the player's hand 308's velocity, such that a gesture is only completed if the player's hand 308 is moving at a certain absolute speed. In this instance, since the push cue 304 is directing the player to move his hand towards the screen, the push gesture criteria can require that the player's hand move in the appropriate direction (e.g., towards the screen) with a certain minimum speed. If the game determines that the velocity of or distance traveled by the player's hand 308 matches the push gesture criteria, the process 348 can branch to stage 362, otherwise the process 348 can continue to stage 358. In some embodiments, if process 348 checks the velocity of both of the player's hands in stage 354, the process 348 can be configured to determine whether the velocity of either hand satisfies the gesture criteria in stage 356. At stage 358, the process 348 can check whether the time window for the push gesture has expired (e.g., whether time t+Δt2 has arrived). If the time window has not yet expired, the process 348 can branch back to stage 354. Otherwise, the process can continue to stage 360.
At stage 360, the process 348 can indicate that the player 102 has missed the push gesture (e.g., the velocity of the player's hand never matched the push gesture criteria during the time window, or matches for less than a threshold time).
At stage 362, the process 348 can indicate that the player 102 has completed the push gesture. The process 348 can, for example, credit the player 102 with “points,” unlock other features within the game, and/or alter a gameplay characteristic based on the determination that the player has completed the gesture.
In some embodiments, the process 348 can require that the player's movement match the gesture criteria for some minimum threshold amount of time (or fractional number of beats). This may require, for example, checks of the player's movement at multiple points in time, rather than just a single point in time.
In the embodiment shown in
During a second time period, as depicted in
In some embodiments, the player holds his hand 408 in this position until the inner sustain cue 404 completely fills up the hollow circle formed by outer sustain cue 402, which signifies to the player that the sustain cue has been completed. This time can be referred to as the release time. At this point, the player can move his hand again. Just as with the beginning of the sustain cue, although the sustain gesture should ideally be completed at the release time, the game can be configured to determine that the gesture is successfully completed even if the player begins moving his hand slightly before the release time.
In operation, referring to
At stage 450, the game can determine the current time. This can be done using, for example, a timer or clock in the game console 104. The time can also be measured using a beat clock, such that time is indexed using a current beat or fraction of a beat instead of real time (e.g., seconds or milliseconds). Indexing time using a beat clock can differ from indexing using real time because the beat can be variable through the duration of a song.
At stage 452, in embodiments where time windows are used, the game can determine whether the time window for the start of the sustain gesture has started yet, i.e., whether trigger time t has arrived. If yes, the process 448 can continue to stage 454. Otherwise, the process 448 can continue back to stage 450.
At stage 454, the process 448 can check the velocity of the player's hand 408 (or other body part, such as the player's head, left or right elbow, shoulder, knee or foot). The velocity that the process 448 checks can include a subset of component velocities related to the player's hand 408 (or other body part), e.g., in the X, Y, or Z direction. In some embodiments, the game can be configured to check the velocity of both of the player's hands in stage 454, perhaps by using some of the procedure discussed above in relation to
At stage 456, the process 448 can check if the measured velocity of the player's hand 408 matches one or more of the sustain gesture criterion. Sustain gesture criterion can require, for example, that the player's hand 408's velocity be below a certain threshold magnitude in any direction. If the game determines that the velocity of the player's hand 408 matches the sustain gesture criterion, the process 448 can branch to stage 460, otherwise, the process 448 can continue to stage 458. In some embodiments, if process 448 checks the velocity of both of the player's hands in stage 454, the process 448 can be configured to determine whether the velocity of either hand satisfies the gesture criteria in stage 456.
At stage 458, the process 448 checks whether the time window for the start of the sustain gesture has expired yet, i.e., whether time t+Δt has arrived. If the time window has expired, the game branches to stage 468, otherwise the game continues back to stage 456.
At stage 468, the process 448 determines that the player has missed the sustain gesture.
At stage 460, the game re-checks the time counter. This can be provided by a timer or clock in the game console 104. This can also be provided using a beat clock, as discussed above.
At stage 462, the game checks whether the time for the end of the sustain gesture (e.g., the release time) has arrived yet. If the release time has arrived, the process 448 branches to stage 470, otherwise, the process 448 continues to stage 464.
At stage 470, the game can indicate that the player 102 has completed the sustain gesture.
At stage 464, the game again checks the velocity of the player's hand 408 (or other body part). For example, the process 448 can use a process similar to that described in relation to stage 454. Stage 464 can also be modified in the ways discussed in relation to stage 454.
At stage 466, the game can check if the velocity of the player's hand 408 (or other body part) match further sustain gesture criteria. Such further sustain gesture criteria can require, for example, that the player's hand 408's velocity be below a certain threshold magnitude. The further sustain gesture criteria checked in stage 466 can be the same as those in stage 456, or they can be different than those in stage 456. If the player's hand's velocity matches the further sustain gesture criteria, the process 448 can branch back to stage 460. Otherwise, the process 466 can branch to stage 468, and determine that the player has missed the gesture. Stage 466 can also be modified in the ways discussed in relation to stage 456.
The loop involving stages 460, 462, 464 and 466 can continue until the time for the end of the sustain gesture arrives (e.g., the release time, or a time slightly before the release time). At that point, the process 448 branches to stage 470, in which the process 448 can indicate that the player 102 has completed the sustain gesture. In some embodiments, the player can get partial credit for a sustain that was held for only some of the time between the start and release time. Also in some embodiments, the player's motion can go through a smoothing filter to produce a running average. This smoothing filter can remove jitter, which can be a side effect of using camera 106 to capture the player's movements. Smoothing filters can be used at any stage of any gesture recognizer (e.g., smooth the joint position, or smooth the velocity, or smooth the dot product of the velocity with the cue direction).
In one embodiment, the progress of current position cue 506 through the path cue 504 is timed to coincide with the musical score being played by speakers 116. For example, each path segment 504a, 504b, 504c . . . 504i can correspond to one musical beat or fraction of a beat, such that the player 102 must complete one path segment for each beat or fraction of a beat.
While
In operation, referring to
At stage 550, the process 548 can set a cumulative error meter to zero. This error meter can be a software variable kept track of by game console 104.
At stage 552, the process 548 can check a time counter. This can be provided by a timer or clock in the game console 104. The time can also be measured using a beat clock, such that time is indexed using a current beat or fraction of a beat instead of real time (e.g., seconds or milliseconds). Indexing time using a beat clock can differ from indexing using real time because the beat block can be variable through the duration of a song. Depending on the time counter, the process 548 can identify which path segment (e.g., 504a, 504b, 504c, etc.) corresponds to the current time, and determines the direction that corresponds to this identified path segment.
At stage 554, the process 548 can check the velocity of the player's hand 508 (or other body part, such as the player's head, left or right elbow, shoulder, knee or foot). The velocity that the process 548 checks can include a subset of component velocities related to the player's hand 508 (or other body part), e.g., in the X, Y, or Z direction. In some embodiments, the game can be configured to check the velocity of both of the player's hands in stage 554, perhaps by using some of the procedure discussed above in relation to
At stage 556, the process 548 can determine the difference between the velocity of the player's hand 508 (e.g., the direction in which the player's hand is moving) and the direction which corresponds to the current identified path segment to output an instantaneous error score. To compute this instantaneous error score, the game can measure the angle between the line defined by the motion of the player's hand 508 and the line defined by the corresponding path segment. And/or, the game can compute the cosine of said angle, the square of the cosine of said angle, the sine of said angle, the square of the sine of said angle, and/or some other mathematical function of said angle to output the instantaneous error score. In the embodiments where the process 548 checks the velocity of both of the player's hands in stage 554, the process 548 can be configured to determine which hand to track for the purposes of determining whether the player has completed the path gesture. For example, if the player's right hand is moving in the direction indicated by the currently identified path segment, but the player's left hand is not moving in the indicated direction, the process 548 can track the player's right hand. In other embodiments, the process 548 can also be configured to continue focusing on only one hand for the duration of one path gesture after making an initial determination to follow that hand.
At stage 558, the process 548 can increment the cumulative error meter based on the instantaneous error score determined in stage 556.
At stage 560, the game checks whether the time for the last path segment has passed. If yes, the process 548 can branch to stage 562; otherwise, the process 548 can branch to stage 552.
At stage 562, the game can check the value in the cumulative error meter. If the cumulative error meter is below a max error threshold, the game can branch to stage 566. Otherwise, the game can branch to stage 564.
At stage 564, the process 548 can indicate that the player has failed to complete the path gesture.
At stage 566, the process 548 can indicate that the player has successfully completed the path gesture.
In some embodiments, a path segment can be sufficiently long that the process 548 can be configured to check the movement of the player's hand more than once during the same path segment. In other words, the process 548 can be configured to repeat stages 552, 554, 556 and 558 multiple times for the same path segment. Also in some embodiments, the player can succeed or fail at individual segments of the path gesture, not only the whole path.
Now, turning to the music played by the game, at certain points in the game, the musical soundtrack played by speakers 116 can branch into one of several alternate paths. In some embodiments, the soundtrack played by the speakers can be a mix of several soundtracks, in which the soundtracks can be pieces of instrumentation, such as percussion section, string section, guitar, vocals, and keyboard. Branching into different alternate paths can include muting and unmuting selected soundtracks. These alternate paths can feature substantially the same melody as each other, but can be differentiated from each other in that they can be played in different styles, for example, rock, classical, or jazz. The alternate paths can also feature solo performances by different types of instruments, for example: an electric guitar, piano, or trumpet. The choice gesture is a feature of the game by which a player 102 can select, through a push gesture and a swipe gesture, which of these alternate paths to branch to.
The instruments do not have to be melody-type instruments. For example, bass and drums are not typically considered melody instruments. The alternate tracks can play the same musical role as one another, but in different styles and/or different orchestrations (e.g., vocal section, percussion section, melody section, bass section). Some gestures allow a player to pick a genre for all of the instruments. Other gestures let a player pick different instruments and styles from one another for a single “track” or musical role (e.g., a player can pick a piano, synthesizer, or harpsichord, each playing its own style). In other words, the choice by the player does not necessarily only change the instrument, but can also change what music is played and/or its style. The choice by the player can also change the cues for the subsequent sections to match the new musical track (e.g., cue authoring can change along with the music).
As discussed in relation to push gestures above, the dart 600 can be traveling towards the push cue 602 at a first point in time. When the dart 600 contacts the push cue 602 at a second point in time, the player can be prompted to perform the push gesture. Immediately after performing the push gesture, the player can be prompted to choose which of the swipe gestures to perform. If the player follows swipe cue 604, the game can branch to the path associated with piano 610. Similarly, if the player follows swipe cue 606, the game can branch to the path associated with trumpet 612. Finally, if the player follows swipe cue 614, the game can branch to the path associated with electric guitar 614. In some embodiments the push gesture can be omitted and the player can simply swipe in the chosen direction.
Although a piano, trumpet and electric guitar were selected for this example, the choice of these three instruments was arbitrary and any instrument can be substituted in place of instruments 610, 612, and 614 (e.g., a set of drums, violin, or clarinet). Furthermore, associating a representation of a particular instrument with a particular musical path does not necessarily signify that the path features a prominent solo by that instrument, but can instead signify that the path is played in a musical style commonly associated with that instrument. For example, the path associated with the electric guitar may not feature a solo by an electric guitar, but can be played in a hard rock style. Similarly, the path associated with a violin may not feature a solo by a violin, but can be played in a classical or orchestral style.
At certain points in the game, the player 102 can have an opportunity to score bonus points, or unlock other features of the game, or otherwise change an aspect of the gameplay by participating in a special feature. For example, the feature can be a “mini-game” that is presented to the player 102 at certain points or stages in the game in response to, for example, a user gesture, timing, and/or an accomplishment. One example of a feature is described in connection with
In order to “capture” an edge of the polyhedron 702, the player 102 can complete the swipe gesture associated with that edge. In this instance, in order to “capture” the edges 714 and 716, the player can complete the swipe gestures associated with swipe cues 704 and 706 respectively. The player can complete both swipe cues 704 and 706 by moving both of his hands in the appropriate direction at the same time.
After the second point in time, new swipe cues 718 and 720 can appear, as well as their associated darts, 722 and 724. As before, swipe cue 718 is associated with one edge of the polyhedron 702, in this case, edge 726, while swipe cue 720 is associated with another edge, in this case, edge 728. Once again, the player 102 can capture edges 726 and 728 by successfully completing the swipe gestures associated with swipe cues 718 and 720 respectively. The player can complete both swipe cues 718 and 720 by moving both of his hands in the appropriate direction at the same time.
The polyhedron 702 can continue rotating in different directions, e.g., right, left, up or down, at successive points in time until the player 102 has had at least one opportunity to capture all the edges. If the player 102 successfully captures all the edges of the polyhedron 702, the game can change at least one aspect of its gameplay, for example, unlocking special bonus new features, giving player 102 bonus points, or displaying special visual effects.
While the operation of the game has been described in the context of one-player operation, the game can also be played in two-player mode, in which two players play the game simultaneously. This can be accomplished by having both players stand side by side in front of the Kinect sensor 106 such that both players are within the Kinect sensor 106's field of view.
In two-player mode, the two players can complete both swipe cues. In one embodiment of the game, the player on the left completes the swipe cue on the left, i.e., swipe cue 802, while the player on the right completes the swipe cue on the right, i.e., swipe cue 804. In another embodiment of the game, the player on the left completes the swipe cue in the first color, while the player on the right completes the swipe cue in the second color, regardless of where in the game space 202 the swipe cues appear. In these two embodiments, the game can be configured to keep track of which player is assigned to which cue, and to keep track of each player's individual score, which is based on each player's individual performance with respect to completing his assigned cues, as well as a group score which reflects both player's performance as a team.
In another embodiment, the first color and the second color are the same color, and it does not matter which player completes which swipe cue, as long as both cues are completed. The game can require that one cue be completed by one player while the other cue is completed by the other player. In yet another embodiment, both cues can be completed by either player. In these last two embodiments, the game can be configured to keep track of a group score which reflects both player's performance as a team rather than in individual scores. There may also be cues that need to be completed by both players. If not, either no player gets a score, or no group bonus is given.
While
Turning to another aspect of the disclosed subject matter, the game system can include an on-screen cursor system that can allow a player to interact with two- or three-dimensional virtual worlds. In some embodiments, the two- or three-dimensional virtual world can contain a protagonist character whose actions are influenced by the player's movements. In other embodiments, the two- or three-dimensional virtual world will not contain a protagonist character. In either of the preceding embodiments, the two- or three-dimensional virtual world can contain a cursor in addition to or in place of the protagonist. The cursor is typically a player-controlled virtual object that can move around in the virtual world in two- or three-dimensions. The cursor system can take various forms including, for example: a one-element cursor and/or a “two-element” cursor.
In some embodiments, the cursor's location in the virtual world can be independent of the view of the world. For example, “looking around” in the virtual world by changing the view can be controlled independently from the cursor. In other words, the cursor does not have to control the view the player sees and the cursor can move independently of the field of view of the virtual world. As a further example, the player can pan left/right in the virtual world by stepping left/right in front of camera 106. In embodiments with a protagonist, based on the player moving left/right or forward/backward, the protagonist can look to the left or right or forward or backward, rotate the view, and/or can move his or her body in the virtual world. If the player lowers his or her body in front of the sensor system, the protagonist can also lower its body. The system can also be configured to detect crawling when the player's body rests on or close to the ground. The system can be made such that the protagonist jumps when the system detects the player jumping. If the player walks in place, the protagonist can walk forward in the virtual world. The speed of the protagonist's movement can depend on how fast the player “walks.” Alternatively or additionally, based on the player moving left/right or forward/backward, the virtual camera that determines the viewpoint in the virtual world can pan, truck, dolly, or move on another path.
For the one-element cursor, the screen can display a cursor that moves around the field of view based on the position of, for example, one of the player's hands, as detected by camera 106 (e.g., as shown in
As the one-element cursor moves around the virtual world, it can interact with virtual objects. Various techniques are possible to determine when and how the one-element cursor interacts with objects in the virtual world. For example:
In the foregoing examples, “within a predefined distance” can be determined using some or all of the three axes. For example, in a three-dimensional virtual world, the one-element cursor can interact with an object if it merely aligns with an object in the X/Y (but not Z) direction. In some embodiments, the nature of the interaction can be different depending on how it is triggered. For example, if the cursor is close to the object it can interact one way, and if it stays close for a certain period of time (perhaps without moving significantly), it can interact in a different way. An interaction can refer to any change in the object, for example, causing the virtual object to open, close, grow, shrink, move in some way, emit sounds, or otherwise change its appearance and behavior.
The game system can determine a Z-position of the one-element cursor using various techniques. For example, the Z-position of the one-element cursor can be determined by the Z-position of the player's hand, as determined by a depth image provided by camera 106. The Z-position of the cursor can also be controlled as a function of another of the player's joints (e.g., if the player is controlling the cursor with their right hand, Z-position can be controlled by moving their left hand up and down). The Z-position can also be controlled as a function of the player's body position (e.g., as the player moves closer to the camera system, the cursor can move proportionality in the virtual world). The Z-position of the cursor can be determined as a function of the Z-position of other nearby (in the X-Y direction) objects in the virtual world.
Typically, the cursor is controlled by following a single hand (e.g., the left or right), but there can be anomalies if a player brings their other hand up and then puts the original hand down. To handle this smoothly, the game can use a weighted position between the player's left and right hands. For example, the game can allow the player to raise both hands and have the cursor at a weighted position between the hands, depending on which hand is closer, etc. If both hands are raised, the cursor would be somewhere in the middle of the player's two hands.
The game can also use “cursor holders,” which can be visually indicated as white semi-circles that face in the direction of the hand they belong to (e.g., a right-oriented semi-circle can indicate the player's right hand). The purpose of the cursor holders can be to show the player where the game thinks their hand is. The cursor holders can follow the same logic as the cursor for mapping the player's physical hand position to the virtual space. That is, the hand frames and the near/far/wall cursor planes (described below) can all affect the cursor holders. These cursor holders can be useful, for example, when the player raises both hands, which can result in the cursor being positioned somewhere in the middle of the player's two hands (as described above), and therefore somewhere in the middle of two cursor holders. By displaying the cursor between two cursor holders, the player can visually see how the movement of both of his hands affects the position of the cursor.
For a two-element cursor, the player can move a primary cursor similar to the one-element cursor. When the primary cursor is near a certain object (e.g., within a predetermined distance of a wheel in the virtual world, either in two- or three-dimensions), the primary cursor's motion can become constrained in some way. For example, it can “stick” to the object or slide alongside the object. In this interaction mode, because the player no longer has full control over the primary cursor, a secondary cursor can appear near the cursor (called the “secondary element”) which is still freely controlled by the player's hand or other body parts. This secondary element can be visually connected to the primary cursor, and can appear to pull or push the primary cursor. In this mode, the secondary cursor's motion can now be used to manipulate or change the state of the object that the primary cursor is constrained to. The user can move his hand or other parts of their body to slide or rotate the object in the virtual world. The speed of movement of the object can be determined by the speed of player's body movement.
If the player moves his or her hand sufficiently far away, thus moving the secondary cursor away from the object, then the interaction mode can disengage. Then, the primary cursor can be detached from the object and the secondary cursor can disappear, returning to the one-element cursor mode. Alternatively or additionally, moving the secondary cursor away from the object can cause the primary cursor to detach from the object if the secondary cursor is moved with sufficient speed.
In
In another example, when the player moves the primary cursor within a predetermined distance of a slider object, the primary cursor can become locked onto the slider object such that it can only slide on a single linear axis (thus causing the secondary cursor to appear). The player continues to move his hand around the slider object (thus moving the secondary cursor), but the primary cursor itself is constrained to move only along the slider axis. In this way, the player can move the slider object back and forth, by “pulling” the cursor along the slider, even if the secondary cursor is not moving parallel to a slider axis.
An object to which the primary cursor locks can be moved or rotated in all three dimensions. In this case, the player can slide an object in 1-D (e.g., like a light switch) or rotate an object in 2- or 3-D based on his or her movements.
In some embodiments, the two-element cursor can be implemented as described in the following non-limiting example.
The cursor can be treated as a virtual “cage” to which a virtual “core” is attached. Doing so can, for example, smooth the movement of the virtual cursor and make its movement more appealing to the player. While the cage follows the position of the player's hand, the virtual core can be configured to be tethered to the cage using a virtual elastic tether. That is, the core can be programmed to move around in the virtual space as if it was a mass with inertia that is attracted to the cage. In some embodiments, the core can be visible all the time, or most of the time, while the cage can be invisible most of the time (and only turn visible when certain conditions are met, as described in more detail below). The core and cage can follow the player's hand. The core can collide with objects in the virtual world using regular physics (e.g., the core can bounce or hang on objects, and objects can be affected by the core). In this embodiment, the visible core can appear to move more naturally, rather than following the player's hand exactly.
In some embodiments, instead of bouncing off objects, the core can be configured to “stick” to objects after a collision. The core can become stuck to objects using magnetization based on two-dimensional and/or three-dimensional proximity. Alternatively, the core can become stuck to objects if it is held in proximity to an object for a predetermined time period. The core can also be configured to stick to only certain types of objects but not others.
Once the core is stuck, the cage can become visible (if it wasn't before). In some embodiments, the cage can appear as the secondary cursor which can be manipulated by the player's hand (or other body part). As described above, the player can manipulate the secondary cursor or cage in order to change the appearance or some other aspect or property of the object to which the core is stuck. In other embodiments of the two-element cursor, the secondary cursor can be comprised of a second cage/core combination.
When the core is stuck to an object and the player moves his or her hand slightly away from the object, the core can become separated from the cage. In that case, the game can be configured to display particles (e.g., a stream of motes of light or small shapes) that are emitted from the core and are attracted to the cage to indicate the separation. In some embodiments, the cage can be non-visible even if the core is stuck to an object, such that particles emitted from the core are attracted to the virtual position of the player's hand without there being a visual indicator of where the player's hand is. In yet other embodiments, the core need not be stuck to a fixed point in the virtual world but can instead be constrained to move along a predefined path in the virtual world (e.g., a spline). The core can appear to be pulled along this predefined path by the attraction between the virtual location of the player's hand and the core. One example of this is a “mix switch,” which can appear as a toggle switch or slider that allows the player to switch between different audio treatments. Since the mix switch can activate only along a single axis, the core can be constrained to move only along that axis, while the cage can move freely with the user's hand.
In one embodiment, the way in which the core moves can be quantized to the beat of the music. For example, when the player moves his hand across the strings of a harp, the core an strum the notes of the harp and appear to “hang up” or pause at each string, playing 8th notes. Meanwhile the cage (or invisible cage) can still track the user's hand. This can solve a problem of indicating musical quantization in the UI while still showing the user feedback on their hand position.
In some embodiments, if the player moves his hand such that the cage moves more than a predetermined distance (in two- or three-dimensional space) away from the core, the core can “break away” from the object to which it is stuck.
In operation, referring to
At stage 1002, the process 1000 can define one “unit” based on the distance between two or more joints in the skeleton provided by camera 106. For example, one unit can be defined to be the distance between the player's head and the player's hip. This unit distance can be used for further computations, as described below. In other embodiments, different computations can use a different set of joints for measurement—some of these alternative embodiments are discussed in further detail with regard to specific computations below. By defining one unit in a way that is relative to a player's body dimensions, the process 1000 can be configured to function for people of different sizes.
At stage 1004, the process 1000 can determine the position and size of a hand frame. A hand frame can include a region defined by a plurality of coordinates, either in two dimensions or in three dimensions, which encompasses an expected range of motion of a player's hand. In some embodiments, the process 1000 can determine the position and size of two hand frames: one for each of the player's hands. For example, a left hand frame can be a fixed number of units high (e.g., 1 unit), and a fixed number of units wide (e.g., 1 unit).
If multiple measurements of the distances discussed in relation to
Returning to
At stage 1008, the cursor's position in the virtual space can be determined based on the player's hand position. For example, the X and Y position of the cursor can be based on the normalized position of the player's hand in the hand frame. For example, if the normalized position of the player's hand on the X axis is 0, the X position of the cursor can be at the left side of the screen displayed to the user (e.g., a viewport). Similarly, if the normalized position of the player's hand is 1, the X position of the cursor can be at the right side of the screen displayed to the user. A similar mapping system can be used for the Y dimension, where Y=0 corresponds to the bottom of the screen displayed to the user, and Y=1 corresponds to the top of the screen displayed to the user. In embodiments where the virtual space is in three dimensions, and camera 106 is capable of tracking the Z position of the player's hand, the cursor's Z-position in the virtual space can be based on the normalized Z-position of the player's hand, as described above for the X and Y dimensions.
With reference to
A ray can be projected in the virtual space from the camera through the X, Y position of the cursor. Said another way, a ray can be cast from the camera through the viewport plane. Two examples of such a ray are illustrated in
For example, say a virtual scene has a wall that is 10 meters away from the camera and spans the left half of the viewport. Another wall is 20 meters away and spans the right half of the viewport. Both walls can be marked as far-plane objects. Suppose further that the player's left hand starts out extended and is all the way to the left of the hand frame and gradually moves to the right. Under the embodiments described in the previous paragraph, the cursor will typically map to the far-plane object on the left—the 10 meter wall. Suppose the player moves his hand gradually to the right. When the ray cast from the camera crosses to the 20 meter wall, the cursor can jump back to the 20 meter wall. Even though the player's hand is at z=1.0 the whole time, the cursor should appear at different depths in the virtual space, depending on the positions of near-plane and far-plane objects.
In embodiments using automatic Z-position control, as the cursor moves around the virtual world, it can take on the Z-position that matches the object that it is in front of (relative to the point of view of the player). For example, if the cursor is in front of a “near” object, the cursor can be mapped (in the Z-direction) to the near plane. Likewise, if the object it is in front of is in the far plane, the cursor can be mapped (in the Z-direction) to the far plane.
In yet other embodiments, objects in the virtual space can additionally be marked as “wall objects.” Wall objects, such as wall object 1310 in
Returning again to
For example, at stage 1010, the overall left-right position of the player's body can be determined based on one or more joints in the skeleton. For example, the center hip position can be determined. The overall left-right position can be normalized into the range of 0 to 1 based on the visual range of camera 106. For example, if the player is all the way at the leftmost edge of the camera's field of view, the overall X-value can be 0. If the player is all the way at the rightmost edge of the camera's field of view, the value can be 1. The process 1000 can also determine the overall height of the player's body (e.g., whether the player is crouching or standing straight), as well as the overall depth of the player's body (e.g., whether the player is standing close to or far away from camera 106).
At stage 1012, the process 1000 can cause the viewpoint of the virtual camera to track the overall position of the player's body. For example, if the player's body is situated all the way to the left of the visual range of camera 106 (e.g., the overall X-value in stage 1008 is 0), the virtual camera can be configured to pan to the left. If the player's body is crouching, the virtual camera can be configured to pan down. If the player's body is standing close to camera 106, the viewpoint can be configured to zoom in. Alternatively, user gestures can be used to control the viewpoint of the camera. For example, the player can perform a zoom-in gesture with both hands. For example, if the player wants to zoom-in on a part of the virtual world that is behind the cursor, the player can stand with both hands outstretched in front of their body, and then part their arms wide to the side to zoom in (e.g., similar to the motion used in the breaststroke style of swimming). The reverse can be true to zoom out. For example, the player can begin with each arm stretched to the side. The player can then swing their arms in to the middle of their body so that they are stretched out in front of the player's chest. In embodiments with a protagonist, the “zoom in” and “zoom out” gesture can actually move the protagonist and/or camera towards or away from the location of focus or can simply change the view while the protagonist stays still. Alternatively, if the player is moving the cursor with one hand, the player can move the cursor to something that can be zoomed in on. The player can then raise the other hand, which “primes” the zoom-in and brings in additional user interface elements. The player can then spread both hands apart, as described previously, to zoom in.
In some embodiments, if it is possible to navigate left and right in the virtual world, for example, by stepping left and right to cause the virtual camera to pan, then the overall left-right position can affect the position of the cursor by shifting the cursor's location. For example, assume that the normalized position of the player's left hand in the left hand frame is (0,0), and that this maps to a cursor position of (X,Y) in the virtual space (ignoring Z for the time being). Assume that player leaves their hand in the same position relative to their body, then steps to the left to track the camera left. Though the normalized position of the player's left hand in the left hand frame is still (0,0), the overall left-right position of the player's body decreases, so the cursor position's X position would decrease accordingly. In some embodiments, the cursor can remain at the same location on the screen, but may correspond to a different place in the virtual world. In other embodiments however, if the normalized position of the player's left hand in the left hand frame remains constant even though the overall left-right position of the player's body decreases, the cursor's x position can be configured to remain constant.
Other embodiments are within the scope and spirit of the disclosed subject matter.
For example, while portions of the foregoing description has focused on comparing velocity to a target criterion to determine if a player successfully performed a gesture, other embodiments are possible. For example, instantaneous velocity, average velocity, median velocity, displacement can also be compared to target criteria. As discussed above, “velocity” can be computed in terms of beats rather than real time (e.g., seconds), and the duration of a beat may vary throughout a song. The term “displacement” is sometimes used in this application to refer to velocities that can be relative to either kind of time.
Throughout this application, reference is made to beat information. Beat information (e.g., a beat timeline) can come from many sources, including song metadata stored in MIDI files. For example, MIDI files typically include a way to encode a fixed tempo for an entire track, but can also have a tempo track that encodes detailed tempo information throughout the duration of a song. Using this detailed tempo information, it is possible to accurately map from measure:beat:tick to minutes:seconds, even if the tempo varies during the course of a song.
The above-described techniques can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. The implementation can be as a computer program product, i.e., a computer program tangibly embodied in a machine-readable storage device, for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, a game console, or multiple computers or game consoles. A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or game console or on multiple computers or game consoles at one site or distributed across multiple sites and interconnected by a communication network.
Method steps can be performed by one or more programmable processors executing a computer or game program to perform functions by operating on input data and generating output. Method steps can also be performed by, and apparatus can be implemented as, a game platform such as a dedicated game console, e.g., PLAYSTATION® 2, PLAYSTATION® 3, or PSP® manufactured by Sony Corporation; NINTENDO WII™, NINTENDO DS®, NINTENDO DSi™, or NINTENDO DS LITE™ manufactured by Nintendo Corp.; or XBOX® or XBOX 360® manufactured by Microsoft Corp. or special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit) or other specialized circuit. Modules can refer to portions of the computer or game program and/or the processor/special circuitry that implements that functionality.
Processors suitable for the execution of a computer program include, by way of example, special purpose microprocessors, and any one or more processors of any kind of digital computer or game console. Generally, a processor receives instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer or game console are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer or game console also includes, or is operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Data transmission and instructions can also occur over a communications network. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in special purpose logic circuitry.
To provide for interaction with a user, the above described techniques can be implemented on a computer or game console having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, a television, or an integrated display, e.g., the display of a PSP®, or Nintendo DS.
The techniques described herein can be implemented using one or more modules. As used herein, the term “module” refers to computing software, firmware, hardware, and/or various combinations thereof. At a minimum, however, modules are not to be interpreted as software that is not implemented on hardware, firmware, or recorded on a non-transitory processor readable recordable storage medium (i.e., modules are not software per se). Indeed “module” is to be interpreted to always include at least some physical, non-transitory hardware such as a part of a processor or computer. Two different modules can share the same physical hardware (e.g., two different modules can use the same processor and network interface). The modules described herein can be combined, integrated, separated, and/or duplicated to support various applications. Also, a function described herein as being performed at a particular module can be performed at one or more other modules and/or by one or more other devices instead of or in addition to the function performed at the particular module. Further, the modules can be implemented across multiple devices and/or other components local or remote to one another. Additionally, the modules can be moved from one device and added to another device, and/or can be included in both devices.
The above described techniques can be implemented in a distributed computing system that includes a back-end component, e.g., as a data server, and/or a middleware component, e.g., an application server, and/or a front-end component, e.g., a client computer or game console having a graphical user interface through which a user can interact with an example implementation, or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet, and include both wired and wireless networks.
The computing/gaming system can include clients and servers or hosts. A client and server (or host) are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
This application incorporates by reference in their entirety U.S. application Ser. Nos. 12/940,794 and 13/828,035. To the extent that any portion of these two applications are inconsistent with the description herein, the description herein shall control.
This application claims priority to U.S. Provisional Patent Application Ser. No. 61/794,570, filed on Mar. 15, 2013, the contents of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
D211666 | MacGillavry | Jul 1968 | S |
D245038 | Ebata et al. | Jul 1977 | S |
D247795 | Darrell | Apr 1978 | S |
D259785 | Kushida et al. | Jul 1981 | S |
D262017 | Frakes, Jr. | Nov 1981 | S |
D265821 | Okada et al. | Aug 1982 | S |
D266664 | Hoshino et al. | Oct 1982 | S |
D287521 | Obara | Dec 1986 | S |
D310668 | Takada | Sep 1990 | S |
5107443 | Smith et al. | Apr 1992 | A |
D345554 | Dones | Mar 1994 | S |
5471576 | Yee | Nov 1995 | A |
5537528 | Takahashi et al. | Jul 1996 | A |
5588096 | Sato et al. | Dec 1996 | A |
5689618 | Gasper et al. | Nov 1997 | A |
5701511 | Smith | Dec 1997 | A |
5711715 | Ringo et al. | Jan 1998 | A |
5772512 | Chichester | Jun 1998 | A |
D398916 | Bernardi | Sep 1998 | S |
D400196 | Cameron et al. | Oct 1998 | S |
5832229 | Tomoda et al. | Nov 1998 | A |
5838909 | Roy et al. | Nov 1998 | A |
5879236 | Lambright | Mar 1999 | A |
6001013 | Ota | Dec 1999 | A |
6077162 | Weiss | Jun 2000 | A |
6126548 | Jacobs et al. | Oct 2000 | A |
6137487 | Mantha | Oct 2000 | A |
6191773 | Maruno et al. | Feb 2001 | B1 |
6206782 | Walker et al. | Mar 2001 | B1 |
6219045 | Leahy et al. | Apr 2001 | B1 |
6267674 | Kondo et al. | Jul 2001 | B1 |
6345111 | Yamaguchi et al. | Feb 2002 | B1 |
6347994 | Yoshikawa et al. | Feb 2002 | B1 |
6438581 | Neuhauser et al. | Aug 2002 | B1 |
D462698 | Sturm | Sep 2002 | S |
6542168 | Negishi et al. | Apr 2003 | B2 |
6554706 | Kim et al. | Apr 2003 | B2 |
6577330 | Tsuda et al. | Jun 2003 | B1 |
6597861 | Tozaki et al. | Jul 2003 | B1 |
6613100 | Miller | Sep 2003 | B2 |
6625388 | Winter et al. | Sep 2003 | B2 |
6636238 | Amir et al. | Oct 2003 | B1 |
6654863 | Nishio | Nov 2003 | B2 |
6663491 | Watabe et al. | Dec 2003 | B2 |
6697079 | Rose | Feb 2004 | B2 |
6710785 | Asai et al. | Mar 2004 | B1 |
6738052 | Manke et al. | May 2004 | B1 |
6765590 | Watahiki et al. | Jul 2004 | B1 |
6788880 | Fuchigami et al. | Sep 2004 | B1 |
6802019 | Lauder | Oct 2004 | B1 |
D503407 | Kaku | Mar 2005 | S |
6860810 | Cannon et al. | Mar 2005 | B2 |
6909420 | Nicolas et al. | Jun 2005 | B1 |
7008323 | Hayashi | Mar 2006 | B1 |
7047503 | Parrish et al. | May 2006 | B1 |
D535659 | Hally et al. | Jan 2007 | S |
7170510 | Kawahara et al. | Jan 2007 | B2 |
7181636 | Kim et al. | Feb 2007 | B2 |
7263668 | Lentz | Aug 2007 | B1 |
7274803 | Sharma et al. | Sep 2007 | B1 |
7317812 | Krahnstoever et al. | Jan 2008 | B1 |
D568892 | Stabb et al. | May 2008 | S |
7386782 | Comps et al. | Jun 2008 | B2 |
D572265 | Guimaraes et al. | Jul 2008 | S |
7480873 | Kawahara | Jan 2009 | B2 |
D590407 | Watanabe et al. | Apr 2009 | S |
7530030 | Baudisch | May 2009 | B2 |
7536654 | Anthony et al. | May 2009 | B2 |
7538776 | Edwards et al. | May 2009 | B2 |
D599812 | Hirsch | Sep 2009 | S |
D599819 | Lew | Sep 2009 | S |
7587680 | Wada | Sep 2009 | B2 |
7614011 | Karidis et al. | Nov 2009 | B2 |
D607892 | Murchie et al. | Jan 2010 | S |
D609715 | Chaudhri | Feb 2010 | S |
D619598 | Maitlen et al. | Jul 2010 | S |
D619609 | Meziere | Jul 2010 | S |
7797641 | Karukka et al. | Sep 2010 | B2 |
D624932 | Chaudhri | Oct 2010 | S |
7806764 | Brosnan et al. | Oct 2010 | B2 |
7814436 | Schrag et al. | Oct 2010 | B2 |
7818689 | Wada | Oct 2010 | B2 |
7823070 | Nelson et al. | Oct 2010 | B2 |
7840907 | Kikuchi et al. | Nov 2010 | B2 |
D628582 | Kurozumi et al. | Dec 2010 | S |
7853896 | Ok et al. | Dec 2010 | B2 |
7853897 | Ogawa et al. | Dec 2010 | B2 |
7865834 | van Os et al. | Jan 2011 | B1 |
D640711 | Ng et al. | Jun 2011 | S |
D642192 | Arnold | Jul 2011 | S |
7979574 | Gillo et al. | Jul 2011 | B2 |
8009022 | Kipman et al. | Aug 2011 | B2 |
8068605 | Holmberg | Nov 2011 | B2 |
D650802 | Jang et al. | Dec 2011 | S |
8074184 | Garside et al. | Dec 2011 | B2 |
D651608 | Allen et al. | Jan 2012 | S |
D651609 | Pearson et al. | Jan 2012 | S |
8122375 | Ito | Feb 2012 | B2 |
D658195 | Cranfill | Apr 2012 | S |
D660861 | Lee et al. | May 2012 | S |
8176438 | Zaman et al. | May 2012 | B2 |
8176439 | Kamen et al. | May 2012 | B2 |
8205172 | Wong et al. | Jun 2012 | B2 |
8209606 | Ording | Jun 2012 | B2 |
8225227 | Headrick et al. | Jul 2012 | B2 |
8230360 | Ma et al. | Jul 2012 | B2 |
D664975 | Arnold | Aug 2012 | S |
8255831 | Araumi | Aug 2012 | B2 |
8261209 | Goto et al. | Sep 2012 | B2 |
8444464 | Boch et al. | May 2013 | B2 |
8449360 | Stoddard et al. | May 2013 | B2 |
8493354 | Birnbaum et al. | Jul 2013 | B1 |
8702485 | Flury et al. | Apr 2014 | B2 |
8744121 | Polzin et al. | Jun 2014 | B2 |
8745541 | Wilson et al. | Jun 2014 | B2 |
8749557 | Evertt | Jun 2014 | B2 |
9358456 | Challinor et al. | Jun 2016 | B1 |
9383814 | Capper | Jul 2016 | B1 |
20010034014 | Nishimoto et al. | Oct 2001 | A1 |
20020019258 | Kim et al. | Feb 2002 | A1 |
20020160823 | Watabe et al. | Oct 2002 | A1 |
20030063115 | Kaku et al. | Apr 2003 | A1 |
20040005924 | Watabe et al. | Jan 2004 | A1 |
20040043815 | Kaminkow | Mar 2004 | A1 |
20040127285 | Kavana | Jul 2004 | A1 |
20040147300 | Seelig et al. | Jul 2004 | A1 |
20040184473 | Tavli et al. | Sep 2004 | A1 |
20040193413 | Wilson et al. | Sep 2004 | A1 |
20040240855 | Kagle | Dec 2004 | A1 |
20050014554 | Walker et al. | Jan 2005 | A1 |
20050054440 | Anderson et al. | Mar 2005 | A1 |
20050108657 | Han | May 2005 | A1 |
20050159209 | Fiden et al. | Jul 2005 | A1 |
20060025282 | Redmann | Feb 2006 | A1 |
20060032085 | Randall | Feb 2006 | A1 |
20060209019 | Hu | Sep 2006 | A1 |
20060252474 | Zalewski et al. | Nov 2006 | A1 |
20060266200 | Goodwin | Nov 2006 | A1 |
20070010329 | Craig et al. | Jan 2007 | A1 |
20070015570 | Pryzby | Jan 2007 | A1 |
20070060336 | Marks et al. | Mar 2007 | A1 |
20070126874 | Kake | Jun 2007 | A1 |
20070139443 | Marks et al. | Jun 2007 | A1 |
20070162850 | Adler et al. | Jul 2007 | A1 |
20070260984 | Marks et al. | Nov 2007 | A1 |
20070265098 | Shimada et al. | Nov 2007 | A1 |
20080001950 | Lin et al. | Jan 2008 | A1 |
20080009347 | Radek | Jan 2008 | A1 |
20080100572 | Boillot | May 2008 | A1 |
20080132334 | Nonaka et al. | Jun 2008 | A1 |
20080141181 | Ishigaki et al. | Jun 2008 | A1 |
20080143722 | Pagan | Jun 2008 | A1 |
20080152191 | Fujimura et al. | Jun 2008 | A1 |
20080155474 | Duhig | Jun 2008 | A1 |
20080188305 | Yamazaki et al. | Aug 2008 | A1 |
20080191864 | Wolfson | Aug 2008 | A1 |
20080194319 | Pryzby et al. | Aug 2008 | A1 |
20080200224 | Parks | Aug 2008 | A1 |
20080231926 | Klug et al. | Sep 2008 | A1 |
20080234023 | Mullahkhel et al. | Sep 2008 | A1 |
20080300053 | Muller | Dec 2008 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090069096 | Nishimoto | Mar 2009 | A1 |
20090073117 | Tsurumi | Mar 2009 | A1 |
20090106667 | Lyle et al. | Apr 2009 | A1 |
20090135135 | Tsurumi | May 2009 | A1 |
20090149257 | Ferguson et al. | Jun 2009 | A1 |
20090189775 | Lashina et al. | Jul 2009 | A1 |
20090197665 | Christensen | Aug 2009 | A1 |
20090213123 | Crow | Aug 2009 | A1 |
20090217211 | Hildreth et al. | Aug 2009 | A1 |
20090222765 | Ekstrand | Sep 2009 | A1 |
20090262118 | Arikan et al. | Oct 2009 | A1 |
20090265668 | Esser et al. | Oct 2009 | A1 |
20090278796 | Komazaki | Nov 2009 | A1 |
20090282335 | Alexandersson | Nov 2009 | A1 |
20100009746 | Raymond | Jan 2010 | A1 |
20100035682 | Gentile et al. | Feb 2010 | A1 |
20100039378 | Yabe et al. | Feb 2010 | A1 |
20100064238 | Ludwig | Mar 2010 | A1 |
20100087240 | Egozy et al. | Apr 2010 | A1 |
20100100848 | Ananian et al. | Apr 2010 | A1 |
20100113117 | Ku et al. | May 2010 | A1 |
20100118033 | Faria | May 2010 | A1 |
20100120470 | Kim et al. | May 2010 | A1 |
20100138785 | Uoi et al. | Jun 2010 | A1 |
20100151948 | Vance | Jun 2010 | A1 |
20100167823 | Winkler | Jul 2010 | A1 |
20100192106 | Watanabe et al. | Jul 2010 | A1 |
20100199221 | Yeung et al. | Aug 2010 | A1 |
20100216598 | Nicolas et al. | Aug 2010 | A1 |
20100231523 | Chou | Sep 2010 | A1 |
20100238182 | Geisner et al. | Sep 2010 | A1 |
20100245241 | Kim et al. | Sep 2010 | A1 |
20100278393 | Snook et al. | Nov 2010 | A1 |
20100302145 | Langridge et al. | Dec 2010 | A1 |
20100302155 | Sands et al. | Dec 2010 | A1 |
20100304860 | Gault et al. | Dec 2010 | A1 |
20100306655 | Mattingly et al. | Dec 2010 | A1 |
20100306713 | Geisner et al. | Dec 2010 | A1 |
20110010667 | Sakai et al. | Jan 2011 | A1 |
20110021273 | Buckley et al. | Jan 2011 | A1 |
20110080336 | Leyvand et al. | Apr 2011 | A1 |
20110083106 | Hamagishi | Apr 2011 | A1 |
20110083112 | Matsubara | Apr 2011 | A1 |
20110083122 | Chen et al. | Apr 2011 | A1 |
20110098109 | Leake et al. | Apr 2011 | A1 |
20110105206 | Rowe | May 2011 | A1 |
20110111850 | Beerhorst et al. | May 2011 | A1 |
20110151974 | Deaguero | Jun 2011 | A1 |
20110169832 | Brown et al. | Jul 2011 | A1 |
20110185309 | Challinor et al. | Jul 2011 | A1 |
20110195779 | Lau | Aug 2011 | A1 |
20110237324 | Clavin et al. | Sep 2011 | A1 |
20110238676 | Liu et al. | Sep 2011 | A1 |
20110255803 | Togawa | Oct 2011 | A1 |
20110291988 | Bamji et al. | Dec 2011 | A1 |
20110306396 | Flury et al. | Dec 2011 | A1 |
20110306397 | Fleming et al. | Dec 2011 | A1 |
20110306398 | Boch et al. | Dec 2011 | A1 |
20120052942 | Esaki et al. | Mar 2012 | A1 |
20120069131 | Abelow | Mar 2012 | A1 |
20120157263 | Sivak | Jun 2012 | A1 |
20120214587 | Segal | Aug 2012 | A1 |
20120309477 | Mayles | Dec 2012 | A1 |
20130132837 | Mead et al. | May 2013 | A1 |
20130203492 | Yum | Aug 2013 | A1 |
20130257807 | Harris et al. | Oct 2013 | A1 |
20140208204 | Lacroix et al. | Jul 2014 | A1 |
20150141102 | Asami | May 2015 | A1 |
Number | Date | Country |
---|---|---|
1029570 | Aug 2000 | EP |
000181482-0005 | Sep 2004 | EP |
000859418-0008 | Feb 2008 | EP |
000890447-0040 | Apr 2008 | EP |
000890447-0046 | Apr 2008 | EP |
2330739 | Apr 1999 | GB |
11328124 | Nov 1999 | JP |
2001299975 | Oct 2001 | JP |
WO-9723845 | Jul 1997 | WO |
Entry |
---|
Craymer, Loring, et al., “A Scalable, RTI-Compatible Interest Manager for Parallel Processors”, in Proceedings of the 1997 Spring Simulation Interoperability Workshop, 97S-SIW-154, 1997, pp. 973-976 (4 pages). |
Van Hook, Daniel J., et al., “Approaches to Relevance Filtering”, in Eleventh Workshop on Standards for the Interoperability of Distributed Simulations, Sep. 26-30, 1994, pp. 367-369 (3 pages). |
Van Hook, Daniel J., et al., “Approaches to RTI Implementation of HLA Data Distribution Management Services”, in Proceedings of the 15th Workshop on Standards for the Interoperability of Distributed Simulations, Orlando, Florida, Sep. 16-20, 1996 (16 pages). |
Petty, Mikel D., et al., “Experimental Comparison of d-Rectangle Intersection Algorithms Applied to HLA Data Distribution”, in Proceedings of the 1997 Fall Simulation Interoperability Workshop, Orlando, Florida, Sep. 8-12, 1997, 97F-SIW-016 (14 pages). |
Singhal, Sandeep K., “Effective Remote Modeling in Large-Scale Distributed Simulation and Visualization Environments”, PhD Thesis, Stanford University, 1996 (173 pages). |
Singhal, Sandeep K., et al., “Using a Position History-Based Protocol for Distributed Object Visualization”, in Designing Real-Time Graphics for Entertainment [Course Notes for SIGGRAPH '94 Course No. 14], Jul. 1994 (25 pages). |
Singhal, Sandeep K., et al., “Chapter 5: Managing Dynamic Shared State”, in Networked Virtual Environments—Design and Implementation, ACM Press Books, SIGGRAPH Series, Jul. 1999 (178 pages). |
Microsoft PowerPoint Handbook, 1992 p. 616 (1 page). |
Microsoft Office Online Clip Art, http://office.microsoft.com/en=us/clipart/results.aspx?Scope=MC,MM,MP,MS&PoleAssetID=MCJ04316180000&Query=Icons&CTT=6&Origin=EC01017435, retrieved on Feb. 21, 2007 (1 page). |
Boombox Icons, http://findicons.com/search/boombox, retrieved on Jul. 31, 2012, copyright 2010 (1 page). |
Kuwayama, Yasaburo, “Trademarks & Symbols vol. 2: Symbolical Designs”, Van Nostrand Reinhold Company, pp. 14 and 15, 1974 (4 pages). |
Thalmann, Daniel, L'animation par ordinateur, XP-002616780, http://web.archive.org/web/20060421045510/http://vrlab.epfl.chi/˜thalmann/CG/infogr.4.pdf (52 pages)—French Language. |
No Author Given, “BVH File Specification”, Character Studio Tutorials, Mar. 21, 2006, XP-002616839, http://web.archive.org/web/20060321075406/http://character-studio.net/bvh_file_specification.htm (16 pages). |
Number | Date | Country | |
---|---|---|---|
61794570 | Mar 2013 | US |