The present invention relates generally to video games. More specifically, the present invention relates to automatically generating video game content based on data provided by a source external to the game.
Music-based video games are video games that rely on music for a dominant gameplay characteristic. These games have, in many cases, received a high degree of critical acclaim. However, even highly acclaimed music-based video games have not, to date, been as commercially successful as video games from other genres, nor have they been as commercially successful as recorded music products, such as compact discs and albums issued by popular musical artists.
At least one barrier to wider consumption of music-based video games has been the way in which those products are created, marketed, and distributed. Music-based video games are unusual in that, due to the strong emphasis on music in the game, a player's enjoyment of a music-based video game is directly related to the player's enjoyment of the specific music on which the video game is based. Consumer tastes in music vary widely, so a song or artist that is enjoyed by one consumer might be unappealing to a majority of other consumers. Consequently, music-based video games are subject to consumers' highly fragmented taste in music.
Historically, music-based video games generally have not been created based upon the music of a specific popular recording artist or the music under the control of the player of the video game, but rather on a collection of music licensed from a variety of artists or custom-produced for a “general audience” of video game consumers. This approach attempts to provide “something for everyone”, but in practice, the lack of focus fails to provide a critical mass of musical content that will be strongly appealing to any one individual's taste. To truly provide something for everyone, the content of the game should be dynamically configurable and based on the musical content selected by the player of the game.
The present invention provides systems and methods for creating video game content from music content, whether provided via an article of manufacture such as a compact disc (CD), digital versatile disc (DVD) or memory device such as a hard drive, read-only memory (ROM) or random access memory (RAM) or provided via wireless or wired network connections. The game code may be distributed with a device specific for playing music e.g., an mp3 player.
In summary, the invention is a music based video game that creates itself from the game player's own favorite music. The inventive video game uses technology that automatically analyzes any song file selected by the player of the game and extracts the rhythm and structural data necessary to create a game level based on the selected song. This turns the game player's personal music collection into an interactive gaming experience. The gaming environment and challenges are created in response to the analyzed song content. In one embodiment, to correctly hear the song, proper gameplay is required.
In one aspect, the invention relates to a method for dynamically creating video game content using musical content supplied from a source other than the game. Musical content is analyzed to identify at least one musical event. A salient musical property associated with the identified event is determined. A video game event synchronized to the identified musical event and reflective of the determined salient musical property associated with the identified event is created. In some embodiments, the determined salient musical property is timbre, pitch range, or loudness. In other embodiments, the musical event is output to the player when the player successfully executes the created game event. In other embodiments, the musical event is modified before it is output to the player based on the player's performance. In still other embodiments, the visual content of video games can be altered responsive to the determined salient musical property of musical events. In these embodiments, the video game can be any genre of game.
In another aspect, the present invention relates to a method for dynamically creating video game content using musical content from a source other than the game. Musical content is analyzed to identify at least one musical event. A video game event is synchronized to the identified musical event is created. The at least one musical event is modified responsive to player input. The modified musical event is output.
In a further aspect, the present invention relates to a portable music and video device housing a memory for storing executable instructions and a processor for executing the instructions, the memory comprising instructions that cause the processor to execute a video game stored in the memory and having a game event that is synchronized to a musical event of musical content supplied from a source other than the video game and to display the video game on a display of the portable music device. In some embodiments, the device is an iPod. In other embodiments, the device is a PSP.
In still further aspects, the present invention relates to a method for altering at least one visual property of a video game responsive to musical content from a source other than the video game. A salient musical property associated with a musical event is determined and a visual property of the game is altered responsive to the determined property.
The foregoing and other objects, features, and advantages of the present invention, as well as the invention itself, will be more fully understood from the following description of various embodiments, when read together with the accompanying drawings, in which:
As used herein, creating a video game refers to creating a game level, a portion of a game level, an entire game that includes several game levels, the contents of the environment displayed to user, the game elements used to generate the score of a game player or any combination of those elements. As used in this specification, the term “music-based video game” refers to a game in which one or more of the dominant gameplay mechanics of the game are based on player interaction with musical content. One example of a music-based video game is Karaoke Revolution, sold by Konami Digital Entertainment; in which one of the dominant gameplay mechanics is reproducing, by a player's voice, the pitch and timing of notes from popular songs. Another example of a music-based video game is BeatMania, also sold by Konami; in which game players attempt to strike controller buttons in time to a musical composition. These and other examples are discussed below. In contrast, certain video games have historically utilized the likenesses of popular recording artists and/or music from popular recording artists for the games' soundtracks, but the gameplay itself was not based on player interaction with the soundtrack. One example of such a game is Def Jam Vendetta, sold by Electronic Arts. This is a wrestling game featuring popular hip-hop artists as wrestlers and music from those artists on the soundtrack. The gameplay itself, however, is based simply wrestling and is not, therefore, “music-based” as that term is used in this specification.
Referring now to
It is, of course, understood that the display of three-dimensional “virtual” space is an illusion achieved by mathematically “rendering” two-dimensional images from objects in a three-dimensional “virtual space” using a “virtual camera,” just as a physical camera optically renders a two-dimensional view of real three-dimensional objects. Animation may be achieved by displaying a series of two-dimensional views in rapid succession, similar to motion picture films that display multiple still photographs per second.
To generate the three-dimensional space, each object in the three-dimensional space is typically modeled as one or more polygons, each of which has associated visual features such as texture, transparency, lighting, shading, anti-aliasing, z-buffering, and many other graphical attributes. The combination of all the polygons with their associated visual features can be used to model a three-dimensional scene. A virtual camera may be positioned and oriented anywhere within the scene. In many cases, the camera is under the control of the viewer, allowing the viewer to scan objects. Movement of the camera through the three-dimensional space results in the creation of animations that give the appearance of navigation by the user through the three-dimensional environment.
A software graphics engine may be provided which supports three-dimensional scene creation and manipulation. A graphics engine generally includes one or more software modules that perform the mathematical operations necessary to “render” the three-dimensional environment, which means that the graphics engine applies texture, transparency, and other attributes to the polygons that make up a scene. Graphic engines that may be used in connection with the present invention include Realimation, manufactured by Realimation Ltd. of the United Kingdom and the Unreal Engine, manufactured by Epic Games. Although a graphics engine may be executed using solely the elements of a computer system recited above, in many embodiments a graphics hardware accelerator is provided to improve performance. Generally, a graphics accelerator includes video memory that is used to store image and environment data while it is being manipulated by the accelerator.
Graphics accelerators suitable for use in connection with the present invention include: the VOODOO 3 line of graphics boards manufactured by 3dfx Interactive, Inc. of San Jose, Calif.; the RAGE line of graphics boards, manufactured by ATI Technologies, Inc. of Thornhill, Ontario, Canada; the VIPER, STEALTH, and SPEEDSTAR lines of graphics boards manufactured by S3, Inc. of Santa Clara, Calif.; the MILLENIUM line of graphics boards manufactured by Matrox Electronic Systems, Ltd. of Dorval, Quebec, Canada; and the TNT, TNT2, RIVA, VANTA, and GEFORCE256 lines of graphics boards manufactured by NVIDIA Corporation, of Santa Clara, Calif.
The special abilities of the graphics system are made available to programs via an application programming interface (API). DIRECT3D, a standard API manufactured by Microsoft Corporation of Redmond, Wash. may be used and provides some level of hardware independence. The API allows a program to specify the location, arrangement, alignment, and visual features of polygons that make up a three-dimensional scene. The API also allows the parameters associated with a virtual camera to be controlled and changed.
In other embodiments, a three-dimensional engine may not be used. Instead, a two-dimensional interface may be used. In such an embodiment, video footage of a band can be used in the background of the video game. In others of these embodiments, traditional two-dimensional computer-generated representations of a band may be used in the game. In still further embodiments, the background may only slightly related, or unrelated, to the band. For example, the background may be a still photograph or an abstract pattern of colors. In these embodiments, the lane 110 may be represented as a linear element of the display, such as a horizontal, vertical or diagonal element.
The cues appear to flow toward the game player and are distributed on the lane 110 in a manner having some relationship to musical content associated with the game level. For example, the cues may represent note information (gems spaced more closely together for shorter notes and further apart for longer notes, pitch (gems placed on the left side of the lane for notes having lower pitch and the right side of the lane for higher pitch), volume (gems may glow more brightly for louder tones), duration (gems may be “stretched” to represent that a note or tone is sustained), articulation, timbre or any other time-varying aspects of the musical content. The elements 120 result from the analysis of the musical content associated with the game level. As described below, the elements 120 may be dynamically created from musical content provided by the player. Although shown in
Player interaction with the game element 120 may be required in a number of different ways. In one embodiment, the player may have to “shoot” the game element 120 by pressing a game controller button in synchronicity with the passage of the game element 120 under a target marker 140, 142, 144, much like the game play mechanics in two rhythm-action games published by Sony Computer Entertainment America for the PlayStation 2 console: FreQuency and Amplitude. In another embodiment, the player operates a “scoop” that slides back and forth along the lane 110 (or other visual display of the musical time axis). The player must keep the scoop aligned with the game elements as they flow toward the player, much like one of the game play mechanics featured in a rhythm-action game published by Koei, Gitaroo-man. The player may interact with the game using a traditional controller, such as a PlayStation 2 Controller. In other embodiments, the player may use a computer keyboard to interact with the game. In still other embodiments, the player may use specialized controllers to interact with the game, such as a Guitar Hero SG Controller, manufactured by RedOctane of Sunnyvale, Calif. or a USB microphone of the sort manufactured by Logitech International of Switzerland.
As the game elements 120 move along the lane 110, the musical data represented by the game elements 120 may be substantially simultaneously played as audible music. In some embodiments, audible music is only played (or only played at full or original fidelity) if the player successfully “performs the musical content” by shooting or scooping the game elements 120. In certain of the embodiments shown in
In addition to modification of the audio aspects of game events based on the player's performance, the visual appearance of those events may also be modified based on the player's proficiency with the game. For example, failure to execute a game event properly may cause game interface elements to appear more dimly. Alternatively, successfully executing game events may cause game interface elements to glow more brightly. Similarly, for embodiments such as
Referring now to
Referring now to
In other embodiments, the gaming platform may provide additional input devices allowing the player to “karaoke” more than just the vocal track. In embodiments in which the gaming platform is provided with a camera, the camera may be used to capture movements of the player such as the position and movements of the player's hands, allowing the player to attempt to play along with the drum track for a musical composition while singing. In other embodiments, the gaming platform may provide an input device having foot-actuable elements, such as dance pads of the sort manufactured by Red Octane of Sunnyvale, Calif. In these embodiments, the player's performance may be determined based on execution of vocal game events as well as “dance” game events.
Other examples of “sing-along” video games include Karaoke Revolution, sold by Konami Digital Entertainment; SingStar by Sony Computer Entertainment and Get On Da Mic by Eidos.
Referring now to
Other examples of “dance along” video games include Dance Dance Revolution, sold by Konami Digital Entertainment; EyeToy:Groove, sold by Sony Computer Entertainment and Bust A Groove, sold by Square Enix. Further examples include, “In the Groove, sold by RedOctane, “Pump It Up” sold by Andamiro, “Dance Factory” sold by Codemasters, and Goo Goo Soundy, sold by Konami.
Referring now to
In another embodiment, the music-based video game features gameplay like that found in Rez, a “musical shooter” sold by Sega. In these games, the player navigates through a game environment. The player controls a targeting device to choose and shoot targets that exist in the game environment. As the player shoots targets, musical events are triggered that contribute to a soundtrack for the game. The gameplay for these types of games is similar to other “shooter” type games, with the exception that shooting targets directly and explicitly contributes to the musical accompaniment provided by the game.
Referring now to
Still referring to
In other embodiments, the musical content may be accessed via a network, such as a personal area network (PAN), local area network (LAN), a metropolitan area network (MAN), or a wide area network (WAN) such as the Internet using a variety of connections including standard telephone lines, LAN or WAN links (e.g., T1, T3, 56 kb, X.25), broadband connections (ISDN, Frame Relay, ATM), and wireless connections (Bluetooth, GSM, CDMA, W-CDMA). A variety of data-link layer communication protocols may be employed (e.g., TCP/IP, IPX, SPX, NetBIOS, NetBEUI, SMB, Ethernet, ARCNET, Fiber Distributed Data Interface (FDDI), RS232, IEEE 802.11, IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and direct asynchronous connections). Further, in these embodiments the received musical content may be encrypted using any one of a number of well-known encryption protocols, e.g., DES, triple DES, AES, RC4 or RC5. In these embodiments, the musical content may be downloaded to the portable music device as needed.
Still referring to
In one particular embodiment, identifying the musical events (step 604) and determining the musical properties associated with the events is performed by preprocessing the musical content to emphasize the attacks in the music (audio sound). The emphasized audio signal can be expressed by the ratio (Ps/PI) of a short term power Ps to a long term power Pl in the audio signal. After thresholding, the peak emphasized signal (short term power Ps/long term power Pl) during each select period is chosen as a potential musical event. This technique is described in greater detail in U.S. Pat. No. 6,699,123 B2.
In other embodiments, known techniques for extracting audio events or transients from an audio signal containing music may be used to identify musical events and determine properties associated with those events. Some of these approaches decompose the audio signal into frequency sub-bands by either using the Short-Time-Fourier Transform (STFT) or by using a bank of bandpass filters and finding the envelopes of the resultant signals. Thereafter, a derivative and half-wave rectifier provides a signal that can be compared to a threshold to find locations of audio events or transients. Further details of these techniques are described in: Klapuri, Anssi. “Musical meter estimation and music transcription.” Paper presented at the Cambridge Music Processing Colloquium, Cambridge University, UK, 2003; Paulus, Jouni and Anssi Klapuri. “Measuring the similarity of rhythmic patterns.” Third International Conference on Music Information Retrieval (ISMIR 2002) Paris, France, Oct. 13-17, 2002; and Scheirer, Eric. “Tempo and beat analysis of acoustic musical signals.” Acoustic Society of America, 103, no. 1 (1998): 588-601.
Other known techniques for extracting audio events or transients from an audio signal containing music focus on finding transients of a particular type such as percussive sounds. For examples of these techniques, see Zils, Aymeric, François Pachet, Olivier Delerue and Fabien Gouyon. “Automatic Extraction of Drum Tracks from Polyphonic Music Signals” Proceedings of International Conference on Web Delivering of Music, Darmstadt, Germany, 2002.
In still other embodiments, the tempo and beats of an audio musical signal may be determined by using a large set of resonant comb-filters that are applied to the audio sub-band as described above. Some of these techniques, which may be used to identify musical events and determine properties associated with those events, are described in the following: Klapuri, Anssi. “Musical meter estimation and music transcription.” Paper presented at the Cambridge Music Processing Colloquium, Cambridge University, UK, 2003; Paulus, Jouni and Anssi Klapuri. “Measuring the similarity of rhythmic patterns.” Third International Conference on Music Information Retrieval (ISMIR 2002) Paris, France, Oct. 13-17, 2002; and Scheirer, Eric. “Tempo and beat analysis of acoustic musical signals.” Acoustic Society of America, 103, no. 1 (1998): 588-601.
In further embodiments, musical events are determined and properties of those events identified using a multi-agent system based on detected transients. Details of these techniques can be found in: Dixon, Simon. “Automatic Extraction of Tempo and Beat from Expressive Performances.” Journal of New Music Research 30, no. 1, (2001): 39-58 and Dixon, Simon. “A Lightweight Multi-Agent Musical Beat Tracking System.” Proceedings of the Pacific Rim International Conference on Artificial Intelligence, PRICAI 2000, Melbourne, Australia, 2000. In certain of these embodiments, autocorrelation of low-level features (i.e., transients) is used to create a pool of weighted candidates for tempo and phase. For example, see Gouyon, Fabien and P. Herrera. “A beat induction method for musical audio signals” Proceedings of 4th WIAMIS-Special session on Audio Segmentation and Digital Music; London, UK, 2003. In still further of these embodiments a more complex probabilistic modeling system known as particle filtering is used to identify musical events and determine properties associated with those events. Details of these techniques can be found in Hainsworth, Stephen and Malcolm D. Macleod. “Beat Tracking with Particle Filtering Algorithms” In Proceedings of the IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, Mohonk, N.J., Oct. 19-22, 2003 and Hainsworth, Stephen W., and Malcolm D. Macleod. “Particle Filtering Applied to Musical Tempo Tracking.” EURASIP Journal on Applied Signal Processing 2004, no. 15 (2004): 2385-2395.
Additionally, techniques may be used that infer larger time-scale information from a musical audio signal. These techniques are useful for identifying unique or repetitive song sections. For example, in popular music, some of these song sections are the introduction, verse, chorus, and bridge sections. In certain techniques, a pitch counter is extracted, a similarity matrix is computed, and a clustering algorithm finds similar sequences. Details of these techniques are found in: Dannenberg, Roger B. and Ning Hu. “Discovering Musical Structure in Audio Recordings.” 2nd International Conference on Music and Artificial Intelligence, (ICMAI 2002), Edinburgh, Scotland, Sep. 12-14, 2002; Dannenberg, Roger B. “Listening to ‘Naima’: An Automated Structural Analysis of Music from Recorded Audio.” Proceedings of the International Computer Music Conference, International Computer Music Association, San Francisco, 2002; and Dannenberg, Roger B. and Ning Hu. “Pattern Discovery Techniques for Music Audio.” Third International Conference on Music Information Retrieval, (IRCAM), Paris, France, 2002.
More general approaches do not depend on pitch counters and instead work on a summarized spectral analysis of the audio stream and a more sophisticated probabilistic clustering algorithm. Details of these approaches are described in: Foote, Jonathan. “Automatic Audio Segmentation using a Measure of Audio Novelty.” Proceedings of IEEE International Conference on Multimedia and Expo, vol. 1, 2000; Foote, Jonathan and Shingo Uchihashi, “The Beat Spectrum: A New Approach to Rhythm Analysis,” Proceedings International Conference on Multimedia and Expo (ICME), 2001; Foote, Jonathan and Matt Cooper, “Media Segmentation using Self-Similarity Decomposition.” Proceedings SPIE Storage and Retrieval for Multimedia Databases, vol. 5021, San Jose, Calif., January 2003; and Jehan, Tristan. “Perceptual Segment Clustering For Music Description and Time-Axis Redundancy Cancellation” Proceedings of the International Symposium on Music Information Retrieval (ISMIR). Barcelona, Spain, October 2004.
Still referring
Referring back to
Further, aspects of the game may be altered to reflect musical events and their properties. For example, referring back to
In some embodiments, the described technique of altering background game content responsive to musical properties of identified game events may by applied to game types other than rhythm-action games, such as first person shooters, adventure games, real-time strategy games, role-playing games, turn-based strategy games, platformers, racing simulation games, sports simulation games, survival-horror games, stealth-action games, and puzzle games. For example, a player of a first-person shooter game may provide musical content in which the musical events are determined to have slow, dark properties. In response, the lighting in the first-person shooter may reflect those musical properties, by dimming light sources in the game or selecting a more muted palette of colors to use on objects in the game.
In some embodiments, only a single game event may occur at one time. In these embodiments, overlapping musical events are resolved so that only a single game element is displayed on the screen for any particular time. An event's time extent is its “game event period”. The minimum time between two events is determined. For example, in one embodiment, the minimum time between two events is a duration equaling 100 milliseconds—this is called the event's “shadow period”. No event may occur in another event's shadow period. In this manner, final event signals (final event array) having a series of final events is generated. When each of the final events is reproduced, one video game object is displayed. In another embodiment, more than one video game object can be displayed. For example, a change in lighting and a gem can be created using the same subset of events from the final events.
Each final event may be mapped to a specific type of game element. For example, a subset of contiguous final events can be mapped to require a specific sequence of buttons to be pressed by the player in a specific rhythm to successfully execute one or more musical events. In one embodiment, the shape of each gem displayed in each of the final events is determined based a predetermined sequence distribution or weight random distribution.
In one embodiment, salient features to the user (e.g., pitch, timbre, and loudness) are used to determine the proper button assignment for the gem. A user interface can be provided to the game player prior to analyzing the musical content to allow the game player define which features of the musical content are of most interest to the game player. In other embodiments, the mapping between musical properties and game input is predetermined.
As an example, assume a music game created by the invention involves three buttons that the game player must press in synchrony with musical events, and some salient property of each musical event determines which button the player must press for that event. In the case where that musical property is pitch, then musical events of “generally low pitch” may be assigned to a first button, those of “generally high pitch” may be assigned to the third button, and those of moderate pitch may assigned to the second button, which is between the first button and the second button. This configuration can be also be mapped to the iPod clickwheel. The clickwheel can be thought of as a clock face. The first button can be nine o'clock, the second button can be twelve o'clock, and third button can be three o'clock.
Similarly, if loudness/volume is selected as a salient musical property, then musical events of “generally low volume” may be assigned to the first button, those of “generally high volume” may be assigned to the third button, and those of moderate volume may be assigned to the second button.
Finally, in the case where the salient musical property is timbre, then musical events of one type, e.g. “noisy” (like a snare drum, for example) are assigned to one button, musical events of another type, e.g. “boomy” (like a kick drum) are assigned to another button.
It should be understood that these button mappings can apply to a traditional Playstation, Playstation 2, X-box, X-box 360, or Nintendo game cube controller. Only a Playstation 2 example will be provided for simplicity. Assuming three buttons are used, a first game event is mapped to the LI button, a second game event is mapped to the R1 button, and a third game event is mapped to the R2 button. In another embodiment, the first game event is mapped to the “square” button, the second game event is mapped to the “triangle” button, and the third game event is mapped to the “circle” button.
In addition to mapping musical events to single game objects with characteristics reflective of musical properties of the associated musical events, musical events can be mapped to a group of game objects. For example, and referring again to the game Amplitude sold by Sony Computer Entertainment America, game play rewards are based upon the game player's successful execution of a group (also referred to as phrase) of notes. The analysis of the musical content can reveal phrases or groups of notes of interest, such as a riff that is repeated or a series of notes that recur. Alternatively, the groupings can be assigned post analysis by the software of the invention. Said another way, “phrases” can either be a function of the musical analysis (e.g., the music analysis engine successfully identifying phrases. by identifying repeating patterns in the audio), or the gameplay phrases could have nothing to do with identified sequences of musical events in the musical content. For example, an arbitrary number of musical events in sequence could be identified as a phrase.
Other musical events that can be mapped to game events can include section changes (e.g., from verse to chorus). In one embodiment, these changes translate into visual changes in the background environment (e.g., the three dimensional space surrounding the characters) of the video game. These changes can include lighting, coloring, texturing, and other visual effects, stage appearance, character appearance, character animation, particle system parameters, and the like.
Said another way, the process of generating the video game environment is a dynamic process whereby properties of the supplied musical content are directly connected to graphical properties of the game environment. The properties of the video game environment are not necessarily governed by gameplay. One example includes having the loudness of the supplied musical content cause the video game environment lighting to increase or decrease in brightness. In another example, the frequency distribution of the music changes the color of the lighting being applied to the environment. In another example, the loudness of the supplied musical content affects some property of the animation of objects in the environment (e.g. animated performing musicians start “rocking out harder” when the music is louder) or deformed surfaces.
The described systems and methods may execute on a wide variety of gaming platforms or devices. The gaming platform may be a personal computer, such as any one of a number of machines manufactured by Dell Corporation of Round Rock, Tex., the Hewlett-Packard Corporation of Palo Alto, Calif., or Apple Computer of Cupertino, Calif. Although games manufactured to be played on personal computers are often referred to as “computer games” or “PC games,” the term “video game” is used throughout this description to refer to games manufactured to be played on any platform or gaming device, including personal computers.
In other embodiments the game platform is a console gaming platform, such as GameCube, manufactured by Nintendo Corp. of Japan, PlayStation 2, manufactured by Sony Corporation of Japan, or Xbox 360, manufactured by Microsoft Corporation of Redmond, Wash. In still other embodiments, the game platform is a portable device, such as GameBoy Advance, manufactured by Nintendo, the PSP, manufactured by Sony or the N-Gage, manufactured by Nokia Corporation of Finland.
In other embodiments, the described systems and methods may execute on an electronic device such as a portable music/video player. Examples of such players include the iPod series of players, manufactured by Apple Computer or the line of MP3 players manufactured by Creative Labs. In still further embodiments, the described methods may operate on a cellular telephone.
The software can be provided to the gaming device in many ways. For example, the software can be embedded in the memory of the gaming device and provided with the purchase of the gaming device. Alternatively, the software can be purchased and downloaded to the gaming device, either via a wireless network or a wired network. Additionally, the software can be provided on a tangible medium that is read by the gaming device. In one embodiment, the video game generation software can be preprogrammed into a portable music/video device such as an iPod, PSP, or another portable music/video device.
In still other embodiments, the software is offered for download. In some specific embodiments, the software may be offered for download from a source traditionally associated with the download of music products, such as the iTunes Store, operated by Apple Computer of Cupertino, Calif. In such an embodiment, the software may be stored on a general purpose computer as part of the iTunes application. The iTunes application and downloaded software can generate the video game and transfer the game to an ipod during a synchronization process. In another embodiment, the iPod itself can receive the downloaded software and generate the video game itself.
The following examples illustrate various game play scenarios on a variety of portable music devices. The examples are not exhaustive of all possibly combinations and configurations of game play within the spirit and scope of the invention.
With reference back to
The provided music is read from the CD and analyzed to generate game events for the game. The analysis identifies musical events and, in some embodiments, determined musical properties for each. In another embodiment, additional analysis is performed on the selected musical content. For example, in addition to performing a pitch analysis both a rhythm-focused analysis and a pitch-focused analysis may be performed on the selected musical content. The additional analysis can be performed prior to the start of gameplay or dynamically during gameplay, as described in more detail below.
In one embodiment, gameplay consists of the game-player capturing the gems as they approach the game-player from the band member while the gems are within the target markers 140, 142, 144. Capturing the gems correctly provides unaltered playback of the selected music that is synchronized to the captured gem. If the gems are not captured correctly, the playback of the selected music is altered, or omitted entirely, to indicate that the gem was not captured.
In another embodiment, the game-player is able to switch between band members 102, 104, 106. Switching to another band member also switches the underlying analysis method and therefore the resulting placement and number of gems. Continuing with the example from above, if the game-player switches, using the input device, from the vocalist 102 to the drummer 106, the gems are now generated in response to the results of the rhythm-related analysis. As previously stated, this analysis can be performed as part of the game generation or dynamically during game play.
Additionally, this type of analysis and game generation provides for a multiplayer environment. Both head-to-head and cooperative gameplay can be provided. For example, a first player can select to capture gems associated with the vocalist 102 and a second player can select to capture gems associated with the drummer 106. The player that captures the most gems correctly is declared the winner. Alternatively, the first player's and the second player's score can be aggregated to provide an overall score. The overall score is used to determine whether the team of players advances to another game level.
With reference back to
In one embodiment, gameplay consists of the game-player capturing the gems 230 as the approach the beat blaster 210. Capturing the gems correctly provides unaltered playback of the selected music that is synchronized to the captured gem. If the gems are not captured correctly, the playback of the selected music is altered to indicate that the gem was not captured.
In another embodiment, the game-player is able to switch among a plurality of lanes. Each lane corresponds to a respective type of analysis performed on the selected musical content. As such, the resulting placement and number of gems can be different for each lane. Also, like the example provided with reference to
With reference back to
In one embodiment, the game player is in possession of an iPod portable music/video player. As is known, the iPod is a portable music and video device having a housing that stores various computation means. For example, a processor, memory, and software for playing the music and video stored within the memory. It is assumed that the iPod includes one or more stored music files purchased or otherwise obtained by the game player. From the menu options provided by the iPod the game player navigates to and selects an option labeled, for example, “play video game.” In response, the iPod displays a splash screen or the like to the game player on the display of the iPod presenting the name of the videogame and the proper credits.
Next, the game player selects the music file that is used to create the video game. In response, the iPod processes the selected music as described above and displays a game level to game player on the display of the iPod. It should be understood that because the computer that executes the associated iTunes application has, in most cases, greater processing power, the selecting of music, processing thereof, and generating of the game event data can occur at the computer associated with the ipod, rather than the iPod itself, with said game event data subsequently being transmitted to the iPod.
Once the video game data is generated, play begins. In this example, assume that the video game is a rhythm-based musical game similar that described above with reference to
In another embodiment, the clickwheel senses the motion of the game player's finger. As such, a scooping style game play is used. As described with reference to
The frequency at which gems appear to the game player for capture can be function of the music selected, the difficulty setting of the game provided by the player prior to game generation (e.g., novice, skilled, and advanced), or the type of analysis program used by the software. It should be understood that any combination of the previous functions can be used.
When the user successfully captures a gem correctly, the portion of the selected music that corresponds to the gem is played to the user without modification. If the gem is not captured, the portion of the selected music that corresponds to that gem is modified prior to being played back to the game player. In one embodiment, if the user successfully captures a series of gems an extended portion of the selected music is played back to the game player without modification. Examples of modification can include by are not limited to adding reverberation to the selected music, filtering the selected music, playing only a portion of the selected music, adjusting the volume of the supplied music and the like. In some embodiments, the music is played back without modification even when the player is not playing correctly.
Although the previous example has been given with response to an iPod music and video player, the concepts can be applied to other portable music and video players. For example, the Zen, MuVo, and Nomand players sold by Creative Technology, Ltd of Singapore.
In one embodiment, the game player is in possession of a PSP portable music/video player. As is known, the PSP is a portable music and video device having a housing that stores various computation means. For example, a processor, memory, and software for playing the music and video stored within the memory. Also, the PSP includes an interface for receiving a Universal Media Disk (UMD). In this example, it is assumed that the PSP includes one or more stored music files purchased or otherwise obtained by the game player. In one embodiment, a UMD disk is inserted into the PSP that includes the analysis and video game generation software described above. In another embodiment, the software is shipped with the PSP device. In yet another embodiment, the software is stored on a Memory Stick that is inserted in the PSP. At the start of the video game, the PSP displays a splash screen or the like to the game player on the display of the PSP presenting the name of the video game and the proper credits.
Next, the game player selects the music file that is used to create the video game. In response, the PSP processes the selected music as described above and displays a game level to game player on the display of the PSP. The music selectable by the user is stored on a Memory Stick that is all inserted in the PSP. In other embodiments, the PSP is networked to another storage device of PSP and accesses the music therefrom.
Once the video game is generated, play begins. In this example, assume that the video game is a rhythm-based musical game similar that described above with reference to
When the user successfully captures a gem correctly, the portion of the selected music that corresponds to the gem is played to the user without modification. If the gem is not captured, the portion of the selected music that corresponds to that gem is modified prior to being played back to the game player. In one embodiment, if the user successfully captures a series of gems an extended portion of the selected music is played back to the game player without modification.
Further, in other embodiments the video game features both multiplayer and head-to-head game play as described above in connection with EXAMPLE 1. In a multiplayer embodiment, each player using their own portable device that are in communication using IrDA or Bluetooth technologies or using a single portable device having multiple controllers connected thereto, cooperate to complete a game level
This “battle of the bands” style game play may also be used in head-to-head competition that occurs across a network. For example, teams of multiple game players can form a “band” and compete against other “bands.” The band that captures the most game events for a given musical content correctly is deemed the winner.
Also, single game player head-to-head style game play can be used. In such game play style, each individual game player is charged with capturing game events. The player who captures more game events correctly is deemed the winner. The players can compete against each on a single music and video player or using multiple music and video players that communicate using know networking techniques.