Instrument game system and method

Abstract
A game system and method that uses an instrument as an input encourages a user to play along with the game's soundtrack on an instrument (e.g. guitar, bass, etc.). The game cues the player to play notes and/or chords on the instrument at an appropriate time and then data is collected from the instrument via a connection between the instrument and the apparatus running the game. The game then scores the user based on note/chord and timing information it receives.
Description
APPENDIX A

Appendix A (8 pages) is an example of the source code that implements the adaptive adjustment process described herein and Appendix A forms part of the specification.


FIELD

A system and method for game playing is described. In more detail, a system and method for using an instrument as an input to a game and the game with the instrument input is disclosed.


BACKGROUND

Video games generally are well known. In addition, video games and gaming system with music type games are also known. The game systems may be both personal computer/gaming console (Microsoft® Xbox® or Sony® Play Station2®) or stand-alone gaming consoles such as might be in an arcade. Examples of these types of games include Dance, Dance Revolution in which a user attempts to follow a dance routine set to music and is scored based on the accuracy of the user's dance routine to the exemplary dance routine and Guitar Hero in which the user has a controller (that looks like a guitar), plays along with a song and is scored based on how closely the user can play the song as compared to the exemplary song. It is desirable to provide a game system and method in which an instrument is the input controller to the game system and it is to this end that the present invention is directed.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A illustrates an example of an implementation of a game system;



FIG. 1B illustrates an example of a user interface of a stringed instrument example of the game system;



FIG. 2 illustrates an example of a user interface of a stringed instrument example of the game system;



FIGS. 3A and 3B illustrate examples of the user interface of a note moving toward the play area of the stringed instrument example of the game system;



FIG. 3C illustrates another example of the user interface of FIG. 1B;



FIGS. 4A and 4B illustrate an example of a hit event and a miss event of the stringed instrument example of the game system;



FIG. 5 illustrate an example of the string, fret and time variables of the stringed instrument example of the game system;



FIG. 6A illustrates an example of an action indicator interface of the stringed instrument example of the game system;



FIG. 6B illustrates an example of another embodiment of an action indicator interface of the stringed instrument example of the game system;



FIG. 7 illustrates an example of another action indicator interface of the stringed instrument example of the game system;



FIG. 8 illustrates yet another example of an action indicator interface of the stringed instrument example of the game system;



FIG. 9 illustrates yet another example of an action indicator interface of the stringed instrument example of the game system;



FIG. 10 illustrates an example of a performance meter user interface of the stringed instrument example of the game system;



FIG. 11 illustrates a method for scoring notes in the stringed instrument example of the game system;



FIGS. 12A and 12B illustrate a hit scoring event and a miss scoring event of the stringed instrument example of the game system;



FIG. 13 illustrates an example of a method for scoring the notes of the stringed instrument example of the game system;



FIG. 14 illustrates an example of the stringed instrument example of the game system in which several time windows are used to score a note;



FIG. 15 illustrates an example of the user interface for selecting a level of difficulty of the stringed instrument example of the game system;



FIG. 16 illustrates examples of a sequence of notes with different difficulty levels in the stringed instrument example of the game system;



FIG. 17 illustrates examples of another sequence of notes with different difficulty levels in the stringed instrument example of the game system;



FIG. 18 illustrates an example of an arrangement of a musical arrangement of the stringed instrument example of the game system;



FIG. 19 illustrates an example of a menu in the stringed instrument example of the game system;



FIG. 20 illustrates an example of a select arrangement user interface of the stringed instrument example of the game system;



FIGS. 21A and 21B illustrate an audio and video selection user interface of the game system;



FIG. 22 illustrates a sound input device and gain user interface of the game system;



FIG. 23 illustrates an example of a hardware implementation of a video game system that incorporates the stringed instrument example of the game system;



FIG. 24 illustrates further details of an analysis module of the exemplary embodiment of the game system shown in FIG. 1A;



FIG. 25 illustrates an adaptive adjustment process that can be implemented by the game system;



FIG. 26 illustrates a song being split into phrases;



FIG. 27 illustrates the difficulty assigned to each phrase of a song;



FIG. 28 illustrates a level up logic of the adaptive adjustment process;



FIG. 29 illustrates a level down logic of the adaptive adjustment process;



FIG. 30 illustrates a new phrase logic of the adaptive adjustment process;



FIG. 31 illustrates disparity phrases that are handled by the adaptive adjustment process;



FIG. 32 illustrates a persistent dynamic difficulty table that is part of the adaptive adjustment process;



FIG. 33 illustrates a process for updating the dynamic difficulty table as part of the adaptive adjustment process;



FIGS. 34A-C illustrate examples of a level up action result on a dynamic difficulty table, a level down action result on the dynamic difficulty table and a dynamic difficulty table at the end of a song, respectively;



FIG. 35 illustrates a counting phrase repetitions that is part of the adaptive adjustment process;



FIG. 36 illustrates spatial timing that is part of the game;



FIG. 37 illustrates rotational timing that is part of the game; and



FIG. 38 illustrates container growth timing that is part of the game.





DETAILED DESCRIPTION OF ONE OR MORE EMBODIMENTS

The game system and method are particularly applicable to a personal computer based, guitar based game system and method with the exemplary user interface described below and it is in this context that the system and method will be described. It will be appreciated, however, that the system and method has greater utility because: 1) the game system can be implemented with other musical or melodic instruments, such as any melodic instrument including, for example, a bass, violin, banjo, piano, voice, clarinet, steel drums, etc.; 2) it can be implemented on other gaming apparatus, such as gaming consoles or stand-alone gaming units (such as the Microsoft® Xbox® system, the Sony® PlayStation®, Nintendo® Wii, etc.); 3) it can be implemented in peer-to-peer, ASP model, client/server architectures or as an Internet game; and 4) it can be implemented using other user interfaces and features that are not specifically described below in the exemplary embodiments which are provided to illustrate the game system and method. Now, an example of an implementation of the game system to illustrate the functions and principles is described in more detail.



FIG. 1A illustrates an example of an implementation of a game system 80 where the game system is implemented as a software based stand-alone system. The system 80 may include a game unit 81, such as a cabinet or stand-alone unit, and an instrument 82, such as any musical or melodic instruments including, for example, a bass, violin, banjo, piano, voice, clarinet, steel drums, etc., that is used as an input to the game unit 81 via an interface 82a such as a USB cable, amplifier cord with adapter for computer sound card, networking cable carrying musical data information, a microphone, etc. The game unit may include a display 83 that is capable of displaying the user interface of the game to the user (an example of which is described below in more detail with reference to FIG. 1B), one or more processing units 84, a storage unit 86 (that may be a combination of a persistent storage device such as a hard disk drive, ROM, etc.), and a memory such as SRAM or DRAM, and an operating system 88 that controls the operation of the game system and a game module 90 that reside in the storage unit. The game module, in this embodiment, may be a plurality of lines of computer code. In other embodiments, the game module may also be implemented in hardware or a combination of hardware and software. The game module may include modules for game administration (level difficulty functions), musical instrument interface and game scoring. When the game system is active, the game module is loaded into the memory and then executed by the one or more processing units to implement the functions and operations of the game system described in more detail below. The game system permits a user to play an instrument along with an arrangement displayed on the display (use the instrument as an input to the game system using the interface) and then scores the user based on the accuracy with which the user plays the arrangement shown on the display as described in more detail below.



FIG. 1B illustrates an example of a user interface 100 of a stringed instrument example of the game system. The example of the user interface is for illustration purposes only and the game system may use other user interfaces and the game system is not limited to any particular user interface design. The example user interface may include a background graphics 102 that may consist of a number of images or a virtual environment and may be two dimensional or three dimensional. An example of a two dimensional background graphic with a single image (i.e. wallpaper) or a series of images (i.e. a movie, animation, music video, etc.) is shown in FIG. 1B. FIG. 2 shows another embodiment of a user interface 200 that is three dimensional and may include an animated character or characters 202 possibly playing a virtual instrument 204 and surrounded by virtual props 206 (audio equipment, stage, audience, etc.).


Returning to FIG. 1B, the exemplary user interface may further include an action indicator interface 104 that may include a note field 106, one or more notes 108 superimposed on top of the note field 106 and a play area 110. In one embodiment of the game system, the horizontal position of a note in the action indicator interface 104 indicates the time to play the note (cue time), the vertical position of the note indicates the string to play it on (cue string), and the number inside each note indicates the fret that is to be pressed down (cue fret) on the string to generate the correct pitch. In the one embodiment, the one or more notes 108 move horizontally towards the play area 110 and the play area 110 is stationary. FIGS. 3A and 3B show this horizontal motion of the notes relative to the play area 110 with FIG. 3A is at an earlier time than FIG. 3B. In the game system, the action indicator interface 104 cues the user to play the appropriate note at a specific time. When the overlap of the note and the play area occur, the user is to play the appropriate note.



FIG. 3C illustrates the expected user response to the action indicator interface 104. The top row of the user interface corresponds to the user playing the bottom string on a guitar (cue string). The number inside the note corresponds to the user holding down a particular fret of a guitar, such as the 2nd fret, with his/her finger (cue fret). The overlap of the note with the play area indicate that the user should play the cue string with the cued fret pressed at that instance (cue time), therefore producing a note that would match the arrangement note if played correctly and played at the correct time.


If the user plays the cued note at the cued time, a “Hit” is awarded. If the user does not play the cued note, or waits too long to play the cued note, a “Miss” is awarded. FIGS. 4A and 4B shows a hit event (when the user plays the correct note at the correct time) and a miss event (when the user fails to play the correct note at the correct time), respectively. In some embodiments of the game system, if the note is judged as a “Hit”, the note graphical symbol may change its appearance (i.e. glow, explode, turn bright color, etc.), otherwise, if the note is judged as a “Miss”, the graphical symbol for the notes may change its appearance differently (i.e. fade out, shrink, turn dark color, etc.).


In some embodiments of the game system, the user interface 100 shown in FIG. 1B may show different size and/or length notes 108 wherein the size and/or length of a note shows the player how long to hold the note with note 108 illustrating a “short note” and note 109 illustrating a “long note”.


In some embodiments of the game system, a note 111 may take on a special characteristic (i.e. glowing star, blinking, moving in a wavy path) from other notes in the note field 106 and may be “hit” by the user (played correctly) for an additional score or otherwise enhance the player's in-game status. In some embodiments, the judgment of the last played note is displayed on-screen in a textual format 113 for the player to receive immediate feedback of the played note. The judgment 113 may read “Great”, “Good”, “Fair”, “Poor”, “Miss”, etc.


The action indicator interface 104 of the game system may be modified in various ways in other embodiments of the game system. For example, each note 108 (shown as a circle in the example in FIG. 1B) may use other graphical representation (i.e. squares, stars, arrows, etc.)


As another example, the horizontal position of the note indicating the time to play it (cue time), the vertical position indicating the string to play it on (cue string), and the number inside the note indicating the fret that is to be pressed down (cue fret) on the string to generate the correct pitch is an example of the user interface that may be used to cue the stringed musical instrument play and the variables that cue the play (which string, which fret, and what time), may be arranged spatially (horizontally spaced, vertically spaced, depth-wise spaced) and demarcated by notation (using numbers, letters, colors, shapes, etc) and may have many permutations as shown in FIG. 5. Examples of these different user interfaces are shown in FIGS. 6-9. FIG. 6 shows an embodiment of the action indicator interface 104 with a note field 602, one or more notes 604, and a play area 606 wherein the horizontal position of the note indicates the cue time, the vertical position of the note represents the cue fret, and the number inside the note represents the cue string. FIG. 7 shows another embodiment of the action indicator interface 104 with a note field 702, one or more notes 704, and a play area 706 wherein the depth-wise position of the note indicates the cue time, the horizontal position indicates cue string, and the numbers inside the notes represent the cue fret. FIG. 8 shows another embodiment of the Action Indicator Interface 100 with a note field 802 and one or more notes 804 in which the horizontal position represents cue fret, the vertical position represents cue string, and the numbers inside the notes represent the cue time (i.e. the number of seconds to wait before playing the appropriate string/fret combination). Finally, FIG. 9 shows another embodiment of the action indicator interface 104 with a note field 902, one or more notes 904, and a play area 906 in which the horizontal position represents the cue fret, the depth-wise position represents the cue string, and the vertical position represents the cue time (i.e. when to play the note depends on how fast the note falls and the string/fret combination cued by where the note falls in the play area). Any of the embodiments shown in FIGS. 1-9 may be further modified by using unique colors, characters, or shapes instead of numbers to cue the appropriate string/note/time. For instance, the six colors of red, green, blue, cyan, yellow, and magenta can be used to demarcate the cue string on a 6 string guitar. Also, for instance, the characters of the note may be used to demarcate the cue note (i.e. “C#”, “D”, “B flat”, etc.) In addition to spacing along the traditional axis (i.e. horizontal, vertical, depth-wise), additional embodiments may space along non-traditional axes (i.e. diagonal). Additionally, there is no requirement that any or all axes be orthogonal (i.e. two axes may be parallel, near parallel, or otherwise not angled at 90 degrees).



FIG. 6B illustrates an example of another embodiment of an action indicator interface 104 of the stringed instrument example of the game system. In this embodiment, the interface may have one or more string representations (FIG. 0a.), such as the six strings in FIG. 6B that indicate each string of a 6 string guitar. In a 4 stringed bass, the interface would show four strings represented on the screen. Like an actual stringed instrument, the top indicated string is thicker that the bottom string.


The interface may also have one or more lane markers (FIG. 0b.) in which the lane markers extend out of each fret and represent which fret the notes (FIG. 0g.) belong to. The space between two lane markers is known as the noteway.


The interface may also have one or more lane highlight (FIG. 0c.) wherein the highlighted lanes represent the currently active area of the guitar. The four lanes that are highlighted represent the recommended hand position the user should place their hand over. The left most highlighted lane is where the index finger should align with, the next lane to the right is for the middle finger, the third lane from the left is for the ring finger, and the right most lane the pinky.


The interface may also have lane numbers (FIG. 0d.) wherein the numbers 3-5-7-9-12-15-17-19-21 on the noteway is associated with inlays typically found on electric and acoustic guitars. The numbers help identify which fret a particular note is on. The left most lane is considered as fret 1, and the fret number increases consecutively rightward.


The interface may also have one or more frets (FIG. 0e.) wherein the space between two frets represents the area users are expected to hold down with their finger when a note appears on an active fret. There are two states of the fret, active and inactive. Active states are displayed in brighter colors than the inactive state to help convey the state. Active frets represent the area the next series of notes will appear on.


The interface may also have one or more fingerboard inlays (FIG. 0f) wherein the dots are visual representation of the fingerboard inlays that appear on guitars. The inlay shapes vary from guitar to guitar, but the common positions where the inlays appear are on the 3rd, 5th, 7th, 9th, 12th, 15th, 17th, 19th, and 21st frets. The visual representation of the inlays on screen help the user understand the spatial relation of the on-screen guitar.


The interface may also have one or more notes (FIG. 0g.) wherein the notes represent the string and fret position the player should hold (or “fret”) with their finger. The different colors of the notes match the string color—Red notes are for the red string, the yellow notes are for the yellow string, etc. The notes also are differentiated by different heights to help convey which string they belong to.


The interface may also have one or more containers (FIG. 0h.) wherein the containers represent on the fingerboard the location of the upcoming notes. When the container and the associated note collide, it indicates to the users the timing to pluck (or pick/strum) the string.


The interface may also have one or more stems (FIG. 0i.) wherein the stems serve two purposes: 1) to ground each note on the noteway to help reinforce the spatial position of the notes on the noteway; and 2) to reinforce the height of the note. The different heights of the note represent the string that the note belongs to.


The game system user interface may also include, in some embodiments, a performance feedback interface and several components of the user interface 100 may loosely define a mechanism for a player to receive feedback on their performance in the game. In some embodiments, the user interface 100 may include a score window 112 or other graphic that is used to present the player's current score during their performance. In some embodiments, a performance meter 114 may also be used to present the player's current performance which is a measure of the number of hit notes and missed notes and, if the player's performance falls below a predetermined level, the game may end. FIG. 10 shows alternative embodiments of performance meters. In some embodiments, the performance meter is a bar graph filled or unfilled with colors or patterns based on the player's performance, shown by 1000/1002 in FIG. 10. In some embodiments, the performance meter resembles an analog meter, where the needle moves up and down to indicate the player's performance, shown by 1004 in FIG. 10.


The user interface 100 of the game system in some embodiments may further comprise a chord track 116 that may be, for example, located above the note field 106. During game play, chord information appears in the chord track 116 and scrolls from right to left towards that stationary play area 110. Each piece of chord data lines up with a corresponding note(s) 108 in the note field 106 to show the association between the two.


The user interface 100 of the game system in some embodiments may further comprise a signal feedback interface wherein several components in the user interface 100 loosely define a mechanism for the player to receive the quality and power of the instrument signal being fed into the game. In some embodiments, a volume meter 118 shows the total volume of the player's instrument during instrument play. In some embodiments, a signal plot 120 shows a plot of the power of the player's signal vs. the note so that the signal plot will show peaks at the tone(s) the player is playing.


The user interface 100 of the game system in some embodiments may further comprise a combo feedback interface wherein several components in the user interface 100 loosely define a mechanism for the player to receive information on the number of correctly played notes that have been “hit” in a row (i.e. a combo of correct notes). In some embodiments, textual information 122 displays the current combo number. In some embodiments, a combo bar 124 is used, containing a graphical representation of the number of combo notes played together 126, as well as a display of a score multiplier 128 gained because of successful combos.



FIG. 11 illustrates a method for scoring notes in the stringed instrument example of the game system. If the player plays the arrangement note within the time window allotted around the cued time, the arrangement note is scored as a “Hit” (1102). If the wrong note is played (1104), or the arrangement note is played but not within the time window (1106), no judgment is given. Therefore, it is possible for the player to play several wrong notes but still receive a “Hit” after finally playing the correct arrangement note. If the arrangement note is never played, then a “Miss” is scored.



FIGS. 12A and 12B illustrate a “hit” scoring event and a “miss” scoring event, respectively. As shown in FIG. 12A, the arrangement note “G” has been cued (1202) accompanied by a time window that is shown (1204). A “Hit” is scored in FIG. 12A because the performance by the user contains the note “G” (1206) within the time window (1204). In FIG. 12B, the arrangement note “G” has also been cued (1208) with a time window (1210). However, a “Miss” is scored in FIG. 12B because no note “G” is played in the user performance in the time window. Generally, the live instrument performance of the player will be a continuous signal (with pitches) that therefore is converted in a known manner into notes with time tags so that the game system is able to compare the notes of the arrangement with the notes of the live instrument performance. To accomplish this conversion, the system (such as the analysis module described with reference to FIG. 24 below) may determine the periodicity component of the pitch so that the periodicity component can be converted into a frequency which can then be converted into a note.



FIG. 13 shows which of the performance notes by the user (1302) will be judged/scored if the player plays multiple arrangement notes within the time window. For example, a note “G” in the arrangement has been cued (1304) accompanied by a time window (1306). The player has played “G” twice within the time window (1306), at time 1308 and time 1310. However, time 1308 is closer in time to the arrangement note 1304 and is therefore the one selected for scoring.


In some embodiments, there may be several time windows associated with an arrangement note 1402 as shown in FIG. 14 wherein four different time windows are shown. Each time window allows the player a greater time tolerance for playing the correct arrangement note. In some embodiments, the scoring may be done by giving higher scores to the user performance notes that are in the smaller time windows. For instance, 1404, 1406, 1408, and 1410, may be judged as “Great”, “Good”, “Fair”, and “Poor” and be given a score 4, 3, 2, and 1 respectively. Also, there is no requirement that the time window be symmetrical, as more of a window can be given after the exact cued time 1402 that before it, or vice versa.


In some embodiments, the scoring of notes can be done independent of the octave of the note so that notes played that are one of more octaves higher or lower than the cued note will still be scored positively (i.e. a “Hit”). In these embodiments, the note of the live instrument performance data point is adjusted by one or more octaves so that the adjusted note is then compared to the arrangement note. Then, the live instrument performance is scored positively if the adjusted note is equal to the arrangement note and the live musical performance is not scored if the adjusted note does not match the arrangement note.


The game system may include a level scoring module. In the game, each level is scored based on the number of “Hits” vs. “Misses” awarded to the player. In embodiments with multiple time windows, “Hits” would be subdivided further into “Great”, “Good”, “Fair”, “Poor”, etc. In some embodiments, scoring for a level is done by the multiplying the number of judged notes by multipliers assigned for each rating (i.e. Miss-0, Poor-1, Fair-2, Good-3, Great-4). In some embodiments, a score above a certain amount will unlock one or more unlocked items (i.e. a song, a new character, a new character outfit or guitar, etc.). In some embodiments, a score below a certain amount will “fail” the player and thus not allow the player to progress to the next level.


The game system may also adjust the difficulty of each level of the game. For example, as shown in FIG. 15, the same song may be played with several different level difficulties using a select difficulty screen 1500.


In the game system, different arrangements of musical pieces can be used to give more difficult and challenging experiences of playing the same musical piece, as shown by FIG. 16. The piece shown, “Mary Had a Little Lamb”, has its rhythmic components shown by 1602. An “Easy” arrangement of the piece 1604 may be composed by cueing only every 4th note. An arrangement more difficult than the Easy arrangement, denoted as “Normal” 1606, cues only every 2nd note. An arrangement more difficult than Normal, denoted as “Hard” 1608, cues the player to play every note in the melody. An arrangement more difficult than Hard, denoted as “Expert” 1610, cues the player to add grace notes 1612 and other extra note runs 1614 to the original musical piece. Furthermore, when the difficulty of an arrangement is made more difficult, the time window for each note may be made smaller than the time window for the note during an easier version of the arrangement.


An alternate arrangement technique is illustrated in FIG. 17. The piece shown, “Mary Had a Little Lamb”, has its rhythmic components shown by 1702. An “Easy” arrangement of the piece 1704 may be composed by cueing every note in the melody. An arrangement more difficult than Easy, denoted as “Normal” 1706, cues additional harmonies to be played on other strings in synchronization with the original melody. An arrangement more difficult than Normal, denoted as “Hard” 1708, cues even more additional harmonies to be played on other strings in synchronization with the original melody. In this way, the difficulty of any arrangement can be adjusted by the game system.


In addition, arrangement of songs do not have to follow the traditional melodies as shown in FIG. 18. In particular, arrangements may be designed where musical theory fundamentals (i.e. scales, chords, arpeggios, etc.) are cued instead. The piece shown, “Mary Had a Little Lamb”, has its rhythmic components shown by 1802. While the melody is shown in 1804, an equally valid series of notes consist of a major scale 1806 in the same key as the musical selection. In some embodiments, more difficult arrangements of musical pieces contain a more difficult series of notes to be played together in sequence (i.e. guitar riffs).



FIG. 19 illustrates an example of a progression of menu screens in the stringed instrument example of the game system wherein 1902 shows a non-interactive progression of screens, which may include a splash screen 1904 that displays the game developer's logo, a logo screen 1906 that displays the game logo, a demonstration screen 1908 that shows the game being autoplayed or played by a computer, and a list of high scores 1910. The user is taken to the interactive progression of screens 1912 after the user interacts with the game (i.e. presses Enter on the keyboard). The main menu 1914 lists available options. The select difficulty screen 1916 allow the player to select their desired song difficulty (FIG. 15). The select music screen allows the player to select a song to play (FIG. 20). The game play screen 1920 is the main game screen (FIG. 1B), which may be paused and then resumed. After game play, the player is taken to an evaluation screen 1922 to review their performance. From the main menu 1914, the player may select the setup instrument screen 1924 to tune their instrument and set up an appropriate sound input device and signal gain (FIG. 22). Also from the main menu 1914, the user may select other options 1926, which will give them the ability to adjust video options 1928 (fullscreen or windowed game, screen resolution, etc.) (FIG. 21b) or audio options 1930 (music volume, sound effects volume, etc.) (FIG. 21a).



FIG. 23 illustrates an example of a hardware implementation of a video game system that incorporates the stringed instrument example of the game system. The game system may include a system bus 2302, a ROM 2306 that holds the operating system and a memory 2308 (such as a RAM) that holds the game program 2309. The game system may also include an external storage 2310 that can either be a computer's hard drive, an optical disk, or a flash memory drive, etc. The game system also has a sound module 2312 that connects to the speaker 2314 and a video module 2316 that processes graphics and connects the display 2318, which can be a computer monitor, TV, or arcade screen. The game system may also have a peripheral input 2320 that takes input from the user's keyboard, mouse, buttoned game controllers, racing wheels, etc and a sound input 2322 that takes input from the user's musical instrument and can be a USB cable, microphone, amplifier cord with adapter for computer sound card, networking cable carrying musical data information, etc. The game system may also have a network interface 2324 that takes data in and sends data out over a network for networked play and it can send or receive game data (player profiles, “Hits”, “Misses”, etc.), sound data (from a musical instrument), or music data (i.e. mp3 or .ogg data from a music file).



FIG. 24 illustrates further details of an analysis module 2400 that is part of the exemplary embodiment shown in FIG. 1A. The analysis module may receive an instrument input that is fed into a processing unit 2401, such as a digital signal processing unit (DSP), that detects one or more notes (and a time tag for each note) in the live instrument input using known pitch conversion and note detection processes (described above) programmed into the DSP. The note and time tag information may be fed into a compare module 2402 (implemented as one or more lines of computer code in one embodiment) that queries the arrangement storage at a particular time period and then compares the live performance notes and time tags to a set of arrangement performance notes and time tags that may be stored in a buffer 2403 as shown. The comparison may be done by determining if the notes match and, if the notes match, then finding the live instrument note with the smallest time error. The compare module then may output a time error to a score module 2404 (implemented as one or more lines of computer code in one embodiment) that generates score data which is output to the player and also output to a performance module 2405 (implemented as one or more lines of computer code in one embodiment) that outputs performance data that indicates the performance level of the particular player. Now, an adaptive adjustment process that may be part of certain embodiments of the instrument game system is described in more detail.


Adaptive Adjustment Process



FIG. 25 illustrates an adaptive adjustment process 250 that can be implemented by the game system. The idea is that the game's difficulty adapts to the players skill level so that they are always challenged but not overwhelmed. The adaptive adjustment process described below may be implemented in hardware (a programmed hardware device) or in software (code running on a computer that is executing the instrument game system) or a combination of hardware and software. To accomplish the adaptive adjustment, the adaptive adjustment process begins when a song is played (251) and the adaptive adjustment process tracks the player's performance (252) and adjusts the difficulty based on the player's performance (254) as described below in more detail until the song is completed (254). The adaptive adjustment process moves phrases up in difficulty if the player does well with the particular phrase and moves the phrases down in difficulty if the player does poorly with the particular phrase which are described below in more detail. The above steps are performed in real time during the game play so that the game play adapts to the player's skill level. The adaptive adjustment process may also perform some action off-line (not in real time) that include tracking and saving each player's progressing skill set (255) and saving the levels for each phrase reached by each player (256) so that the game is always adapting to the player's skill set.



FIG. 26 illustrates a song being split into phrases. In the adaptive adjustment process, each song is broken up into phrases so that the player's progress can be tracked for each phrase which is smaller than an entire song. Each phrase is a piece of melodic and rhythmic material that repeats. (i.e., the hook of “Smoke on the Water”) and phrases may end up repeated many times throughout a section. In the example in FIG. 26, the A Phrase repeats 4 times during the verse and the B Phrase spans more measures than the A Phrase and only ends up repeating twice over the chorus. In the example, the particular song also has a C phrase that is part of the outro of the song.



FIG. 27 illustrates the difficulty assigned to each phrase of a song in that the adaptive adjustment process assigns a difficulty to each phrase and the difficulty of each phrase. In particular, when making levels, the system starts with the simplest abbreviation of a phrase and the makes small changes on each iteration of that phrase's difficulty until the full part is represented. Since some phrases are naturally more difficult than others, each phrase uses as many difficulty levels necessary in order to create a smooth transition from level 0 to full part. Thus, in the example shown in FIG. 27, the C phrase is more complex than the other phrases and is built with more difficulty levels as shown by the larger number of difficulty levels (10).



FIG. 28 illustrates a level up logic of the adaptive adjustment process which is the process by which a player's difficulty for a particular phrase is increased. The player must get N % (i.e., 100%) of the notes in a phrase in order to level-up (increase the difficulty of) that phrase. The leveling up takes place on the next instance of that same phrase. If the player only gets M %-O % (i.e., 50-99%) of the notes, the phrase does not level up. In the example shown in FIG. 28, the player levels up (levels 0-2), but gets stuck on level 3 for three phrases until continuing the increase in difficulty of the particular phrase.



FIG. 29 illustrates a level down logic of the adaptive adjustment process which is the process by which a player's difficulty for a particular phrase is reduced. During this process, if the player only gets X %-Y % (i.e., 25-49%) of the notes, the phrase is marked with a warning. If the player doesn't perform above M % (i.e., 50%) on the next instance of that phrase, then the player is dropped down a level for that phrase. If the player does perform above M % (i.e., 50%) on the next instance of that phrase, the warning marker is erased. The leveling down takes place on the next instance of that same phrase. If the player performs below X % (i.e., 25%) for a particular phrase, the game drops them down a level immediately. In the example shown in FIG. 29, the player scores below X % (i.e., 25%) and drops immediately, remains stable on level 3, but then starts to slip again so that to phrase is marked and after scoring below M % (i.e., 50%) a second time they are dropped down again.



FIG. 30 illustrates a new phrase logic of the adaptive adjustment process. The new Phrase Logic keeps the player from having to start back at level one every time they encounter a phrase they have never played before. When the game sees a new phrase, the adaptive adjustment process calculates its level based on the average of all previously completed phrases and references a matching table (an example of which is shown in FIG. 30) that indicates what level the new phrase should be presented. In the example in FIG. 30, the new phrase is the Chorus and the new phrase level of the chorus is calculated by taking average of the previously played phrases (the verse), rounding that number down, and referencing the new phrase matching table. In this example the New Phrase Logic has leveled the chorus to a level 2.



FIG. 31 illustrates disparity phrases that are handled by the adaptive adjustment process. Disparity is a markeplaced on a phrase when there are very few instances of the phrase. Similar to the new phrase logic, a disparity phrases level is calculated by taking the average of all previously completed phrases in a song and referencing a matching table. However, while the level for a New Phrase is only calculated once, the level for a Disparity Phrase is calculated every time. Once the player levels up a Disparity Phrase above the average calculated for it, the player's level is remembered instead. By constantly recalculating the average, the system has leveled out the disparity in leveling up infrequent phrases. In the example in FIG. 31, the average for all the phrases in the song are calculated, that result is rounded down, and the Disparity Matching Table is referenced to place the disparity phrase at a level 2.



FIG. 32 illustrates a persistent dynamic difficulty table that is part of the adaptive adjustment process. In the game, each player has a Persistent Dynamic Difficulty Table saved with their profile. The table tracks the number of repetitions a player must perform with 100% accuracy in order to level up a phrase from a given level and the table has a min and a max for each level as shown. The values for Min # of Repetitions and Max # of Repetitions should be in a easily modifiable array.



FIG. 33 illustrates a process for updating 3300 the dynamic difficulty table as part of the adaptive adjustment process that occurs for each song and starts when a song starts (3302). When a song starts, the game plays a particular phrase at level N (3304) and then determines if the difficulty of the particular phrase should be leveled up (3306). If the player's skill at the particular phrase warrants a level up (see description above), then the process assigns a value of R+S (the number of repetitions of the particular phrase at level N and a success value) to the phrase (3308) and then determines if the song has ended (3310) and loops back to the next phrase (3304) if the song has not ended. If the player's skill at the particular phrase does not warrant a level up, then the process determines if the difficulty of the particular phrase should be leveled down (3312). If the player's skill at the particular phrase warrants a level down, then the process assigns a value of R+F (the number of repetitions of the particular phrase at level N and a failure value) to the phrase (3314) and then determines if the song has ended (3310) and loops back to the next phrase (3304) if the song has not ended. In this manner, each phrase of the song is completed and “scored”.


When the song is completed, the process determines if the repetition value for each phrase is greater than a max change value (1 in this example) (3316). If the repetition value for each phrase is greater than a max change value, then the repetition value is changed to the max change value (3318) and the new value of R is written into the dynamic difficulty table for the particular phrase at level N and higher (3320). If the repetition value for each phrase is not greater than a max change value, the process determines if the repetition value for each phrase is greater than zero (3322). If the repetition value for each phrase is greater than zero, then the original value of R is written into the dynamic difficulty table for the particular phrase at level N and higher (3320). If the repetition value for each phrase is not greater than zero, then the process determines if the repetition value for each phrase is less than the negative max change value (3324). If the repetition value for each phrase is less than the negative max change value, then the repetition value is changed to the negative max change value (3326) and the new repetition value is written into the dynamic difficulty table for the particular phrase a level N and lower (3328). If the repetition value for each phrase is not less than the negative max change value, then the original value of the repetition value is written into the dynamic difficulty table for the particular phrase a level N and lower (3328). Using this process, the dynamic difficulty table for each phrase of each song is updated in the dynamic difficulty table, examples of which are shown in FIGS. 34A-C described below.



FIGS. 34A-C illustrate examples of a level up action result on a dynamic difficulty table, a level down action result on the dynamic difficulty table and a dynamic difficulty table at the end of a song, respectively. Leveling up a phrase modifies the required number of repetitions at that level and every level below while leveling down a phrase modifies the required number of repetitions at that level and every level above. While the table retains the float information, the required repetitions are rounded up. In addition, changes to the Persistent Dynamic Difficulty Table are recorded upon the completion of a song as shown in FIG. 33 above. In FIG. 34A, the player total level up (2×LUM) for level 2 gets propagated down the lower levels so that levels 0-2 are given a value of “1”. In FIG. 34B, the player total level down (1×LDM) for level 4 gets propagated up the higher levels so that levels 4-6 have the same “2.5: value. FIG. 34C is an example of the table for a particular player at the end of a song.

    • Because the skills required by arrangement types vary, a player needs to have a separate persistent dynamic difficulty table for each arrangement type. Thus, for each player, there may be 3 persistent dynamic difficulty tables total, including: a Single Note/Lead; a combo and a Chord/Rhythm table.



FIG. 35 illustrates a counting phrase repetitions that is part of the adaptive adjustment process. In particular, successfully completed repetitions of a given phrase are saved per song with the players profile, but the repetitions completed counter resets after reaching required amount. In addition, the repetitions completed counter resets if the player levels down from a given phrase.


The adaptive adjustment process handles redundant levels which are levels in a phrase which are copies of the previous level. Redundant Levels are used to match the development of phrases when there are no options to build the density slow enough. In the normal/event mode of the game, the leveling up TO a redundant level should NOT trigger level up messaging and feedback and the leveling up FROM a redundant levels to a non-redundant levels should trigger all normal level up messaging. In the leveler mode of the game, the leveler should skip over redundant levels and display the next non-blank phrase.


The adaptive adjustment process also handles linked phrases which are phrases that have their difficulty level related to each other in a parent/child relationship and the linked phrases form a linked group. In these groups, there is one Parent Phrase, and one Child Phrases. When a Parent Phrase levels up, the Child Phrases<=to the Parent Phrase level up X amount. When a Parent Phrase levels down, all Child Phrases>=the parent level down Y amount. When a Child Phrase levels up or down, there is no change to the Parent Phrase or the other Child Phrases. The parent X & Y Variables are controlled on a group by group basis as editable properties from the editor tool.


The adaptive adjustment process also handles empty levels that are levels with no notes in them. Empty levels are often used in conjunction with Linked Phrases or Disparity to hold off the introduction of a phrase for the later levels. During the normal/event mode of the game, empty levels should NOT automatically level up and empty Levels should only be leveled up through Linked or Disparity relationships. In a play next mode of the game, play next should not recommend practicing a phrase level that is an empty level. In the leveler mode of the game, the user should NOT be able to manually select sections for the Leveler that are empty. In the accelerator mode, the user should NOT be able to manually select sections for Accelerator that are empty.


The adaptive adjustment process also handles ignore phrases that are flagged phrases that do not advance or demote the dynamic difficulty. (i.e., playing an Ignore Phrase perfectly won't level up the phrase) They are usually instances of a phrase that are not the complete phrase for design reasons. In the play next mode of the game, the play next mode should NOT recommend challenges that uses ignore phrases and instead recommend a more common instance of the same phrase.


The adaptive adjustment process also handles level jumps which are flags on levels of a phrase that halt the advancement of the dynamic difficulty. In the game, there can be more than one Level Jump per phrase. The level jump stops the advancement of the dynamic difficulty at the level before the phrase that is marked. (i.e., if the 8th level of an A phrase is a level jump, than the player can not advance past the 7th level.)


Each level jump has a challenge (level jump challenge) to unlock the level jump. The level jump challenges are a combination of the Free Speed and Accelerator challenges and players need to complete the level jump challenges in order to unlock Level Jumps in Normal Mode. In the game, there are two ways to unlock a level jump challenge: 1) leveling up to the Level Jump from Normal Mode; and 2) leveling up to the Level Jump from Leveler. The player must complete the Level Jump Challenge above X % in order to unlock the Level Jump in Normal Mode. Once the player completes the Level Jump Challenge, the Level Jump disappears. In the normal game play mode, all notes of a Level Jump and notes of levels past a Level Jump are counted as bonus points, the messaging indicates that a Level Jump Challenge has been unlocked and level jumps need to be removed once completed. In the leveler mode of the game, level jump challenges are unlockable from Leveler, but not playable in Leveler and the messaging to indicate that a Level Jump Challenge has been unlocked.


Multilayer Timing


The instrument game system may also have, in certain embodiments, multi-layer timing (that contributes to an anticipatory display). The multilayered timing information is a method of providing more than one indicator for conveying the timing information to the end user in the user interface of the game. The multi-layer timing may include spatial timing, rotational timing and container growth timing that are described below in more detail.



FIG. 36 illustrates spatial timing that is part of the game. In particular, as the notes come down the noteway. the distance from the fingerboard and the speed at which the note approaches the fingerboard conveys the amount of time before the notes needs to be struck.



FIG. 37 illustrates rotational timing that is part of the game. In particular, as the notes come down the noteway, the notes rotate 90 degrees to line up with the “containers” (described above) that appear on the fingerboard. This rotation begins a measure before the note lines up with the fingerboard, providing a secondary timing for the note. In FIG. 37, a default state (FIG. 2a.) is shown with the note rectangle being vertical. One measure away from the fingerboard (and one measure before the exact moment the note needs to be played), the note being to rotate (FIG. 2b.). In the moment just before the note is to be played, the note has almost fully rotated (FIG. 2c.).



FIG. 38 illustrates container growth timing that is part of the game. In particular, the container that is on the fingerboard starts off small, and gradually grows as the note gets closer. The container will grow to its maximum size a measure before the note lands on the fingerboard.


While the foregoing has been with reference to a particular embodiment of the invention, it will be appreciated by those skilled in the art that changes in this embodiment may be made without departing from the principles and spirit of the invention, the scope of which is defined by the appended claims.

Claims
  • 1. A non-transitory computer readable storage medium having a computer program stored thereon for instructing a user on playing a musical instrument, wherein the computer program instructs a processing element to perform the following steps: present, on an electronic display, a graphical representation of a plurality of notes or chords of a song to be played on the musical instrument and by the user during a session,wherein the session corresponds to a single playing of the song,wherein each said note or chord is associated with a predetermined data file for the note or chord;divide the song into a plurality of phrases, wherein each phrase includes at least one of said plurality of notes or chords;assign at least a first difficulty level and at least a second difficulty level to each phrase;advance, on the electronic display, the graphical representation of said plurality of notes or chords to present each of the plurality of phrases of the song so as to instruct playing, on the musical instrument and by the user, the song,wherein at least one of said plurality of phrases is presented on the electronic display at said first difficulty level;receive an output corresponding to an electronic signal for each of said plurality of notes or chords produced by the musical instrument as played by the user;compare the received output against said predetermined data file for each of said plurality of notes or chords produced by the musical instrument as played by the user;for each phrase including at least one note or chord of said plurality of notes or chords, determine an accuracy of play, by the user, of the phrase,wherein the accuracy of play is at least in part based on said step of comparing the received output against said predetermined data file,wherein the accuracy of play is determined in real time as the user plays the at least one note or chord of each phrase of the song during the session; andupon the accuracy of play being above a predetermined threshold for said at least one of said plurality of phrases, automatically adjust the advanced graphical representation of the presented at least one of said plurality of phrases from the first difficulty level to the second difficulty level,wherein the automatic adjustment of the presented at least one of said plurality of phrases from the first difficulty level to the second difficulty level is performed during the session.
  • 2. The computer readable storage medium of claim 1, wherein the first difficulty level is an initial difficulty level that represents a simplest abbreviation of a phrase.
  • 3. The computer readable storage medium of claim 1, wherein the first difficulty level is based on an average difficulty level of all successfully completed phrases previously played by the user in the single session of the song.
  • 4. The computer readable storage medium of claim 1, wherein the first difficulty level is based on a difficulty level attained by the user during a previous playing of the song in a prior session.
  • 5. The computer readable storage medium of claim 1, wherein the first and second difficulty levels are based on a skill level of the user.
  • 6. The computer readable storage medium of claim 1, wherein each of said plurality of phrases further includes a third difficulty level for the phrase that can be presented during the session.
  • 7. The computer readable storage medium of claim 1, wherein the predetermined data file for each said note or chord includes note tag information and tag time information.
  • 8. The computer readable storage medium of claim 1, wherein the session begins when the graphical representation of said plurality of notes or chords of the song is presented on the electronic display, andwherein the session ends when the advancement of the plurality of notes or chords of the song on the electronic display is completed.
  • 9. A method for instructing a user on playing a musical instrument comprising the steps of: presenting, by at least one processing element and on an electronic display, a graphical representation of a plurality of notes or chords of a song to be played on the musical instrument and by the user during a session,wherein the session corresponds to a single playing of the song,wherein each said note or chord is associated with a predetermined data file for the note or chord;dividing, by at least one processing element, the song into a plurality of phrases, wherein each phrase includes at least one of said plurality of notes or chords;assigning, by at least one processing element, at least a first difficulty level and at least a second difficulty level to each phrase;advancing, by at least one processing element and on the electronic display, the graphical representation of said plurality of notes or chords to present each of the plurality of phrases of the song so as to instruct playing, on the musical instrument and by the user, the song,wherein at least one of said plurality of phrases is presented on the electronic display at said first difficulty level;receiving, by at least one processing element, an output corresponding to an electronic signal for each of said plurality of notes or chords produced by the musical instrument as played by the user;comparing, by at least one processing element, the received output against said predetermined data file for each of said plurality of notes or chords produced by the musical instrument as played by the user;for each phrase including at least one note or chord of said plurality of notes or chords, determining, by at least one processing element, an accuracy of play, by the user, of the phrase,wherein the accuracy of play is at least in part based on said step of comparing the received output against said predetermined data file,wherein the accuracy of play is determined in real time as the user plays the at least one note or chord of each phrase of the song during the session; andupon the accuracy of play being above a predetermined threshold for said at least one of said plurality of phrases, automatically adjusting, by at least one processing element, the advanced graphical representation of the presented at least one of said plurality of phrases from the first difficulty level to the second difficulty level,wherein the automatic adjustment of the presented at least one of said plurality of phrases from the first difficulty level to the second difficulty level is performed during the session.
  • 10. The method of claim 9, wherein the first difficulty level is an initial difficulty level that represents a simplest abbreviation of a phrase.
  • 11. The method of claim 9, wherein the first difficulty level is based on an average difficulty level of all successfully completed phrases previously played by the user in the single session of the song.
  • 12. The method of claim 9, wherein the first difficulty level is based on a difficulty level attained by the user during a previous playing of the song in a prior session.
  • 13. The method of claim 9, wherein the first and second difficulty levels are based on a skill level of the user.
  • 14. The method of claim 9, wherein each of said plurality of phrases further includes a third difficulty level for the phrase that can be presented during the session.
  • 15. The method of claim 9, wherein the predetermined data file for each said note or chord includes note tag information and tag time information.
  • 16. The method of claim 9, wherein the session begins when the graphical representation of said plurality of notes or chords of the song is presented on the electronic display, andwherein the session ends when the advancement of the plurality of notes or chords of the song on the electronic display is completed.
  • 17. A system for instructing a user on playing a musical instrument comprising: a processing element; anda memory element associated with a non-transitory computer readable storage medium,wherein the non-transitory computer readable storage medium has a computer program stored thereon,wherein the computer program instructs the processing element to perform the following steps: present, on an electronic display, a graphical representation of a plurality of notes or chords of a song to be played on the musical instrument and by the user during a session,wherein the session corresponds to a single playing of the song,wherein each said note or chord is associated with a predetermined data file for the note or chord;divide the song into a plurality of phrases, wherein each phrase includes at least one of said plurality of notes or chords;assign at least a first difficulty level and at least a second difficulty level to each phrase;advance, on the electronic display, the graphical representation of said plurality of notes or chords to present each of the plurality of phrases of the song so as to instruct playing, on the musical instrument and by the user, the song,wherein at least one of said plurality of phrases is presented on the electronic display at said first difficulty level;receive an output corresponding to an electronic signal for each of said plurality of notes or chords produced by the musical instrument as played by the user;compare the received output against said predetermined data file for each of said plurality of notes or chords produced by the musical instrument as played by the user;for each phrase including at least one note or chord of said plurality of notes or chords, determine an accuracy of play, by the user, of the phrase,wherein the accuracy of play is at least in part based on said step of comparing the received output against said predetermined data file,wherein the accuracy of play is determined in real time as the user plays the at least one note or chord of each phrase of the song during the session; andupon the accuracy of play being above a predetermined threshold for said at least one of said plurality of phrases, automatically adjust the advanced graphical representation of the presented at least one of said plurality of phrases from the first difficulty level to the second difficulty level,wherein the automatic adjustment of the presented at least one of said plurality of phrases from the first difficulty level to the second difficulty level is performed during the session.
  • 18. The system of claim 17, wherein the first difficulty level is selected from the group consisting of (1) an initial difficulty level that represents a simplest abbreviation of a phrase; (2) an average difficulty level of all successfully completed phrases previously played by the user in the single session of the song; and (3) a difficulty level attained by the user during a previous playing of the song in a prior session.
  • 19. The system of claim 17, wherein the predetermined data file for each said note or chord includes note tag information and tag time information.
  • 20. The system of claim 17, wherein the session begins when the graphical representation of said plurality of notes or chords of the song is presented on the electronic display, andwherein the session ends when the advancement of the plurality of notes or chords of the song on the electronic display is completed.
PRIORITY CLAIM

This application is claims priority under 35 USC 120 and is a continuation in part of U.S. patent application Ser. No. 11/865,681, filed on Oct. 1, 2007 with the title “Instrument Game System and Method” that in turn claims priority under 35 USC 119(e) and 120 to U.S. Provisional Patent Application Ser. No. 60/902,066 filed on Feb. 20, 2007 entitled “A Music Video Game with Stringed Instrument Input” which is incorporated herein by reference.

US Referenced Citations (171)
Number Name Date Kind
4264845 Bednarz Apr 1981 A
4318327 Toups Mar 1982 A
4919030 Perron, III Apr 1990 A
5036742 Youakim Aug 1991 A
5214231 Ernst et al. May 1993 A
5270475 Weiss et al. Dec 1993 A
5408911 Weiss et al. Apr 1995 A
5533903 Kennedy Jul 1996 A
5583308 Owen Dec 1996 A
5585583 Owen Dec 1996 A
5627335 Rigopulos et al. May 1997 A
5690496 Kennedy Nov 1997 A
5763804 Rigopulos et al. Jun 1998 A
D398595 Baer et al. Sep 1998 S
5834671 Phoenix Nov 1998 A
5990405 Auten et al. Nov 1999 A
6011212 Rigopulos et al. Jan 2000 A
6066791 Renard et al. May 2000 A
6156965 Shinsky Dec 2000 A
6211451 Tohgi et al. Apr 2001 B1
6225547 Toyama et al. May 2001 B1
6268557 Devecka et al. Jul 2001 B1
6336092 Gibson et al. Jan 2002 B1
6347998 Yoshitomi et al. Feb 2002 B1
6353169 Juszkiewicz et al. Mar 2002 B1
6353174 Schmidt et al. Mar 2002 B1
6390923 Yoshitomi et al. May 2002 B1
6410835 Suzuki et al. Jun 2002 B2
6425822 Hayashida et al. Jul 2002 B1
6425827 Nimura Jul 2002 B1
6429863 LoPiccolo et al. Aug 2002 B1
6476304 Uehara Nov 2002 B2
6482087 Egozy et al. Nov 2002 B1
6486388 Akahori Nov 2002 B2
6495747 Shimaya et al. Dec 2002 B2
6515211 Umezawa et al. Feb 2003 B2
6541692 Miller Apr 2003 B2
6605769 Juszkiewicz Aug 2003 B1
6628313 Minakuchi et al. Sep 2003 B1
6645067 Okita et al. Nov 2003 B1
6686531 Pennock et al. Feb 2004 B1
6740803 Brinkman et al. May 2004 B2
6751439 Tice et al. Jun 2004 B2
6835887 Devecka Dec 2004 B2
6850252 Hoffberg Feb 2005 B1
6870085 MacCutcheon Mar 2005 B2
6969797 Brinkman et al. Nov 2005 B2
6974325 D'Zmura Dec 2005 B2
7019205 Fujisawa et al. Mar 2006 B1
7020635 Hamilton et al. Mar 2006 B2
7030307 Wedel Apr 2006 B2
7030311 Brinkman et al. Apr 2006 B2
7034217 Pachet Apr 2006 B2
7081580 Brinkman et al. Jul 2006 B2
7103873 Tanner et al. Sep 2006 B2
7129407 Hiratsuka et al. Oct 2006 B2
7151214 Barry Dec 2006 B2
7157638 Sitrick Jan 2007 B1
7164076 McHale et al. Jan 2007 B2
7174510 Salter Feb 2007 B2
7288712 Uehara Oct 2007 B2
7320643 Brosius et al. Jan 2008 B1
7335833 Smith et al. Feb 2008 B2
D573605 Amacker Jul 2008 S
7404202 Hamilton et al. Jul 2008 B2
7423214 Reynolds et al. Sep 2008 B2
7435178 Tam et al. Oct 2008 B1
7435891 Perla Oct 2008 B2
7439441 Jarrett et al. Oct 2008 B2
7446253 Knapp et al. Nov 2008 B2
7453035 Evans et al. Nov 2008 B1
7459624 Schmidt et al. Dec 2008 B2
7472525 Griggs et al. Jan 2009 B2
7486791 Ohbi et al. Feb 2009 B2
7487549 Kawamoto et al. Feb 2009 B2
7521619 Salter Apr 2009 B2
7562312 Rochford et al. Jul 2009 B2
7663044 Katsuta Feb 2010 B2
7675514 Ni et al. Mar 2010 B2
7758427 Egozy Jul 2010 B2
7773121 Huntsberger et al. Aug 2010 B1
7799984 Salter Sep 2010 B2
7806759 McHale et al. Oct 2010 B2
7893337 Lenz Feb 2011 B2
7923620 Foster Apr 2011 B2
7935880 Stoddard et al. May 2011 B2
7982114 Applewhite et al. Jul 2011 B2
8026435 Stoddard et al. Sep 2011 B2
8076564 Applewhite Dec 2011 B2
8079901 Brosius et al. Dec 2011 B2
8079907 Egozy Dec 2011 B2
8119896 Smith Feb 2012 B1
D664551 Boyer et al. Jul 2012 S
8303409 Tsubakisawa et al. Nov 2012 B2
8481838 Smith Jul 2013 B1
8586849 Smith Nov 2013 B1
8686269 Schmidt et al. Apr 2014 B2
8690670 Kay et al. Apr 2014 B2
20010035088 Faecher Nov 2001 A1
20010039870 Shimaya et al. Nov 2001 A1
20010040671 Metcalf Nov 2001 A1
20010045153 Alexander et al. Nov 2001 A1
20020004191 Tice et al. Jan 2002 A1
20020157521 Shahal Oct 2002 A1
20020194984 Pachet Dec 2002 A1
20030066414 Jameson Apr 2003 A1
20030096556 Fong May 2003 A1
20030151628 Salter Aug 2003 A1
20040055441 Katsuta Mar 2004 A1
20040123726 Kato et al. Jul 2004 A1
20040158462 Rutledge et al. Aug 2004 A1
20040192430 Burak et al. Sep 2004 A1
20040233239 Lahdesmaki Nov 2004 A1
20040255758 Klefenz et al. Dec 2004 A1
20050034591 Chang Feb 2005 A1
20050235812 Fallgatter Oct 2005 A1
20050252362 McHale et al. Nov 2005 A1
20050255914 McHale et al. Nov 2005 A1
20060020469 Rast Jan 2006 A1
20060050901 Cohen et al. Mar 2006 A1
20060107819 Salter May 2006 A1
20060107826 Knapp et al. May 2006 A1
20060151803 Wesner et al. Jul 2006 A1
20060196343 Yung Sep 2006 A1
20060236850 Shaffer Oct 2006 A1
20060245599 Regnier Nov 2006 A1
20070000375 Harrison, Jr. Jan 2007 A1
20070003073 Iriarte Jan 2007 A1
20070044638 Egan Mar 2007 A1
20070051226 Diaz Mar 2007 A1
20070066403 Conkwright Mar 2007 A1
20070163427 Rigopulos et al. Jul 2007 A1
20070232374 Lopiccolo et al. Oct 2007 A1
20070234885 Schmidt et al. Oct 2007 A1
20070245881 Egozy et al. Oct 2007 A1
20070256540 Salter Nov 2007 A1
20070256541 McCauley Nov 2007 A1
20070256543 Evans et al. Nov 2007 A1
20070256551 Knapp et al. Nov 2007 A1
20080078281 Katsuta Apr 2008 A1
20080113797 Egozy May 2008 A1
20080122922 Geng May 2008 A1
20080173162 Williams Jul 2008 A1
20080200224 Parks Aug 2008 A1
20080205669 Michelet Aug 2008 A1
20080318673 Rofougaran Dec 2008 A1
20090038467 Brennan Feb 2009 A1
20090064851 Morris et al. Mar 2009 A1
20090098918 Teasdale et al. Apr 2009 A1
20090100988 Villa et al. Apr 2009 A1
20090100991 Lazovic Apr 2009 A1
20090104956 Kay et al. Apr 2009 A1
20090191932 Chiu et al. Jul 2009 A1
20100009755 Burckart et al. Jan 2010 A1
20100016079 Jessop Jan 2010 A1
20100087254 Sullivan Apr 2010 A1
20100137048 Kim et al. Jun 2010 A1
20100137049 Epstein Jun 2010 A1
20100216549 Salter Aug 2010 A1
20100313736 Lenz Dec 2010 A1
20110003638 Lee et al. Jan 2011 A1
20110028218 Gomes et al. Feb 2011 A1
20110203442 Raveendran Aug 2011 A1
20110207513 Cross et al. Aug 2011 A1
20110247479 Helms et al. Oct 2011 A1
20110259176 Pillhofer et al. Oct 2011 A1
20120090446 Moreno Apr 2012 A1
20120266738 Evans et al. Oct 2012 A1
20130036897 Parks Feb 2013 A1
20130065656 Parks Mar 2013 A1
20140109750 Reynolds et al. Apr 2014 A1
Foreign Referenced Citations (6)
Number Date Country
1465150 Jun 2004 EP
WO 9858364 Dec 1998 WO
WO 0195052 Dec 2001 WO
WO 2006042358 Apr 2006 WO
2008103269 Aug 2008 WO
2012125763 Sep 2012 WO
Non-Patent Literature Citations (28)
Entry
PCT International Search Report of PCT/US12/29123; dated Jun. 20, 2012 (2 pages).
PCT Written Opinion of PCT/US12/29123; dated Jun. 20, 2012 (5 pages).
Cakewalk © Pro Audio™ User's Guide Copyright © 1999 (604 pages).
Webpage:http://web.archive.org/web/20070920223307/http://www.seventhstring.com/tuner/tuner.html, dated Sep. 20, 2007 (5 pages).
Webpage: http://web.archive.org/web/2007071209222/http://www.nch.com.au/twelvekeys, dated Jul. 12, 2007 (1 page).
Webpage:http://web.archive.org/web/20080309172950/http://www.owlnet.rice.edu/˜elec301/Projects02/realTime/TheProgram.htm, dated Mar. 9, 2008 (1 page).
Webpage: http://marketplace.xbox.com/en-US/Product/Karaoke-Revolution/66acd000-77fe-1000-9115-d8024b4e0803, dated Aug. 27, 2012 (3 pages).
Webpage: http://web.archive.org/20070705093418/http://en.wikipedia.org/wiki/Karaoke—Revolution, dated Jul. 5, 2007 (4 pages).
Webpage: http://web.archive.org/web/20070929061546/http://www.noteworthysoftware.com, dated Sep. 29, 2007 (2 pages).
Webpage: http://web.archive.org/web/20070218101428/http://www.sloud.com/technology/query—by—humming, dated Sep. 27, 2007 (2 pages).
Webpage: http://web.archive.org/web/20070928013629/http:/www.ultrastarstuff.com/ dated Sep. 27, 2007 (2 pages).
Webpage: http://web.archive.org/web/20070809101756/http://www.starplaymusic.com/index.php, dated Aug. 9, 2007 (2 pages).
PCT/US 08/01965 International Preliminary Report dated Sep. 3, 2009 (2 pgs).
PCT/US 08/01965, Written Opinion, Aug. 1, 2008 (5 pgs).
PCT/US 08/01965, International Search Report, Jun. 9, 2008 (3 pgs).
PCT/US 08/01965, Written Opinion, Jun. 9, 2008 (6 pgs).
Miracle Piano Teaching System, from Wikipedia, 1990; http://en.wikipedia.org/wiki/Miracle—Piano (2 pgs).
Miracle Piano Teaching System, Nintendo Entertainment System User's Guide, 1990; 1991 The Software Toolworks, Inc.; (34 pgs).
Teach Me Piano (formerly Discovering Keyboards), Voyetra Technologies, 1997 (copyright 1996) (2 pgs).
Karaoke, Revolution, Konami, Nov. 2003 (6 pgs).
International Search Report dated Jul. 8, 2010 for PCT application PCT/US2009/065413.
Greg Johnson. www.littleBigStar.net. Games with music instruments/make music by gaming. 2007. Accessed Jul. 5, 2010.
JamOrigin. www.jamorigin.com. Amazing Guitar Recognition Engine. 2010. Accessed Jul. 5, 2010.
Offbeat-Guitarist-The Guide. http://offbeatforum.com/entry.php?2-OFFBEAT-Guitarist-The-Guide.com. 2010. Accessed Jul. 5, 2010.
DJ Hero 101. www.djhero.com. Available at http://www.djhero.com/game/djhero101. Accessed Feb. 3, 2010.
DJ Hero. Wikipedia. Available at http://en.wikipedia.org/wiki/Dj—hero. Accessed Feb. 3, 2010.
Guitar Hero. Wikipedia. Available at http://en.wikipedia.org/wiki/Guitar—Hero. Accessed Feb. 3, 2010.
U.S. Appl. No. 60/902,066 entitled Music video game with stringed instrument input filed: Feb. 20, 2007.
Related Publications (1)
Number Date Country
20110207513 A1 Aug 2011 US
Provisional Applications (1)
Number Date Country
60902066 Feb 2007 US
Continuation in Parts (1)
Number Date Country
Parent 11865681 Oct 2007 US
Child 13047727 US