The present invention relates to video games, and, more specifically, rhythm-action games which simulate the experience of playing in a band.
Music making is often a collaborative effort among many musicians who interact with each other. One form of musical interaction may be provided by a video game genre known as “rhythm-action,” which involves a player performing phrases from an assigned, prerecorded musical composition using a video game's input device to simulate a musical performance. If the player performs a sufficient percentage of the notes or cues displayed for the assigned part, the singer may score well for that part and win the game. If the player fails to perform a sufficient percentage, the singer may score poorly and lose the game. Two or more players may compete against each other, such as by each one attempting to play back different, parallel musical phrases from the same song simultaneously, by playing alternating musical phrases from a song, or by playing similar phrases simultaneously. The player who plays the highest percentage of notes correctly may achieve the highest score and win.
Two or more players may also play with each other cooperatively. In this mode, players may work together to play a song, such as by playing different parts of a song, either on similar or dissimilar instruments. One example of a rhythm-action game with different instruments is the ROCK BAND® series of games, developed by Harmonix Music Systems, Inc. and published by Electronic Arts, Inc. and MTV Games. ROCK BAND® simulates a band experience by allowing players to play a rhythm-action game using various simulated instruments, e.g., a simulated guitar, a simulated bass guitar, a simulated drum set, or by singing into a microphone. Other examples of rhythm-action games, focused specifically on singing or vocal performances are the KARAOKE REVOLUTION® series of games published by Konami Digital Entertainment, the SINGSTAR® series published by SONY Computer Entertainment, and LIPS™ published by Microsoft Corporation. An example of a prior art systems and methods for comparing a received vocal input's pitch and timing to a particular vocal track pitch is U.S. Pat. No. 7,164,076 to McHale et al.
Prior rhythm-action games directed to vocal performance typically allow one or more players to sing the main vocal part of a song, i.e., the vocal melody. Often the interfaces of these games are similar to traditional karaoke interfaces in that the lyrics appear as words on a display in a sequence and some indication is given to the player which lyrics should be sung when. For example, in Microsoft's LIPS™ game, a lyrical phrase is displayed in white text in the center of the screen and when a word is supposed to be sung, that word's text changes color from white to yellow. Naturally, at the end of the phrase, the text of the entire phrase is yellow. While the current phrase is being performed, the next phrase is displayed in grey text below the current phrase and at the end of the current phrase, the new phrase is shifted up and the text is changed from grey to white.
Beyond what is offered by traditional karaoke systems, many vocal-oriented rhythm-action games also indicate to the player the pitch the player is expected to sing. In LIPS™, a series of stationary hollow horizontal cues or “note tubes” are arranged vertically according to the pitch to person is expected to sing; higher notes are displayed as tubes located higher on the display than tubes representing lower notes. The length of a note tube generally indicates the duration of the lyrics or syllable, and the tubes fill in with color only when the player is singing on key. When the next phrase is to be sung, the prior set of tubes disappears and the next set of tubes is displayed.
In SINGSTAR®, a similar pitch-relative stationary tube system is used—that is hollow tubes show what the player is expected to sing—but the input from the player also paints tubes on the screen reflecting the player's pitch. This has the effect of filling in the hollow tubes when the player is on key and coloring in areas above and below the tube when the player's voice is sharp or flat, respectively, to the expected pitch.
KARAOKE REVOLUTION® presents pitch differently that LIPS™ or SINGSTAR®. In KARAOKE REVOLUTION®, a lane is displayed to the player with a note tubes within it that scroll from right to left with lyrics that scroll under the corresponding note tubes. Both the lyrics and note tubes pass through the vertical plane of a target marker, or “Now Bar,” that indicates when the lyric is supposed to be sung and at what pitch. Additionally, an arrow-shaped pitch indicator moves vertically within the lane with respect to the note tube to indicate how sharp or flat the player's voice is compared to the expected pitch represented by the note tube and aligns with the note tube and gives off “sparks” when the player is on key. Unfortunately, the display method used in KARAOKE REVOLUTION® is incompatible with certain displays and causes the lyrics of a song to “tear” and blur, thereby interfering with the player's enjoyment of the game.
Another problem with prior rhythm-action games is the handling of multiple singers. Often only one player is allowed to provide the vocals for a group. Where multiple players can sing, even as a group, players' performances are isolated with respect to each other—that is for each player a separate lane is presented on the display. This is true even when both players are singing the exact same part. Furthermore, though these existing rhythm-action games provide a single user with the ability to sing as part of a band, or sing the same vocal parts as another player, i.e., both players sing the melody as a duet or to sing as lead and backup vocals, none allow players to dynamically switch which vocal part the player is singing. In existing rhythm-action games where players can sing simultaneously, players are locked into a particular part. For example, in KARAOKE REVOLUTION® PRESENTS AMERICAN IDOL® ENCORE, during a “True Duet” where two players sing simultaneously, before the song starts, one player must choose to sing “lead vocals” while the second player chooses “backup.” Once gameplay begins, each player is required to sing only the assigned phrases for the part they initially selected, and are penalized as “missing” his or her assigned part if they sing the part assigned to the other player. These part assignments remain until the end of gameplay, thus preventing player from experimenting with different parts unless they physically exchange microphones. The present invention overcomes these deficiencies in several ways.
The present invention, implemented in various ways such as computerized methods executed by a game platform, executable instructions tangibly embodied in a computer readable storage medium, systems with an apparatus configured to perform particular functions, apparatuses with means for performing the functions, and other forms of technology, provides players of a rhythm-action game a playing experience that more closely resembles that of a cooperative band. This is achieved through several aspects of the invention.
In one aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for dynamically determining a musical part performed by a player of a rhythm-action game. In one aspect of a rhythm-action game, microphones are not tied to a particular part and therefore any player can play any of a number of parts, e.g., melody or harmony, lead or rhythm, guitar or bass, without switching instruments. This is accomplished by displaying, on a display, a plurality of target music data associated with a musical composition, receiving a music performance input data via the input device, determining which of the plurality of target music data has a degree of matching with the music performance input data, and assigning the music performance input data to the determined target music data.
Beneficially, there are various embodiments of the methods, systems, computer program products, and apparatuses of the aspect. For example, in some embodiments, a score is generated by the game platform based on the degree of matching between the music performance input data and the target music data. In some embodiments, determining the degree of matching is based on a score assigned to the music performance input data with respect to the target music data. In other embodiments, determining the degree of matching is based on the music performance input data being within a tolerance threshold of the target music data. In still other embodiments, determining the degree of matching is based on the proximity of a visual representation of the music performance input data to a visual cue associated with the target music data. Determining the degree of matching can also take other factors into account. For example, determining the degree of matching can include ignoring an octave difference between the music performance input data and the target music data. This is accomplished, in some versions, by determining a first number, e.g., such as a MIDI number corresponding to the input (but the invention is not limited to MIDI), and determining a second number, such as the MIDI number corresponding the target music data, i.e., to the expected input. Then, a modulo operation is performed on the first and second numbers to determine a difference between the music performance data and the target music data. In some implementations, the modulo operation involves determining an above-number based on the first number that is within an octave above the second number. Then an above-difference between the above-number and the second number is determined. A below-number based on the first number that is within an octave below the second number is also determined, as is a below-difference between the below-number and the second number. Finally the minimum of the above-difference and the below-difference is determined to provide the minimal pitch difference between the input and the expected input.
In some implementations, it is determined if the music performance input data is within a tolerance threshold of each of at least two target music data of the plurality of target music data. In these scenarios, alternative approaches to determining the degree of matching are provided based on the tolerance threshold. For example, determining the degree of matching may include assigning a score to each of the at least two target music data. Alternatively or additionally, determining the degree of matching can be based on determining that the music performance input data is no longer within the tolerance threshold of one of the at least two target music data.
An alternate embodiment instead displays a plurality of target music data associated with a musical composition on the display, similarly receives a music performance input data via an input device, but bases the degree of matching solely on determining that the music performance input data is within a tolerance threshold of one of the plurality of target music data. If it is, the music performance input data is assigned to the determined target music data.
Various implementations allow for input from various sources and display target music data accordingly. For example, in some versions, determined target music data is a vocal part of the musical composition. Or, alternatively, the determined target music data can be an instrumental part of the musical composition, such as a guitar part, a keyboard part, a drum part, or a bass guitar part of the musical composition. The sources for each input often correspond to the parts, e.g., for vocal parts, the input device is a microphone. Where the part is a guitar part, a simulated guitar is used and so forth.
With dynamic part determination, often the music performance input data is assigned to only one of the plurality of target music data at a time. Similarly, each of the plurality of target music data can have only one music performance input data assigned to it at a time. For example, while the music performance input data is assigned to the determined target music data, a second music performance input data would be prevented from being assigned to the determined target music data.
To prevent random assignments where the degree of matching is low with all target music data, one implementation involves displaying, on a display, a plurality of target music data associated with a musical composition. A music performance input data is received via an input device such as a microphone, and it is determined that none of the plurality of target music data has a degree of matching with the music performance input data. In this scenario, assignment of the music performance input data to any of the plurality of target music data is prevented until the music performance input data has a degree of matching with one of the plurality of target music data.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for biasing a musical performance input of a player of a rhythm-action game to a part in the game. In one aspect this is accomplished by providing, by a game platform, a history of a degree of matching between a prior music performance input data and a prior music data associated with a first part in a musical composition. Then, on a display, a plurality of target music data, each associated with a respective part in the musical composition, is displayed, with one of the plurality being associated with the first part. Music performance input data is received by the game platform via an input device, such as a microphone (although it could be a simulated guitar, drum, keyboard, or other simulated instrument), and, based on the history, the received music performance input is assigned to the target music data of the plurality that is associated with the first part.
Some implementations of the above methods, systems, computer program products, and apparatuses for biasing a musical performance input of a player of a rhythm-action game to a part in the game provide additional or alternative functionality. For example, in some versions, it is further determined that the music performance input data is within a tolerance threshold of each of at least two target music data of the plurality of target music data. Also, in some versions, the received music performance input data is compared to each of the plurality of target music data and a score is assigned to each comparison, e.g., one part that the music performance data was not close to may have a low score whereas a different part that the music performance input data was close to may have a high score.
Several methodologies are provided for determining the history of the degree of matching. For example, in some versions, the history of the degree of matching is determined based on a score assigned to the prior music performance input data with respect to the prior music data. Alternatively or additionally, the history of the degree of matching can be determined based on if the prior music performance input data is within a tolerance threshold of the prior music data, or within a tolerance threshold that overlaps with a second tolerance threshold. Or, in some cases, the history of the degree of matching is determined based on the proximity of a visual representation of the prior music performance input data to a visual cue associated with the prior music data. In still other cases, the history is based on silence—that is, the history of the degree of matching is based on the prior music performance input data being silence when the prior music data indicated no music performance input data should be received.
The history of the degree of matching is typically stored in memory, such as a memory buffer, and often multiple memory buffers are used, one for each part or target music data. Often the information is stored for a limited time, such as ten seconds, but alternatively it can be stored song to song or gaming session to gaming session, e.g., a particular player always tries to sing the melody.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for scoring a musical performance after a period of ambiguity in a rhythm-action game. In one aspect this is accomplished by displaying, on a display in communication with a game platform, a first target music data and a second target music data associated with a musical composition. The first target music data has a tolerance threshold that overlaps with a tolerance threshold of the second target music data. Then, a music performance input data is received via an input device, also in communication with the game platform. The game platform determines that the music performance input data is within the first target music data tolerance threshold and within the second target music data tolerance threshold. When this occurs, the game platform determines a first score based on a first degree of matching between the music performance input and the first target music data and determines a second score based on a second degree of matching between the music performance input and the second target music data. The game platform then assigns the music performance input data to the first target music data or the second target music data, often to whichever has the higher score, when the difference between the first score and second score is greater than a predetermined value.
Advantageously, some implementations of the above methods, systems, computer program products, and apparatuses for scoring a musical performance after a period of ambiguity in a rhythm-action game provide additional or alternative functionality. For example, in some implementations, the music performance input data is assigned to the part associated with the higher score. Also, in some versions, the first and second degrees of matching are determined by comparing a pitch component of the music performance input data to a pitch component of the respective first and second target music data.
In some embodiments, when the score that will be assigned is ambiguous, one of the scores, e.g., the first score, is displayed on the display. Once the assigned score is known, the assigned score, if it is not the first score, is displayed, and the first score ceases to be displayed.
In some versions of the above methods, systems, computer program products, and apparatuses, displays, on the display, at least a first target music data and a second target music data associated with a musical composition, the first target music data having a tolerance threshold that overlaps a tolerance threshold of the second target music data. A music performance input data is received via the input device such as a microphone, etc, and it is determined that the music performance input data is within the first target music data tolerance threshold and also within the second target music data tolerance threshold. Then, the music performance input data is assigned to the first target music data when the first target music data tolerance threshold and the second target music data tolerance threshold no longer overlap. In these scenarios, a score, based on a degree of matching between the music performance input data and the first target music data, is also typically assigned to the music performance input data.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for scoring a musical performance involving multiple parts in a rhythm-action game. In one aspect this is accomplished by displaying, on a display in signal communication with a game platform, target musical data associated with a musical composition. The game platform receives a first music performance input data, with the first music performance input data being associated with a first part in the musical composition. The game platform also receives a second music performance input data, the second music performance input data associated with a second part in the musical composition. The game platform then calculates a first score based on the first music performance input data and a second score based on the second music performance input data. It then calculates a final or modified score based on the first score and the second score.
Some of the above methods, systems, computer program products, and apparatuses for scoring a musical performance involving multiple parts in a rhythm-action game provide additional or alternative functionality. For example, in some versions, selecting the first score involves determining that the first score is higher than the second score. Alternatively, selecting the first score can involve determining that the first score has priority in being selected over the second score. With respect to modifying the score, in some implementations, modifying the preferred score includes increasing the preferred score by a percentage of the second score. Alternatively or additionally, modifying the preferred score can be based on a third score calculated based on a degree of matching between a third music performance input data and a third part. Additionally, the first and second parts can be associated with a musical player and a final score is assigned that is the modified preferred score to the musical player.
Because microphones or instruments are not tied to parts, in some cases, the first music performance input data and the second music performance input data are received by the game platform from the same input device. In other cases, the first music performance input data and the second music performance input data are received by the game platform from different input devices. Also, the first score can be associated with a melody of the musical composition and the second score is associated with a harmony of the musical composition, or vice versa.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for dynamically displaying a pitch range in a rhythm-action game. In one aspect this is accomplished by a game platform dividing a musical composition into a plurality of portions each comprising one or more notes. The musical composition can be divided into portions based on verses, phrases, a length of time, at least a predetermined number of musical notes, or a combination thereof. Then the game platform determines a pitch range between a highest note and a lowest note for each portion. Then the game platform determines a display density for each portion based on the pitch range of each portion, or alternatively, a display density for the entire song based on the greatest pitch range of all portions. Then, the game platform displays each portion within a viewable area. The viewable area has a density that is alterable based on the portion to be displayed or a position that is alterable based on the portion to be displayed, or has both an alterable position and alterable pitch density.
In some versions of the above methods, systems, computer program products, and apparatuses for dynamically displaying a pitch range in a rhythm-action game, the center of the viewable area's position is substantially equidistant between the lowest note and highest note of the portion. Beneficially, the position of the viewable area can be altered before displaying a new portion such that the viewable area appears to slide from the prior position to a position where the center of the viewable area is substantially equidistant between the high note of the new portion and the low note of the new portion. Alternatively, position of the viewable area can be altered before displaying a new portion such that the viewable area appears to slide from the prior position to a position where the viewable area displays the highest note of the new portion and the lowest note of the new portion.
For implementations with variable densities, the above implementations sometimes also include altering the density of the viewable area before displaying a new portion such that the viewable area appears to zoom in before displaying the new portion. Or, in the alternative, altering the density of the viewable area before displaying a new portion such that the viewable area appears to zoom out before displaying the new portion.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for preventing an unintentional deploy of a bonus in a video game. In one aspect this is accomplished by displaying, on a display in communication with a game platform, a target music data of a musical composition. The game platform receives a music performance input data via the microphone, and also determines if the music performance input data has a predetermined degree of matching with a vocal cue. If so, the performance input data is prevented from executing an improvisation deploy.
In some versions of the above methods, systems, computer program products, and apparatuses for preventing an unintentional deploy of a bonus in a video game, it is further determined that an improvisation deploy value exceeds a predetermined threshold. In some of the embodiments, it is determined that the music performance input data is at least a predetermined volume for a predetermined duration. In some implementations though, the music performance input data may satisfy the predetermined volume for a predetermined duration, but yet not count towards a threshold input that executes the improvisation deploy because the music performance input has a degree of matching with the target music data. Some embodiments further involve, receiving a second music performance input data via a second microphone; and executing the improvisation deploy if the second music performance input data does not have a predetermined degree of matching with the first target music data.
Alternatively, there are implementations that display, on the display, a first target music data of a musical composition and receive a first performance input data via the microphone. Then, the implementation determines if the first performance input data has a predetermined degree of matching with the first vocal cue, and prevents the first performance input from executing an improvisation deploy if the first performance input is within a tolerance threshold of the first target music data.
In any of the scenarios above involving preventing an unintentional deploy of a bonus in a video game, the first target music data can be a melody target music data or part and the second target music data is a harmony target music data or part. Alternatively, the first target music data can be a harmony target music data or part and the second target music data can be a melody target music data or part. Furthermore, some versions determine if the first performance input is associated with the first vocal cue by determining if a pitch component of the first performance input has a degree of matching with the first vocal cue. Some versions, however, determine if the first performance input is associated with the first vocal cue by determining if the first input matches the first vocal cue within a tolerance threshold of the first vocal cue.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for displaying song lyrics and vocal cues in a rhythm-action game. In one aspect this is accomplished by displaying, on a display in communication with a game platform, a vocal cue. The vocal cue moves on the display in synchronization with a timing component of a musical composition towards a target marker. Lyrics are also displayed, but instead of moving with the movement of the vocal cue the lyrics are displayed in a fixed position. The lyrics maintain their position until the vocal cue has moved to a particular position with respect to the target marker.
In some versions of the above methods, systems, computer program products, and apparatuses for displaying song lyrics and vocal cues in a rhythm-action game, the particular position is the first vocal cue is aligned with a vertical plane of the target marker. Moving, in some versions, includes altering the horizontal position of the vocal cue from the right side of the display, through a vertical plane of the target marker, to the right of the display. The particular position that the vocal cues moves to, which is what triggers the release of the lyric from the fixed position, in some cases is the to the left of the vertical plane of the target marker or directly on top of it. Additionally, some implementations further involve displaying a second vocal cue associated with the first vocal cue, the second vocal cue moving through a plane of the target marker; and displaying, on the display, a second lyric.
In any of the above examples for displaying song lyrics and vocal cues in a rhythm-action game, the coloration of the lyric can be altered depending on the position of the vocal cue. For example, the lyric can be highlighted if the vocal cue is aligned with a vertical plane of the target marker, or de-highlighted if the vocal cue is past the vertical plane of the target marker, or appear deactivated before the vocal cue reaches the target marker (i.e., the vocal cue is located to the right of the target marker).
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means, for providing a practice mode for multiple musical parts in a rhythm-action game. In one aspect this is accomplished by displaying, on a display in communication with a game platform, a first and second target musical data associated with a musical composition. The game platform receives a selection by the user of the first target musical data to be performed and produces an audio output associated with the first and second target musical data. The game platform also produces a synthesized tone associated with the first target musical data. In some versions, the target music data that is not selected is dimmed and made less visible.
The implementations above of the practice mode may also include receiving music performance input data and scoring the music performance input data with respect to only the first target musical data. Additionally, some embodiments of the practice mode produce synthesized tone at a volume louder than the audio associated with the first and second target musical data. The first target musical data can be the melody and the second target music data can be the harmony or vice versa.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for selectively displaying song lyrics in a rhythm-action game. In one aspect this is accomplished by determining a number of vocal cues to be displayed on a display in communication with a game platform, where the vocal cues are each associated with a lyric. Provided a number of areas available to display a set of lyrics, either before run-time or determined at run-time, the game platform determines, based on a lyric priority associated with each lyric, which of the lyrics associated with each vocal cue to display when the number of vocal cues exceeds the number of areas available.
The methods, systems, computer program products, and apparatuses for selectively displaying song lyrics include also allow for variations. For example, in some versions, one of the vocal cues and an associated lyric are associated with a lead vocal part in a musical composition. Alternatively, the vocal cues can be a plurality of the vocal cues, and the plurality and their corresponding lyrics are associated with a plurality of harmony vocal parts in the musical composition. In some embodiments, the lyric priority of each of the plurality of lyrics is predetermined. In other embodiments, the lyric priority of each of the plurality of lyrics is assigned randomly. In some implementations, the number of areas available to display the set of lyrics is predetermined before execution of the computerized method. In other implementations, the number of areas available to display the set of lyrics is determined at rum-time.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for displaying an input at multiple octaves in a rhythm-action game. In one aspect this is accomplished by receiving by a game platform via a microphone, a music performance input data and displaying, on a display in communication with the game platform, a first pitch marker reflective of the music performance input data. Then substantially simultaneously with the display of the first pitch marker, displaying a second pitch marker at an offset, typically vertical, from the first pitch marker, the offset indicative of an octave difference between the first pitch marker and the second pitch marker. In some versions, the second pitch marker is indicative of an octave above the first pitch marker; in others the second pitch marker is indicative of an octave below the first pitch marker. As above, a vocal cue is displayed that includes a pitch component. Then a first score is calculated for the first pitch marker based on a comparison between the first pitch marker and the pitch component of the vocal cue and a second score is calculated for the second pitch marker based on a comparison between the second pitch marker and the pitch component of the vocal cue. In some implementations, the second pitch marker is displayed only if the music performance input data has a degree of matching with a target music data when the octave of the music performance input data is not used to determine the degree of matching. In some of these implementations, the degree of matching is based on a tolerance threshold.
In another aspect, there are methods, systems with an apparatus configured to perform particular functions, computer program products, and apparatuses that provide means for displaying a harmonically relevant pitch guide in a rhythm-action game. In one aspect this is accomplished by analyzing, by a game platform, target music data associated with a musical composition to determine a musical scale within the target music data. Then a bounded space, such as a lane to display vocal cues in, is displayed that includes a plurality of interval demarcations based on the scale, and a background comprising a color scheme based on preselected pitches of the scale. Then the game platform displays the target music data in a manner indicative of the harmonically relevant pitches with respect to the pitch guide.
In some versions of the above methods, systems, computer program products, and apparatuses for displaying a harmonically relevant pitch guide, the preselected pitches are harmonically relevant. Specifically, in some embodiments, the preselected pitches of the scale are the root, 3rd, and fifth pitches of the scale. In some implementations, an uppermost pitch and a lowermost pitch of the target music data is determined. Then the lane's upper bound is based on the uppermost pitch of the target music data and its lower bound based on the lowermost pitch of the target music data. Additionally or alternatively, the color scheme can include a first color for intervals not matching the preselected pitches and a second color for intervals that do match the preselected pitches, or the color scheme can include shading the background according to the preselected pitches, such as being shaded with a first color for harmonically relevant pitches and shaded a second color for pitches that are not harmonically relevant.
Advantageously, for any of the aspects above, a pitch arrow can be associated with the music performance input data to indicate how the player is performing. Specifically, the pitch arrow points up if the music performance input data is flat compared to the assigned target music data and the pitch arrow points down if the music performance input data is sharp compared to the assigned target music data. Alternatively or additionally, the pitch arrow points towards the assigned target music data and the arrow is positioned above the target music data if the music performance input data is sharp compared to the assigned target music data or the arrow is positioned below the target music data if the music performance input data is flat compared to the assigned target music data.
Beneficially, in some implementations where a music performance input is assigned to a part, target music data, or is scored, doing so may alter a visual property of the part, the target music data associated with the part, the arrow associated with the music performance input, or any of these. For example, the target music data (part, arrow, etc.) may glow or flash. Alternatively assignment or scoring may alter an audio property of the game such as causing a crowd to cheer or to make the received performance input data or audio associated with the target music data become louder or have a distortion effect applied to it.
Other aspects and advantages of the present invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating the principles of the invention by way of example only.
The foregoing and other objects, features, and advantages of the present invention, as well as the invention itself, will be more fully understood from the following description of various embodiments, when read together with the accompanying drawings, in which:
Architecture
The game platform 100 is typically in electrical and/or signal communication with a display 105. This may be a television, an LCD monitor, projector, or the like. The game platform is also typically in electrical or signal communication with one or more controllers or input devices. In
Though reference is made to the game platform 100 generally, the game platform, in some embodiments such as that depicted in
Prior art versions of some of these modules are found in U.S. Pat. No. 7,164,076 as they relate to processing vocal input. The data extractor module 170 extracts pitch data and timestamps stored in song data records, which may be stored in storage 140, RAM 145, ROM, 150, on memory card or disc media in communication with the game platform 100, or accessible via a network connection. The digital signal processor module 175 extracts pitch frequency data from the digital data stream using known pitch extraction techniques. In some embodiments, a time-based autocorrelation filter is used to determine the input signal's periodicity. The periodicity is then refined to include a fractional periodicity component. This period is converted into frequency data, which is then converted into a semitone value or index using known conversion techniques. The semitone value may be similar to a MIDI note number, but may have both integer and fractional components (e.g., 50.3). While the pitch data is typically represented by semitones, pitch data can be converted into any desired units (e.g., Hertz) for comparison with the sampled pitch data from a microphone input.
The comparison module 180 compares the timestamps of data records with the sample time associated with the pitch sample. The comparison module 180 selects a data record from a number of data records stored in a buffer that has a timestamp that most closely matches the sample time, then compares the pitch value stored in that data record (i.e., correct pitch) with the pitch sample associated with sample time. In some embodiments, the comparison includes determining the absolute value of the difference between the correct pitch value and the sample pitch data. The performance evaluation module 185 takes the results of the comparison module 180 and generates performance evaluation data based on the pitch error and other settings, e.g., the difficulty chosen by the player. This information includes a tolerance threshold, which can be compared against the pitch error to determine a performance rating. If the pitch error falls within the tolerance threshold, then a “hit” will be recorded, and if the pitch error falls outside the tolerance threshold, then a “miss” will be recorded. The hit/miss information is then used to compute a score and to drive or trigger the various performance feedback mechanisms described herein (e.g., pitch arrow, performance meter, crowd meter, etc.). Though pitch extraction and determining a hit or miss for one part are known, the modules, their functions and programming are improved by the present invention and provide new functionality described herein.
In some embodiments, execution of game software limits the game platform 100 to a particular purpose, e.g., playing the particular game. In these scenarios, the game platform 100 combined with the software, in effect, becomes a particular machine while the software is executing. In some embodiments, though other tasks may be performed while the software is running, execution of the software still limits the game platform 100 and may negatively impact performance of the other tasks. While the game software is executing, the game platform directs output related to the execution of the game software to the display 105, thereby controlling the operation of the display. The game platform 100 also can receive inputs provided by one or more players, perform operations and calculations on those inputs, and direct the display to depict a representation of the inputs received and other data such as results from the operations and calculations, thereby transforming the input received from the players into a visual representation of the input and/or the visual representation of an effect caused by the player.
Game Interface
While
One or more of the players of the game may be represented on screen by an avatar 205a, 205b, 205c (collectively 205), rendered by the graphics processor 165. In some embodiments, an avatar 205 may be a computer-generated image. In other embodiments, an avatar 205 may be a digital image, such as a video capture of a person. An avatar 205 may be modeled on a famous figure or, in some embodiments, the avatar may be modeled on the game player associated with the avatar. In cases where additional players enter the game, the screen may be altered to display an additional avatar 205 and/or music interface for each player.
In
The cues 220 are distributed in the lane 200 in a manner having some relationship to musical content associated with the song being audibly played. For example, the cues 220 may represent pitch (cues displayed towards the bottom of the lane represent notes having a lower pitch and cues towards the top of the lane, e.g., 220c represent notes having a higher pitch), volume (cues may glow more brightly for louder tones), duration (cues may be “stretched” to represent that a note or tone is sustained), note information (cues spaced more closely together for shorter notes and further apart for longer notes), articulation, timbre or any other time-varying aspects of the musical content. The cues 220 may be tubes, cylinders, circlers, or any geometric shape and may have other visual characteristics, such as transparency, color, or variable brightness.
As the cues 220 move through the lane and intersect the Now Bar 225, the player is expected to sing the musical data represented by the vocal cues. To assist the player, in some embodiments the music data represented by the note tubes 220 may be substantially simultaneously reproduced as audible music or tones. For example, during regular play or during a practice mode, to assist a player with a particular part, the pitch that the player is expected to sing is audibly reproduced, thereby assisting the player by allowing them to hear a pitch to match.
In certain embodiments, successfully performing the musical content triggers or controls the animations of avatars 205. Additionally, the visual appearance of interface elements, e.g., the cues 220, may also be modified based on the player's proficiency with the game. For example, failure to execute a game event properly may cause game interface elements 220 to appear more dimly. Alternatively, successfully executing game events may cause game interface elements 220 to glow more brightly. Similarly, the player's failure to execute game events may cause their associated avatar 205 to appear embarrassed or dejected, while successful performance of game events may cause their associated avatar to appear happy and confident. In other embodiments, successfully executing cues 220 in the lane 200 causes the avatar 205 associated with that performance to appear to sing in a particular manner. For example, where the singer is on key for a sustained period, the avatar 205 may appear to “belt out” the vocal part. In some embodiments, when two or more parts are being successfully sung, e.g., human players are singing the same or different vocal parts and the players are both on key for their respective parts (or the same part), avatars 205 will visually be depicted standing closer together or leaning into each other and singing. In some embodiments the avatars 205 will be depicted sharing a microphone on stage. Successful execution of a number of successive cues 220 may cause the corresponding avatar 205 to execute a “flourish,” such as kicking their leg, pumping their fist, winking at the crowd, spinning around, or if the avatar is depicted with an instrument, performing a guitar “windmill,” throwing drum sticks, or the like.
Player interaction with a cue 220 may be required in a number of different ways. In one embodiment, there is one vocal “player” for the game, i.e., vocal input from any number of different microphones are represented as one member of the group, or by one avatar 205, even though many real-world players may provide vocal input. For example, in one scenario, a real-world player can choose a microphone 110a as their instrument from an instrument selection screen. When gameplay starts, the game platform detects, in some cases via peripheral interfaces 155, that multiple microphones are in electrical and/or signal communication with the game platform (e.g., plugged into the game platform or connected wirelessly). When a vocal cue 220 is displayed to the players, not only can the person that chose the microphone 110a at the instrument selection screen sing and provide vocal input, but so can anyone else singing into a microphone 110b that is in signal communication with the game platform 100. Any singing input into the other microphones 110b provided by real-world players can be treated as coming from the person that chose the microphone 110a, as additional input, or as complementary input to the person that chose the microphone 110a. In some implementations, additional or complementary input provides a bonus score to the person that selected the microphone. Thus, several real-world people singing can be treated, along with the person that chose microphone, as one “player” or one “instrument” in the game.
Beneficially, no one player singing into a microphone 110 is necessarily tied to a vocal part, e.g., a melody or harmony part. In a multi-vocal part game, e.g., one that allows players to sing melody and harmony parts simultaneously, the player that chose the microphone 110a can sing a harmony part while another player that has a microphone 110b can sing a melody part, or vice versa, or the two can switch dynamically during gameplay, even during a single phrase. Not tying players to particular parts is applicable to other instruments as well, e.g., guitars 115, and not limited to vocal input. For example, in a game where there are multiple guitar parts, e.g., lead guitar and rhythm guitar, two players each playing a simulated guitar 115a, 115b can play with one player performing the lead guitar part, the other playing the rhythm guitar part, or vice versa, or they can switch which part they are each playing dynamically during the game, even within a phrase. Similarly, where there are two or more keyboard parts displayed, two or more players with simulated keyboards (not shown) can each play different parts, e.g., parts played on the higher keys on the keyboard, in the middle of the keyboard, or parts involving the lower keys. Additionally or alternatively, combinations of parts can be played by a single player and additional players can play additional parts, e.g., one person plays high and middle key parts and the other plays low parts, or one person could perform high and low while another performs middle parts, or other combinations.
Referring back to vocal input, player interaction with a cue 220 may comprise singing a pitch and or a lyric associated with a cue 220. In one aspect, multiple vocal parts 220a, 220b, 220c are displayed substantially simultaneously in the same lane 200, with different vocal parts depicted using different colors. Additionally or alternatively, pitch indicators 230a, 230b, 230c (collectively 230) assigned to each microphone in communication with the game platform 100 have a particular shape, e.g., triangle, circle, square, or various stylized arrows, or other shapes. In some embodiments the pitch markers assigned to each microphone have a distinctive coloring or shaping allowing players to distinguish between them.
As an example, in
Still referring to
In addition to multiple vocal cues, multiple sets of lyrics can be sung. Still referring to
Still referring to
Also in
In some embodiments, a separate performance meter (not shown) may be displayed for each player. This separate performance meter may comprise a simplified indication of how well the player is doing. In one embodiment, the separate performance meter may comprise an icon which indicates whether a player is doing great, well, or poorly. For example, the icon for “great” may comprise a word such as “Fab” being displayed, “good” may be a thumbs up, and “poor” may be a thumbs down. In other embodiments, a player's lane may flash or change color to indicate good or poor performance.
Still referring to
In some embodiments, if a given amount of bonuses are accumulated, a player may activate the bonus to trigger an in-game effect. An in-game effect may comprise a graphical display change including, without limitation, an increase or change in crowd animation, avatar animation, performance of a special trick by the avatar, lighting change, setting change, or change to the display of the lane of the player. An in-game effect may also comprise an aural effect, an increase in volume, or a crowd cheer, and/or an explosion or other aural signifier that the bonus has been activated. In embodiments where instruments are used, an effect could be a guitar modulation, feedback, distortion, screech, flange, wah-wah, echo, or reverb. An in-game effect may also comprise a score effect, such as a score multiplier or bonus score addition. In some embodiments, the in-game effect may last a predetermined amount of time for a given bonus activation. In some implementations, the singer may trigger or deploy the bonus by providing any manner of vocal input, e.g., percussion sounds such as tapping the microphone, speaking, screaming, wailing, growling, etc. This triggering or deployment is also called an “improvisation deploy.”
In some embodiments, bonuses may be accumulated and/or deployed in a continuous manner. In other embodiments, bonuses may be accumulated and/or deployed in a discrete manner. For example, instead of the continuous bar 245 shown in
Dynamic Musical Part Determination
One feature of the game is that input received from a simulated instrument, e.g., 115, or microphone, e.g., 110, is not forcibly associated with a particular part for that instrument or microphone for the duration of a song. Specifically, players providing input can dynamically switch between melody and harmony parts, or lead and rhythm parts, different percussion parts where two or more drums are present, or, in the case of simulated guitars, even guitar parts and bass guitar parts, during gameplay. Though examples herein typically refer to microphones, vocal input, and vocal parts, the technology is applicable to guitars, bass guitars, drums, keyboards, and other simulated instruments as well. Furthermore, references to melody and harmony are not limiting; rather, in discussing two or more parts, the game can present any two or more parts to the players, e.g., two harmony parts and no melody, or two or more parts in general that are not designated as melody or harmony.
Using singing as an example, contrary to prior art games where a person chooses that they want to sing lead vocals or backup vocals at the beginning of a song, and are forced to remain with that selection for the duration of the song, in the in present invention, microphones 110 are not tied to a particular part. For example, a player can sing melody vocals and then, during the song, begin singing harmony vocals with no additional input to the game (e.g., the player does not need to pause the game, press a button, or manipulate a menu to switch parts). Instead, one aspect of the present invention dynamically determines which part a player is singing and associates input from that microphone 110 with that part “on the fly.” As an example, in
When comparing the pitch of an input to the expected pitch represented by a note tube (target music data), a degree of matching is determined based on how close or how far the input is from the expected pitch.
In some implementations 342, the degree of matching is non-linear as the input pitch 310 gets closer to the pitch of the note tube 300, i.e., as the input pitch gets closer to the expected pitch, the degree of matching increasingly increases. In other implementations 343, there is no degree of matching between the input pitch 310 and the note tube 300 unless the input pitch is within a tolerance threshold 325 of the note tube. In some implementations there is a constant, rather than zero, degree of matching for any input 310 outside the tolerance threshold 325. In either implementation 342 or 343, the degree of matching is non-linear once the input pitch is within the tolerance threshold 325, i.e., as the input pitch gets closer to the pitch of the note tube, the degree of matching increasingly increases. Other implementations (not shown) combine these approaches, for example, there is no degree of matching until the input pitch 310 is within the tolerance threshold 325 and then the degree of matching and distance are linearly related. Other relationships correlating distance between the input pitch and the note tube pitch are also contemplated.
Additionally, in some versions of the implementations above, the pitch distance can be determined modulo octave. Specifically, the octave of the input pitch 310 from the player is not taken into account when determining distance from the expected pitch of the note tube 300. For example, if the pitch of the note tube 300 is a C4 and the input 310 from the player is a C5, the input pitch is a full octave's distance apart. Using MIDI note numbers, for example, the input pitch has a MIDI note number of 72 and the note tube's pitch has a MIDI note number of 60. However, in a modulo octave implementation, the distance would be zero since the difference in octave is not considered, e.g., there is a 12 MIDI note number difference between 60 and 72, but modulo octave, in this case, modulo 12, the difference is zero. For example, if the pitch of the note tube 300 is C4, i.e., MIDI note number 60, and the input 310 from a player is the B below C5, i.e., MIDI note number 71, the distance can be computed as 11 MIDI note numbers difference. However, the distance between C5, i.e., 72, and B4, i.e., 71, is only 1, so preferentially the distance is instead determined to be 1. These eleven MIDI note numbers correspond to 11 half steps pitch-wise, i.e., C4 to D (1 step) to E (1 step) to F (1/2 step) to G (1 step) to A (1 step) to B (1 step). The distance, however, is only one half step, i.e., B to C because the implementation is modulo octave.
Another way of calculating the modulo octave operation is to determine the MIDI note number of the input pitch, increment or decrement it by octaves until the MIDI note numbers of the pitch above and below the target music data MIDI note number are known, and then determine the minimum difference between the target pitch and the modulo pitches above and below it. As an example, an input pitch has a MIDI note number of 86 (D6) and the note tube's pitch has a MIDI note number of 60 (C4). The input pitch's MIDI note number is decremented by an octave until it is within an octave of the MIDI note number of the target music data, i.e., 86 (D6) is decremented to 74 (D5), and, since 74 is still not within an octave of 60 (C4), the MIDI note number is decremented again to 62 (D4), which is within an octave above the MIDI note number of the target music data (D4 is an “above-MIDI note number”). Then, because the difference could be smaller if another octave decrement is performed, the MIDI note number of the input pitch is decremented again to 50 (D3) so it is below the MIDI note number of the target music data (D3 being a “below-MIDI note number”). Then the minimum of the difference between the MIDI note number of the target music data and the above-MIDI note number and the MIDI note number of the target music data and the below-MIDI note number is determined, i.e., 62−60=|2| and 50−60=|−10| (absolute values are used to negate negative numbers). Thus, because the difference between the above-MIDI note number and the target music data MIDI note number is smaller, the singer's input is scored as if it were sung at the above-MIDI note number, i.e., 62 (D4). The same principle is applied when the singer's input is below the pitch of the target music data; the input pitch is incremented octave by octave until the pitches within an octave below and an octave above the target music data's pitch are determined and the minimum difference is determined.
As a result, in some implementations, the degree of matching is not directly correlated to the distance of the input pitch 310 from the note tube 300, or as the difference expressed as MIDI note numbers. As an input pitch 310 gets farther away from the pitch of the note tube 300, after crossing half of the scale, the input pitch begins getting closer to the octave above the pitch of the note tube, and thus getting closer to the pitch, modulo octave, of the note tube. As a result, since octave differences are not considered, the degree of matching actually increases after the input pitch exceeds the halfway mark. Though MIDI note numbers and “pitch steps” are used above to describe the modulo octave example, the invention is not limited to this form of “distance” and, in many implementations, a difference in frequency from the input pitch and the note tube pitch is used in distance calculations.
Advantageously, in some implementations, the accumulation of points for a given note tube 300 is directly correlated to the degree of matching between the pitch of the input 310 and the pitch of the note tube 300. A high degree of matching will generate a large number of points and a low degree of matching will generate a low number of, or zero, points. Where the relationship between the degree of matching and the distance is non-linear, the closer the input 310 is to the pitch of the note tube 300, the faster score accumulates. Also, in some versions, a constant amount of points, or no points, are accumulated when the input pitch 310 is anywhere outside the tolerance threshold 325 of the note tube 300 because there is a constant or zero degree of matching.
Referring back to
One of the benefits of the present invention is that when the player wants to shift parts dynamically, the game platform 100 allows them to do so, even in the middle of a phrase. When a player's input 310 is outside the tolerance threshold 325 for a particular cue 300, 305, the input 310 is no longer assigned to that part and the player is considered not to be singing that cue. For example, in
To appreciate the dynamic part determination, consider in
In embodiments with instruments, the degree of matching can be based on how close a provided input is to a particular part over a period of time, e.g., if, for example, a lead guitar part and rhythm guitar part both have a sequence of target music data, e.g., green gem, green gem, and then the lead guitar has a blue gem while the rhythm guitar has a third green gem, the player performing a third input corresponding to a green gem indicates that the player is attempting to play the rhythm guitar part and not the lead guitar part. Beneficially, allowance is made for the player to make mistakes, for example, in the prior example, if the fourth and fifth inputs for the lead guitar part are also blue gems and the rhythm guitar part is two more green gems, if the player plays a third input corresponding to a blue gem (indicating the lead guitar part) but mistakenly provides an input corresponding to the green gem part on the fourth input (which would indicate the player is attempting the rhythm guitar part), when the fifth input is provided as corresponding to a blue gem (again, lead guitar), the degree of matching allows for the mistaken input corresponding to the green gem and still indicates that the player is attempting the lead guitar part. In some embodiments, the degree of matching for instruments takes into consideration the proximity of the gem when determining if a mistake was made. For example, in the prior example, if the green gem is separated from the blue gem by a red gem, the player providing input corresponding to the green gem on the fourth input may not be determined to be a mistake because the green gem is too far, gem-wise, from the blue gem. In that scenario, the player would be assigned to the rhythm part. If, however, the fourth input corresponded to the red gem, because the ref gem is next to the blue gem, that is, in close proximity, the game platform determines that the fourth input, which does not correspond to either part, is a mistake and keeps the player associated with the lead guitar part.
Biasing a Music Performance Input to a Part
Dynamic part determination, however, presents an interesting problem itself: which part is input 310 assigned to when it is within the tolerance thresholds of two parts, i.e., 325 for the melody 300 and 330 for the harmony 305 such as at t2? In scenarios where it is ambiguous which part the player is singing, including cases where the harmony and melody parts use the exact same pitch, a method of determining which part the player is likely singing is necessary to bias a player's input 310 towards a particular part to ensure proper scoring.
Still referring to
In some implementations, for cases like at t2 where the part a player is trying to input is ambiguous, i.e., the singer could be attempting to sing 300 or 305, historical data is examined to determine which part the player is intending to sing, in effect, making a player's input 310 “sticky” to a particular part the singer sang before. Historical data can be any information collected prior to the period of determination, e.g., a prior degree of matching between a prior input and a prior part, a score for each part based on prior degrees of matching between prior performance and prior cues, prior performance data from prior songs, etc. As an example, the game platform may store scoring data accumulated for a particular time period or window, e.g., the last 10 seconds, and store that scoring data in memory in locations such as melody score memory 335, and harmony score memory 340 (though again, the game may simply refer to these as part 1 score memory, part 2 score memory, etc, where there is no designated melody or harmony). Though not depicted, historical information for any number of parts and for any length of time can be stored and used to in this calculation. Parts can be of any type (e.g., a second harmony part or an instrument part) and time periods may be of any length (e.g., seconds, the length of the entire current song, or the span of multiple song performances in the past). One use of the historical information is demonstrated with respect to t2.
During t0-t2, the player is accumulating score for the melody part while the singer is within the tolerance threshold 325 (or alternatively at varying rates depending on his accuracy within the threshold to the melody 300). By t1, the player's input 310 has not entered the threshold 330 for the harmony cue 305, and therefore the player has not accumulated any score for the harmony part, but has accumulated score for the melody part (though scoring per “part” can be based on generating a score for a cue, for a series of cues, or for portion of a cue, score can be kept per part, even across phrases). Approaching t2, the player continues to accumulate score for the melody cue 300 because input 310 is still within threshold 325 (and may slow as the singer gets further from the center of the tube 300). The score information for the melody cue 300 is stored periodically, e.g., every 60th of a second, in the melody score memory 335. As the player's input 310 approaches t2, it also enters threshold 330 for the harmony cue 305 and the player begins accumulating score for the harmony part. This score information for the harmony cue 305 is stored in the harmony score memory 340. Since it is ambiguous which part the player is trying to sing due to the overlapping thresholds 325 and 330, the game platform allows for the possibility that the player could be singing either part and therefore generates a score for both parts simultaneously depending on the degree of matching between the input 310 and each respective cue 300, 305. However, the player is still assigned to the melody part for the period where there is ambiguity because the score in the melody score memory 335 is higher than the score in the harmony score memory 340 (because the singer was singing the melody prior to the ambiguity period).
Note, because the player may switch parts at any time, the historical information is typically consulted only when it is ambiguous which part is being sung. For example, at t3 in some implementations, there is no ambiguity as to which part the player is singing—input pitch 310 is outside the tolerance threshold 325 and therefore it is determined that the player cannot be singing the melody cue 300 and must be singing harmony cue 305. In some implementations, where the input falls outside the tolerance threshold of a cue, the historical data stored in melody memory 335 and harmony score memory 340 is not consulted and that player's input is no longer assigned to that part regardless of history. However, should the player's input 310 again enter the threshold 325 of the part 300, and it once again becomes ambiguous which part is being sung, the historical data is consulted and the player's input 310 could be reassigned to the original part 300.
Biasing a player is also particularly useful when parts directly overlap, e.g., when the melody and harmony have the same pitch. Without biasing a player to a part, a scenario could result where, before the parts converged, a first player was singing the melody and the second player was singing the harmony. Upon convergence, because both players would be within the threshold of the part they were not singing, they could conceivably be scored for singing the other part. Naturally, this is not desirable—if the player singing the melody was consistently accurate before the convergence—and therefore accumulating bonus points—and then awarded no points during the convergence because the singer was scored only for the harmony part, this would ruin a player's enjoyment of the game. Instead, by determining that the first player was singing the melody before and is likely singing the melody now based on the historical performance data, the player is still associated with the melody during the converged, overlapping parts. Biasing the singer to the melody allows the singer to continue accruing score and bonus points for the melody. Likewise, if a player was singing the harmony part and the two parts converged, it would be undesirable to score the singer for only the melody parts, thereby negatively impacting his score for the harmony portion.
Another example of parts converging, specifically overlapping, is depicted in
Scoring Musical Performances During and After Periods of Ambiguity
In some cases, referring back to
In some implementations, a score is determined based on which part of multiple parts was performed the most completely for a given time frame, e.g., for a phrase. In some of these implementations, any additional input is treated as a bonus or additional score. For example, in
In some embodiments, the most completely performed part forms the basis of the score assigned to the vocal “player.” In this case, the performance of the harmony cues 510, 515 is more complete for the harmony part than the melody cues 500 were for the melody part. As a result, in some embodiments, the player(s) are awarded 66% of the possible score for the harmony part for the phrase and nothing for the melody. In other embodiments, additional parts that were performed, but not as completely as the most-completely-performed part, are converted into bonus points that are added to the score. For example, the harmony part may be awarded 66% of the possible points for performing the harmony cues, and then 50% of the points possible for the melody are added to that. Or, in some implementations, a fraction of the less-completed score is awarded, e.g., 10% of the possible points for the other parts, in this case 50% multiplied by 10%, so 5% of the possible points for the melody are added to the harmony score. In some embodiments, duration of a cue may play a part in the determination of how complete the performance of a particular part was. For example, performing cue 500 is considered performing seventy five percent (75%) of the melody part because the cue 500 is longer than cues 510 and is deemed “worth more.” In some embodiments, performing only a portion of the cues is considered completing it 100% or a completion bonus is added to the amount performed to achieve 100% completion. For example, sustaining a pitch for a particular duration may be heavily weighted (considered a success) and thus performing 500 and only a fraction of 505 is necessary to get a 100% complete.
Allowing for a completion metric and supplemental scoring is beneficial in that it results in additional players singing to enjoy the game and achieve a high score. The supplemental scoring is accomplished by several functions. First, the target music data is displayed on the display 105 by the game platform 100. The game platform 100 then receives music performance input from the player and the player's input is associated with the first music performance, e.g., the player is singing the melody of the song according to the displayed target music data. Then, a new set of target music data is displayed on the display 105 and a new, second, set of input is received by the game platform 100. The game platform 100 calculates a score, e.g., via the singing analysis module 130, for the first input based on the first music performance input data and calculates a second score based on the second music performance input data. Depending on which score is higher—first or second—the game platform chooses one as the preferred score. For illustration purposes, assume the first score was higher. That score—the score for the first part—becomes the effective score for both parts since it was the most complete. However, in some implementations, the second score is not discarded—instead the scores that were not selected to be the preferred scores are modified via a score multiplier and the preferred score is modified based on the non-preferred score and the multiplier. In other implementations, rather than picking a preferred score and adding to it or modifying it, a “final score” is determined based on both score, e.g., they are combined, added, the first is multiplied by the second, the second provides an incremental increase, or other means of combination.
In some implementations, the phrase performance meters 240 reflect the performance so far for a part, e.g., when a first harmony is sixty six percent complete, sixty-six percent of the corresponding phrase performance meter, e.g., 240b, fills. Similarly, when fifty percent of the melody is completed, fifty percent of the melody performance meter, e.g., 240a, is filled. In some implementations, phrase performance does not directly map to filling a meter. For example, performing sixty percent of a part is “good enough” to consider the phrase two-thirds complete, or eighty percent is good enough to consider the phrase one hundred percent complete where there are four tubes to sing (and thereby each tube counting for twenty-five percent). In some embodiments, the most-completely performed part—or most filled phrase performance meter—contributes to a counter that fills the score multiplier indicators 250, e.g., the more complete the performance of a part, the more the meter is filled. In some implementations, less-complete performances of other parts also contribute to filling the score multiplier indicators 250.
Pitch Guide That Displays Multiple Octaves and Harmonically Relevant Pitches
One aspect provides an improved method of displaying vocal cues. To increase the player's appreciation of the relative difference between pitches represented by the vertical position of a pitch cue, the shading on the backdrop behind the vocal cues divides the spaces into octave-sized regions. For example, in
Another beneficial aspect is that horizontal lines 625a, 625b (collectively 625) (and denoted by 605, 610, 615 as well) in each lane 600, 620 indicate pitches that make musical sense in the context of the song. It is typical for note tubes pictured to line up with one of the background lines because the note tubes represent pitches in the song, and the pitches in the song are typically related musically. As an example, the songs depicted in 600 and 625 could both be in the key of A, in which case the lines 625 indicate A, C#, and E. Though for other songs these lines may refer to different pitches. For example, in the key of Cm, these lines would refer to C, E♭ (E flat), and G. Some songs use a wider range of pitches than others, and the background and line 625 spacing can pan and scale to accommodate variable ranges. For example, lane 600 has a wider pitch range (three octaves) than lane 620, so the backdrops have different vertical scale.
What is harmonically relevant depends on the song. In some embodiments, the horizontal lines reflect specific notes or a scale or chord. For example, the specific notes of a major chord may be the tonic, the 3rd, and the 5th of the scale. In other embodiments, the horizontal lines reflect specific notes of a minor chord, e.g., a tonic, minor 3rd, and 5th of the scale. Optionally the notes could include a 7th or other notes of the scale. In some embodiments the horizontal lines reflect specific notes of a particular mode of a scale, e.g., the Ionian, Dorian, Phrygian, Lydian, Mixolydian, Aeolian, or Locrian modes, or the like. Beneficially, these embodiments can be combined. For example, the game platform causes the display to display the horizontal lines of a first phrase of a song as reflecting the notes a major chord, the horizontal lines of the next phrase is displayed reflecting the notes of a minor chord, and then a third phrase again is displayed reflecting a major chord. Optionally a mode can be substituted for any of the chords in the preceding example. Beneficially, the lines 625a, 625b indicate the harmonically relevant pitches. In some implementations, 625a, 625b are not lines, but are perceivable gaps in the coloring or shading of a section.
Before a song begins, the game platform 100 determines, in some implementations by a song analysis module (not displayed), which pitches to demarcate as harmonically relevant. Advantageously, the game platform 100 can also change the demarcations during a song on a per phrase basis if applicable, e.g., the song has multiple keys. The game platform 100 analyzes the song or phrase, (i.e., analyzes the musical data of the song) to determine a scale within the song. The lane is displayed with a number of interval demarcations based on the scale. Also, a background to the lane is displayed with a color scheme that is based on preselected pitches of the scale. Then the song data (or target music data) is displayed. Beneficially, the display of the pitch range for any phrase is can be dynamic—that is a phrase with low notes can be displayed with a given pitch range and note density and another phrase can be displayed with a different pitch range and note density.
Dynamically Displaying a Pitch Range
Dynamically displaying a pitch range allows the game to display a lane of a constant size, but to shift the displayed area to different upper and lower pitches, and to “zoom in” and “zoom out” of the displayed pitch range to display different pitch or note tube densities. In some embodiments pitch density is the spacing between the note tubes. In other implementations it corresponds to the thickness of note tubes. In still other implementations, pitch density is a combination of tube spacing and tube thickness. Where the pitch density of the current portions is different than the pitch density of a prior portion, the spacing between note tubes or gems of the displayed pitches is changed. Advantageously, some implementations utilize both dynamic range functionalities, that is utilizes both shifting the displayed pitch area and dynamically altering the pitch density. Beneficially, these determinations can be made before gameplay begins or during gameplay on a portion-by-portion basis.
To shift the displayed area, in some implementations, the game platform 100 divides a song into portions. In some implementations this is performed by a song analysis module (not displayed). Portions can be phrases or other musically significant divisions, e.g., bars or groupings of two or more notes. The game platform 100 then determines a deviation between the highest note and the lowest note for each portion to determine the pitch range for that portion. When displayed, the lowest note of a portion of the song typically aligns with the bottom of the lane and the upper note of a portion typically aligns with the top of the lane, even if there is a higher note later in the song. The game platform 100 then determines a pitch density for the entire song—which is used to determine the size of the lane in some implementations—based on the largest pitch range of all portions. This allows the highest note and lowest note of every portion to fit within the viewable area. Then, each portion is displayed via display logic of the game platform 100 within the lane on the display 105. Beneficially, because some portions will have notes that are higher than notes in other portions (or notes lower than those in other portions) the viewable area that displays the notes for that portion can change positions according to the changes in pitches.
In some implementations, the “zoom value,” or the deviation between highest and lowest notes is changeable. In
In some implementations, the game platform 100, via display logic, directs the display 105 to visually zoom in and zoom out when displaying portions of different densities. This is accomplished similar to the steps above with respect to a shiftable display—the song is divided into portions, and the pitch range for each portion is determined. Then the game platform 100 determines a display density for each portion based on the pitch range of that portion. Then the portion is displayed with a pitch density that is alterable based on the displayed portion. For example, transitioning between portion 645 and 650 alters the appearance of the viewable area such that the portion appears to zoom out to show the greater pitch range of 650 (the highest note of 650 could not be displayed in 645 because it is outside the viewable area displayed in 645). Then, because 655 does not have the high pitches that 650 does, the viewable area zooms in and displays a pitch density that corresponds to the highest note and lowest notes of 655. In some implementations, the visual effect of transitioning from 645 to 650 is that the note tubes appear to shrink in the vertical dimension to allow for wider span, or larger number, of note tubes to be displayed in a single lane.
Referring back to
Beneficially, dynamic pitch range is not limited to vocal parts; some embodiments use dynamic pitch range for instrumental parts such as a keyboard, guitar, or drums as well.
Again, the musical composition is divided into portions 668, 670, 672. Then the game platform 100 determines the pitch range for each portion, and each portion is displayed with the appropriate number of sub-lanes. The amount to zoom in, or in this case, the number of sub-lanes to display, is determined based on the left-most and right-most gems, e.g., five sub-lanes difference in portion 668, four gems difference in portion 670, and three gems difference in portion 672. Additionally or alternatively, the game platform 100 determines a limited area of the total twelve-sub-lane to display, e.g., only the low portion, or only the middle portion, or only the high portion, or combinations of low and middle, middle and high, or other combinations. For portion 668, the game platform 100 determines that only the five sub-lanes of section 662 are required. Therefore, during portion 668, the display window's position is altered such that only 662's sub-lanes are displayed, and a lane similar to 674 is displayed to the player. Then, as gameplay progresses, the game platform determines for portion 670, only the sub-lanes of section 664 need to be displayed and the display window shifts horizontally to the right to display only the sub-lanes of 664. Then, similarly, for portion 672, the display window shifts to the right again and displays only the sub-lanes of section 666, i.e., as shown in 676 of
Beneficially, continuing the above example, not only is the position of the display window altered by moving from left to right as gameplay transitions between portions, but similarly, the zoom value used to display each portion changes. Portion 662 is rendered as a five sub-lane lane 674 in
Referring now to
Displaying multiple arrows is accomplished when the game platform 100 receives music performance data via a microphone. The game platform 100 displays the player's input on the display as pitch marker that is reflective of the music performance input data, e.g., indicates the relative pitch of the performance. Then, the game platform 100 displays, on the display 105, substantially simultaneously with the display of the first pitch marker, a second pitch marker at a vertical offset from the first pitch marker. The offset is indicative of an octave difference between the first pitch marker and the second pitch marker, such as being an octave above the first pitch marker or an octave below the first pitch marker. Alternatively, if the lane is displayed to show multiple octaves, the second pitch marker can be displayed at any octave offset from the first pitch marker.
The singing analysis module 130 of the game platform 100 will, in some embodiments, calculate a score for the first pitch marker based on a comparison between the first pitch marker and the pitch component of the vocal cue. Additionally, the singing analysis module 130 of the game platform 100 calculates a second score for the second pitch marker based on a comparison between the second pitch marker and the pitch component of the vocal cue. Alternatively, the score can be calculated for the second pitch marker if the input has a degree of matching with a different part, e.g., a harmony line an octave below the melody line the player was singing.
Practice Mode for Multiple Musical Parts
To assist players that wish to sing or perform different parts, a practice mode is provided where multiple parts are displayed, but only one is scored. In
Specifically, the practice mode is enabled by displaying, on the display 105, a note tube or target music data associated with the song or musical composition. The game platform 100 receives via an input interface or menu, or alternatively determined based on the player's performance using dynamic part determination, a particular part the player wants performed, such as the part associated with 800, 805, 810. Then the game platform 100 produces an audio output via the sound processing module 135 associated with the vocal cues, e.g., singing or music for the selected part. Optionally, audible sounds, lyrics, or notes can be played for all parts. Further, the game platform 100 can produce a synthesized tone associated with the selected part that helps the player match their voice to the pitch of the audible tone. As the player practices the part, a score is calculated based on the degree of matching between the player's input and the note tube of the selected part. If a different part is selected for practice, an audible tone for that part is played and the other, non-selected parts are then dimmed (including the first-selected part if it was not chosen again). Allowing the player to practice a section and not be dynamically assigned to a different part improve the player's enjoyment of the game.
Prior art games display scrolling vocal cues and scrolling text. Unfortunately, this can cause blurring of the text on certain displays 105 and or screen “tearing.” One aspect of the invention provides an improved way of displaying lyrics and target music data.
Displaying Song Lyrics and Vocal Cues
This is accomplished by the game platform 100 displaying, on the display 105 via display logic, a vocal cue and moving the vocal cue on the display towards a target marker in synchronization with the timing of the song. Then, the game platform 100 displays, on the display 105, a lyric associated with the vocal cue in a fixed position until the vocal cue has moved to a particular position with respect to the target marker such as the over or past the target marker.
Beneficially, a “queued” word also has its color changed so the player knows it is not the lyric the singer is singing now, but it will be the next lyric the singer sings. For example the lyric that should be sung “or” in some versions would be colored green. “Madam”, the next lyric or syllable is colored white. Then the remaining lyrics, as well as lyrics already sung, are colored grey. By having the lyrics remain static and have the note tubes still move, this aspect of the invention overcomes the blurring and tearing of the lyric text experienced on some displays with prior art vocal games.
Selectively Displaying Song Lyrics
In one aspect of the invention, a way of determining which lyrics to display to players in a multi-vocal-part game is provided. In
Advantageously, each set of lyrics is assigned a priority by the game platform 100. Typically the two lyric lines with the highest priorities are displayed, although in some implementations the priority is determined randomly at run-time. In some embodiments, the priority is assigned to each lyric line by the game developer, while in others, it can be chosen by the players before gameplay begins or during gameplay.
Determining which lyrics to display begins by the game platform 100 determining how many vocal cues will need to be displayed. Then, based either on limitations provided by the game developer before the game is executed or determined at run-time, the game platform 100 determines the number of areas available to display lyrics. When the number of spaces to display lyrics is less than the number of vocal cues (or correspondingly the number of vocal cues is greater than the available spaces) the game platform 100 determines which lyrics have the highest priority and displays the lyrics, one set per display area, in priority order until the number available spaces have been exhausted.
In
In some embodiments, a lyric's priority can be determined randomly at run time if multiple sets of lyrics have the same priority assigned to them by the developer. Beneficially this adds to the player's enjoyment because different lyrics are displayed for each session and later gameplay sessions are not performed exactly like prior gameplay sessions. In some embodiments, the priority determination is done at the beginning of a song, while in others the determination is made on a per-phrase or per-bar basis. Notably, the parts associated with each set of lyrics do not need to be designated as the melody part and two harmony parts. In some embodiments there are three or more harmony parts with no melody parts. In other embodiments there are three or more parts that are not designated as a melody or harmony and instead are just considered different parts. Other combinations, e.g., two melodies and one harmony and the like are also contemplated
Preventing an Unintentional Deploy of a Bonus
As described earlier, either vocally or with an instrument, a deployable bonus can become available (see bonus meter 245 in
Continuing the example, during deploy period 1105, the player is singing “shout” successfully and therefore the player's input 1110 has a degree of matching with vocal cue 1115. As explained above, the degree of matching can be a measure of how close the input pitch 1110 is to the vocal cue 1115, or the input pitch 1110 can simply be within the tolerance threshold (not displayed) of the vocal cue 1115, or a combination of these depending on the implementation. When the singing analysis module 130 of the game platform 100 determines, e.g., via the singing analysis module 130, that because an input has a degree of matching with a part being displayed, then the game platform prevents that input from satisfying the bonus deploy criteria (certain-volume-for-a-certain-time criteria) and effectively blocks that input from triggering the bonus deploy (in some embodiments, it accomplishes this by not counting the volume or duration of the input towards the internal meter that determines if the volume or duration of an input satisfies the triggering criteria). Then, because the melody part 1115 does not have a vocal cue or lyrics to be sung, the bonus can be deployed during the melody parts' period of silence (indicated at deploy period 1100). During 1100, if any player, including the person that just sung the melody, sings either the harmony part 1120, 1125 successfully, that person is also prevented from deploying the bonus. However, if the person provides input that is above a certain volume for a certain period, and that input does not have a degree of matching with either cue/part 1120, 1125, then the bonus will be deployed.
The above-described techniques can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. The implementation can be as a computer program product, i.e., a computer program tangibly embodied in a machine-readable storage device, for execution by, or to control the operation of, data processing apparatus, e.g., a programmable processor, a computer, a game console, or multiple computers or game consoles. A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or game console or on multiple computers or game consoles at one site or distributed across multiple sites and interconnected by a communication network.
Method steps can be performed by one or more programmable processors executing a computer or game program to perform functions of the invention by operating on input data and generating output. Method steps can also be performed by, and apparatus can be implemented as a game platform such as a dedicated game console, e.g., PLAYSTATION® 2, PLAYSTATION® 3, or PSP® manufactured by Sony Corporation; WII™, NINTENDO DS®, NINTENDO DSi™, or NINTENDO DS LITE™ manufactured by Nintendo Corp.; or XBOX® or XBOX 360® manufactured by Microsoft Corp. or special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit) or other specialized circuit. Modules can refer to portions of the computer or game program and/or the processor/special circuitry that implements that functionality.
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer or game console. Generally, a processor receives instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer or game console are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer also includes, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Data transmission and instructions can also occur over a communications network. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in special purpose logic circuitry.
To provide for interaction with a user, the above described techniques can be implemented on a computer or game console having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, a television, or an integrated display, e.g., the display of a PSP® or Nintendo DS. The display can in some instances also be an input device such as a touch screen. Other typical inputs include simulated instruments, microphones, or game controllers. Alternatively input can be provided by a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer or game console. Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input.
The above described techniques can be implemented in a distributed computing system that includes a back-end component, e.g., as a data server, and/or a middleware component, e.g., an application server, and/or a front-end component, e.g., a client computer or game console having a graphical user interface through which a user can interact with an example implementation, or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet, and include both wired and wireless networks.
The computing/gaming system can include clients and servers or hosts. A client and server (or host) are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
The invention has been described in terms of particular embodiments. The alternatives described herein are examples for illustration only and not to limit the alternatives in any way. The steps of the invention can be performed in a different order and still achieve desirable results. Other embodiments are within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
3430530 | Fletcher | Mar 1963 | A |
D211666 | MacGillavry | Jul 1968 | S |
3897711 | Elledge | Aug 1975 | A |
D245038 | Ebata et al. | Jul 1977 | S |
D247795 | Darrell | Apr 1978 | S |
4128037 | Montemurro | Dec 1978 | A |
D259785 | Kushida et al. | Jul 1981 | S |
4295406 | Smith | Oct 1981 | A |
D262017 | Frakes, Jr. | Nov 1981 | S |
D265821 | Okada et al. | Aug 1982 | S |
D266664 | Hoshino et al. | Oct 1982 | S |
D287521 | Obara | Dec 1986 | S |
4644495 | Crane | Feb 1987 | A |
4766541 | Bleich et al. | Aug 1988 | A |
4783812 | Kaneoka | Nov 1988 | A |
4794838 | Corrigau, III | Jan 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
4890833 | Lantz et al. | Jan 1990 | A |
D310668 | Takada | Sep 1990 | S |
5074182 | Capps et al. | Dec 1991 | A |
5107743 | Decker | Apr 1992 | A |
5109482 | Bohrman | Apr 1992 | A |
5140889 | Segan et al. | Aug 1992 | A |
5194683 | Tsumura et al. | Mar 1993 | A |
5208413 | Tsumura et al. | May 1993 | A |
5250745 | Tsumura | Oct 1993 | A |
5262765 | Tsumura et al. | Nov 1993 | A |
5287789 | Zimmerman | Feb 1994 | A |
D345554 | Dones | Mar 1994 | S |
5362049 | Hofer | Nov 1994 | A |
5368309 | Monroe et al. | Nov 1994 | A |
5393926 | Johnson | Feb 1995 | A |
5395123 | Kondo | Mar 1995 | A |
5398585 | Starr | Mar 1995 | A |
5399799 | Gabriel | Mar 1995 | A |
5434949 | Jeong | Jul 1995 | A |
5453570 | Umeda et al. | Sep 1995 | A |
5464946 | Lewis | Nov 1995 | A |
5482087 | Overbergh et al. | Jan 1996 | A |
5488196 | Zimmerman et al. | Jan 1996 | A |
5491297 | Johnson et al. | Feb 1996 | A |
5510573 | Cho et al. | Apr 1996 | A |
5513129 | Bolas et al. | Apr 1996 | A |
5524637 | Erickson | Jun 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5537528 | Takahashi et al. | Jul 1996 | A |
5553864 | Sitrick | Sep 1996 | A |
5557056 | Hong et al. | Sep 1996 | A |
5557057 | Starr | Sep 1996 | A |
5563358 | Zimmerman | Oct 1996 | A |
5565639 | Bae | Oct 1996 | A |
5567162 | Park | Oct 1996 | A |
5568275 | Norton et al. | Oct 1996 | A |
5574238 | Mencher | Nov 1996 | A |
5613909 | Stelovsky | Mar 1997 | A |
5616078 | Oh | Apr 1997 | A |
5627335 | Rigopulos et al. | May 1997 | A |
5631433 | Iida et al. | May 1997 | A |
5634849 | Abecassis | Jun 1997 | A |
5663517 | Oppenheim | Sep 1997 | A |
5670729 | Miller et al. | Sep 1997 | A |
5681223 | Weinreich | Oct 1997 | A |
5693903 | Heidorn et al. | Dec 1997 | A |
D389216 | Isetani et al. | Jan 1998 | S |
5704836 | Norton et al. | Jan 1998 | A |
5715179 | Park | Feb 1998 | A |
5719344 | Pawate | Feb 1998 | A |
5723802 | Johnson et al. | Mar 1998 | A |
5734961 | Castille | Mar 1998 | A |
5739457 | Devecka | Apr 1998 | A |
5763804 | Rigopulos et al. | Jun 1998 | A |
5768396 | Sone | Jun 1998 | A |
5777251 | Hotta et al. | Jul 1998 | A |
5782692 | Stelovsky | Jul 1998 | A |
D398916 | Bernardi | Sep 1998 | S |
5804752 | Sone et al. | Sep 1998 | A |
D399887 | Schultz et al. | Oct 1998 | S |
D400196 | Cameron et al. | Oct 1998 | S |
5824933 | Gabriel | Oct 1998 | A |
5825427 | MacLeod | Oct 1998 | A |
5830065 | Sitrick | Nov 1998 | A |
5833469 | Ito et al. | Nov 1998 | A |
D403024 | Muraki et al. | Dec 1998 | S |
5861881 | Freeman et al. | Jan 1999 | A |
5874686 | Ghias et al. | Feb 1999 | A |
5880788 | Bregler | Mar 1999 | A |
5886275 | Kato et al. | Mar 1999 | A |
5889224 | Tanaka | Mar 1999 | A |
5906494 | Ogawa et al. | May 1999 | A |
D411258 | Isetani et al. | Jun 1999 | S |
5913727 | Ahdoot | Jun 1999 | A |
5915288 | Gabriel | Jun 1999 | A |
5915972 | Tada | Jun 1999 | A |
5915975 | McGrath | Jun 1999 | A |
5925843 | Miller et al. | Jul 1999 | A |
5953005 | Liu | Sep 1999 | A |
5953485 | Abecassis | Sep 1999 | A |
5969716 | Davis et al. | Oct 1999 | A |
5983280 | Hunt | Nov 1999 | A |
5990405 | Auten et al. | Nov 1999 | A |
5999173 | Ubillos | Dec 1999 | A |
6001013 | Ota | Dec 1999 | A |
6009457 | Moller | Dec 1999 | A |
6011212 | Rigopulos et al. | Jan 2000 | A |
6016380 | Norton | Jan 2000 | A |
6018121 | Devecka | Jan 2000 | A |
6032156 | Marcus | Feb 2000 | A |
6065042 | Reimer et al. | May 2000 | A |
6066792 | Sone | May 2000 | A |
6067126 | Alexander | May 2000 | A |
6067566 | Moline | May 2000 | A |
6072113 | Tohgi et al. | Jun 2000 | A |
6072480 | Gorbet et al. | Jun 2000 | A |
6073489 | French et al. | Jun 2000 | A |
6074215 | Tsurumi | Jun 2000 | A |
6075197 | Chan | Jun 2000 | A |
6083009 | Kim et al. | Jul 2000 | A |
6091408 | Treibitz et al. | Jul 2000 | A |
6098458 | French et al. | Aug 2000 | A |
6118444 | Garmon et al. | Sep 2000 | A |
6121531 | Kato | Sep 2000 | A |
6142870 | Wada et al. | Nov 2000 | A |
6150947 | Shima | Nov 2000 | A |
6162981 | Newcomer et al. | Dec 2000 | A |
6166314 | Weinstock et al. | Dec 2000 | A |
6177623 | Ooseki | Jan 2001 | B1 |
6182044 | Fong et al. | Jan 2001 | B1 |
6184899 | Akemann | Feb 2001 | B1 |
6191350 | Okulov et al. | Feb 2001 | B1 |
6215411 | Gothard | Apr 2001 | B1 |
6224486 | Walker et al. | May 2001 | B1 |
6225547 | Toyama et al. | May 2001 | B1 |
6227968 | Suzuki et al. | May 2001 | B1 |
6227974 | Eilat et al. | May 2001 | B1 |
6243087 | Davis et al. | Jun 2001 | B1 |
6243092 | Okita et al. | Jun 2001 | B1 |
6252153 | Toyama | Jun 2001 | B1 |
6262724 | Crow et al. | Jul 2001 | B1 |
6263392 | McCauley | Jul 2001 | B1 |
6268557 | Devecka | Jul 2001 | B1 |
6283860 | Lyons et al. | Sep 2001 | B1 |
6287198 | McCauley | Sep 2001 | B1 |
6288727 | Akemann | Sep 2001 | B1 |
6292620 | Ohmori et al. | Sep 2001 | B1 |
6307576 | Rosenfeld | Oct 2001 | B1 |
6308565 | French et al. | Oct 2001 | B1 |
6309301 | Sano | Oct 2001 | B1 |
6319129 | Igarashi et al. | Nov 2001 | B1 |
6319130 | Ooseki et al. | Nov 2001 | B1 |
6320110 | Ishikawa et al. | Nov 2001 | B1 |
6326536 | Wang | Dec 2001 | B1 |
6329620 | Oishi et al. | Dec 2001 | B1 |
6337433 | Nishimoto | Jan 2002 | B1 |
6342665 | Okita et al. | Jan 2002 | B1 |
6347998 | Yoshitomi et al. | Feb 2002 | B1 |
6350942 | Thomson | Feb 2002 | B1 |
6352432 | Tsai et al. | Mar 2002 | B1 |
6353174 | Schmidt et al. | Mar 2002 | B1 |
D455792 | Isetani et al. | Apr 2002 | S |
6369313 | Devecka | Apr 2002 | B2 |
6379244 | Sagawa et al. | Apr 2002 | B1 |
6380474 | Taruguchi et al. | Apr 2002 | B2 |
6380950 | Montgomery et al. | Apr 2002 | B1 |
6384736 | Gothard | May 2002 | B1 |
6390923 | Yoshitomi et al. | May 2002 | B1 |
6392133 | Georges | May 2002 | B1 |
6407324 | Hulcher | Jun 2002 | B1 |
6410835 | Suzuki et al. | Jun 2002 | B2 |
6417432 | Downing | Jul 2002 | B1 |
6425822 | Hayashida et al. | Jul 2002 | B1 |
6425825 | Sitrick | Jul 2002 | B1 |
6425827 | Nimura | Jul 2002 | B1 |
6425828 | Walker et al. | Jul 2002 | B2 |
6429863 | LoPiccolo et al. | Aug 2002 | B1 |
6430997 | French et al. | Aug 2002 | B1 |
6437227 | Theimer | Aug 2002 | B1 |
6438611 | Hara et al. | Aug 2002 | B1 |
D462698 | Sturm | Sep 2002 | S |
6444887 | Hiraoka et al. | Sep 2002 | B1 |
6450886 | Oishi et al. | Sep 2002 | B1 |
6450888 | Takase et al. | Sep 2002 | B1 |
6461239 | Sagawa et al. | Oct 2002 | B1 |
6463205 | Aschbrenner et al. | Oct 2002 | B1 |
6464585 | Miyamoto et al. | Oct 2002 | B1 |
6468161 | Shimomura | Oct 2002 | B1 |
6471584 | Wada et al. | Oct 2002 | B1 |
6482087 | Egozy et al. | Nov 2002 | B1 |
6483018 | Mead | Nov 2002 | B2 |
6504089 | Negishi et al. | Jan 2003 | B1 |
6504990 | Abecassis | Jan 2003 | B1 |
6506969 | Baron | Jan 2003 | B1 |
6514083 | Kumar et al. | Feb 2003 | B1 |
6527639 | Suzuki | Mar 2003 | B2 |
6530834 | Kondo | Mar 2003 | B2 |
6530839 | Horio | Mar 2003 | B2 |
6535269 | Sherman et al. | Mar 2003 | B2 |
6540613 | Okubo et al. | Apr 2003 | B2 |
6541692 | Miller | Apr 2003 | B2 |
6542155 | Mifune et al. | Apr 2003 | B1 |
6542168 | Negishi et al. | Apr 2003 | B2 |
6544119 | Kubo et al. | Apr 2003 | B2 |
6544122 | Araki et al. | Apr 2003 | B2 |
6544125 | Horigami et al. | Apr 2003 | B2 |
6554706 | Kim et al. | Apr 2003 | B2 |
6554711 | Kawasaki et al. | Apr 2003 | B1 |
6555737 | Miyaki et al. | Apr 2003 | B2 |
6570078 | Ludwig | May 2003 | B2 |
6577330 | Tsuda et al. | Jun 2003 | B1 |
6582235 | Tsai et al. | Jun 2003 | B1 |
6582309 | Higurashi et al. | Jun 2003 | B2 |
6589120 | Takahashi | Jul 2003 | B1 |
6598074 | Moller et al. | Jul 2003 | B1 |
6599195 | Araki et al. | Jul 2003 | B1 |
6607446 | Shimomura et al. | Aug 2003 | B1 |
6608249 | Georges | Aug 2003 | B2 |
6609979 | Wada | Aug 2003 | B1 |
6611278 | Rosenfeld | Aug 2003 | B2 |
6612931 | Kojima et al. | Sep 2003 | B2 |
6613100 | Miller | Sep 2003 | B2 |
6618053 | Tanner | Sep 2003 | B1 |
6621503 | Ubillos | Sep 2003 | B1 |
6623358 | Harima | Sep 2003 | B2 |
6629892 | Oe et al. | Oct 2003 | B2 |
6634886 | Oyama et al. | Oct 2003 | B2 |
6636877 | Doleac et al. | Oct 2003 | B1 |
6638160 | Yoshitomi | Oct 2003 | B2 |
6645067 | Okita et al. | Nov 2003 | B1 |
6645076 | Sugai | Nov 2003 | B1 |
6645784 | Tayebati et al. | Nov 2003 | B2 |
6653545 | Redmann et al. | Nov 2003 | B2 |
6659873 | Kitano et al. | Dec 2003 | B1 |
6661496 | Sherman et al. | Dec 2003 | B2 |
6663491 | Watabe et al. | Dec 2003 | B2 |
6666764 | Kudo | Dec 2003 | B1 |
6669563 | Kitami et al. | Dec 2003 | B1 |
6676523 | Kasai et al. | Jan 2004 | B1 |
6682424 | Yamauchi et al. | Jan 2004 | B2 |
6684480 | Conrad | Feb 2004 | B2 |
6685480 | Nishimoto et al. | Feb 2004 | B2 |
6695694 | Ishikawa et al. | Feb 2004 | B2 |
6712692 | Basson et al. | Mar 2004 | B2 |
6725108 | Hall | Apr 2004 | B1 |
6727889 | Shaw | Apr 2004 | B2 |
6733382 | Oe et al. | May 2004 | B2 |
6738052 | Manke et al. | May 2004 | B1 |
6740802 | Browne, Jr. | May 2004 | B1 |
6740803 | Brinkman et al. | May 2004 | B2 |
6743099 | Yabe et al. | Jun 2004 | B2 |
6749432 | French et al. | Jun 2004 | B2 |
6749508 | Kohira et al. | Jun 2004 | B2 |
6750848 | Pryor | Jun 2004 | B1 |
6758753 | Nagata et al. | Jul 2004 | B1 |
6758756 | Horigami et al. | Jul 2004 | B1 |
6764399 | Nagata et al. | Jul 2004 | B2 |
6765590 | Watahiki et al. | Jul 2004 | B1 |
6765726 | French et al. | Jul 2004 | B2 |
6767282 | Matsuyama et al. | Jul 2004 | B2 |
6769689 | Shimomura et al. | Aug 2004 | B1 |
6786821 | Nobe et al. | Sep 2004 | B2 |
6801930 | Dionne et al. | Oct 2004 | B1 |
6811491 | Levenberg et al. | Nov 2004 | B1 |
6821203 | Suga et al. | Nov 2004 | B2 |
6831220 | Varme | Dec 2004 | B2 |
6831656 | Kitao | Dec 2004 | B2 |
6835136 | Kitao | Dec 2004 | B2 |
6835887 | Devecka | Dec 2004 | B2 |
6838608 | Koike | Jan 2005 | B2 |
6843726 | Nomi et al. | Jan 2005 | B1 |
6850252 | Hoffberg | Feb 2005 | B1 |
6852034 | Nagata et al. | Feb 2005 | B2 |
6856923 | Jung | Feb 2005 | B2 |
6857960 | Okubo et al. | Feb 2005 | B2 |
D503407 | Kaku | Mar 2005 | S |
6876496 | French et al. | Apr 2005 | B2 |
6881148 | Yotsugi et al. | Apr 2005 | B2 |
6881887 | Berens | Apr 2005 | B2 |
6890262 | Oishi et al. | May 2005 | B2 |
6893343 | Suda et al. | May 2005 | B2 |
6894693 | Nash | May 2005 | B1 |
6898637 | Curtin | May 2005 | B2 |
6905413 | Terao et al. | Jun 2005 | B1 |
6915488 | Omori et al. | Jul 2005 | B2 |
6921332 | Fukunaga et al. | Jul 2005 | B2 |
6924425 | Naples et al. | Aug 2005 | B2 |
6930235 | Sandborn et al. | Aug 2005 | B2 |
6930236 | Jung | Aug 2005 | B2 |
6936758 | Itoh | Aug 2005 | B2 |
6949023 | Okubo et al. | Sep 2005 | B1 |
6953887 | Nagashima et al. | Oct 2005 | B2 |
6964610 | Yamauchi et al. | Nov 2005 | B2 |
6967275 | Ozick | Nov 2005 | B2 |
6976918 | Hosokawa | Dec 2005 | B2 |
6991542 | Asami et al. | Jan 2006 | B2 |
6995765 | Boudier | Feb 2006 | B2 |
6995869 | Onodera | Feb 2006 | B2 |
6998527 | Agnihotri | Feb 2006 | B2 |
7000200 | Martins | Feb 2006 | B1 |
7001272 | Yamashita et al. | Feb 2006 | B2 |
7010291 | Iwanaga | Mar 2006 | B2 |
D519569 | Kiyono et al. | Apr 2006 | S |
7022905 | Hinman et al. | Apr 2006 | B1 |
7027046 | Zhang | Apr 2006 | B2 |
7027124 | Foote et al. | Apr 2006 | B2 |
7030307 | Wedel | Apr 2006 | B2 |
7030311 | Brinkman et al. | Apr 2006 | B2 |
7037197 | Watanabe | May 2006 | B2 |
7038855 | French et al. | May 2006 | B2 |
7044856 | Suzuki | May 2006 | B2 |
7044857 | Klitsner et al. | May 2006 | B1 |
7064672 | Gothard | Jun 2006 | B2 |
7066818 | Ikeda | Jun 2006 | B2 |
7069296 | Moller et al. | Jun 2006 | B2 |
7070500 | Nomi et al. | Jul 2006 | B1 |
7071914 | Marks | Jul 2006 | B1 |
7074999 | Sitrick et al. | Jul 2006 | B2 |
7076052 | Yoshimura | Jul 2006 | B2 |
7079026 | Smith | Jul 2006 | B2 |
7079114 | Smith et al. | Jul 2006 | B1 |
7084855 | Kaku et al. | Aug 2006 | B2 |
7084888 | Takahama et al. | Aug 2006 | B2 |
7098392 | Sitrick et al. | Aug 2006 | B2 |
7098921 | Nash et al. | Aug 2006 | B2 |
7103873 | Tanner et al. | Sep 2006 | B2 |
7119268 | Futamase et al. | Oct 2006 | B2 |
7122751 | Anderson et al. | Oct 2006 | B1 |
7123272 | Moriyama | Oct 2006 | B2 |
7126607 | Emerson | Oct 2006 | B2 |
7128649 | Nobe et al. | Oct 2006 | B2 |
7129408 | Uehara | Oct 2006 | B2 |
7134876 | Hou | Nov 2006 | B2 |
7142807 | Lee | Nov 2006 | B2 |
7143353 | McGee et al. | Nov 2006 | B2 |
7145070 | Barry | Dec 2006 | B2 |
D535659 | Hally et al. | Jan 2007 | S |
7164075 | Tada | Jan 2007 | B2 |
7164076 | McHale et al. | Jan 2007 | B2 |
7169998 | Kondo et al. | Jan 2007 | B2 |
7170510 | Kawahara et al. | Jan 2007 | B2 |
7174510 | Salter | Feb 2007 | B2 |
7189912 | Jung | Mar 2007 | B2 |
7192353 | Okubo | Mar 2007 | B2 |
7194676 | Fayan et al. | Mar 2007 | B2 |
7199298 | Funaki | Apr 2007 | B2 |
7199801 | Tsunashima et al. | Apr 2007 | B2 |
7201659 | Nakayama et al. | Apr 2007 | B2 |
7221852 | Iizuka et al. | May 2007 | B2 |
7223913 | Knapp et al. | May 2007 | B2 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7259357 | Walker | Aug 2007 | B2 |
7259971 | Allen et al. | Aug 2007 | B1 |
7263668 | Lentz | Aug 2007 | B1 |
7271329 | Franzblau | Sep 2007 | B2 |
7272780 | Abbott et al. | Sep 2007 | B2 |
7274803 | Sharma et al. | Sep 2007 | B1 |
7304232 | Nicholes | Dec 2007 | B1 |
7317812 | Krahnstoever et al. | Jan 2008 | B1 |
7320643 | Brosius et al. | Jan 2008 | B1 |
7323631 | Miyaki et al. | Jan 2008 | B2 |
7324165 | Shan et al. | Jan 2008 | B2 |
7336890 | Lu et al. | Feb 2008 | B2 |
7346472 | Moskowitz et al. | Mar 2008 | B1 |
7352359 | Zalewski et al. | Apr 2008 | B2 |
7352952 | Herberger et al. | Apr 2008 | B2 |
7359121 | French et al. | Apr 2008 | B2 |
7359617 | Ma | Apr 2008 | B2 |
D568659 | Ophardt et al. | May 2008 | S |
D568892 | Stabb et al. | May 2008 | S |
D569382 | Yow et al. | May 2008 | S |
7367887 | Watabe et al. | May 2008 | B2 |
7383508 | Toyama et al. | Jun 2008 | B2 |
7391409 | Zalewski et al. | Jun 2008 | B2 |
7391874 | Semmes, Jr. et al. | Jun 2008 | B1 |
D572265 | Guimaraes et al. | Jul 2008 | S |
7398002 | Hsiao et al. | Jul 2008 | B2 |
7408106 | Weiner et al. | Aug 2008 | B2 |
7423213 | Sitrick | Sep 2008 | B2 |
7430360 | Abecassis | Sep 2008 | B2 |
7432810 | Menache et al. | Oct 2008 | B2 |
7435178 | Tam et al. | Oct 2008 | B1 |
7453035 | Evans et al. | Nov 2008 | B1 |
7458025 | Crow et al. | Nov 2008 | B2 |
7459324 | Ptasinski et al. | Dec 2008 | B1 |
7459624 | Schmidt et al. | Dec 2008 | B2 |
7462772 | Salter | Dec 2008 | B2 |
7480446 | Bhadkamkar et al. | Jan 2009 | B2 |
7480873 | Kawahara | Jan 2009 | B2 |
7488886 | Kemp | Feb 2009 | B2 |
D590407 | Watanabe et al. | Apr 2009 | S |
7521619 | Salter | Apr 2009 | B2 |
7528315 | Goodwin | May 2009 | B2 |
7536654 | Anthony et al. | May 2009 | B2 |
7546130 | Vance | Jun 2009 | B2 |
7559834 | York | Jul 2009 | B1 |
7559841 | Hashimoto | Jul 2009 | B2 |
7579543 | Haruyama et al. | Aug 2009 | B2 |
D599812 | Hirsch | Sep 2009 | S |
D599819 | Lew | Sep 2009 | S |
7582015 | Onoda et al. | Sep 2009 | B2 |
7589727 | Haeker | Sep 2009 | B2 |
7593618 | Xu et al. | Sep 2009 | B2 |
7599554 | Agnihotri et al. | Oct 2009 | B2 |
7605322 | Nakamura | Oct 2009 | B2 |
7612278 | Sitrick et al. | Nov 2009 | B2 |
7625284 | Kay et al. | Dec 2009 | B2 |
7628699 | Onoda et al. | Dec 2009 | B2 |
7640069 | Johnston | Dec 2009 | B1 |
D607892 | Murchie et al. | Jan 2010 | S |
7649134 | Kashioka | Jan 2010 | B2 |
D609715 | Chaudhri | Feb 2010 | S |
7660510 | Kawahara et al. | Feb 2010 | B2 |
7660700 | Moskowitz et al. | Feb 2010 | B2 |
7690017 | Stecyk et al. | Mar 2010 | B2 |
7692630 | Natsume et al. | Apr 2010 | B2 |
7714849 | Pryor | May 2010 | B2 |
7716572 | Beauregard et al. | May 2010 | B2 |
7722450 | Onoda et al. | May 2010 | B2 |
7747348 | Shim et al. | Jun 2010 | B2 |
D619598 | Maitlen et al. | Jul 2010 | S |
D619609 | Meziere | Jul 2010 | S |
7754961 | Yang et al. | Jul 2010 | B1 |
7758427 | Egozy | Jul 2010 | B2 |
7760908 | Curtner et al. | Jul 2010 | B2 |
7772480 | Brennan | Aug 2010 | B2 |
7774706 | Sakai | Aug 2010 | B2 |
7789741 | Fields et al. | Sep 2010 | B1 |
7791808 | French et al. | Sep 2010 | B2 |
7797641 | Karukka et al. | Sep 2010 | B2 |
D624932 | Chaudhri | Oct 2010 | S |
7806759 | McHale et al. | Oct 2010 | B2 |
7814436 | Schrag et al. | Oct 2010 | B2 |
7823070 | Nelson et al. | Oct 2010 | B2 |
7829777 | Kyuma et al. | Nov 2010 | B2 |
7838755 | Taub et al. | Nov 2010 | B2 |
7840907 | Kikuchi et al. | Nov 2010 | B2 |
D628582 | Kurozumi et al. | Dec 2010 | S |
7853896 | Ok et al. | Dec 2010 | B2 |
7853897 | Ogawa et al. | Dec 2010 | B2 |
7865834 | van Os et al. | Jan 2011 | B1 |
7877690 | Margulis | Jan 2011 | B2 |
7881702 | Heyworth et al. | Feb 2011 | B2 |
7890867 | Margulis | Feb 2011 | B1 |
7893337 | Lenz | Feb 2011 | B2 |
7895617 | Pedlow, Jr. | Feb 2011 | B2 |
7899389 | Mangum | Mar 2011 | B2 |
7904814 | Errico et al. | Mar 2011 | B2 |
7917644 | Vedantham et al. | Mar 2011 | B2 |
7920931 | Van de Sluis et al. | Apr 2011 | B2 |
7923620 | Foster | Apr 2011 | B2 |
7928307 | Hetherington et al. | Apr 2011 | B2 |
7935880 | Stoddard et al. | May 2011 | B2 |
7949494 | Moskowitz et al. | May 2011 | B2 |
7973230 | Mahowald | Jul 2011 | B2 |
7980997 | Thukral et al. | Jul 2011 | B2 |
7982114 | Applewhite et al. | Jul 2011 | B2 |
8003872 | Lopiccolo et al. | Aug 2011 | B2 |
8010088 | Cheng | Aug 2011 | B2 |
8026435 | Stoddard et al. | Sep 2011 | B2 |
8057290 | Vance et al. | Nov 2011 | B2 |
D650802 | Jang et al. | Dec 2011 | S |
8076564 | Applewhite | Dec 2011 | B2 |
8076574 | Irmer et al. | Dec 2011 | B2 |
8079901 | Brosius et al. | Dec 2011 | B2 |
8079907 | Egozy | Dec 2011 | B2 |
8080722 | Applewhite et al. | Dec 2011 | B2 |
D651608 | Allen et al. | Jan 2012 | S |
D651609 | Pearson et al. | Jan 2012 | S |
8176439 | Kamen et al. | May 2012 | B2 |
8198526 | Izen et al. | Jun 2012 | B2 |
8202161 | Leake et al. | Jun 2012 | B2 |
8205172 | Wong et al. | Jun 2012 | B2 |
8209606 | Ording | Jun 2012 | B2 |
8214175 | Moskowitz et al. | Jul 2012 | B2 |
8225227 | Headrick et al. | Jul 2012 | B2 |
8230360 | Ma et al. | Jul 2012 | B2 |
D664975 | Arnold | Aug 2012 | S |
20010004861 | Suzuki et al. | Jun 2001 | A1 |
20010007824 | Fukuda | Jul 2001 | A1 |
20010007829 | Suzuki | Jul 2001 | A1 |
20010008844 | Yamauchi et al. | Jul 2001 | A1 |
20010008846 | Yamauchi et al. | Jul 2001 | A1 |
20010012795 | Asami et al. | Aug 2001 | A1 |
20010014440 | Oyama et al. | Aug 2001 | A1 |
20010014620 | Nobe et al. | Aug 2001 | A1 |
20010014621 | Okubo et al. | Aug 2001 | A1 |
20010016510 | Ishikawa et al. | Aug 2001 | A1 |
20010023202 | Okubo | Sep 2001 | A1 |
20010024972 | Kitao | Sep 2001 | A1 |
20010030652 | Kitao | Oct 2001 | A1 |
20010031653 | Oe et al. | Oct 2001 | A1 |
20010033287 | Naegle et al. | Oct 2001 | A1 |
20010035868 | Uehara et al. | Nov 2001 | A1 |
20010036861 | Uehara et al. | Nov 2001 | A1 |
20010037181 | Matsuura et al. | Nov 2001 | A1 |
20010039207 | Horigami et al. | Nov 2001 | A1 |
20010041615 | Kondo | Nov 2001 | A1 |
20020002411 | Higurashi et al. | Jan 2002 | A1 |
20020002900 | Cho | Jan 2002 | A1 |
20020004420 | Suga et al. | Jan 2002 | A1 |
20020005109 | Miller | Jan 2002 | A1 |
20020006819 | Kubo et al. | Jan 2002 | A1 |
20020006823 | Horio | Jan 2002 | A1 |
20020013166 | Yoshitomi | Jan 2002 | A1 |
20020016203 | Nagata et al. | Feb 2002 | A1 |
20020019258 | Kim et al. | Feb 2002 | A1 |
20020022520 | Oe et al. | Feb 2002 | A1 |
20020022522 | Yamada | Feb 2002 | A1 |
20020025841 | Nobe et al. | Feb 2002 | A1 |
20020025842 | Nobe et al. | Feb 2002 | A1 |
20020025853 | Kojima et al. | Feb 2002 | A1 |
20020027899 | Ikeda | Mar 2002 | A1 |
20020032054 | Hosoya | Mar 2002 | A1 |
20020041385 | Onodera | Apr 2002 | A1 |
20020052236 | Kohira et al. | May 2002 | A1 |
20020054127 | Omori et al. | May 2002 | A1 |
20020055383 | Onda et al. | May 2002 | A1 |
20020055386 | Yotsugi et al. | May 2002 | A1 |
20020061776 | Wada et al. | May 2002 | A1 |
20020065121 | Fukunaga et al. | May 2002 | A1 |
20020085833 | Miyauchi | Jul 2002 | A1 |
20020091455 | Williams | Jul 2002 | A1 |
20020091847 | Curtin | Jul 2002 | A1 |
20020094865 | Araki et al. | Jul 2002 | A1 |
20020094866 | Takeda et al. | Jul 2002 | A1 |
20020105229 | Tanaka | Aug 2002 | A1 |
20020119811 | Yabe et al. | Aug 2002 | A1 |
20020128736 | Yoshida et al. | Sep 2002 | A1 |
20020142818 | Nakatsuka et al. | Oct 2002 | A1 |
20020142824 | Kazaoka et al. | Oct 2002 | A1 |
20020142827 | Aida et al. | Oct 2002 | A1 |
20020142834 | Sobue | Oct 2002 | A1 |
20020151337 | Yamashita et al. | Oct 2002 | A1 |
20020160824 | Goto et al. | Oct 2002 | A1 |
20020169014 | Egozy et al. | Nov 2002 | A1 |
20020187835 | Nakayama et al. | Dec 2002 | A1 |
20020198045 | Okubo | Dec 2002 | A1 |
20030003431 | Maeda | Jan 2003 | A1 |
20030003991 | Kuraishi | Jan 2003 | A1 |
20030003992 | Furuya | Jan 2003 | A1 |
20030011620 | Moriyama | Jan 2003 | A1 |
20030014262 | Kim | Jan 2003 | A1 |
20030017872 | Oishi et al. | Jan 2003 | A1 |
20030028598 | Moller et al. | Feb 2003 | A1 |
20030032478 | Takahama et al. | Feb 2003 | A1 |
20030045334 | Hosokawa | Mar 2003 | A1 |
20030069071 | Britt et al. | Apr 2003 | A1 |
20030078086 | Matsuyama et al. | Apr 2003 | A1 |
20030078102 | Okita et al. | Apr 2003 | A1 |
20030099461 | Johnson | May 2003 | A1 |
20030104868 | Okita et al. | Jun 2003 | A1 |
20030109298 | Oishi et al. | Jun 2003 | A1 |
20030164084 | Redmann et al. | Sep 2003 | A1 |
20030218626 | Greene | Nov 2003 | A1 |
20030232644 | Takahashi et al. | Dec 2003 | A1 |
20030232645 | Suda et al. | Dec 2003 | A1 |
20040021684 | B. Millner | Feb 2004 | A1 |
20040054725 | Moller et al. | Mar 2004 | A1 |
20040063479 | Kimura | Apr 2004 | A1 |
20040063480 | Wang | Apr 2004 | A1 |
20040072620 | Nagata et al. | Apr 2004 | A1 |
20040077405 | Watanabe | Apr 2004 | A1 |
20040082380 | George et al. | Apr 2004 | A1 |
20040082386 | George et al. | Apr 2004 | A1 |
20040089139 | Georges et al. | May 2004 | A1 |
20040092303 | George et al. | May 2004 | A1 |
20040092304 | George et al. | May 2004 | A1 |
20040092305 | George et al. | May 2004 | A1 |
20040092306 | George et al. | May 2004 | A1 |
20040092307 | George et al. | May 2004 | A1 |
20040092313 | Saito et al. | May 2004 | A1 |
20040092314 | George et al. | May 2004 | A1 |
20040093354 | Xu et al. | May 2004 | A1 |
20040098582 | Mori | May 2004 | A1 |
20040109000 | Chosokabe | Jun 2004 | A1 |
20040113360 | George et al. | Jun 2004 | A1 |
20040116069 | Fadavi-Ardekani et al. | Jun 2004 | A1 |
20040116184 | George et al. | Jun 2004 | A1 |
20040116185 | George et al. | Jun 2004 | A1 |
20040123726 | Kato et al. | Jul 2004 | A1 |
20040127282 | Naobayashi | Jul 2004 | A1 |
20040127291 | George et al. | Jul 2004 | A1 |
20040132518 | Uehara et al. | Jul 2004 | A1 |
20040132531 | George et al. | Jul 2004 | A1 |
20040152514 | Kasai et al. | Aug 2004 | A1 |
20040154460 | Virolainen et al. | Aug 2004 | A1 |
20040181592 | Samra et al. | Sep 2004 | A1 |
20040186720 | Kemmochi | Sep 2004 | A1 |
20040204211 | Suzuki | Oct 2004 | A1 |
20040204238 | Aoki | Oct 2004 | A1 |
20040205204 | Chafe | Oct 2004 | A1 |
20040207774 | Gothard | Oct 2004 | A1 |
20040209673 | Shiraishi | Oct 2004 | A1 |
20040229685 | Smith et al. | Nov 2004 | A1 |
20040236543 | Stephens | Nov 2004 | A1 |
20040239678 | Tsunashima et al. | Dec 2004 | A1 |
20040243482 | Laut | Dec 2004 | A1 |
20040254016 | Shimazaki | Dec 2004 | A1 |
20040259631 | Katz et al. | Dec 2004 | A1 |
20040259632 | Crittenden et al. | Dec 2004 | A1 |
20040259644 | McCauley | Dec 2004 | A1 |
20050027381 | George et al. | Feb 2005 | A1 |
20050027383 | Nagata et al. | Feb 2005 | A1 |
20050045025 | Wells et al. | Mar 2005 | A1 |
20050049047 | Kitao | Mar 2005 | A1 |
20050059480 | Soukup et al. | Mar 2005 | A1 |
20050060231 | Soukup et al. | Mar 2005 | A1 |
20050070349 | Kimura | Mar 2005 | A1 |
20050070359 | Rodriquez et al. | Mar 2005 | A1 |
20050073427 | Gothard | Apr 2005 | A1 |
20050075165 | George et al. | Apr 2005 | A1 |
20050082559 | Hasan Zaidi et al. | Apr 2005 | A1 |
20050101364 | Onoda et al. | May 2005 | A1 |
20050106546 | Strom | May 2005 | A1 |
20050115383 | Chang | Jun 2005 | A1 |
20050120865 | Tada | Jun 2005 | A1 |
20050120868 | Hinman et al. | Jun 2005 | A1 |
20050143174 | Goldman et al. | Jun 2005 | A1 |
20050164779 | Okuniewicz | Jul 2005 | A1 |
20050181864 | Britt et al. | Aug 2005 | A1 |
20050215319 | Rigopulos et al. | Sep 2005 | A1 |
20050221892 | Takase | Oct 2005 | A1 |
20050227767 | Shimomura et al. | Oct 2005 | A1 |
20050229769 | Resnikoff | Oct 2005 | A1 |
20050235809 | Kageyama | Oct 2005 | A1 |
20050250565 | Nojiri et al. | Nov 2005 | A1 |
20050252362 | McHale et al. | Nov 2005 | A1 |
20050255914 | McHale et al. | Nov 2005 | A1 |
20050255923 | Aoki | Nov 2005 | A1 |
20050273319 | Dittmar et al. | Dec 2005 | A1 |
20060003839 | Lawrence et al. | Jan 2006 | A1 |
20060009282 | George et al. | Jan 2006 | A1 |
20060009979 | McHale et al. | Jan 2006 | A1 |
20060026304 | Price | Feb 2006 | A1 |
20060030382 | Okamura et al. | Feb 2006 | A1 |
20060052161 | Soukup et al. | Mar 2006 | A1 |
20060052162 | Soukup et al. | Mar 2006 | A1 |
20060052163 | Aida | Mar 2006 | A1 |
20060052167 | Boddicker et al. | Mar 2006 | A1 |
20060052169 | Britt et al. | Mar 2006 | A1 |
20060058099 | Soukup et al. | Mar 2006 | A1 |
20060058101 | Rigopulos | Mar 2006 | A1 |
20060063573 | Ishikawa et al. | Mar 2006 | A1 |
20060068911 | Pirich et al. | Mar 2006 | A1 |
20060107819 | Salter | May 2006 | A1 |
20060107822 | Bowen | May 2006 | A1 |
20060135253 | George et al. | Jun 2006 | A1 |
20060152622 | Tan et al. | Jul 2006 | A1 |
20060154710 | Serafat | Jul 2006 | A1 |
20060166744 | Igarashi et al. | Jul 2006 | A1 |
20060175758 | Riolo | Aug 2006 | A1 |
20060189879 | Miyajima et al. | Aug 2006 | A1 |
20060191401 | Ueshima et al. | Aug 2006 | A1 |
20060204214 | Shah et al. | Sep 2006 | A1 |
20060218239 | Umezawa et al. | Sep 2006 | A1 |
20060218288 | Umezawa et al. | Sep 2006 | A1 |
20060247046 | Choi et al. | Nov 2006 | A1 |
20060252503 | Salter | Nov 2006 | A1 |
20060258450 | Ishihata et al. | Nov 2006 | A1 |
20060266200 | Goodwin | Nov 2006 | A1 |
20060287106 | Jensen | Dec 2006 | A1 |
20060288842 | Sitrick et al. | Dec 2006 | A1 |
20060290810 | Mallinson | Dec 2006 | A1 |
20070015571 | Walker et al. | Jan 2007 | A1 |
20070026943 | Yoshimura | Feb 2007 | A1 |
20070059670 | Yates | Mar 2007 | A1 |
20070060312 | Dempsey et al. | Mar 2007 | A1 |
20070081562 | Ma | Apr 2007 | A1 |
20070088812 | Clark | Apr 2007 | A1 |
20070111802 | Ishihara et al. | May 2007 | A1 |
20070119292 | Nakamura | May 2007 | A1 |
20070140510 | Redmann | Jun 2007 | A1 |
20070155494 | Wells et al. | Jul 2007 | A1 |
20070162497 | Pauws | Jul 2007 | A1 |
20070163427 | Rigopulos et al. | Jul 2007 | A1 |
20070163428 | Salter | Jul 2007 | A1 |
20070168415 | Matahira et al. | Jul 2007 | A1 |
20070175317 | Salter | Aug 2007 | A1 |
20070178973 | Camhi | Aug 2007 | A1 |
20070201815 | Griffin | Aug 2007 | A1 |
20070218444 | Konetski et al. | Sep 2007 | A1 |
20070226293 | Sakurada et al. | Sep 2007 | A1 |
20070232374 | Lopiccolo et al. | Oct 2007 | A1 |
20070234284 | Tanner et al. | Oct 2007 | A1 |
20070234881 | Takehisa | Oct 2007 | A1 |
20070234885 | Schmidt et al. | Oct 2007 | A1 |
20070243915 | Egozy et al. | Oct 2007 | A1 |
20070245881 | Egozy et al. | Oct 2007 | A1 |
20070256540 | Salter | Nov 2007 | A1 |
20070256541 | McCauley | Nov 2007 | A1 |
20070260984 | Marks et al. | Nov 2007 | A1 |
20070265095 | Jonishi | Nov 2007 | A1 |
20070270223 | Nonaka et al. | Nov 2007 | A1 |
20070273700 | Nash et al. | Nov 2007 | A1 |
20070297755 | Holt et al. | Dec 2007 | A1 |
20080009346 | Jessop et al. | Jan 2008 | A1 |
20080026355 | Petef | Jan 2008 | A1 |
20080053295 | Goto et al. | Mar 2008 | A1 |
20080076497 | Kiskis et al. | Mar 2008 | A1 |
20080096654 | Mondesir et al. | Apr 2008 | A1 |
20080101762 | Kellock et al. | May 2008 | A1 |
20080102958 | Kitamura et al. | May 2008 | A1 |
20080113698 | Egozy | May 2008 | A1 |
20080113797 | Egozy | May 2008 | A1 |
20080115657 | Wakiyama | May 2008 | A1 |
20080125229 | Jonishi | May 2008 | A1 |
20080146342 | Harvey et al. | Jun 2008 | A1 |
20080155421 | Ubillos et al. | Jun 2008 | A1 |
20080184870 | Toivola | Aug 2008 | A1 |
20080200224 | Parks | Aug 2008 | A1 |
20080202321 | Goto et al. | Aug 2008 | A1 |
20080220864 | Brosius et al. | Sep 2008 | A1 |
20080222685 | McCarthy et al. | Sep 2008 | A1 |
20080268943 | Jacob | Oct 2008 | A1 |
20080273755 | Hildreth | Nov 2008 | A1 |
20080276175 | Kim et al. | Nov 2008 | A1 |
20080280680 | Dutilly et al. | Nov 2008 | A1 |
20080288866 | Spencer et al. | Nov 2008 | A1 |
20080289477 | Salter | Nov 2008 | A1 |
20080311969 | Kay et al. | Dec 2008 | A1 |
20080311970 | Kay et al. | Dec 2008 | A1 |
20090010335 | Harrison et al. | Jan 2009 | A1 |
20090013253 | Laefer et al. | Jan 2009 | A1 |
20090015653 | Baek | Jan 2009 | A1 |
20090038467 | Brennan | Feb 2009 | A1 |
20090069096 | Nishimoto | Mar 2009 | A1 |
20090073117 | Tsurumi et al. | Mar 2009 | A1 |
20090075711 | Brosius et al. | Mar 2009 | A1 |
20090082078 | Schmidt et al. | Mar 2009 | A1 |
20090083281 | Sarig et al. | Mar 2009 | A1 |
20090088249 | Kay et al. | Apr 2009 | A1 |
20090098918 | Teasdale et al. | Apr 2009 | A1 |
20090100992 | Elion | Apr 2009 | A1 |
20090104956 | Kay et al. | Apr 2009 | A1 |
20090122146 | Zalewski et al. | May 2009 | A1 |
20090135135 | Tsurumi | May 2009 | A1 |
20090158220 | Zalewski et al. | Jun 2009 | A1 |
20090165632 | Rigopulos et al. | Jul 2009 | A1 |
20090177742 | Rhoads et al. | Jul 2009 | A1 |
20090186698 | Ludden | Jul 2009 | A1 |
20090188371 | Chiu et al. | Jul 2009 | A1 |
20090189775 | Lashina et al. | Jul 2009 | A1 |
20090191932 | Chiu et al. | Jul 2009 | A1 |
20090215533 | Zalewski et al. | Aug 2009 | A1 |
20090222392 | Martin et al. | Sep 2009 | A1 |
20090228544 | Demers et al. | Sep 2009 | A1 |
20090231425 | Zalewski | Sep 2009 | A1 |
20090233714 | Toro | Sep 2009 | A1 |
20090241758 | Neubacker | Oct 2009 | A1 |
20090258686 | McCauley et al. | Oct 2009 | A1 |
20090258700 | Bright et al. | Oct 2009 | A1 |
20090258703 | Brunstetter | Oct 2009 | A1 |
20090260508 | Elion | Oct 2009 | A1 |
20090265668 | Esser et al. | Oct 2009 | A1 |
20090282335 | Alexandersson | Nov 2009 | A1 |
20090300676 | Harter, Jr. | Dec 2009 | A1 |
20090310027 | Fleming | Dec 2009 | A1 |
20090317783 | Noguchi | Dec 2009 | A1 |
20090318228 | Hughes | Dec 2009 | A1 |
20100009749 | Chrzanowski, Jr. et al. | Jan 2010 | A1 |
20100009750 | Egozy et al. | Jan 2010 | A1 |
20100029386 | Pitsch et al. | Feb 2010 | A1 |
20100035688 | Picunko | Feb 2010 | A1 |
20100041477 | Kay et al. | Feb 2010 | A1 |
20100062405 | Zboray et al. | Mar 2010 | A1 |
20100064238 | Ludwig | Mar 2010 | A1 |
20100080528 | Yen et al. | Apr 2010 | A1 |
20100087240 | Egozy et al. | Apr 2010 | A1 |
20100100848 | Ananian et al. | Apr 2010 | A1 |
20100113117 | Ku et al. | May 2010 | A1 |
20100120470 | Kim et al. | May 2010 | A1 |
20100137049 | Epstein | Jun 2010 | A1 |
20100144436 | Marks et al. | Jun 2010 | A1 |
20100151948 | Vance et al. | Jun 2010 | A1 |
20100160038 | Youm et al. | Jun 2010 | A1 |
20100161432 | Kumanov et al. | Jun 2010 | A1 |
20100186579 | Schnitman | Jul 2010 | A1 |
20100192106 | Watanabe et al. | Jul 2010 | A1 |
20100209003 | Toebes et al. | Aug 2010 | A1 |
20100216598 | Nicolas et al. | Aug 2010 | A1 |
20100228740 | Cannistraro et al. | Sep 2010 | A1 |
20100245241 | Kim et al. | Sep 2010 | A1 |
20100247081 | Victoria Pons et al. | Sep 2010 | A1 |
20100255827 | Jordan et al. | Oct 2010 | A1 |
20100261146 | Kim | Oct 2010 | A1 |
20100265398 | Johnson et al. | Oct 2010 | A1 |
20100283723 | Konishi | Nov 2010 | A1 |
20100299405 | Socher et al. | Nov 2010 | A1 |
20100300264 | Foster | Dec 2010 | A1 |
20100300265 | Foster et al. | Dec 2010 | A1 |
20100300266 | Stoddard et al. | Dec 2010 | A1 |
20100300267 | Stoddard et al. | Dec 2010 | A1 |
20100300268 | Applewhite et al. | Dec 2010 | A1 |
20100300269 | Applewhite | Dec 2010 | A1 |
20100300270 | Applewhite et al. | Dec 2010 | A1 |
20100300272 | Scherf | Dec 2010 | A1 |
20100304810 | Stoddard | Dec 2010 | A1 |
20100304811 | Schmidt et al. | Dec 2010 | A1 |
20100304812 | Stoddard et al. | Dec 2010 | A1 |
20100304863 | Applewhite et al. | Dec 2010 | A1 |
20100304865 | Picunko et al. | Dec 2010 | A1 |
20100306655 | Mattingly et al. | Dec 2010 | A1 |
20110010667 | Sakai et al. | Jan 2011 | A1 |
20110021273 | Buckley et al. | Jan 2011 | A1 |
20110028214 | Bright et al. | Feb 2011 | A1 |
20110039659 | Kim et al. | Feb 2011 | A1 |
20110047471 | Lord et al. | Feb 2011 | A1 |
20110066940 | Asghari Kamrani et al. | Mar 2011 | A1 |
20110098106 | He et al. | Apr 2011 | A1 |
20110098109 | Leake et al. | Apr 2011 | A1 |
20110118621 | Chu | May 2011 | A1 |
20110140931 | Geurts et al. | Jun 2011 | A1 |
20110151975 | Mori | Jun 2011 | A1 |
20110159938 | Umeda | Jun 2011 | A1 |
20110185309 | Challinor et al. | Jul 2011 | A1 |
20110195779 | Lau | Aug 2011 | A1 |
20110197740 | Chang et al. | Aug 2011 | A1 |
20110237324 | Clavin et al. | Sep 2011 | A1 |
20110238676 | Liu et al. | Sep 2011 | A1 |
20110251840 | Cook et al. | Oct 2011 | A1 |
20110256929 | Dubrofsky et al. | Oct 2011 | A1 |
20110257771 | Bennett et al. | Oct 2011 | A1 |
20110283236 | Beaumier et al. | Nov 2011 | A1 |
20110306396 | Flury et al. | Dec 2011 | A1 |
20110306397 | Fleming et al. | Dec 2011 | A1 |
20110306398 | Boch et al. | Dec 2011 | A1 |
20110312397 | Applewhite et al. | Dec 2011 | A1 |
20110312415 | Booth et al. | Dec 2011 | A1 |
20120021833 | Boch et al. | Jan 2012 | A1 |
20120052947 | Yun | Mar 2012 | A1 |
20120063617 | Ramos | Mar 2012 | A1 |
20120069131 | Abelow | Mar 2012 | A1 |
20120094730 | Egozy | Apr 2012 | A1 |
20120108305 | Akiyama et al. | May 2012 | A1 |
20120108334 | Tarama et al. | May 2012 | A1 |
20120143358 | Adams et al. | Jun 2012 | A1 |
Number | Date | Country |
---|---|---|
468071 | Jun 2010 | AT |
200194329 | Oct 2001 | AU |
741239 | Nov 2001 | AU |
2003285918 | May 2004 | AU |
2010229693 | Nov 2011 | AU |
2587415 | May 2005 | CA |
2609587 | Dec 2005 | CA |
2720723 | Nov 2009 | CA |
2757238 | Sep 2010 | CA |
2760210 | Dec 2010 | CA |
19716937 | Mar 1998 | DE |
69804915 | Sep 2002 | DE |
69726507 | Nov 2004 | DE |
69832379 | Aug 2006 | DE |
69739885 | Jul 2010 | DE |
0903169 | Mar 1999 | EP |
919267 | Jun 1999 | EP |
972550 | Jan 2000 | EP |
974382 | Jan 2000 | EP |
974954 | Jan 2000 | EP |
978301 | Feb 2000 | EP |
982055 | Mar 2000 | EP |
992928 | Apr 2000 | EP |
992929 | Apr 2000 | EP |
993847 | Apr 2000 | EP |
0997870 | May 2000 | EP |
1003130 | May 2000 | EP |
1022672 | Jul 2000 | EP |
1029565 | Aug 2000 | EP |
1029566 | Aug 2000 | EP |
1029570 | Aug 2000 | EP |
1029571 | Aug 2000 | EP |
1031363 | Aug 2000 | EP |
1031904 | Aug 2000 | EP |
1033157 | Sep 2000 | EP |
1033158 | Sep 2000 | EP |
1043745 | Oct 2000 | EP |
1043746 | Oct 2000 | EP |
1048330 | Nov 2000 | EP |
1061501 | Dec 2000 | EP |
1064974 | Jan 2001 | EP |
1064975 | Jan 2001 | EP |
1066866 | Jan 2001 | EP |
1079368 | Feb 2001 | EP |
1 081 680 | Mar 2001 | EP |
1081679 | Mar 2001 | EP |
1082981 | Mar 2001 | EP |
1082982 | Mar 2001 | EP |
1082983 | Mar 2001 | EP |
1088573 | Apr 2001 | EP |
1 096 468 | May 2001 | EP |
1114659 | Jul 2001 | EP |
1122703 | Aug 2001 | EP |
1125607 | Aug 2001 | EP |
1125613 | Aug 2001 | EP |
1127599 | Aug 2001 | EP |
1130569 | Sep 2001 | EP |
1132889 | Sep 2001 | EP |
1134723 | Sep 2001 | EP |
1136107 | Sep 2001 | EP |
1138357 | Oct 2001 | EP |
1139293 | Oct 2001 | EP |
1145744 | Oct 2001 | EP |
1145745 | Oct 2001 | EP |
1145748 | Oct 2001 | EP |
1145749 | Oct 2001 | EP |
1150276 | Oct 2001 | EP |
1151770 | Nov 2001 | EP |
1151773 | Nov 2001 | EP |
1157723 | Nov 2001 | EP |
1159992 | Dec 2001 | EP |
1160762 | Dec 2001 | EP |
1161974 | Dec 2001 | EP |
1 174 856 | Jan 2002 | EP |
1170041 | Jan 2002 | EP |
1178427 | Feb 2002 | EP |
1184061 | Mar 2002 | EP |
1187427 | Mar 2002 | EP |
1192976 | Apr 2002 | EP |
1195721 | Apr 2002 | EP |
1197947 | Apr 2002 | EP |
1199702 | Apr 2002 | EP |
1199703 | Apr 2002 | EP |
1 201 277 | May 2002 | EP |
1206950 | May 2002 | EP |
1208885 | May 2002 | EP |
1214959 | Jun 2002 | EP |
1220539 | Jul 2002 | EP |
1228794 | Aug 2002 | EP |
1245255 | Oct 2002 | EP |
1249260 | Oct 2002 | EP |
1258274 | Nov 2002 | EP |
1264622 | Dec 2002 | EP |
1270049 | Jan 2003 | EP |
1270050 | Jan 2003 | EP |
1271294 | Jan 2003 | EP |
1279425 | Jan 2003 | EP |
1287864 | Mar 2003 | EP |
1306112 | May 2003 | EP |
1413340 | Apr 2004 | EP |
000181482-0005 | Sep 2004 | EP |
1503365 | Feb 2005 | EP |
1533010 | May 2005 | EP |
1542132 | Jun 2005 | EP |
1552864 | Jul 2005 | EP |
1552865 | Jul 2005 | EP |
1569171 | Aug 2005 | EP |
1604711 | Dec 2005 | EP |
1609513 | Dec 2005 | EP |
1630746 | Mar 2006 | EP |
1666109 | Jun 2006 | EP |
1696385 | Aug 2006 | EP |
1699017 | Sep 2006 | EP |
1731204 | Dec 2006 | EP |
1743680 | Jan 2007 | EP |
1 758 387 | Feb 2007 | EP |
1 825 896 | Aug 2007 | EP |
000859418-0008 | Feb 2008 | EP |
000890447-0040 | Apr 2008 | EP |
000890447-0046 | Apr 2008 | EP |
2000190 | Dec 2008 | EP |
2001569 | Dec 2008 | EP |
2027577 | Feb 2009 | EP |
2206539 | Jul 2010 | EP |
2206540 | Jul 2010 | EP |
2301253 | Mar 2011 | EP |
2411101 | Feb 2012 | EP |
2494432 | Sep 2012 | EP |
200705530 | Jan 2009 | FI |
20096276 | Dec 2009 | FI |
2118809 | Nov 1983 | GB |
2425730 | Nov 2006 | GB |
2465918 | Jun 2010 | GB |
2471871 | Jan 2011 | GB |
1018021 | Oct 2002 | HK |
1023734 | Feb 2006 | HK |
01685 | Jan 2012 | IT |
7185131 | Jul 1995 | JP |
3014386 | Aug 1995 | JP |
2552427 | Nov 1996 | JP |
11053563 | Feb 1999 | JP |
11128534 | May 1999 | JP |
11128535 | May 1999 | JP |
11151380 | Jun 1999 | JP |
11156054 | Jun 1999 | JP |
2922509 | Jul 1999 | JP |
11219443 | Aug 1999 | JP |
2951948 | Sep 1999 | JP |
2982147 | Nov 1999 | JP |
11313979 | Nov 1999 | JP |
3003851 | Jan 2000 | JP |
2000014931 | Jan 2000 | JP |
2000037490 | Feb 2000 | JP |
3017986 | Mar 2000 | JP |
3031676 | Apr 2000 | JP |
2000107447 | Apr 2000 | JP |
2000107458 | Apr 2000 | JP |
2000112485 | Apr 2000 | JP |
2000116938 | Apr 2000 | JP |
3053090 | Jun 2000 | JP |
2000157723 | Jun 2000 | JP |
3066528 | Jul 2000 | JP |
2000218046 | Aug 2000 | JP |
3088409 | Sep 2000 | JP |
2000237454 | Sep 2000 | JP |
2000237455 | Sep 2000 | JP |
2000245957 | Sep 2000 | JP |
2000245964 | Sep 2000 | JP |
2000245967 | Sep 2000 | JP |
2000250534 | Sep 2000 | JP |
2000288254 | Oct 2000 | JP |
2000293292 | Oct 2000 | JP |
2000293294 | Oct 2000 | JP |
2000300838 | Oct 2000 | JP |
2000300851 | Oct 2000 | JP |
2000308759 | Nov 2000 | JP |
2000317144 | Nov 2000 | JP |
2000325665 | Nov 2000 | JP |
2000350861 | Dec 2000 | JP |
2001000610 | Jan 2001 | JP |
2001009149 | Jan 2001 | JP |
2001009152 | Jan 2001 | JP |
2001009157 | Jan 2001 | JP |
2001046739 | Feb 2001 | JP |
2001062144 | Mar 2001 | JP |
2001070637 | Mar 2001 | JP |
2001070640 | Mar 2001 | JP |
2001070652 | Mar 2001 | JP |
2001075579 | Mar 2001 | JP |
2001096059 | Apr 2001 | JP |
2001096061 | Apr 2001 | JP |
2001129244 | May 2001 | JP |
2001145777 | May 2001 | JP |
2001145778 | May 2001 | JP |
3179769 | Jun 2001 | JP |
2001162049 | Jun 2001 | JP |
2001170352 | Jun 2001 | JP |
2001175254 | Jun 2001 | JP |
3187758 | Jul 2001 | JP |
2001190834 | Jul 2001 | JP |
2001190835 | Jul 2001 | JP |
2001190844 | Jul 2001 | JP |
2001198351 | Jul 2001 | JP |
2001198352 | Jul 2001 | JP |
2001198354 | Jul 2001 | JP |
3202733 | Aug 2001 | JP |
2001212369 | Aug 2001 | JP |
2001218980 | Aug 2001 | JP |
2001222280 | Aug 2001 | JP |
2001224850 | Aug 2001 | JP |
2001231904 | Aug 2001 | JP |
2001232059 | Aug 2001 | JP |
2001232062 | Aug 2001 | JP |
2001-252470 | Sep 2001 | JP |
3204652 | Sep 2001 | JP |
2001252467 | Sep 2001 | JP |
2001259224 | Sep 2001 | JP |
2001269482 | Oct 2001 | JP |
2001273517 | Oct 2001 | JP |
2001293246 | Oct 2001 | JP |
2001293254 | Oct 2001 | JP |
2001293256 | Oct 2001 | JP |
2001299975 | Oct 2001 | JP |
2001312260 | Nov 2001 | JP |
2001312740 | Nov 2001 | JP |
2001314645 | Nov 2001 | JP |
2001321565 | Nov 2001 | JP |
2001344049 | Dec 2001 | JP |
2001353374 | Dec 2001 | JP |
3245139 | Jan 2002 | JP |
2002000936 | Jan 2002 | JP |
2002018123 | Jan 2002 | JP |
2002018134 | Jan 2002 | JP |
2002028368 | Jan 2002 | JP |
3258647 | Feb 2002 | JP |
3261110 | Feb 2002 | JP |
2002045567 | Feb 2002 | JP |
2002056340 | Feb 2002 | JP |
2002066127 | Mar 2002 | JP |
2002066128 | Mar 2002 | JP |
2002084292 | Mar 2002 | JP |
3270928 | Apr 2002 | JP |
2002116752 | Apr 2002 | JP |
2002140727 | May 2002 | JP |
2002143567 | May 2002 | JP |
2002153673 | May 2002 | JP |
3306021 | Jul 2002 | JP |
2002204426 | Jul 2002 | JP |
3310257 | Aug 2002 | JP |
3317686 | Aug 2002 | JP |
3317956 | Aug 2002 | JP |
2002224435 | Aug 2002 | JP |
2002239223 | Aug 2002 | JP |
2002239233 | Aug 2002 | JP |
3320700 | Sep 2002 | JP |
3321111 | Sep 2002 | JP |
2002263229 | Sep 2002 | JP |
3333773 | Oct 2002 | JP |
3338005 | Oct 2002 | JP |
2002282417 | Oct 2002 | JP |
2002282418 | Oct 2002 | JP |
2002292123 | Oct 2002 | JP |
2002292139 | Oct 2002 | JP |
2002301263 | Oct 2002 | JP |
3345591 | Nov 2002 | JP |
3345719 | Nov 2002 | JP |
2002325975 | Nov 2002 | JP |
3351780 | Dec 2002 | JP |
2002360937 | Dec 2002 | JP |
3361084 | Jan 2003 | JP |
3370313 | Jan 2003 | JP |
3371132 | Jan 2003 | JP |
2003000951 | Jan 2003 | JP |
2003010541 | Jan 2003 | JP |
2003010542 | Jan 2003 | JP |
2003019346 | Jan 2003 | JP |
2003030686 | Jan 2003 | JP |
2003058317 | Feb 2003 | JP |
3392833 | Mar 2003 | JP |
2003117233 | Apr 2003 | JP |
2003126548 | May 2003 | JP |
3417555 | Jun 2003 | JP |
3417918 | Jun 2003 | JP |
3420221 | Jun 2003 | JP |
2003175279 | Jun 2003 | JP |
3425548 | Jul 2003 | JP |
3425552 | Jul 2003 | JP |
3433918 | Aug 2003 | JP |
3439187 | Aug 2003 | JP |
2003236244 | Aug 2003 | JP |
3442730 | Sep 2003 | JP |
3448043 | Sep 2003 | JP |
2003256552 | Sep 2003 | JP |
3458090 | Oct 2003 | JP |
3470119 | Nov 2003 | JP |
2003334387 | Nov 2003 | JP |
3491759 | Jan 2004 | JP |
2004016315 | Jan 2004 | JP |
2004016388 | Jan 2004 | JP |
3496874 | Feb 2004 | JP |
3500379 | Feb 2004 | JP |
3500383 | Feb 2004 | JP |
2004033266 | Feb 2004 | JP |
2004097610 | Apr 2004 | JP |
2004105309 | Apr 2004 | JP |
2004121397 | Apr 2004 | JP |
3526302 | May 2004 | JP |
2004141261 | May 2004 | JP |
3534345 | Jun 2004 | JP |
2004164519 | Jun 2004 | JP |
2004166994 | Jun 2004 | JP |
3545755 | Jul 2004 | JP |
3545983 | Jul 2004 | JP |
3546206 | Jul 2004 | JP |
3547374 | Jul 2004 | JP |
2004192069 | Jul 2004 | JP |
2004201937 | Jul 2004 | JP |
3561456 | Sep 2004 | JP |
3566195 | Sep 2004 | JP |
3573288 | Oct 2004 | JP |
3576994 | Oct 2004 | JP |
3582716 | Oct 2004 | JP |
2004283249 | Oct 2004 | JP |
2004298469 | Oct 2004 | JP |
2004321245 | Nov 2004 | JP |
3597465 | Dec 2004 | JP |
2004337256 | Dec 2004 | JP |
3611807 | Jan 2005 | JP |
2005046445 | Feb 2005 | JP |
2005049913 | Feb 2005 | JP |
3626711 | Mar 2005 | JP |
3634273 | Mar 2005 | JP |
2005095440 | Apr 2005 | JP |
3656118 | Jun 2005 | JP |
3686906 | Aug 2005 | JP |
3699660 | Sep 2005 | JP |
2005261586 | Sep 2005 | JP |
3702269 | Oct 2005 | JP |
2005287830 | Oct 2005 | JP |
2005301578 | Oct 2005 | JP |
3715513 | Nov 2005 | JP |
2005319025 | Nov 2005 | JP |
3727275 | Dec 2005 | JP |
2006020758 | Jan 2006 | JP |
3753425 | Mar 2006 | JP |
2006075264 | Mar 2006 | JP |
2006116046 | May 2006 | JP |
2006116047 | May 2006 | JP |
2006192157 | Jul 2006 | JP |
3804939 | Aug 2006 | JP |
3816931 | Aug 2006 | JP |
3822887 | Sep 2006 | JP |
3831695 | Oct 2006 | JP |
3869175 | Jan 2007 | JP |
2007029589 | Feb 2007 | JP |
3890445 | Mar 2007 | JP |
2007504901 | Mar 2007 | JP |
2008018287 | Jan 2008 | JP |
2008168143 | Jul 2008 | JP |
2009531153 | Sep 2009 | JP |
2010509000 | Mar 2010 | JP |
200100287533 | Apr 2001 | KR |
20050047024 | May 2005 | KR |
2010146213 | May 2012 | RU |
173496 | Sep 2011 | SG |
340049 | Mar 2009 | TW |
200951764 | Dec 2009 | TW |
201006526 | Feb 2010 | TW |
322023 | Mar 2010 | TW |
201116318 | May 2011 | TW |
WO-9717598 | May 1997 | WO |
WO-9938588 | Aug 1999 | WO |
WO-0163592 | Aug 2001 | WO |
WO-0230535 | Apr 2002 | WO |
WO-2004002590 | Jan 2004 | WO |
WO-2004002594 | Jan 2004 | WO |
WO-2004024256 | Mar 2004 | WO |
WO-2004024263 | Mar 2004 | WO |
WO-2004027631 | Apr 2004 | WO |
WO-2004030779 | Apr 2004 | WO |
WO-2004039055 | May 2004 | WO |
WO-2004052483 | Jun 2004 | WO |
WO-2004053800 | Jun 2004 | WO |
WO-2004082786 | Sep 2004 | WO |
WO-2004087272 | Oct 2004 | WO |
WO-2004101093 | Nov 2004 | WO |
WO-2004107270 | Dec 2004 | WO |
WO-2005027062 | Mar 2005 | WO |
WO-2005027063 | Mar 2005 | WO |
WO-2005030354 | Apr 2005 | WO |
WO-2005099842 | Oct 2005 | WO |
WO-2005107902 | Nov 2005 | WO |
WO 2005113096 | Dec 2005 | WO |
WO-2005113096 | Dec 2005 | WO |
WO-2005114648 | Dec 2005 | WO |
WO-2006006274 | Jan 2006 | WO |
WO-2006075494 | Jul 2006 | WO |
WO-2007055522 | May 2007 | WO |
WO-2007070738 | Jun 2007 | WO |
WO-2007078639 | Jul 2007 | WO |
WO 2007115299 | Oct 2007 | WO |
WO-2007115299 | Oct 2007 | WO |
WO-2007111247 | Oct 2007 | WO |
WO-2007130582 | Nov 2007 | WO |
WO-2008001088 | Jan 2008 | WO |
WO-2008145952 | Dec 2008 | WO |
WO-2009021124 | Feb 2009 | WO |
WO-2010018485 | Feb 2010 | WO |
WO-2010036989 | Apr 2010 | WO |
WO-2011067469 | Jun 2011 | WO |
WO-2011155958 | Dec 2011 | WO |
Entry |
---|
“Karaoke Revolution,” In Wikipedia Online Encyclopedia. Wikipedia, Retrieved from the Internet: <URL: http://en.wikipedia.org/wiki/Karaoke—Revolution>, 5 pages (retrieved on Aug. 3, 2010). |
“Lips,” In Wikipedia Online Encyclopedia. Wikipedia, Retrieved from the Internet: <URL: http://en.wikipedia.org/wiki/Lips—(video—game)>, 4 pages (retrieved on Aug. 3, 2010). |
“SingStar,” In Wikipedia Online Encyclopedia. Wikipedia, Retrieved from the Internet: <URL: http://en.wikipedia.org/wiki/SingStar>, 10 pages (retrieved on Aug. 3, 2010). |
TablEdit Tablature Editor, software having music data display and synthesis functions, Retrieved from the Internet: <URL: http://www.tabledit.com/index.shml>, 2 pages (retrieved on Mar. 5, 2010). |
Non-Final Office Action for U.S. Appl. No. 12/474,800. Mailing date: Apr. 14, 2010, 47 pages. |
Response to Non-Final Office Action to the Non-Final Office Action for U.S. Appl. No. 12/474,800 dated: Oct. 13, 2010, 14 pages. |
Non-Final Office Action for U.S. Appl. No. 12/474,751. Mailing date: Mar. 12, 2010, 16 pages. |
Response to Non-Final Office Action to the Non-Final Office Action for U.S. Appl. No. 12/474,751 dated: Sep. 10, 2010, 4 pages. |
Non-Final Office Action for U.S. Appl. No. 12/474,948. Mailing date: Jul. 8, 2010, 14 pages. |
Non-Final Office Action for U.S. Appl. No. 12/474,899. Mailing date: Jan. 19, 2011, 7 pages. |
International Search Report issued for PCT/US2010/054300, dated May 31, 2011 (5 pages). |
Kuwayama, Y. Trademarks & Symbols, vol. 2: Symbolical Designs, Van Nostrand Reinhold Company, (Nov. 4, 1980). 4 pages. |
Microsoft Office Online Clip Art, http://office.microsoft.com/en-us/clipart/results.aspx?Scope=MC,MM,MP,MS&PoleAssetID=MCJ04316180000&Querty=lcon s&CTT=6&Origin=EC01017435m (Feb. 21, 2007) (1 page). |
Microsoft PowerPoint Handbook, (1 page) (1992). |
Thalmann, “L'animation par ordinateur” http://web.arch ive.org/web/20060421045510/http://vrlab.epfl.ch/{thalmann/CG/infog r.4.pdf>, Apr. 21, 2006 (52 pages). |
“BVH File Specification”, Character Studio, http://web.archive.org/web/20060321075406/http://character-studio.net/bvh—file—specification.htm, Mar. 21, 2006 (16 pages). |
Amplitude for Playstation. Retrieved from the Internet: www.target.com/gp/detail.hbnI/601- 0682676-9911341?asin=B0000859TM&AFID. Retrieved on Feb. 22, 2005. 1 page. |
Amplitude Review by Ryan Davis. Retrieved from the Internet: www.gamespot.com/ps2/puzzle/ampli˜de /printable—6023980.html. Retrieved on Jun. 11, 2012. 10 pages. |
Amplitude. Retrieved from the Internet: www.gamesquestdirect.com/71171972582.html. Retrieved on Jun. 8, 2012. 4 pages. |
Amplitude: Sony's Rocking Rhythm Game Outdoes Itself on All Fronts by Douglass C. Perry. Retrieved from the Internet: http://ps2.ign.com/articles/390/390620pl.thml. Retrieved on Jun. 8, 2012. 6 pages. |
Association of British Scrabble Players. “Rolling System” ABSP, http://www.absp.org.uk/results/ratings—detail.shtml. Retrieved May 25, 2011 (4 pages). |
Beat Planet Music (Import) Review by Christian Nutt. Retrieved from the Internet: www.gamespot.com/ps/ action/beatplanetmusiclprintable—2546762.html. Retrieved on Jun. 11, 2012. 3 pages. |
Beatmania IIDX 9 Style. Retrieved from the Internet: www.play-asia.com/paOS-13-71-8-iu.html. Retrieved on Feb. 22, 2005. 2 pages. |
Beatmania PlayStation Review from www.GamePro.com/sony/psx/games/reviews/89.shtml. Retrieved on Feb. 22, 2005. 1 page. |
Beatmania Review. Retrieved from the Internet: www.gamesarefun.com/gamesdb/review. h?reviewid=294. Retrieved on Jun. 11, 2012. 1 page. |
Beatmania IIDX 7 Style. Retrieved from the Internet: www.lik-sang.com/Info.php?category= 27&products id=4061. Retrieved on Feb. 22, 2005. 1 page. |
Bishop, Sam; 'Frequency: If you decide to pick up this game, you better give up ont he idea of getting a full night of sleep. via www.ign.com [online], Nov. 26, 2001 [retrieved on Mar. 1, 2006]. Retrieved from the Internet <URL: http://ps2.ign.com/articles/166/166450p1.html>. Retrieved on Jun. 8, 2012. 8 pages. |
Bust A Groove Review by Jeff Gerstmann. Retrieved from the Internet: www.gamespolcom/ps/puzzlelbusta groove/printable—2546923.html. Retrieved on Jun. 11, 2012. 9 pages. |
Bust A Groove. Retrieved from the Internet: www.buyritegames.com/product—information. asp?rc=frgl&number=PS-BUSTA2. Retrieved on Feb. 22 2005. 1 page. |
Bust A Groove. Retrieved from the Internet: www.estarland.com/index.asp?page=Piaystation &cat=F&oroduct=6257&q. Retrieved on Jun. 11, 2012. 2 pages. |
Bust A Groove: 989 Studios Best Game of the Year is a Funky Dance Sim thars Got the Fever by Doug Peny. Retrieved from the Internet http://psx.com/articles/152/152308pl.html. Retrieved on Jun. 8, 2012. 5 pages. |
Dance Dance Revolution Review by Andy Chien. Retrieved from the Internet www.gamingage.com/reviews /archive/old reviews/psx/ddr. Retrieved on Feb. 22, 2005. 3 pages. |
Dance Dance Revolution Review by Ryan Davis. Retrieved from the Internet www.gamespolcom/ps/puzzJe /dancedancerevolutionfprintable—2699724.html. Retrieved on Jun. 11, 2012. 9 pages. |
Dance Dance Revolution, Konami via wvvw.ign.com [online], Apr. 4, 2001 [retrieved on Mar. 1, 2006]. Retrieved from the Internet <URL: http://psx.ign.com/articles/161/161525p1.html>. Retrieved on Jun. 14, 2012. 7 pages. |
Dance Dance Revolution. Retrieved from the Internet: www.ebgames.com/ebxlproduct/224 7 89.asp. Retrieved on Feb. 22 2005. 2 pages. |
Dancing with the Stars Game Manual (1 page). |
Dave H, et al. StepMania Tutorial. Nov. 3, 2004. <http://web.archive.org/web/200411031145Λ/vww.stepmania.conVstepmania/wiki.php?pagename=Tutorial>. Retrieved on Jun. 19, 2012. 7 pages. |
Def Jam Vendetta Review by Alex Navarro. Retrieved from the Internet www.gamespot.com/ps2/actionf actionfdefjamvendetta/prIntable—6024297 .html. Retrieved on Jun. 11, 2012. 10 pages. |
Def Jam Vendetta. Retrieved from the Internet www.ebgames.com/ebxlproduct/232378.asp. Retrieved on Feb. 22, 2005. 2 pages. |
Def Jam Vendetta: Rapper's Delight or Fight-Night Fright? Smash Sumthin' and Find Out by Jon Robinson. Mar. 31, 2003. Retrieved from the Internet http://ps2.ign.com/articles/391/391713pl.html. Retrieved on Jun. 8, 2012. 6 pages. |
Digital Play: Reloaded. Opening Reception. Museum of the Moving Image. Mar. 19, 2005. <http://web.archive.Org/web/20050319060247/http://www.movingimage.us/site/screenings/contenV2005/digital—play—reloaded.ht ml>. 1 page. |
Donkey Konga Review by Ryan Davis. Retrieved from the Internet: www.gamespot.com/gamecubelpuzzle/ donkeykongalprintable—6108977.html. Retrieved on Jun. 11, 2012. 11 pages. |
Donkey Konga. Retrieved from the Internet: www.ebgames.com/ebxlproducV244024.asp. Retrieved on Jun. 11, 2012. 2 pages. |
Donkey Konga: Could a Game Featuring Donkey Kong and Mini-Bongos ever Fail? Our Ful Review by Juan Castro. Retrieved from the Internet: cube.ign.com/articles/550/550723pl.html. Retrieved on Jun. 8, 2012. 6 pages. |
DrumMana w/ Drum Set. Retrieved from the Internet www.estarland.com/index.asp?page=Playstation2&cat=RD&product=181268 &q. Retrieved on Jun. 11, 2012. 2 pages. |
Frequency PS2. Retrieved from the Internet: www.walmart.com/catalog/producLgsp7dests 9999999997&product id=1635738&s. Retrieved on Feb. 22, 2005. 2 pages. |
Frequency Review by Ryan Davis. Retrieved from the Internet:www.gamespot.com/ps2/puzzle/frequency/ printable 2827476.html. Retrieved on Jun. 19, 2012. 9 pages. |
Get on Da Mic Overview by Matt Gonzales. Retrieved from the Internet www.gamechronides.com/reviews/ ps2/getondamic/body.htm. Retrieved on Jun. 11, 2012. 3 pages. |
Get on Da Mic Review by Jeff Gerstmann. Retrieved from the Internet wvw.gamespot.cx)rri/ps2/puzzle/getondamic/printable 6110242.html. Retrieved on Jun. 11, 2012. 10 pages. |
Get on Da Mic. Retrieved from the Internet: www.ebgames.com/ebx/product/245102.asp. Retrieved on Jun. 11, 2012. 2 pages. |
Gitaroo Man. Retrieved from the Internet www.estartand.com/index.asp?page=Playstation2&cat=PZ&product=676&Q .. Retrieved on Jun. 14, 2012. 2 pages. |
Gitaroo-Man Review by David Smith. Retrieved from the Internet htt£- yΛs2.ign.conVara'cles/354/ 354413pjLhtml. Retrieved on Jun. 11, 2012. 4 pages. |
Gitaroo-Man Review by Ryan Davis. Retrieved from the Internet: www.gamesrx)t.coiTi/ps2/puzzle/gitaroomart/printable 2847915.html. Retrieved on Jun. 19, 2012. 9 pages. |
Gitaroo-Man. Retrieved from the Internet vvvrw.buyritegames.com/productjnformation.asp?re=frgl&number=PS2-GITARO. Retrieved on Feb. 22, 2005. 1 page. |
Guitar Freaks (Import) Review by Sam Kennedy. Retrieved from the Internet: www.gamespot.com/pslaction/ guitarfreaks/printable—2545966.html. Retrieved on Jun. 11, 2012. 10 pages. |
Guitar Freaks Review by Wade Monnig. Retrieved from the Internet: www.gamesarefun.com/gamesdb/review.php? .reviewid=301. Retrieved on Jun. 11, 2012. 3 pages. |
Guitar Freaks Sony. Retrieved from the Internet www.gameexpress.com/product—detail.cfm.?UPC=SCPS45422. Retrieved on Feb. 22 2005. 1 page. |
Guitar Freaks with Guitar. Retrieved from the Internet: www.buyritegames.com/product—information.asp?rc=frgl&number=PSJ-GUilWG. Retrieved on Feb. 22, 2005. 1 page. |
Guitar Hero (video game)—Wikipedia, the free encyclopedia—(Publisher—RedOctane) Release Date Nov. 2005. 25 pages. |
Guitar Hero—Wikipedia, the free encyclopedia—Nov. 2005. http://en.wikipedia.org/w/index.php?title=guitaryhero&oldid=137778068. Retrieved on May 22, 2012. 5 pages. |
GuitarFreaks—Wikipedia, the free encyclopedia—(Publisher—Konami, Konami Digital Entertainment) Release Date 1998. Accessed on Mar. 19, 2009. 5 pages. |
International Search Report, PCT/US2006/062287, Mailed on May 10, 2007. 2 pages. |
Ipodgames.com Tips. 4 Dec. 2004. <http://web.archive.org/web/ 20041204032612Awww.ipodgames.com/tips.html> 1 page. |
Karaoke Revolution Review by Jeff Gerstmann. Retrieved from the Internet www.gamespot.com/ps2/puzzle/ karaokerevolution/printable . . . 6081709.html. Retrieved on Jun. 14, 2012. 10 pages. |
Karaoke Revolution. Retrieved from the Internet: www.ebgames.com/ebxlproduct/24806.asp. Retrieved on Feb. 22, 2005. 2 pages. |
DrumMania (Import) Review by Jeff Garstmann. Retrieved from the Internet: www.gamespot.com/ps2/actionf drummania/prinlable—2546356.html. Retrieved on Jun. 11, 2012. 9 pages. |
DrumMania OST. Retrieved from the Internet www.lik-sang.corn/info/php?category=264& products id=4793. Retrieved on Feb. 22, 2005. 2 pages. |
DrumMania Review by Wynfwad. Retrieved from the Internet www.gamefaqs.com/console/ps2/review/ R56573.html. Retrieved on Jun. 11, 2012. 2 pages. |
ESRB Game Ratings: Game Rating & Descriptor Guide' via www.esrb.org[online], Retrived from the Internet: <URL: http:/Arvww.esrb.org/esrbratings—guide.asp#symbols>. Retrieved on Jun. 14, 2012. 3 pages. |
Eye Toy Groove with Camera (Playstation 2). Retrieved from the Internet www.jr.com/JRProductPage.process?Product Code=PS2+97400&JRSource=google. Retrieved on Feb. 22, 2005. 1 page. |
Eye Toy Groove with Eye Toy Camera PS2. Retrieved from the Internet: www.walmart.com/catalog/product.gsp?dest=9999999997&product id-2607013&s. Retrieved on Feb. 22, 2005. 1 page. |
Eye Toy: Groove—The Little Camera That Could comes Back with a Few New Tricks by Ed Lewis. Retrieved from the Internet: htiΛy/ps2.ign.com/artjcies/507/507854pl.html. Retrieved on Jun. 8, 2012. 8 pages. |
Eye Toy: Groove Review by Ryan Davis. Retrieved from the Internet: wvm.gamespot.com/ps2/puzzle/ eyetoygroove/printable—6094754.html. Retrieved on Jun. 11, 2012. 10 pages. |
Frequency—Pre-Played. Retrieved from the Internet www.ebgames.com/ebx/product/203370.asp. Retrieved on Feb. 22, 2005. 2 pages. |
Frequency PS2 Review from GamePro.com, written by Dan Electro on Nov. 26, 2001. Retrieved from the Internet: www.gamepro.com/ sony/ps2/games/reviews/18464.shtml. Retrieved on Jun. 11, 2012. 2 pages. |
Karaoke Revolution: The Revolution will be Televised by Ed Lewis. Retrieved from the Internet: http://ps2.ign.com/articles/458/458064p1.html. Retrieved on Jun. 11, 2012. 7 pages. |
Lohman, “Rockstar vs. Guitar Hero,” (The Rebel Yell). Nov. 13, 2008, accessed on Mar. 19, 2009. 5 pages. |
Mad Maestro!—Pre-Played. Retrieved from the Internet: www.ebgames.com/ebx/product/217604.asp. Retrieved on Feb. 22, 2005. 2 pages. |
Mad Maestro! by Ryan Davis. Retrieved from the Internet: www.gamespot.com/ps2/puzzle.madmaestro/ printable—2856821.html. Retrieved on Jun. 19, 2012. 9 pages. |
Mad Maestro: The First Orchestra-conducting Sim on US Soil—Is It All It Could Have Been? by David Smith. Retrieved from the Internet http://ps2.ign.com/articles/3551355561 p1.html. Retrieved on Jun. 11, 2012. 6 pages. |
Mojib Ribbon Playtest by Anoop Gantayat. Retrieved from the Internet: htto://os2.ion.com/articles/442/442204p1.html. Retrieved on Jun. 11, 2012. 4 pages. |
Mojib Ribbon—Review. Retrieved from the Internet: www.ntsc-uk.com/review.php?platform=ps2&game=MoiibRibbon. Retrieved on Jun. 14, 2012. 2 pages. |
Mojib Ribbon. Retrieved from the Internet: www.lik-sang.comllnfo.php?category=27&productsid=3805&PHPSESSID=b9eQca. Retrieved on Feb. 22 2005. 1 page. |
Mojib Ribbon. Retrieved from the Internet: www.ncsxshop.com/cgi-bin/shop/SCPS. 11033.html. Retrieved on Jun. 14, 2012. 2 pages. |
NCSX.com; Game Synpopsys of Guitar Freaks and DrumMania Masterpiece Gold, with a date of Mar. 8, 2007, and with an Archive.org Wayback Machine Verified date of May 17, 2007, downloaded from http://web.archiv.org/web/20070517210234/http://www.ncsx.com/2007/030507/guitarfreaks—gold.htm (4 pages). |
Non-Final Office Action as issued by the United States Patent and Trademark Office for U.S. Appl. No. 12/474,899, dated Jan. 19, 2011, 7 pages. |
PaRappa the Rapper 2. Retrieved from the Internet:wvAV.amazon.eom/exedobidos/tg/deteil/-/B00005UNWD/ 104-4695527-8827110. Retrieved on Feb. 22, 2005. 2 pages. |
PaRappa the Rapper Review by Jer Horwitz. Retrieved from the Internet: www.gamespot.com/pslpuzzlel parappatherapper/printable—2548866.html. Retrieved on Jun. 14, 2012. 9 pages. |
Parappa the Rapper. Retrieved from the Internet: wvvw.estariand.com/index.asp?page=Playstation&cat=F&product=6871&q. Retrieved on Jun. 11, 2012. 2 pages. |
Parappa the Rapper: PaRapper the Rapper Is finally here, but does it live up to the hype? by Adam Douglas. Retrieved from the Internet http://psx.ign.com/articlesl150/150490p1.html. Retrieved on Jun. 11, 2012. 2 pages. |
PopCap Games Site Review via www.download-free-games.com, retrieved on Mar. 2, 2006]. Retrieved from the Internet <URL:http7Avww.download-free-games.com/reviews/popcap—games.htm>. 2 pages. |
Ramsey, A. Guitar Freaks and Drum Mania Masterpiece Gold FAQ v. 1.04, Apr. 2, 2007, downloaded from http://www.gamefaqs.com/console/ps2/file/937670/47326. 3 pages. |
RedOctane. “Guitar Hero 2 Manual” Activision Publishing, Inc. (2006) (13 pages). |
Rez PlayStationΛ. Retrieved from the Internet: http://global.yesasia.com/en/PrdDept.aspx/ pjd -1002847668. Retrieved on Jun. 14, 2012. 1 page. |
Rez Review by Jeff Gerstmann. Retrieved from the Internet:www.qamespotcom/ps2/action/rez/printable 2838815.html. Retrieved on Jun. 11, 2012. 9 pages. |
Rez. Retrieved from the Internet: vvww.estartand.a)rn/index.asp?p.=Pfaystation2c\cat=RD&product=5426&q. Retrieved on Jun. 14, 2012. 2 pages. |
Rez: You May Not Understand This Review. We May Not Either. But you should certainly play this game by.David Smith. Retrieved from the Internet: httpΛ/ps2.ign.corru'artides/166/166546p1.html. Retrieved on Jun. 11, 2012. 3 pages. |
SingStar Party (SingStar2) Bundle. Retrieved from the Internet: www.gameswarehouse.com.Au/longpage.asp?gameid=10329. Retrieved on Feb. 22, 2005. 2 pages. |
SingStar Party. Retrieved from the Internet: www.argos.co.uk/Webapp/wcs/stores/servlet/ProductDisplay?storeId=10001&IangId. Retrieved on Feb. 22, 2005. 1 page. |
SingStar Review (PS2) by James Hamer-Mortonl. Retrieved from the Internet http://ps2.twomtown.net/en uk/articles/ art.print.php?id=5279. Retrieved on Jun. 11, 2012. 5 pages. |
SingStar Review by Luke Van Leuveren. Retrieved from the Internet http://palgn.com.aii/article.php7id-1282. Retrieved on Jun. 11, 2012. 5 pages. |
Space Channel 5 Special Edition Review by Brad Shoemaker. Retrieved from the Internet: www.gamespot.corn/ps2/puzzle/spacecriannel5pait2/printeble—6085137.h Retrieved on Jun. 11, 2012. 10 pages. |
Space Channel 5. Retrieved trom the Internet: www.lik-sang.com/into.php7products—is=2050 &likref=fro—gle4. Retrieved on Feb. 22, 2005. 1 page. |
Space Channel 5: Description. Retrieved from the Internet: www.buyritegames.com/product—information.asp?rc=frgl&number=DC-SPACEC5. Retrieved on Feb. 22, 2005. 1 page. |
Space Channel 5: Special Edition by Jason Thompson. Retrieved from the Internet www.popmattere.com/ mulumerJia/reviews/s/space-channel-5.shtml. Retrieved on Jun. 8, 2012. 2 pages. |
Taiko Drum Master Review by Justin Calvert. Retrieved from the Internet: www.gamespot.com/ps2 puzzie/taikodrummaster/printable—6111767.html. Retrieved on Jun. 14, 2012. 10 pages. |
Taiko Drum Master w/ Drum. Retrieved from the Internet: www.ebgames.com/ebx/product/244015.asp. Retrieved on Jun. 14, 2012. 2 pages. |
Taiko no Tatsujin. Retrieved from the Internet htlpy/games.channel.aol.com/review.adp?qameID-7569. Retrieved on Feb. 22, 2005. 3 pages. |
Thalmann, “L'animation par ordinateur” http://web.archive.org/web/20060421045510/http://vrlab.epfl.ch/{thalmann/CG/infogr.4.pdf>, Apr. 21, 2006 (52 pages). |
Vib Ribbon (PSX): Homepage, Screenshots by James Anthony. http://www.vib-ribbon.com/vibhtml/english/index.html. Retrieved on Jun. 14, 2012. 1 page. |
Vib-Ribbon (Import) Review by Jeff Gerstmann. Retrieved from the Internet: www.gamespot.com/ps /puzzle/vibribbon/printabte 2550100.html. Retrieved on Jun. 14, 2012. 9 pages. |
Vib-Ribbon. Retrieved from the Internet: www.ncsxshop.com/cgi-bin/shop/SCPS-45469.html. Retrieved on Feb. 22, 2005. 1 page. |
Video MJ the Experience Kinect: announce—Oct. 2010: (http://www.youtube.com/watch?v=xLbiPicu0MM). |
Video “Dance Online-Dance lessons gameplay” <http://www.youtube.com/watch?v=B4phOjfVNLk>, Jul. 27, 2010. |
Video “E3 2010 Live Demo”, where Ubi talked about MJ:TE for Kinect (<http://www.gametrailers.com/video/e3-2010-michael-jackson/101449>). |
Video MJ the Experience Kinect: release Apr. 2011, http//www.youtube.com/watch?v=N7oyxHIP48A. |
Video 'Don't Stop' Gameplay Trailer: <http://www.gametrailers.com/video/dont-stop-michael-jackson/707336> (Nov. 10, 2010). |
Video ‘Ghosts’ Gameplay Trailer: <http://www.gametrailers.com/video/ghosts-gameplay-michael-jackson/706825> (Oct. 27, 2010). |
Video <http://www.bing.com/videos/search?q=dance+instruction+game&mid=E69356CFA1B6719FF5C8E69356CFA1B6719FF5C8&view=detail&FORM=VIRE5> (uploaded Jul. 27, 2010). |
Video Alvin and the Chipmunks Chipwrecked—Majesco Sales: release—Nov. 2011 (http://www.youtube.com/watch?v=xKeW3CUt14A&feature=related). |
Video Dream Dance & Cheer (Released Sep. 13, 2009 for Wii) <http://www.youtube.com/watch?v=oi9vQjT1x5Q>. |
Video Just Dance—Ubisoft Paris; <http://www.youtube.com/watch?v=t7f22xQCEpY>; (Nov. 17, 2009). |
Video Just Dance 2—Ubisoft; <http://www.youtube.com/watch?v=kpaW9sM—M2Q> (Oct. 12, 2010). |
Video Just Dance 2: Oct. 2010 (http://youtu.be/2ChliUgqLtw). |
Video Just Dance: Nov. 2009 (http://youtu.be/rgBo-JnwYBw). |
Video Kidz Bop Dance Party! The Video Game (Released Sep. 14, 2010 on Wii) <http://www.youtube.com/watch?v=I8VD9EvFdeM>. |
Video Let's Cheer—Take 2 Interactive: release—Nov. 2011; announce—Jun. 2011 (http://www.youtube.com/watch?v=uv1IMBIw2Dw&feature=related). |
Video Michael Jackson: The Experience—Ubisoft, Ubisoft Paris, Ubisoft Montpelier; <http://www.youtube.com/watch?v=AS3-SuYhSBk>. |
Video MJ—Paris Week game demo—Oct. 29, 2010 http://www.dailymotion.com/video/xfg4oe—pgw-10-michael-jackson-experience-kinect—videogames?start=13#from=embed <http://www.dailymotion.com/video/xfg4oe—pgw-10-michael-jackson-experience-kinect—videogames?start=13>. |
Video MJ the Experience Wii: Nov. 2010 (http://www.youtube.com/watch?v=8ZA59JY8Y—w). |
Video MJ:TE Kinect from Aug. 19, 2010 at <http://www.youtube.corn/watch?v=6AiGmSnN6gQ>; Michael Jackson The Experience Video Game—Kinect for Xbox 360—Gamescom 2010 HD. |
Video MJ:TE on Wii (Nov. 2010); <http://www.youtube.com/watch?v=gmIMNGWxgvo>. |
Video N.Y. Comic Con '10—Billie Jean Gameplay Demo: <http://www.gametrailers.com/video/nycc-10-michael-jackson/706056> (Oct. 13, 2010). |
Video Tokyo Game Show '10—'Every Step' Trailer: http://www.gametrailers.com/video/tgs-10-michael-jackson/704548 (Sep. 15, 2010). |
Video Victorious: Time to Shine—D3 publishers: Nov. 2011 announce: Sep. 2011 (http://www.youtube.com/watch?v=ud69OK02KGg&feature=fvst). |
Video We Cheer 2 (Released Nov. 3, 2009 for Wii) <http://www.youtube.com/watch?v=-4oalxqnbll>. |
Video MJ the Experience Kinect: release Apr. 2011, http//www. youtube.com/watch?v=N7oyxH I P48A. |
Video 'Ghosts' Gameplay Trailer: <http://www.gametrailers.com/video/ghosts-gameplay-michaeljackson/ 706825> —(Oct. 27, 2010). |
Video <http://www.bing.com/videos/search?q=dance+instruction+game&mid=E69356CFA 1 B6719FF 5C8E69356CFA1B6719FF5C8&view=detaii&FORM=VIRE5> (uploaded Jul. 27, 2010). |
Video Britney's Dance Beat (Released May 8, 2002 for PS2); <http://www.youtube.com/watch?v=- KR 1 dR GNX w>. |
Video Dance Dance Revolution: Hottest Party 3 (Released Sep. 27, 2009 for Wii) <http://www.youtube.com/watch?v=zk20hEzGmUY>. |
Video Dance on Broadway—Ubisoft, Longtail Studios; <http://www.youtube.comiwatch?v='eYaPdT4z-M>: (Jun. 6, 201Q}—. |
Video Dance on Broadway: Jun. 2010 (<http://youtu.be/Wi9Y5HHcytY>). |
Video Dance Summit 2001: Bust a Groove (Released Nov. 2, 2000 for PS2); <http://www.youtube.com/watch?v=E8NjTGHYQcM>. |
Video Dancing With the Stars—Activision, Zoe Mode; <http://www.youtube.com/watch?v=C7zBVfEJO˜:gp (Oct. 2007). |
Video Dancing with the Stars: Oct. 2007 (http://www.youtube.com/watch?v=8UChG2v5DI). |
Video Dancing with the Stars: We Dance—Activision, Zoe Mode; <http://www.youtube.com/watch?v=31GOb-CT8vs> (Oct. 2008). |
Video DDR Hottest Party; <http://www.youtube.com/watch?v=zk20hEzGmUY> (Sep. 2007). |
Video Don't Stop' Gameplay Trailer: <http://www.gametrailers.com/video/dont-stop-michaeljackson/ 707336> (Nov. 10, 2010). |
Video Dream Dance & Cheer (Released Sep. 13, 2009 for Wii) <http://www.voutube.com/watch?v=oi9vQiT1x5Q>. |
Video E3 2010 Live Demo <http://www.gametrailers.com/video/e3-201 0-michael-jackson/101449>; (Jun. 14, 2010). |
Video Eyetoy Groove for PS2; <http://www.youtube.com/watch?v=c80aa0U fuE> (A(2ri12004). |
Video Gamescom '1 0—Billie Jean Demo <http:/iwww.aarnetraiiHrs.corn/video/gc-•1 Q.. Michael-Jackson/703294>: 1:58-1 :13) (Aug. 20, 201 D). |
Video Gamescom '10—Working Day and Night Demo <http://www.gametrailers.com/video/gc-1 0-michael-jackson/703295> (Aug. 20, 2010). |
Video Grease Dance—505 Games: release—Oct. 2011 li—http://www.youtube.com/watch?v=PaGBHSB2urg). |
Video Hannah Montana: Spotlight World Tour (Released Nov. 6, 2007 for Wii); <http://www.voutube.com/watch?v=WtyuU2NaL3Q>. |
Virginia Tech Multimedia Music Dictionary: “P: Phrase” Virginia Tech University, http://www.music.vt.edu/musicdictionary/textp/Phrase.html. Retrieved May 25, 2011 (7 pages). |
Taiko Drum Master Game Manual, Namco Ltd. For PlayStation 2 (Oct. 25, 2004, 18 pages). |
U.S. Appl. No. 29/393,964, filed Jun. 10, 2011. 2 pages. |
U.S. Appl. No. 29/393,967, filed Jun. 10, 2011. 2 pages. |
U.S. Appl. No. 29/393,968, filed Jun. 10, 2011. 2 pages. |
U.S. Appl. No. 29/393,970, filed Jun. 10, 2011. 2 pages. |
U.S. Appl. No. 29/393,973, filed Jun. 10, 2011. 2 pages. |
U.S. Appl. No. 29/393,975, filed Jun. 10, 2011. 2 pages. |
Number | Date | Country | |
---|---|---|---|
20100304863 A1 | Dec 2010 | US |