MUSIC GAME SYSTEM, COMPUTER PROGRAM OF SAME, AND METHOD OF GENERATING SOUND EFFECT DATA

Abstract
A music game system (1) is provided with a sound input device (9) which inputs sound, a speaker (8) which outputs game sound, and a external storage device (20) which stores sound effect data (27) to cause the speaker to output each of sound effects of different musical intervals and sequence data (29) in which a relationship between a player's operation and the sound effect to be output correspondingly is described. The music game system determines the musical interval representing an input sound based on sound data of the sound input by the sound input device (9), generates multiple tone data which have different music interval from the sound data respectively based on the musical interval determination result so as to form the musical scale, and stores a set of multiple tone data as at least a part of the sound effect data (27).
Description
TECHNICAL FIELD

The present invention relates to a music game system and the like in which a sound input by a player is reflected in game contents.


BACKGROUND ART

Music game machines in which game contents changes based on a sound input by a player are well-known. For example, music game machines that reflect an input sound in the behavior of characters (refer to Patent Literature 1) and also music game machines that inputs and marks player's singing to vie for supremacy (refer to Patent Literature 2) are known. Patent Literature 1: JP-A-2002-136764 and Patent Literature 2: JP-A-H10-268876.


SUMMARY OF INVENTION
Technical Problem

All of the above game machines change game contents by capturing a player's voice. After a musical interval of the player's voice is detected, processing is performed so that behavior of characters is changed based on a result of comparison with a reference musical interval. However, no game machine is configured to reflect a sound input by the player as a raw material in game content to enjoy the game based on the input sound.


The present invention aims to provide a music game system capable of determining a sound input by a player and forming a musical scale based on a determination result, a computer program thereof, and a method of generating sound effect data.


Solution to Problem

The music game system of the present invention is a game system comprising: a sound input device which inputs sound; an audio output device which outputs game sound; a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals; a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to be output correspondingly is described; a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device; a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; and a sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.


The computer program of the present invention is a computer program for a music game system comprising: a sound input device which inputs sound; an audio output device which outputs game sound; a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals; a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to output correspondingly is described; wherein the computer program causes the music game system to function as: a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device; a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; and a sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.


In the present invention, sound data is generated by a musical interval determination device based on a sound input into a sound input device by a player and a musical interval to represent the sound data is determined. Then, multiple tone data which have different music interval are generated by a musical scale generation device from the sound data whose musical interval has been determined based on a musical interval determination result of the sound data. The multiple tone data form a musical scale. The multiple tone data are stored in a sound effect data storage device as sound effect data and the multiple tone data are used as a sound effect to be output in response to a player's operation. Thus, a musical scale is formed based on a sound input arbitrarily by the player and therefore, a melody can be played based on an input sound or an input sound may be reflected in game content as a raw material for the player to enjoy a game with a sound input by the player.


As one aspect of the music game system of the present invention, the musical interval determination device determines the musical interval of the sound by identifying a frequency representing the sound data of the sound input by the sound input device. According to this, the musical interval of the sound is determined by, for example, identifying the frequency at which the distribution is maximum as a representative value with reference to a frequency spectrum of the sound data.


As one aspect of the music game system of the present invention, the musical scale generating device generates the musical scale of at least one octave or more. According to this, a melody can be played by generating a musical scale. If a large number of pieces of tone data is generated, the musical scale grows in breadth and the number of melodies that can be played increases so that game content can be made more advanced.


As one aspect of the music game system of the present invention, further comprising: an input device which has at least one operating device; wherein the sound effect following a description of the sequence data is played by the audio output device based on operations of the player through the input device. According to this, by operating the operating device, the player can reproduce a sound effect constituted of a musical scale formed by using a sound input by the player. Therefore, an input sound can be reflected in game content as a raw material to enjoy a game with a sound input by the player.


The method of the present invention is a method of generating sound effect data comprising: a musical interval determination step which determines the musical interval representing an input sound based on sound data of the sound input by a sound input device; a musical scale generating step which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination step so as to form the musical scale; and a storing step which causes a storage device to store the multiple tone data generated by the musical scale generating step as sound effect data for outputting from an audio output device.


The present invention is a method of generating sound effect data in a music game system and a computer program thereof and achieves a similar operation effect. The present invention is not limited to music game systems and is also applicable to various electronic devices such as electronic musical instruments.


Advantageous Effects of Invention

In a music game system according to the present invention and a computer program thereof, as described above, sound data is generated by a musical interval determination device based on a sound input into a sound input device by a player and a musical interval to represent the sound data is determined. Then, multiple tone data which have different music interval are generated by a musical scale generation device from the sound data whose musical interval has been determined based on a musical interval determination result of the sound data. The multiple tone data form a musical scale. The multiple tone data are stored in a sound effect data storage device as sound effect data and the multiple tone data are used as a sound effect to be output in response to a player's operation. Thus, a musical scale is formed based on a sound input arbitrarily by the player and therefore, a melody can be played based on an input sound or an input sound may be reflected in game content as a raw material for the player to enjoy a game with a sound input by the player. A similar effect is achieved by a method of generating sound effect data.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a diagram showing an appearance of a game machine according to one aspect of the present invention.



FIG. 2 is a functional block diagram of the game machine according to one aspect of the present invention.



FIG. 3 is an enlarged view of an operation instruction screen displayed as part of a game screen.



FIG. 4 is a diagram showing one example of contents of sound effect data.



FIG. 5 is a diagram showing one example of contents of sequence data.



FIG. 6 is a flowchart showing a sequence processing routine executed by a game controller.



FIG. 7 is a flowchart showing a musical interval determination processing routine executed by the game controller.



FIG. 8 is a flowchart showing a musical scale generating processing routine executed by the game controller.



FIG. 9 is a graph showing one example of sound data.



FIG. 10 is a graph showing a frequency spectrum of the sound data in FIG. 9.



FIG. 11 is a graph showing tone data obtained by frequency conversion of the sound data in FIG. 9.





DESCRIPTION OF EMBODIMENTS

An embodiment obtained by applying the present invention to a mobile game machine will be described below. As shown in FIG. 1, a game machine 1 includes a housing 2 that can be held by a player (user) by hand, a first monitor 3 arranged on the right side of the housing 2, a second monitor 4 arranged on the left side of the housing 2, a plurality of push-button switches 5 arranged in the upper part of the first monitor 3, and a cross key 6 arranged in the lower part of the first monitor 3. A transparent touch panel 7 is laid on the surface of the first monitor 3. The touch panel 7 is a well-known input device that, when touched by a player through a touch pen or the like, outputs a signal in accordance with the touch position. In addition, the game machine 1 is provided with various input devices and output devices included in an ordinary mobile game machine such as a power switch, volume operation switch, and power lamp, but an illustration thereof is omitted in FIG. 1.


As shown in FIG. 2, a control unit 10 as a computer is provided inside the game machine 1. The control unit 10 includes a game controller 11 as a control body, a pair of display controllers 12, 13 that operate according to output from the game controller 11, and an audio output controller 14. The game controller 11 is configured as a unit combining a microprocessor and various peripheral devices such as internal storage devices (as an example, a ROM and a RAM) necessary for the operation of the microprocessor. The display controllers 12, 13 render an image in accordance with image data provided from the game controller 11 in a frame buffer to cause the monitors 3, 4 to display a predetermined image by outputting a video signal corresponding to the rendered image to the monitors 3, 4 respectively. The audio output controller 14 causes a speaker 8 to play predetermined sound (including music sound and the like) by generating audio playback signals in accordance with audio playback data provided from the game controller 11 and outputting them to the speaker 8.


The push-button switches 5, the cross key 6, and the touch panel 7 described above are connected to the game controller 11 as input devices and, in addition, a sound input device (microphone) 9 is connected thereto. Besides, various input devices may be connected to the game controller 11. Further, an external storage device 20 is connected to the game controller 11. A storage medium capable of holding storage without power feeding like a magnetic storage device and a nonvolatile semiconductor memory device such as EEPROM is used as the external storage device 20. The storage medium of the external storage device 20 is removable from the game machine 1.


A game program 21 and game data 22 are stored in the external storage device 20. The game program 21 is a computer program needed to play a music game in the game machine 1 according to a predetermined procedure and contains a sequence control module 23, a musical interval determination module 24, and a musical scale generating module 25 to realize functions according to the present invention. When the game machine 1 is started, the game controller 11 performs various initial settings necessary for operation as the game machine 1 by executing an operation program stored in an internal storage device thereof and then sets the environment to play the music game according to the game program 21 by reading the game program 21 from the external storage device 20 and executing the game program 21. A sequence process portion 15 is generated in the game controller 11 after the sequence control module 23 of the game program 21 being performed by the game controller 11. Also, a musical interval determination portion 16 is generated in the game controller 11 after the musical interval determination module 24 of the game program 21 being performed by the game controller 11 and similarly, a musical scale generating portion 17 is generated in the game controller 11 after the musical scale generating module 25 being performed by the game controller 11.


The sequence process portion 15, the musical interval determination portion 16, and the musical scale generating portion 17 are logical devices realized by combining computer hardware and computer programs. The sequence process portion 15 performs music game processing such as issuing instructions of operation to a player in time to playback of music (musical piece) selected by the player or generating a sound effect in accordance with a player's operation. The musical interval determination portion 16 decides a representative value of a frequency by capturing any sound input into the sound input device 9 by the player and performing predetermined processing described later thereon. The musical scale generating portion 17 generates multiple tone data by changing the musical interval based on the representative value decided by the musical interval determination portion 16. These pieces of tone data form musical scales of a predetermined octave number and constitute sound effects. In addition to the above modules 23 to 25, various program modules necessary for playing the music game are contained in the game program 21 and logical devices corresponding to such modules are generated in the game controller 11, but an illustration thereof is omitted.


Various kinds of data to be referenced when the music game is played according to the game program 21 are contained in the game data 22. For example, music data 26, sound effect data 27, and image data 28 are contained in the game data 22. The music data 26 is data needed to cause the speaker 8 to play and output a musical piece intended for the game. Though one kind of the music data 26 is shown in FIG. 2, the player can actually select the musical piece from a plurality of musical pieces. The game data 22 has the plurality of pieces of the music data 26 recorded with information to identify each musical piece attached thereto. The sound effect data 27 is data in which a plurality of kinds of sound effects to be output from the speaker 8 in response to a player's operation is recorded by associating with unique code for each sound effect. A sound effect contains sounds of instruments and other various kinds of sounds. Vocal sounds to cause the speaker 8 to output text are also contained as a kind of sound effects. The sound effect data 27 is prepared for each kind for a predetermined octave number by changing the musical interval. The image data 28 is data to cause the monitors 3 and 4 to display a background image in the game screen, various objects, icons and the like.


Further, sequence data 29 is contained in the game data 22. The sequence data 29 is data that defines operations and the like to be instructed to the player. At least one piece of the sequence data 29 is prepared for one piece of the music data 26.


Next, an overview of the music game played in the game machine 1 will be provided. As shown in FIG. 1, an operation instruction screen 100 of the game is displayed in the first monitor 3 and an information screen 110 of the game is displayed in the second monitor 4 while the music game is played in the game machine 1. As shown also in FIG. 3, a state in which a first lane 101, a second lane 102, and a third lane 103 extending in the vertical direction are visually divided by a procedure such as dividing by a division line 104 is displayed in the operation instruction screen 100. An operation reference portion 105 is displayed at a bottom end of each of the lanes 101, 102 and 103. Objects 106 as operation indicators are displayed in the lanes 101, 102 and 103 according to the sequence data 27 while the music game is played, that is, playback of a musical piece is in progress.


The objects 106 appear at a top end of the lanes 101, 102 and 103 at an appropriate time of the musical piece and are scrolled downward, as indicated by an arrow A in FIG. 3, with the progress of the musical piece. The player is requested to perform a touch operation of the lane 101, the lane 102, or the lane 103 in which the object 106 is displayed through an operation member such as a touch pen 120 coinciding with the arrival of the object 106 at the operation reference portion 105. If the player performs a touch operation, a difference between the time when the object 106 and the operation reference portion 105 match and the time when the player performs the touch operation is detected. The player's operation is evaluated more highly with a decreasing the difference. Moreover, a sound effect corresponding to each of the objects 106 is played by the speaker 8 in accordance with the touch operation. In the example of FIG. 3, the object 106 is immediately before arriving at the operation reference portion 105 in the second lane 102 and the player may perform a touch operation of the second lane 102 coinciding with the arrival thereof. Anywhere inside the second lane 102 may be touched. That is, three operating devices are formed in the present embodiment by the combination of the lanes 101, 102 and 103 displayed in the first monitor 3 and the touch panel 107 laid thereon. Incidentally, each of the lanes 101, 102 and 103 may be used as a term representing the operating device below.


The sound effect corresponding to each of the objects 106 played in accordance with a touch operation is selected from a plurality of sound effects recorded in the sound effect data 27. As shown in FIG. 4, the sound effect data 27 contains original data 27a pre-recorded in the game data 22 and user data 27b obtained based on sound input into the sound input device 9 by the player. The original data 27a and the user data 27b have a plurality of sound effects A1, B1, . . . recorded therein and if the sound effect A1 is taken as an example, the sound effect A1 has a set of tone data sd_000, sd_001, sd_002, . . . associating each tone configuring a musical scale with unique code recorded therein. The other sound effects B1, Cl, . . . have similar tone data. The user data 27b is similar to the original data 27a in the structure of tone data included in the sound effects A1, B1, . . . , but is different from the pre-recorded original data 27a in that tone data is generated based on sound input into the sound input device 9 by the player.


Next, the sequence data 29 will be described in detail. As shown in FIG. 5, the sequence data 29 contains an initial setting portion 29a and an operation sequence portion 29b. In the initial setting portion 29a, information specifying play conditions of the game that are different from musical piece to musical piece such as information like the tempo of the music (for example, a BPM) as an initial setting for the game to play and information specifying sound effects to be generated when the lanes 101 to 103 are each operated is described.


In the operation sequence portion 29b, on the other hand, operation specifying information 29c and sound effect switching instruction information 29d are described. The operation specifying information 29c in which operation times of the lanes 101 to 103 are associated with information specifying one of the lanes 101 to 103 is described. That is, as illustrated in FIG. 5 as a portion thereof, the operation specifying information 29c is configured as a set of a plurality of records associating the time (operation time) when an operation should be performed during a musical piece with information specifying the operation device (lane). As the operation time, values indicating a bar number in the musical piece, a beat number, and the time in a beat are described by each delimited with a comma. The time in a beat is an elapsed time from the start of a beat and if the time length of a beat is equally divided into n unit times, the time in a beat is represented by the number of units from the start of the beat. If, for example, n=100 and the time ¼ having passed from the start of the second beat of the first bar of a musical piece should be specified as the operation time, the operation specifying information 29c is described as “01, 2, 025”. When the first lane 101 should be specified as the operation device, “button 1” is described, when the second lane 102 should be specified, “button2” is described, and when the third lane 103 should be specified, “button3” is described. In the example of FIG. 5, the operation time and the operation device are specified in a manner such as touching the first lane 101 at the start (000) of the first beat of the first bar, touching the second lane 102 at the start (000) of the second beat of the first bar, and touching the third lane 103 when “025” passes after the start of the second beat of the first bar.


The sound effect switching instruction information 29d is inserted into a suitable position in the operation specifying information 29c. The sound effect switching instruction information 29d is described by associating the time in a musical piece when the sound effect should be changed and tone data of sound effects to be generated when the lanes 101 to 103 are each operated to change the sound effects generated when the specified lane is touched in the subsequent operation specifying information 29c. The time in a musical piece is described in the same format as the format of the operation time of the operation specifying information 29c. The sound effect switching instruction information 29d specifies tone data of one of the original data 27a and the user data 27b recorded in the sound effect data 27 for each lane. The sound effect switching instruction information 29d is inserted into the time in a musical piece when the sound effect should be switched and the setting of the sound effect is maintained until instructed by the next sound effect switching instruction information 29d.


The sequence process portion 15 of the game controller 11 controls the display of each of the lanes 101 to 103 so that the object 106 and the operation reference portion 105 match at the above operation time specified by the operation specifying information 29c. The sequence process portion 15 also exercises control so that the sound effects generated when the player touches the specified lanes 101 to 103 are switched at the time in a musical piece specified by the sound effect switching instruction information 29d.


Next, processing of the game controller 11 when a music game is played on the game machine 1 will be described. After completing initial settings necessary to play the music game by reading the game program 21, the game controller 11 waits in preparation for instructions to start the game from a player. Instructions to start the game include, for example, an operation to identify the musical piece to be played in the game or data to be used in the game such as the selection of the degree of difficulty. The procedure for receiving such instructions may be the same as the procedure for a well-known music game and the like.


If the start of the game is instructed, the game controller 11 reads the music data 26 corresponding to the music selected by the player and outputs the music data 26 to the audio output controller 14 to cause the speaker 8 to play the musical piece. Accordingly, the control unit 10 functions as a musical piece playback device. In synchronization with playback of the musical piece, the game controller 11 also reads the sequence data 29 corresponding to the player's selection to generate image data necessary for rendering of the operation instruction screen 100 and the information screen 110 while referencing the image data 28 and outputs the image data to the display controllers 12 and 13 to cause the monitors 3 and 4 to display the operation instruction screen 100 and the information screen 110 respectively. Further, while the music game is played, the game controller 11 repeatedly executes the sequence processing routine shown in FIG. 6 as processing necessary for the display of the operation instruction screen 100 and the like in a predetermined period.


When the sequence processing routine shown in FIG. 6 is started, in step S1, the sequence process portion 15 of the game controller 11 first obtains the current time in the musical piece. For example, keeping time is started by an internal clock of the game controller 11 relative to the time of playback start of the musical piece and the current time is obtained from the value of the internal clock. In subsequent step S2, the sequence process portion 15 obtains data of the operation time present in the time length corresponding to the display range of the operation instruction screen 100 from the sequence data 28. The display range is set, as an example, to the time range corresponding to two bars of the musical piece from the current time toward the future.


In next step S3, the sequence process portion 15 calculates coordinates of all the objects 106 to be displayed in the lanes 101 to 103 in the operation instruction screen 100. The calculation is carried out, as an example, as described below. Whether to arrange the object 106 in any of the lanes 101 to 103 is determined based on the designation of the lanes 101 to 103 associated with any operation time contained in the display range, that is, the designation of any of “button1” to “button3” in the example of FIG. 5. Also, the position of each of the objects 106 in the time-axis direction (namely, the direction of movement of the object 106) from the operation reference portion 105 is determined in accordance with a difference between each operation time and the current time. Accordingly, coordinates of each of the objects 106 needed to arrange each of the objects 106 along the time axis from the operation reference portion 105 in the specified lanes 101 to 103 can be obtained.


After the calculation of coordinates of the objects 106 is completed, the sequence process portion 15 proceeds to step S4 to determine whether the sound effect switching instruction information 29d is present in the data which is obtained from the sequence data 29. If the sound effect switching instruction information 29d is present, the sequence process portion 15 obtains the current time in step S5 and compares the current time with the time in the musical piece specified by the sound effect switching instruction information 29d to determine whether the current time corresponds to the timing of switching instructions of the sound effect. If the current time corresponds to the timing of switching instructions of the sound effect, in step S6, the sequence control portion 15 changes the sound effects generated in the respective lanes 101 to 103 specified by the subsequent operation specifying information 29c to the sound effects specified by the sound effect switching instruction information 29d. To give a description by taking the example shown in FIG. 5, after the start of the third beat of the first bar of the musical piece, sound data sd_101, sd_105, sd_106 of the sound effect A2 of the user data 27b of the sound effect data 27 is allocated to the lanes 101, 102 and 103 respectively and if the player touches the lanes 101, 102 or 103, the respective sound data is played. If the sound effect switching instruction information 29d is not present in step S4 or the sound effect switching instruction information 29d is not present in step S5, the sequence process portion 15 proceeds to step S7.


When switching of the sound effects is completed, the sequence process portion 15 proceeds to next step S7 to generate image data necessary for rendering of the operation instruction screen 100 based on coordinates of the objects 106 calculated in step S3. More specifically, the sequence process portion 15 generates image data in such a way that the objects 106 are arranged in calculated coordinates. The image of the object 106 may be obtained from the image data 28.


In subsequent step S8, the sequence process portion 15 outputs the image data to the display controller 12. Accordingly, the operation instruction screen 100 is displayed in the first monitor 3. When the processing in step S8 is completed, the sequence process portion 15 terminates this sequence processing routine. With the above processing being performed repeatedly, the objects 106 are displayed by scrolling in the lanes 101 to 103 in such a way that the objects 106 arrive at the operation reference portion 105 at operation times described in the sequence data 29.


Next, processing by the musical interval determination unit 16 and the musical scale generating portion 17 when a sound effect is created based on a sound input by a player into the game machine 1 will be described. A sound effect is created when, for example, the start thereof is instructed by the player in a waiting state in which no music game is played. When the creation of a sound effect is started, first the musical interval determination portion 16 executes the musical interval determination processing routine shown in FIG. 7 and the musical scale generating portion 17 executes the musical scale generating processing routine shown in FIG. 8 based on a result of the musical interval determination processing routine.


When the musical interval determination processing routine in FIG. 7 is started, in step S11, the musical interval determination portion 16 of the game controller 11 obtains sound input by the player. If the player inputs sound when the sound input device 9 is ready to capture sound, raw sound data is generated. In subsequent step S12, the musical interval determination portion 16 makes A/D conversions of the raw sound data. An analog signal of the raw sound data is thereby converted into a digital signal to create sound data of the input sound. FIG. 9 shows an example of sound data. The sound data in FIG. 9 is a digital waveform of guitar sound and the horizontal axis and the vertical axis represent the dynamic range and the duration respectively. Incidentally, well-known technology may be used for A/D conversion.


Then, in step S13, the musical interval determination portion 16 obtains a frequency spectrum of the sound data. FIG. 10 shows a frequency spectrum generated by a fast Fourier transform of the sound data obtained in step S12. The horizontal axis and the vertical axis represent the frequency and the degree of distribution of the frequency respectively. Incidentally, the generation of a frequency spectrum is not limited to the calculation based on the fast Fourier transform and various well-known technologies may be used. In subsequent step S14, the musical interval determination portion 16 decides the representative value from the frequency spectrum obtained in step S13. The representative value is defined as the maximum value of the distribution number of the frequency spectrum. To describe by taking the graph in FIG. 10, the frequency at the peak indicated by an arrow p becomes the representative value. Based on the frequency of the representative value decided as described above, the musical interval of the sound data based on the sound input by the player is determined. The representative value may also be calculated from data of a band q occupying both sides of a crest having the above maximum peak. The representative value can also be calculated from a fixed band by the method as described above when the peak is ambiguous with the peak frequency having a width or the like. If the processing in step S14 is completed, the musical interval determination portion 16 terminates this musical interval determination processing routine. With the above processing, the representative value of sound data based on a sound input by a player is decided and the inherent musical interval is determined.


If the representative value is obtained by the musical interval determination processing routine, the musical scale generating portion 17 executes the musical scale generating processing routine in FIG. 8. In step S21, the musical scale generating portion 17 generates multiple tone data forming a musical scale from sound data whose representative value has been decided. The musical scale generating portion 17 makes frequency conversions of the sound data based on the representative value so that the representative value of each piece of tone data becomes the frequency of each tone forming the musical scale of the predetermined octave number. FIG. 11 shows an example of frequency-converted tone data. The waveform in FIG. 11 is obtained by frequency conversion one octave upward of the sound data in FIG. 9. Then, in step S22, the musical scale generating portion 17 stores a set of generated tone data in the sound effect data 27. The tone data is stored in the user data 27b of the sound effect data 27. If the processing in step S22 is completed, the musical scale generating portion 17 terminates this musical scale generating processing routine. With the above processing, multiple tone data with mutually different frequencies of representative values is generated based on sound data whose representative value has been decided to form a musical scale. A set of tone data forming the musical scale is stored in the user data 27b of the sound effect data 27 as a sound effect.


In the above embodiment, the external storage device 20 of the game machine 1 functions as a sound effect data storage device and a sequence data storage device. Also, the control unit 10 functions as a musical interval determination device by causing the musical interval determination portion 16 to perform the processing in steps S11 to S14 in FIG. 7, functions as a musical scale generating device by causing the musical scale generating portion 17 to perform the processing in step S21 in FIG. 8, and functions as a sound effect data storage control device by causing the musical scale generating portion 17 to perform the processing in step S22 in FIG. 8.


The present invention is not limited to the above embodiment and can be carried out in various embodiments. For example, the present embodiment has been described by taking the music game machine 1 as an example of the apparatus that causes a musical interval determination device, a musical scale generating device, and a sound effect data storage control device to function, but is not limited to the above example. For example, the present invention may be applied to various electronic devices such as electronic musical instruments. If the present invention is applied to an electronic musical instrument, a melody can be played based on any sound input by the player.


A music game system according to the present invention is not limited to game systems realized as mobile game machines and may be realized in an appropriate form such as home video game machines, business-use game machines installed in commercial facilities, and game systems realized by using a network. The input device is not limited to an example using the touch panel and input devices configured in various ways such as a push button, lever, and track ball can be used.

Claims
  • 1. A music game system comprising: a sound input device which inputs sound;an audio output device which outputs game sound;a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals;a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to be output correspondingly is described;a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device;a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; anda sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.
  • 2. The music game system of claim 1, wherein the musical interval determination device determines the musical interval of the sound by identifying a frequency representing the sound data of the sound input by the sound input device.
  • 3. The music game system of claim 1, wherein the musical scale generating device generates the musical scale of at least one octave or more.
  • 4. The music game system of claim 1, further comprising: an input device which has at least one operating device; whereinthe sound effect following a description of the sequence data is played by the audio output device based on operations of the player through the input device.
  • 5. A storage medium storing a computer program for a music game system comprising: a sound input device which inputs sound;an audio output device which outputs game sound;a sound effect data storage device which stores sound effect data to cause the audio output device to output each of sound effects of different musical intervals;a sequence data storage device which stores sequence data in which a relationship between a player's operation and the sound effect to output correspondingly is described; whereinthe computer program causes the music game system to function as:a musical interval determination device which determines the musical interval representing an input sound based on sound data of the sound input by the sound input device;a musical scale generating device which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination device so as to form the musical scale; anda sound effect data storage control device which causes the sound effect data storage device to store the multiple tone data generated by the musical scale generating device as at least a part of the sound effect data.
  • 6. A method of generating sound effect data comprising: a musical interval determination step which determines the musical interval representing an input sound based on sound data of the sound input by a sound input device;a musical scale generating step which generates multiple tone data which have different music interval from the sound data respectively based on a musical interval determination result of the musical interval determination step so as to form the musical scale; anda storing step which causes a storage device to store the multiple tone data generated by the musical scale generating step as sound effect data for outputting from an audio output device.
Priority Claims (1)
Number Date Country Kind
2009 210571 Sep 2009 JP national
PCT Information
Filing Document Filing Date Country Kind 371c Date
PCT/JP2010/065337 9/7/2010 WO 00 3/8/2012