Referring to
The game apparatus 12 is connected with a receiving unit 18 via a connection terminal. The receiving unit 18 receives operation data transmitted wirelessly from the controller 14. More specifically, the controller 14 uses a wireless communication technique such as Bluetooth (registered trademark) to transmit operation data to the game apparatus 12 to which the receiving unit 18 is connected.
In addition, an optical disc 20 is attached to or detached from the game apparatus 12, as an example of information storage medium that is replaceably used in the game apparatus 12. Provided on an upper main surface of the game apparatus 12 are a power ON/OFF switch for the game apparatus 12, a reset switch for game processing and an OPEN switch for opening the upper cover of the game apparatus 12. When the player presses the OPEN switch, the aforesaid cover is opened, whereby the optical disc 20 is attached to or detached from the game apparatus 12. Moreover, an external memory card 22 is detachably attached to the game apparatus 12 as required. A flash memory etc. contained in the memory card 22 store saved data and the like.
The game apparatus 12 executes a game program stored in the optical disc 20 and displays results of the execution as a game image on the monitor 16. The game apparatus 12 may also use saved data stored in the external memory card 22 to reproduce the state of a game executed in the past and display the game image on the monitor 16. A speaker 24 (see
The CPU 26 is connected via a memory controller 30 with a GPU (Graphics Processing Unit) 32, the main memory 28, a DSP (Digital Signal Processor) 34, and an ARAM (Audio RAM) 36. The memory controller 30 is connected via a predetermined bus with a controller I/F (Interface) 38, a video I/F 40, an external memory I/F 42, an audio I/F 44, and a disc I/F 46, which are connected with the receiving unit 18, the monitor 16, the external memory card 22, the speaker 24, and the disc drive 48, respectively.
The GPU 32 performs image processing under instructions from the CPU 26. For example, the GPU 32 is formed by a semiconductor chip that performs calculations required for display of 3D graphics. For image processing, the GPU 32 uses a memory dedicated for image processing and some storage area of the main memory 28. The GPU 32 generates game image data and movie pictures to be displayed, and outputs them to the monitor 16 via the memory controller 30 and the video I/F 40 as appropriate.
The main memory 28 is a storage area used by the CPU 26, and stores appropriately a game program and data required by the CPU 26 for game processing. For instance, the main memory 28 stores the game program and various kinds of data, etc. read by the CPU 26 from the optical disc 20.
The DSP 34 serves as a sound processor connected with the ARAM 36 for storage of sound data, etc. The ARAM 36 is used for a predetermined process (e.g. storing previously read game program and sound data). The DSP 34 reads the sound data (sound wave data) stored in the ARAM 36, generates data for sound output based on the sound control data from the CPU 26 and the sound wave data and the like, outputs the sound from the speaker 24 provided in the monitor 16 via the memory controller 30 and the audio I/F 44.
The memory controller 30 controls centrally data transfer and is connected with the above mentioned I/Fs. The controller I/F 38 is formed by four controller I/Fs, for example, and connects the game apparatus 12 communicably with an external device via connectors possessed by those controller I/Fs. For instance, the receiving unit 18 is engaged with the above mentioned connector and connected to the game apparatus 12 via the controller I/F 38. As described above, the receiving unit 18 receives operation data from the controller 14, and outputs it to the CPU 26 via the controller I/F 38. In another embodiment, the game apparatus 12 may contain inside a receiving module for receiving operation data transmitted from the controller 14, instead of the receiving unit 18. In this case, the transmission data received by the receiving module is output to the CPU 26 via a predetermined bus.
The video I/F 40 is connected with the monitor 16 on which a game image is displayed according to an image signal from the video I/F 40. The external memory I/F 42 is connected with the external memory card 22. The CPU 26 accesses a flash memory, etc. provided in the external memory card 22 via the memory controller 30.
The audio I/F 44 is connected with the speaker 24 contained in the monitor 16. The audio I/F 44 provides the speaker 24 with an audio signal corresponding to sound data read from the ARAM 36 or generated by the DSP 34 and the sound data directly output from the disc drive 48. The speaker 24 outputs the sound.
The disc I/F 46 is connected with the disc drive 48 which reads data stored in the optical disc 20 in a predetermined reading position. The read data is written into the main memory 28 via the disc I/F 46 and the memory controller 30, etc., or is output to the audio I/F 44.
The controller 14 has a housing 50 formed by plastic molding, for example. The housing 50 is of an approximately rectangle with longer sides in the back-and-forth direction (Z-axis direction shown in
The housing 50 is provided with a plurality of operating buttons. Provided on an upper surface of the housing 50 are a cross key 52a, an X button 52b, a Y button 52c, an A button 52d, a select switch 52e, a menu switch 52f, and a start switch 52g. Meanwhile, a lower surface of the housing 50 has a concave portion, and a B button 52i is provided on a rear-side inclined surface of the concave portion. Each of these buttons (switches) 52 is given a function according to a game program executed by the game apparatus 12. In addition, a power switch 52h for remotely turning on/off the game apparatus 12 is provided on the upper surface of the housing 50.
Moreover, a connector 54 is provided on a rear surface of the housing 50. The connector 54 is, for example, a 32-pin edge connector used for connection of another device to the controller 14. A plurality of LEDs 56 are provided at a rear side of the upper surface of the housing 50. The controller 14 is given a controller type (number) for discrimination from other controllers 14. When the controller 14 transmits operation data to the game apparatus 12, one LED 56 corresponding to the currently set controller type of the controller 14 lights up.
The acceleration sensor 60 detects, out of accelerations applied to a detection part of the acceleration sensor, the acceleration of a line component for each sensing axis and the acceleration of gravity. The acceleration sensor 60 detects accelerations in directions of at least two axes orthogonal to each other. For example, the biaxial or triaxial acceleration sensor detects the accelerations applied to the detection part of the acceleration sensor, as accelerations of straight line components along the axes. More specifically, in this embodiment, the triaxial acceleration sensor is used to detect the accelerations of the controller 14 in the directions of the three axes, the up-and-down direction (Y-axis direction in
For the acceleration sensor 60, a biaxial acceleration sensor may be used to detect the accelerations in any combination of directions of two axes among the up-and-down direction, the right-and-left direction and the back-and-forth direction, depending on the kind of a required operation signal. In this embodiment, the acceleration sensor 60 detects the state of an operation such as the user's stroke performance on the controller 14. As shown in
Data on the accelerations detected by the acceleration sensor 60 is output to the communication part 58. The acceleration sensor 60 is typically a capacitance-type acceleration sensor. The acceleration sensor 60 has a sampling cycle of 200 frames per second at the maximum, for example.
The communication part 58 includes a microcomputer 62, a memory 64, a wireless module 66 and an antenna 68. The microcomputer 62 controls the wireless module 66 for transmitting acquired data wirelessly while using the memory 64 as a storage area during the process.
The data output from the operating part 52 and the acceleration sensor 60 to the microcomputer 62 is temporarily stored in the memory 64. Here, wireless transmission from the communication part 58 to the receiving unit 18 is carried out in a predetermined cycle. Since the game process is generally performed each 1/60 second, the wireless transmission needs to be carried out in a shorter cycle. When timing for transmission to the receiving unit 18 has come, the microcomputer 62 outputs the data stored in the memory 64 as operation data to the wireless module 66. The wireless module 66 uses Bluetooth (registered trademark) technique to modulate a carrier wave at a predetermined frequency by operation data and emit a weak radio wave signal through the antenna 68. That is, the operation data is modulated in the wireless module 66 into a weak radio wave signal and transmitted from the controller 14. The weak radio wave signal is received by the receiving unit 18 of the game apparatus 12. By demodulating or decoding the received weak radio wave signal, the game apparatus 12 can obtain the operation data. The CPU 26 of the game apparatus 12 performs the game processing based on the operation data acquired from the controller 14.
The shape of the controller 14, and the shapes, number and layout of the operating switches 52, as shown in
Through the use of the controller 14, the player can perform game operations such as moving and rotating the controller 14 in addition to conventional typical game operations such as pressing the operating switches.
The sound output control apparatus 10 is allowed to output a sound by swinging the controller 14 as if carrying out stroke performance on a guitar, etc. More specifically, with the game system 10, the user can carry out simulative stroke guitar performance by holding the controller 14 and performing a wristy stroke operation on the same.
The acceleration sensor 60 detects the acceleration with which the above mentioned stroke performance is carried out on the controller 14. The inventor et al. have found out that the value of sum of accelerations in the Z-axis direction and in the X-axis direction detected during operation of the controller 14 with stroke performance varies depending on the position of the operated controller 14, and that assigning a specific value to a specific string would allow simulative playing with a sense of operation like stroke performance.
More specifically, as shown in
In the game apparatus 12, it is determined that a change in the stroke value has formed a predetermined relationship with the threshold value of each string when the predetermined relationship has been formed, that is, when the stroke value has passed through the value associated with each string, the string is assumed to be plucked and then the sound associated with the string is output.
As stated above, sound output is controlled according to a predetermined relationship between a change in the stroke value and a predetermined plurality of threshold values. Therefore, as in the case with this embodiment, by previously setting as appropriate the threshold values in correspondence with the six guitar strings, for example, it is possible to reproduce, by executing a simple process, simulative playing in which a plurality of strings make sounds with time differences like a real guitar.
In addition, different sounds can be associated with the threshold values that are compared to a change in stroke values, allowing different sounds to be sequentially output by one stroke operation. As in the case with this embodiment, by setting predetermined different tones of sounds to the six guitar strings, for example, it is possible to reproduce simulative playing by a simple operation as if producing the same chords as a real guitar makes.
A storage area 84 of the program storage area 80 stores an operation data acquisition program. By this program, the operation data from the controller 14 is acquired in the main memory 28 via the receiving unit 18 and the controller I/F 38. As mentioned above, the controller 14 transmits operation data in a cycle shorter than one frame with the game apparatus 12 (e.g. 1/60 second). In addition, the sampling cycle of the acceleration sensor 60 in the controller 14 is set as a cycle shorter than one frame with the game apparatus 12 (e.g. 1/200 second). The data transmitted on a single occasion from the controller 14 includes the values of accelerations with a plurality of detection timings. Thus, in this embodiment, the game apparatus 12 can acquire operation data in which a plurality of pieces of operation information (acceleration, etc.) are included in one frame. The CPU 26 can execute a sound output control process using a plurality of pieces of operation information as required.
The storage area 86 stores a chord sound setting program. This program makes it possible to set chord sounds. The guitar chords capable of being set in this embodiment includes C, G7, Am, F, Dm, etc. In this embodiment, a chord is selected using the cross key 52a out of the operating switch 52, for example. In addition, the tone of sound of each string is set according to the set chord. That is, the sound of each string is taken as a component of a chord. Accordingly, it is possible to produce chords in the same manner as a real guitar makes.
A storage area 88 stores a sound emission control program. This program controls the presence or absence of sound output. More specifically, in this embodiment, the sound output is switched on or off depending on whether the A button 52d out of the operating switch 52 is pressed or not. That is, a sound is output when the A button 52d is pressed, and no sound is output when the A button 52d is not pressed. Therefore, the user is allowed to produce or not to produce the sound of a predetermined string by pressing or releasing the A button 52d during the operation. On the contrary, in another embodiment, the sound may be deadened when the A button 52d is pressed.
A storage area 90 stores a tone selection program. This program makes it possible to select a tone of sound to be output according to a change in acceleration. More specifically, it is determined by this program whether or not a change in stroke value has formed a predetermined relationship with the threshold value associated with each string. Then, when it is determined that the predetermined relationship has been established, the sound set for the string is selected. More specifically, it is determined whether or not the threshold value of any string exists between the stroke value of the current frame and the stroke value of the previous frame. This means that it is investigated which string has been plucked between the previous frame and the current frame. More specifically, in normal downstroke, it is determined whether or not the threshold value of each string is equal to or more than the stroke value of the previous frame and is less than the stroke value of the current frame. Additionally, as for upstroke, it is determined whether or not the threshold value of each string is equal to or more than the stroke value of the current frame and is less than the stroke value of the previous frame.
A storage area 92 stores a sound output program. By this program, control data for sound output is generated and a sound is output based on the control data.
A storage area 94 of the data storage area 82 is an operation data buffer that stores operation data transmitted from the controller 14. As stated above, since the operation data including a plurality of pieces of operation information is received from the controller 14 at least once within a time period of one frame with the game apparatus 12, the received operation data is sequentially stored in the storage area 94. The operation data includes acceleration data indicating accelerations of X, Y and Z axes detected by the acceleration sensor 60 and button operation data indicating the presence or absence of each button operation on the operating part 52.
A storage area 96 stores a historical record of acceleration. This area holds the values of accelerations for a predetermined number of frames. In this embodiment, since the acceleration values of X and Z axes are used, the accelerations of X and Z axes are obtained from the operation data buffer and stored in the storage area 96. Since a plurality of acceleration values are obtained from one frame as stated above, the mean value of the plurality of values may be taken. Alternatively, the maximum value or the minimum value may be employed.
A storage area 98 stores button operation information that indicates a button being used in the current frame, based on the button operation data obtained from the operation data buffer 94.
A storage area 100 stores a historical record of stroke value. The stroke value represents the sum of the acceleration values in the X-axis direction and the Z-axis direction, as mentioned above. The storage area 100 stores the stroke values of at least the current and previous frames.
A storage area 102 stores a string threshold value table read from the optical disc 20. The string threshold value table holds the threshold values indicative of the positions of the strings for comparison with a change in stroke value. As shown in
A storage area 104 stores a set chord. For instance, the chord is C by default and is changed according to the operation of the cross key 52a. As an example, a portion for designating an upward direction of the cross key 52a is associated with G7, a portion for an downward direction thereof is associated with F, a portion for a rightward direction is associated with Am, and a portion for a leftward direction is associated with Dm, respectively. When any of those direction designating portions is pressed down, the chord associated with the portion is selected. Also, when the portion indicative of the direction opposite to the selected chord is pressed down, the chord is deselected and the default C is selected.
A storage area 106 stores tone data. The tone data indicates the tone of each string. The storage area 106 stores data for designating sound data on tone of each string constituting the chord selected according to the chord setting.
The processes of succeeding steps S3 to S9 are executed frame by frame. The CPU 26 acquires acceleration information and button operation information in the step S3. More specifically, the CPU 26 reads operation data from the operation data buffer 94, and acquires and stores acceleration values in the X-axis and Z-axis directions in an acceleration historical record storage area 96, and stores operation information of each button 52 in the button operation information storage area 98.
Subsequently, the CPU 26 executes a playing process in a step S5. By the playing process, sound is output according to the user's stroke operation of the controller 14. One example of operation of the playing process is provided in detail in
In a step S21 of
Then, in a step S23, the CPU 26 sets the tone of each string based on the set chord, and stores the information indicative of the tone of each string in the tone data storage area 106.
In a step S25, the CPU 26 reads the acceleration values of the current frame in the X-axis and Z-axis directions, and calculates a stroke value by adding those values together. The calculated stroke value of the current frame is stored in the stroke value historical record storage area 100.
Subsequently, in a step S27, the CPU 26 determines whether the A button has been pressed or not, based on the data in the button operation information storage area 98. That is, the CPU 26 determines whether the user has designated sound output or not. If “NO” in the step S27, that is, if the user does not intend to output any sound, the CPU 26 terminates the playing process and then returns the process to the step S7 of
On the other hand, if “YES” in the step S27, the CPU 26 determines whether each string has been plucked or not, based on a change in stroke value. That is, the CPU 26 sets an initial value to a variable N for designating a target string in a step S29 (“6” indicative of the sixth string in this embodiment).
Subsequently, in a step S31, the CPU 26 reads the threshold value of the string corresponding to the value N from the threshold value table storage area 102. Then, in a step S33, the CPU 26 determines whether or not the threshold value of the N-th string is equal to or more than the stroke value of the previous frame and is less than the stroke value of the current frame. Here, the CPU 26 determines whether or not the stroke value has passed through the threshold value of the N-th string from bottom up, that is, whether or not the N-th string has been plucked by a downstroke from top down. If “YES” in the step S33, the CPU 26 moves the process to a step S39 to output the sound of the N-th string.
On the other hand, if “NO” in the step S33, the CPU 26 determines in a step S35 whether it is possible to perform an upstroke or not. This determination is made on the basis of the flag indicating the possibility of an upstroke stored in the data storage area 82. If the sound output is to be permitted in an upstroke as well as a downstroke, the above mentioned flag is provided with information indicative of the permission at a time of initial setting, for example. The setting on the possibility of an upstroke can be changed on a setting screen or the like not shown, by the user's manipulation of the operating button 52.
If “YES” in the step S35, the CPU 26 determines in a step S37 whether or not the threshold value of the N-th string is equal to or more than the stroke value of the current frame and is less than the stroke value of the previous frame. Here, the CPU 26 determines whether or not the stroke value has passed through the threshold value of the N-th string from top down, that is, whether or not the N-th string has been plucked by an upstroke from bottom up. If “YES” in the step S37, the CPU 26 moves the process to the step S39 to output the sound of the N-th string.
On the other hand, if “NO” in the step S37, that is, if the N-th string as a process target has not been plucked, the CPU 26 moves the process to a step S45 to exercise control over the next string. In addition, if “NO” in the step S35 as well, the CPU 26 moves the process to the step S45.
In the step S39, the CPU 26 sets sound volume from the sum of the acceleration values in the X-axis and Z-axis direction. More specifically, the CPU 26 uses the current and previous stroke values as sum of the X-axis and Z-axis accelerations s to calculate the sound volume based on an absolute value of a difference between the stroke value of the current frame and the stroke value of the previous frame. Since the stroke value indicates “position” of the stroke operation and a difference in the “position” represents “speed” of the stroke operation, this “speed” is converted on a scale of sound volume. The more quickly the stroke operation is performed, the more the sound volume is increased.
In a step S41, the CPU 26 generates control data for outputting the sound of the N-th string. More specifically, the CPU 26 reads the tone data set to the N-th string from the storage area 106, and generates control data including information for designating output of the tone at a set sound volume.
Then, in a step S43, the CPU 26 outputs the sound of the N-th string. More specifically, the CPU 26 provides the DSP 34 via the memory controller 30 with the control data for outputting the sound of the N-th string. According to the control data, the DSP 34 uses sound waveform data stored in the ARAM 36 to generate data for outputting the sound, and provides the data to the audio I/F 44 via the memory controller 30. The audio I/F 44 provides the speaker 24 with an audio signal for outputting the sound, based on the data for outputting the sound. This allows the sound of the N-th string to be output from the speaker 24.
In a step S45, the CPU 26 subtracts 1 from the value of the variable N and sets the next string as a process target. Then, in a step S47, the CPU 26 determines whether the value of the variable N has reached 0 or not. That is, the CPU 26 determines whether comparison of all the threshold values of the strings to a change in stroke value has been completed or not. If “NO” in the step S47, the CPU 26 returns the process to the step S31 to perform a process on the next string. As stated above, the plucked string is identified according to a change in stroke value, and the sound of the string is output. On the other hand, if “YES” in the step S47, the CPU 26 terminates the playing process and returns the process to the step S7 of
In the step S7 of
In the step S9, the CPU 26 determines whether or not to end the simulative playing. For example, the CPU 26 determines whether any operating switch 52 for designating the end of the simulative playing has been pressed or not, based on the button operation information. If “NO” in the step S9, the CPU 26 returns the process to the step S3 to continue the sound output control according to the user's stroke operation. If “YES”, the CPU 26 ends the sound output control process.
According to this embodiment, the controller 14 to be operated by the user is provided with the acceleration sensor 60 for detecting accelerations of at least two axes so that a sound is controlled according to a change in sum of the accelerations of the two axes detected by the acceleration sensor 60, that is, according to the state of the stroke operation. This allows simulative playing by performing a wristy operation like stroke performance on a guitar. This also allows a performance that cannot be reproduced by a simple pointing operation because the sound output can be controlled according to the state of an operation such as the swing of the controller 14, thereby making it possible to output entertaining sound never before possible.
In another embodiment, in the display process in the step S7 of
However, since the sound of each string can be output according to a change in stroke value by moving the controller 14 as if carrying out stroke performance, even the first-timer user can easily grasp a sense of operation for successful sound output by a stroke operation using the controller 14 several times. Therefore, the screen display as shown in
Since the position of the pick icon 110 indicates the position of the stroke operation in the screen of
Moreover, as shown in
Besides, the sound output control apparatus 10 structured as a guitar simulative playing system is described in relation to each of the above mentioned embodiments. However, as a matter of course, the sound output control apparatus 10 also allows simulative stroke performance on string instruments other than a guitar in which strings are to be plucked by fingers, such as ukulele and sitar.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2006-124830 | Apr 2006 | JP | national |