The present invention relates to a music game apparatus which displays images following the motion of an operation article and the related arts.
A music conducting game apparatus is disclosed in Patent document 1 (Japanese Patent Published Application No. 2002-263360). This music conducting game apparatus is provided with a phototransmitter unit at the tip of a baton controller, and a photoreceiver unit in a lower position of a monitor. The motion of the baton controller is detected by such a configuration.
When a game is started, an operation guidance image is displayed on the monitor in order to instruct the direction and timing of swinging the baton controller while the sound of music performance is output. This sound of performance is output irrespective of the manipulation of the baton controller. On the other hand, a baton responsive sound is output only when the baton controller is manipulated in accordance with the direction and timing as instructed. This baton responsive sound corresponds to fragments into which a certain performance part is divided by a predetermined length. As a result, each time the player manipulates the baton controller in accordance with the direction and diodes as instructed, the baton responsive sound corresponding thereto is output.
Patent document 2 (Japanese Patent Published Application No. Hei 10-143151) discloses a conducting apparatus. In this conducting apparatus, while a mouse is manipulated in the same manner as a baton, music parameters such as a tempo, an accent and dynamics are calculated with reference to the trajectory of the mouse. Then, the music parameters as calculated are reflected in the music and image as output. For example, in the case where the motion picture of a steam train is displayed, the speed of the steam train is controlled to follow the tempo as calculated, the variation of the speed is controlled to follow the accent as calculated, and the amount of smoke of the steam train is controlled to follow the dynamics as calculated.
As explained above, it is apparently the main purpose of the music conducting game apparatus of Patent document 1 that the player plays music performance. On the other hand, in the conducting apparatus of the Patent document 2, since it is the main purpose that the player plays music performance, the moving information of the mouse is converted into music parameters which are then reflected in the music and image.
As has been discussed above, in the case of the conventional apparatuses having the main purpose of playing music performance by the player, the image which is displayed (a steam train in the case of the above example) is not interesting enough, and little importance is attached to such an image that can be enjoyed by the player.
Furthermore, with respect to the baton controller and the mouse which are the operation articles manipulated by the player, there are the following facts. The baton controller of Patent document 1 is provided with the phototransmitter unit, and thereby it is indispensable to use an electronic circuit. Accordingly, the cost of the baton controller rises, and it can be the cause of trouble. Still further, the manipulability is degraded. Particularly, since the baton controller is used by swinging, it is desirable to dispense with an electronic circuit and simplify the configuration. In addition to this, the mouse of the Patent document 2 can be moved only on a plane surface so that there are substantial restrictions on the manipulation, and in addition to this there are the same problem as in the baton controller of Patent document 1.
Accordingly, it is an object of the present invention to provide a music game apparatus and the related arts in which the player can enjoy images, which are displayed in synchronization with the manipulation of an operation article, together with music by manipulating the operation article having a simple structure, while automatically playing the music without relation to the player.
In accordance with an aspect of the present invention, a music game apparatus operable to automatically playing music, comprises: a stroboscope operable to irradiate an operation article manipulated by a player with light in a predetermined cycle; an imaging unit operable to generate a lighted image signal and an unlighted image signal by capturing images of the operation article respectively when said stroboscope is lighted and unlighted; a differential signal generating unit operable to generate a differential signal between the lighted image signal and the unlighted image signal; a state information calculating unit operable to calculate the state information of the operation article on the basis of the differential signal; a guide control unit operable to control the display of a guide for the manipulation of a cursor, which moves in association with the operation article, in a timing on the basis of the music; a cursor control unit operable to control the display of the cursor on the basis of the state information of the operation article; and a follow-up image control unit operable to control the display of an image in accordance with guidance by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide, wherein said follow-up image control unit determines whether or not the cursor is correctly manipulated by the operation article in correspondence with the guide, on the basis of the state information of the operation article and the information about the guide.
In accordance with this configuration, if the cursor is correctly manipulated in correspondence with the guide, the display of the image is controlled in accordance with the guidance by the guide. In this case, since the cursor is manipulated in correspondence with the guidance by the guide, the display of the image is controlled in accordance with the manipulation of the cursor. In other words, since the cursor moves in association with the operation article, the display of the image is controlled in accordance with the manipulation of the operation article. The state information of the operation article is obtained by capturing the image of the operation article, which is intermittently lighted by the stroboscope. Because of this, no circuit which is driven by a power supply need be provided within the operation article for obtaining the state information of the operation article. Furthermore, this music game apparatus serves to automatically play music.
As a result, while automatically playing music without relation to the player, the player can enjoy, together with the music, images which are displayed in synchronization with the manipulation of the operation article by manipulating the operation article having a simple structure.
Also, since the guide is controlled in the timing on the basis of music, the operation article is manipulated in synchronization with music as long as the player manipulates the cursor in correspondence with the guide. Accordingly, the player can enjoy the manipulation of the operation article in synchronization with music.
In this case, the “manipulation” of the operation article means moving the operation article itself (for example, changing the position thereof), but does not mean pressing a switch, moving an analog stick, and so forth.
In the above music game apparatus, the guide is operable to guide the cursor to a destination position in a manipulation timing, and wherein said follow-up image control unit is operable to control the display of the image in correspondence with the direction of the destination position as guided by the guide when the cursor is correctly manipulated by the operation article in correspondence with the guide.
In accordance with this configuration, when the player manipulates the operation article in order to move the cursor to the destination position guided by the position guide in the manipulation timing guided by the guide, the display of images is controlled in correspondence with the direction toward the destination position of the cursor guided by the guide. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor which is moving in association with the motion of the operation article.
In the above music game apparatus, said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal, and wherein said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is located in an area corresponding to the guidance by the guide within a period corresponding to the guidance by the guide.
In accordance with this configuration, it is possible to determine the correctness of the manipulation of the cursor on the basis of the position of the operation article which can be calculated by a simple process.
In the above music game apparatus, the guide is operable to guide the moving path, moving direction and manipulation timing of the cursor.
In accordance with this configuration, when the player manipulates the operation article in order to move the cursor in the manipulation timing guided by the guide, in the moving direction guided by the guide and along the moving path guided by the guide, the display of images is controlled in correspondence with the guide. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor which is moving in association with the motion of the operation article.
In the above music game apparatus, said state information calculating unit is operable to calculate the position of the operation article as the state information on the basis of the differential signal, and wherein said follow-up image control unit is operable to determine that the cursor, which moves in association with the operation article, is correctly manipulated in correspondence with the guide if the position of the operation article as calculated by said state information calculating unit is moved through a plurality of predetermined areas guided by the guide in a predetermined order guided by the guide within a period guided by the guide.
In accordance with this configuration, it is possible to determine the correctness of the manipulation of the cursor on the basis of the position of the operation article which can be calculated by a simple process.
In the above music game apparatus, the guide is displayed in each of a plurality of positions which is determined in advance in a screen, and wherein the guide control unit is operable to change the appearance of the guide in a timing on the basis of the music;
In accordance with this configuration, the player can easily recognize the position and the direction to which the cursor is to be moved with reference to the change of the position guide in appearance.
In the present specification, the appearance of the guide is related to either or both of the shape and color of the guide.
In the above music game apparatus the guide is expressed in an image with which it is possible to visually recognize the motion from a first predetermined position to a second predetermined position on a screen, and wherein the guide control unit is operable to control the display of the guide in a timing on the basis of the music.
In accordance with this configuration, the player can clearly recognize the direction and path of the cursor to be moved.
For example, the guide is expressed by the change in appearance of a plurality of objects which are arranged in a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
In accordance with this configuration, the player can easily recognize the direction and path of the cursor to be moved with reference to the change in appearance of the plurality of objects.
For example, the guide is expressed by an object moving from the first predetermined position to the second predetermined position on the screen.
In accordance with this configuration, the player can easily recognize the direction and path of the cursor to be moved with reference to the motion of the object.
For example, the guide is expressed by the change in appearance of a path having a start point at the first predetermined position and an end point at the second predetermined position on the screen.
In accordance with this configuration, the player can easily recognize the direction and path of the cursor to be moved with reference to the change in appearance of the path.
In the above music game apparatus the state information of the operation article as calculated by said state information calculating unit is any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information.
In accordance with this configuration, since a variety of information can be used as the state information of the operation article for determining whether or not the cursor is correctly manipulated in correspondence with the guides, the possibility of expression of guides is greatly expanded, and thereby the design freedom of the game content is also greatly increased.
The novel features of the invention are set forth in the appended claims. The invention itself, however, as well as other features and advantages thereof, will be best understood by reading the detailed description of specific embodiments in conjunction with the accompanying drawings.
In
(b) is a timing diagram showing register data of
In what follows, an embodiment of the present invention will be explained in conjunction with the accompanying drawings. Meanwhile, like references indicate the same or functionally similar elements throughout the drawings, and therefore redundant explanation is not repeated.
The housing 19 of the music game apparatus 1 includes an imaging unit 13 therein. The imaging unit 13 includes four infrared light emitting diodes 15 and an infrared filter 17. The light emission units of the infrared light emitting diodes 15 are exposed from the infrared filter 17.
The music game apparatus 1 is supplied with a DC power voltage from an AC adapter 92. Alternatively, a battery cell (not shown in the figure) can be used to apply the DC power voltage in place of the AC adaptor 92.
The television monitor 90 includes a screen 91 at the front side thereof. The television monitor 90 and the music game apparatus 1 are connected by an AV cable 93. Incidentally, as illustrated in
When the player 94 turns on the power switch (not shown in the figure) which is provided in the back side of the music game apparatus 1, a game screen is displayed on the screen 91. The player 94 manipulates the operation article 150 in accordance with the guidance of a game screen to run a game. In the present specification, the “manipulation” of the operation article 150 means moving the operation article itself (for example, changing the position thereof), but does not mean pressing a switch, moving an analog stick, and so forth.
The infrared light emitting diodes 15 of the imaging unit 13 intermittently emit infrared light. The infrared light emitted from the infrared light emitting diodes 15 is reflected by the reflection sheet (to be described below) attached to this operation article 150, and input to the imaging device (to be described below) located inside the infrared filter 17. In this way, the image of the operation article 150 is intermittently captured. Accordingly, the music game apparatus 1 can intermittently acquire an image signal of the operation article 150 which is moved by the player 94. The music game apparatus 1 analyzes the image signals and reflects the analysis result in the game. The reflection sheet which is used in the present embodiment is for example a retroreflective sheet.
As illustrated in
The image sensor 43 is a low resolution CMOS image sensor (for example, 32 pixels×32 pixels: gray scale). However, this image sensor 43 may be an image sensor having a larger number of pixels, a CCD or the like device. In the following explanation, it is assumed that the image sensor 43 comprises 32 pixels×32 pixels.
In addition, a plurality (four in this embodiment) of the infrared light emitting diodes 15 is attached to the unit base 35 in order that the light output directions thereof are set respectively to the upward direction. Infrared light is emitted to an area over the imaging unit 13 by this infrared light emitting diodes 15. In addition, the infrared filter 17 (a filter capable of passing only infrared light therethrough) is attached to the upper portion of the unit base 35 in order to cover the above opening 41. Then, the infrared light emitting diodes 15 are repeatedly turned on/off (non-lighted) in a continuous manner, as will be described below, so that it serves as a stroboscope. However, the “stroboscope” is a generic term used to refer to a device serving to intermittently irradiate a moving object. Accordingly, the above image sensor 43 serves to capture an image of an object, which is moving in the scope of imaging, i.e., the operation article 150 in the case of the embodiment. Incidentally, as illustrated in
In this case, the imaging unit 13 is incorporated in the housing 19 in order that the light receiving surface of the image sensor 43 is inclined from the horizontal surface at a predetermined angle (for example, 90 degrees). Also, the scope of imaging of the image sensor 43 is for example within 60 degrees as determined by the concave lens 39 and the convex lens 37.
The high speed processor 200 is connected to the bus 53. Furthermore, the ROM 51 is connected to the bus 53. Accordingly, the high speed processor 200 can access the ROM 51 through the bus 53 to read and execute a game program as stored in the ROM 51, and read and process image data and music data as stored in the ROM 51 in order to generate a video signal and an audio signal, which are then output through the video signal output terminal 47 and the audio signal output terminal 49 respectively.
The operation article 150 is irradiated with infrared light emitted from the infrared light emitting diodes 15, and reflects the infrared light by the reflection sheet 155. The image sensor 43 detects the reflected light from this retroreflective sheet 155, and outputs an image signal which includes an image of the retroreflective sheet 155. The analog image signal output from the image sensor 43 is converted into digital data by an A/D converter (to be described below) incorporated in the high speed processor 200. This process is performed also in the periods without infrared light. The high speed processor 200 analyzes this digital data, and reflects the analysis result in the game processing.
The CPU 201 takes control of the entire system and perform various types of arithmetic operations in accordance with the program stored in the memory (the internal memory 207, or the ROM 51). The CPU 201 is a bus master of the first bus 218 and the second bus 219, and can access the resources connected to the respective buses.
The graphics processor 202 is also a bus master of the first bus 218 and the second bus 219, and generates a video signal on the basis of the data as stored in the internal memory 207 or the ROM 51, and output the video signal through the video signal output terminal 47. The graphics processor 202 is controlled by the CPU 201 through the first bus 218. Also, the graphics processor 202 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
The sound processor 203 is also a bus master of the first bus 218 and the second bus 219, and generates an audio signal on the basis of the data as stored in the internal memory 207 or the ROM 51, and output the audio signal through the audio signal output terminal 49. The sound processor 203 is controlled by the CPU 201 through the first bus 218. Also, the sound processor 203 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
The DMA controller 204 serves to transfer data from the ROM 51 to the internal memory 207. Also, the DMA controller 204 has the functionality of outputting, to the CPU 201, an interrupt request signal 220 indicative of the completion of the data transfer. The DMA controller 204 is also a bus master of the first bus 218 and the second bus 219. The DMA controller 204 is controlled by the CPU 201 through the first bus 218.
The internal memory 207 may be implemented with one or any necessary combination of a mask ROM, an SRAM (static random access memory) and a DRAM in accordance with the system requirements. A battery 217 is provided if an SRAM has to be powered by the battery for maintaining the data contained therein. In the case where a DRAM is used, a so-called refresh cycle is periodically performed to maintain the data contained therein.
The first bus arbiter circuit 205 accepts a first bus use request signal from the respective bus masters of the first bus 218, performs bus arbitration among the requests for the first bus 218, and issue a first bus use permission signal to one of the respective bus masters. Each bus master is permitted to access the first bus 218 after receiving the first bus use permission signal. Here, the first bus use request signal and the first bus use permission signal are shown as the first bus arbitration signals 222 in
The second bus arbiter circuit 206 accepts a second bus use request signal from the respective bus masters of the second bus 219, performs bus arbitration among the requests for the second bus 219, and issue a second bus use permission signal to one of the respective bus masters. Each bus master is permitted to access the second bus 219 after receiving the second bus use permission signal. Here, the second bus use request signal and the second bus use permission signal are shown as the second bus arbitration signals 223 in
The input/output control circuit 209 serves to perform the communication with an external input/output device(s) and/or an external semiconductor device(s) through input/output signals. The read and write operations of the input/output signals are performed by the CPU 201 through the first bus 218. Also, the input/output control circuit 209 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
This input/output control circuit 209 outputs an LED control signal LEDC for controlling the infrared light emitting diodes 15.
The timer circuit 210 has the functionality of periodically outputting an interrupt request signal 220 to the CPU 201 on the basis of a time interval as preset. The settings such as the time interval are performed by the CPU 201 through the first bus 218.
The ADC 208 converts analog input signals into digital signals. The digital signals are read by the CPU 201 through the first bus 218. Also, the ADC 208 has the functionality of outputting an interrupt request signal 220 to the CPU 201.
This ADC 208 receives pixel data (analog) from the image sensor 43 and converts it into digital data.
The PLL circuit 214 generates a high frequency clock signal by multiplication of the sinusoidal signal as obtained from a quartz oscillator 216.
The clock driver 213 amplifies the high frequency clock signal as received from the PLL circuit 214 to a sufficient signal level to supply the respective blocks with the clock signal 225.
The low voltage detection circuit 215 monitors the power potential Vcc and issues the reset signal 226 of the PLL circuit 214 and the reset signal 227 to the other circuit elements of the entire system when the power potential Vcc falls below a certain voltage. Also, in the case where the internal memory 207 is implemented with an SRAM requiring the power supply from the battery 217 for maintaining data, the low voltage detection circuit 215 serves to issue a battery backup control signal 224 when the power potential Vcc falls below the certain voltage.
The external memory interface circuit 212 has the functionality of connecting the second bus 219 to the external bus 53 and the functionality of controlling the bus cycle length of the second bus by issuing a cycle end signal 228.
The DRAM refresh cycle control circuit 211 periodically and unconditionally gets the ownership of the first bus 218 to perform the refresh cycle of the DRAM at a certain interval. Needless to say, the DRAM refresh cycle control circuit 211 is provided in the case where the internal memory 207 includes a DRAM.
In what follows, with reference to
Referring to
The middle point of the analog pixel data D (X, Y) as described above is determined by a reference voltage given to a reference voltage terminal Vref of the image sensor 43. For this reason, in association with the image sensor 43, for example, a reference voltage generation circuit 59 made of a resistance voltage divider is provided in order to supply a reference voltage which is always kept at a certain level to the reference voltage terminal Vref.
The respective digital signals for controlling the image sensor 43 are input to and output from the high speed processor 200 through the I/O ports thereof. These I/O ports are digital ports capable of controlling input and output operations and connected to the input/output control circuit 209 inside of this high speed processor 200.
More specifically speaking, a reset signal “reset” is output to the image sensor 43 from the I/O port of the high speed processor 200 for resetting the image sensor 43. In addition, a pixel data strobe signal PDS and a frame status flag signal FSF are output from the image sensor 43, and supplied to the input ports of the high speed processor 200.
The pixel data strobe signal PDS is a strobe signal as shown in
Also, the high speed processor 200 outputs, from the I/O ports, a command (or command associated with data) to be set in a control register (not shown in the figure) of the image sensor 43, outputs a register setting clock CLK which periodically and alternatively takes high and low levels, and supplies the register setting clock CLK to the image sensor 43.
Incidentally, the infrared light emitting diodes 15 as used are four infrared light emitting diodes 15a, 15b, 15c and 15d which are connected in parallel each other as illustrated in
This infrared light emitting diodes 15 is turned on/off (non-lighted) by the LED drive circuit 75. The LED drive circuit 75 receives the frame status flag signal FSF as described above from the image sensor 43, and this frame status flag signal FSF is passed through a differentiating circuit 67, which is made up of a resistor 69 and a capacitor 71, and given to the base of the PNP transistor 77. The base of this PNP transistor 77 is connected further to a pull-up resistor 79 which usually pulls up the base of the PNP transistor 77 to a high level. Then, when the frame status flag signal FSF is pulled down to a low level, the low level signal is input to the base through the differentiating circuit 67 so that the PNP transistor 77 is turned on only for the low level period of the frame status flag signal FSF.
The emitter of the PNP transistor 77 is grounded through resistors 73 and 65. On the other hand, the connecting point between the emitter resistors 73 and 65 is connected to the base of an NPN transistor 81. The collector of this NPN transistor 81 is connected commonly to the anodes of the respective infrared light emitting diodes 15a to 15d. The emitter of the NPN transistor 81 is connected directly to the base of another NPN transistor 61. The collector of the NPN transistor 61 is connected commonly to the cathodes of the respective infrared light emitting diodes 15a to 15d, while the emitter of the NPN transistor 61 is grounded.
This LED drive circuit 75 turns on the infrared light emitting diodes 15a to 15d only within the period when the LED control signal LEDC which is output from the I/O port of the high speed processor 200 is activated (in a high level) while the frame status flag signal FSF which is output from the image sensor 43 is in a low level.
When the frame status flag signal FSF is pulled down to the low level as shown in
The LED drive circuit 75 turns on the infrared light emitting diodes 15 only the period when the LED control signal LEDC is activated as shown in
Accordingly, useless power consumption can be restricted. Furthermore, since the frame status flag signal FSF is given also to the coupling capacitor 71, the transistor 77 is necessarily turned off after a certain period even when the flag signal FSF is fixed at a low level due to the runaway of the image sensor 43 or the like, so that the infrared light emitting diodes 15 are also necessarily turned off after the certain period.
It is therefore possible to arbitrarily and freely change the exposure period of the image sensor 43 by adjusting the mark duration of the frame status flag signal FSF.
Furthermore, the lighting period, non-lighting period, cycles of lighting/non-lighting period and so forth of the infrared light emitting diodes 15, i.e., of the stroboscope can be arbitrarily and freely set and changed by adjusting the mark durations and the frequencies of the frame status flag signal FSF and LED control signal LEDC.
As has been discussed above, when the operation article 150 is irradiated with the infrared light emitted from the infrared light emitting diodes 15, the image sensor 43 is exposed to the light reflected from the operation article 150. In response to this, the above pixel data D (X, Y) is output from the image sensor 43. More specifically speaking, during the period in which the frame status flag signal FSF as shown in
The high speed processor 200 acquires digital pixel data through the ADC 208 while monitoring the frame status flag signal FSF and the pixel data strobe signal PDS.
In this case, the pixel data is sequentially output as the zeroth line, the first line, . . . and the thirty-first line as illustrated in
Next, the details of the game played with the music game apparatus 1 will be explained with specific examples.
Incidentally, in the case of the present embodiment, the position guides “G1” to “G4” are displayed in the form of blooms, the evaluation objects 107 to 109 are displayed in the form of heart-shaped objects, the dance object 106 is displayed in the form of a male-female pair, and the cursor 105 is displayed in the form of the operation article 150. In the following description, the term “position guides G” are used to generally represent the position guides “G1” to “G4”.
The cursor 105 serves to indicate the position of the operation article 150 on the screen 91, and moves on the screen 91 to follow the motion of the operation article 150. Accordingly, as seen from the player 94, the manipulation of the operation article 150 is equivalent to the manipulation of the cursor 105. The position guide “G” serves to guide the manipulation timing and destination position of the cursor 105 (the operation article 150) in terms of the timings relative to the music which is automatically played. Direction guides “g1” to “g5”, which will be described below, serves to guide the manipulation timing and moving direction of the cursor 105 (the operation article 150) in terms of the timings relative to the music which is automatically played. Path guides “rg1” to “rg10”, which will be described below, serves to guide the manipulation timing, the moving direction and moving path of the cursor 105 (the operation article 150) in terms of the timings relative to the music which is automatically played. The evaluation objects 107 to 109 serves to indicate the evaluation of the manipulate of the cursor 105 (the operation article 150) by the player 94 in a visual way. In the following description, the term “direction guides g” are used to generally represent the direction guides “g1” to “g5”. In the same manner, the term “path guides “rg” are used to generally represent the path guides “rg1” to “rg10”.
In addition to this, the direction in which the cursor 105 is to be moved is guided also by the direction guides “g1” to “g5”. Particularly, the direction guides “g1” to “g5” sequentially appear in the order that the direction guide “g1” appears first, the direction guide “g2” appears second, the direction guide “g3” appears third, the direction guide “g4” appears fourth, and then the direction guide “g5” appears fifth. Accordingly, the direction in which the cursor 105 is to be moved is guided by the direction in which the direction guides “g1” to “g5” appear in sequence. In this case, while each of the direction guides “g11” to “g5” is displayed as a graphic form representing a small sphere just after it appears, the sphere gradually increases in size as time passes, and when the size is maximized an animation is performed as if the sphere shatters into fragments. Accordingly, the direction toward the graphic form of the sphere which appears is the direction in which the cursor 105 is to be moved.
The player 94 has to move the cursor 105 to the area in which the position guide “G1” is displayed within a predetermined period in which the bloom serving as the position guide “G” is opened. In other words, the position guide “G” serves to guide the manipulation timing of the cursor 105 by the animation that the bloom is opened. Also, the player 94 has to move the cursor 105 to the area in which the position guide “G” is displayed as an opening bloom within a predetermined period after the last direction guide “g” appears as the graphic form of the sphere. In other words, the manipulation timing of the cursor 105 is guided also by the direction guide “g”.
In addition to this, the position guide “G” serves also to indicate in advance the manipulation direction of the cursor 105. That is to say, if the bud of the bloom serving as the position guide “G” is coming to open, it enable the player 94 to know the direction in which the cursor 105 is to be moved next. Furthermore, the direction guide “g” serves also to indicate in advance the manipulation direction of the cursor 105. Namely, since the direction guide “g” appears in advance of the manipulation timing, the player 94 can know the direction in which the cursor 105 is to be moved next also by the direction guide “g”.
This will be explained with reference to a specific example. In the case of the example as illustrated in
The player 94 has to move the cursor 105 to the area in which the position guide “G2” is displayed within a predetermined period in which the bloom serving as the position guide “G2” is opened. Also, the player 94 has to move the cursor 105 to the area in which the position guide “G2” is displayed as an opening bloom within a predetermined period after the last direction guide “g5” appears as the graphic form of the sphere. In other words, the manipulation timing of the cursor 105 is guided also by the direction guide “g”.
The player 94 appropriately manipulates the operation article 150 in accordance with the instruction by the position guide “G2” and the direction guides “g1” to “g5” in order to move the cursor 105 from the position of the position guide “G1” to the position of the position guide “G2”. As a result, animation is performed such that all the evaluation objects 107 to 109 are flashing. Incidentally, if the cursor 105 is manipulated in a most appropriate timing, animation is performed in order that all the evaluation objects 107 to 109 are flashing, and if the cursor 105 is manipulated in a timing which is not most appropriate but within an acceptable range, animation is performed in order that only the evaluation object 108 is flashing. Meanwhile, each of the position guides “G1”, “G3” and “G4” is displayed in the form of the bud of the bloom because the current time is out of the time slot for guiding the manipulation timing and destination position of the cursor 105. Also, the direction guide “g” does not appear between the position guide “G2” and the position guide “G4”, between the position guide “G4” and the position guide “G3” and between the position guide “G3” and the position guide “G1”, because the current time is out of the time slot for guiding the manipulation timing and destination position of the cursor 105.
When the player 94 appropriately manipulates the cursor 105 in accordance with the guidance given by the position guide “G2” and the direction guides “g1” to “g5”, the animation of dance is performed in the direction corresponding to the moving direction of the cursor 105 (the direction from the position guide “G1” to the position guide “G2”, i.e., the right direction as seen toward the screen 91). For example, the animation of the dance object 106 turning in the counter-clockwise direction is performed, while the background 110 is scrolled in the left direction as seen toward the screen 91. By this process, although the dance object 106 is positioned in the center of the screen 91, it appears that the dance object 106 is turning in the counter clockwise direction while moving in the right direction.
When the player 94 manipulates the operation article 150 in accordance with the position guides “G1” to “G4” and the path guides “rg1” to “rg10” in order to move the cursor 105 in an appropriate manner, the animation of the dance object 106 (for example, the animation which is widely turning in the counter clockwise direction) is performed in correspondence with the path guides “rg1” to “rg10”.
Meanwhile, as discussed above, the object illustrated in each of
In this case, each of the dance object 106, the position guide “G”, the evaluation objects 107 to 109, the cursor 105, the direction guide “g” and the path guide “rg” in the game screens as illustrated in
Next, the scrolling of the background 110 will be explained. First, the background screen will be explained.
In this description, in the case where the block “0” to the block “1023” are generally referred to, they are referred to simply as the “block”; in the case where the array element PA[0] to the array element PA[1023] are generally referred to, they are referred to as the “array element PA”; and in the case where the array element CA[0] to the array element CA[1023] are generally referred to, they are referred to as the “array element CA”.
Incidentally, data (pixel pattern data) for designating the pixel pattern of the corresponding block is assigned to the array element PA. This pixel pattern data consists of the color information of the respective pixels of the 8 pixels×8 pixels for making up a block. On the other hand, the information for designating the color palette and the depth value for use in the corresponding block is assigned to the array element CA. A color palette consists of the predetermined number of color information entries. The depth value indicates the depth position of the pixels, and if a plurality of pixels overlap each other in the same position only the pixel having the largest depth value is displayed.
For example, if it is assumed that the portion thereof (hatched portion) scrolled out of the screen 91 consists of the block “64”, the block “96”, . . . , the block “896” and the block “928” of
In order to making the background look smooth and continuous, it is needed to update the relevant array elements PA and the relevant array elements CA in advance of displaying them at the right edge of the screen 91. In this case, it is needed to update the relevant array elements PA and the relevant array elements CA while displaying the left edge of the screen 91, and thereby the image near the left edge of the screen 91 becomes incoherent. Accordingly, as illustrated in
Incidentally, the scroll process in the rightward direction is performed in the same manner as the scroll process in the leftward direction. Also, in the case of the present embodiment, since the range of scrolling is limited within ±16 pixels in the longitudinal direction (vertical direction) of the background screen 140, there is no mask at the top and bottom edges of the screen 91.
As has been discussed above, the background 110 is scrolled by scrolling the background screen 140.
Next, the details of the game process by the music game apparatus 1 will be explained.
The first musical score data 305 shown in
“Note On” is a command to output sound, and “Wait” is a command to set a waiting time. The waiting time is the time period to elapse before reading the next command (the time period between one musical note and the next musical note). The note number is information for designating the frequency of sound vibration (pitch). The waiting time information item is information for designating a waiting time to be set. The instrument designation information item is information for designating a musical instrument whose tone quality is to be used. The velocity value is information for designating the magnitude of sound, i.e., a sound volume. The gate time is information for designating a period for which a musical note is to be continuously output.
Returning to
The instrument designation information item of the second musical score data 306 is the number indicating that the second musical score data 306 is the musical score data for displaying guides (the position guide “G”, the direction guide “g” and the path guide “rg”) rather than the number indicating the instrument (tone quality) corresponding to the instrument of which sound is to be output.
Accordingly, “Note On” is not a command to output sound but a command to designate starting the animation of the position guide “G” or designate starting the display of the direction guide “g” and the path guide “rg”. Accordingly, the note number is not a command to designate the frequency of sound vibration (pitch) but information used to designate which of the animations of the position guides “G” is to be started and designate where the direction guide “g” and the path guide “rg” are displayed. This point will be explained in detail.
Meanwhile, for example, the note number “81” is dummy data placed on top of the second musical score data 306 (refer to
Next is the explanation of the main process performed by the high speed processor 200.
[Pixel Data Group Acquisition Process] The CPU 201 acquires digital pixel data by converting analog pixel data which is output from the image sensor 43, and assigns it to the array element P[X][Y]. Meanwhile, it is assumed that the horizontal axis (in the lateral direction or the row direction) of the image sensor 43 is X-axis and the vertical axis (in the longitudinal direction or the column direction) is Y-axis.
[Differential Data Calculation Process] The CPU 201 calculates the differential data between the pixel data P[X] [Y] acquired when the infrared light emitting diodes 15 are turned on and the pixel data P[X][Y] acquired when the infrared light emitting diodes 15 are turned off, and the differential data is assigned to the array element Dif[X] [Y]. In what follows, the advantages of obtaining the differential data will be explained with reference to drawings. In this case, the pixel data represents the luminance value. Accordingly, the differential data also represents the luminance value.
As has been discussed above, the operation article 150 is irradiated with infrared light in order to capture an image by the reflected infrared light which is incident on the image sensor 43 through the infrared filter 17. In the case where an image of the operation article 150 is stroboscope captured by the use of an ordinary light source in an ordinary indoor environment, an ordinary image sensor (corresponding to the image sensor 43 of
Incidentally, although the image of
Next,
Because of this, the infrared filter 17 is used as illustrated in
For this purpose, the difference is calculated between the pixel data of the image signal with the illumination as shown in
For the reason as described above, the CPU 201 acquires differential data by calculating the difference between the pixel data acquired when the infrared light emitting diodes 15 are turned on and the pixel data acquired when the infrared light emitting diodes 15 are turned off.
[Target Point Extraction Process] The CPU 201 obtains the coordinates of the target point of the operation article 150 on the basis of the differential data Dif[X][Y] as calculated. This will be explained in detail.
As illustrated in
In this case, the CPU 201 finds the differential data of the maximum luminance value from the differential data of 32 pixels×32 pixels as scanned, and compares the maximum luminance value to a predetermined threshold value “Th”. Then, if the maximum luminance value is larger than the predetermined threshold value “Th”, the CPU 201 calculates the coordinates of the target point of the operation article 150 on the basis of the coordinates of the pixel having the maximum luminance value. This point will be explained in detail.
As illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
The CPU 201 converts the coordinates (Xc, Yc) (=(13, 7)) of the target point which is calculated as described above into the coordinates (xc, yc) in the screen 91. The CPU 201 performs the process of calculating the coordinates (xc, yc) of the target point as described above each time the frame is updated. Then, the CPU 201 assigns “xc” and “yc” respectively to the array elements Px[M] and Py[M]. Meanwhile, “M” is an integer and incremented by one each time the frame displayed on the screen 91 is updated.
[Target Point Existence Area Determination Process (1)] The CPU 201 determines which of areas a1 to a4 includes the target point of the operation article 150 on the screen 91. This point will be explained in detail.
[Target Point Existence Area Determination Process (2)] The CPU 201 determines which of areas A1 to A4 includes the target point of the operation article 150 on the screen 91. This point will be explained in detail.
[Cursor Control Process] The CPU 201 registers (stores in the internal memory 207) the coordinates (xc, yc) of the current target point of the operation article 150 as the coordinates of the cursor 105 to be displayed in the next frame.
[Guide Type Registration Process] The CPU 201 assigns a note number (refer to
[Guide Control Process] The CPU 201 registers the animation information of the direction guide “G”, the animation information of the position guide “g” and the animation information of the path guide “rg” with reference to the array element NN[J] (guide display number “J”=0, 1) in accordance with the note number assigned to the array element NN[J]. This point will be explained in detail.
Each of the note numbers in this table is a note number which is used to control the display of a guide and shown in
In this case, the display timing information is information indicative of when an object is to be displayed on the screen 91. For example, the guide number “55” indicates that the position guide “G2” is to be displayed at the coordinates (x1, y1) in the next frame following the frame which is currently displayed since the display timing information of the position guide “G” is “0”. Also, for example, since the display timing information of the direction guide “g” is 0, 6, 12, . . . and 24, the guide number “55” indicates that the position guide “g1” is to be displayed at the coordinates (x3, y1) in the next frame following the frame which is currently displayed, that the position guide “g2” is to be displayed at the coordinates (x4, y1) 6 frames after, . . . and that the position guide “g5” is to be displayed at the coordinates (x7, y1) 24 frames after.
The animation table pointed to by the animation table storage location information “address 0” is an example of the animation table of the position guide “G”, the animation table pointed to by the animation table storage location information “address 1” is an example of the animation table of the direction guide “g”, the animation table pointed to by the animation table storage location information “address 2” is an example of the animation table of the path guide “rg”, and the animation table pointed to by the animation table storage location information “address 3” is an example of the animation table of the position guide “G” which is used when the player 94 successfully manipulates the cursor 105.
[Dance Control Process] The CPU 201 determines whether or not the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. More specific description is as follows.
The CPU 201 determines whether or not the cursor 105 (i.e., the target point of the operation article 150) is located in the area which is currently designated by the position guide “G” and the direction guide “g” on the basis of the result of determination J1[M] performed by the target point existence area determination process (1) (refer to
Also, the CPU 201 determines whether or not the cursor 105 (i.e., the target point of the operation article 150) is moved along the path which is currently designated by the position guide “G” and the path guide “rg” on the basis of the result of determination J2[M] performed by the target point existence area determination process (2) (refer to
From the above results, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 registers (stores in the predetermined area of the internal memory 207) the dance animation information corresponding to the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. In the same manner as the table of
The dance animation information is designed in the same manner as the animation information of
Meanwhile, the second musical score data 306 of
Also, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 registers the storage location information (“address 3” in the case of the example of
Furthermore, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 registers (stores in the predetermined area of the internal memory 207) the animation information for the evaluation objects 107 to 109. This animation information is designed in the same manner as the animation information of
Still further, in the case where it is determined that the player 94 correctly manipulates the cursor 105 in correspondence with the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”, the CPU 201 performs the scrolling corresponding to the position guide “G” and the direction guide “g” or the position guide “G” and the path guide “rg”. More specifically speaking, the CPU 201 changes the center position of the background screen 140 to scroll the background 110 (refer to
Incidentally,
As illustrated in
The CPU 201 starts the process of determining whether or not the cursor 105 is correctly manipulated in correspondence with the direction guide “g” and the position guide “G” a predetermined period (for example, 30 frames) after starting the display of the direction guide “g”, and completes the process at the corresponding time point of T1 to T3 in which the corresponding note number of the first musical score data 305 is read. Then, if it is determined that the cursor 105 is correctly manipulated in correspondence with the direction guide “g” and the position guide “G”, the CPU 201 registers the dance animation information at the time point when the determination period ends. Accordingly, in this case, dance animation is performed on the basis of the dance animation information which is registered.
Meanwhile, the following is the reason why the time point of starting reading the second musical score data 306 is earlier than the time point T1, T2 . . . of starting reading the first musical score data 305 by a predetermined time “t”. Namely, since the player 94 starts the manipulation of the operation article 150 after the guidance by the direction guide “g” and the position guide “G” is started, the direction guide “g” and the position guide “G” are displayed earlier than in the timing of music for the purpose of adjusting the time lag.
The timing of displaying the path guide “rg” is provided in the same manner as the timing of displaying the direction guide “G”. However, for example, the determination of whether or not the cursor 105 is correctly manipulated in correspondence with the path guide “rg” is performed between the start and end of the guidance by the path guide “rg” (For example, for 60 frames).
[Image Display Process] The CPU 201 provides the graphics processor 202 of
The CPU 201 calculates the display coordinates of the respective sprites forming the cursor 105 on the basis of the coordinate information (the coordinate information of the target point of the operation article 150) which is registered by the cursor control process. Then, the CPU 201 provides the graphics processor 202 with the display coordinate information, the color palette information, the depth value, the size information and the pixel pattern data storage location information of the respective sprites for forming the cursor 105. The graphics processor 202 generates the image signal of the cursor 105 on the basis of the respective information, and outputs it to the video signal output terminal 47.
Also, the CPU 201 acquires the size information of the object for forming the animation image of each guide (the position guide “G”, the direction guide “g” or the path guide “rg”) and the size information of the sprite for forming the object with reference to the animation table on the basis of the animation table storage location information contained in the animation information which is registered by the guide control process. Then, the CPU 201 calculates the display coordinates of the respective sprites for forming the object on the basis of the above respective information and the display coordinate information contained in the animation information as registered. Furthermore, the CPU 201 calculates the pixel pattern data storage location information of the respective sprites for forming the object on the basis of the reference number of the position guide “G” to be displayed next, the size information of the object and sprite contained in the animation table, and the animation image data storage location information of the position guide “G” contained in the animation table.
Still further, the CPU 201 provides the graphics processor 202 with the color palette information, the depth value and the size information of the respective sprites for forming the position guide “G” together with the pixel pattern data storage location information and the display coordinate information of the respective sprites with reference to the animation table. In this case, the CPU 201 provides the graphics processor 202 with the above respective information on the basis of the display timing information of the position guide “G” contained in the animation information as registered and the information on the number of duration frames of the animation table.
For the direction guide “g” and the path guide “rg”, the information to be given to the graphics processor 202 by the CPU 201 has a similar content and is acquired in a similar manner as for the position guide “G”. However, since the direction guides “g1” to “g4” and the path guides “rg1” to “rg10” are sequentially displayed in a plurality of positions which are designated by the display coordinate information contained in the animation information in the timing which is designated by the display timing information contained in the animation information, the CPU 201 provides the information on the direction guides “g1” to “g4” and the path guides “rg1” to “rg10” to the graphics processor 202, when starting displaying each of the direction guides “g1” to “g4” and each of the path guides “rg1” to “rg10”, with reference to the display coordinate information and the display timing information contained in the animation information as registered.
The graphics processor 202 generates the image signals of the guides (the position guide “G”, the direction guide “g”, the path guide “rg”) on the basis of the above information which is given as described above, and outputs them to the video signal output terminal 47.
Also, the CPU 201 acquires the size information of the dance object 106 for forming the dance animation image and the size information of the sprite for forming the dance object 106 with reference to the dance animation table on the basis of the dance animation table storage location information contained in the dance animation information which is registered by the dance control process. Then, the CPU 201 calculates the display coordinates of the respective sprites for forming the dance object 106 on the basis of the above respective information and the display coordinate information contained in the dance animation information as registered. Furthermore, the CPU 201 calculates the pixel pattern data storage location information of the respective sprites for forming the dance object 106 on the basis of the reference number of the dance object 106 to be displayed next, the size information of the dance object 106 and the sprite contained in the dance animation table, and the dance animation image data storage location information contained in the dance animation table.
Still further, the CPU 201 provides the graphics processor 202 with the color palette information, the depth value and the size information of the respective sprites for forming the dance object 106 together with the pixel pattern data storage location information and the display coordinate information of the respective sprites with reference to the dance animation table. In this case, the CPU 201 provides the above respective information to the graphics processor 202 on the basis of the display timing information contained in the dance animation information as registered and the information on the number of duration frames of the dance animation table.
Still further, the CPU 201 acquires the information required for generating image signals on the basis of the animation information and the animation table for the evaluation objects 107 to 109 which are registered by the dance control process, and provides the information to the graphics processor 202. Incidentally, in this case, the information to be given to the graphics processor 202 by the CPU 201 has a similar content and is acquired in a similar manner as for the dance object 106.
The graphics processor 202 generates the image signals of the dance object 106 and the evaluation objects 107 to 109 on the basis of the above information which is given as described above, and outputs them to the video signal output terminal 47.
[Music Playback] The playback of music is performed by an interrupt operation. The CPU 201 reads and interprets the music control information of
Then, if the command contained in the music control information as read is “Note On”, the CPU 201 provides the sound processor 203 with the head address from which the wave data is stored in accordance with the frequency of sound vibration (pitch) designated by the note number contained in the music control information and the instrument (tone quality) designated by the instrument designation information. Furthermore, if the command contained in the music control information as read is “Note On”, the CPU 201 provides the sound processor 203 with the head address from which the envelope data as required is stored. Still further, if the command contained in the music control information as read is “Note On”, the CPU 201 provides the sound processor 203 with pitch control information corresponding to the frequency of sound vibration (pitch) designated by the note number contained in the music control information, and volume information contained in the music control information.
In what follows, the pitch control information will be explained. The pitch control information is used to perform the pitch conversion by changing the frequency of reading the wave data. Namely, the sound processor 203 periodically reads the pitch control information at a certain interval and accumulates the pitch control information. The sound processor 203 then processes this result of accumulation to obtain the address pointer to the wave data. Accordingly, if the pitch control information is set to a large value, the address pointer is quickly incremented by the large value to raise the frequency of the wave data. Conversely, if the pitch control information is set to a small value, the address pointer is slowly incremented by the small value to lower the frequency of the wave data. In this way, the sound processor 203 performs the pitch conversion of wave data.
Next, the sound processor 203 reads the wave data stored in the location pointed to by the head address as given from the ROM 51, while incrementing the address pointer on the basis of the pitch control information as given. Then, the sound processor 203 generates an audio signal by multiplying the wave data, which is successively read, by the envelope data and the volume data. In this way, an audio signal having the tone quality of the musical instrument, the frequency of sound vibration (pitch) and the sound volume which are designated by the first musical score data 305 is generated and output to the audio signal output terminal 49.
On the other hand, the CPU 201 manages the gate times contained in the music control information as read. Accordingly, the CPU 201 outputs an instruction to the sound processor 203 in order that, when a gate time elapses, the output of the corresponding musical tone is terminated. In response to this, the sound processor 203 terminates the output of the corresponding musical tone as designated.
Music is played back as described above on the basis of the first musical score data 305 and output through a speaker (not shown in the figure) of the television monitor 90.
Next, the entire process flow of the music game apparatus 1 of
In step S6, it is determines whether or not the CPU 201 waits for the video system synchronous interrupt. The CPU 201 provides the graphics processor 202 with the image information for updating the display screen of the television monitor 90 after starting the vertical blanking period (step S7). Accordingly, after the process necessary for updating the display screen is completed, the process is halted until the next video system synchronous interrupt is issued. If “YES” is determined in step S6, i.e., while waiting for a video system synchronous interrupt (while there is no video system synchronous interrupt), the same step S6 is repeated. Conversely, if “NO” is determined in step S6, i.e., if it gets out of the state of waiting for a video system synchronous interrupt (if there is a video system synchronous interrupt), the process proceeds to step S7. In step S7, the CPU 201 provides the graphics processor 202 with the image information required for generating the game screen (refer to
In step S12, the CPU 201 initializes the musical score data pointer for music for music. In step S13, the CPU 201 sets an execution stand-by counter for music to “t”.
In step S14, the CPU 201 performs the initial settings of the image sensor 43. In step S15, the CPU 201 initializes various flags and various counters.
In step S16, the CPU 201 sets the timer circuit 210 as an interrupt source for outputting sound. Incidentally, as an interrupt handler, the sound processor 203 performs a process to output sound from the speaker of the television monitor 90.
As has been discussed above, as illustrated in
Returning to
Returning to
Thereafter, the setting data is set to a command “RUN” in step S28 for indicating the completion of initialization and having the image sensor 43 start outputting data, followed by step S29 in which the command “RUN” is transmitted. As has been discussed above, the sensor initialization process is performed in step S14 of
In step S51, the process of extracting a target point is performed. More specifically speaking, the CPU 201 acquires differential data by calculating the difference between the pixel data acquired when the infrared light emitting diodes 15 are turned on and the pixel data acquired when the infrared light emitting diodes 15 are turned off. Then, the CPU 201 finds the maximum value of the differential data and compares it with the predetermined threshold value “Th”. Furthermore, if the maximum value of the differential data is greater than the predetermined threshold value “Th”, the CPU 201 converts the coordinates of the pixel having the differential data corresponding to the maximum value into the coordinates on the screen 91 of the television monitor 90 and sets the coordinates of the target point of the operation article 150 to the coordinates as converted.
In step S52, the CPU 201 determines which of the areas a1 to a4 in
In step S53, the CPU 201 determines which of the areas A1 to A4 in
If “YES” is determined in step S74, the CPU 201 determines in step S75 whether or not X=−1, i.e., whether or not it is the initial pixel. As has been discussed above, since the initial pixel of each line is set as a dummy pixel, if “YES” is determined in this step S75, the current pixel data is not acquired, but the element index “X” is incremented in the following step S77.
If “NO” is determined in step S75, since it is the second or later pixel data constructing the line, the current pixel data is acquired and saved in a temporary register (not shown in the figure) in steps S76 and S78. Thereafter, the process proceeds to step S62 of
In step S62 of
In the following step S63, “X” is incremented if “X” is smaller than “32”, the process of from step S61 to step S63 is repeatedly performed. If “X” is equal to “32”, i.e., if the acquisition process of pixel data reaches the end of the current line, “X” is set to “−1” in the following step S65, “Y” is incremented in step S66, and the acquisition process of pixel data is repeated from the top of the next line.
If “Y” is equal to “32” in step S67, i.e., if the acquisition process of pixel data reaches the last pixel data array element P[Y][X], the process proceeds to step S51 of
In step S82, the CPU 201 scans all the array elements Dif[X] [Y]. In step S83, the CPU 201 finds the maximum value of all the array elements Dif[X] [Y]. If the maximum value is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S85, and if the maximum value is less than or equal to the predetermined threshold value “Th”, the CPU 201 proceeds to step S4 of
In step S85, the CPU 201 calculates the coordinates (Xc, Yc) of the target point of the operation article 150 on the basis of the coordinates corresponding to the maximum value. In step S86, the CPU 201 increments the value of the count “M” by one (M=M+1).
In step S87, the CPU 201 converts the coordinates (Xc, Yc) of the target point on the image sensor 43 into the coordinates (xc, yc) on the screen 91 of the television monitor 90. In step S88, the CPU 201 assigns “xc” to the array element Px[M] as the x-coordinate of the M-th target point, and “yc” to the array element Py[M] as the y-coordinate of the M-th target point.
In step S104, the CPU 201 assigns to “m” the X-coordinate which is obtained in step S83 in correspondence with the maximum value. In step S105, the CPU 201 decrements “m” by one. If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S107, otherwise proceeds to step S108 (step S106). In step S107, the CPU 201 assigns the current value of “m” to “ml”. The endmost X-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S105 to S107 while scanning the X-axis from the maximum value position in the negative direction.
In step S108, the CPU 201 calculates the center coordinate between the X-coordinate “mr” and the X-coordinate “ml”, and assigns it to the X-coordinate (Xc) of the target point. In step S109, the CPU 201 assigns “Xc” which is obtained in step S108 and the Y-coordinate which is obtained in step S83 in correspondence with the maximum value, respectively to “m” and “n”. In step S110, the CPU 201 increments “n” by one (n=n+1). If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S112, otherwise proceeds to step S113 (step S111). In step S112, the CPU 201 assigns the current value of “n” to “md”. The endmost Y-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S110 to S112 while scanning the Y-axis from the maximum value position in the positive direction.
In step S113, the CPU 201 assigns to “n” the Y-coordinate which is obtained in step S83 in correspondence with the maximum value. In step S114, the CPU 201 decrements “n” by one. If the differential data Dif[m][n] is greater than the predetermined threshold value “Th”, the CPU 201 proceeds to step S116, otherwise proceeds to step S117 (step S115). In step S116, the CPU 201 assigns the current value of “n” to “mu”. The endmost Y-coordinate of the differential data greater than the predetermined threshold value “Th” is obtained by repeating steps S114 to S116 while scanning the Y-axis from the maximum value position in the negative direction.
In step S117, the CPU 201 calculates the center coordinate between the Y-coordinate “md” and the Y-coordinate “mu”, and assigns it to the Y-coordinate (Yc) of the target point. As has been described above, the coordinates (Xc, Yc) of the target point of the operation article 150 is calculated.
In step S121, the CPU 201 registers the x-coordinate Px[M] and the y-coordinate Py[M] of the target point of the operation article 150 as the display coordinates of the cursor 105 on the screen 91.
The CPU 201 repeats the process between step S122 and step S144 twice. In this case, “j” represents a guidance display number “J” (refer to
In step S123, the CPU 201 checks a guidance start flag GF[j] (refer to step S194 of
In step S128, the CPU 201 checks the note number NN[j], and if it is the note number designating the turn of the cursor 105 (refer to
Incidentally, as apparent from step S128 and step S131, in the case where the note number NN[j] designates the turn of the cursor 105, the success determination of the manipulate of the cursor 105 is performed just after starting displaying the position guide “G” and the path guide “rg” (the frame counter C[j] is “0”) irrespective of the value of the frame counter C[j]. On the other hand, in the case where the note number NN[j] is a note number other than the note number designating the turn of the cursor 105, the success determination of the manipulate of the cursor 105 is performed a predetermined number “f1” of frames (for example, 30 frames) after starting displaying the position guide “G” and the direction guide “g” (the frame counter C[j] is “0”) (refer to
By the way, as a result of the determination in step S131, if the manipulate of the cursor 105 succeeded, the CPU 201 proceeds to step S133 otherwise proceeds to step S140 (step S132). In step S133, the CPU 201 registers the dance animation information with reference to the note number NN[j] and a dance speed flag DF (refer to step S193, step S190 and step S192 of
In step S134, the CPU 201 checks the note numbers NN[j], and if it is the note number designating the turn of the cursor 105, the process proceeds to step S137 otherwise proceeds to step S135. In step S135, the CPU 201 checks the frame counter C[j]. If the frame counter [j] is greater than or equal to a predetermined number “f2” of frames, the CPU 201 proceeds to step S137 otherwise proceeds to step S138 (step S136). In step S137, the CPU 201 adds “3” to a score “S”. On the other hand, in step S138, “1” is added to the score “S”.
Incidentally, the following is the reason why “3” is added to the score “S” in step S137 while “1” is added to the score “S” in step S138. In the case where the cursor 105 is located in the area of the position guide “G” within a predetermined period (for example, 10 frames) from the time point the predetermined number “f2” of frames (for example, 50 frames) after starting displaying the position guide “G” and the direction guide “g” (the frame counter C[j] is “0”), it is determined that the cursor 105 is manipulated in a best timing so that “3” is added. On the other hand, in the case where the cursor 105 is located in the area of the position guide “G” the predetermined number “f1” of frames after and the predetermined number “f2” of frames before, it is determined that the cursor 105 is manipulated in an ordinary successful manner that “1” is added. Also, when the manipulation is performed in correspondence with the position guide “G” and the path guide “rg” (the guide designating the turn of the cursor 105), “3” is added equally to the score “S”.
Next, in step S139, the CPU 201 checks the frame counter C[j]. If the frame counter C[j] is equal to a predetermined number “f3” (for example, 60 frames), the CPU 201 proceeds to step S142 otherwise proceeds to step S141 (step S140). In step S141, the CPU 201 increments the frame counter C[j] by one. On the other hand, in step S142, the CPU 201 sets the frame counter C[j] to “0”. In step S143, the CPU 201 turns off the guidance start flag GF[j]. Incidentally, the predetermined number “f3” is used to define the end of success determination.
On the other hand, in step S162, the CPU 201 reads and interprets the command pointed to by the musical score data pointer for music. If the command is “Note On”, the process proceeds to step S164 (step S163). On the other hand, if the command is not “Note On”, i.e., “Waiting”, the process proceeds to step S165. In step S165, the CPU 201 sets a waiting time to the execution stand-by counter for music.
On the other hand, in step S164, the CPU 201 instructs the sound processor 203 to start outputting a sound corresponding to the note number which is read. In step S166, the CPU 201 increments the musical score data pointer for music.
In step S167, the CPU 201 checks the remaining sound outputting time corresponding to the note number associated with the outputting sound. If the remaining sound outputting time is “0”, the process proceeds to step S169 otherwise proceeds to step S151 of
On the other hand, in step S182, the CPU 201 reads and interprets the command pointed to by the musical score data pointer for guide. If the command is “Note On”, the CPU 201 proceeds to step S184 (step S183). On the other hand, if the command is not “Note On”, i.e., “Waiting”, the CPU 201 proceeds to step S197. In step S197, the CPU 201 sets the execution stand-by counter for guide to a waiting time.
If the note number designates the end of music, the CPU 201 proceeds to step S196 otherwise proceeds to step S185 (step S184). In step S196, the CPU 201 turns on the music end flag.
On the other hand, if the note number designates the start of music, the CPU 201 proceeds to step S195 otherwise proceeds to step S186 (step S185). If the guidance display number “J” is “1, the CPU 201 sets the guidance display number “J” to “0” in step S188, conversely if the guidance display number “J” is not “1 (i.e., it is 0”), the CPU 201 sets the guidance display number “J” to “1” in step S187. Since the guidance of a certain position guide “G” and the guidance of another position guide “G” are started in different timings but can be overlappingly continued in a certain period, the guidance display number “J” is used to perform the game process in step S3 of
By the way, if the note number designates that a high speed dance animation is to be performed, the CPU 201 proceeds to step S190 otherwise proceeds to step S191 (step S189). In step S190, the CPU 201 sets the dance speed flag DF to “1” (a high speed dance animation).
On the other hand, if the note number designates that a low speed dance animation is to be performed, the CPU 201 proceeds to step S192 otherwise proceeds to step S193 (step S191). In step S192, the CPU 201 sets the dance speed flag DF to “0” (a low speed dance animation).
By the way, in the case where the note number is none of the note number designating the end of music, the note number designating the start of music, the note number designating a high speed dance animation and the note number designating a low speed dance animation, such a note number shall be a note number which designates a type of a guide (
In step S195, the CPU 201 increments the musical score data pointer for guide.
Next is the explanation of another example of the direction guide “g”.
As has been discussed above, in the case of the present embodiment, if the cursor 105 is correctly manipulated in correspondence with the guides (the position guide “G”, the direction guide “g” and the path guide “rg”), the display of images (the dance object 106 and the background 110 in the above example) is controlled in accordance with the guidance by the guides. In this case, since the cursor 105 is correctly manipulated in correspondence with the guidance by the guides, the display of images is controlled in accordance with the manipulation of the cursor 105. In other words, since the cursor 105 moves in association with the operation article 150, the display of the images is controlled in accordance with the manipulation of the operation article 150. Also, the image of the operation article 150, which is intermittently lighted by the stroboscope, is captured by the imaging unit 13 in order to obtain the state information of the operation article 150. Because of this, no circuit which is driven by a power supply need be provided within the operation article 150 for obtaining the state information of the operation article 150. Furthermore, this music game apparatus 1 serves to automatically play music.
As a result, while automatically playing music without relation to the player 94, the player 94 can enjoy, together with the music, images which are displayed in synchronization with the manipulation of the operation article 150 by manipulating the operation article 150 having a simple structure.
Also, since the guides are controlled in the timing on the basis of music, the operation article 150 is manipulated in synchronization with music as long as the player 94 manipulates the cursor 105 in correspondence with the guides. Accordingly, the player 94 can enjoy the manipulation of the operation article 150 in synchronization with music.
In this case, for example, in correspondence with the note numbers “55” and “67”, the high speed processor 200 scrolls the background 110 to the left, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the counter clockwise direction. Also, for example, in correspondence with the note numbers “45” and “64”, the high speed processor 200 scrolls the background 110 to the right, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the clockwise direction. Furthermore, for example, in correspondence with the note numbers “76” and “77”, the high speed processor 200 scrolls the background 110 in the downward direction, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the counter clockwise direction. Still further, for example, in correspondence with the note numbers “65” and “74”, the high speed processor 200 scrolls the background 110 in the upward direction, and prepares the dance animation information and the dance animation table for turning the dance object 106 in the clockwise direction. Still further, for example, the dance animation information and the dance animation table for widely turning the dance object 106 in the clockwise direction are prepare in correspondence with the note number “53”. Still further, for example, the dance animation information and the dance animation table for widely turning the dance object 106 in the counter clockwise direction are prepare in correspondence with the note number “57”.
Incidentally, the above types of the note numbers (refer to
Also, in the case of the present embodiment, the position guide “G” serves to guide the manipulation timing and the destination position of the cursor 105. In addition, when the cursor 105 is correctly manipulated by the operation article 150 in correspondence with the guidance of the position guide “G”, the high speed processor 200 controls the display of images (the dance object 106 and the background 110 in the case of the above example) in correspondence with the direction toward the destination position which is guided by the position guide “G”.
Accordingly, when the player 94 manipulates the operation article 150 in order to move the cursor 105 to the destination position guided by the position guide “G” in the manipulation timing guided by the position guide “G”, the display of images is controlled in correspondence with the direction toward the destination position guided by the position guide “G”. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor 105 which is moving in association with the motion of the operation article 150 (refer to
Furthermore, in the case of the present embodiment, the path guide “rg” serves to guide the moving path, the moving direction and the manipulation timing of the cursor 105. Accordingly, when the player 94 manipulates the operation article 150 in order to move the cursor 105 in the manipulation timing guided by the path guide “rg”, in the moving direction guided by the path guide “rg” and along the moving path guided by the path guide “rg”, the display of images (the dance object in the case of the above example) is controlled in correspondence with the path guide “rg”. As a result, it is possible to enjoy, together with music, the images which are synchronized with the cursor 105 which is moving in association with the motion of the operation article 150 (refer to
Furthermore, in the case of the present embodiment, if the position of the target point of the operation article 150 is found in the area guided by the position guide “G” within the period is guided by the position guide “G”, it is determined that the cursor 105 which is moving in association with the operation article 150 is correctly manipulated in correspondence with the guidance of the position guide “G” (refer to
Furthermore, in the case of the present embodiment, the position guide “G” is displayed in each of a plurality of positions which are determined in advance on the screen 91. Then, the high speed processor 200 changes the appearance of the position guide “G” in the timing on the basis of music (the animation that a bloom is opening in the case of the example of
Furthermore, in the case of the present embodiment, the direction guide “g” and the path guide “rg” are expressed in images with which it is possible to visually recognize the motion from the first predetermined position to the second predetermined position on the screen 91. As has been discussed above, the manipulation of the cursor 105 is guided not only by the position guide “G” but also by the direction guide “g” and the path guide “rg”. Accordingly, the player 94 can clearly recognize the direction and path of the cursor 105 to be moved. More specific description is as follows.
The direction guide “g” and the path guide “rg” are expressed by the change in appearance of a plurality of objects (in the form of spheres in the case of the examples of
Also, the direction guide “g” is expressed by the motion of an object (in the form of a bird in the case of the example of
In addition to this, the direction guide “g” is expressed by the change in appearance of the path having a start point at the first predetermined position and an end point at the second predetermined position on the screen 91 (refer to
Furthermore, in the case of the present embodiment, the high speed processor 200 can be calculate, as the state information of the operation article 150, any one of or any combination of two or more of speed information, moving direction information, moving distance information, velocity vector information, acceleration information, movement locus information, area information, and positional information. As has been discussed above, since a variety of information can be used as the state information of the operation article 150 for determining whether or not the cursor 105 is correctly manipulated in correspondence with the guides (the position guide “G”, the direction guide “g” and the path guide “rg”), the possibility of expression of guides is greatly expanded, and thereby the design freedom of the game content is also greatly increased.
Furthermore, in the case of the present embodiment, the state information of the operation article 150 can be obtained by intermittently emitting infrared light to the operation article 150 to which the reflection sheet 155 is attached and capturing the image thereof. Because of this, no circuit which is driven by a power supply need be provided within the operation article 150 for obtaining the state information of the operation article 150. Accordingly, it is possible to improve the manipulability and reliability of the operation article 150, and to reduce the cost.
Meanwhile, the present invention is not limited to the embodiments as described above, but can be applied in a variety of aspects without departing from the spirit thereof, and for example the following modifications may be effected.
(1) In the above embodiments, the dance object 106 and the background 110 are explained as images (follow-up images) which are controlled to follow the motion of the operation article 150. However, the present invention is not limited thereto, but any arbitrary object can be selected as such a follow-up image. Also, instead of scrolling the background 110 in order to express the motion of the dance object 106, it is possible to move the dance object 106 itself in the up, down, right and left directions.
(2) While the direct manipulation of the cursor 105 is guided by both the position guide “G” and the direction guide “g” in the above embodiments, it is possible to perform the guidance only by one of them. In the case where only the direction guide “g” is used to instruct the manipulation of the cursor 105, it is preferred that the position guide “G” in the form of a still image is arranged at each of the start point and the end point of the direction guide “g”. Also, while both the position guide “G” and the path guide “rg” is used to guide the turning manipulation of the cursor 105, it is possible to guide only by the path guide “rg”. Furthermore, while the guides (the position guide “G”, the direction guide “g” and the path guide “rg”) are expressed by animation in the above description, the present invention is not limited thereto. Still further, the implementation of a guide is not limited to those as described above.
(3) While the operation article 150 comprising the stick 152 and the reflection ball 151 is employed as an operation article in the above embodiments, the configuration of the operation article is not limited to those as described above as long as a reflecting object is provided.
(4) While the coordinates of the operation article 150 are calculated as illustrated in
(5) While an arbitrary type of a processor can be used as the high speed processor 200 of
While the present invention has been described in terms of embodiments, it is apparent to those skilled in the art that the invention is not limited to the embodiments as described in the present specification. The present invention can be practiced with modification and alteration within the spirit and scope which are defined by the appended claims. Accordingly, the description of this application is thus to be regarded as illustrative instead of limiting in any way on the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2003-325381 | Sep 2003 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP04/14025 | 9/17/2004 | WO | 12/6/2006 |
Number | Date | Country | |
---|---|---|---|
60515267 | Oct 2003 | US |