The disclosure of Japanese Patent Application No. 2007-111177 is incorporated herein by reference.
1. Field of the Invention
The present invention relates to a game controller, a storage medium storing a game program, and a game apparatus, and particularly to the game controller including a plurality of load sensors, the storage medium storing a game program for performing game processing with the game controller, and the game apparatus.
2. Description of the Related Art
Conventionally, there is known a load detecting device including a plurality of load sensors in the field of medical equipment for the purpose of training such as rehabilitation.
For example, Japanese Patent Publication Laid-Open No. 62-34016 discloses a variable load display apparatus including the two load sensors. In the variable load display apparatus, legs are ridden on the load sensors respectively, and a balance between right and left is measured by displaying load values detected from the two load sensors.
Japanese Patent Publication Laid-Open No. 7-275307 discloses a barycentric movement training apparatus including three load detecting means. In the barycentric movement training apparatus, legs are ridden on a detecting board in which the three load detecting means are provided. A barycentric position is computed and displayed by computation of signals detected from the three load detecting means, thereby conducting barycentric movement training.
On the other hand, in a conventional general-purpose game controller, a cross key, for example, is provided, and at least four-directional instructions can be issued.
In view of the application of the apparatus including the load sensor of Japanese Patent Publication Laid-Open Nos. 62-34016 and 7-275307 to the game controller, only the instructions in the right and left directions are enabled using outputs of the right and left sensors in the technique disclosed in Japanese Patent Publication Laid-Open No. 62-34016, and the instructions in the three directions are enabled using the load values of the three load detecting means in the technique disclosed in Japanese Patent Publication Laid-Open No. 7-275307.
Thus, unfortunately the techniques disclosed in Japanese Patent Publication Laid-Open Nos. 62-34016 and 7-275307 are hardly used as the general-purpose game controller in which manipulations are required in at least four directions.
In the technique disclosed in Japanese Patent Publication Laid-Open No. 62-34016, because the values detected from the two load sensors are directly used, only simple measurement can be performed to hardly make an interesting game, even if the technique is applied to the game processing.
In the technique disclosed in Japanese Patent Publication Laid-Open No. 7-275307, the barycentric position is computed by the three load detecting means and an image indicating the barycentric position is displayed. However, various quantities of load values are not computed from the signals of the three load detecting means.
In view of the foregoing, an object of the present invention is to provide a novel game controller.
Another object of the present invention is to provide a game controller that can perform various manipulations using the load sensor.
Still another object of the present invention is to provide a novel storage medium storing a game program, in which a game controller including a plurality of load sensors is used, and a game apparatus.
Still another object of the present invention is to provide a storage medium storing a game program, which can compute a quantity of load values necessary for game processing to perform game processing using a game controller including a plurality of load sensors, and a game apparatus.
In the present invention, the following configurations are adopted to solve the problems. A parenthetic reference numeral and supplementary explanation indicate correlation with the later-described embodiments for the purpose of the easy understanding of the present invention, and do not restrict the present invention.
In a first aspect of the present invention, a game controller used in a game machine, includes a support portion on which player's legs are ridden; at least four load sensors disposed at predetermined intervals below the support portion; and a communication means for transmitting a load value as manipulation data detected from each of the four load sensors.
In the first aspect of the present invention, the game controller (10) is used as the manipulation means or input means of the game machine (52). The game controller includes the support portion (16) on which the player's legs are ridden, and at least the four load sensors (14) are disposed at predetermined intervals below the support portion. The load applied by the player ride on the support portion is detected by the four load sensors. The communication means (20, 42, S7) transmits the load value as the manipulation data detected from each of the four load sensors. Accordingly, in the game machine, the game can be performed based on the load values detected by the four load sensors.
According to the first aspect of the present invention, the four load sensors are provided, and the load value as the manipulation data detected from each of the four load sensors is transmitted to the game machine, so that the game controller that can perform various manipulations using the load applied by the player can be provided.
In a second aspect of the present invention, preferably a game controller according to the first aspect of the present invention further includes a power supply unit that supplies electric power to the load sensor; and a power supply control means for controlling power supply from the power supply unit to the load sensor, wherein the communication means includes a reception determining means for determining whether or not a load obtaining command is received from the game machine, the power supply control means supplies electric power from the power supply unit to the load sensor when the reception determining means determines that the load obtaining command is received, and the power supply control means stops electric power supply from the power supply unit to the load sensor when the reception determining means determines that the load obtaining command is not received.
In the second aspect of the present invention, the power supply from the power supply unit (30) to the load sensor is controlled by the power supply control means (20, 26). Specifically, the electric power is supplied to the load sensor when the reception determining means (20, S1) determines that the load obtaining command is received from the game machine, and the electric power supply is stopped when the reception determining means determines that the load obtaining command is not received. Accordingly, the electric power is supplied to detect the load only when the load is required, so that power consumption for operating the load sensor can be suppressed.
A third aspect of the present invention is a game controller according to the first aspect of the present invention, preferably the communication means includes a wireless communication unit receiving wirelessly the load obtaining command from the game machine; and a processing means for imparting the load value as the manipulation data detected from each of the four load sensors to the wireless communication unit when the wireless communication unit receives the load obtaining command, and the wireless communication unit wirelessly transmits the manipulation data received from the processing means to the game machine.
In the third aspect of the invention, the wireless communication unit (42) conducts wireless communication with the game machine. The processing means (20) imparts the load value as the manipulation data detected from each of the four load sensors to the wireless communication unit when the load obtaining command is received and the wireless communication unit transmits the manipulation data to the game machine. Accordingly, the wireless game controller that can wirelessly transmit and receive the data to and from the game machine can be provided.
A fourth aspect of the present invention is a game controller according to the second aspect of the present invention, preferably the communication means includes a wireless communication unit receiving wirelessly the load obtaining command from the game machine; and a processing means for imparting the load value as the manipulation data detected from each of the four load sensors to the wireless communication unit when the wireless communication unit receives the load obtaining command, and the wireless communication unit transmits wirelessly the manipulation data received from the processing means to the game machine.
In the fourth aspect of the present invention, similarly to the third aspect of the present invention, the wireless game controller that can wirelessly transmit and receive the data to and from the game machine can be provided.
A fifth aspect of the present invention is a game controller according to the first aspect of the present invention, preferably the communication means includes a connector unit detachable from a different type of game controller, and the communication means transmits the load value to the game machine through the different type of game controller attached to the connector unit.
In the fifth aspect of the present invention, the game controller is connected to the different type of game controller (54) by the connector unit (24). The communication means transmits the load value to the game machine through the different type of game controller. Accordingly, the extended type game controller that is used while attached to the different type of game controller by the connector unit can be provided.
A sixth aspect of the present invention is a game controller according to the first aspect of the present invention, preferably the communication means includes a command determining means for determining which load obtaining command in a plurality of types of load obtaining commands is received from the game machine; and a manipulation data computing means for computing a predetermined quantity of pieces of manipulation data according to the load obtaining command determined by the command determining means from the load value detected from each of the four load sensors.
In the sixth aspect of the present invention, the command determining means (20, S621 to S625) determines the type of load obtaining command from the game machine. The manipulation data computing means (20, S627, S631, S633) computes the predetermined quantity of pieces of manipulation data according to the determined command. Accordingly, because the quantity of pieces of manipulation data according to the command can be transmitted to the game machine, various quantities of pieces of manipulation data can be imparted to the game machine according to contents of the game.
In a seventh aspect of the present invention, preferably a game controller according to the first aspect of the present invention further includes a manipulation button provided in a surface different from an upper surface of the support portion to be manipulated by the player's legs.
In the seventh aspect of the invention, the manipulation button (40) manipulated by the player's legs is provided in the surface different from the upper surface of the support portion on which the player rides. When the manipulation button is manipulated, the communication means transmits the manipulation data of the manipulation button to the game machine. Accordingly, the game controller in which button manipulation can be performed by the legs can be provided.
In an eighth aspect of the present invention, a storage medium stores a game program executed in a computer of a game machine that performs a game using a game controller including a plurality of load sensors, wherein the game program causes the computer to execute a detection value obtaining step of obtaining a detection value outputted from each of the plurality of load sensors; a quantity determining step of determining a quantity of load values necessary for game processing; a load value computing step of computing the quantity of load values determined by the quantity determining step from a plurality of detection values; and a game processing step of performing game processing based on the load value.
In the eighth aspect of the present invention, the game program is executed by the computer (82) of the game machine (52), and the game is performed using the game controller (10) including the plurality of load sensors (14) in the game machine. The detection value outputted from each of the plurality of load sensors is obtained in the detection value obtaining step (S49, S399). The quantity of load values necessary for the game processing is determined in the quantity determining step (S53 to S57, S403, S405). The determined quantity of load values are computed from the plurality of detection values in the load value computing step (S101, S151, S153, S181, S311, S351, S431, S471, S511, S543). The game processing is performed based on the load value in the game processing step (S59, S61, S63, S407, S409).
According to the eighth aspect of the present invention, the quantity of load values necessary for the game processing, and the necessary quantity of load values is computed from the plurality of load detection values. Therefore, various quantities of load values can be used in the game by a combination of values of the plurality of load sensors, and novel play can be proposed.
A ninth aspect of the present invention is a storage medium storing a game program according to the eighth aspect of the present invention, preferably the game program causes the computer to further execute a selection step of causing a player to select a type of game, and the quantity determining step determines the necessary quantity according to the type of game by determining the type of game.
In the ninth aspect of the present invention, the type of game is selected by the player in the selection step (S43). In the quantity determining step (S53 to S57), the type of game is determined, and the necessary quantity according to the type of game is determined. According to the ninth aspect of the present invention, the necessary quantity of load values can appropriately be set in each game, and the necessary quantity of load values can be computed in each playing game to perform the game processing, so that various games can be proposed.
A tenth aspect of the present invention is a storage medium storing a game program according to the eighth aspect of the present invention, preferably the game program causes the computer to further execute a difference computing step of computing difference of the detection values outputted from the plurality of load sensors; and a correction step of correcting the detection value based on the difference computed by the difference computing step.
In the tenth aspect of the present invention, the difference of the detection values outputted from the plurality of load sensors is computed in the difference computing step (S41, S77, S85). In the correction step (S51, S401), the detection value is corrected based on the computed difference. Accordingly, a fluctuation in load detection value caused by an attitude and a standing position, etc. of the player can properly be corrected.
An eleventh aspect of the present invention is a storage medium storing a game program according to the tenth aspect of the present invention, preferably the difference computing step includes a first difference computing step of dividing the detection values into first two sets to compute difference between the first two sets, the detection value being outputted from each of the plurality of load sensors; and a second difference computing step of dividing the detection values into second two sets to compute difference between the second two sets, the detection value being outputted from each of the plurality of load sensors, the second two sets being different from the first two sets, and the correction step corrects the detection value based on the first difference and the second difference.
In the eleventh aspect of the present invention, the plurality of detection values are divided into the first two sets to compute the difference between the first two sets in the first difference computing step (S77). The plurality of detection values are divided into the second two sets different from the first two sets to compute the difference between the second two sets in the second difference computing step (S85). In the correction step, the detection value is corrected based on the first difference and the second difference. The plurality of load detection values are corrected based on the two types of differences of the different combinations, so that accuracy of load value correction can be improved.
A twelfth aspect of the present invention is a storage medium storing a game program according to the eighth aspect of the present invention, preferably the game program causes the computer to further execute a load value comparing step of comparing the load values computed by the load value computing step, and the game processing step performs the game processing based on the load value that is determined to be the largest value as a result of comparison in the load value comparing step.
In the twelfth aspect of the present invention, the computed load values are compared to another in the load value comparing step (S211 to S217, S271 to S277, S353 to S357, S433 to S437, S545 to S549), and the game processing is performed based on the maximum load value in the game processing step (S219 to S233, S279 to S285, S359 to S373, S439 to S453, S551 to S557).
According to the twelfth aspect of the present invention, the largest load value is selected in the plurality of load values, and the game processing can be performed based on the selected load value. Accordingly, a novel manipulation can be realized by such simple processing that a character is moved in one of the vertical and horizontal directions according to the selected load value.
A thirteenth aspect of the present invention is a storage medium storing a game program according to the eighth aspect of the present invention, preferably the quantity determining step determines the necessary quantity according to a scene by determining the scene in the game.
In the thirteenth aspect of the present invention, the necessary quantity is determined according to the scene by determining the scene in the game in the quantity determining step (S403,S405). According to the thirteenth aspect of the invention, the necessary quantity of load values can appropriately be set in each scene of the game, and the necessary quantity of load values can be computed in each scene of the game to perform the game processing. Therefore, the game can be played by various manipulations.
In a fourteenth aspect of the present invention, a storage medium stores a game program executed in a computer of a game machine that performs a game using a game controller including the plurality of load sensors, wherein the game program causes the computer to execute a quantity determining step of determining a quantity of load values necessary for game processing; a command transmitting step of transmitting a command according to the quantity determined by the quantity determining step to the game controller; a load value obtaining step of obtaining the quantity of load values according to the command from the game controller; and a game processing step of performing game processing based on the load value obtained by the load value obtaining step.
In the fourteenth aspect of the present invention, the game program is executed in the computer (82) of the game machine (52), and the game is performed using the game controller (10) including the plurality of load sensors (14) in the game machine. The quantity of load values necessary for the game processing is determined in the quantity determining step (S53 to S57, S403, S405). The command according to the determined quantity is transmitted to the game controller in the command transmitting step (S581, S585, S589, S601, S605). The quantity of load values according to the command is obtained from the game controller in the load value obtaining step (S583, S587, S591, S603, S607). The game processing is performed based on the obtained load value in the game processing step (S59, S61, S63, S407, S409).
According to the fourteenth aspect of the present invention, the command according to the necessary quantity is transmitted to the game controller, and the necessary quantity of load values according to the command can be obtained from the game controller. Therefore, the game processing can be performed based on the various quantities of load values according to contents of the game. Accordingly, various quantities of load values can be used in the game by a combination of values of the plurality of load sensors, and novel play can be proposed.
In a fifteenth aspect of the present invention, a game apparatus executing a game played by a load of a player includes a manipulation means including a support portion on which player's legs are ridden and a plurality of load sensors detecting a load applied to the support portion; a quantity determining means for determining a quantity of load values necessary for game processing; a load value computing means for computing the quantity of load values based on a detection value detected by each of the plurality of load sensors, the quantity of load values being determined by the quantity determining step; and a game processing means for performing the game processing based on the load value computed by the load value computing means.
In the fifteenth aspect of the present invention, the game apparatus (50,52) includes the manipulation means (10) to perform the game played by the load of the player, and the manipulation means includes the support portion (16) on which the player's legs are ridden and the plural load sensors (14) detecting the load applied to the support portion. The quantity determining means (82, S53 to S57, S403 to S405, 20, S621 to S625) determines the quantity of load values necessary for the game processing. The load value computing means (82, S101, S151, S153, S181, S311, S351, S431, S471, S511, S543, 20, S627, S631, S633) computes the determined quantity of load values based on the detection values of the plurality of load sensors. The game processing means (82, S59, S61, S63, S407, S409) performs the game processing based on the computed load value.
According to the fifteenth aspect of the present invention, the four load sensors are provided in the manipulation means, the quantity of load values necessary for the game processing is determined, and the necessary quantity of load values is computed from the plurality of load detection values, so that the game processing can be performed based on the various quantities of load values by various combinations of values of the plurality of load sensors. Accordingly, a novel game played by various manipulations according to the load of the player can be performed.
According to the present invention, the load of the player is detected by at least the four load sensors and the detected load value is set to the manipulation data to perform the game processing, so that the game controller that can perform various manipulations using the load sensors can be provided.
The necessary quantity is determined and the necessary quantity of load values is computed, so that various quantities of load values can be used in the game processing by various combinations of values of the plural load sensors. Accordingly, a novel game played by the load of the player using the game controller including the plurality of load sensors can be proposed.
The above described objects and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
With reference to
The board 12 is formed in a substantially rectangular solid, and the board 12 has a substantially square shape when viewed from above. For example, one side of the square is set in a range of about 30 cm to 50 cm. An upper surface of the board 12 on which the player rides is formed in flat. Side faces at four corners of the board 12 are formed so as to be partially projected in a cylindrical shape.
In the board 12, the four load sensors 14 are arranged at predetermined intervals. In the embodiment, the four load sensors 14 are arranged in peripheral portions of the board 12, specifically, at the four corners. The interval between the load sensors 14 is set an appropriate value such that player's intention can accurately be detected for the load applied to the board 12 in a game manipulation.
As can be seen from
The support plate 16 includes an upper-layer plate 16a that constitutes an upper surface and an upper side face, a lower-layer plate 16b that constitutes a lower surface and a lower side face, and an intermediate-layer plate 16c provided between the upper-layer plate 16a and the lower-layer plate 16b. For example, the upper-layer plate 16a and the lower-layer plate 16b are formed by plastic molding and integrated with each other by bonding. For example, the intermediate-layer plate 16c is formed by pressing one metal plate. The intermediate-layer plate 16c is fixed onto the four load sensors 14. The upper-layer plate 16a has a lattice-shaped rib (not shown) in a lower surface thereof, and the upper-layer plate 16a is supported by the intermediate-layer plate 16c while the rib is interposed.
Accordingly, when the player rides on the board 12, the load is transmitted to the support plate 16, the load sensor 14, and the leg 18. As shown by an arrow in
The load sensor 14 is formed by, e.g., a strain gage (strain sensor) type load cell, and the load sensor 14 is a load transducer that converts the input load into an electric signal. In the load sensor 14, a strain inducing element 14a is deformed to generate a strain according to the input load. The strain is converted into a change in electric resistance by a strain sensor 14b adhering to the strain inducing element 14a, and the change in electric resistance is converted into a change in voltage. Accordingly, the load sensor 14 outputs a voltage signal indicating the input load from an output terminal when the voltage is imparted to the load sensor 14 from a power supply terminal.
Other types of load sensors such as a folk vibrating type, a string vibrating type, an electrostatic capacity type, a piezoelectric type, a magneto-striction type, and gyroscope type may be used as the load sensor 14.
The game controller 10 includes a microcomputer 20 that controls an operation of the game controller 10. The microcomputer 20 includes a ROM and a RAM (not shown) and controls the operation of the game controller 10 according to a program stored in the ROM.
An AD converter 22, a connector 24, and a DC-DC converter 26 are connected to the microcomputer 20. In
The connector 24 is provided such that the game controller 10 conducts communication with a game machine 52 (see
A battery 30 that supplies the electric power is also accommodated in the game controller 10. However, in the embodiment, the electric power is supplied to the microcomputer 20 from an external device, such as the game machine 52 and the different type of controller 54, which is connected using the connector 24. On the other hand, the electric power is supplied from the battery 30 to the load sensor 14, the amplifier 28, and the AD converter 22 through the DC-DC converter 26. The DC-DC converter 26 converts a direct-current voltage from the battery 30 into a different voltage to impart the converted voltage to the load sensor 14, the amplifier 28, and the AD converter 22.
The electric power may be supplied to the load sensor 14, the AD converter 22, and the amplifier 28 if needed such that the microcomputer 20 controls the DC-DC converter 26. That is, when the microcomputer 20 determines that a need to operate the load sensor 14 to detect the load arises, the microcomputer 20 may control the DC-DC converter 26 to supply the electric power to each load sensor 14, each amplifier 28, and the AD converter 22.
Once the electric power is supplied, each load sensor 14 outputs a signal indicating the input load. The signal is amplified by each amplifier 28, and the analog signal is converted into a digital data by the AD converter 22. Then, the digital data is inputted to the microcomputer 20. Identification information on each load sensor 14 is imparted to the detection value of each load sensor 14, allowing for distinction among the detection values of the load sensors 14. Thus, the microcomputer 20 can obtains the pieces of data indicating the detection values of the four load sensors 14 at the same time.
On the other hand, when the microcomputer 20 determines that the need to operate the load sensor 14 does not arise, i.e., when the microcomputer 20 determines it is not the time the load is detected, the microcomputer 20 controls the DC-DC converter 26 to stop the supply of the electric power to the load sensor 14, the amplifier 28, and the AD converter 22. Thus, in the game controller 10, the load sensor 14 is operated to detect the load only when needed, so that the power consumption for detecting the load can be suppressed.
Typically, the time the load detection is required shall means the time the game machine 52 (
The data indicating the detection value from the load sensor 14 is transmitted as the manipulation data (input data) of the game controller 10 from the microcomputer 20 to the game machine 52 (
The controller 54 is a game controller, a different type from the game controller 10. In the embodiment, the controller 54 is a main game controller of the game machine 52, and the game controller 10 is prepared as an extended unit of the controller 54 in order to utilize a wireless communication function of the controller 54 with the game machine 52. The game controller 10 is connected to the controller 54 by the connector 24 that is located at the front end of the cable 32 extended from the board 12. For the purpose of distinction, sometimes the controller 54 is referred to as “remote control”.
The game machine 52 includes a housing 56 having a substantially rectangular solid, and a disk slot 58 is provided in a front face of the housing 56. An optical disk 60 that is of an example of an information storage medium in which the game program and the like are stored is inserted from the disk slot 58 and placed on a disk drive 62 (see
A power button 64a and a reset button 64b are provided in the front face and in the upper portion of the housing 56 of the game machine 52, and an eject button 64c is provided in the front face and in the lower portion of the housing 56. An external memory card connector cover 66 is provided between the reset button 64b and the eject button 64c and near the disk slot 58. An external memory card connector 68 (see
A general-purpose SD card can be used as the memory card, and other general-purpose memory cards such as Memory Stick and MultiMediaCard (registered trademark) can also be used.
Although not shown in
The electric power of the game machine 52 is imparted by a general AC adaptor (not shown). The AC adaptor is inserted in a standard wall socket in home, and the game machine 52 converts home-use power supply (commercial power supply) into a low DC voltage signal suitable to the drive of the game machine 52. In another embodiment, the battery is used as the power supply.
In order that a user or player plays a game (or other applications except for the game) with the game system 50, the user turns on the power of the game machine 52, then the user appropriately selects the optical disk 60 in which a program of a video game (or another application to be played) is stored, and the user loads the optical disk 60 on the disk drive 62 of the game machine 52. Accordingly, the game machine 52 starts the execution of the video game or another application based on the program recorded in the optical disk 60. The user manipulates the remote control 54 or the game controller 10 to impart the input to the game machine 52. For example, the game or another application can be started by manipulating one of input means 80 such as various manipulation buttons provided in the remote control 54 or using the game controller 10. In addition to the manipulation of the input means 80, the movement of the remote control 54 itself or the use of the game controller 10 can move a moving picture object (player object) in a different direction or change a viewpoint (camera position) of the user in the 3D game world.
Alternatively, the program of the video game or another application may be stored (installed) in the internal memory (flash memory 70) of the game machine 52 and executed from the internal memory. In such cases, the program stored in the storage medium such as the optical disk 60 may be installed in the internal memory or downloaded program may be installed in the internal memory.
The external main memory 86 is used to store the programs such as the game program or various kinds of data, and the external main memory 86 is used as a work area or a buffer area of the CPU 82. The ROM/RTC 88 is a so-called boot ROM into which a program starting up the game machine 52 is incorporated, and a time circuit is provided to count time in the ROM/RTC 88. The disk drive 62 reads the program or texture data or the like from the optical disk 60, and the disk drive 62 writes the program or texture data or the like in an internal main memory 84e or an external main memory 86 under the control of the CPU 82.
An input and output processor 84a, a GPU (Graphics Processor Unit) 84b, a DSP (Digital Signal Processor) 84c, a VRAM84d, and an internal main memory 84e are provided in the system LSI 84 and connected to one another by an internal bus (not shown).
The input and output processor (I/O processor) 84a performs the transmission and reception of the data or the download of the data.
The GPU 84b constitutes a part of a drawing means and receives a graphics command (graphics generation command) from the CPU 82 to generate game image data according to the command. In addition to the graphics command, the CPU 82 imparts an image producing program necessary to produce the game image data to the GPU 84b.
Although not shown, as described above, the VRAM 84d is connected to the GPU 84b. The GPU 84b accesses the VRAM 84d to obtain the data (image data such as polygon data and texture data) necessary to execute the graphics generation command. The CPU 82 writes the image data necessary for the graphics generation in the VRAM 84d through the GPU 84b. The GPU 84b accesses the VRAM 84d to produce the game image data for drawing.
In the embodiment, the explanation will be made in a case where the GPU 84b produces the game image data. However, in a case where any application except for the game application is executed, the GPU 84b produces the image data for the application.
The DSP 84c acts as an audio processor that produces audio data corresponding to the sound, voice, or music outputted from the speaker 76a using sound data or sound waveform (tone) data stored in the internal main memory 84e or external main memory 86.
The game image data and audio data produced in the above-described ways are read by the AVIC 90 and outputted to the monitor 76 and speaker 76a through the AV connector 72. Accordingly, a game screen is displayed on the monitor 76, and the sound (music) necessary for the game is outputted from the speaker 76a.
A flash memory 70, a wireless communication module 92, and a wireless controller module 94 are connected to the input and output processor 84a. An extended connector 96 and the memory card connector 68 are also connected to the input and output processor 84a. An antenna 92a is connected to the wireless communication module 92 and an antenna 94a is connected to the wireless controller module 94.
The input and output processor 84a can conduct communication with another game apparatus and various servers connected to a network through the wireless communication module 92. However, the input and output processor 84a can directly conduct communication with another game apparatus without the network. The input and output processor 84a periodically accesses the flash memory 70 to detect the presence or absence of data (referred to as “transmission data”) necessary to be transmitted to the network, and the input and output processor 84a can transmit the transmission data to the network through the wireless communication module 92 and antenna 92a when the transmission data exists. The input and output processor 84a receives data (referred to as “reception data”) transmitted from another game apparatus through the network, antenna 92a, and wireless communication module 92, and the input and output processor 84a can store the reception data in the flash memory 70. However, the reception data is directly destroyed in the case where the reception data does not satisfy a predetermined condition. The input and output processor 84a receives data (referred to as “download data”) downloaded from a download server through the network, antenna 92a, and wireless communication module 92, and the input and output processor 84a can store the download data in the flash memory 70.
The input and output processor 84a receives the input data (manipulation data) transmitted from the remote control 54 through the antenna 94a and wireless controller module 94, and the input and output processor 84a stores (temporarily stores) the input data in the buffer area in the internal main memory 84e or external main memory 86. The input data is erased from the buffer area after used in processing (for example, game processing) of the CPU 82.
In the embodiment, as described above, the wireless controller module 94 conducts communication with the remote control 54 pursuant to the Bluetooth standard.
Moreover, the extended connector 96 and the memory card connector 68 are connected to the input and output processor 84a. The extended connector 96 is a connector used for an interface such as a USB and an SCSI, and a medium such as an external storage medium or a peripheral device such as a controller different from the remote control 54 can be connected to the extended connector 96. The wired LAN can also be used instead of the wireless communication module 92 by connecting a wired LAN adaptor to the extended connector 96. An external storage medium such as the memory card can be connected to the memory card connector 68. Accordingly, the input and output processor 84a can access the storage medium to store or read the data through the extended connector 96 or memory card connector 68.
Although the detailed description is omitted, the power button 64a, the reset button 64b, and the eject button 64c are provided in the game machine 52 (housing 56) as shown in
Although the electric power is supplied to the system LSI 84 even in the standby mode, the GPU 84b, the DSP 84c, and the VRAM 84d are not driven to reduce the power consumption by stopping clock supply to the GPU 84b, the DSP 84c, and the VRAM 84d.
Although not shown, a fan is provided in the housing 56 of the game machine 52 to discharge heat of ICs such as the CPU 82 and the system LSI 84 to the outside. The fan is also stopped in the standby mode.
In the case where the standby mode is not utilized, the electric power supply is completely stopped to all the circuit components by selecting a setting in which the standby mode is not utilized, when the power button 64a is turned off.
The switch between the normal mode and the standby mode can remotely be performed by the switch of on/off of a power switch 80h (see
The reset button 64b is also connected to the system LSI 84. When the reset button 64b is pressed, the system LSI 84 restarts a start-up program of the game machine 52. The eject button 64c is connected to the disk drive 62. When the eject button 64c is pressed, the optical disk 60 is ejected from the disk drive 62.
With reference to
The cross key 80a is a four-directional push switch, and the cross key 80a includes manipulation portions of four directions shown by arrows, i.e., forward (or upward), backward (or downward), rightward, and leftward directions. For example, the player can provide the instruction of moving direction of a manipulable character or object (player character or player object) or cursor by manipulating one of the manipulation portions.
The (1) button 80b and the (2) button 80c are push-button switches. For example, the (1) button 80b and the (2) button 80c are used in the game manipulation such that the viewpoint position or the viewpoint direction, i.e, the position or a view angle of a virtual camera are adjusted when the three-dimensional game image is displayed. Alternatively, the (1) button 80b and the (2) button 80c may be used to perform the same manipulations as the A button 80d and B trigger switch 80i or a supplementary manipulation.
The A button switch 80d is a push-button switch used to cause the player character or player object to perform motions except for the directional instruction, i.e., any action such as punch, throw, grasp (obtaining), ride, and jump. For example, in an action game, the user can provide the instructions such as the jump, punch, and movement of a weapon. In a role-playing game (RPG) or a simulation RPG, the user can provide the instructions such as obtaining of an item and selection and determination of the weapon or command. The A button switch 80d is also used to instruct the determination of an icon indicated by a pointer (indicated image) or a button image on the game screen. For example, when the icon or button image is determined, the instruction or command (game command) previously set corresponding to the icon or button image can be inputted.
Similarly, the (−) button 80e, the HOME button 80f, the (+) button 80g, and the power button 80h are push-button switches. For example, the (−) button 80e is used to select a game mode. The HOME button 80f is used to display a game menu (menu screen). The (+) button 80g is used to start (resume) the game or suspend the game. The power switch 80h is used to remotely turn on/off the power of the game machine 52.
In the embodiment, a power switch for turning on/off the remote control 54 itself is not provided. The remote control 54 is turned on by manipulating one of the input means 80 of the remote control 54, and the remote control 54 is automatically turned off unless manipulated for over a predetermined time (for example, 30 seconds).
The B trigger switch 80i is also a push-button switch, and is mainly used to perform the input emulating a trigger such as shooting or specify the position selected by the remote control 54. When the B trigger switch 80i is continuously pressed, the motion or a parameter of the player object can be kept at a constant state. In a certain state, the B trigger switch 80i acts as the normal B button, and the B trigger switch 80i is used to delete the action or command or the like determined by the A button 80d.
As shown in
The remote control 54 includes an imaging information computation unit 104 (see
The shape of the remote control 54 and the shape, the quantity, and the installation position, etc. of each input means 80 are shown in
A power supply circuit 124 supplies the electric power to each component of the remote control 54. Typically the power supply circuit 124 is a battery exchangeably accommodated in the housing 98. The power supply circuit 124 can also supply the electric power to the extended units (such as the game controller 10) connected through the external extended connector 100.
Although not shown in
The processor 112 controls the whole of the remote control 54. The processor 112 transmits (inputs) pieces of information (input information) inputted by the input means 80, acceleration sensor 116, and imaging information computation unit 104 and pieces of information (such as data from the game controller 10) obtained through the external extended connector 100 to the game machine 52 through the wireless module 118 and antenna 118a in the form of the input data (manipulation data). At this point, the processor 112 uses the memory 114 as the work area or buffer area. The manipulation signals (pieces of manipulation data) from the input means 80 (80a to 80i) are inputted to the processor 112, and the processor 112 temporarily stores the pieces of manipulation data in the memory 114.
The acceleration sensor 116 detects acceleration in each of three axes of a longitudinal direction (y-axis direction), a crosswise direction (x-axis direction), and a fore-and-aft direction (z-axis direction) of the remote control 54. Typically, an electrostatic capacity type acceleration sensor is used as the acceleration sensor 116. However, different type acceleration sensor may be used.
For example, the acceleration sensor 116 detects the acceleration (ax, ay, az) for the x-axis, y-axis, and z-axis at first predetermined time intervals, and the acceleration sensor 116 inputs the detected acceleration data to the processor 112. For example, the acceleration sensor 116 detects the acceleration in each axial direction in a range of −2.0 g to 2.0 g (g is gravitational acceleration, hereinafter the same). The processor 112 detects the acceleration data imparted from the acceleration sensor 116 at second predetermined time intervals, and the processor 112 temporarily stores the acceleration data in the memory 114.
The processor 112 produces input data (manipulation data) including at least one of the manipulation data, the acceleration data, and later-mentioned marker coordinate data, and the processor 112 transmits the produced input data to the game machine 52 at third predetermined time intervals (for example, 5 ms). The processor 112 can add the data received from the game controller 10 through the external extended connector 100 to the input data.
Although not shown in
At this point, those skilled in the art easily understand from the description of the present invention that the computer such as the processor (for example, CPU 82) of the game machine 52 and the processor (for example, processor 112) of the remote control 54 can perform processing to estimate or compute (determine) further information on the remote control 54 based on the acceleration data outputted from the acceleration sensor 116.
For example, in the case where the uni-axial acceleration sensor 116 is mounted on the remote control 54 to perform the processing on the computer side while the remote control 54 is assumed to be in the static state, namely, in the case where the processing is performed while the acceleration detected by the acceleration sensor 116 is assumed to be formed only by the gravitational acceleration, whether or not an attitude of the remote control 54 is inclined with respect to the gravitational direction or how much the attitude of the remote control 54 is inclined with respect to the gravitational direction can be understood based on the detected acceleration data when the remote control 54 is actually in the static state. Specifically, on the basis of the state in which the acceleration sensor 116 has a vertically-downward detection axis, whether or not the attitude of the remote control 54 is inclined by the application of 1 g (gravitational acceleration) and how much the attitude of the remote control 54 is inclined by a magnitude of the acceleration can be understood.
In the case where the multi-axis acceleration sensor 116 is mounted on the remote control 54, how much the attitude of the remote control 54 is inclined with respect to the gravitational direction can be understood in detail by performing processing to the acceleration data of each axis. In this case, the processor 112 may perform processing for computing data of an inclination angle of the remote control 54 based on the output of the acceleration sensor 116, or processing for roughly estimating the inclination may be performed without performing the processing for computing the data of the inclination angle based on the output from the acceleration sensor 116. Thus, the inclination, attitude, or position of the remote control 54 can be determined by the combination of the acceleration sensor 116 and the processor 112.
On the other hand, assuming that the acceleration sensor 116 is in a dynamic state, because the acceleration is detected according to the motion of the acceleration sensor in addition to the gravitational acceleration component, the direction of the motion and the like can be understood when the gravitational acceleration component is removed by predetermined processing. Specifically, in the case where the remote control 54 on which the acceleration sensor 116 is mounted is moved by the user while dynamically accelerated, various motions and/or positions of the remote control 54 can be computed by processing the acceleration data produced by the acceleration sensor 116.
Even if the acceleration sensor 116 is assumed to be in the dynamic state, the inclination can be understood with respect to the gravitational direction when the acceleration corresponding to the motion of the acceleration sensor 116 is removed by predetermined processing. In another embodiment, the acceleration sensor 116 may include a built-in signal processing unit or another dedicated processing unit in order to perform desired processing to the acceleration signal (acceleration data) outputted from the built-in acceleration detection means before the acceleration data is outputted from the processor 112. For example, in the case where the acceleration sensor 116 is used to detect the static acceleration (for example, gravitational acceleration), the built-in or dedicated processing unit may convert the detected acceleration data into the corresponding inclination angle (or other preferable parameter).
The wireless module 118 uses, e.g., the Bluetooth technique to modulate a carrier wave having a predetermined frequency using the input data, and the wireless module 118 radiates the weak radio signal from the antenna 118a. That is, the input data is modulated into the weak radio signal by the wireless module 118 and transmitted from the antenna 118a (remote control 54). The weak radio signal is received by the wireless controller module 94 provided in the game machine 52. Demodulation and decoding are performed to the received weak radio signal, which allows the game machine 52 (CPU 82) to obtain the input data from the remote control 54. The CPU 82 can perform the application processing (game processing) according to the obtained input data and the application program (game program).
As described above, the imaging information computation unit 104 is provided in the remote control 54. The imaging information computation unit 104 includes an infrared filter 104a, a lens 104b, an imaging device 104c, and an image processing circuit 104d. The infrared filter 104a transmits only the infrared ray in the light incident from ahead of the remote control 54. As described above, the markers 78m and 78n disposed near (around) the display screen of the monitor 76 are infrared LEDs that output the infrared ray ahead of the monitor 76. Accordingly, the images of the markers 78m and 78n can be taken more exactly by providing the infrared filter 104a. The infrared ray transmitted through the infrared filter 104a is outputted to the imaging device 104c through the lens 104b. The imaging device 104c is a solid-state imaging device, such as a CMOS sensor or a CCD, which images the infrared ray collected by the lens 104b. Accordingly, the imaging device 104c images only the infrared ray transmitted through the infrared filter 104a to produce the image data. Hereinafter the image taken by the imaging device 104c is referred to as taken image. The image data produced by the imaging device 104c is processed by the image processing circuit 104d. The image processing circuit 104d computes the position of the imaging target (markers 78m and 78n) in the taken image and outputs each coordinate value indicating the position as imaging data (later-mentioned marker coordinate data) to the processor 112 at fourth predetermined time intervals. The processing performed in the image processing circuit 104d is described later.
In the case where the position and orientation of the remote control 54 are out of the range, the game manipulation cannot be performed based on the position and orientation of the remote control 54. Hereinafter the range is referred to as “manipulable range”.
In the case where the remote control 54 is grasped in the manipulable range, the images of the markers 78m and 78n are taken by the imaging information computation unit 104. That is, the taken image obtained by the imaging device 104c includes the images (target images) of the markers 78m and 78n that are of the imaging target.
Because the target image appears as a high-brightness portion in the image data of the taken image, the image processing circuit 104d detects the high-brightness portion as a candidate of the target image. Then, the image processing circuit 104d determines whether or not the high-brightness portion is the target image based on the size of the detected high-brightness portion. Sometimes the taken image includes not only images 78m′ and 78n′ corresponding to the two markers 78m and 78n that are of the target image but also the image except for the target image due to the sunlight from a window or a fluorescent light. The processing of the determination whether or not the high-brightness portion is the target image is performed in order to distinguish the images 78m′ and 78n′ that are of the target image from other images to exactly detect the target image. Specifically, the determination whether or not the detected high-brightness portion has the size within a predetermined range is made in the determination processing. When the high-brightness portion has the size within the predetermined range, it is determined that the high-brightness portion indicates the target image. On the contrary, when the high-brightness portion does not have the size within the predetermined range, it is determined that the high-brightness portion indicates the image except for the target image.
Then, the image processing circuit 104d computes the position of the high-brightness portion for the high-brightness portion in which it is determined as a result of the determination processing that the high-brightness portion indicates the target image. Specifically, a barycentric position of the high-brightness portion is computed. Hereinafter the coordinate of the barycentric position is referred to as marker coordinate. The barycentric position can be computed in more detail compared with resolution of the imaging device 104c. At this point, it is assumed that the image taken by the imaging device 104c has the resolution of 126×96 and the barycentric position is computed in a scale of 1024×768. That is, the marker coordinate is expressed by an integer number of (0, 0) to (1024, 768).
The position in the taken image is expressed by a coordinate system (XY-coordinate system) in which an origin is set to an upper left of the taken image, a downward direction is set to a positive Y-axis direction, and a rightward direction is set to a positive X-axis direction.
In the case where the target image is correctly detected, two marker coordinates are computed because the two high-brightness portions are determined as the target image by the determination processing. The image processing circuit 104d outputs the pieces of data indicating the two computed marker coordinates. As described above, the outputted pieces of marker coordinate data are added to the input data by the processor 112 and transmitted to the game machine 52.
When the game machine 52 (CPU 82) detects the marker coordinate data from the received input data, the game machine 52 can compute the position (indicated coordinate) indicated by the remote control 54 on the screen of the monitor 76 and the distances between the remote control 54 and the markers 78m and 78n based on the marker coordinate data. Specifically, the position toward which the remote control 54 is orientated, i.e., the indicated position is computed from the position at the midpoint of the two marker coordinates. When the coordinate of the position indicated by the remote control 54 is computed from the marker coordinate, the coordinate system of the taken image of
In the embodiment, the remote control 54 performs the predetermined computation processing to the imaging data to detect the marker coordinate, and the marker coordinate data is transmitted to the game machine 52. Alternatively, in another embodiment, the imaging data is transmitted as the input data from the remote control 54 to the game machine 52, and the CPU 82 of the game machine 52 may perform the predetermined computation processing to the imaging data to detect the marker coordinate and the coordinate of the indicated position.
The distance between the target images in the taken image is changed according to the distances between the remote control 54 and the markers 78m and 78n. The distance between the markers 78m and 78n, a width of the taken image, and the view angle θ2 of the imaging device 104c are previously determined, so that the game machine 52 can compute the current distances between the remote control 54 and the markers 78m and 78n by computing the distance between the marker coordinates.
In the game system 50, the game controller 10 is used for the game manipulation by the load applied by the player. The connector 24 of the game controller 10 is connected to the external extended connector 100 of the remote control 54, thereby connecting the game controller 10 and the remote control 54. This enables the game controller 10 to transmit and receive the data to and from the game machine 52 through the remote control 54.
As described above, the game controller 10 can detect the load only when the game machine 52 requires the load. Specifically, when the game machine 52 requires the load detected by the game controller 10, the CPU 82 of the game machine 52 wirelessly transmits the load obtaining command for the game controller 10 to the remote control 54. When the processor 112 of the remote control 54 receives the load obtaining command from the game machine 52, the processor 112 transmits the load obtaining command to the game controller 10 through the external extended connector 100. When the microcomputer 20 of the game controller 10 receives the load obtaining command through the connector 24 and cable 32, the microcomputer 20 controls the DC-DC converter 26 to supply the electric power to the load sensor 14, amplifier 28, and AD converter 22. Therefore, the signal indicating the load applied to each load sensor 14 is outputted, and the signal is amplified by each amplifier 28 and imparted to the AD converter 22. The AD converter 22 converts the signal into the digital data and outputs the digital data to the microcomputer 20. Accordingly, the microcomputer 20 can obtain the load detection value data detected by each of the four load sensors 14.
The microcomputer 20 transmits the obtained load detection value data to the remote control 54 through the cable 32 and connector 24. The load detection value data may directly be transmitted, or the load detection value data may be transmitted after predetermined computation processing is performed to the detection value or computation processing is performed according to the load obtaining command. The processor 112 of the remote control 54 stored the load detection value data in the memory 114 when receiving the load detection value data from the game controller 10 through the external extended connector 100. The processor 112 produces the input data (manipulation data) including the load detection value data and transmits the input data to game machine 52 through the wireless module 118. The CPU 82 of the game machine 52 obtains the load value of the game controller 10 from the received input data, and the CPU 82 can use the load value for the game processing. Accordingly, the player can perform various game manipulations by the load applied to the game controller 10.
For example, in the case of a game performed based on the simple total value of the four load values detected by the four load sensors 14, the player can take any position with respect to the four load sensors 14 of the game controller 10, that is, the player can play the game while riding on any position of the board 12 with any orientation. However, depending on the type of the game, it is necessary to perform processing while determining toward which direction the load value detected by each load sensors 14 is orientated when viewed from the player. That is, it is necessary to understand a positional relationship between the four load sensors 14 of the game controller 10 and the player. For example, the positional relationship between the four load sensors 14 and the player is previously defined, and it may be assumed that the player rides on the board 12 such that the predetermined positional relationship is obtained. Typically, there is defined such the positional relationship that each two load sensors 14 exist at the front and the back of and on right and left sides of the player riding on the center of the board 12, i.e. such the positional relationship that the load sensors 14 exist in the right front, left front, right rear, and left rear directions from the center of the player respectively when the player rides on the center of the board 12 of the game controller 10. In the typical game in which the screen of the monitor 76 is located at the front of the player, as shown in
A spot may be provided in the board 12 in order to provide information on such the arrangement of the game controller 10 that the predetermined positional relationship is obtained to the player. For example, in order that the two adjacent load sensors 14 in predetermined one side of the board 12 are disposed at the front of the player, i.e., on the side of the monitor 76, the spot may be provided in a predetermined portion such as the upper surface or side face along the predetermined one side of the board 12. Alternatively, the cable 32 of the connector 24 is configured to be extracted from a predetermined portion in the side face or lower surface along the predetermined one side of the board 12, and the position from which the cable 32 is extracted may be set to the spot.
The game controller 10 and the game machine 52 can understand that the load detection values correspond to which direction when viewed from the player based on the identification information on each load sensor 14 included in the load detection value data and the previously-set (stored) information on the arrangement of the load sensors 14. Accordingly, the intention of the game manipulation, such as the front, rear, right, and left manipulation directions, which is performed by the player can be understood.
The arrangement of the load sensors 14 relative to the player is not previously defined but the arrangement may be set by the player's input in the initial setting or the like. For example, the load is obtained while the screen in which the player instructed to ride on the portion in a predetermined direction (such as the right front, left front, right rear, and left rear directions) when viewed from the player. Therefore, the positional relationship between each load sensor 14 and the player can be specified, and the information on the arrangement by the setting can be generated and stored.
A positional relationship in which the load sensors 14 exist at the front and back of and on right and left sides of the player respectively may be assumed in another embodiment. In this case, the game controller 10 is disposed such that one predetermined corner of the board 12 exists on the side of the monitor 76 while a predetermined diagonal line is parallel to the screen. The spot may be provided in the upper surface or side face at the one predetermined corner of the board 12.
In the game system 50, the quantity of load values necessary to the game processing is determined, and the determined quantity of load values is computed from the four load detection values. The game processing is performed based on the necessary quantity of load computation values. Because the necessary quantity of load values is computed from the four load detection values to perform the game processing, a novel game with the game controller 10 including the load sensor 14 can be proposed to perform various games.
In the embodiment, the game is performed such that the necessary quantity of load values is kept constant.
A game selection program is stored in a memory area 204. The game selection program is used to select the game (mode) to be performed. For example, one game is selected from the plurality of games (the total load game, the right and left balance game, and the four-directional balance game) by the input of the player. In the case where the selection is made by the player input, a game selection screen having icons corresponding to the plurality of games is displayed, and the icon is selected by the position indication using the imaging information computation unit 104 of the remote control 54, the indication using the cross key 80a, or the indication using the game controller 10, or the like. Alternatively, the game may be selected according to the performance order previously determined in the program or the game may randomly be selected.
A command transmission program is stored in a memory area 206. The command transmission program is used to transmit the load obtaining command to the game controller 10. The load obtaining command is transmitted when the load value is required (load obtaining timing). The load obtaining timing may be set so as to come at regular time intervals, or the load obtaining timing may be set only when a predetermined game status or an event is generated. When the game controller 10 receives the load obtaining command, in the game controller 10, the load is detected by the load sensor 14 and the load detection value is transmitted to the game machine 52.
A load detection value obtaining program is stored in a memory area 208. The load detection value obtaining program is used to receive and obtain the load detection value transmitted from the game controller 10.
A correction program is stored in a memory area 210. The correction program is used to correct the obtained load detection value. For example, even if the player believes that the player rides on the center of the board 12, namely, even if the player believes the player rides on the board 12 such that the barycenter is located in the center of the board 12, sometimes a variation in load values detected by the four load sensors 14 is generated depending on individual characteristics such as the attitude of the player, a standing position, a physical feature (such as difference in length of the legs), and a habit. Accordingly, in the embodiment, the load detection value is appropriately corrected to accurately recognize the game manipulation by the load of the player. The correction is performed based on difference of the load detection values of the four load sensors 14. Specifically, the detection value is corrected based on a correction value computed by a correction value computing program described below.
The correction value computing program is stored in a memory area 212. The correction value computing program is used to compute the correction value for correcting the load detection value. The correction value computation is performed by the initial setting before the game is started. For example, the image in which the player is instructed to ride on the center of the board 12 of the game controller 10 is displayed on the monitor 76, and the four load detection values detected by the four load sensors 14 are obtained. The two kinds of the differences are computed in the different combinations from the four load detection values, and the correction value is computed based on the two kinds of the differences. Specifically, the four load sensors 14 are divided into first two sets, i.e., into the right and left, the right load value and the left load value are computed, and a first difference is computed by taking the difference between the right load value and the left load value. A first correction value is computed to correct each of the load detection values divided into the right and left based on the first difference. The four load sensors 14 are divided into second two sets, i.e., into the upper and lower portions, the upper load value and the lower load value are computed, and a second difference is computed by taking the difference between the upper load value and the lower load value. A second correction value is computed to correct each of the load detection values divided into the upper and lower portions based on the second difference. Then, a final correction value of each load detection value is computed based on the first difference and the second difference. Each load detection value is corrected based on each final correction value.
For example, assuming that the left load has the value of 60 and the right load has the value of 40, the first difference becomes 20, and the first correction value is computed by equally dividing the first difference into four. That is, the first correction value for the upper left load sensor 14a and lower left load sensor 14b becomes −5 (=−20/4), and the first correction value for the upper right load sensor 14c and lower right load sensor 14d becomes 5(=20/4). Assuming that the upper load has the value of 30 and the lower load has the value of 70, the second difference becomes 40, and the second correction value is computed by equally dividing the second difference into four. That is, the second correction value for the upper left load sensor 14a and upper right load sensor 14c becomes 10 (=40/4), and the second correction value for the lower left load sensor 14b and lower right load sensor 14d becomes −10 (=−40/4). The correction values finally set for the four load sensors 14 are computed based on the first correction value and the second correction value. Specifically, the final correction value is computed by adding the first correction value and the second correction value. That is, the final correction value for the upper left load sensor 14a becomes +5 (=−5+10), the final correction value for the lower left load sensor 14b becomes −15 (=−5-10), the final correction value for the upper right load sensor 14c becomes +15 (=5+10), and the final correction value for the lower right load sensor 14d becomes −5 (=5−10).
A necessary quantity determining program is stored in a memory area 214. The necessary quantity determining program is used to determine the quantity of load values necessary for the game processing. In the embodiment, because the total load game, the right and left balance game, and the four-directional balance game, etc. as described above are performed, the necessary quantity of load values is determined in each game or game mode or the like, and information such as a quantity table in which the necessary quantity of load values is set in each game or game mode or the like is previously stored. Accordingly, the necessary quantity of load values can be determined by the identification information such as a game name or the type or mode of the game. In another embodiment, the necessary quantity of load values may be changed according to the scene or status or the like of the game. In such cases, the necessary quantity of load values is determined by the scene of the game or the like.
A load value computing program is stored in a memory area 216. The load value computing program is used to compute the quantity of load values necessary for the game processing based on the load detection values from the four load sensors 14. In the case where the correction is performed by the correction program, the load value is computed based on the corrected load detection value. Specifically, the summation (total load value) of the four load detection values is computed in the case of the total load game, the right and left load values are computed in the case of the right and left balance game, and the left load value, the right load value, the upper load value, and the lower load value are computed in the case of the four-directional balance game. Because the necessary quantity of load values is computed from the four load detection values, the game processing can be performed using various quantities of load values according to the game. Depending on the game, sometimes the load detection value (corrected load detection value) is directly used. In such cases, the load detection value (corrected load detection value) is directly computed as the load computation value.
A game processing program is stored in a memory area 218. The game processing program is used to perform the game processing based on the load computation value. In the embodiment, the game processing is performed for the total load game, the right and left balance game, and the four-directional balance game, etc.
A memory area 220 is an input data buffer in which the pieces of input data (manipulation data) from the game controller 10 and remote control 54 are stored. A memory area 222 is a selection game memory area where the identification information on the game selected by the game selection program is stored.
The load detection values of the four load sensors 14 obtained from the input data buffer 220 using the load detection value obtaining program are stored in a memory area 224. The correction values, i.e., the final correction values for the four load sensors 14 computed using the correction value computing program are stored in a memory area 226. The load detection values corrected using the correction program are stored in a memory area 228.
The quantity table indicating the quantity of load values necessary for the game processing is stored in a memory area 230. In the embodiment, the necessary quantity of load values is stored while correlated with the name or type of the game or the like. The load computation value obtained using the load value computing program is stored in a memory area 232.
A squat flag is stored in a memory area 234. The squat flag indicates status in action or at rest in the total load game. For example, the squat flag is turned on when the change in summation (total load value) of the four load detection values is not lower than a predetermined value. A squat success counter indicating the number of times at which the player is successful in the squat is stored in a memory area 236. A time counter is stored in a memory area 238. In the squat determination of the total load game, the time counter is used to measure a time for one-time squat. In the right and left balance game and the four-directional balance game, the time counter is used to measure a time during which each load computation value falls within the predetermined target range.
If “YES” in the step S1, the microcomputer 20 controls the DC-DC converter 26 to supply the electric power to the four load sensors 14 in a step S3. At the same time, the electric power is also supplied to each amplifier 28 and the AD converter 22. Accordingly, each load sensor 14 imparts the signal to the AD converter 22 through each amplifier 28 according to the detected load, and the AD converter 22 produces the data indicating the load detection value of each load sensor 14 and imparts the data to the microcomputer 20.
In a step S5, the microcomputer 20 obtains the load detection values from the four load sensors 14. Specifically, the microcomputer 20 obtains the pieces of data indicating the four load detection values from the AD converter 22 and stored the pieces of data in an internal memory (not shown).
In a step S7, the microcomputer 20 transmits the obtained four pieces of load detection value data to the game machine 52. In the embodiment, the load detection value data is transmitted to the remote control 54 through the connector 24, and the load detection value data is transmitted from the remote control 54 to the game machine 52.
On the other hand, if “NO” in the step S1, that is when the load obtaining command is not received from the game machine 52, the microcomputer 20 controls the DC-DC converter 26 to stop the electric power supplied to the four load sensors 14 in a step S9. The electric power supplied to each amplifier 28 and the AD converter 22 is also stopped at the same time. When the step S7 or S9 is ended, the processing is ended. Thus, in the game controller 10, the electric power is supplied from the battery 30 to the load sensors 14 and the like only when the load detection is required, so that the power consumption can be suppressed at a lower level.
In a step S21, the processor 112 determines whether or not the load obtaining command is received from the game machine 52 through the wireless module 118. If “YES” in the step S21, the processor 112 transmits the load obtaining command to the game controller 10 through the connector 100 in a step S23. Therefore, in the game controller 10, the load value is detected as described above and the load detection value data is transmitted to the remote control 54.
When the step S23 is ended, or If “NO” in the step S21, the processor 112 determines whether or not the load detection value data is received from the game controller 10 through the connector 100 in a step S25. If “YES” in the step S25, the processor 112 stores the received four pieces of load detection value data in the memory 114 in a step S27. In a step S29, the processor 112 produces the input data (manipulation data) including the four pieces of load detection value data and transmits the input data to the game machine 52 through the wireless module 118. This enables the four load detection values to be imparted from the game controller 10 to the game machine 52. The transmission is performed when the load detection value data is received from the game controller 10. Alternatively, the transmission may be performed at a predetermined time the remote control 54 transmits the input data including the manipulation data of the input means 80, the acceleration data detected by the acceleration sensor 116, and the marker coordinate data from the imaging information computation unit 104. When the step S29 is ended or If “NO” in the step S25, the processing is ended.
In a step S41, the CPU 82 performs correction value computing processing. The correction value computing processing is performed according to the correction value computing program, and
In a step S71 of
In a step S73, the CPU 82 computes the summation of the two load detection values on the left side, i.e., the left load value. In a step S75, the CPU 82 computes the summation of the two load detection values on the right side, i.e., the right load value. In a step S77, the CPU 82 computes the difference (first difference) between the left load value and the right load value. In a step S79, the CPU 82 computes the correction values of the load detection values from the four load sensors 14 based on the computed first difference. The correction value is the first correction value obtained by dividing the four load sensors 14 into the left and right, and the first correction value is computed by equally dividing the first difference into four and by allocating the equally-divided difference to each load sensor 14. Accordingly, absolute values of the first correction values for the load sensors 14 are equal to one another, and the left side differs from the right side in the sign.
In a step S81, the CPU 82 computes the summation of the two load detection values on the upper side, i.e., the upper load value. In a step S83, the CPU 82 computes the summation of the two load detection values on the lower side, i.e., the lower load value. In a step S85, the CPU 82 computes the difference (second difference) between the upper load value and the lower load value. In a step S87, the CPU 82 computes the correction values of the load detection values from the four load sensors 14 based on the computed second difference. The correction value is the second correction value obtained by vertically dividing the four load sensors 14 into the two sets, and the second correction value is computed by equally dividing the second difference into four and by allocating the equally-divided difference to each load sensor 14. Accordingly, absolute values of the second correction values for the load sensors 14 are equal to one another, and the upper side differs from the lower side in the sign.
In a step S89, the CPU 82 computes the final correction values of the four load sensors 14 based on the two computed correction values. Specifically, the first correction value and the second correction value are added to each other for each load sensor 14, thereby computing the finally-set correction values. In a step S91, the CPU 82 writes the final correction value of each of the four load sensors 14 in the correction value memory area 226. When the step S91 is ended, the correction value computing processing is ended and the processing returns to a step S43 of
In the step S43 of
Then, the CPU 82 starts the processing for the selected game. In a step S45, the CPU 82 determines whether or not it is load obtaining timing. The load obtaining timing is a time the load value is required in the game processing. In the case where the load is required at regular time intervals, the processing is configured to determine that it is the load obtaining timing at regular time intervals. Alternatively, the load obtaining timing may be a time at a predetermined event is generated or a time at a predetermined status is generated in the game. The processing in the step S45 is performed at regular time intervals until the CPU 82 determines that it is the load obtaining timing.
If “YES” in the step S45, the CPU 82 transmits the load obtaining command to the game controller 10 in a step S47. Specifically, the CPU 82 transmits the load obtaining command to the remote control 54 through the wireless controller module 94, etc. In response to the transmission of the load obtaining command, the manipulation data including the four pieces of load detection value data is transmitted from the game controller 10 (remote control 54) through the processing performed by the remote control 54 and the game controller 10. The four pieces of load detection value data are received through the wireless controller module 94, etc. and stored in the input data buffer 220. In a step S49, the CPU 82 obtains the four pieces of load detection value data from the game controller 10. Specifically, the CPU 82 reads the four pieces of load detection value data from the input data buffer 220 and stores the four pieces of load detection value data in the load detection value memory area 224.
In a step S51, the CPU 82 corrects the four load detection values based on the correction values stored in the correction value memory area 226. Specifically, the CPU 82 adds the final correction values for the four load sensors 14 to the four load detection values respectively, and the CPU 82 stores the computed values in the memory area 228 for the corrected load detection values.
In steps S53 to S57, the CPU 82 determines the quantity of load values necessary for the game processing. In the embodiment, the necessary quantity of load values is kept constant in each selection game, so that the necessary quantity of load values corresponding to the selection game stored in the memory area 222 can be specified by referring to the quantity table stored in the memory area 230.
In the step S53, the CPU 82 determines whether or not the necessary quantity of load values is one. If “YES” in the step S53, the CPU 82 performs the game processing 1 in a step S59. In the embodiment, the necessary quantity of load values is one in the total load game, and
On the other hand, if “NO” in the step S53, the CPU 82 determines whether or not the necessary quantity of load values is two in the step S55. If “YES” in the step S55, the CPU 82 performs the game processing 2 in a step S61. In the embodiment, the necessary quantity of load values is two in the right and left balance game, and
If “NO” in the step S55, the CPU 82 determines whether or not the necessary quantity of load values is four in the step S57. If “YES” in the step S57, the CPU 82 performs the game processing 3 in a step S63. In the embodiment, the necessary quantity of load values is four in the four-directional balance game, and
If “NO” in the step S57, the CPU 82 performs another piece of game processing in a step S65.
A game end determination is made in each game processing. When the game is not ended, the processing returns to the step S45. Accordingly, the pieces of processing from the step S45 are repeated and the game advances until the CPU 82 determines that the game is ended. On the other hand, when the CPU 82 determines that the game is ended, the game processing in the step S59, S61, S63, or S65 is ended.
In a step S103, the CPU 82 stores the summation (total load value) in the memory. Specifically, the summation is written in the load computation value memory area 232. A history of the summation is stored in the memory area 232.
On the basis of the summation, it is determined whether or not the squat is performed. The change in summation is increased while the player does the squat, that is, the change from the previous load obtaining timing becomes a predetermined value or more. Accordingly, when the change in summation is not lower than the predetermined value, it can be determined that the squat is performed, and the summation at that time is recorded. When the squat is finished, the change from the previous load obtaining timing becomes lower than a predetermined value. Accordingly, when the change in summation becomes lower than the predetermined value, it is considered that the one-time squat is finished, and it is determined whether or not the squat is actually performed based on the recorded waveform of the temporal change in summation. The squat determination is made by a determination of a vertical length in the waveform and a determination of a horizontal length in the waveform. That is, the determination whether or not the squat is performed is made by such squat conditions that the difference between the maximum value and the minimum value of the summation in the squat is not lower than the predetermined value and the elapse time during the squat is not lower than the predetermined value.
Specifically, in a step S105, the CPU 82 determines whether or not the difference between the current summation and the previous summation is not lower than the predetermined value, namely, the player is in the squat or at rest. If “YES” in the step S105, namely, in the case where it can be considered that the squat is being performed, the CPU 82 turns on the squat flag of the memory area 234 in a step S107.
In a step S109, the CPU 82 increments the time counter of the memory area 238. This enables an elapsed time to be measured when the squat flag is turned on.
In a step S111, the CPU 82 determines whether or not the game is ended. Examples of the game end condition include that the squat is performed the predetermined times and that a predetermined time-limit elapses since the game is started. If “NO” in the step S111, the processing returns to the step S45 of
On the other hand, if “NO” in the step S105, the CPU 82 determines whether or not the squat flag of the memory area 234 is turned on in a step S113. At this point, the CPU 82 determines whether or not the state in which the squat is performed is changed to the rest state, that is, whether or not the one-time squat is ended. If “YES” in the step S113, the CPU 82 turned off the squat flag of the memory area 234 in a step S115. In a step S117, the CPU 82 resets the time counter of the memory area 238. However, because the elapsed time until the squat flag is turned off since the squat flag is turned on, i.e., the time of the current squat is recorded in the time counter, the value indicated by the time counter is stored in another predetermined area of the data memory area 202 to use the elapsed time for the squat determination before the time counter is reset.
In a step S119, the CPU 82 detects the maximum value and the minimum value in the summation history stored in the memory area 232 when the squat flag is turned on, and computes the difference between the maximum value and the minimum value. In a step S121, the CPU 82 determines whether or not the difference between the maximum value and the minimum value is not lower than a predetermined value. That is, the CPU 82 determines whether or not the vertical length in the waveform of the summation is not lower than the predetermined value. If “YES” in the step S121, the CPU 82 determines whether or not the time count is not lower than the predetermined value in a step S123. Thus, the determination is made based on the time that elapsed when the squat flag is turned on. The elapsed time is stored in the predetermined area before the time counter is reset in the step S117. That is, the CPU 82 determines whether or not the horizontal length in the waveform of the summation is not lower than the predetermined value. If “YES” in the step S123, that is, when the CPU 82 recognizes that the squat is performed, the CPU 82 increments the squat success counter of the memory area 236, namely, the CPU 82 counts the number of squat success times in a step S125. When the step S125 is ended, the processing goes to a step S111. If “NO” in the step S121, or if “NO” in the step S123, because the CPU 82 cannot recognize that the squat is performed, the processing goes directly to the step S111. If “NO” in the step S113, that is, even if the CPU 82 recognizes that the player does not the squat but is at rest, the processing also goes to the step S111.
If “YES” in the step S111, that is, when the game end condition is satisfied, the CPU 82 turns off the squat flag of the memory area 234 in a step S127, and the CPU 82 resets the time counter of the memory area 238 in a step S129. In a step S131, the CPU 82 performs score processing based on the number of squat success times. The number of squat success times is recorded in a squat success counter of the memory area 236, and the score of the player is computed based on the number of squat success times. In a step S133, the CPU 82 resets the squat success counter of the memory area 236 to end the game processing 1.
The method for computing the right and left load values is not limited to the embodiment. Alternatively, the right and left load values can be computed using the summation of the four load detection values. For example, the summation (total load value) of the four load detection values and the summation (right load value) of the two load detection values on the right side are computed, and the summation (left load value) of two load detection values on the left side may be computed from the difference (or ratio) between the load values.
In a step S155, the CPU 82 determines whether or not the right and left load values fall within respective predetermined ranges as targets. If “YES” in the step S155, the CPU 82 increments the time counter of the memory area 238 in a step S157. Therefore, the time during which the horizontal balance maintains the target state is measured.
In a step S159, the CPU 82 determines whether or not a predetermined time (for example, three seconds) elapses based on the time counter value of the memory area 238. That is, the CPU 82 determines whether or not the horizontal balance state in which the right and left load values fall within the predetermined ranges respectively is maintained for three seconds. If “YES” in the step S159, that is, when the target horizontal balance state is maintained for the predetermined time, the CPU 82 performs game clear processing in a step S161, and the CPU 82 resets the time counter of the memory area 238 in a step S163. If “NO” in the step S159, the processing goes directly to a step S167.
If “NO” in the step S155, that is, when the target horizontal balance is not achieved, the CPU 82 resets the time counter of the memory area 238 in a step S165. When the step S165 is ended, the processing goes to the step S167.
In the step S167, the CPU 82 determines whether or not the game is ended. Examples of the game end condition include that a predetermined time elapses since the game is started, that the maintenance of the target horizontal balance state is not achieved, and that the predetermined number of right and left balance games is cleared. If “NO” in the stop S167, the processing returns to the step S45 of
In a step S183, the CPU 82 determines whether or not the four load computation values fall within respective predetermined ranges as targets. If “YES” in the step S183, the CPU 82 increments the time counter of the memory area 238 in a step S185. Therefore, the time during which the four-directional balance maintains the target state is measured.
In a step S187, the CPU 82 determines whether or not the predetermined time (for example, three seconds) elapses based on the time counter value of the memory area 238. That is, the CPU 82 determines whether or not the four-directional balance state in which the vertical and horizontal load values fall within the predetermined ranges respectively is maintained for three seconds. If “YES” in the step S187, that is, when the target four-directional balance state is maintained for the predetermined time, the CPU 82 performs the game clear processing in a step S189, and the CPU 82 resets the time counter of the memory area 238 in a step S191. If “NO” in the step S187, the processing goes directly to a step S195.
If “NO” in the step S183, that is, when the target four-directional balance state is not achieved, the CPU 82 resets the time counter of the memory area 238 in a step S193. When the step S193 is ended, the processing goes to the step S195.
In the step S195, the CPU 82 determines whether or not the game is ended. Examples of the game end condition include that a predetermined time elapses since the game is started, that the maintenance of the target four-directional balance state is not achieved, and that the predetermined number of four-directional balance games is cleared. If “NO” in the step S195, the processing returns to the step S45 of
According to the embodiment, the quantity of load values necessary for the game processing is determined, and the necessary quantity of load values is computed from the load detection values of the four load sensors 14, so that the game processing can be performed using the various quantities of load values according to the game. Therefore, the novel play can be proposed with the load applied by the player.
When the player rotates the waist on the game controller 10, the detected load value is changed according to the rotation. Accordingly, in the game processing, the waist rotation of the player is determined by the load value. The four load detection values are directly computed as the four load computation values to determine the waist rotation. The four load computation values are compared to one another, and the game processing is performed based on the load value having the determined maximum value. Specifically, the waist of the player character is moved toward the maximum load detection value. That is, the waist of the player character is moved toward the left front direction in the case where the upper-left load sensor 14a has the maximum load detection value, the waist of the player character is moved toward the left rear direction in the case where the lower-left load sensor 14b has the maximum load detection value, the waist of the player character is moved toward the right front direction in the case where the upper-right load sensor 14c has the maximum load detection value, and the waist of the player character is moved toward the right rear direction in the case where the lower-right load sensor 14d has the maximum load detection value. The history of the waist position is recorded. It is determined whether or not the waist movement indicates the rotation in a constant direction. When it is determined that the waist is rotated in the constant direction, the hoop can be rotated.
A flag N for recording the histories of the waist positions of the player and player character is stored in a memory area 242. It is determined whether or not the change in waist position recorded in the flag N indicates the rotation in the constant direction. The variable N of the flag N has an initial value of 1, and the variable N is incremented every load obtaining timing. In the embodiment, the variable N is up to 4, namely, it is determined whether or not the waist is rotated in the constant direction during the four-time load obtaining timing. For example, the numerical values of 1 to 4 are allocated to the load sensors 14 in the arrangement order of the four load sensors 14 respectively, and the allocated value are recorded in the flag N. In the embodiment, the numerical values called movement numbers of 1 to 4 are allocated in the clockwise direction of the upper left, upper right, lower right, and lower left. When the movement numbers recorded in the flags 1 to 4 in sequence are changed in the ascending (or descending) order, it is determined that the waist is rotated in the constant direction, namely, the hoop is successfully rotated. A rotating number counter is stored in a memory area 244. Therefore the number of successful rotations of the hoop is counted.
Because the four load values are required in the game processing of the hoop game, the game processing is performed as the game processing 3 in the step S63 of
In steps S213 to S217, it is determined which load value indicates the maximum value, and the waist position is determined based on the load value having the determined maximum value.
Specifically, in a step S213, the CPU 82 determines whether or not the upper-left load value is the maximum. If “YES” in the step S213, the CPU 82 moves the waist of the player character toward the left front direction in a step S219. The position in the left front direction is stored in the waist position memory area 240. In a step S221, the CPU 82 sets (stores) the movement number 1 indicating the left front direction to the flag N of the memory area 242.
On the other hand, if “NO” in the step S213, the CPU 82 determines whether or not the upper-right load value is the maximum in a step S215. If “YES” in the step S215, the CPU 82 moves the waist of the player character toward the right front direction in a step S223. The position in the right front direction is stored in the waist position memory area 240. In a step S225, the CPU 82 sets (stores) the movement number 2 indicating the right front direction to the flag N of the memory area 242.
If “NO” in the step S215, the CPU 82 determines whether or not the lower-right load value is the maximum in a step S217. If “YES” in the step S217, the CPU 82 moves the waist of the player character toward the right rear direction in a step S227. The position in the right rear direction is stored in the waist position memory area 240. In a step S229, the CPU 82 sets (stores) the movement number 3 indicating the right rear direction to the flag N of the memory area 242.
If “NO” in the step S217, namely, when the lower-left load value is the maximum, the CPU 82 moves the waist of the player character toward the left rear direction in a step S231. The position in the left rear direction is stored in the waist position memory area 240. In a step S233, the CPU 82 sets (stores) the movement number 4 indicating the left rear direction to the flag N of the memory area 242.
When the step S221, S225, S229, or S233 is ended, the processing goes to a step S235 of
If “NO” in the step S235, the CPU 82 increments the variable N in a step S237, and the processing goes to a step S249. In the step S249, the CPU 82 determines whether or not the game is ended. Examples of the game end condition include that a predetermined time elapses since the game is started and that the hoop is not successfully rotated. If “NO” in the step S249, the processing returns to the step S45 of
On the other hand, if “YES” in the step S235, the CPU 82 determines whether or not the four movement numbers set in the flags 1 to 4 of the memory area 242 are in the ascending (or descending) order in a step S239. That is, the CPU 82 determines whether or not the waist is rotated in the constant direction. If “YES” in the step S239, the CPU 82 performs hoop rotation processing in a step S241. Therefore, the hoop is controlled so as to be rotated around a torso of the player character. Because the rotation is required to rotate in the constant direction, it is determined “NO” in the step S239, when the orientation (ascending or descending order) of the change in the movement number is changed from, the previous rotation. In a step S243, the CPU 82 increments the rotating number counter of the memory area 244.
If “NO” in the step S239, the CPU 82 performs hoop rotation failure processing in a step S245. Therefore, the hoop is controlled so as to be stopped the rotation thereof.
When the step S243 or S245 is ended, the CPU 82 sets the variable N to the initial value of 1 for the purpose of the next rotation in a step S247. Then, the processing goes to the step S249.
If “YES” in the step S249, the CPU 82 performs the score processing based on the number of rotations stored in the memory area 244 in a step S251. The score is computed according to the number of successful rotations. In a step S253, the CPU 82 resets the rotating number counter of the memory area 244 and the flag N of the memory area 242 to end the game processing 3.
In the quiz game, similarly to the hoop game, the four load detection values are directly computed as the four load computation values. The four load computation values are compared to one another, and the game processing is performed based on the load value having the determined maximum value. Specifically, the answer corresponding to the maximum load detection value is selected, and it is determined whether or not the selected answer is correct. On the game controller 10, the player put a player's weight on the direction corresponding to the answer that the player considers to be correct, or the player's leg rides on the portion corresponding to the answer. Therefore, the load value of the load sensor 14 corresponding to the answer is caused to become the maximum to reply the question. Thus, the game controller 10 is used to perform the game manipulation using the load, which allows the game to be played by selecting the answer from the plurality of choices like the general-purpose game controller including the conventional cross key, stick, and manipulation button, etc.
Because the four load values are required in the game processing of the quiz game, the game processing is performed as the game processing 3 in the step S63 of
In steps S273 to S277, it is determined which load value indicates the maximum value, and the answer by the player is selected based on the load value having the determined maximum value.
Specifically, in a step S273, the CPU 82 determines whether or not the upper-left load value is the maximum. If “YES” in the step S273, the CPU 82 selects the answer 1 corresponding to the upper-left load sensor 14a in a step S279. The identification information indicating the answer 1 is stored in the answer memory area 252.
On the other hand, if “NO” in the step S273, the CPU 82 determines whether or not the upper-right load value is the maximum in a step S275. If “YES” in the step S275, the CPU 82 selects the answer 2 corresponding to the upper-right load sensor 14c in a step S281. The identification information indicating the answer 2 is stored in the answer memory area 252.
If “NO” in the step S275, the CPU 82 determines whether or not the lower-left load value is the maximum in a step S277. If “YES” in the step S277, the CPU 82 selects the answer 3 corresponding to the lower-left load sensor 14b in a step S283. The identification information indicating the answer 3 is stored in the answer memory area 252.
If “NO” in the step S277, that is, when the lower-right load value is the maximum, the CPU 82 selects the answer 4 corresponding to the lower-right load sensor 14d in a step S285. The identification information indicating the answer 4 is stored in the answer memory area 252.
When the steps S279, S281, S283, or S285 is ended, the CPU 82 compares the selected answer to the correct answer based on the answer stored in the memory area 252 and the correct answer data stored in the memory area 250 in a step S287. In a step S289, the CPU 82 determines whether or not the selected answer is correct. If “YES” in the step S289, the CPU 82 performs correct answer processing in a step S291. For example, the player's score is computed by adding the point according to the question. On the other hand, if “NO” in the step S289, the CPU 82 performs incorrect answer processing in a step S293. For example, the player's score is computed by subtracting the point according to the question.
When the step S291 or S293 is ended, the CPU 82 determines whether or not the game is ended in a step S295. Examples of the game end condition include that the predetermined number of questions are taken, that the predetermined number of correct answer or incorrect answer is obtained, and that the time-limit elapses. If “NO” in the step S295, the processing returns to the step S45 of
Because the four load values are required in the game processing of the ski game, the game processing is performed as the game processing 3 in the step S63 of
In a step S313, the CPU 82 determines whether or not the upper load value is larger than lower load value based on the load computation value of the memory area 232. If “YES” in the step S313, the CPU 82 computes the acceleration of the player character based on the upper load value in a step S315. The computed acceleration is stored in the memory area 260. In a step S317, the CPU 82 controls the movement speed of the player character based on the computed acceleration of the memory area 206. The current movement speed is computed based on the previous movement speed stored in the memory area 262 and the acceleration, and the current movement speed is stored in the memory area 262.
On the other hand, if “NO” in the step S313, the CPU 82 computes the deceleration of the player character based on the lower load value in a step S319. The computed deceleration is stored in the memory area 260. In a step S321, the CPU 82 controls the movement speed of the player character based on the computed deceleration of the memory area 260. The current movement speed is computed based on the previous movement speed stored in the memory area 262 and the deceleration, and the current movement speed is stored in the memory area 262.
When the step S317 or S321 is ended, in a step S323, the CPU 82 determines whether or not the right load value is larger than the left load value based on the load computation value of the memory area 232. If “YES” in the step S323, the CPU 82 turns the player character in the right direction based on the right load value in a step S325. The turning movement is controlled based on the movement speed of the memory area 262. A turning radius may be computed based on the right load value. The current position of the player character is computed based on the previous position stored in the character position memory area 264, the movement speed, and the turning radius in the right direction, etc.
On the other hand, if “NO” in the step S323, the CPU 82 turns the player character in the left direction based on the left load value in a step S327. The turning movement is controlled based on the movement speed of the memory area 262. The turning radius may be computed based on the left load value. The current position of the player character is computed based on the previous position stored in the character position memory area 264, the movement speed, and the turning radius in the left direction, etc.
When the step S325 or S327 is ended, in a step S329, the CPU 82 determines whether or not the player character reaches a goal. Specifically, the CPU 82 determines whether or not the position of the player character moved in the step S325 or S327 becomes a position within a region indicating a predetermined goal point previously stored. If “NO” in the step S329, the processing returns to the step S45 of
Because the four load values are required in the game processing of the moving game, the game processing is performed as the game processing 3 in the step S63 of
In steps S353 to S357, it is determined which load computation value is the maximum, and the moving direction and movement amount of the player character are controlled based on the load value having the determined maximum value.
Specifically, in a step S353, the CPU 82 determines whether or not the upper load value is the maximum. If “YES” in the step S353, the CPU 82 computes the movement amount of the player character based on the upper load value in a step S359. For example, the movement amount of the player character is increased with increasing load value. The movement amount computed based on the load computation value is stored in the memory area 270. In a step S361, the CPU 82 moves the player character in the upward direction according to the computed movement amount. The current position of the player character is computed based on the previous position stored in the character position memory area 272 and the upward movement amount stored in the memory area 270.
On the other hand, if “NO” in the step S353, the CPU 82 determines whether or not the lower load value is the maximum in a step S355. If “YES” in the step S355, the CPU 82 computes the movement amount of the player character based on the lower load value in a step S363. In a step S365, the CPU 82 moves the player character in the downward direction according to the computed movement amount of the memory area 270. The current position of the player character is computed based on the previous position stored in the character position memory area 272 and the downward movement amount stored in the memory area 270.
If “NO” in the step S355, the CPU 82 determines whether or not the right load value is the maximum in a step S357. If “YES” in the step S357, the CPU 82 computes the movement amount of the player character based on the right load value in a step S367. In a step S369, the CPU 82 moves the player character in the right direction according to the computed movement amount of the memory area 270. The current position of the player character is computed based on the previous position stored in the character position memory area 272 and the rightward movement amount stored in the memory area 270.
If “NO” in the step S357, that is, when the left load value is the maximum, the CPU 82 computes the movement amount of the player character based on the left load value in a step S371. In a step S373, the CPU 82 moves the player character in the left direction according to the computed movement amount of the memory area 270. The current position of the player character is computed based on the previous position stored in the character position memory area 272 and the leftward movement amount stored in the memory area 270.
When the step S361, S365, S369, or S373 is ended, the CPU 82 determines whether or not the game is ended in a step S375. Examples of the game end condition include that the character position enters a predetermined region and that a predetermined time-limit elapses. If “NO” in the step S375, the processing returns to the step S45 of
In the moving game, the movement target is the player character (player object). Alternatively, the moving processing in the moving game can also be applied to the movement of a cursor or a pointer and the movement of a viewpoint or a point of gaze of a virtual camera or the like.
In each of the above-described embodiments, the necessary quantity of load values is kept constant in each game. However, in another embodiment, the necessary quantity of load values may be changed in the one game according to the status or scene or the like. In the embodiment, the necessary quantity of load values is determined according to the status or scene of the game, and the game processing is performed based on the necessary quantity of load values. Various quantities of load values can be computed according to the status or scene of the game to perform various game manipulations.
As shown in
In a step S393, the CPU 82 performs correction value computing processing. The correction value computing processing is similar to the step S41 of
The pieces of processing of steps S395, S397, S399, and S401 are similar to those of steps S45, S47, S49, and S51 of
In steps S403 and S405, the necessary quantity of load values is determined according to the scene and status of the game. The determination is made based on the scene flag of the memory area 286.
Specifically, in the step S403, the CPU 82 determines whether or not the scene is the field based on the scene flag. If “YES” in the step S403, namely, when it is determined that the four load values are required, the CPU 82 performs the moving processing based on the four values in a step S407. The four values mean the four load computation values.
On the other hand, if “NO” in the step S403, the CPU 82 determines whether or not scene is the battle scene based on the scene flag in a step S405. If “YES” in the step S405, namely, when it is determined that the two load values are required, the CPU 82 performs the battle processing based on the two values in a step S409. The two values mean the two load computation values.
If “NO”, in the step S405, namely, in the case of other scenes except for the filed and battle, the CPU 82 performs other pieces of processing in a step S411.
When the step S407, S409, or S411 is ended, the CPU 82 determines whether or not the game is ended in a step S413. Examples of the game end condition include that the player character loses battle in the battle processing and the predetermined number of enemy characters are struck down. If “NO” in the step S413, the processing returns to the step S395. Accordingly, the action game in which the necessary number of load values is changed according to the scene and status is continued. On the other hand, if “YES” in the step S413, the game processing is ended.
When the step S441, S445, S449, or S453 is ended, the CPU 82 determines whether or not the player character encounters the enemy character in a step S455. Specifically, the CPU 82 determines whether or not the position of the player character of the memory area 282 and the position of the enemy character of the memory area 284 are brought close to each other within the predetermined range. The movement of the enemy character is controlled by the program, and the computed position of the enemy character is stored in the memory area 284.
If “YES” in the step S455, the CPU 82 sets the scene flag to the battle by storing the value indicating the battle in the scene flag memory area 286 in a step S457. On the other hand, if “NO” in the step S455, the moving processing based on the four values is ended, and the processing goes to the step S413 of
In a step S473, the CPU 82 determines whether or not the right load value is larger than the left load value. If “YES” in the step S473, the CPU 82 performs motion processing in a step S475 in order that the player character attacks against the enemy character with the sword in the player character's right hand. For example, the motion processing is performed based on previously-stored motion data with which the player character wields the sword in the right hand. On the other hand, if “NO” in the step S473, the CPU 82 performs motion processing in a step S477 in order that the player character protects against the enemy character with the shield in the left hand. The motion processing is also performed based on previously-stored motion data with which the player character puts the shield in the left hand forward.
When step S475 or S477 is ended, the CPU 82 performs other pieces of processing in a step S479. Examples of other pieces of processing include enemy character attack processing and defense processing, and other pieces of processing are performed according to the program.
In a step S481, the CPU 82 performs HP subtraction processing to the player character and enemy character based on the attack processing or the defense processing. For example, when it is determined that one character hits the other character, the HP of the other character is subtracted by a predetermined value in the case where the other character does not protect against the one character, and the HP of the other character is not subtracted in the case where the other character protects against the one character. The computed HPs of the player character and enemy character are stored in the memory area 288.
In a step S483, the CPU 82 determines whether or not the battle is ended. For example, the CPU 82 determines that the battle is ended when one of the HPs becomes zero. When the HP of the player character becomes zero, it is determined that the player character loses the battle. When the HP of the enemy character becomes zero, it is determined that the player character wins the battle. If “YES” in the step S483, the CPU 82 sets the scene flag to the field by storing the value indicating the field in the scene flag memory area 286 in a step S485. On the other hand, if “NO” in the step S483, the battle processing based on the two values is directly ended (battle scene is continued), and the processing returns to the step S413 of
In the embodiment of the action game, the necessary quantity of load values is determined according to the scene of the game, and the moving processing and battle processing of the player character are performed based on the necessary quantity of load values. However, in another embodiment, the game processing may be performed while the barycentric position and the necessary quantity of load values are used according to the game scene. In the following embodiment of the role-playing game, the movement of the player character is controlled based on the barycentric position, and the battle of the player character is controlled based on the necessary quantity of load values.
XG=((c+d)−(a+b))×m [Formula 1]
YG=((a+c)−(b+d))×n [Formula 2]
Where m and n are a constant. An XY is the coordinate system on the screen, an origin (0, 0) is set in the center of the screen, and −1≦X≦1 and −1≦Y≦1.
Thus, the XG is computed based on the difference between the right load value and the left load value, and the YG is computed based on the difference between the upper load value and the lower load value.
The moving direction and movement speed of the player character are controlled based on the coordinate of the barycenter. Assuming that the origin of the screen coordinate system is the position of the player character and the center of the board 12 of the game controller 10, the distance with the barycentric position and the orientation from the origin toward the barycentric position are used in the movement control. Specifically, a vector V connecting the center (0, 0) of the screen and the barycenter (XG, YG) is computed, and the movement speed of the player character is computed based on a size of the vector V. The player character is moved at the computed movement speed in the orientation of the vector V. In the case of the two-dimensional virtual game space, the player character may be moved at the computed movement speed in the orientation of the vector V computed using the screen coordinate system. On the other hand, in the case of the three-dimensional virtual game space, for example, the screen coordinate system is considered to be a plane coordinate system in a three-dimensional coordinate system of the game space, and the player character may be moved on the plane at the computed movement speed in the orientation of the vector V. From the view point of display, as shown in
If “YES” in the step S403, namely, in the case of the field scene, the CPU 82 performs the moving processing based on the barycenter in a step S501.
In a step S513, the CPU 82 computes difference (referred to as horizontal difference) between the right load value and the left load value. In a step S515, the CPU 82 computes difference (referred to as vertical difference) between the upper load value and the lower load value. In a step S517, the CPU 82 computes the X and Y coordinates at the barycentric position with respect to the center position (origin) based on the horizontal difference and vertical difference. The computation is performed according to the above-mentioned Formula 1 and Formula 2. The computed barycentric coordinate is stored in the memory area 300.
In a step S519, the CPU 82 computes the vector connecting the origin and the barycenter, and the CPU 82 stores the vector in the memory area 302. In a step S521, the CPU 82 computes the movement speed of the player character based on the magnitude (length) of the vector and stores the movement speed in the memory area 304. In a step S523, the CPU 82 performs the moving processing for moving the player character at the computed movement speed toward the orientation of the vector. The position of the player character is computed based on the orientation of the vector, the movement speed, and the previous position and is stored in the memory area 282.
In a step S525, the CPU 82 determines whether or not the player character encounters the enemy character. The movement of the enemy character is controlled by the program, and the computed position of the enemy character is stored in the memory area 284. Accordingly, it is determined whether the positions of the player character stored in the memory area 282 and enemy character stored in the memory area 284 are brought close to each other within the predetermined range. If “YES” in the step S525, the CPU 82 sets the scene flag to the battle by storing the value indicating the battle in the scene flag memory area 286 in a step S527. When the step S527 is ended, or if “NO” in the step S525, the moving processing based on the barycenter is ended, and the processing returns to the step S413 of
In a step S543, the CPU 82 computes the upper load value, the lower load value, the right load value, and the left load value based on the corrected load detection value of the memory area 228 and stores the load values in the load computation value memory area 232.
In steps S545 to S549, it is determined which load computation value indicates the maximum value. In a step S545, the CPU 82 determines whether or not the upper load value is the maximum. If “YES” in the step S545, namely, when the CPU 82 determines that the command of “fight” corresponding to the upward direction is selected, the CPU 82 performs the attack processing with the weapon in a step S551. Therefore, the player character attacks against the enemy character.
On the other hand, if “NO” in the step S545, the CPU 82 determines whether or not the lower load value is the maximum in a step S547. If “YES” in the step S547, namely, when the CPU 82 determines that the command of “protect” corresponding to the downward direction is selected, the CPU 82 performs the defense processing in a step S553. Therefore, the player character takes a defensive pose to protect against the attack of the enemy character.
If “NO” in the step S547, the CPU 82 determines whether or not the right load value is the maximum in a step S549. If “YES” in the step S549, namely, when the CPU 82 determines that the command of “magic” corresponding to the right direction is selected, the CPU 82 performs the magic working processing in a step S555. Therefore, the player character damages the enemy character with the magic.
If “NO” in the step S549, namely, when the CPU 82 determines that the command of “escape” corresponding to the left direction is selected while the left load value is the maximum, the CPU 82 performs the processing for escaping from the battle scene in a step S557. For example, the player character tries to escape from the battle. The battle is ended in the case where the player character escapes successfully from the battle, and the battle is continued in the case where the player character fails to escape from the battle. The successful and failure of the escape may be determined by the difference between the HPs or a random number or the like.
When the step S551, S553, S555, or S557 is ended, the CPU 82 performs other pieces of processing in a step S559. Specifically, examples of other pieces of processing include enemy character attack processing, defense processing, and magic processing.
In a step S561, the CPU 82 performs the HP subtraction processing to the player character and enemy character based on the attack processing, the defense processing and the magic processing. The HP is subtracted by the predetermined value according to the attack and magic of the opponent. In the case where the opponent protects against the attack, the HP of the opponent is subtracted by the predetermined value which is lower than that of the case in which the opponent does not protect against the attack. The computed HPs of the player character and enemy character are stored in the memory area 288.
In a step S563, the CPU 82 determines whether or not the battle is ended. Examples of the battle end condition include that the HP of the player character or enemy character becomes zero and that the player character escapes successfully from the battle. In the case where the player character loses the battle, it is determined that the game is ended in the step S413 of
In the role-playing game, the movement target is the player character. Alternatively, the moving processing can also be applied to the movement of the cursor or pointer and the movement of the viewpoint or point of gaze of the virtual camera and the like.
In each of the above-described embodiments, the game machine 52 obtains the four load detection values from the game controller 10 and computes the necessary quantity of load computation values. However, in another embodiment, the game machine 52 informs the game controller 10 of the necessary quantity of load computation values, and the game controller 10 may compute the necessary quantity of load computation values in response to the notification and transmit the necessary quantity of load computation values to the game machine 52.
Specifically, the game machine 52 determines the necessary quantity of load values at the load obtaining timing, and the game machine 52 transmits the load obtaining command for obtaining the necessary quantity of load values to the game controller 10. The game controller 10 that receives the command determines the type of the command to compute the necessary quantity of load values according to the command, and the game controller 10 transmits the computed load values to the game machine 52. In the game machine 52, the game processing is performed based on the received necessary quantity of load values.
In the case where it is the load obtaining timing in the step S45, the necessary quantity of load values is determined in the steps S53 to S57. Specifically, in the step S53, the CPU 82 determines whether or not the necessary quantity of load values is one. If “YES” in the step S53, namely, in the case of the total load game, the CPU 82 transmits a total load command to the game controller 10 in a step S581. The total load command is used to obtain the summation (total load value) of the four load detection values. The transmission to the game controller 10 is performed to the remote control 54, and the total load command is transmitted from the remote control 54 to the game controller 10. In response to the total load command, the game controller 10 detects the four load values and computes the summation of the four load detection values. The summation is transmitted as the input data through the remote control 54 and received by the wireless controller module 94 of the game machine 52. Accordingly, in a step S583, the CPU 82 obtains the summation from the game controller 10 through the input data buffer 220. In the step S59, the CPU 82 performs the game processing 1 based on the summation. Because the summation is received from the game controller 10, the load computing processing (step S101 of
On the other hand, if “NO” in the step S53, the CPU 82 determines whether or not the necessary quantity of load values is two in the step S55. If “YES” in the step S53, namely, in the case of the right and left balance game, the CPU 82 transmits a horizontal load command to the game controller 10 in a step S585. The horizontal load command is used to obtain the right and left load values. In response to the horizontal load command, the game controller 10 detects the four load values and computes the right and left load values. The right and left load values are transmitted to the game machine 52. Accordingly, in a step S587, the CPU 82 obtains the right and left load values from the game controller 10 through the input data buffer 220. In the step S61, the CPU 82 performs the game processing 2 based on the right and left load values. The load computing processing (steps S151 and S153 of
If “NO” in the step S55, the CPU 82 determines whether or not the necessary quantity of load values is four in the step S57. If “YES” in the step S57, namely, in the cases of the four-directional balance game, the hoop game, and the like, the CPU 82 transmits a four-directional load command to the game controller 10 in a step S589. The four-directional load command is used to obtain the four load detection values. In response to the four-directional load command, the game controller 10 detects the four load values and transmits the four load detection values to the game machine 52. Accordingly, in a step S591, the CPU 82 obtains the four load detection values from the game controller 10 through the input data buffer 220. In the step S63, the CPU 82 performs the game processing 3 based on the four load detection values. In the embodiment, in the case where the four load values are required, because the four load detection values is obtained from the game controller 10, the load computing processing (step S181 of
In the case where it is the load obtaining timing in the step S395, the necessary quantity of load values is determined by determining the scene in the steps S403 to S405. Specifically, in the step S403, the CPU 82 determines whether or not the field is set in the scene flag of the memory area 286. If “YES” in the step S403, namely, in the case where the four load values are required, the CPU 82 transmits a four-directional load obtaining command to the game controller 10 in a step S601. In response to the four-directional load obtaining command, the game controller 10 transmits the four load detection values to the game machine 52, and the four load detection values are received by the wireless controller module 94. Accordingly, in a step S603, the CPU 82 obtains the four load detection values from the game controller 10 through the input data buffer 220. In the step S407, the CPU 82 performs the moving processing based on the four values. In the embodiment, because the four load detection values are obtained from the game controller 10 in the case where the four load values are required, the load computing processing (step S431 of
On the other hand, if “NO” in the step S403, the CPU 82 determines whether or not the battle is set in the scene flag of the memory area 286 in the step S405. If “YES” in the step S405, namely, in the case where the two load values are required, the CPU 82 transmits a horizontal load obtaining command to the game controller 10 in a step S605. In response to the horizontal load obtaining command, the game controller 10 transmits the right and left load values. Accordingly, in a step S607, the CPU 82 obtains the right and left load values from the game controller 10 through the input data buffer 220. In the step S409, the CPU 82 performs the battle processing based on the two values. Because the right and left load values are received, the load computing processing (step S471 of
After the load detection values are obtained from the four load sensors 14 in the step S5, the CPU 82 determines whether or not the command from the game machine 52 is the total load command in a step S621. If “YES” in the step S621, namely, in the case where one load value is required, the microcomputer 20 computes the summation of the four load detection values in a step S627. In a step S629, the microcomputer 20 transmits the computed summation to the game machine 52. The transmission to the game machine 52 is performed to the remote control 54 through the connector 24, and the load computation values are transmitted as the input data from the remote control 54 to the game machine 52.
On the other hand, if “NO” in the step S621, the microcomputer 20 determines whether or not the command is the horizontal load command in a step S623. If “YES” in the step S623, namely, in the case where the two load values are required, the microcomputer 20 computes the left load value, i.e., the summation of the two load detection values on the left sides in a step S631. In a step S633, the microcomputer 20 computes the right load value, i.e., the summation of the two load detection values on the right sides. In a step S635, the microcomputer 20 transmits the right load value data and the left load value data to the game machine 52 through the remote control 54.
If “NO” in the step S623, the microcomputer 20 determines whether or not the command is the four-directional load command in a step S625. If “YES” in the step S625, the microcomputer 20 transmits the four load detection values to the game machine 52 through the remote control 54 in a step S637. In another embodiment, the microcomputer 20 may compute such the four load computation values as the upper load value, the lower load value, the left load value, and the right load value from the four pieces of load detection values and transmit the four load computation value data to the game machine 52.
Although not shown in
The processor 112 of the remote control 54 obtains the four load detection values from the game controller 10, and the processor 112 may compute the quantity of load computation values according to the command. The processor 112 of the remote control 54 may also perform the correction value computing processing and the load detection value correction processing.
In each of the above-described embodiments, when the necessary quantity of the load values is two, the four load sensors 14 are equally and horizontally divided into two sets to compute the summation of the two load detection values adjacent to each other on the left side and the summation of the two load detection values adjacent to each other on the right side. However, the patterns of the combinations of the plurality of load detection values may appropriately be changed when the load computation values are computed. For example, the two load computation values of the summation (upper load value) of the two load detection values adjacent to each other on the upper side and the summation (lower load value) of the two load detection values adjacent to each other on the lower side may be computed. The plurality of load detection values may unequally be combined. For example, the two load computation values of the one load detection value and the summation of the three load detection values may be computed, or the three load computation values of the summation of the two load detection values and each of the two remaining load detection values may be computed.
In each of the above-described embodiments, only the load sensor 14 that is of the manipulation means or input means manipulated by the player is provided in the game controller 10. However, in another embodiment, as shown in
When the foot manipulation button 40 is pressed down by the player's leg, a manipulation signal corresponding to the foot manipulation button 40 is inputted to the microcomputer 20, and the manipulation data is transmitted to the game machine 52. Accordingly, the game processing is performed based on the foot manipulation data.
The functions allocated to the foot manipulation buttons 40 are appropriately set. For example, the two foot manipulation buttons 40 may be the start button for starting the game and the select button for selecting the game. Alternatively, the two foot manipulation buttons 40 may be the A button for providing the instructions such as the predetermined motion and determination and the B button for providing the instructions such as the predetermined motion and cancellation. According to the game controller 10, the player can perform the game manipulation by the way the load is applied, and the player can also press down the foot manipulation button 40 to manipulate the game using the function allocated to the foot manipulation button 40.
In each of the above-described embodiments, the game controller 10 conducts communication with the game machine 52 through the remote control 54. The remote control 54 is different type of game controller, and the remote control 54 can wirelessly conduct communication with the game machine 52. However, in another embodiment, the game controller 10 may directly conduct communication with the game machine 52.
In the embodiment shown in
In the embodiment shown in
In each of the above-described embodiments, the board 12 (support plate 16) of the game controller 10 is formed in the square or rectangular shape when viewed from above. However, the shape of the board 12 (support plate 16) can appropriately be changed. For example, as shown in
In each of the above-described embodiments, the four load sensors 14 are arranged in the peripheral portion of the board 12 (support plate 16). In another embodiment, the four load sensors 14 may be arranged inside the peripheral portion as shown in
In each of the above-described embodiments, the intermediate-layer plate 16c included in the support plate 16 is supported by the four load sensors 14. However, in another embodiment, the intermediate-layer plate 16c may be divided into four as shown in
In each of the above-described embodiments, the one game controller 10 is connected to the game machine 52. However, in another embodiment, the game system 50 may be configured such that the plurality of game controllers 10 are directly connected to the game machine 52 or such that the plurality of game controllers 10 are connected to the game machine 52 through the plurality of remote controls 54, a junction device, or the network. In such cases, the game for the plurality of players can be performed.
In each of the above-described embodiments, the four load sensors 14 are provided in the game controller 10. However, in another embodiment, four or more load sensors 14 may be provided in the game controller 10. In such cases, further various quantities of load computation values can be computed to perform the game processing.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2007-111177 | Apr 2007 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
588172 | Peters | Aug 1897 | A |
688076 | Ensign | Dec 1901 | A |
D188376 | Hotkins et al. | Jul 1960 | S |
3217536 | Motsinger et al. | Nov 1965 | A |
3424005 | Brown | Jan 1969 | A |
3428312 | Machen | Feb 1969 | A |
3712294 | Muller | Jan 1973 | A |
3752144 | Weigle, Jr. | Aug 1973 | A |
3780817 | Videon | Dec 1973 | A |
3826145 | McFarland | Jul 1974 | A |
3869007 | Haggstrom et al. | Mar 1975 | A |
4058178 | Shinohara et al. | Nov 1977 | A |
4104119 | Schilling | Aug 1978 | A |
4136682 | Pedotti | Jan 1979 | A |
4246783 | Steven et al. | Jan 1981 | A |
4296931 | Yokoi | Oct 1981 | A |
4337050 | Engalitcheff, Jr. | Jun 1982 | A |
4404854 | Krempl et al. | Sep 1983 | A |
4488017 | Lee | Dec 1984 | A |
4494754 | Wagner, Jr. | Jan 1985 | A |
4558757 | Mori et al. | Dec 1985 | A |
4569519 | Mattox et al. | Feb 1986 | A |
4574899 | Griffin | Mar 1986 | A |
4577868 | Kiyonaga | Mar 1986 | A |
4598717 | Pedotti | Jul 1986 | A |
4607841 | Gala | Aug 1986 | A |
4630817 | Buckleu | Dec 1986 | A |
4660828 | Weiss | Apr 1987 | A |
4680577 | Straayer et al. | Jul 1987 | A |
4688444 | Nordstrom | Aug 1987 | A |
4691694 | Boyd et al. | Sep 1987 | A |
4711447 | Mansfield | Dec 1987 | A |
4726435 | Kitagawa et al. | Feb 1988 | A |
4739848 | Tulloch | Apr 1988 | A |
4742832 | Kauffmann et al. | May 1988 | A |
4742932 | Pedragosa | May 1988 | A |
4800973 | Angel | Jan 1989 | A |
4838173 | Schroeder et al. | Jun 1989 | A |
4855704 | Betz | Aug 1989 | A |
4880069 | Bradley | Nov 1989 | A |
4882677 | Curran | Nov 1989 | A |
4893514 | Gronert et al. | Jan 1990 | A |
4907797 | Gezari et al. | Mar 1990 | A |
4927138 | Ferrari | May 1990 | A |
4970486 | Gray et al. | Nov 1990 | A |
4982613 | Becker | Jan 1991 | A |
D318073 | Jang | Jul 1991 | S |
5044956 | Behensky et al. | Sep 1991 | A |
5049079 | Furtado et al. | Sep 1991 | A |
5052406 | Nashner | Oct 1991 | A |
5054771 | Mansfield | Oct 1991 | A |
5065631 | Ashpitel et al. | Nov 1991 | A |
5089960 | Sweeney, Jr. | Feb 1992 | A |
5103207 | Kerr et al. | Apr 1992 | A |
5104119 | Lynch | Apr 1992 | A |
5116296 | Watkins et al. | May 1992 | A |
5118112 | Bregman et al. | Jun 1992 | A |
5151071 | Jain et al. | Sep 1992 | A |
5195746 | Boyd et al. | Mar 1993 | A |
5197003 | Moncrief et al. | Mar 1993 | A |
5199875 | Trumbull | Apr 1993 | A |
5203563 | Loper, III | Apr 1993 | A |
5207426 | Inoue et al. | May 1993 | A |
5259252 | Kruse et al. | Nov 1993 | A |
5269318 | Nashner | Dec 1993 | A |
5299810 | Pierce et al. | Apr 1994 | A |
5303715 | Nashner | Apr 1994 | A |
5360383 | Boren | Nov 1994 | A |
5362298 | Brown et al. | Nov 1994 | A |
5368546 | Stark et al. | Nov 1994 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5431569 | Simpkins et al. | Jul 1995 | A |
5462503 | Benjamin et al. | Oct 1995 | A |
5466200 | Ulrich et al. | Nov 1995 | A |
5469740 | French et al. | Nov 1995 | A |
5474087 | Nashner | Dec 1995 | A |
5476103 | Nahsner | Dec 1995 | A |
5507708 | Ma | Apr 1996 | A |
5541621 | Nmngani | Jul 1996 | A |
5541622 | Engle et al. | Jul 1996 | A |
5547439 | Rawls et al. | Aug 1996 | A |
5551445 | Nashner | Sep 1996 | A |
5551693 | Goto et al. | Sep 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
D376826 | Ashida | Dec 1996 | S |
5584700 | Feldman et al. | Dec 1996 | A |
5584779 | Knecht et al. | Dec 1996 | A |
5591104 | Andrus et al. | Jan 1997 | A |
5613690 | McShane et al. | Mar 1997 | A |
5623944 | Nashner | Apr 1997 | A |
5627327 | Zanakis | May 1997 | A |
D384115 | Wilkinson et al. | Sep 1997 | S |
5669773 | Gluck | Sep 1997 | A |
5689285 | Asher | Nov 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5697791 | Nashner et al. | Dec 1997 | A |
5713794 | Shimojima et al. | Feb 1998 | A |
5721566 | Rosenberg et al. | Feb 1998 | A |
5746684 | Jordan | May 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
D397164 | Goto | Aug 1998 | S |
5788618 | Joutras | Aug 1998 | A |
5792031 | Alton | Aug 1998 | A |
5800314 | Sakakibara et al. | Sep 1998 | A |
5805138 | Brawne et al. | Sep 1998 | A |
5813958 | Tomita | Sep 1998 | A |
5814740 | Cook et al. | Sep 1998 | A |
5820462 | Yokoi et al. | Oct 1998 | A |
5825308 | Rosenberg | Oct 1998 | A |
5837952 | Oshiro et al. | Nov 1998 | A |
D402317 | Goto | Dec 1998 | S |
5846086 | Bizzi et al. | Dec 1998 | A |
5853326 | Goto et al. | Dec 1998 | A |
5854622 | Brannon | Dec 1998 | A |
5860861 | Lipps et al. | Jan 1999 | A |
5864333 | O'Heir | Jan 1999 | A |
5872438 | Roston | Feb 1999 | A |
5886302 | Germanton et al. | Mar 1999 | A |
5888172 | Andrus et al. | Mar 1999 | A |
5889507 | Engle et al. | Mar 1999 | A |
D407758 | Isetani et al. | Apr 1999 | S |
5890995 | Bobick et al. | Apr 1999 | A |
5897457 | Mackovjak | Apr 1999 | A |
5897469 | Yalch | Apr 1999 | A |
5901612 | Letovsky | May 1999 | A |
5902214 | Makikawa et al. | May 1999 | A |
5904639 | Smyser et al. | May 1999 | A |
D411258 | Isetani et al. | Jun 1999 | S |
5912659 | Rutledge et al. | Jun 1999 | A |
5919092 | Yokoi et al. | Jul 1999 | A |
5921780 | Myers | Jul 1999 | A |
5921899 | Rose | Jul 1999 | A |
5929782 | Stark et al. | Jul 1999 | A |
5947824 | Minami et al. | Sep 1999 | A |
5976063 | Joutras et al. | Nov 1999 | A |
5980256 | Carmein | Nov 1999 | A |
5980429 | Nashner | Nov 1999 | A |
5984785 | Takeda et al. | Nov 1999 | A |
5987982 | Wenman et al. | Nov 1999 | A |
5989157 | Walton | Nov 1999 | A |
5993356 | Houston et al. | Nov 1999 | A |
5997439 | Ohsuga et al. | Dec 1999 | A |
6001015 | Nishiumi et al. | Dec 1999 | A |
6007428 | Nishiumi et al. | Dec 1999 | A |
6010465 | Nashner | Jan 2000 | A |
D421070 | Jang et al. | Feb 2000 | S |
6037927 | Rosenberg | Mar 2000 | A |
6038488 | Barnes et al. | Mar 2000 | A |
6044772 | Gaudette et al. | Apr 2000 | A |
6063046 | Allum | May 2000 | A |
6086518 | MacCready, Jr. | Jul 2000 | A |
6102803 | Takeda et al. | Aug 2000 | A |
6102832 | Tani | Aug 2000 | A |
D431051 | Goto | Sep 2000 | S |
6113237 | Ober et al. | Sep 2000 | A |
6147674 | Rosenberg et al. | Nov 2000 | A |
6152564 | Ober et al. | Nov 2000 | A |
D434769 | Goto | Dec 2000 | S |
D434770 | Goto | Dec 2000 | S |
6155926 | Miyamoto et al. | Dec 2000 | A |
6162189 | Girone et al. | Dec 2000 | A |
6167299 | Galchenkov et al. | Dec 2000 | A |
6190287 | Nashner | Feb 2001 | B1 |
6200253 | Nishiumi et al. | Mar 2001 | B1 |
6203432 | Roberts et al. | Mar 2001 | B1 |
6216542 | Stockli et al. | Apr 2001 | B1 |
6216547 | Lehtovaara | Apr 2001 | B1 |
6220865 | Macri et al. | Apr 2001 | B1 |
D441369 | Goto | May 2001 | S |
6225977 | Li | May 2001 | B1 |
6227968 | Suzuki et al. | May 2001 | B1 |
6228000 | Jones | May 2001 | B1 |
6231444 | Goto | May 2001 | B1 |
6239806 | Nishiumi et al. | May 2001 | B1 |
6241611 | Takeda et al. | Jun 2001 | B1 |
6244987 | Ohsuga et al. | Jun 2001 | B1 |
D444469 | Goto | Jul 2001 | S |
6264558 | Nishiumi et al. | Jul 2001 | B1 |
6280361 | Harvey et al. | Aug 2001 | B1 |
D447968 | Pagnacco et al. | Sep 2001 | S |
6295878 | Berme | Oct 2001 | B1 |
6296595 | Stark et al. | Oct 2001 | B1 |
6325718 | Nishiumi et al. | Dec 2001 | B1 |
6330837 | Charles et al. | Dec 2001 | B1 |
6336891 | Fedrigon et al. | Jan 2002 | B1 |
6353427 | Rosenberg | Mar 2002 | B1 |
6354155 | Berme | Mar 2002 | B1 |
6357827 | Brightbill et al. | Mar 2002 | B1 |
6359613 | Poole | Mar 2002 | B1 |
D456410 | Ashida | Apr 2002 | S |
D456854 | Ashida | May 2002 | S |
D457570 | Brinson | May 2002 | S |
6387061 | Nitto | May 2002 | B1 |
6388655 | Leung | May 2002 | B1 |
6389883 | Berme et al. | May 2002 | B1 |
6394905 | Takeda et al. | May 2002 | B1 |
6402635 | Nesbit et al. | Jun 2002 | B1 |
D459727 | Ashida | Jul 2002 | S |
D460506 | Tamminga et al. | Jul 2002 | S |
6421056 | Nishiumi et al. | Jul 2002 | B1 |
6436058 | Krahner et al. | Aug 2002 | B1 |
D462683 | Ashida | Sep 2002 | S |
6454679 | Radow | Sep 2002 | B1 |
6461297 | Pagnacco et al. | Oct 2002 | B1 |
6470302 | Cunningham et al. | Oct 2002 | B1 |
6482010 | Marcus et al. | Nov 2002 | B1 |
6510749 | Pagnacco et al. | Jan 2003 | B1 |
6514145 | Kawabata et al. | Feb 2003 | B1 |
6515593 | Stark et al. | Feb 2003 | B1 |
D471594 | Nojo | Mar 2003 | S |
6543769 | Podoloff et al. | Apr 2003 | B1 |
6563059 | Lee | May 2003 | B2 |
6568334 | Gaudette et al. | May 2003 | B1 |
6616579 | Reinbold et al. | Sep 2003 | B1 |
6624802 | Klein et al. | Sep 2003 | B1 |
6632158 | Nashner | Oct 2003 | B1 |
6636161 | Rosenberg | Oct 2003 | B2 |
6636197 | Goldenberg et al. | Oct 2003 | B1 |
6638175 | Lee et al. | Oct 2003 | B2 |
6663058 | Peterson et al. | Dec 2003 | B1 |
6676520 | Nishiumi et al. | Jan 2004 | B2 |
6676569 | Radow | Jan 2004 | B1 |
6679776 | Nishiumi et al. | Jan 2004 | B1 |
6697049 | Lu | Feb 2004 | B2 |
6719667 | Wong et al. | Apr 2004 | B2 |
6726566 | Komata | Apr 2004 | B2 |
6764429 | Michalow | Jul 2004 | B1 |
6797894 | Montagnino et al. | Sep 2004 | B2 |
6811489 | Shimizu et al. | Nov 2004 | B1 |
6813966 | Dukart | Nov 2004 | B2 |
6817973 | Merril et al. | Nov 2004 | B2 |
D500100 | Van Der Meer | Dec 2004 | S |
6846270 | Etnyre | Jan 2005 | B1 |
6859198 | Onodera et al. | Feb 2005 | B2 |
6872139 | Sato et al. | Mar 2005 | B2 |
6872187 | Stark et al. | Mar 2005 | B1 |
6888076 | Hetherington | May 2005 | B2 |
6913559 | Smith | Jul 2005 | B2 |
6936016 | Berme et al. | Aug 2005 | B2 |
D510391 | Merril et al. | Oct 2005 | S |
6975302 | Ausbeck, Jr. | Dec 2005 | B1 |
6978684 | Nurse | Dec 2005 | B2 |
6991483 | Milan et al. | Jan 2006 | B1 |
D514627 | Merril et al. | Feb 2006 | S |
7004787 | Milan | Feb 2006 | B2 |
D517124 | Merril et al. | Mar 2006 | S |
7011605 | Shields | Mar 2006 | B2 |
7033176 | Feldman et al. | Apr 2006 | B2 |
7038855 | French et al. | May 2006 | B2 |
7040986 | Koshima et al. | May 2006 | B2 |
7070542 | Reyes et al. | Jul 2006 | B2 |
7083546 | Zillig et al. | Aug 2006 | B2 |
7100439 | Carlucci | Sep 2006 | B2 |
7121982 | Feldman | Oct 2006 | B2 |
7126584 | Nishiumi et al. | Oct 2006 | B1 |
7127376 | Nashner | Oct 2006 | B2 |
7163516 | Pagnacco et al. | Jan 2007 | B1 |
7179234 | Nashner | Feb 2007 | B2 |
7195355 | Nashner | Mar 2007 | B2 |
7202424 | Carlucci | Apr 2007 | B2 |
7202851 | Cunningham et al. | Apr 2007 | B2 |
7270630 | Patterson | Sep 2007 | B1 |
7307619 | Cunningham et al. | Dec 2007 | B2 |
7308831 | Cunningham et al. | Dec 2007 | B2 |
7331226 | Feldman et al. | Feb 2008 | B2 |
7335134 | LaVelle | Feb 2008 | B1 |
RE40427 | Nashner | Jul 2008 | E |
7416537 | Stark et al. | Aug 2008 | B1 |
7530929 | Feldman et al. | May 2009 | B2 |
7722501 | Nicolas et al. | May 2010 | B2 |
7938751 | Nicolas et al. | May 2011 | B2 |
20010001303 | Ohsuga et al. | May 2001 | A1 |
20010018363 | Goto et al. | Aug 2001 | A1 |
20020055422 | Airmet et al. | May 2002 | A1 |
20020080115 | Onodera et al. | Jun 2002 | A1 |
20020185041 | Herbst | Dec 2002 | A1 |
20030054327 | Evensen | Mar 2003 | A1 |
20030069108 | Kaiserman et al. | Apr 2003 | A1 |
20030107502 | Alexander | Jun 2003 | A1 |
20030176770 | Merril et al. | Sep 2003 | A1 |
20030193416 | Ogata et al. | Oct 2003 | A1 |
20040038786 | Kuo et al. | Feb 2004 | A1 |
20040041787 | Graves | Mar 2004 | A1 |
20040077464 | Feldman et al. | Apr 2004 | A1 |
20040099513 | Hetherington | May 2004 | A1 |
20040110602 | Feldman | Jun 2004 | A1 |
20040163855 | Carlucci | Aug 2004 | A1 |
20040180719 | Feldman et al. | Sep 2004 | A1 |
20040259688 | Stabile | Dec 2004 | A1 |
20050070154 | Milan | Mar 2005 | A1 |
20050076161 | Albanna et al. | Apr 2005 | A1 |
20050130742 | Feldman et al. | Jun 2005 | A1 |
20050202384 | DiCuccio et al. | Sep 2005 | A1 |
20060097453 | Feldman et al. | May 2006 | A1 |
20060161045 | Merril et al. | Jul 2006 | A1 |
20060181021 | Seelig et al. | Aug 2006 | A1 |
20060205565 | Feldman et al. | Sep 2006 | A1 |
20060211543 | Feldman et al. | Sep 2006 | A1 |
20060217243 | Feldman et al. | Sep 2006 | A1 |
20060223634 | Feldman et al. | Oct 2006 | A1 |
20060258512 | Nicolas et al. | Nov 2006 | A1 |
20070021279 | Jones | Jan 2007 | A1 |
20070027369 | Pagnacco et al. | Feb 2007 | A1 |
20070155589 | Feldman et al. | Jul 2007 | A1 |
20070219050 | Merril | Sep 2007 | A1 |
20080012826 | Cunningham et al. | Jan 2008 | A1 |
20080228110 | Berme | Sep 2008 | A1 |
20080261696 | Yamazaki et al. | Oct 2008 | A1 |
20090093315 | Matsunaga et al. | Apr 2009 | A1 |
20100137063 | Shirakawa et al. | Jun 2010 | A1 |
20110070953 | Konishi | Mar 2011 | A1 |
20110077899 | Hayashi et al. | Mar 2011 | A1 |
20110207534 | Meldeau | Aug 2011 | A1 |
Number | Date | Country |
---|---|---|
40 04 554 | Aug 1991 | DE |
195 02 918 | Aug 1996 | DE |
297 12 785 | Jan 1998 | DE |
20 2004 021 792 | May 2011 | DE |
20 2004 021 793 | May 2011 | DE |
0 275 665 | Jul 1988 | EP |
0 299 738 | Jan 1989 | EP |
0 335 045 | Oct 1989 | EP |
0 519 836 | Dec 1992 | EP |
1 043 746 | Oct 2000 | EP |
1 120 083 | Aug 2001 | EP |
1 257 599 | Aug 2001 | EP |
1 870 141 | Dec 2007 | EP |
2 472 929 | Jul 1981 | FR |
2 587 611 | Mar 1987 | FR |
2 604 910 | Apr 1988 | FR |
2647331 | Nov 1990 | FR |
2 792 182 | Oct 2000 | FR |
2 801 490 | Jun 2001 | FR |
2811753 | Jan 2002 | FR |
2906365 | Mar 2008 | FR |
1 209 954 | Oct 1970 | GB |
2 288 550 | Oct 1995 | GB |
44-23551 | Oct 1969 | JP |
55-95758 | Dec 1978 | JP |
54-73689 | Jun 1979 | JP |
55-113472 | Sep 1980 | JP |
55-113473 | Sep 1980 | JP |
55-125369 | Sep 1980 | JP |
55-149822 | Nov 1980 | JP |
55-152431 | Nov 1980 | JP |
60-79460 | Jun 1985 | JP |
60-153159 | Oct 1985 | JP |
61-154689 | Jul 1986 | JP |
62-034016 | Feb 1987 | JP |
62-34016 | Feb 1987 | JP |
63-158311 | Oct 1988 | JP |
63-163855 | Oct 1988 | JP |
63-193003 | Dec 1988 | JP |
02-102651 | Apr 1990 | JP |
2-238327 | Sep 1990 | JP |
3-25325 | Feb 1991 | JP |
3-103272 | Apr 1991 | JP |
03-107959 | Nov 1991 | JP |
6-063198 | Mar 1994 | JP |
6-282373 | Oct 1994 | JP |
7-213741 | Aug 1995 | JP |
7-213745 | Aug 1995 | JP |
7-241281 | Sep 1995 | JP |
7-241282 | Sep 1995 | JP |
07-275307 | Oct 1995 | JP |
7-302161 | Nov 1995 | JP |
8-43182 | Feb 1996 | JP |
08-131594 | May 1996 | JP |
8-182774 | Jul 1996 | JP |
8-184474 | Jul 1996 | JP |
8-215176 | Aug 1996 | JP |
08-244691 | Sep 1996 | JP |
2576247 | Jan 1997 | JP |
9-120464 | May 1997 | JP |
9-168529 | Jun 1997 | JP |
9-197951 | Jul 1997 | JP |
9-305099 | Nov 1997 | JP |
11-309270 | Nov 1999 | JP |
U3068681 | Feb 2000 | JP |
2000-146679 | May 2000 | JP |
U3069287 | Jun 2000 | JP |
2000-254348 | Sep 2000 | JP |
3172738 | Jun 2001 | JP |
2001-178845 | Jul 2001 | JP |
2001-286451 | Oct 2001 | JP |
2002-112984 | Apr 2002 | JP |
2002-157081 | May 2002 | JP |
2002-253534 | Sep 2002 | JP |
2003-79599 | Mar 2003 | JP |
2003-235834 | Aug 2003 | JP |
3722678 | Nov 2005 | JP |
2005-334083 | Dec 2005 | JP |
3773455 | May 2006 | JP |
2006-167094 | Jun 2006 | JP |
3818488 | Sep 2006 | JP |
2006-284539 | Oct 2006 | JP |
U3128216 | Dec 2006 | JP |
2008-49117 | Mar 2008 | JP |
WO 9111221 | Aug 1991 | WO |
WO 9212768 | Aug 1992 | WO |
WO 9840843 | Sep 1998 | WO |
WO 0012041 | Mar 2000 | WO |
WO 0057387 | Sep 2000 | WO |
WO 0069523 | Nov 2000 | WO |
WO 0229375 | Apr 2002 | WO |
WO 02057885 | Jul 2002 | WO |
WO 2004051201 | Jun 2004 | WO |
WO 2004053629 | Jun 2004 | WO |
WO 2005043322 | May 2005 | WO |
2008034965 | Mar 2008 | WO |
WO 2008099582 | Aug 2008 | WO |
Number | Date | Country | |
---|---|---|---|
20080261696 A1 | Oct 2008 | US |