The disclosure of Japanese Patent Application No. 2007-261798 is incorporated herein by reference.
1. Field of the Invention
The invention relates to a storage medium storing a load detecting program and a load detecting apparatus. More specifically, the present invention relates to a storage medium storing a load detecting program and a load detecting apparatus which perform processing by detecting load values imposed on a support plate on which a foot of a player is put.
2. Description of the Related Art
Conventionally, a load detecting apparatus equipped with a sensor for detecting a load of a subject is known in a field of medical equipment for purpose of exercises such as rehabilitation.
For example, in a Patent Document 1 (Japanese Patent Application Laid-Open No. 62-34016 [G01G 19/00, A61B 5/10, A61H 1/00, G01G 23/37]), a variable load display apparatus provided with two load sensors is disclosed. In this apparatus, right and left feet are put on the respective load sensors one by one. From the display of load values detected by the two load sensors, a balance between the right and left feet is measured.
Furthermore, in a Patent Document 2 (Japanese Patent Application Laid-open No. 7-275307 [A61H 1/02, A61B 5/11, A63B 23/04]), a center of gravity shift training apparatus with three load detecting means is disclosed. In this apparatus, both feet are put on a detection plate provided with the three load detecting means. By an arithmetic operation of signals detected from the three load detecting means, a position of the center of gravity is calculated and displayed, and whereby, training for shifting the center of gravity is performed.
However, in the above-described Patent Documents 1 and 2, although changes of the load in a state that the foot of the subject is put on the detection plate provided with the load detecting means (the balance between right and left and shift of the center of gravity) can be measured, there is a problem in that it is difficult to accurately determine a motion of putting the foot on and down from the detection plate like a step-up-and-down exercise. For example, consider now that a subject, putting one foot on the detection plate, is made to make a motion of putting the foot down from the detection plate to the ground according to an instruction of the screen. At this time, the change of the load put on the plate at a time when the subject puts the foot off from the detection plate is measured, the load value becomes approximately 0. However, at this step, since the subject merely puts the foot off from the detection plate, and has not finished the motion of putting the foot on the ground, if the motion is determined here, a time lag occurs between the timing when the motion is actually performed and the timing when the determination of the motion is performed in the apparatus. Thus, this makes it impossible to accurately determine the motion of the subject, and this gives the subject a strong uncomfortable feeling and an unnatural impression. Since the above-described Patent Documents 1 and 2 only disclose the measurement of the changes of the load with both of the feet are put on the plate, it is impossible to detect timing when the foot is put on the ground, so that such problems cannot be solved.
Therefore, it is a primary object of the present invention to provide a novel storage medium storing a load detecting program and a load detecting apparatus.
Another object of the present invention is to provide a storage medium storing a load detecting program and a load detecting apparatus capable of accurately determining timing of a motion such as putting a foot on and down from the stepping board by a player.
The present invention employs following features in order to solve the above-described problems. It should be noted that reference numerals inside the parentheses and the supplements show one example of a corresponding relationship with the embodiments described later for easy understanding of the present invention, and do not limit the present invention.
A first invention is a storage medium storing a load detecting program to be executed by a computer of a load detecting apparatus having a support board allowing a player to put the feet on. The load detecting program causes the computer to execute a load value detecting step, a motion instructing step, an elapsed time counting step, a judgment timing deciding step, and a first motion determining step. The load value detecting step detects a load value put on the support board. The motion instructing step instructs the player to perform a first motion. The elapsed time counting step counts an elapsed time from when the motion instructing step gives an instruction of the first motion. The judgment timing deciding step decides whether or not a first motion judgment timing has come on the basis of the elapsed time. The first motion determining step determines whether or not the first motion is performed on the basis of the load value detected by the load value detecting step when the judgment timing deciding step decides the judgment timing has come.
In the first invention, the load detecting program is executed in a computer (40, 42) of a load detecting apparatus (10, 12), and causes the load detecting apparatus to function as an apparatus for determining a motion of a player (user) on the basis of the detected load values, for example. The load detecting apparatus has a support board (36) for allowing the player to put the feet on, and the support board has a load sensor (36b), for example. A load value detecting step (S35, S75, S115, S163) detects a load value put on the support board. A load value depending on the motion by the player with respect to the support board is detected. A motion instructing step (S151) instructs the player to perform a first motion. For example, the instruction of the motion may be performed by displaying panels (400) representing the motion on a screen, and suitable timing may be shown by the timing of the movement or stop of the panels. An elapsed time counting step (S153, S155) counts an elapsed time (T4) from an instruction of the first motion is given. A judgment timing deciding step (S159, S161) decides whether or not the first motion judgment timing has come on the basis of the elapsed time. It is decided whether or not a suitable time for determining the execution of the first motion elapses from the instruction. A first motion determining step (S165) determines whether or not the first motion is performed on the basis of the detected load value when it is decided that the judgment timing has come.
According to the first invention, since the judgment timing is decided by the elapsed time from the start of the instruction of the motion, it is possible to determine the execution of the motion by properly deciding the judgment timing of the motion by the player.
A second invention is a storage medium storing a load detecting program according to the first invention, and the load detecting program causes the computer to further execute a load determining step for determining whether or not the load value detected by the load value detecting step becomes a predetermined state. The motion instructing step instructs the player to perform a second motion, the elapsed time counting step counts an elapsed time from when the instruction of the second motion is given, the judgment timing deciding step decides that the first motion judgment timing has come when the elapsed time from the instruction of the first motion reaches the elapsed time from when the instruction of the second motion is given to when the load determining step determines that the load value becomes the predetermined state.
In the second invention, a motion instructing step (S31, S71, S111) gives an instruction of a second motion, and an elapsed time counting step (S33, S49, S73, S89, S113, S129) counts an elapsed time from when the instruction of the second motion is given. A load determining step (S37, S41, S77, S81, S117, S121) determines whether or not the detected load value becomes a predetermined state. The predetermined state is a state that a condition for determining that the second motion is executed is satisfied, for example. That is, a judgment condition of a ratio of a load value to a body weight value, and a judgment condition in relation to a position of the center of gravity are decided in advance, and judgment of the condition is performed on the basis of the detected load value. The judgment timing deciding step decides that the first motion judgment timing has come when the elapsed time (T4) from the instruction of the first motion reaches the elapsed time (T3) from when the instruction of the second motion is given to when the load determining step determines that the load value becomes the predetermined state. The judgment timing of the first motion can be decided on the basis of the second motion judgment timing, so that it is possible to make a suitable judgment with simple processing.
A third invention is a storage medium storing a load detecting program according to the second invention, and the judgment timing deciding step decides that the first motion judgment timing has come in a case that the load determining step does not determines that the load value becomes the predetermined state from the instruction of the second motion, when the elapsed time from the instruction of the first motion becomes a predetermined time.
In the third invention, in the judgment timing deciding step, in a case that the load determining step does not determine that the load value becomes the predetermined state from the instruction of the second motion, that is, it is not determined that the second motion is performed by the player, when the elapsed time (T4) from the instruction of the first motion becomes a predetermined time (PS), it is decided that the first motion judgment timing has come. For example, the predetermined time is set to a value suitable for performing the first motion. Even if determination of the second motion is not performed, it is possible to decide the first motion judgment timing on the basis of the suitable timing set in advance.
A fourth invention is a storage medium storing a load detecting program according to the second invention, and the first motion is a motion, from a state that one foot of the player is put on the support board and the other foot is put on a ground, of putting the one foot down from the support board, and the second motion is a motion of putting only one foot down from the support board from a state that the player rides on the support board.
In the fourth invention, it is possible to properly decide judgment timing of a motion such as a step-up-and-down exercise.
A fifth invention is a storage medium storing a load detecting program according to the first invention, and the load detecting program causes the computer to further execute a notifying step for notifying the player that the first motion is performed when the first motion determining step determines that the first motion is performed.
In the fifth invention, the notifying step (S171) notifies the player that the first motion is performed by a sound output, an image display, etc. It is possible to easily inform the player whether or not the instructed motion is executed.
A sixth invention is a load detecting apparatus having a support board allowing a player to put the feet on, and comprises a load value detecting means, a motion instructing means, an elapsed time counting step, a judgment timing deciding means, and a first motion determining means. The load value detecting means detects a load value put on the support board. The motion instructing means instructs the player to perform a first motion. The elapsed time counting means counts an elapsed time from when the motion instructing means gives an instruction of the first motion. The judgment timing deciding means decides whether or not the first motion judgment timing has come on the basis of the elapsed time. The first motion determining means determines whether or not the first motion is performed on the basis of the load value detected by the load value detecting means when the judgment timing deciding means decides the judgment timing has come.
The sixth invention is a load detecting apparatus to which the storage medium storing a load detecting program according to the first invention is applied, and has an advantage similar to the first invention.
According to the present invention, since an elapsed time from when an instruction of a motion is performed is counted, and whether the judgment timing of the motion or not is decided on the basis of the elapsed time, it is possible to properly decide the judgment timing as to whether or not the player performs a motion of putting the feet on and down from the plate. Thus, it is possible to accurately determine whether or not the motion is performed.
The above described objects and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
Referring to
The game apparatus 12 includes a roughly rectangular parallelepiped housing 14, and the housing 14 is furnished with a disk slot 16 on a front surface. An optical disk 18 as one example of an information storage medium storing game program, etc. is inserted from the disk slot 16 to be loaded into a disk drive 54 (see
Furthermore, on a front surface of the housing 14 of the game apparatus 12, a power button 20a and a reset button 20b are provided at the upper part thereof, and an eject button 20c is provided below them. In addition, a connector cover for external memory card 28 is provided between the reset button 20b and the eject button 20c, and in the vicinity of the disk slot 16. Inside the connector cover for external memory card 28, an connector for external memory card 62 (see
It should be noted that a general-purpose SD card can be employed as a memory card, but other general-purpose memory cards, such as MemoryStick, Multimedia Card (registered trademark) can be employed.
The game apparatus 12 has an AV cable connector 58 (see
Furthermore, the power of the game apparatus 12 is applied by means of a general AC adapter (not illustrated). The AC adapter is inserted into a standard wall socket for home use, and the game apparatus 12 transforms the house current (commercial power supply) to a low DC voltage signal suitable for driving. In another embodiment, a battery may be utilized as a power supply.
In the game system 10, a user or a player turns the power of the game apparatus 12 on for playing the game (or applications other than the game). Then, the user selects an appropriate optical disk 18 storing a program of a video game (or other applications the player wants to play), and loads the optical disk 18 into the disk drive 54 of the game apparatus 12. In response thereto, the game apparatus 12 starts to execute a video game or other applications on the basis of the program recorded in the optical disk 18. The user operates the controller 22 in order to apply an input to the game apparatus 12. For example, by operating any one of the operating buttons of the input means 26, a game or other application is started. Besides the operation on the input means 26, by moving the controller 22 itself, it is possible to move a moving image object (player object) in different directions or change the perspective of the user (camera position) in a 3-dimensional game world.
The external main memory 46 is utilized as a work area and a buffer area of the CPU 40 by storing programs like a game program, etc. and various data. The ROM/RTC 48, which is a so-called boot ROM, is incorporated with a program for activating the game apparatus 12, and is provided with a time circuit for counting a time. The disk drive 54 reads program data, texture data, etc. from the optical disk 18, and writes them in an internal main memory 42e described later or the external main memory 46 under the control of the CPU 40.
The system LSI 42 is provided with an input-output processor 42a, a GPU (Graphics Processor Unit) 42b, a DSP (Digital Signal Processor) 42c, a VRAM 42d and an internal main memory 42e, and these are connected with one another by internal buses although illustration is omitted.
The input-output processor (I/O processor) 42a executes transmitting and receiving data and executes downloading of the data. Reception and transmission and download of the data are explained in detail later.
The GPU 42b is made up of a part of a drawing means, and receives a graphics command (construction command) from the CPU 40 to generate game image data according to the command. Additionally, the CPU 40 applies an image generating program required for generating game image data to the GPU 42b in addition to the graphics command.
Although illustration is omitted, the GPU 42b is connected with the VRAM 42d as described above. The GPU 42b accesses the VRAM 42d to acquire data (image data: data such as polygon data, texture data, etc.) required to execute the construction instruction. Additionally, the CPU 40 writes image data required for drawing to the VRAM 42d via the GPU 42b. The GPU 42b accesses the VRAM 42d to create game image data for drawing.
In this embodiment, a case that the GPU 42b generates game image data is explained, but in a case of executing an arbitrary application except for the game application, the GPU 42b generates image data as to the arbitrary application.
Furthermore, the DSP 42c functions as an audio processor, and generates audio data corresponding to a sound, a voice, music, or the like to be output from the speaker 34a by means of the sound data and the sound wave (tone) data stored in the internal main memory 42e and the external main memory 46.
The game image data and audio data generated as described above are read by the AV IC 56, and output to the monitor 34 and the speaker 34a via the AV connector 58. Accordingly, a game screen is displayed on the monitor 34, and a sound (music) necessary for the game is output from the speaker 34a.
Furthermore, the input-output processor 42a is connected with a flash memory 44, a wireless communication module 50 and a wireless controller module 52, and is also connected with an expansion connector 60 and a connector for external memory card 62. The wireless communication module 50 is connected with an antenna 50a, and the wireless controller module 52 is connected with an antenna 52a.
The input-output processor 42a can communicate with other game apparatuses and various servers to be connected to a network via a wireless communication module 50. It should be noted that it is possible to directly communicate with another game apparatus without going through the network. The input-output processor 42a periodically accesses the flash memory 44 to detect the presence or absence of data (referred to as data to be transmitted) being required to be transmitted to a network, and transmits it to the network via the wireless communication module 50 and the antenna 50a in a case that data to be transmitted is present. Furthermore, the input-output processor 42a receives data (referred to as received data) transmitted from another game apparatuses via the network, the antenna 50a and the wireless communication module 50, and stores the received data in the flash memory 44. If the received data does not satisfy a predetermined condition, the reception data is abandoned as it is. In addition, the input-output processor 42a can receive data (download data) downloaded from the download server via the network, the antenna 50a and the wireless communication module 50, and store the download data in the flash memory 44.
Furthermore, the input-output processor 42a receives input data transmitted from the controller 22 and the load controller 36 via the antenna 52a and the wireless controller module 52, and (temporarily) stores it in the buffer area of the internal main memory 42e or the external main memory 46. The input data is erased from the buffer area after being utilized in game processing by the CPU 40.
In this embodiment, as described above, the wireless controller module 52 makes communications with the controller 22 and the load controller 36 in accordance with Bluetooth standards.
Furthermore, for the sake of the drawings,
In addition, the input-output processor 42a is connected with the expansion connector 60 and the connector for external memory card 62. The expansion connector 60 is a connector for interfaces, such as USB, SCSI, etc., and can be connected with medium such as an external storage, and peripheral devices such as another controller. Furthermore, the expansion connector 60 is connected with a cable LAN adaptor, and can utilize the cable LAN in place of the wireless communication module 50. The connector for external memory card 62 can be connected with an external storage like a memory card. Thus, the input-output processor 42a, for example, accesses the external storage via the expansion connector 60 and the connector for external memory card 62 to store and read the data.
Although a detailed description is omitted, as shown in
Although the system LSI 42 is supplied with power even in the standby mode, supply of clocks to the GPU 42b, the DSP 42c and the VRAM 42d are stopped so as not to be driven, realizing reduction in power consumption.
Although illustration is omitted, inside the housing 14 of the game apparatus 12, a fan is provided for excluding heat of the IC, such as the CPU 40, the system LSI 42, etc. to outside. In the standby mode, the fan is also stopped.
However, in a case that the standby mode is not desired to be utilized, when the power button 20a is turned off, by making the standby mode unusable, the power supply to all the circuit components are completely stopped.
Furthermore, switching between the normal mode and the standby mode can be performed by turning on and off the power switch 26h (see
The reset button 20b is also connected with the system LSI 42. When the reset button 20b is pushed, the system LSI 42 restarts the activation program of the game apparatus 12. The eject button 20c is connected to the disk drive 54. When the eject button 20c is pushed, the optical disk 18 is removed from the disk drive 54.
Each of
Referring to
The cross key 26a is a four directional push switch, including four directions of front (or upper), back (or lower), right and left operation parts. By operating any one of the operation parts, it is possible to instruct a moving direction of a character or object (player character or player object) that is be operable by a player or instruct the moving direction of a cursor.
The 1 button 26b and the 2 button 26c are respectively push button switches, and are used for adjusting a viewpoint position and a viewpoint direction on displaying the 3D game image, i.e. a position and an image angle of a virtual camera. Alternatively, the 1 button 26b and the 2 button 26c can be used for the same operation as that of the A-button 26d and the B-trigger switch 26i or an auxiliary operation.
The A-button switch 26d is the push button switch, and is used for causing the player character or the player object to take an action other than that instructed by a directional instruction, specifically arbitrary actions such as hitting (punching), throwing, grasping (acquiring), riding, and jumping, etc. For example, in an action game, it is possible to give an instruction to jump, punch, move a weapon, and so forth. Also, in a roll playing game (RPG) and a simulation RPG, it is possible to instruct to acquire an item, select and determine the weapon and command, and so forth.
The −button 26e, the HOME button 26f, the +button 26g, and the power supply switch 26h are also push button switches. The −button 26e is used for selecting a game mode. The HOME button 26f is used for displaying a game menu (menu screen). The +button 26g is used for starting (re-starting) or pausing the game. The power supply switch 26h is used for turning on/off a power supply of the game apparatus 12 by remote control.
In this embodiment, note that the power supply switch for turning on/off the controller 22 itself is not provided, and the controller 22 is set at on-state by operating any one of the switches or buttons of the input means 26 of the controller 22, and when not operated for a certain period of time (30 seconds, for example) or more, the controller 22 is automatically set at off-state.
The B-trigger switch 26i is also the push button switch, and is mainly used for inputting a trigger such as shooting and designating a position selected by the controller 22. In a case that the B-trigger switch 26i is continued to be pushed, it is possible to make movements and parameters of the player object constant. In a fixed case, the B-trigger switch 26i functions in the same way as a normal B-button, and is used for canceling the action determined by the A-button 26d.
As shown in
In addition, the controller 22 has an imaged information arithmetic section 80 (see
Note that, the shape of the controller 22 and the shape, number and setting position of each input means 26 shown in
The processor 70 is in charge of an overall control of the controller 22, and transmits (inputs) information (input information) inputted by the input means 26, the acceleration sensor 74, and the imaged information arithmetic section 80 as input data, to the game apparatus 12 via the radio module 76 and the antenna 78. At this time, the processor 70 uses the memory 72 as a working area or a buffer area.
An operation signal (operation data) from the aforementioned input means 26 (26a to 26i) is inputted to the processor 70, and the processor 70 stores the operation data once in the memory 72.
Moreover, the acceleration sensor 74 detects each acceleration of the controller 22 in directions of three axes of vertical direction (y-axial direction), lateral direction (x-axial direction), and forward and rearward directions (z-axial direction). The acceleration sensor 74 is typically an acceleration sensor of an electrostatic capacity type, but the acceleration sensor of other type may also be used.
For example, the acceleration sensor 74 detects the accelerations (ax, ay, and az) in each direction of x-axis, y-axis, z-axis for each first predetermined time, and inputs the data of the acceleration (acceleration data) thus detected in the processor 70. For example, the acceleration sensor 74 detects the acceleration in each direction of the axes in a range from −2.0 g to 2.0 g (g indicates a gravitational acceleration. The same thing can be said hereafter.) The processor 70 detects the acceleration data given from the acceleration sensor 74 for each second predetermined time, and stores it in the memory 72 once. The processor 70 creates input data including at least one of the operation data, acceleration data and marker coordinate data as described later, and transmits the input data thus created to the game apparatus 12 for each third predetermined time (5 msec, for example).
In this embodiment, although omitted in
The radio module 76 modulates a carrier of a predetermined frequency by the input data, by using a technique of Bluetooth, for example, and emits its weak radio wave signal from the antenna 78. Namely, the input data is modulated to the weak radio wave signal by the radio module 76 and transmitted from the antenna 78 (controller 22). The weak radio wave signal is received by the radio controller module 52 provided to the aforementioned game apparatus 12. The weak radio wave thus received is subjected to demodulating and decoding processing. This makes it possible for the game apparatus 12 (CPU 40) to acquire the input data from the controller 22. Then, the CPU 40 performs game processing, following the input data and the program (game program).
In addition, as described above, the controller 22 is provided with the imaged information arithmetic section 80. The imaged information arithmetic section 80 is made up of an infrared rays filter 80a, a lens 80b, an imager 80c, and an image processing circuit 80d. The infrared rays filter 80a passes only infrared rays from the light incident from the front of the controller 22. As described above, the markers 340m and 340n placed near (around) the display screen of the monitor 34 are infrared LEDs for outputting infrared lights forward the monitor 34. Accordingly, by providing the infrared rays filter 80a, it is possible to image the image of the markers 340m and 340n more accurately. The lens 80b condenses the infrared rays passing thorough the infrared rays filter 80a to emit them to the imager 80c. The imager 80c is a solid imager, such as a CMOS sensor and a CCD, for example, and images the infrared rays condensed by the lens 80b. Accordingly, the imager 80c images only the infrared rays passing through the infrared rays filter 80a to generate image data. Hereafter, the image imaged by the imager 80c is called an “imaged image”. The image data generated by the imager 80c is processed by the image processing circuit 80d. The image processing circuit 80d calculates a position of an object to be imaged (markers 340m and 340n) within the imaged image, and outputs each coordinate value indicative of the position to the processor 70 as imaged data for each fourth predetermined time. It should be noted that a description of the process in the image processing circuit 80d is made later.
The board 36a is formed in a substantially rectangle, and the board 36a has a substantially rectangular shape when viewed from above. For example, a short side of the rectangular is set in the order of about 30 cm, and a long side thereof is set in the order of 50 cm. An upper surface of the board 36a on which the player rides is formed in flat. Side faces at four corners of the board 36a are formed so as to be partially projected in a cylindrical shape.
In the board 36a, the four load sensors 36b are arranged at predetermined intervals. In the embodiment, the four load sensors 36b are arranged in peripheral portions of the board 36a, specifically, at the four corners. The interval between the load sensors 36b is set an appropriate value such that player's intention can accurately be detected for the load applied to the board 36a in a game manipulation.
The support plate 360 includes an upper-layer plate 360a that constitutes an upper surface and an upper side face, a lower-layer plate 360b that constitutes a lower surface and a lower side face, and an intermediate-layer plate 360c provided between the upper-layer plate 360a and the lower-layer plate 360b. For example, the upper-layer plate 360a and the lower-layer plate 360b are formed by plastic molding and integrated with each other by bonding. For example, the intermediate-layer plate 360c is formed by pressing one metal plate. The intermediate-layer plate 360c is fixed onto the four load sensors 36b. The upper-layer plate 360a has a lattice-shaped rib (not shown) in a lower surface thereof, and the upper-layer plate 360a is supported by the intermediate-layer plate 360c while the rib is interposed. Accordingly, when the player rides on the board 36a, the load is transmitted to the support plate 360, the load sensor 36b, and the leg 362. As shown by an arrow in
The load sensor 36b is formed by, e.g., a strain gage (strain sensor) type load cell, and the load sensor 36b is a load transducer that converts the input load into an electric signal. In the load sensor 36b, a strain inducing element 370a is deformed to generate a strain according to the input load. The strain is converted into a change in electric resistance by a strain sensor 370b adhering to the strain inducing element 370a, and the change in electric resistance is converted into a change in voltage. Accordingly, the load sensor 36b outputs a voltage signal indicating the input load from an output terminal.
Other types of load sensors such as a folk vibrating type, a string vibrating type, an electrostatic capacity type, a piezoelectric type, a magneto-striction type, and gyroscope type may be used as the load sensor 36b.
Returning to
The load controller 36 includes a microcomputer 100 that controls an operation of the load controller 36. The microcomputer 100 includes a CPU, a ROM and a RAM (not shown), and the CPU controls the operation of the load controller 36 according to a program stored in the ROM.
The microcomputer 100 is connected with the power button 36c, the A/D converter 102, a DC-DC converter 104 and a wireless module 106. In addition, the wireless module 106 is connected with an antenna 106a. Furthermore, the four load sensors 36b are displayed as a load cell 36b in
Furthermore, the load controller 36 is provided with a battery 110 for power supply. In another embodiment, an AC adapter in place of the battery is connected to supply a commercial power supply. In such a case, a power supply circuit has to be provided for converting alternating current into direct current and stepping down and rectifying the direct voltage in place of the DC-DC converter. In this embodiment, the power supply to the microcomputer 100 and the wireless module 106 are directly made from the battery. That is, power is constantly supplied to a part of the component (CPU) inside the microcomputer 100 and the wireless module 106 to thereby detect whether or not the power button 36c is turned on, and whether or not a power-on (load detection) command is transmitted from the game apparatus 12. On the other hand, power from the battery 110 is supplied to the load sensor 36b, the A/D converter 102 and the amplifier 108 via the DC-DC converter 104. The DC-DC converter 104 converts the voltage level of the direct current from the battery 110 into a different voltage level, and applies it to the load sensor 36b, the A/D converter 102 and the amplifier 108.
The electric power may be supplied to the load sensor 36b, the A/D converter 102, and the amplifier 108 if needed such that the microcomputer 100 controls the DC-DC converter 104. That is, when the microcomputer 100 determines that a need to operate the load sensor 36b to detect the load arises, the microcomputer 100 may control the DC-DC converter 104 to supply the electric power to each load sensor 36b, the A/D converter 102, and each amplifier 108.
Once the electric power is supplied, each load sensor 36b outputs a signal indicating the input load. The signal is amplified by each amplifier 108, and the analog signal is converted into digital data by the A/D converter 102. Then, the digital data is inputted to the microcomputer 100. Identification information on each load sensor 36b is imparted to the detection value of each load sensor 36b, allowing for distinction among the detection values of the load sensors 36b. Thus, the microcomputer 100 can obtain the pieces of data indicating the detection values of the four load sensors 36b at the same time.
On the other hand, when the microcomputer 100 determines that the need to operate the load sensor 36b does not arise, i.e., when the microcomputer 100 determines it is not the time the load is detected, the microcomputer 100 controls the DC-DC converter 104 to stop the supply of the electric power to the load sensor 36b, the A/D converter 102 and the amplifier 108. Thus, in the load controller 36, the load sensor 36b is operated to detect the load only when needed, so that the power consumption for detecting the load can be suppressed.
Typically, the time the load detection is required shall means the time the game apparatus 12 (
Alternatively, the microcomputer 100 determines it is the time the load is detected at regular time intervals, and the microcomputer 100 may control the DC-DC converter 104. In the case when the microcomputer 100 periodically detects the load, information on the period may initially be imparted from the game apparatus 12 to the microcomputer 100 of the load controller 36 or previously stored in the microcomputer 100.
The data indicating the detection value from the load sensor 36b is transmitted as the manipulation data (input data) of the load controller 36 from the microcomputer 100 to the game apparatus 12 (
Additionally, the wireless module 106 can communicate by a radio standard (Bluetooth, wireless LAN, etc.) the same as that of the radio controller module 52 of the game apparatus 12. Accordingly, the CPU 40 of the game apparatus 12 can transmit a load obtaining command to the load controller 36 via the radio controller module 52, etc. The microcomputer 100 of the load controller 36 can receive a command from the game apparatus 12 via the wireless module 106 and the antenna 106a, and transmit input data including load detecting values (or load calculating values) of the respective load sensors 36b to the game apparatus 12.
For example, in the case of a game performed based on the simple total value of the four load values detected by the four load sensors 36b, the player can take any position with respect to the four load sensors 36b of the load controller 36, that is, the player can play the game while riding on any position of the board 36a with any orientation. However, depending on the type of the game, it is necessary to perform processing while determining toward which direction the load value detected by each load sensors 36b is orientated when viewed from the player. That is, it is necessary to understand a positional relationship between the four load sensors 36b of the load controller 36 and the player. For example, the positional relationship between the four load sensors 36b and the player is previously defined, and it may be assumed that the player rides on the board 36a such that the predetermined positional relationship is obtained. Typically, there is defined such the positional relationship that each two load sensors 36b exist at the front and the back of and on right and left sides of the player riding on the center of the board 36a, i.e. such the positional relationship that the load sensors 36b exist in the right front, left front, right rear, and left rear directions from the center of the player respectively when the player rides on the center of the board 36a of the load controller 36. In this case, in this embodiment, the board 36a of the load controller 36 takes shape of a rectangle in a plane view, and the power button 36c is provided on one side (long side) of the rectangle, and therefore, by means of the power button 36c as a mark, the player is informed in advance that he or she rides on the board 36a such that the long side on which the power button 36c is provided is positioned in a predetermined direction (front, back, left or right). Thus, a load value detected at each load sensor 36b becomes a load value in a predetermined direction (right front, left front, right back and left back) when viewed from the player. Accordingly, the load controller 36 and the game apparatus 12 can understand that to which direction each load detecting value corresponds, seen from the player on the basis of the identification information of each load sensor 36b included in the load detection value data and the arrangement data set (stored) in advance for indicating a position or a direction of each load sensor 36b with respect to the player. This makes it possible to grasp an intention of a game operation by the player such as an operating direction from front to back and from side to side, for example.
The arrangement of the load sensors 36b relative to the player is not previously defined but the arrangement may be set by the player's input in the initial setting, setting in the game, or the like. For example, the load is obtained while the screen in which the player instructed to ride on the portion in a predetermined direction (such as the right front, left front, right rear, and left rear directions) when viewed from the player. Therefore, the positional relationship between each load sensor 36b and the player can be specified, and the information on the arrangement by the setting can be generated and stored. Alternatively, a screen for selecting an arrangement of the load controller 36 is displayed on the screen of the monitor 34 to allow the player to select by an input with the controller 22 to which direction the mark (power button 36c) exists when viewed from the player, and in response to the selection, arrangement data of each load sensor 36b may be generated and stored.
Additionally, although
In the case where the position and orientation of the controller 22 are out of the range, the game manipulation cannot be performed based on the position and orientation of the controller 22. Hereinafter the range is referred to as “manipulable range”.
In the case where the controller 22 is grasped in the manipulable range, the images of the markers 340m and 340n are taken by the imaged information arithmetic section 80. That is, the imaged image obtained by the imager 80c includes the images (target images) of the markers 340m and 340n that are of the imaging target.
Because the target image appears as a high-brightness portion in the image data of the imaged image, the image processing circuit 80d detects the high-brightness portion as a candidate of the target image. Then, the image processing circuit 80d determines whether or not the high-brightness portion is the target image based on the size of the detected high-brightness portion. Sometimes the imaged image includes not only images 340m′ and 340n′ corresponding to the two markers 340m and 340n that are of the target image but also the image except for the target image due to the sunlight from a window or a fluorescent light. The processing of the determination whether or not the high-brightness portion is the target image is performed in order to distinguish the images 340m′ and 340n′ of the makers 340m and 340n that are of the target image from other images to exactly detect the target image. Specifically, the determination whether or not the detected high-brightness portion has the size within a predetermined range is made in the determination processing. When the high-brightness portion has the size within the predetermined range, it is determined that the high-brightness portion indicates the target image. On the contrary, when the high-brightness portion does not have the size within the predetermined range, it is determined that the high-brightness portion indicates the image except for the target image.
Then, the image processing circuit 80d computes the position of the high-brightness portion for the high-brightness portion in which it is determined indicate the target image as a result of the determination processing. Specifically, a position of the center of gravity of the high-brightness portion is computed. Hereinafter, the coordinate of the position of the center of gravity is referred to as marker coordinate. The position of the center of gravity can be computed in more detail compared with resolution of the imager 80c. At this point, it is assumed that the image taken by the imager 80c has the resolution of 126×96 and the position of the center of gravity is computed in a scale of 1024×768. That is, the marker coordinate is expressed by an integer number of (0, 0) to (1024, 768).
The position in the imaged image is expressed by a coordinate system (XY-coordinate system) in which an origin is set to an upper left of the imaged image, a downward direction is set to a positive Y-axis direction, and a rightward direction is set to a positive X-axis direction.
In the case where the target image is correctly detected, two marker coordinates are computed because the two high-brightness portions are determined as the target image by the determination processing. The image processing circuit 80d outputs the pieces of data indicating the two computed marker coordinates. As described above, the outputted pieces of marker coordinate data are added to the input data by the processor 70 and transmitted to the game apparatus 12.
When the game apparatus 12 (CPU 40) detects the marker coordinate data from the received input data, the game apparatus 12 can compute the position (indicated coordinate) indicated by the controller 22 on the screen of the monitor 34 and the distances between the controller 22 and the markers 340m and 340n based on the marker coordinate data. Specifically, the position toward which the controller 22 is orientated, i.e., the indicated position is computed from the position at the midpoint of the two marker coordinates. Accordingly, the controller 22 functions as a pointing device for instructing an arbitrary position within the screen of the monitor 34. The distance between the target images in the imaged image is changed according to the distances between the controller 22 and the markers 340m and 340n, and therefore, by computing the distance between the marker coordinates, the game apparatus 12 can compute the current distances between the controller 22 and the markers 340m and 340n.
In the game system 10, a game is performed by a player's motion by putting the feet on and down from the load controller 36. In this embodiment, a step-up-and-down exercise game is performed. The step-up-and-down exercise is an exercise of repetitively putting the foot of a person on and down from the board. As shown in
The motion of stepping up and down the load controller 36 is according to an instruction on the game screen described below, and can be changed as necessary. Accordingly,
More specifically, in each panel 400, two right and left foot prints are drawn, and by changing a color, a shape, a pattern, etc. of the foot prints, a stepping up and down motion to be performed by the right and left feet is represented. Additionally, as a basic manner of the panel 400, a base color is drawn in white, and a line of the foot prints is drawn in gray, for example. If no motion is required to be executed, the panel 400 in this basic manner is used. In
Each panel 400 is constructed so as to sequentially appear from the upper end of the screen, move down, and disappear to the lower end of the screen. At a predetermined position below the center of the screen, a frame 402 is fixedly arranged. The frame 402 is provided on the moving path of the panels 400, and the panel 400 is stopped within the frame 402 for a set amount of time. The frame 402 can instruct the panel 400 for a motion to currently be executed. The panel 400 moving into the position of the frame 402 out of the plurality of panels 400 indicates a motion to be currently executed.
In addition, on the screen, a plurality of characters 404 are displayed at the right and left of the panel 400, for example. These characters 404 are controlled so as to make their actions according to the instruction by the panel 400 and the frame 402. In
The player performs a motion on the load controller 36 according to a motion instruction. In the game apparatus 12, it is determined whether or not the instructed motion is performed by the player on the basis of a load value detected by the load controller 36. If it is determined that the motion is performed, the player can gain a score. In addition, the timing of the motion is judged, and if the timing is good, a high score can be gained.
As shown in
Furthermore, the panel 400 is stopped within the frame 402 after a predetermined time PS elapses from the start of the movement, and then continues to be stopped until the time limit TA expires. The provision of the stopping period to the movement of the panels can clearly show the player when an instruction of each motion is started and ended. The panel stopping time PS is set to a proper value in advance by experiments, etc. so as to be timing suitable for a motion of stepping up and down, for example. For example, the timing when the foot moving for stepping up and down accurately touches the load controller 36 or the ground (floor) may be adopted. Thus, the player can perform a motion according to the moving state of the panels 400 such that he or she puts the foot down and on another place while the panels 400 moves, and completely puts the foot on in order to prepare for a next position while the panel 400 is stopped.
If it is not determined that the motion is performed from when the instruction panel 400 starts to move to when the time limit TA expires, it is determined that the execution of the motion fails. If it is a failure judgment, the player is not scored.
On the other hand, if it is determined that the motion is performed by the time when the time limit TA expires, it is determined that the execution of the motion succeeds, and the player is scored. Accordingly, for example, even if the motion is performed by a foot different from the instructed foot, if the motion is done again by the accurate foot by the time when the time limit TA expires, a success judgment is performed.
In addition, in this embodiment, a score corresponding to the timing when it is determined that a motion is performed may be given. Out of the success judgment, a motion performed in good timing called a perfect judgment, and the rest is called an OK judgment. As a time for discriminating the perfect judgment from the OK judgment, a perfect judgment time from Tp0 to Tp1 is set. When the elapsed time when it is determined that the motion is performed falls in the range from Tp0 to Tp1, the perfect judgment is made. That is, the period when the elapsed time falls in the range from Tp0 to Tp1 is a perfect judgment area, the period when the elapsed time falls in the range from 0 to Tp0 and the period when the elapsed time falls in the range from Tp1 to TA are OK judgment areas.
The perfect judgment times Tp0 and Tp1 are decided to be a value suitable for a motion of stepping up and down by experiments. In a case that the panel stopping time PS is determined to be timing suitable for the motion as described above, predetermined times before and after the panel stopping time PS are adopted as perfect judgment areas. In this case, the player can easily obtain a perfect judgment if the foot is put on the load controller 36 or the ground at right timing when the instruction panel 400 on the screen is stopped.
As to the motions of step-up-and-down exercise at first to third steps, by a detected load value, whether each motion is performed or not can be determined, and by the timing when the motion is performed, whether the OK judgment or the perfect judgment can be determined. However, as to a motion at the fourth step for putting down the left foot which was put on the load controller 36 on the floor to thereby put the both feet on the floor, it is difficult to determine the timing when the motion is performed by the detected load value. The reason why that at a time when the foot is put down from the load controller 36, the load value already becomes approximately zero, and it is impossible to observe from a change of the load value that the foot is put on the floor, so that the motion at the fourth step is finished.
Here, in this embodiment, the judgment timing of a motion at the fourth step is decided on the basis of an elapsed time from when the instruction of the motion is given. More specifically, as shown in
Here, in a case that the player does not perform a motion according to an instruction, it is impossible to measure the elapsed time T3 at the third step, and therefore, in this case, when a predetermined time elapses from the start of the instruction of the fourth step, a judgment timing of a motion at the fourth step is decided to thereby make a judgment of the motion at the fourth step on the basis of the load value. As described above, since the stopping time PS of the panel 400 is set so as to be suitable for a motion of stepping up and down, and the perfect judgment areas are set before and after the PS, it is expected that the player puts the feet together in response to the stop of the panel 400. Accordingly, in this embodiment, by utilizing the panel stopping time PS as a judgment timing of a motion at the fourth step, it is possible to determine the motion at an appropriate timing. That is, in a case that the elapsed time T3 at the third step cannot be detected, the judgment of a motion at the fourth step is performed after the elapsed time from the start of the movement of the panel 400 is equal to or more than the panel stopping time PS.
Thus, on the basis of the elapsed time from the instruction of the motion at the fourth step, the judgment timing of the motion at the fourth step is decided, and therefore, it is possible to appropriately decide the judgment timing of a motion by the player.
Additionally,
A memory area 504 stores a load value detecting program. The program is for detecting a load value of the load controller 36. For example, when a load is required, a load obtaining command is transmitted to the load controller 36 via the radio controller module 52, and a load value of each load sensor 36b is detected from the data of the load controller 36 received in the radio controller module 52. At a time of a judgment of the motion, a load value is fetched at an interval of a constant period, such as one frame ( 1/60 seconds), for example.
A memory area 506 stores a motion instructing program. The program is for instructing a player of a motion to be executed. The movements and stops of the plurality of panels 400 for instructing a series of motions are controlled on the basis of the success-or-failure judgment time TA and the panel stopping time PS, etc. as described above.
The elapsed time counting program is a program for measuring a lapse of time after the instruction of a motion. More specifically, the time when a panel 400 starts to move from the position upwardly adjacent to the frame 402 into the frame 402 is the time when the motion corresponding to the panel 400 is instructed, and therefore, the time is counted from when the panel 400 starts to move into the frame 402.
A memory area 510 stores a judgment timing deciding program. The program is for deciding whether a judgment timing of the motion or not on the basis of an elapsed time. In this embodiment, the judgment timing of a motion at the fourth step of the step-up-and-down exercise is decided. More specifically, as described above, it is determined whether or not the elapsed time T4 from the instruction of a motion at the fourth step is equal to or more than the elapsed time T3 measured at a motion at the third step. Or, it is determined whether or not the elapsed time T4 is equal to or more than the panel stopping time PS.
A memory area 512 stores a load determining program. The program is for determining whether or not the detected load value becomes a predetermined state. The predetermined state is a state in which a condition for determining each motion of the step-up-and-down exercise is satisfied. The judgment condition of each motion is a ratio of a load value to a body weight value of the player, and a position of the center of gravity of the load value. In this embodiment, it is determined whether or not the load value detected as to a motion at the third step of the step-up-and-down exercise becomes a predetermined state, and the elapsed time T3 from when the instruction of the motion is started to when it is determined that the load value becomes the predetermined state is adopted as a judgment timing of a motion at the fourth step.
A memory area 514 stores a motion determining program. The program is, when it is determined that the judgment timing of a motion has come, for determining whether or not the motion is performed on the basis of the load value. In this embodiment, when it is determined that the judgment timing of a motion at the fourth step of the step-up-and-down exercise has come, it is determined whether or not the motion is performed on the basis of the detected load value.
A memory area 516 stores a motion's completion notifying program. The program is for notifying the player that the motion is performed when it is determined that the instructed motion is performed. This makes it possible to easily inform the player whether or not the motion is executed. In this embodiment, by outputting a predetermined sound indicating that execution of the instructed motion is determined from the speaker 34a, it is possible to inform the player that the instructed motion is performed. Furthermore, this may be informed by an image display such as change of a color of the panel 400 corresponding to the instructed motion, and display of letters representing a success on the screen, for example.
A memory area 518 of the data memory area 502 stores a body weight value of the player. The body weight value is calculated by summing load values of all the load sensors 36b detected when the player rides still on the load controller 36. Additionally, when the body weight value is measured, a screen for instructing the player to gently ride on the load controller 36 with both feet is displayed before start of the step-up-and-down exercise.
A memory area 520 stores a time counter. The time counter is a counter for counting an elapsed time from an instruction of each motion. In this embodiment, the count is performed at an interval of a preset time (1 frame).
A memory area 522 stores a load value of each of the load sensors 36b detected by the load detecting program. When judgment of the condition of a motion is performed by the load determining program or the motion determining program, a ratio between a sum of the load values and a body weight value, and a position of the center of gravity are calculated.
A memory area 524 stores a position of the center of gravity. The position of the center of gravity is a position of the center of gravity of a load value of each load sensor 36b of the load controller 36. In this embodiment, as understood from that the two foot prints are arranged in a direction of the long side of the rectangle instruction panel 400 shown in
When a load value detected by the load sensor 36b at the left front of the player is a, when a load value detected by the load sensor 36b at the left back is b, when a load value detected by the load sensor 36b at the right front is c, when a load value detected by the load sensor 36b at the right back is d, the position of the center of gravity in the right and left direction XG is calculated by Equation 1 below.
XG=((c+d)−(a+b))*m [Equation 1]
Here, m is a constant, and set to a value satisfying −1≦XG≦1.
Additionally, although not utilized in this embodiment, in another embodiment, judgment may be decided on the basis of a position of the center of gravity in a back and forth direction depending on the motion, and in this case, a position of the center of gravity in a back and forth direction YG is calculated by a following Equation 2.
YG=((a+c)−(b+d))*n [Equation 2]
Here, n is a constant, and set to a value satisfying −1≦YG≦1.
Thus, a position of the center of gravity in a right and left direction XG is calculated on the basis of the difference between the load value (c+d) at the right of the player and the load value (a+b) at the left of the player, and the position of the center of gravity in a back-and-forth direction YG is calculated on the basis of the difference between the load value (a+c) in front of the player and the load value (b+d) at the rear of the player.
It should be noted toward which direction (right front, right back, left front, left back in this embodiment) each load sensor 36b exists when viewed from the player can be grasped from the arrangement data which is decided in advance or set by the player so as to be stored as described above.
A memory area 526 stores an elapsed time counted by the elapsed time counting program. More specifically, an elapsed time from when a motion instruction at the first step of the step-up-and-down exercise is given to when it is determined that the motion is performed is stored as T1. For example, the time when the motion instruction is given is a time when the panel 400 of the motion starts to move into the frame 402, and the time when it is determined that the motion is performed is a time when a motion completion sound is output. Similarly, an elapsed time as to a motion at the second step is stored as T2, and an elapsed time as to a motion at the third step is stored as T3. Furthermore, as to an elapsed time T4 of a motion at the fourth step, an elapsed time from when a motion at the fourth step instruction is given to the present is stored.
A memory area 528 stores a panel stopping time PS indicating a time during which the instruction panel 400 is stopped. A memory area 530 stores the success-or-failure judgment time TA indicating a time limit for judging a motion to be currently executed. A memory area 532 stores the perfect judgment time Tp0, Tp1 for defining the perfect judgment area. The panel stopping time PS, the success-or-failure judgment time TA, and the perfect judgment time Tp0, Tp1 are read from the optical disk 18. In this embodiment, a common value suitable for the respective motions at the first to fourth steps is set in each of the PS, the TA, the Tp0 and the Tp1 such that the step-up-and-down exercise is performed at a constant rhythm. It should be noted that in another embodiment, different values for each motion may be set to the PS, the TA, the Tp0 and the Tp1.
A memory area 534 stores a result of the game. As a game result, a score of the player, an evaluation (perfect, OK or failure), etc. of the respective motions at the first to the fourth steps are stored.
In a succeeding step S3, the CPU 40 writes the body weight value to the external main memory 46. Thus, the body weight value of the player is stored in the memory area 518.
Then, in a step S5, the CPU 40 displays the instruction panels 400. More specifically, the CPU 40 generates a game screen shown in
Succeedingly, in a step S7, the CPU 40 executes first step processing for judging a motion at the first step of the step-up-and-down exercise. The detail of the first step processing is shown in
In a step S15, the CPU 40 determines whether or not a game is to be ended. For example, it is determined whether or not the step-up-and-down exercise is performed for a predetermined time period or at a predetermined number of times. If “NO” in the step S15, the process returns to the step S7 to judge each of the motions of the step-up-and-down exercise again. On the other hand, in a case that it is determined that the game end condition is satisfied in the step S15, the CPU 40 executes game end processing in a step S17 to end the game processing of the step-up-and-down exercise. For example, the sum of the scores obtained by the successes of the respective motions of the step-up-and-down exercise is calculated, the score and a result of the evaluation corresponding to the score are displayed, and so forth.
The processing in succeeding steps S33-S43 is executed at a set intervals of times (one frame) until it is determined that a motion at the first step is performed on the basis of the load values in the step S41, or until it is determined that a motion at the first step is not performed within the time limit in the step S43.
In the step S33, the CPU 40 executes time counting processing. For example, by incrementing the time counter, the value of the time counter of the memory area 520 is updated. By the time counting processing, it is possible to count an elapsed time from when the motion instruction is given.
Furthermore, in the step S35, the CPU 40 executes load value fetching processing. More specifically, the CPU 40 transmits a load obtaining command to the load controller 36 via the radio controller module 52, etc. In response thereto, input data including the detected load values is transmitted from the load controller 36. The CPU 40 detects the load values of the respective load sensors 36b from the input data received by the wireless controller module 52, and stores the same in the memory area 522.
It is determined whether or not the instructed motion on the panel 400 is performed on the basis of the detected load values. The judgment of the motion is performed on the basis of a ratio of the load values to the body weight value, and a position of the center of gravity.
More specifically, in the step S37, the CPU 40 determines whether or not the load value is 25-75% of the body weight value. The load value compared with the condition here is a sum of the load values of the respective load sensors 36b stored in the memory area 522. The ratio of the sum of the load values to the body weight value in the memory area 518 is calculated, and it is determined whether or not the ratio is within the range of 25-75% as a judgment condition of the first step. The judgment condition relating to the ratio is set to an appropriate value by experiments in advance and stored. The motion at the first step in this embodiment is a motion of putting the right foot on the load controller 36. Here, when the motion of the right foot is performed, the left foot remains to be put on the ground, so that the half of all the player's weight is put on the load controller 36. Thus, in view of the difference of the balance of the loads put on the right and left feet due to a habit for each player, etc., if the sum of the detected loads is 25-75% of the body weight value, it can be determined that one foot is put on the load controller 36.
If “YES” in the step S37, that is, if the condition of the ratio of the load values is satisfied, the CPU 40 calculates a position of the center of gravity in order to perform the judgment of a condition of the position of the center of gravity in the step S39 and stores it in the memory area 524. The position of the center of gravity is calculated on the basis of the load values of the respective load sensors 36b stored in the memory area 522 according to the above-described Equation 1.
Then, in the step S41, the CPU 40 determines whether or not the position of the center of gravity falls in the range of 0.01 to 1 as a judgment condition at the first step. The motion at the first step in this embodiment is a motion of putting the right foot on the load controller 36, and the right foot is put on the right side of the load controller 36 when viewed from the player, and therefore, the position of the center of gravity appears on the right side of the load controller 36 when viewed from the player. Accordingly, if the calculated position of the center of gravity falls in the range of 0.01 to 1, it can be determined that the right foot is put on the load controller 36.
If “NO” in the step S41, that is, if the condition of the position of the center of gravity is not satisfied, it can be determined that the motion at the first step is not performed. Furthermore, if “NO” in the step S37, that is, if the condition of the ratio of the load values is also not satisfied, the same is true for this. In these cases, the CPU 40 determines whether or not the predetermined time TA elapses from the start of the movement of the panel in the step S43. The elapsed time from the start of the movement of the panel to the present can be fetched by the value of the time counter of the memory area 520. Furthermore, the predetermined time TA is a success-or-failure judgment time of the memory area 530, that is, a time limit. If “NO” in the step S43, that is, if the elapsed time falls within the time limit of the motion judgment at the first step, the process returns to the step S33. Accordingly, until the time limit expires, the motion judgment at the first step is continued on the basis of the detected load value.
On the other hand, if “YES” in the step S43, that is, if the time limit expires without the motion at the first step being performed, the CPU 40 executes failure processing in a step S45. Since it is determined that the player cannot perform the instructed motion at the first step by the panel 400, a score is not given to the player. Furthermore, data indicating the failure judgment as to the motion at first step is stored in the game result memory area 534. Alternatively, a failure of the motion may be displayed on the screen by letters of FAILURE, etc.
Furthermore, if “YES” in the step S41, that is, if it is determined that the motion at the first step is performed, the CPU 40 informs the player of this with a motion completion sound in a step S47. More specifically, the CPU 40 generates audio data for outputting a motion completion sound on the basis of predetermined sound data by utilizing the DSP 42c, etc., and outputs the sound from the speaker 34a via the AV IC 56, etc. Thus, the player is easily informed that the motion at the first step is successful.
In a succeeding step S49, the CPU 40 detects an elapsed time T1 from the start of the movement of the instruction panel 400 to the notification with the motion completion sound, that is, detects the elapsed time T1 from when the motion instruction is given to when it is determined that the motion is performed on the basis of the value of the time counter in the memory area 520, and stores the same in the memory area 526.
Then, in a step S51, the CPU 40 determines whether or not the elapsed time T1 is equal to or more than Tp0 and equal to or less than Tp1. The Tp0 and Tp1 are threshold values for a perfect judgment, and stored in the memory area 532. If “YES” in the step S51, that is, if the elapsed time T1 is a value within the perfect judgment area, the CPU 40 executes perfect success processing in a step S53. More specifically, a score higher than that in the OK judgment is given to the player, and is added to the score data of the player in the game result memory area 534. Furthermore, evaluation data indicating the perfect judgment as to the motion at the first step is also stored in the memory area 534. Additionally, the fact the motion is perfect may be displayed on the screen by letters of PERFECT, etc.
On the other hand, if “NO” in the step S51, that is, if the motion is not performed at timing within the perfect judgment area, the CPU 40 executes OK success processing in a step S55. More specifically, a score lower than that in the perfect judgment is given to the player, and is added to the score data of the player in the game result memory area 534. Furthermore, evaluation data indicating the OK judgment as to the motion at the first step is also stored in the memory area 534. Additionally, the fact the motion is OK may be displayed on the screen by letters of PERFECT, etc.
When the processing in the step S45, the step S53 or the step S55 is ended, the first step processing is ended, and the process proceeds to the second step processing in the step S9.
Additionally, as shown in
When starting the second step processing, the CPU 40 starts movement processing of the instruction panels 400 in a step S71. The movement processing of the panels 400 is similar to that in the above-described step S31. Here, the panel 400 for instructing a motion at the second step is the panel 400b shown in
Processing in succeeding steps S73-S83 is executed at a set intervals of times (one frame) until it is determined that a motion at the second step is performed on the basis of the load value in the step S81, or it is determined that the motion at the second step is not performed within the time limit in the step S83.
In the step S73, the CPU 40 executes time counting processing similar to the above-described step S33. Furthermore, in the step S75, the CPU 40 executes load value fetching processing similar to the above-described step S35.
Then, in the step S77, the CPU 40 determines whether or not the load value is equal to or more than 95% of the body weight value. A judgment condition of a ratio of the load value to the body weight value with respect to the motion at the second step is set to equal to or more than 95% in advance. The motion at the second step is a motion of further putting the left foot on the load controller 36 from a state the right foot is put thereon, and when the motion at the second step is completed, almost all the player's weight is put on the load controller 36. If the sum of the detected loads is equal to or more than 95% of the body weight value, it can be determined that both of the feet are put on the load controller 36.
If “YES” in the step S77, the CPU 40 calculates a position of the center of gravity similar to the above-described step S39 in the step S79. Then, in the step S81, the CPU 40 determines whether or not the position of the center of gravity falls in the range of −0.7 to 0.7 as a judgment condition of the second step. By the motion at the second step, both of the feet are put on the load controller 36, so that the position of the center of gravity appears at approximately the center of the load controller 36. Accordingly, in view of the difference in the position of the center of gravity due to a habit for each player, etc., if the calculated position of the center of gravity falls in the range of −0.7 to 0.7, it can be determined that the motion at the second step is completed and both of the feet are put on the load controller 36.
If “NO” in the step S81, or if “NO” in the step S77, it can be determined that the motion at the second step is not performed. In these cases, the CPU 40 determines whether or not the predetermined time TA elapses from the start of the movement of the panel similar to the above-described step S43 in the step S83. If “NO” in the step S83, the process returns to the step S73.
On the other hand, if “YES” in the step S83, that is, if the time limit expires without execution of the motion at the second step being determined, the CPU 40 executes failure processing similar to the above-described step S45 in a step S85. Here, since the motion judgment is as to the second step, the evaluation data indicating a failure judgment as to the motion at the second step is stored in the game result memory area 534.
Furthermore, if “YES” in the step S81, that is, if it is determined that the motion at the second step is performed, the CPU 40 informs this with a motion completion sound similar to the above-described step S47 in a step S87.
In a succeeding step S89, the CPU 40 detects an elapsed time T2 from the start of the movement of the instruction panel 400 at the second step to the notification with the motion completion sound, that is, detects the elapsed time T2 from when the instruction of the motion at the second step is given to when it is determined the motion is performed on the basis of the value of the time counter of the memory area 520, and stores the same in the memory area 526.
Then, in a step S91, the CPU 40 determines whether or not the elapsed time T2 is equal to or more than Tp0 and equal to or less than Tp1, and whether or not the determination result at the first step is perfect. In this embodiment, in order to obtain the perfect judgment at the second step, the perfect judgment is required to be obtained at the first step as well as the timing when the motion at the second step is performed is within the perfect judgment area. More specifically, it is determined whether or not the elapsed time T2 falls in the perfect judgment area. In addition, with reference to the evaluation data as to the first step stored in the game result memory area 534, it is determined whether the data indicating the perfect judgment or not.
If “YES” in the step S91, that is, if the perfect judgment is performed as to the second step, the CPU 40 executes perfect success processing similar to the above-described step S53 in a step S93. Here, since the motion judgment is as to the second step, the evaluation data indicating the perfect judgment as to the motion at the second step is stored in the game result memory area 534.
On the other hand, if “NO” in the step S91, the CPU 40 executes OK success processing similar to the above-described step S55 in a step S95. Here, since the motion judgment is as to the second step, the evaluation data indicating the OK judgment as to the motion at the second step is stored in the game result memory area 534.
After completion of the step S85, the step S93 or the step S95, the second step processing is ended, and the process proceeds to the third step processing in the step S11.
Additionally, the third step processing is executed after the predetermined time TA elapses from the start of the movement of the previous instruction panel 400 similar to the above-described second step processing.
When the third step processing is started, in a step S111, the CPU 40 starts movement processing of the instruction panels 400. The movement processing of the panels 400 is similar to that in the above-described step S31. Here, the panel 400 for instructing the motion at the third step is the panel 400c shown in
The processing in succeeding steps S113-S123 is executed at a set intervals of times (one frame) until it is determined that a motion at the third step is performed on the basis of the load value in the step S121, or it is determined that the motion at the third step is not performed within the time limit in the step S123.
In the step S113, the CPU 40 executes time counting processing similar to the above-described step S33. Furthermore, in the step S115, the CPU 40 executes load value fetching processing similar to the above-described step S35.
Then, in the step S117, the CPU 40 determines whether or not the load value is 25 to 75% of the body weight value similar to the above-described step S37. A judgment condition of a ratio of the load value to the body weight value with respect to the third step is set to 25 to 75% in advance. The motion at the third step is a motion of putting the right foot down from the load controller 36 from a state that both of the feet are put on the load controller 36. When the motion at the third step is completed, the right foot is put on the ground, so that the load of the left foot is only put on the load controller 36. Accordingly, if the sum of the detected loads is 25 to 75% of the body weight value, it can be determined that the right foot is put down from the load controller 36. Here, similar to the motion at the first step, by the motion at the third step, one foot is put on the load controller 36 in a state that the other foot is put on the ground, the condition of the ratio of the load value at the third step is the same as that in the above-described first step.
If “YES” in the step S117, the CPU 40 calculates a position of the center of gravity similar to the above-described step S39 in the step S119. Then, in the step S121, the CPU 40 determines whether or not the position of the center of gravity falls in the range of −1 to −0.01 as the judgment condition of the third step. By the motion at the third step, the right foot is put down from the load controller 36, and only the left foot remains on the load controller 36, so that the position of the center of gravity appears on the left side of the load controller 36 when viewed from the player. Accordingly, if the calculated position of the center of gravity falls in the range of −1 to −0.01, it can be determined that the motion at the third step is completed, and the right foot is put down.
If “NO” in the step S121, or if “NO” in the step S117, it can be determined that the motion at the third step is not performed. In these cases, the CPU 40 determines whether or not the predetermined time TA elapses from the start of the movement of the panel similar to the above-described step S43 in the step S123. If “NO” in the step S123, the process returns to the step S113.
On the other hand, if “YES” in the step S123, that is, if the time limit expires without execution of the motion at the third step being determined, the CPU 40 executes failure processing similar to the above-described step S45 in a step S125. Here, since the motion judgment is as to the third step, the evaluation data indicating a failure judgment as to the motion at the third step is stored in the game result memory area 534.
Furthermore, if “YES” in the step S121, that is, if it is determined the motion at the third step is performed, the CPU 40 notifies this with a motion completion sound similar to the above-described step S47 in a step S127.
In a succeeding step S129, the CPU 40 detects an elapsed time T3 from the start of the movement of the instruction panel 400 at the third step to the notification with the motion completion sound, that is, detects the elapsed time T3 from when the instruction of the motion at the third step is given to when it is determined that the motion is performed on the basis of the value of the time counter of the memory area 520, and stores the same in the memory area 526. The judgment timing of the motion at the third step is utilized as a judgment timing of a motion at the fourth step.
Then, in a step S131, the CPU 40 determines whether or not the elapsed time T3 is equal to or more than Tp0 and equal to or less than Tp1, and whether or not the determination result at the first step is perfect similar to the above-described step S91. In this embodiment, in order to obtain the perfect judgment at the third step, it is necessary that the timing when the motion at the third step is performed is within the perfect judgment area, and the perfect judgment is obtained at the first step.
If “YES” in the step S131, that is, if the perfect judgment is performed with respect to the motion at the third step, the CPU 40 executes perfect success processing similar to the above-described step S53 in a step S133. Here, since the motion judgment is as to the third step, the evaluation data indicating the perfect judgment as to the motion at the third step is stored in the game result memory area 534.
On the other hand, if “NO” in the step S131, the CPU 40 executes OK success processing similar to the above-described step S55 in a step S135. Here, since the motion judgment is as to the third step, the evaluation data indicating the OK judgment as to the motion at the third step is stored in the game result memory area 534.
After completion of the step S125, the step S133 or the step S135, the third step processing is ended, and the process proceeds to the fourth step processing in the step S13.
Additionally, the fourth step processing is executed after the predetermined time TA elapses from the start of the movement of the previous instruction panel 400 similar to the above-described second step processing and third step processing.
When the fourth step processing is started, the CPU 40 starts movement processing of the instruction panels 400 in a step S151. The movement processing of the panels 400 is similar to that in the above-described step S31. Here, the panel 400 for instructing the motion at the fourth step is the panel 400d shown in
The processing in succeeding steps S153-S167 is executed at a set intervals of times (one frame) until it is determined that a motion at the fourth step is performed on the basis of the load value in the step S165, or it is determined that the motion at the fourth step is not performed within the time limit in the step S167.
In the step S153, the CPU 40 executes time counting processing similar to the above-described step S33. Then, in the step S155, the CPU 40 detects an elapsed time T4 from the start of the movement of the instruction panel 400 at the fourth step on the basis of the value of the time counter in the memory area 520 and stores the same in the memory area 526. The judgment timing of the motion at the fourth step is decided on the basis of the elapsed time T4 described above.
In the succeeding step S157, the CPU 40 determines whether or not the elapsed time T3 is detected with reference to the memory area 526. If “YES” in the step S157, that is, if it is determined that the motion at the third step is performed, the judgment timing of the motion at the third step is utilized as a judgment timing of a motion at the fourth step in
On the other hand, if “NO” in the step S157, that is, if it is not determined that the motion at the third step is performed, it is impossible to determine the judgment timing of the motion at the fourth step on the basis of the judgment timing of the motion at the third step. Thus, as described above, by utilizing the panel stopping time PS set to be timing suitable for the motion of stepping up and down, it is determined whether or not the judgment timing of the motion at the fourth step has come. That is, in the step S161, it is determined whether or not the elapsed time T4 is equal to or more than the panel stopping time PS. If “NO” in the step S161, since the judgment timing of the motion at the fourth step has not come, the process returns to the step S153.
Furthermore, if “YES” in the step S159 or if “YES” in the step S161, that is, if the judgment timing of the motion at the fourth step has come, the CPU 40 executes load value fetching processing similar to the above-described step S35 in the step S163.
Then, in the step S165, the CPU 40 determines whether or not the load value is equal to or less than 5% of the body weight value similar to the above-described step S37. A judgment condition of a ratio of the load value to the body weight value with respect to the fourth step is set to be equal to or less than 5% in advance. The motion at the fourth step is a motion of putting the left foot down from the load controller 36. When the motion at the fourth step is completed, both of the feet are put on the ground, so that the load put on the load controller 36 is substantially zero. Thus, when it is determined that the judgment timing of the motion at the fourth step has come on the basis of the elapsed time, if the sum of the detected loads is equal to or less than 5% of the body weight value, it can be determined that the left foot is put down from the load controller 36.
Here, since the motion at the fourth step is performed to bring about a state that the feet of the player is not put on the load controller 36, in the judgment of the motion at the fourth step, only the condition of the ratio of the load value to the body weight value is taken into account without seeing position of the center of gravity.
On the other hand, if “NO” in the step S165, it can be determined that the motion at the fourth step is not performed. Furthermore, if “NO” in the step S159, since the elapsed time T4 has not reached the judgment timing of the motion, the motion judgment on the basis of the load value is not performed. In these cases, the CPU 40 determines whether or not the predetermined time TA elapses from the start of the movement of the panel similar to the above-described step S43 in the step S167. That is, it is determined whether or not the elapsed time T4 is equal to or more than the predetermined time TA. If “NO” in the step S167, that is, if the elapsed time T4 is within the time limit of the motion at the fourth step, the process returns to the step S153.
On the other hand, if “YES” in the step S167, that is, if the time limit expires without execution of the motion at the fourth step being determined, the CPU 40 executes failure processing similar to the above-described step S45 in a step S169. Here, since the motion judgment is as to the fourth step, the evaluation data indicating the failure judgment as to the motion at the fourth step is stored in the game result memory area 534.
Furthermore, if “YES” in the step S165, that is, if it is determined that the motion at the fourth step is performed, the CPU 40 notifies this with a motion completion sound similar to the above-described step S47 in a step S171.
In a succeeding step S173, the CPU 40 determines whether or not the determination result at the third step is perfect on the basis of the evaluation data at the third step in the game result memory area 534. Here, in this embodiment, in order to obtain the perfect judgment at the fourth step, it is necessary to obtain the perfect judgment at the third step. Since whether or not the judgment timing of the motion at the fourth step is decided on the basis of the judgment timing of the motion at the third step or the panel stopping time PS, it is not determined whether or not the timing when the motion at the fourth step is performed is within the perfect judgment area.
If “YES” in the step S173, that is, if the perfect judgment is performed as to the motion at the fourth step, the CPU 40 executes perfect success processing similar to the above-described step S53 in a step S175. Here, since the motion judgment is as to the fourth step, the evaluation data indicating the perfect judgment as to the motion at the fourth step is stored in the game result memory area 534.
On the other hand, if “NO” in the step S173, the CPU 40 executes OK success processing similar to the above-described step S55 in a step S177. Here, since the motion judgment is as to the fourth step, the evaluation data indicating the OK judgment as to the motion at the fourth step is stored in the game result memory area 534.
After completion of the step S169, the step S175 or the step S177, the fourth step processing is ended, and the process proceeds to the step S15 in
According to this embodiment, since the judgment timing of the motion at the fourth step of putting the left foot down from the load controller 36 to bring about a state that both of the feet are not put on is decided on the basis of the elapsed time from when the instruction of the motion at the fourth step is given, it is possible to suitably decide the judgment timing of the motion by the player, and thus determine whether or not the motion is performed.
Furthermore, the judgment timing of the motion at the fourth step is judged on the basis of the judgment timing of the motion at the third step of only putting one foot down from the load controller 36 from a state that both of the feet are put thereon, it is possible to make a proper judgment with the simple processing.
In addition, in a case that it is not determined whether or not the motion at the third step is performed, the judgment timing of the motion at the fourth step is decided on the basis of the panel stopping time PS suitably set for the motion, it is possible to perform an appropriate judgment on the basis of the suitable time set in advance.
Additionally, in the above-described embodiment, since a motion at the fourth step is a motion of putting both of the feet down from the controller, the fact that the load values detected at the judgment timing at the fourth step are approximately zero is a condition for deciding that the fourth step is successful. However, in another embodiment, whether or not a load is put on even once during the judgment of the first to third steps may be decided as a condition for a success of the fourth step.
If “YES” in the step S165 in
On the other hand, if “NO” in the step S201, it is regarded that the player does not ride on the load controller 36 during the judgment of the first to third step, and the motion at the fourth step is unsuccessful, and the process proceeds to the step S169.
According to the embodiment in
Furthermore, in each of the above-described embodiments, the judgment timing of a motion at the fourth step is decided on the basis of the judgment timing (elapsed time T3) of a motion at the third step. However, in another embodiment, an average value of the judgment timings (T1, T2 and T3) of the respective motions from the first to third steps, and the judgment timing of the motion at the fourth step may be decided on the basis of the average value. Or, the judgment timings (T3) at the third step detected in the past, that is, the histories of the judgment timing T3 at the third step are stored, and the judgment timing of the motion at the fourth step may be decided on the basis of the average value of the judgment timings at the third step in the past. If so, it is possible to accurately decide the judgment timing at the fourth step. In addition, it is possible to solve the problem that the judgment timing at the fourth step cannot be accurately judged if it is not determined that a motion at the third step directly before is performed.
Furthermore, in each of the above-described embodiments, a motion of the step-up-and-down exercise is determined, but a motion instructed to the player can arbitrarily be changed. For example, as shown in
At a judgment of the motion at the second step, in the step S77 of the second step processing shown in
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2007-261798 | Oct 2007 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
588172 | Peters | Aug 1897 | A |
688076 | Ensign | Dec 1901 | A |
D188376 | Hotkins et al. | Jul 1960 | S |
3184962 | Gay | May 1965 | A |
3217536 | Motsinger et al. | Nov 1965 | A |
3424005 | Brown | Jan 1969 | A |
3428312 | Machen | Feb 1969 | A |
3712294 | Muller | Jan 1973 | A |
3752144 | Weigle, Jr. | Aug 1973 | A |
3780817 | Videon | Dec 1973 | A |
3826145 | McFarland | Jul 1974 | A |
3869007 | Haggstrom et al. | Mar 1975 | A |
4058178 | Shinohara et al. | Nov 1977 | A |
4104119 | Schilling | Aug 1978 | A |
4136682 | Pedotti | Jan 1979 | A |
4246783 | Steven et al. | Jan 1981 | A |
4296931 | Yokoi | Oct 1981 | A |
4337050 | Engalitcheff, Jr. | Jun 1982 | A |
4404854 | Krempl et al. | Sep 1983 | A |
4488017 | Lee | Dec 1984 | A |
4494754 | Wagner, Jr. | Jan 1985 | A |
4558757 | Mori et al. | Dec 1985 | A |
4569519 | Mattox et al. | Feb 1986 | A |
4574899 | Griffin | Mar 1986 | A |
4577868 | Kiyonaga | Mar 1986 | A |
4598717 | Pedotti | Jul 1986 | A |
4607841 | Gala | Aug 1986 | A |
4630817 | Buckley | Dec 1986 | A |
4660828 | Weiss | Apr 1987 | A |
4680577 | Straayer et al. | Jul 1987 | A |
4688444 | Nordstrom | Aug 1987 | A |
4691694 | Boyd et al. | Sep 1987 | A |
4711447 | Mansfield | Dec 1987 | A |
4726435 | Kitagawa et al. | Feb 1988 | A |
4739848 | Tulloch | Apr 1988 | A |
4742832 | Kauffmann et al. | May 1988 | A |
4742932 | Pedragosa | May 1988 | A |
4800973 | Angel | Jan 1989 | A |
4838173 | Schroeder et al. | Jun 1989 | A |
4855704 | Betz | Aug 1989 | A |
4880069 | Bradley | Nov 1989 | A |
4882677 | Curran | Nov 1989 | A |
4893514 | Gronert et al. | Jan 1990 | A |
4907797 | Gezari et al. | Mar 1990 | A |
4927138 | Ferrari | May 1990 | A |
4970486 | Gray et al. | Nov 1990 | A |
4982613 | Becker | Jan 1991 | A |
D318073 | Jang | Jul 1991 | S |
5044956 | Behensky et al. | Sep 1991 | A |
5049079 | Furtado et al. | Sep 1991 | A |
5052406 | Nashner | Oct 1991 | A |
5054771 | Mansfield | Oct 1991 | A |
5065631 | Ashpitel et al. | Nov 1991 | A |
5076584 | Openiano | Dec 1991 | A |
5089960 | Sweeney, Jr. | Feb 1992 | A |
5103207 | Kerr et al. | Apr 1992 | A |
5104119 | Lynch | Apr 1992 | A |
5116296 | Watkins et al. | May 1992 | A |
5118112 | Bregman et al. | Jun 1992 | A |
5151071 | Jain et al. | Sep 1992 | A |
5195746 | Boyd et al. | Mar 1993 | A |
5197003 | Moncrief et al. | Mar 1993 | A |
5199875 | Trumbull | Apr 1993 | A |
5203563 | Loper, III | Apr 1993 | A |
5207426 | Inoue et al. | May 1993 | A |
5259252 | Kruse et al. | Nov 1993 | A |
5269318 | Nashner | Dec 1993 | A |
5299810 | Pierce et al. | Apr 1994 | A |
5303715 | Nashner et al. | Apr 1994 | A |
5360383 | Boren | Nov 1994 | A |
5362298 | Brown et al. | Nov 1994 | A |
5368546 | Stark et al. | Nov 1994 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5431569 | Simpkins et al. | Jul 1995 | A |
5462503 | Benjamin et al. | Oct 1995 | A |
5466200 | Ulrich et al. | Nov 1995 | A |
5469740 | French et al. | Nov 1995 | A |
5474087 | Nashner | Dec 1995 | A |
5476103 | Nahsner | Dec 1995 | A |
5507708 | Ma | Apr 1996 | A |
5541621 | Nmngani | Jul 1996 | A |
5541622 | Engle et al. | Jul 1996 | A |
5547439 | Rawls et al. | Aug 1996 | A |
5551445 | Nashner | Sep 1996 | A |
5551693 | Goto et al. | Sep 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
D376826 | Ashida | Dec 1996 | S |
5584700 | Feldman et al. | Dec 1996 | A |
5584779 | Knecht et al. | Dec 1996 | A |
5591104 | Andrus et al. | Jan 1997 | A |
5613690 | McShane et al. | Mar 1997 | A |
5623944 | Nashner | Apr 1997 | A |
5627327 | Zanakis | May 1997 | A |
D384115 | Wilkinson et al. | Sep 1997 | S |
5669773 | Gluck | Sep 1997 | A |
5689285 | Asher | Nov 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5697791 | Nashner et al. | Dec 1997 | A |
5713794 | Shimojima et al. | Feb 1998 | A |
5721566 | Rosenberg et al. | Feb 1998 | A |
5746684 | Jordan | May 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
D397164 | Goto | Aug 1998 | S |
5788618 | Joutras | Aug 1998 | A |
5792031 | Alton | Aug 1998 | A |
5800314 | Sakakibara et al. | Sep 1998 | A |
5805138 | Brawne et al. | Sep 1998 | A |
5813958 | Tomita | Sep 1998 | A |
5814740 | Cook et al. | Sep 1998 | A |
5820462 | Yokoi et al. | Oct 1998 | A |
5825308 | Rosenberg | Oct 1998 | A |
5837952 | Oshiro et al. | Nov 1998 | A |
D402317 | Goto | Dec 1998 | S |
5846086 | Bizzi et al. | Dec 1998 | A |
5853326 | Goto et al. | Dec 1998 | A |
5854622 | Brannon | Dec 1998 | A |
5860861 | Lipps et al. | Jan 1999 | A |
5864333 | O'Heir | Jan 1999 | A |
5872438 | Roston | Feb 1999 | A |
5886302 | Germanton et al. | Mar 1999 | A |
5888172 | Andrus et al. | Mar 1999 | A |
5889507 | Engle et al. | Mar 1999 | A |
D407758 | Isetani et al. | Apr 1999 | S |
5890995 | Bobick et al. | Apr 1999 | A |
5897457 | Mackovjak | Apr 1999 | A |
5897469 | Yalch | Apr 1999 | A |
5901612 | Letovsky | May 1999 | A |
5902214 | Makikawa et al. | May 1999 | A |
5904639 | Smyser et al. | May 1999 | A |
D411258 | Isetani et al. | Jun 1999 | S |
5912659 | Rutledge et al. | Jun 1999 | A |
5919092 | Yokoi et al. | Jul 1999 | A |
5921780 | Myers | Jul 1999 | A |
5921899 | Rose | Jul 1999 | A |
5929782 | Stark et al. | Jul 1999 | A |
5947824 | Minami et al. | Sep 1999 | A |
5976063 | Joutras et al. | Nov 1999 | A |
5980256 | Carmein | Nov 1999 | A |
5980429 | Nashner | Nov 1999 | A |
5984785 | Takeda et al. | Nov 1999 | A |
5987982 | Wenman et al. | Nov 1999 | A |
5989157 | Walton | Nov 1999 | A |
5993356 | Houston et al. | Nov 1999 | A |
5997439 | Ohsuga et al. | Dec 1999 | A |
6001015 | Nishiumi et al. | Dec 1999 | A |
6007428 | Nishiumi et al. | Dec 1999 | A |
6010465 | Nashner | Jan 2000 | A |
D421070 | Jang et al. | Feb 2000 | S |
6037927 | Rosenberg | Mar 2000 | A |
6038488 | Barnes et al. | Mar 2000 | A |
6044772 | Gaudette et al. | Apr 2000 | A |
6063046 | Allum | May 2000 | A |
6086518 | MacCready, Jr. | Jul 2000 | A |
6102803 | Takeda et al. | Aug 2000 | A |
6102832 | Tani | Aug 2000 | A |
D431051 | Goto | Sep 2000 | S |
6113237 | Ober et al. | Sep 2000 | A |
6147674 | Rosenberg et al. | Nov 2000 | A |
6152564 | Ober et al. | Nov 2000 | A |
D434769 | Goto | Dec 2000 | S |
D434770 | Goto | Dec 2000 | S |
6155926 | Miyamoto et al. | Dec 2000 | A |
6162189 | Girone et al. | Dec 2000 | A |
6167299 | Galchenkov et al. | Dec 2000 | A |
6190287 | Nashner | Feb 2001 | B1 |
6200253 | Nishiumi et al. | Mar 2001 | B1 |
6203432 | Roberts et al. | Mar 2001 | B1 |
6216542 | Stockli et al. | Apr 2001 | B1 |
6216547 | Lehtovaara | Apr 2001 | B1 |
6220865 | Macri et al. | Apr 2001 | B1 |
D441369 | Goto | May 2001 | S |
6225977 | Li | May 2001 | B1 |
6227968 | Suzuki et al. | May 2001 | B1 |
6228000 | Jones | May 2001 | B1 |
6231444 | Goto | May 2001 | B1 |
6239806 | Nishiumi et al. | May 2001 | B1 |
6241611 | Takeda et al. | Jun 2001 | B1 |
6244987 | Ohsuga et al. | Jun 2001 | B1 |
D444469 | Goto | Jul 2001 | S |
6264558 | Nishiumi et al. | Jul 2001 | B1 |
6280361 | Harvey et al. | Aug 2001 | B1 |
D447968 | Pagnacco et al. | Sep 2001 | S |
6295878 | Berme | Oct 2001 | B1 |
6296595 | Stark et al. | Oct 2001 | B1 |
6325718 | Nishiumi et al. | Dec 2001 | B1 |
6330837 | Charles et al. | Dec 2001 | B1 |
6336891 | Fedrigon et al. | Jan 2002 | B1 |
6353427 | Rosenberg | Mar 2002 | B1 |
6354155 | Berme | Mar 2002 | B1 |
6357827 | Brightbill et al. | Mar 2002 | B1 |
6359613 | Poole | Mar 2002 | B1 |
D456410 | Ashida | Apr 2002 | S |
D456854 | Ashida | May 2002 | S |
D457570 | Brinson | May 2002 | S |
6387061 | Nitto | May 2002 | B1 |
6388655 | Leung | May 2002 | B1 |
6389883 | Berme et al. | May 2002 | B1 |
6394905 | Takeda et al. | May 2002 | B1 |
6402635 | Nesbit et al. | Jun 2002 | B1 |
D459727 | Ashida | Jul 2002 | S |
D460506 | Tamminga et al. | Jul 2002 | S |
6421056 | Nishiumi et al. | Jul 2002 | B1 |
6436058 | Krahner et al. | Aug 2002 | B1 |
D462683 | Ashida | Sep 2002 | S |
6450886 | Oishi et al. | Sep 2002 | B1 |
6454679 | Radow | Sep 2002 | B1 |
6461297 | Pagnacco et al. | Oct 2002 | B1 |
6470302 | Cunningham et al. | Oct 2002 | B1 |
6482010 | Marcus et al. | Nov 2002 | B1 |
6510749 | Pagnacco et al. | Jan 2003 | B1 |
6514145 | Kawabata et al. | Feb 2003 | B1 |
6515593 | Stark et al. | Feb 2003 | B1 |
6516221 | Hirouchi et al. | Feb 2003 | B1 |
D471594 | Nojo | Mar 2003 | S |
6543769 | Podoloff et al. | Apr 2003 | B1 |
6563059 | Lee | May 2003 | B2 |
6568334 | Gaudette et al. | May 2003 | B1 |
6616579 | Reinbold et al. | Sep 2003 | B1 |
6624802 | Klein et al. | Sep 2003 | B1 |
6632158 | Nashner | Oct 2003 | B1 |
6636161 | Rosenberg | Oct 2003 | B2 |
6636197 | Goldenberg et al. | Oct 2003 | B1 |
6638175 | Lee et al. | Oct 2003 | B2 |
6663058 | Peterson et al. | Dec 2003 | B1 |
6676520 | Nishiumi et al. | Jan 2004 | B2 |
6676569 | Radow | Jan 2004 | B1 |
6679776 | Nishiumi et al. | Jan 2004 | B1 |
6685480 | Nishimoto et al. | Feb 2004 | B2 |
6695694 | Ishikawa et al. | Feb 2004 | B2 |
6697049 | Lu | Feb 2004 | B2 |
6719667 | Wong et al. | Apr 2004 | B2 |
6726566 | Komata | Apr 2004 | B2 |
6764429 | Michalow | Jul 2004 | B1 |
6797894 | Montagnino et al. | Sep 2004 | B2 |
6811489 | Shimizu et al. | Nov 2004 | B1 |
6813966 | Dukart | Nov 2004 | B2 |
6817973 | Merril et al. | Nov 2004 | B2 |
D500100 | Van Der Meer | Dec 2004 | S |
6846270 | Etnyre | Jan 2005 | B1 |
6859198 | Onodera et al. | Feb 2005 | B2 |
6872139 | Sato et al. | Mar 2005 | B2 |
6872187 | Stark et al. | Mar 2005 | B1 |
6888076 | Hetherington | May 2005 | B2 |
6913559 | Smith | Jul 2005 | B2 |
6936016 | Berme et al. | Aug 2005 | B2 |
D510391 | Merril et al. | Oct 2005 | S |
6975302 | Ausbeck, Jr. | Dec 2005 | B1 |
6978684 | Nurse | Dec 2005 | B2 |
6991483 | Milan et al. | Jan 2006 | B1 |
D514627 | Merril et al. | Feb 2006 | S |
7004787 | Milan | Feb 2006 | B2 |
D517124 | Merril et al. | Mar 2006 | S |
7011605 | Shields | Mar 2006 | B2 |
7033176 | Feldman et al. | Apr 2006 | B2 |
7038855 | French et al. | May 2006 | B2 |
7040986 | Koshima et al. | May 2006 | B2 |
7070542 | Reyes et al. | Jul 2006 | B2 |
7083546 | Zillig et al. | Aug 2006 | B2 |
7100439 | Carlucci | Sep 2006 | B2 |
7121982 | Feldman | Oct 2006 | B2 |
7126584 | Nishiumi et al. | Oct 2006 | B1 |
7127376 | Nashner | Oct 2006 | B2 |
7163516 | Pagnacco et al. | Jan 2007 | B1 |
7179234 | Nashner | Feb 2007 | B2 |
7195355 | Nashner | Mar 2007 | B2 |
7202424 | Carlucci | Apr 2007 | B2 |
7202851 | Cunningham et al. | Apr 2007 | B2 |
7270630 | Patterson | Sep 2007 | B1 |
7307619 | Cunningham et al. | Dec 2007 | B2 |
7308831 | Cunningham et al. | Dec 2007 | B2 |
7331226 | Feldman et al. | Feb 2008 | B2 |
7335134 | Lavelle | Feb 2008 | B1 |
RE40427 | Nashner | Jul 2008 | E |
7416537 | Stark et al. | Aug 2008 | B1 |
7530929 | Feldman et al. | May 2009 | B2 |
7722501 | Nicolas et al. | May 2010 | B2 |
7938751 | Nicolas et al. | May 2011 | B2 |
8075449 | Lee | Dec 2011 | B2 |
20010001303 | Ohsuga et al. | May 2001 | A1 |
20010007825 | Harada et al. | Jul 2001 | A1 |
20010018363 | Goto et al. | Aug 2001 | A1 |
20010050683 | Ishikawa et al. | Dec 2001 | A1 |
20020055422 | Airmet et al. | May 2002 | A1 |
20020080115 | Onodera et al. | Jun 2002 | A1 |
20020185041 | Herbst | Dec 2002 | A1 |
20030054327 | Evensen | Mar 2003 | A1 |
20030069108 | Kaiserman et al. | Apr 2003 | A1 |
20030107502 | Alexander | Jun 2003 | A1 |
20030176770 | Merril et al. | Sep 2003 | A1 |
20030193416 | Ogata et al. | Oct 2003 | A1 |
20040038786 | Kuo et al. | Feb 2004 | A1 |
20040041787 | Graves | Mar 2004 | A1 |
20040077464 | Feldman et al. | Apr 2004 | A1 |
20040099513 | Hetherington | May 2004 | A1 |
20040110602 | Feldman | Jun 2004 | A1 |
20040127337 | Nashner | Jul 2004 | A1 |
20040147317 | Ito et al. | Jul 2004 | A1 |
20040163855 | Carlucci | Aug 2004 | A1 |
20040180719 | Feldman et al. | Sep 2004 | A1 |
20040259688 | Stabile | Dec 2004 | A1 |
20050070154 | Milan | Mar 2005 | A1 |
20050076161 | Albanna et al. | Apr 2005 | A1 |
20050130742 | Feldman et al. | Jun 2005 | A1 |
20050202384 | DiCuccio et al. | Sep 2005 | A1 |
20060097453 | Feldman et al. | May 2006 | A1 |
20060161045 | Merril et al. | Jul 2006 | A1 |
20060205565 | Feldman et al. | Sep 2006 | A1 |
20060211543 | Feldman et al. | Sep 2006 | A1 |
20060217233 | Lee | Sep 2006 | A1 |
20060217243 | Feldman et al. | Sep 2006 | A1 |
20060223634 | Feldman et al. | Oct 2006 | A1 |
20060258512 | Nicolas et al. | Nov 2006 | A1 |
20070021279 | Jones | Jan 2007 | A1 |
20070027369 | Pagnacco et al. | Feb 2007 | A1 |
20070155589 | Feldman et al. | Jul 2007 | A1 |
20070219050 | Merril | Sep 2007 | A1 |
20080012826 | Cunningham et al. | Jan 2008 | A1 |
20080228110 | Berme | Sep 2008 | A1 |
20080261696 | Yamazaki et al. | Oct 2008 | A1 |
20090093315 | Matsunaga et al. | Apr 2009 | A1 |
Number | Date | Country |
---|---|---|
40 04 554 | Aug 1991 | DE |
195 02 918 | Aug 1996 | DE |
297 12 785 | Jan 1998 | DE |
20 2004 021 792 | May 2011 | DE |
20 2004 021 793 | May 2011 | DE |
0 275 665 | Jul 1988 | EP |
0 299 738 | Jan 1989 | EP |
0 335 045 | Oct 1989 | EP |
0 519 836 | Dec 1992 | EP |
1 043 746 | Oct 2000 | EP |
1 120 083 | Aug 2001 | EP |
1 127 599 | Aug 2001 | EP |
1 870 141 | Dec 2007 | EP |
2 472 929 | Jul 1981 | FR |
2 587 611 | Mar 1987 | FR |
2 604 910 | Apr 1988 | FR |
2 647 331 | Nov 1990 | FR |
2 792 182 | Oct 2000 | FR |
2 801 490 | Jun 2001 | FR |
2 811 753 | Jan 2002 | FR |
2 906 365 | Mar 2008 | FR |
1 209 954 | Oct 1970 | GB |
2 288 550 | Oct 1995 | GB |
44-23551 | Oct 1969 | JP |
55-95758 | Dec 1978 | JP |
54-73689 | Jun 1979 | JP |
55-113472 | Sep 1980 | JP |
55-113473 | Sep 1980 | JP |
55-125369 | Sep 1980 | JP |
55-149822 | Nov 1980 | JP |
55-152431 | Nov 1980 | JP |
60-79460 | Jun 1985 | JP |
60-153159 | Oct 1985 | JP |
61-154689 | Jul 1986 | JP |
62-34016 | Feb 1987 | JP |
62-034016 | Feb 1987 | JP |
63-158311 | Oct 1988 | JP |
63-163855 | Oct 1988 | JP |
63-193003 | Dec 1988 | JP |
02-102651 | Apr 1990 | JP |
2-238327 | Sep 1990 | JP |
3-25325 | Feb 1991 | JP |
3-103272 | Apr 1991 | JP |
03-107959 | Nov 1991 | JP |
6-063198 | Mar 1994 | JP |
6-282373 | Oct 1994 | JP |
7-213741 | Aug 1995 | JP |
7-213745 | Aug 1995 | JP |
07-239957 | Sep 1995 | JP |
7-241281 | Sep 1995 | JP |
7-241282 | Sep 1995 | JP |
7-275307 | Oct 1995 | JP |
7-302161 | Nov 1995 | JP |
8-43182 | Feb 1996 | JP |
08-131594 | May 1996 | JP |
8-182774 | Jul 1996 | JP |
08-182774 | Jul 1996 | JP |
08-184474 | Jul 1996 | JP |
8-184474 | Jul 1996 | JP |
8-215176 | Aug 1996 | JP |
08-244691 | Sep 1996 | JP |
2576247 | Jan 1997 | JP |
9-120464 | May 1997 | JP |
9-168529 | Jun 1997 | JP |
9-197951 | Jul 1997 | JP |
9-305099 | Nov 1997 | JP |
11-309270 | Nov 1999 | JP |
2000-146679 | May 2000 | JP |
U3068681 | May 2000 | JP |
U3069287 | Jun 2000 | JP |
2000-254348 | Sep 2000 | JP |
3172738 | Jun 2001 | JP |
2001-178845 | Jul 2001 | JP |
2001-178965 | Jul 2001 | JP |
2001-286451 | Oct 2001 | JP |
2002-017934 | Jan 2002 | JP |
2002-112984 | Apr 2002 | JP |
2002-157081 | May 2002 | JP |
2002-253534 | Sep 2002 | JP |
2003-79599 | Mar 2003 | JP |
2003-235834 | Aug 2003 | JP |
2004-216142 | Aug 2004 | JP |
2005-168963 | Jun 2005 | JP |
3722678 | Nov 2005 | JP |
2005-334083 | Dec 2005 | JP |
2006-110211 | Apr 2006 | JP |
3773455 | May 2006 | JP |
2006-167094 | Jun 2006 | JP |
3818488 | Sep 2006 | JP |
2006-284539 | Oct 2006 | JP |
U3128216 | Dec 2006 | JP |
2008-49117 | Mar 2008 | JP |
WO 9111221 | Aug 1991 | WO |
WO 9212768 | Aug 1992 | WO |
WO 9840843 | Sep 1998 | WO |
WO 0012041 | Mar 2000 | WO |
WO 0057387 | Sep 2000 | WO |
WO 0069523 | Nov 2000 | WO |
WO 0229375 | Apr 2002 | WO |
WO 02057885 | Jul 2002 | WO |
WO 2004051201 | Jun 2004 | WO |
WO 2004053629 | Jun 2004 | WO |
WO 2005043322 | May 2005 | WO |
WO 2008099582 | Aug 2008 | WO |
Entry |
---|
Interface, Inc.—Advanced Force Measurement—SM Calibration Certificate Installation Information, 1984. |
Hugh Stewart, “Isometric Joystick: A Study of Control by Adolescents and Young Adults with Cerebral Palsy,” The Australian Occupational Therapy Journal, Mar. 1992, vol. 39, No. 1, pp. 33-39. |
Raghavendra S, Rao, et al., “Evaluation of an Isometric and a Position Joystick in a Target Acquisition Task for Individuals with Cerebral Palsy,” IEEE Transactions on Rehabilitation Engineering, vol. 8, No. 1, Mar. 2000, pp. 118-125. |
D. Sengupta, et al., “Comparative Evaluation of Control Surfaces for Disabled Patients,”Proceedings of the 27th Annual Conference on Engineering in Medicine and Biology, vol. 16, Oct. 6-10, 1974, p. 356. |
Ian Bogost, “The Rhetoric of Exergaming,”The Georgia Institute of Technology, 9 pages (date unknown). |
Ludonauts, “Body Movin',” May 24, 2004, http://web.archive.org/web/20040611131903/http:/www.ludonauts.com; retrieved Aug. 31, 2010, 4 pages. |
Atari Gaming Headquarters—AGH's Atari Project Puffer Page, http://www.atarihq.com/othersec/puffer/index.html, retrieved Sep. 19, 2002, 4 pages. |
Michael Antonoff, “Real estate is cheap here, but the places you'd most want to visit are still under construction,” Popular Science, Jun. 1993, pp. 33-34. |
Steve Aukstakalnis and David Blatner, “The Art and Science of Virtual Reality—Silicon Mirage,” 1992, pp. 197-207. |
Electronics, edited by Michael Antonoff, “Video Gaines—Virtual Violence: Boxing Without Bruises,” Popular Science, Apr. 1993, p. 60. |
Stuart F. Brown, “Video cycle race,” Popular Science, May 1989, p. 73. |
Scanning the Field for Ideas, “Chair puts Player on the Joystick,” Machine Design, No. 21, Oct. 24, 1991, XP 000255214, 1 page. |
Francis Hamit, “Virtual Reality and the Exploration of Cyberspace,” University of MD Baltimore County, 1993, 4 pages. |
Innovation in Action—Biofeed back Motor Control, Active Leg Press—IsoLegPress, 2 pages (date unknown). |
Ric Manning, “Videogame players get a workout with the Exertainment,” The Gizmo Page from the Courier Journal Sep. 25, 1994, 1 page. |
Tech Lines, Military—Arcade aces and Aviation—Winging it, Popular Mechanics, Mar. 1982, p. 163. |
Sarju Shah, “Mad Catz Universal MC2 Racing Wheel: Mad Catz MC2 Universal,” Game Spot, posted Feb. 18, 2005, 3 pages. |
Joe Skorupa, “Virtual Fitness,” Sports Science, Popular Mechanics, Oct. 1994, 3 pages. |
AGH Musuem—Suncom Aerobics Joystick; http://atarihq.com/museum/2678/hardware/aerobics.html, (retrieved date unknown) 1 page. |
Nintendo Zone—The History of Nintendo (1889-1997), retrieved Aug. 24, 1998 pp. 1, 9-10. |
The Legible City, Computergraphic Installation with Dirk Groeneveld, Manhattan version (1989), Amsterdam version (1990), Karlsruhe version (1991), 3 pages. |
The New Exertainment System. It's All About Giving Your Members Personal Choices, Life Fitness, Circle Reader Service Card No. 28, 1995, 1 page. |
The Race Begins with $85, Randal Windracer, Circle Reader Service Card No. 34, 1990, 1 page. |
Universal S-Video/Audio Cable; Product #5015, MSRP 9.99; http://www.madcatz.com/Defaultasp?Page=133&CategoryImg=Universa—Cables, retrieved May 12, 2005, 1 page. |
Tom Dang, et al., “Interactive Video Exercise System for Pediatric Brain Injury Rehabilitation,” Assistive Technology Research Center, Rehabilitation Engineering Service, National Rehabilitation Hospital, Proceedings of the RESNA 20th Annual Conference, Jun. 1998, 3 pages. |
Linda S. Miller, “Upper Limb Exerciser,” Biometrics Ltd—Unique Solutions for Clinical and Research Applications, 6 pages (date unknown). |
Raymond W. McGorry, “A system for the measurement of grip forces and applied moments during hand tool use,” Liberty Mutual Research Center for Safety and Health, Applied Ergonomics 32 (2001) 271-279. |
NordicTrack's Aerobic Cross Trainer advertisment as shown in “Big Ideas—For a Little Money: Great Places to Invest $1,000 or Less,” Kiplinger's Personal Finance Magazine, Jul. 1994, 3 pages. |
Maurice R. Masliah, “Measuring the Allocation of Control in 6 Degree of Freedom Human-Computer Interaction Tasks,” Graduate Department of Mechanical and Industrial Engineering, University of Toronto, 2001, 177 pages. |
Leigh Arm Roman, “Boing! Combines Arcade Fun with Physical Training,” Memphis—Health Care News: Monitoring the Pulse of Our Health Care Community, Sep. 20, 1996, One Section, 1 page. |
“No More Couch Potato Kids,” as shown in Orange Coast, Sep. 1994, p. 16. |
Gary L. Downey, et al., “Design of an Exercise Arcade for Children with Disabilities,” Resna, Jun. 26-30, 1998, pp. 405-407. |
Frank Serpas, et al., “Forward-dynamics Simulation of Anterior Crueiate Ligament Forces Developed During Isokinetic Dynamometry,” Computer Methods in Biomechanics and Biomedical Engineering, vol. 5 (1), 2002, pp. 33-43. |
Carolyn Cosmos, “An ‘Out of Wheelchair Experience’”, The Washington Post, May 2, 2000, 3 pages. |
“Look Ma! No Hands!”, The Joyboard—Power Body Control, (date unknown). |
David H. Ahl, “Controller update,” Creative Computing, vol. 9, No. 12, Dec. 1983, p. 142. |
Ian Bogost, “Water Cooler Games—The Prehistory of Wii Fit,” Videogame Theory, Criticism, Design, Jul. 15, 2007, 2 pages. |
Jeremy Reimer, “A history of the Amiga, part 2: The birth of Amiga,” last updated Aug. 12, 2007, 2 pages. |
The Amiga Joyboard (1982) image, Photos: Fun with plastic—peripherals that changed gaming; http://news.cnet.com/2300-27076—3-10001507-2.html (retrieved Jul. 23, 2010), 1 page. |
The Amiga Power System Joyboard, Amiga history guide, http://www.amigahistory.co.uk/joyboard.html (retrieved Jul. 23, 2010), 2 pages. |
“Joyboard,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Joyboard (retrieved Jul. 26, 2010), 2 pages. |
“Dance Dance Revolution,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Dance Dance Revolution (retrieved Jul. 23, 2010), 9 pages. |
“Cure for the couch potato,” Kansas City Star (MO), Jan. 2, 2005, WLNR 22811884, 1 page. |
JC Fletcher, “Virtually Overlooked: The Power Pad games,” Joystiq, http://www.joystiq.com/2007/09/20/virtually-overlooked-the-power-pad-games/ (retrieved Jul. 26, 2010), 3 pages. |
Family Fun Fitness, Nintendo Entertainment System, BANDAI, (date unknown). |
“Power Pad/Family Fun and Fitness/Family Trainer,” http://www.gamersgraveyard.com/repository/nes/peripherals/powerpad.html (retrieved Jul. 26, 2010), 2 pages. |
“Power Pad Information,” Version 1.0 (Sep. 23, 1999) http://www.gamersgraveyard.com/repository/nes/peripherals/powerpad.txt (retrieved Jul. 26, 2010), 2 pages. |
Wii+Power+-Pad.jpg (image), http://bpl.blogger.com/—J5LEiGp54I/RpZbNpnLDgl/AAAAAAAAAic/Gum6DD3Umjg/s1600-h/Wii+Power+Pad.jpg (retrieved Jul. 26, 2010), 1 page. |
Vs. Slalom—Videogame by Nintendo, KLOV—Killer List of Video Games, http://www.arcade-museum.com/game—detail.php?game—id=10368 (retrieved Jul. 26, 2010), 3 pages. |
“Nintendo Vs. System,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Nintendo—Vs.—System (retrieved Jul. 26, 2010), 3 pages. |
Vs. Slalom—Step Up to the Challenge, Nintendo, (date unknown). |
Vs. Slalom—Live the Thrill, Nintendo, (date unknown). |
Vs. Slalom—Operation Manual, MDS(MGS), Nintendo, 4 pages, (date unknown). |
HyperspaceArcade.com—Specialists in Arcade Video Game Repair and Restoration, http://www.hyperspacearcade.com/VSTypes.html (retrieved Jul. 3, 2010), 3 pages. |
Vs. Slalom—Attachment Pak Manual; for Installation in: VS. UniSystem (UPRIGHT) and VS. DualSystem (UPRIGHT), TM of Nintendo of America Inc., 1986, 15 pages. |
Leiterman, “Project Puffer: Jungle River Cruise,” Atari, Inc., 1982, 2 pages. |
Leiterman, “Project Puffer: Tumbleweeds,” Atari, Inc., 1982, 1 page. |
Jerry Smith, “Other Input Devices,” Human Interface Technology Laboratory, 2 pages, (date unknown). |
Trevor Meers, “Virtually There: VR Entertainment Transports Players to Entrancing New Worlds,” Smart Computing, vol. 4, Issue 11, Nov. 1993, 6 pages. |
“Dance Aerobics,” Moby Games, Feb. 12, 2008, 2 pages. |
“Hard Drivin',” KLOV—Killer List of Video Games, The International Arcade Museum, http://www.arcade-museum.com, 6 pages, (date unknown). |
“The World's First Authentic Driving Simulation Game!”, Hard Drivin' -Get Behind the Wheel and Feel the Thrill (image), Atari games Corporation, 1 page, (date unknown). |
Electronic Entertainment Expo (E3) Overview, Giant Bomb—E3 2004 (video game concept), http://www.giantbomb.com/e3-2004/92-3436/ (retrieved Sep. 3, 2010), 3 pages. |
Guang Yang Amusement, Product Name: Live Boxer, 1 page, (date unknown). |
Family Fun Fitness: Basic Set (Control Mat and Athletic World Game Pak), Nintendo Entertainment System, Bandai, (date unknown). |
Roll & Rocker (image), 1 page, (date unknown). |
Roll & Rocker, Enteractive (image), 2 pages, (date unknown). |
Michael Goldstein, “Revolution on Wheels—Thatcher Ulrich,” Nov.-Dec. 1994, 3 pages. |
“Playboy on the Scene: Ride On!”, 1 page, (date unknown). |
Candace Putnam, “Software for Hardbodies: A virtual-reality hike machine takes you out on the open road,” Design, 1 page, (date unknown). |
Rachel, “No-Sweat Exercise—Can you get healthier without really trying?” Fitness, 1 page, (date unknown). |
Fitness article, Sep. 1994, p. 402-404. |
“Wired Top 10: Best Selling Toys in Jun. 1994,” Wired Sep., 1994, 1 page. |
“Top Skater,” Sega Amusements U.S.A, Inc, 1 page, (date unknown). |
Katharine Alter, et al., “Video Games for Lower Extremity Strength Training in Pediatric Brain Injury Rehabilitation,” National Rehabilitation Hospital, 18 pages, (date unknown). |
Cateye Recumbent GameBike Pro: Latest Technology in Exercise Bikes, beyondmoseying,com High Performance Exercise Equipment, 2 pages (advertisement; no date). |
Fitness Fun, while Exercising and Getting FIT for Kids, Teens and Adults, (advertisement, no date). |
Warranty Information and Your Joyboard: How it Works, Amiga Corporation, date unknown, 2 pages. |
Complaint for Patent Infringement, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Northern Division (Apr. 2, 2010), 317 pages. |
Plaintiff IA Labs CA, LLC's Opening Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Dec. 13, 2010), 36 pages. |
Nintendo Co., Ltd. and Nintendo of America Inc.'s Opening Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Dec. 13, 2010), 55 pages. |
Plaintiff IA Labs CA, LLC's Response Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Jan. 6, 2011), 49 pages. |
Nintendo Co., Ltd. and Nintendo of America Inc.'s Closing Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Jan. 6, 2011), 25 pages. |
Expert Report of Lee Rawls, Nov. 2, 2010, 37 pages (redacted). |
Nintendo Co., Ltd. and Nintendo of America's Opposition to IA Labs CA, LLC's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), including the Appendix of Exhibits and Exhibits A-R, 405 pages. |
Declaration of R. Lee Rawls in Support of Nintendo Co., Ltd. and Nintendo of America CA. LLC's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), including Exhibits 1, 3-12, 193 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States Distric Court for the District of Maryland Southern Division (May 16, 2011), 7 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant) United States District Court for the District of Maryland Southern Division (May 16, 2011), Appendix of Exhibits, 2 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 1, 36 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant) United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 2, 40 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 3, 85 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 4, 10 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States Distric Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 5, 9 pages. |
Declaration fo tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States Distrcit Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 6, 17 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 7, 16 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al, (Defendant) United States Distric Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 8, 45 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plantiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al, (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 9, 4 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 10, 22 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 11, 27 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 12, 3 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant) United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 13, 7 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 14, 22 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 15, 45 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 16, 42 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 17, 19 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 18, 27 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 19, 13 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 20, 29 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al, (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 21, 25 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 22, 11 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition for Plaintiff's Motion for Partial Summary Judgment IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al, (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 23, 20 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 24, 7 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintfif's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 25, 80 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Souther Division (May 16, 2011), Exhibit 26, 32 pages. |
U.S. Appl. No. 74/402,755, filed Jun. 14, 1993, 43 pages. |
“AccuSway Dual Top: For Balance and Postural Sway Measurement,” AMTI: Force and Motion, ISO 9001:2000, 2 pages. |
Borzelli G., Cappozzo A., and Papa E., “Inter- and intra-individual variability of ground rejection forces during sit-to-stand with principal component analysis,” Medical Engineering & Physics 21 (1999), pp. 235-240. |
Chiari L., Cappello A., Lenzi D., and Della Croce U, “An Improved Technique for the Extraction of Stochasitc Parameters from Stabilograms,” Gait and Posture 12 (2000), pp. 225-234. |
Cutlip R., Hsiao H., Garcia R., Becker E., Mayeux B., “A comparison of different postures for scaffold end-frame disassembly,” Applied Ergonomics 31 (2000), pp. 507-513. |
Davis K.G., Marras W.S., Waters T.R., “Evaluation of spinal loading during lowering and lifting,” The Ohio State University, Biodynamics Laboratory, Clinical Biomechanics vol. 13, No. 3, 1998 pp. 141-152. |
Rolf G. Jacob, Mark S. Redfern, Joseph M. Furman, “Optic Flow-induced Sway in Anxiety Disorders Associated with Space and Motion Discomfort,” Journal of Anxiety Disorders, vol. 9, No. 5, 1995, pp. 411-425. |
Jorgensen M.J., Marras W.S., “The effect of lumbar back support tension on trunk muscle activity,” Clinical Biomechanics 15 (2000), pp. 292-294. |
Deborah L. King and Vladimir M. Zatsiorsky, “Extracting gravity line displacement from stabilographic recordings,” Gait & Posture 6 (1997), pp. 27-38. |
Kraemer W.J., Volek J.S., Bush J.A., Gotshalk L.A., Wagner P.R., Gómez A.L., Zatsiorsky V.M., Duzrte M., Ratamess N.A., Mazzetti S.A., Selle B.J., “Influence of compression hosiery on physiological responses to standing fatigue in women,” The Human Performance Laboratory, Medical & Science in Sports & Exercise, 2000, pp. 1849-1858. |
Papa E. and Cappozzo A., “A telescopic inverted-pendulum model of the musculo-skeletal system and its use for the analysis of the sit-to-stand motor task,” Journal of Biomechanics 32 (1999), pp. 1205-1212. |
Balance System, BalanceTrak 500, & Quantrem, ZapConnect.com: Medical Device Industry Portal, http://www.zapconnect.com/products/index/cfm/fuseaction/products, 2 pages. (Retrieved Apr. 5, 2011). |
Bertec: Dominate Your Field, Physician's Quick Guide, Version 1.0.0, Feb. 2010, 13 pages. |
Bertec: Dominate Your Field, Balancecheck Screener, Version 1.0.0, Feb. 2010, 35 pages. |
Bertec: Dominate Your Field, Balancecheck Trainer, Version 1.0.0, Feb. 2010, 37 pages. |
Bertec Corporation—BALANCECHECK Standard Screener Package, http://bertec.com/products/balance-systems/standard-screener.html, 1 page. (Retrieved Apr. 12, 2011). |
Bertec Corporation—Balance Systems: Balancecheck Advanced balance assessment & training products for the balance professional, http://bertec.com/products/balance-systems.html, 1 page. (Retrieved Mar. 31, 2011). |
Bertec Corporation—Balancecheck Mobile Screener Package: Portable balance screening with full functionality, http://bertec.com/products/balance-systems/mobile-screener .html, 1 page. (Retrieved Mar. 31, 2011). |
Bertec Corporation—Balancecheck Standard Screener & Trainer Package: Advanced balance screening and rehabilitation system, http://bertec.com/products/balance-systems/standard-screener-trainer.html, 1 page. (Retrieved Mar. 31, 2011). |
U.S. Appl. No. 75/136,330, filed Jul. 19, 1996, 47 pages. |
Bertec: Dominate Your Field, Digital Acquire 4, Version 4.0.10, Mar. 2011, 22 pages. |
Bertec: Dominate Your Field, Bertec Force Plates, Version 1.0.0, Sep. 2009, 31 pages. |
Bertec: Dominate Your Field, Product Information: Force Plate FP4060-08:Product Details and Specifications, 4 pages. |
Bertec: Dominate Your Field, Product Information: Force Plate FP4060-10:Product Details and Specifications, 2 pages. |
U.S. Appl. No. 73/542,230, filed Jun. 10, 1985, 52 pages. |
Brent L. Arnold and Randy J. Schmitz, “Examination of Balance Measures Produced by the Biodex Stability System,” Journal of Athletic Training, vol. 33(4), 1998, pp. 323-327. |
Trademark Registration No. 1,974,115 filed Mar. 28, 1994, 8 pages. |
ICS Balance Platform, Fall Prevention: Hearing Assessment, Fitting Systems, Balance Assessment, Otometrics: Madsen, Aurical, ICS, 2 pages. |
U.S. Appl. No. 75/471,542, filed Apr. 16, 1998, 102 pages. |
VTI Force Platform, Zapconnect.com: Medical Device Industry Portal, http://zapconnect.com/products/index.cfm/fuseaction/products, 2 pages. (Retrieved Apr. 5, 2011). |
Amin M., Girardi M., Konrad H.R., Hughes L., “A Comparison of Electronystagmorgraphy Results with Posturography Findings from the BalanceTrak 500,” Otology Neurotology, 23(4), 2002, pp. 488-493. |
Girardi M., Konrad H.R., Amin M., Hughes L.F., “Predicting Fall Risks in an Elderly Population: Computer Dynamic Posturography Versus Electronystagmography Test Results,” Laryngoscope, 111(9), 2001, 1528-32. |
Dr. Guido Pagnacco, Publications, 1997-2008, 3 pages. |
College of Engineering and Applied Science: Electrical and Computer Engineering, University of Wyoming, Faculty: Guido Pagnacco, http://wwweng.uwyo.edu/electrical/faculty/Pagnacco.html, 2 pages. (Retrieved Apr. 20, 2011). |
EyeTracker, IDEAS, DIFRA, 501(k) Summary: premarket notification, Jul. 5, 2007, 7 pages. |
Vestibular technologies, copyright 2000-2004, 1 page. |
Scopus preview—Scopus—Author details (Pagnacco, Guido), http:www.scopus.com/authid/detail.url?authorId=6603709393, 2 pages. (Retrieved Apr. 20, 2011). |
Vestibular Technologies Company Page, “Vestibular technologies: Helping People Regain their Balance for Life,” http:www.vestibtech.com/AboutUs.html, 2 pages. (Retrieved Apr. 20, 2011). |
GN Otometrics Launces ICS Balance Platform: Portable system for measuring postural sway, http://audiologyonline.com/news/pf—news—detail.asp?news—id=3196, 1 page. (Retrieved Mar. 31, 2011). |
U.S. Appl. No. 75/508,272, filed Jun. 25, 1998, 36 pages. |
U.S. Appl. No. 75/756,991, filed Jul. 21, 1999, 9 pages. |
U.S. Appl. No. 76/148,037, filed Oct. 17, 2000, 78 pages. |
Vestibular technologies, VTI Products: BalanceTRAK User's Guide, Preliminary Version 0.1, 2005, 34 pages. |
U.S. Appl. No. 76/148,037, filed Oct. 17, 2000, 57 pages. |
Vestibular Technologies, Waybackmachine, http://vestibtech.com/balancetrak500.html, 7 pages. (Retrieved Mar. 30, 2011). |
Vestibular Technologies, 2004 Catalog, 32 pages. |
The Balance Trak 500—Normative Data, 8 pages. |
State of Delaware: The Official Website of the First State, Division of Corporations—Online Services, http://delecorp.delaware.gov/tin/controller, 2 pages. (Retrieved Mar. 21, 2011). |
Memorandum in Support of Plaintiff IA Labs' Motion for Partial Summary Judgment on Defendants' Affirmative Defense and Counterclaim that U.S. Patent No. 7,121,982 is Invalid Under 35 U.S.C. §§ 102 and 103, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (Apr. 27, 2011), 17 pages. |
Addlesee, M.D., et al., “The ORL Active Floor,” IEEE—Personal Communications, Oct. 1997. |
Baek, Seongmin, et al., “Motion Evaluation for VR-based Motion Training,” Eurographics 2001, vol. 20, No. 3, 2001. |
Biodex Medical Systems, Inc.—Balance System SD Product Information—http://www.biodex.com/rehab/balance/balance—300feat.htm. |
Chen, I-Chun, et al., “Effects of Balance Training on Hemiplegic Stroke Patients,” Chang Gung Medical Journal, vol. 25, No. 9, pp. 583-590, Sep. 2002. |
Dingwell, Jonathan, et al., “A Rehabilitation Treadmill with Software for Providing Real-Time Gait Analysis and Visual Feedback,” Transactions of the ASME, Journal of Biomechanical Engineering, 118 (2), pp. 253-255, 1996. |
Geiger, Ruth Ann, et al., “Balance and Mobility Following Stroke: Effects of Physical Therapy Interventions With and Without Biofeedback/Forceplate Training,” Physical Therapy, vol. 81, No. 4, pp. 995-1005, Apr. 2001. |
Harikae, Miho, “Visualization of Common People's Behavior in the Barrier Free Environment,” Graduate Thesis—Master of Computer Science and Engineering in the Graduate School of the University of Aizu, Mar. 1999. |
Hodgins, J.K., “Three-Dimensional Human Running,” Proceedings: 1996 IEEE International Conference on Robotics and Automation, vol. 4, Apr. 1996. |
Kim, Jong Yun, et al., “Abstract—A New VR Bike System for Balance Rehabilitation Training,” Proceedings: 2001 IEEE Seventh International Conference on Virtual Systems and Multimedia, Oct. 2001. |
McComas, Joan, et al., “Virtual Reality Applications for Prevention, Disability Awareness, and Physical Therapy Rehabilitation in Neurology: Our Recent Work,” School of Rehabilitation Sciences, University of Ottawa—Neurology Report, vol. 26, No. 2, pp. 55-61, 2002. |
NeuroCom International, Inc.—Balance Manager Systems/Products—http://resourcesonbalance.com/neurocom/products/index.aspx. |
NeuroCom International, Inc.—Neurogame—http://resourcesonbalance.com/neurocom/products/NeuroGames.aspx. |
Nicholas, Deborah S, “Balance Retraining After Stroke Using Force Platform Feedback,” Physical Therapy, vol. 77, No. 5, pp. 553-558, May 1997. |
Nintendo Co., Ltd.—Aerobic Exercise Rhythm Boxing—http://www.nintendo.co.jp/wii/rfnj/training/aerobics/aerobics07.html. |
Redfern, Mark, et al., “Visual Influences of Balance,” Journal of Anxiety Disorders, vol. 15, pp. 81-94, 2001. |
Sackley, Catherine, “Single Blind Randomized Controlled Trial of Visual Feedback After Stroke: Effects on Stance Symmetry and Function,” Disavility and Rehabilitation, vol. 19, No. 12, pp. 536-546, 1997. |
Tossavainen, Timo, et al., “Postural Control as Assessed with Virtual Reality,” Acta Otolaryngol, Suppl 545, pp. 53-56, 2001. |
Tossavainen, Timo, et al., “Towards Virtual Reality Simulation in Force Platform Posturography,” MEDINFO, pp. 854-857, 2001. |
Tsutsuguchi, Ken, et al., “Human Walking Animation Based on Foot Reaction Force in the Three-Dimensional Virtual World,” The Journal of Visualization and Computer Animation, vol. 11, pp. 3-16, 2000. |
Wong, Alice, et al., “The Devlopment and Clinical Evaluation of a Standing Biofeedback Trainer,” Journal of Rehabilitation Research and Development, vol. 34, No. 3, pp. 322-327, Jul. 1997. |
Yang, Ungyeon, et al., “Implementation and Evaluation of ‘Just Follow Me’: An Immersive, VR-Based, Motion-Training System,” Presence, vol. 11, No. 3, pp. 304-323, 2002. |
Search Report (2 pgs.) dated May 27, 2011 issued in German Application No. 20 2004 021 793.7. |
Japanese Office Action issued for corresponding Japanese Patent Application No. 2007-261798, dated Dec. 26, 2012. |
Number | Date | Country | |
---|---|---|---|
20090094442 A1 | Apr 2009 | US |