The technology herein relates to a computer readable storage medium storing an information processing program, and an information processing apparatus. More particularly, the technology herein relates to a computer readable storage medium storing an information processing program which performs information processing using an input device including a load detecting means, and an information processing apparatus.
Conventionally, there is a mat controller which is operated using a foot or feet to play a game. For example, Non-Patent Document 1 (“Instructions for ‘Wii Family Trainer’,” NAMCO BANDAI Games Inc., p. 9, p. 11) discloses a mat controller including upward, downward, rightward and leftward movement operating buttons for, e.g., selecting items, a decision operating button for deciding (or canceling) an item, and the like. The mat controller is provided with a plurality of buttons corresponding to respective operations arranged at predetermined positions on a mat. The user steps on a button corresponding to a desired operation to perform the operation.
As described above, the mat controller of Non-Patent Document 1 is provided with a plurality of buttons at predetermined positions on a mat, and is not operated based on a center-of-gravity position on the mat.
Therefore, an object of certain example embodiments is to provide a computer readable storage medium storing an information processing program which applies a load to an input device including a load detecting means and performs a process based on a center-of-gravity position of the load, and an information processing apparatus.
Certain example embodiments have the following features to attain the object mentioned above. Note that reference numerals, additional descriptions and the like inside parentheses in this section indicate correspondence to embodiments described below for the sake of easy understanding, and do not limit the present invention.
An embodiment of the present invention is directed to a computer readable storage medium storing an information processing program executable by a computer (CPU 40) of an information processing apparatus (game apparatus 12) for processing a signal which is based on a load value output from a first input device (load controller 36) including an input surface (platform 36a) and a load detecting means (load sensors 364) for detecting the load value applied to the input surface. The program causes the computer to function as a center-of-gravity position detecting means (step S30 of
According to the embodiment of the present invention, when the magnitude of a load applied to an input portion of the first input device including the load detecting means is smaller than a predetermined value, a process can be performed based on a center-of-gravity position of the load. In other words, when the user stands on the first input device, an input to the first input device by the user is not accepted, and when the user does not stand on the first input device, an input to the first input device by the user is accepted. As a result, it is possible to prevent an operation which is not intended by the user and confusion.
In the embodiment of the present invention, the first input device may output the load value detected by the load detecting means. In this case, the center-of-gravity position detecting means calculates the center-of-gravity position based on the load value detected by the load detecting means. The load value determining means determines whether or not the load value detected by the load detecting means is smaller than the predetermined value.
With this configuration, the information processing apparatus can calculates the center-of-gravity position based on the load value output from the first input device. The information processing apparatus can executes a predetermined process based on the calculated center-of-gravity position.
In the embodiment of the present invention, the processing means may execute one of different processes corresponding to respective areas set on the input surface, depending on in which of the areas the center-of-gravity position is located.
With this configuration, different processes can be executed, depending on the area on the input surface in which the center-of-gravity position is located.
In the embodiment of the present invention, the information processing program may cause the computer to further function as a weight information acquiring means. The weight information acquiring means acquires weight information of a user. Moreover, the load value determining means may include a threshold changing means. The threshold changing means changes the predetermined value based on the weight information of the user. The threshold changing means also determine whether or not the load value is smaller than the predetermined value changed by the threshold changing means. Here, the weight information of the user may be a previously stored user's weight or a weight input by the user.
With this configuration, the threshold for determination of the load value determining means can be changed based on the weight information of the user.
In the embodiment of the present invention, the first input device may include a plurality of load detecting means, and output a plurality of load values detected by the plurality of load detecting means. In this case, the center-of-gravity position detecting means calculates the center-of-gravity position based on the load values detected by the plurality of load detecting means. The load value determining means determines whether or not a sum of the load values detected by the plurality of load detecting means is smaller than the predetermined value.
With this configuration, the center-of-gravity position of a load can be obtained from load values detected by a plurality of load detecting means.
In the embodiment of the present invention, the information processing apparatus may process a signal output from a second input device different from the first input device, in addition to the signal output from the first input device. In this case, the processing means, when the result of the determination by the load value determining means is negative, executes the predetermined process based on the signal output from the second input device.
With this configuration, the user can perform inputting using two input devices, i.e., the first input device and the second input device. Moreover, when the result of determination by the load value determining means is negative, then if a process is executed based on an input to the second input device entered by the user, it is possible to prevent an input which is not intended by the user. As a result, even when two input devices can be used to perform inputting, the user can easily perform an intended operation.
In the embodiment of the present invention, the information processing program may cause the computer to further function as an area setting means (S2). The area setting means sets on the input surface an input area (380, 382) corresponding to an input type of the second input device. The processing means, when the result of the determination by the load value determining means is positive, executes the predetermined process based on an input of the input type corresponding to the input area in which the center-of-gravity position is located.
With this configuration, when the first input device and the second input device can be used to perform inputting, a similar operation can be performed. Specifically, by setting an input area corresponding to an input type of the second input device, the same input type as that which the second input device can be used to input can be input using the first input device.
In the embodiment of the present invention, the information processing program may cause the computer to further function as an operation displaying means (S10). The operation displaying means displays on a display device a display indicating that an operation using the first input device is available, when the result of the determination by the load value determining means is positive.
With this configuration, when an operation using the first input device is effective, a display indicating that the operation is available (a message or an image (e.g., an image indicating which area on the input surface corresponds to which input operation) can be provided. As a result, the user can easily determine whether or not an operation using the first input device is available.
In the embodiment of the present invention, the processing means may execute a menu operation process for selecting and deciding an item as the predetermined process.
With this configuration, the first input device can be used to perform a menu operation.
In the embodiment of the present invention, the information processing program may cause the computer to further function as an area setting means (S2). The area setting means sets a first area (380) including one or more areas and a second area (382) including one or more areas on the input surface of the first input device. The processing means executes a first process (cursor moving process) as the predetermined process when the center-of-gravity position is located in the first area, and executes a second process (item deciding process) as the predetermined process when the center-of-gravity position is located in the second area.
With this configuration, the user can perform different processes (the first and second processes) using the first input device.
In the embodiment of the present invention, the area setting means may set the second area to be narrower than the first area.
With this configuration, the user can cause the computer to more easily execute the second process than the first process. As a result, satisfactory operability is obtained even when an input operation is performed using the first input device.
In the embodiment of the present invention, the area setting means may set a distance between a predetermined position on the input surface and the second area to be longer than a distance between the predetermined position and the first area.
With this configuration, the user can cause the computer to more easily perform the first process than the second process.
In the embodiment of the present invention, the area setting means may further set a third area (invalid input area) on the input surface. In this case, the processing means executes the predetermined process when the center-of-gravity position is located in the first area or the second area, and does not execute the predetermined process when the center-of-gravity position is located in the third area.
With this configuration, an invalid input area can be set on the input surface. As a result, when the center-of-gravity position is located in the third area, an input by the first input device can be caused to be invalid.
In the embodiment of the present invention, the processing means may execute as the first process a process of moving a cursor used to select an item, and execute as the second process a process (decision of an item, transition of screens, etc.) including a process of deciding the item selected by the cursor.
With this configuration, the user can perform a process of moving a cursor, a process of deciding an item selected by the cursor, and the like, using the first input device.
In the embodiment of the present invention, the area setting means may set as the third area a boundary between each area included in the first area and the second area.
With this configuration, the user can easily perform an operation based on each area. Specifically, by setting the third area between each area of the first area and the second area, the areas can be clearly distinguished from each other, thereby making it possible to prevent the user from entering an erroneous input.
In the embodiment of the present invention, the area setting means may set the third area at a center portion of the input surface, the first area above, below, to the right of, and to the left of the third area in the shape of a cross, and the second area in oblique directions of the third area.
With this configuration, the user can use the first input device to perform an input operation similar to a cross-key operation or a button operation which is performed using a conventional controller held by a hand.
According to the embodiment of the present invention, even when an operation is performed by applying a load to an input device including a load detecting means, the user can easily perform a desired operation. Specifically, only when the user does not stand on the input device, an input from the input device is accepted. As a result, it is possible to prevent an operation which is not intended by the user and confusion.
These and other objects, features, aspects and advantages of certain example embodiments will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
(Overall Configuration of Game System)
Next, an example will be described with reference to the accompanying drawings.
The game apparatus 12 includes a housing 14 in the shape of substantially a rectangular parallelepiped. A disc slot 16 is provided in a front surface of the housing 14. An optical disc 18 which is an exemplary information storage medium storing a game program or the like is inserted and loaded through the disc slot 16 into a disc drive 54 (see
Also, a power button 20a and a reset button 20b are provided in an upper portion of the front surface of the game apparatus 12, and an eject button 20c is provided in a lower portion thereof. Moreover, a connector cover 28 for an external memory card is provided between the reset button 20b and the eject button 20c and in the vicinity of the disc slot 16. A connector 62 for an external memory card (see
Note that, as the memory card, a general-purpose SD card can be used, or alternatively, other general-purpose memory cards, such as a memory stick® and a multimedia card®, can be used.
An AV cable connector 58 (see
Note that power for the game apparatus 12 is supplied from a typical AC adapter (not shown). The AC adapter is plugged into a standard home wall socket. The game apparatus 12 converts home power supply (commercial power supply) into a low DC voltage signal suitable for driving. In other examples, a battery may be used as a power supply.
In the game system 10, when a user or users desire to play a game (or other applications rather than games), the user initially powers ON the game apparatus 12, and then selects an appropriate optical disc 18 storing a program of a video game (or another application which the user desires to play), and loads the optical disc 18 into the disc drive 54 of the game apparatus 12. In response to this, the game apparatus 12 starts executing the video game or another application based on a program recorded on the optical disc 18. The user operates the controller 22 so as to give an input to the game apparatus 12. For example, by operating any of a plurality of input means 26, the game or another application is started. Also, in addition to operations to the input means 26, by moving the controller 22 itself, a moving image object (user object) can be moved in different directions, or a point of view (camera position) of the user in a 3D game world can be changed.
The external main memory 46 stores a program, such as a game program or the like, or various kinds of data, or serves as a work area or a buffer area for the CPU 40. The ROM/RTC 48 is a so-called boot ROM, in which a program for booting the game apparatus 12 is incorporated and a clock circuit for counting time is provided. The disc drive 54 reads out program data, texture data or the like from the optical disc 18, and writes data into an internal main memory 42e (described below) or the external main memory 46 by a control of the CPU 40.
The system LSI 42 includes an input-output processor 42a, a GPU (Graphics Processor Unit) 42b, a DSP (Digital Signal Processor) 42c, a VRAM 42d, and the internal main memory 42e, which are connected to each other via an internal bus (not shown).
The input/output processor (I/O processor) 42a executes transmission/reception or downloading of data. The data transmission/reception or downloading will be described in detail below.
The GPU 42b, which is a part of a drawing means, receives a graphics command (drawing command) from the CPU 40, and generates game image data in accordance with the command. Note that the CPU 40 gives the GPU 42b an image generating program required for generation of the game image data in addition to the graphics command.
The VRAM 42d is connected to the GPU 42b as described above, though not shown. The GPU 42b accesses the VRAM 42d to acquire data (image data: polygon data, texture data, etc.) required for execution of the drawing command. Note that the CPU 40 writes image data required for drawing into the VRAM 42d via the GPU 42b. The GPU 42b access the VRAM 42d to generate game image data for drawing.
Note that it is assumed in this example that the GPU 42b generates game image data. When any application other than game applications is executed, the GPU 42b generates image data for the application.
The DSP 42c, which functions as an audio processor, generates audio data corresponding to sound, speech or music which is to be output from the loudspeakers 34a, using sound data, sound waveform (tone color) data or the like stored in the internal main memory 42e, the external main memory 46 or the like.
The image data and audio data thus generated are read out by the AV IC 56, and are then output via the AV connector 58 to the monitor 34 and the loudspeakers 34a, respectively. Therefore, a game screen is displayed on the monitor 34 while sound (music) required for a game is output from the loudspeakers 34a.
The flash memory 44, a wireless communication module 50, and a wireless controller module 52 as well as an extension connector 60 and the external memory card connector 62 are connected to the input/output processor 42a. An antenna 50a is connected to the wireless communication module 50. An antenna 52a is connected to the wireless controller module 52.
The input/output processor 42a can communicate with other game apparatuses or various servers connected to a network via the wireless communication module 50. Note that the input/output processor 42a can directly communicate with other game apparatuses without via a network. The input/output processor 42a regularly accesses the flash memory 44 to detect the presence or absence of data (transmission data) that needs to be transmitted to the network. If there is the transmission data, the input/output processor 42a transmits the transmission data via the wireless communication module 50 and the antenna 50a to the network. The input/output processor 42a also receives data (received data) transmitted from another game apparatus via the network, the antenna 50a and the wireless communication module 50, and stores the received data into the flash memory 44. Note that, in a certain case, the received data is directly discarded. Moreover, the input/output processor 42a receives data (downloaded data) downloaded from a download server via the network, the antenna 50a and the wireless communication module 50, and stores the downloaded data into the flash memory 44.
The input/output processor 42a also receives input data transmitted from the controller 22 or the load controller 36 via the antenna 52a and the wireless controller module 52, and stores (temporarily stores) the input data into a buffer area of the internal main memory 42e or the external main memory 46. The input data is utilized in a game process performed by the CPU 40 before being erased from the buffer area.
Note that, in this example, as described above, the wireless controller module 52 communicates with the controller 22 and the load controller 36 in accordance with the Bluetooth standard.
In
Also, the extension connector 60 and the external memory card connector 62 are connected to the input/output processor 42a. The extension connector 60 is a connector for interface, such as USB or SCSI. A medium (e.g., an external storage medium, etc.) or a peripheral device (e.g., another controller, etc.) can be connected to the extension connector 60. A wired LAN adapter can be connected to the extension connector 60, so that a wired LAN can be used instead of the wireless communication module 50. An external storage medium, such as a memory card or the like, can be connected to the external memory card connector 62. Therefore, for example, the input/output processor 42a can access an external storage medium via the extension connector 60 or the external memory card connector 62 to save or read out data.
As also shown in
Although power is supplied to the system LSI 42 even in the standby mode, a clock is not supplied to the GPU 42b, the DSP 42c or the VRAM 42d so that they are not driven, resulting in a decrease in power consumption.
Moreover, a fan for emitting heat of ICs, such as the CPU 40, the system LSI 42 and the like, is provided in the housing 14 of the game apparatus 12, though not shown. The fan is also stopped in the standby mode.
Note that, when the standby mode is not desired, then if the system LSI 42 is set so that the standby mode is not to be used, power supply to all circuit components is completely stopped when the power button 20a is turned OFF.
The normal mode and the standby mode can be switched by turning ON/OFF a power switch 26h (see
The reset button 20b is also connected to the system LSI 42. When the reset button 20b is pushed down, the system LSI 42 reboots the boot program of the game apparatus 12. The eject button 20c is connected to the disc drive 54. When the eject button 20c is pushed down, the optical disc 18 is ejected from the disc drive 54.
Referring to
The cross-key 26a is a 4-direction push switch which includes operation portions corresponding to four directions indicated by arrows, i.e., forward (or upward), backward (or downward), rightward and leftward directions. By operating one of the operation portions, the user can indicate a movement direction of a character or an object (a user character or a user object) which the user can operate, or a movement direction of a cursor.
The 1-button 26b and the 2-button 26c are each a push button switch. For example, the 1-button 26b and the 2-button 26c are used to perform a game operation, such as adjustment of a viewpoint position or a viewpoint direction (i.e., a position or an angle of view of a virtual camera) when a three-dimensional game image is displayed. Also, the 1-button 26b and the 2-button 26c may be used to perform the same operations as those of the A-button 26d and the B-trigger switch 26i or supplementary operations.
The A-button switch 26d is a push button switch which causes a user character or a user object to perform any action other the direction specification, i.e., hitting (punching), throwing, catching (getting), riding, jumping or the like. For example, in an action game, an instruction to jump, punch, move a weapon or the like can be input.
Also, in a role playing game (RPG) or a simulation RPG, an instruction to get an item, select a weapon or a command, make a decision or the like can be input.
The “−” button 26e, the HOME button 26f, the “+” button 26g and the power switch 26h are also push button switches. The “−” button 26e is used to select a game mode. The HOME button 26f is used to display a game menu (menu screen). The “+” button 26g is used to, for example, start (resume) or temporarily stop a game. The power switch 26h is used to turn ON/OFF a power supply of the game apparatus 12 by a remote operation.
Note that, in this example, a power switch for turning ON/OFF the controller 22 itself is not provided, and the controller 22 is turned ON by operating any of the input means 26 of the controller 22, and is automatically turned OFF if none of the input means 26 is performed for a predetermined period of time (e.g., 30 sec) or more.
The B-trigger switch 26i is also a push button switch, and is mainly used to provide an input mimicking a trigger for shooting a bullet or the like, or designate a position selected by the controller 22. When the B-trigger switch 26i continues to be pressed, an operation or a parameter of a user object can be maintained in a predetermined state. Also, in a certain case, the B-trigger switch 26i functions in a manner similar to that of a typical B-button, i.e., is used to cancel an action decided by the A-button 26d.
Also, as shown in
Moreover, the controller 22 has an image capture information computing unit 80 (see
Note that the shape of the controller 22 and the shapes, number and installation positions of the input means 26 shown in
The processor 70 controls the overall operation of the controller 22. The processor 70 transmits (inputs) information (input information) received from the input means 26, the acceleration sensor 74 and the image capture information computing unit 80, as input data, to the game apparatus 12 via the radio module 76 and the antenna 78. In this case, the processor 70 employs the memory 72 as a work area or a buffer area.
An operation signal (operation data) from the input means 26 (26a to 26i) is input to the processor 70. The processor 70 temporarily stores the operation data in the memory 72.
Also, the acceleration sensor 74 detects an acceleration along each of three axes in a vertical direction (y-axis direction), a lateral direction (x-axis direction) and a front-to-rear direction (z-axis direction) of the controller 22. The acceleration sensor 74 is typically of a capacitance type or may be of other types.
For example, the acceleration sensor 74 detects accelerations (ax, ay, az) along the x-axis, the y-axis and the z-axis and outputs data of the accelerations (acceleration data) to the processor 70 at first predetermined intervals. For example, the acceleration sensor 74 detects an acceleration in each axial direction within the range of −2.0 g to 2.0 g (g herein indicates the gravitational acceleration). The processor 70 detects acceleration data received from the acceleration sensor 74 at second predetermined intervals and temporarily stores the acceleration data in the memory 72. The processor 70 generates input data containing at least one of operation data, acceleration data, and marker coordinate data (described below), and transmits the generated input data to the game apparatus 12 at third predetermined intervals (e.g., 5 msec).
Note that, in this example, the acceleration sensor 74 is provided at a portion in the vicinity of the cross-key 26a of the printed board in the housing 22a, though not shown in
The radio module 76 uses, for example, the Bluetooth® technique to modulate carrier waves having a predetermined frequency with input data, and emits a resultant weak radio wave signal from the antenna 78. In other words, the input data is modulated by the radio module 76 into the weak radio wave signal, which is in turn transmitted from the antenna 78 (controller 22). The weak radio wave signal is received by the wireless controller module 52 of the game apparatus 12. The received weak radio waves are subjected to demodulation and decoding processes, and therefore, the game apparatus 12 can acquire the input data from the controller 22. Thereafter, the CPU 40 performs a game process based on the acquired input data and a program (game program).
Moreover, as described above, the controller 22 includes the image capture information computing section 80. The image capture information computing section 80 includes an infrared filter 80a, a lens 80b, an image capturing device 80c, and an image processing circuit 80d. The infrared filter 80a passes only an infrared part of light entering from the front of the controller 22. As described above, the markers 340m and 340n provided in the vicinity of (around) the display screen of the monitor 34 are infrared LEDs which output infrared light toward the front of the monitor 34. Therefore, by providing the infrared filter 80a, images of the markers 340m and 340n can be more correctly captured. The lens 84 collects infrared light passing through the infrared filter 82 and emits the infrared light to the image capturing device 80c. The image capturing device 80c, which is, for example, a solid-state image capturing device, such as a CMOS sensor or a CCD sensor, receives infrared light collected by the lens 80b. Therefore, the image capturing device 80c captures only infrared light passing through the infrared filter 80a to generate image data. Hereinafter, the image captured by the image capturing device 80c is referred to as a captured image. The image data generated by the image capturing device 80c is processed by the image processing circuit 80d. The image processing circuit 80d calculates a position of a target object (the markers 340m and 340n) in the captured image, and outputs coordinates indicating the calculated position, as captured image data, to the processor 70 at fourth predetermined intervals. Note that the process of the image processing circuit 80d will be described below.
The platform 36a is in the shape of substantially a rectangular parallelepiped, and is in the shape of substantially a rectangle as viewed from the top. For example, the short side and the long side of the rectangle are set to about 30 cm and about 50 cm, respectively. The platform 36a has a flat upper surface on which the user stands. The platform 36a has four corner side surfaces each partially sticking out in a cylindrical shape.
In the platform 36a, the four load sensors 364 are arranged at predetermined intervals. In this example, the four load sensors 364 are arranged in a periphery of the platform 36a, specifically, at the respective four corners. The intervals at which the load sensors 364 are arranged are set to appropriate values which allow accurate detection of what is intended by a game operation depending on the way in which the user puts a load onto the platform 36a.
The support plate 360 includes an upper plate 360a forming an upper surface and an upper side surface portion, a lower plate 360b forming a lower surface and a lower side surface portion, and an intermediate plate 360c provided between the upper plate 360a and the lower plate 360b. The upper plate 360a and the lower plate 360b are formed by, for example, plastic molding, and are integrated using an adhesive or the like.
The intermediate plate 360c is, for example, formed of a single metal plate by press forming The intermediate plate 360c is fixed onto the four load sensors 364. The upper plate 360a has a grid-patterned rib (not shown) on a lower surface thereof. The upper plate 360a is supported by the intermediate plate 360c with the rib being interposed therebetween. Therefore, when the user stands on the platform 36a, the load is transferred to the support plate 360, the load sensors 364 and the legs 362. As indicated with arrows in
Each load sensor 364 is, for example, a strain gauge (strain sensor) load cell, which is a load transducer which converts an input load to an electrical signal. In the load sensor 364, a structural member 365 is deformed, depending on an input load, resulting in strain. The strain is converted into a change in electrical resistance and is then converted into a change in voltage by a strain sensor 366 attached to the structural member. Therefore, the load sensor 364 outputs a voltage signal indicating the input load from an output terminal thereof.
Note that the load sensor 364 may be of other types, such as a tuning fork type, a string vibration type, a capacitance type, a piezoelectric type, a magnetostrictive type, and a gyroscopic type.
Referring back to
The load controller 36 includes a microcomputer 100 for controlling the operation of the load controller 36. The microcomputer 100 includes a CPU, a ROM, a RAM and the like (not shown). The CPU controls the operation of the load controller 36 in accordance with a program stored in the ROM.
The power button 36c, an A/D converter 102, a DC-DC converter 104, and a radio module 106 are connected to the microcomputer 100. Moreover, an antenna 106a is connected to the radio module 106. The four load sensors 364 are connected via respective amplifiers 108 to the A/D converter 102.
The load controller 36 also accommodates a battery 110 for supplying power. In other examples, an AC adapter may be connected to the load controller 36 instead of the battery so that commercial power is supplied to the load controller 36. In this case, a power supply circuit which converts alternating current into direct current and decreases and rectifies direct voltage needs to be provided instead of the DC-DC converter. In this example, power is supplied directly from the battery 110 to the microcomputer 100 and the radio module 106. In other words, power is invariably supplied to a portion (CPU) of the components of the microcomputer 100 and the radio module 106 so as to determine whether or not the power button 36c has been pushed down or whether or not a command to power ON (detection of a load) has been transmitted from the game apparatus 12. On the other hand, power is supplied to the load sensors 364, the A/D converter 102 and the amplifiers 108 from the battery 110 via the DC-DC converter 104. The DC-DC converter 104 converts the voltage value of the direct current from the battery 110 to a different voltage value, and supplies the resultant power to the load sensors 364, the A/D converter 102 and the amplifiers 108.
The supply of power to the load sensors 364, the A/D converter 102 and the amplifiers 108 may be performed as required by the microcomputer 100 controlling the DC-DC converter 104. Specifically, the microcomputer 100, when determining that it is necessary to operate the load sensors 364 to detect a load, may control the DC-DC converter 104 to supply power to the load sensors 364, the A/D converter 102 and the amplifiers 108.
When power is supplied to the load sensors 364, each load sensor 364 outputs a signal indicated a load input thereto. The signals are amplified by the respective amplifiers 108, and are converted from analog signals into digital data by the A/D converter 102. The digital data is input to the microcomputer 100. A detected value of each load sensor 364 is given identification information of the load sensor 364. Therefore, each load sensor 364 can be identified from a corresponding detected value. Thus, the microcomputer 100 can acquire data indicating detected load values at the same time of the four load sensors 364.
On the other hand, the microcomputer 100, when determining that it is not necessary to operate the load sensors 364 (i.e., it is not the timing of load detection), controls the DC-DC converter 104 to stop the supply of power to the load sensors 364, the A/D converter 102 and the amplifiers 108. Thus, the load controller 36 can operate the load sensors 364 to detect a load only when it is required, whereby power consumption for load detection can be suppressed.
The load detection is typically required when the game apparatus 12 (
Alternatively, the microcomputer 100 may determine that load detection timing occurs at predetermined intervals and control the DC-DC converter 104. When the load detection is thus periodically performed, cycle information may be initially supplied and stored from the game apparatus 12 into the microcomputer 100 of the load controller 36 or may be previously stored in the microcomputer 100, for example.
Data indicating detected values from the load sensors 364 is transmitted as operation data (input data) of the load controller 36 from the microcomputer 100 via the radio module 106 and an antenna 106b to the game apparatus 12 (
Note that the radio module 106 can perform communication in accordance with the same wireless standard (Bluetooth®, wireless LAN, etc.) as that of the wireless controller module 52 of the game apparatus 12. Therefore, the CPU 40 of the game apparatus 12 can transmit the load acquisition command via the wireless controller module 52 and the like to the load controller 36. The microcomputer 100 of the load controller 36 can receive the command via the radio module 106 and the antenna 106a from the game apparatus 12, and transmit input data containing a detected load value (or a calculated load value) of each load sensor 364 to the game apparatus 12.
For example, in a game which is executed based on a simple sum of four load values detected by the four load sensors 364, the user is permitted to stand at any position with respect to the four load sensors 364 of the load controller 36, i.e., the user is permitted to stand on the platform 36a at any position and in any orientation to play a game. In some kinds of games, however, the direction of a load value detected by each load sensor 364 as viewed from the user needs to be identified, i.e., a positional relationship between the four load sensors 364 of the load controller 36 and the user needs to be recognized. In this case, for example, the positional relationship between the four load sensors 364 and the user may be previously defined, and the user may be supposed to stand on the platform 36a in a manner which allows the predetermined positional relationship. Typically, a positional relationship in which there are two load sensors 364 in front of, behind, to the right of, and to the left of the user standing at a middle of the platform 36a, i.e., a positional relationship in which, when the user stands at a middle of the platform 36a of the load controller 36, there is a load sensor 364 in front right, front left, rear right and rear left directions with respect to the user as a center, is defined. In this case of this example, the platform 36a of the load controller 36 is in the shape of a rectangle as viewed from the top and the power button 36c is provided at one side (long side) of the rectangle. Therefore, it is previously ruled that the user should stand on the platform 36a using the power button 36c as a guide in a manner which allows the long side at which the power button 36c is provided to be located in a predetermined direction (front, rear, left or right). In this case, a load value detected by each load sensor 364 is located in a predetermined direction (front right, front left, rear right, and rear left) as viewed from the user. Therefore, the load controller 36 and the game apparatus 12 can find out a direction in which each detected load value is located as viewed from the user, based on the identification information of the load sensors 364 contained in the detected load value data and arrangement data indicating previously set (stored) positions or directions of the load sensors 364 with respect to the user. As a result, for example, it is possible to recognize what is intended by a game operation input by the user, such as forward, backward, rightward and leftward operation directions and the like.
Note that the arrangement of the load sensors 364 with respect to the user may not be previously defined, and may be input and set by the user during initial setting or during a game. For example, a screen may be displayed which instructs the user to stand on a portion in a predetermined direction (front left, front right, rear left, rear right, etc.) as viewed from the user, and load values may be acquired, so that a positional relationship between the load sensors 364 and the user can be specified, and therefore, arrangement data may be generated and stored based on these settings. Alternatively, a screen for selecting an arrangement of the load controllers 36 may be displayed on the monitor 34 to cause the user to select a direction in which a guide (the power button 36c) is located as viewed from the user by an input using the controller 22. Arrangement data of the load sensors 364 may be generated and stored based on the selection.
Note that, in
Note that when the position or orientation of the controller 22 is out of the range, a game operation cannot be performed based on the position and orientation of the controller 22. Hereinafter, the range is referred to as an “operable range”.
When the controller 22 is held within the operable range, the image capture information computing unit 80 captures an image of each of the markers 340m and 340n. Specifically, an image captured by the image capturing device 80c contains the image (target image) of each of the markers 340m and 340n whose images are targets to be taken.
The target image appears as a high luminance portion in the image data of the captured image. Therefore, the image processing circuit 80d initially detects the high luminance portion as a candidate for the target image. Next, the image processing circuit 80d determines whether or not the high luminance portion is the target image, based on a size of the detected high luminance portion. The captured image may contain an image caused by sunlight through a window or light of a fluorescent tube in a room in addition to target images 340m′ and 340n′ of the two the markers 340m and 340n. The process of determining whether or not the high luminance portion is the target image is executed so as to distinguish the target images 340m′ and 340n′ of the two the markers 340m and 340n from other images to correctly detect the target images. Specifically, in the determination process, it is determined whether or not the detected high luminance portion has a size within a predetermined range. When the high luminance portion has a size within the predetermined range, the high luminance portion is determined to represent the target image. Conversely, when the size of the high luminance portion does not fall within the predetermined range, the high luminance portion is determined to be an image other than the target images.
Moreover, for a high luminance portion which has been determined to represent the target image as a result of the determination process, the image processing circuit 80d calculates a position of the high luminance portion. Specifically, a position of the center of gravity of the high luminance portion is calculated. Coordinates of the center-of-gravity position of the high luminance portion are herein referred to as marker coordinates. The marker coordinates can be calculated with a scale finer than the resolution of the image capturing device 80c. Here, it is assumed that an image captured by the image capturing device 80c has a resolution of 126×96, and the marker coordinates are calculated with a scale of 1024×768. The marker coordinates are represented with integer values in the range of (0, 0) to (1024, 768).
Note that it is assumed that the position of a captured image is represented by a coordinate system (XY coordinate system) in which an upper left point of the captured image is the origin, a downward direction is the positive direction of the Y-axis, and a rightward direction is the positive direction of the X-axis.
When the target image is correctly detected, the determination process determines two high luminance portions as the target images, so that two marker coordinate points are calculated. The image processing circuit 80d outputs data indicating the two calculated marker coordinate points. The output data of the marker coordinates (marker coordinate data) is incorporated into input data by the processor 70 as described above, and is transmitted to the game apparatus 12.
The game apparatus 12 (the CPU 40), when detecting marker coordinate data from received input data, can calculate a position (coordinates) on the screen of the monitor 34 pointed by the controller 22, and a distance between the controller 22 and each of the markers 340m and 340n, based on the marker coordinate data. Specifically, a position pointed by the controller 22 (pointed position) is calculated from a middle point between the two marker coordinate points. Therefore, the controller 22 functions as a pointing device which points any position within the screen of the monitor 34. The pointed position of the controller 22 is ideally a position where a straight line extending from the front end surface of the controller 22 in a longitudinal direction of the controller 22 intersects the screen of the monitor 34. Also, a distance between the target images in the captured image varies, depending on the distances between the controller 22 and the markers 340m and 340n. Therefore, by calculating a distance between the two marker coordinate points, the game apparatus 12 can find out the distances between the controller 22 and the markers 340m and 340n.
(Menu Operation)
Next, a menu operation which is performed using the load controller 36 of this example will be described. This example is a game in which the user stands on the load controller 36 and performs various exercises (e.g., “yoga”, “strength training”, etc.). The user initially selects and decides an exercise which the user is to perform, from a menu screen.
In this example, the user moves the cursor 351 to a desired item image on the menu screen, thereby selecting and deciding an exercise type which the user is to perform. Here, the user can perform a menu operation (operation of selecting and deciding an item image) by two methods. Specifically, the two methods are a method of using the controller 22 and a method of using the load controller 36. In the method of using the controller 22, a menu operation is performed by using the controller 22 to point a position on the screen of the monitor 34 (pointing operation), or by using the cross-key 26a of the input means 26 of the controller 22 (cross-key operation). In the method of using the load controller 36, a menu operation is performed by the user stepping on a predetermined position on the load controller 36. In this example, the user does not stand on the load controller 36 and instead steps on the load controller 36 using one foot to perform a menu operation. When the user stands on the load controller 36 (the user stands on the load controller 36 using their both feet), a menu operation using the load controller 36 is not accepted, and only a menu operation using the controller 22 is accepted. Hereinafter, the menu operation using the load controller 36 will be described in detail.
The user performs a menu operation by stepping on the platform 36a of the load controller 36. Specifically, for example, the user stands behind the load controller 36, facing the long side at which the power button 36c is provided, and steps on a predetermined position on the platform 36a using one foot (i.e., the user does not completely stand on the platform 36a). On the platform 36a of the load controller 36, movement operation areas for moving the cursor 351 in upward, downward, rightward and leftward directions, and a determination operation area for deciding an item or the like, are previously defined.
When the user puts their foot on a predetermined area shown in
SG=((a+c)−(b+d))×m (1)
TG=((c+d)−(a+b))×n (2)
where a indicates the load value of the rear right load sensor 364a, b indicates the load value of the rear left load sensor 364b, c indicates the load value of the front right load gsensor 364c, d indicates the load value of the front left load sensor 364d, and m and n indicate constants, and −1≦SG≦1 and −1≦TG≦1. Thus, SG is calculated based on a difference between the addition of the load values of the load sensors 364a and 364c on the right side and the addition of the load values of the load sensors 364b and 364d on the left side. Similarly, TG is calculated based on a difference between the addition of the load values of the load sensors 364c and 364d on the front side and the addition of the load values of the load sensors 364a and 364b on the rear side. Note that the expressions for calculating the center-of-gravity position are only for illustrative purposes. The center-of-gravity position may be calculated by other methods.
Based on the center-of-gravity position thus calculated, the cursor 351 may be moved in a predetermined direction, or an item selected using the cursor 351 may be decided. For example, when the calculated center-of-gravity position falls within the input area 380c of
Here, as shown in
Operations corresponding to the input areas 382a to 382d have influence on events of a game, such as decision of an item, changing of screens, and the like. If the input areas 382a to 382d are set to be relatively small, the user needs to deliberately strongly step on the four corners of the platform 36a, so that it is possible to prevent the user from performing an erroneous input (i.e., when the user moves the cursor 351, decision of an item, changing of screens, or the like may occur although the user does not intend to do so). On the other hand, operations corresponding to the input areas 380a to 380d relate to movement of the cursor 351. Even if the user erroneously steps on the platform 36a to move the cursor 351, an item is not decided or screens are not changed, i.e., the state of a game is not affected. Therefore, even when the user erroneously steps on the direction key areas 370a to 370d of the platform 36a, the user can continue to perform a menu operation without being bothered.
In the ST coordinate space, an area (referred to as an invalid input area) other than the input areas 380a to 380d and the input areas 382a to 382d is set. As shown in
As described above, in this example, two menu operation methods are available (the operation method of using the load controller 36 and a foot and the operation method of using the controller 22 and a hand). When the user stands on the platform 36a of the load controller 36, a menu operation using the load controller 36 is not accepted and only a menu operation using the controller 22 is accepted. It is determined whether or not the user stands on the platform 36a of the load controller 36, based on whether or not the sum of load values detected by the four load sensors 364 is larger than or equal to a predetermined value. For example, when the sum of the load values is larger than or equal to 10 kg, it is determined that the user stands on the platform 36a of the load controller 36, and therefore, only a menu operation of the controller 22 is effective. A reason why such a process is employed will be hereinafter described.
If it is assumed that the operation method of using the load controller 36 and a foot can be used even when the user stands on the platform 36a of the load controller 36, a menu operation is likely to be performed by the load controller 36 without the user's intention. For example, when the user standing on the platform 36a of the load controller 36 is performing a menu operation using the controller 22, then if the calculated center-of-gravity position accidentally falls within any input area (382a to 382d, 380a to 380d), the game apparatus 12 recognizes that a menu operation has been performed using the load controller 36. Specifically, when the user stands on the load controller 36, then if the operation of the load controller 36 using a foot and the operation of the controller 22 using a hand are both available, an operation which is not intended by the user may be recognized.
Also, when the user tries to perform an operation using a foot while standing on the load controller 36, it is often difficult for the user to satisfactorily control the center-of-gravity position. For example, when the user puts their right foot on a desired position while standing on the load controller 36, the center-of-gravity position does not necessarily move to a position on which the right foot is put, and therefore, it is difficult for the user to perform an operation as they intend. In order for the user to move the center-of-gravity position while standing on the load controller 36, the user needs to move their weight using their whole body. Therefore, when the user moves the center-of-gravity position while standing on the load controller 36, the user needs to perform an operation in a manner different from that when the user steps on a desired portion without standing on the load controller 36. Therefore, when performing an operation using a foot, the user may be confused to decide what operation to perform.
Therefore, when the user stands on the load controller 36, then if a menu operation of the load controller 36 is not accepted and only a menu operation of the controller 22 is accepted, it is possible to prevent an operation which is not intended by the user and the confusion.
Note that, when the user does not stand on the load controller 36, both the load controller 36 and the controller 22 provide effective menu operations. In this case, if both the load controller 36 and the controller 22 simultaneously provide inputs, a menu operation is performed based on predetermined priorities. Any order of priority may be set. For example, the order of priority may be set as follows: a cross-key operation using the cross-key 26a of the input means 26>a screen pointing operation using the controller 22>an operation using the load controller 36.
Note that, in this example, the aforementioned display image 352 (see
Next, a process of a menu operation performed in the game apparatus 12 will be described in detail. Firstly, main data which is used in the menu operation process will be described with reference to
The load value data 502 relates to load values detected by the four load sensors 364. Specifically, the load value data 502 includes a set of a load value a detected by the load sensor 364a, a load value b detected by the load sensor 364b, a load value c detected by the load sensor 364c, and a load value d detected by the load sensor 364d. The sum load value 504 is the sum of the load values detected by the four load sensors 364. Specifically, the sum load value 504 is the addition of the load values a, b, c and d.
The current center-of-gravity position 506 is coordinate data indicating a center-of-gravity position calculated based on load values currently detected by the four load sensors 364. Specifically, the current center-of-gravity position 506 is calculated by Expressions (1) and (2) using the load value data 502, i.e., the load values a, b, c and d.
The immediately previous center-of-gravity position 508 is coordinate data indicating a center-of-gravity position calculated based on load values immediately previously detected by the four load sensors 364. Load values detected by the four load sensors 364 vary depending on a position on which the user steps. The center-of-gravity position also varies depending on the load values. The immediately previous center-of-gravity position 508 is one that is calculated immediately before the current center-of-gravity position 506. Specifically, when the current center-of-gravity position is changed, the current center-of-gravity position 506 is stored as the immediately previous center-of-gravity position 508 into the external main memory 46, and a changed center-of-gravity position is stored as a new current center-of-gravity position 506 into the external main memory 46.
The input area data 510 indicates positions and sizes of the aforementioned input areas (the input areas 380a to 380d and the input areas 382a to 382d). In other words, the input area data 510 includes a set of the positions and sizes of the input areas. For example, data indicating the position of the input area 380a may be coordinate data indicating a center of the input area 380a, and data indicating the size of the input area 380a may be the length of a diagonal line of the input area 380a.
The cursor position 512 indicates a current position of the cursor 351.
Next, a menu operation process performed in the game apparatus 12 will be described in detail with reference to
Initially, in step S1, the CPU 40 executes a preprocess for a menu operation. The preprocess for a menu operation is a process which is executed before the menu operation is performed using the load controller 36, i.e., a process for correcting a zero point. The zero-point correction is a process of setting a load value detected in the absence of an applied load to each load sensor 364, to 0 (kg). Even when a load is not applied to each load sensor 364, each load sensor 364 may detect a load value as, for example, 0.1 (kg) due to a change in environment or the like. The process of step S1 redefines the detected load value as 0 (kg).
Hereinafter, the preprocess for a menu operation in step S1 will be described in detail with reference to
In the menu operation preprocess, initially, in step S20, recognition of the load controller 36 is executed. Here, it is determined whether or not the load controller 36 is correctly connected and is normally ready to operate. Next, the CPU 40 executes a process of step S21.
In step S21, the CPU 40 determines whether or not the preprocess (the zero-point correction process of step S26) has already been executed. The CPU 40, when executing a process of step S26 described below, stores the fact that the preprocess has been executed, into the external main memory 46. In step S21, the CPU 40 determines whether or not the preprocess has already been executed, by referring to the external main memory 46. When the result of determination is negative, the CPU 40 next executes a process of step S22. When the result of determination is positive, the CPU 40 ends the menu operation preprocess. Note that the processes of steps S22 to S26 are executed only when the menu operation process of
In step S22, the CPU 40 displays on the monitor 34 a message which instructs the user not to stand on the load controller 36 (to put nothing on the load controller 36). For example, the CPU 40 displays a message that “Please do not stand on the load controller.”
In step S23, the CPU 40 transmits a load acquisition command to the load controller 36 to acquire load values (or data indicating load values) of the four load sensors 364. Thereafter, the CPU 40 stores the acquired load values of the load sensors 364 as the load value data 502 into the external main memory 46. Next, the CPU 40 executes a process of step S24.
In step S24, the CPU 40 calculates a sum of the load values acquired in step S23. Specifically, the CPU 40 acquires the load value data 502 by referring to the external main memory 46 to calculate the addition of the load values detected by the four load sensors 364. Thereafter, the CPU 40 stores the calculated sum value as the sum load value 504 into the external main memory 46. Next, the CPU 40 executes a process of step S25.
In step S25, the CPU 40 determines whether or not the sum load value is smaller than a predetermined value. Specifically, the CPU 40 acquires the sum load value 504 calculated in step S24 by referring to the external main memory 46. Thereafter, the CPU 40 determines whether or not the acquired sum load value is smaller than, for example, 10 kg. If the result of determination is positive, the CPU 40 next executes a process of step S26. If the result of determination is negative, the CPU 40 returns to the process of step S22.
In step S26, the CPU 40 executes the zero-point correction. Specifically, the CPU 40, when determining that the sum load value is smaller than the predetermined value in step S25, determines that a load is not applied to the load controller 36, and defines the load value currently detected by each load sensor 364 as 0 (kg). After step S26, the CPU 40 ends the preprocess, and returns to the process of the flowchart of
Referring back to
Next, a process of step S3 is executed. In the process of step S3, it is determined whether or not a menu operation is being performed using the controller 22. When a menu operation is being performed using the controller 22, the menu operation of the controller 22 is executed with priority. In step S3, the CPU 40 determines whether or not an operation is being performed using the cross-key 26a (and other buttons) of the controller 22. When determining that an operation is being performed using the cross-key 26a or the like, the CPU 40 next executes a process of step S4. On the other hand, when an operation is not being performed using the cross-key 26a or the like, the CPU 40 next executes a process of step S5.
In step S4, the CPU 40 executes a menu operation based on an operation performed using the cross-key 26a or the like. In step S4, an input by the input means 26 of the controller 22 is accepted, and a process corresponding to the input is executed. Specifically, when an operation is performed using the cross-key 26a, a process of moving the cursor 351 is performed. When an operation has been performed using the input means 26 other than the cross-key 26a, a process of deciding an item corresponding to each button (the “+” button 26g, the A-button 26d, the B-switch 26i, etc.) is performed, for example. Specifically, for example, when an operation is performed using the cross-key 26a, the CPU 40 moves the cursor 351 in an upward or downward (or rightward or leftward) direction, depending on the operation of the cross-key 26a. Also, for example, when the “+” button 26g is pushed down, the CPU 40 decides an item selected by the cursor 351 (an exercise type corresponding to an item image 350). Note that a process which is performed, depending on the type of each button, is not particularly limited. Also, the buttons and the processes are associated with each other in any manner (e.g., the 1-button 26b is used to decide an item, the B-switch 26i is used to return to the previous screen, etc.). After step S4, the CPU 40 executes a process of step S12.
On the other hand, when an operation is not being performed using the cross-key 26a or the like (the result of determination in step S3 is negative), the CPU 40 determines whether or not an item image 350 is pointed by the controller 22 in step S5. Specifically, the CPU 40 determines whether or not a position pointed by the controller 22 falls within an area where an item image 350 is displayed. When the position pointed by the controller 22 falls within an area where an item image 350 is displayed, the CPU 40 determines that an item is pointed by the controller 22. Conversely, when the position pointed by the controller 22 does not fall within an area where an item image 350 is displayed, the CPU 40 determines that an item is not pointed by the controller 22. When the result of determination in step S5 is positive, the CPU 40 next executes a process of step S6. When the result of determination is negative, the CPU 40 next executes a process of step S7.
In step S6, the CPU 40 executes a menu process based on a pointing operation using the controller 22. Specifically, the CPU 40 moves the cursor 351 to an item image 350 pointed by the controller 22.
In step S7, the CPU 40 acquires load values from the load controller 36. Specifically, the CPU 40 transmits a load acquisition command to the load controller 36 to acquire load values (or data indicating the load values) of the four load sensors 364. Thereafter, the CPU 40 stores the acquired load values of the load sensors 364 as the load value data 502 into the external main memory 46. Next, the CPU 40 executes a process of step S8.
In step S8, the CPU 40 calculates a sum of the load values acquired in step S7. Specifically, the CPU 40 acquires the load value data 502 by referring to the external main memory 46 and calculates the addition of the load values detected by the four load sensors 364. Thereafter, the CPU 40 stores the calculated sum value as the sum load value 504 into the external main memory 46. Next, the CPU 40 executes a process of step S9.
In step S9, the CPU 40 determines whether or not the sum load value is smaller than a predetermined value. Specifically, the CPU 40 acquires the sum load value 504 by referring to the external main memory 46 and determines whether or not the sum load value is smaller than, for example, 10 kg. The process of step S9 determines whether or not the user stands on the load controller 36. When the result of determination is positive, the CPU 40 next executes a process of step S10. When the result of determination is negative, the CPU 40 returns to the process of step S3.
In step S10, the CPU 40 displays on the monitor 34 a display (the display image 352 of
In step S11, the CPU 40 executes a menu operation process based on a center-of-gravity position. Hereinafter, the menu operation process based on a center-of-gravity position in step S11 will be described in detail with reference to
Initially, in step S30, the CPU 40 calculates a center-of-gravity position from load values. Specifically, the CPU 40 acquires the load value data 502 by referring to the external main memory 46. Next, the CPU 40 calculates a center-of-gravity position based on the acquired load values of the load sensors 364. The calculation of a center-of-gravity position is executed in accordance with Expressions (1) and (2) above. Thereafter, the CPU 40 stores the current center-of-gravity position 506 stored in the external main memory 46 as a new immediately previous center-of-gravity position 508 into the external main memory 46. Moreover, the CPU 40 stores the calculated center-of-gravity position as a new current center-of-gravity position 506 into the external main memory 46. As a result, a center-of-gravity position calculated in the current step S30 is stored as the current center-of-gravity position 506 into the external main memory 46, and a center-of-gravity position calculated in the previous step S30 is stored as the immediately previous center-of-gravity position 508 into the external main memory 46. Next, the CPU 40 executes a process of step S31.
In step S31, the CPU 40 determines whether or not the center-of-gravity position has been changed. Specifically, the CPU 40 determines whether or not a current center-of-gravity position calculated in the immediately previous step S30 has been changed from the previously calculated center-of-gravity position. More specifically, the CPU 40 acquires the current center-of-gravity position 506 and the immediately previous center-of-gravity position 508 by referring to the external main memory 46. When the current center-of-gravity position 506 is different from the immediately previous center-of-gravity position 508, the CPU 40 determines that the center-of-gravity position has been changed, and next executes a process of step S32. When the current center-of-gravity position 506 is the same as the immediately previous center-of-gravity position 508, the CPU 40 determines that the center-of-gravity position has not been changed, and ends the menu operation process based on a center-of-gravity position.
In step S32, the CPU 40 determines whether or not an area in which a center-of-gravity position is located has been changed. Specifically, the CPU 40 determines in which of the input areas (380a to 380d and 382a to 382d) and the invalid input areas a current center-of-gravity position is located, based on the current center-of-gravity position 506. The CPU 40 also determines in which of the input areas and the invalid input areas the immediately previous center-of-gravity position is located, based on the immediately previous center-of-gravity position 508. Thereafter, the CPU 40 determines whether or not an area in which the current center-of-gravity position is located (one of the input areas and the invalid input areas) has been changed from an area in which the immediately previous center-of-gravity position is located. For example, when the current center-of-gravity position is located in the input area 380c and the immediately previous center-of-gravity position is located in an invalid input area, the CPU 40 determines that the area in which the center-of-gravity position is located has been changed. On the other hand, when the current center-of-gravity position is located in the input area 380c and the immediately previous center-of-gravity position is located in the input area 380c, the CPU 40 determines that the area in which the center-of-gravity position is located has not been changed. When the result of determination is positive, the CPU 40 next executes a process of step S33. When the result of determination is negative, the CPU 40 ends the menu operation process, and returns to the process shown in the flowchart of
The process of step S32 determines whether or not the center-of-gravity position has been changed from one area to another area (these areas are any of the input areas and the invalid input areas of
Next, in step S33, the CPU 40 executes a menu operation process corresponding to an area in which a center-of-gravity position is located. Specifically, for example, when a current center-of-gravity position is located in the input area 380a, the CPU 40 moves the cursor 351 in an upward direction. Also, for example, when a current center-of-gravity position is located in the input area 382a or the input area 382c, the CPU 40 decides an item selected by the cursor 351. Also, when a current center-of-gravity position is located in an invalid input area, the CPU 40 does not execute a menu operation. Thereafter, the CPU 40 ends the menu operation process, and returns to the process shown in the flowchart of
Referring back to
In this manner, a menu operation of using the load controller 36 and a foot is performed. Thus, in a game apparatus, such as the load controller 36, in which input is performed by a center-of-gravity position, a menu operation can be performed, depending on the center-of-gravity position. Also, even when the user stands on the load controller 36, then if an input operation using the load controller 36 is accepted, the operability may be deteriorated as described above. Therefore, if an input operation using the load controller 36 is accepted only when the user does not stand on the load controller 36, the user can easily perform a desired operation. Even when the load controller 36 and another input device which can be operated using a hand are available in the game apparatus 12, then if an input operation using the load controller 36 is accepted only when the user does not stand on the load controller 36, the operability for the user is improved. In other words, it is possible to prevent the user from entering an input without the user's intention when two input devices are available.
Note that, in the aforementioned example, load values (or data indicating the load values) are acquired from the load controller 36, and a center-of-gravity position is calculated in the game apparatus 12. Alternatively, the game apparatus 12 may detect a center-of-gravity position based on an input signal (data indicating load values or a center-of-gravity position) from the load controller 36. For example, the load controller 36 may calculate and transmit load values and a center-of-gravity position to the game apparatus 12.
Also, in the aforementioned example, it is determined whether or not the sum load value is smaller than a predetermined value, and only when the result of determination is positive, a menu operation using the load controller 36 is caused to be effective. In another example, the threshold used in this determination may be changed, depending on the weight of the user. The weight of the user may be previously stored in the game apparatus 12 and may be selected by the user before an operation. Alternatively, the weight of the user may be input every operation.
Also, in the aforementioned example, the input areas 382a to 382d used to decide an item or the like are set to be smaller than the input areas 380a to 380d used to move the cursor 351. Alternatively, distances between a predetermined point in the ST coordinate space (e.g., the origin O) and the input areas 382a to 382d may be set to be longer than distances between the point and the input areas 380a to 380d. Specifically, a center-of-gravity position is determined by differences between load values detected by the load sensors 364, and therefore, if a center-of-gravity position in the absence of a load applied to the load controller 36 is set as an origin, the difference between load values needs to be increased (i.e., a larger load needs to be applied) with a distance from the origin. Conversely, when an applied load is small, the load difference is small, so that the calculated center-of-gravity position is closer to the origin.
Also, in the aforementioned example, the load controller 36 includes the four load sensors 364. In other examples, the load controller 36 may include three or less load sensors or five or more load sensors. Also, as long as a center-of-gravity position of applied loads can be obtained, the load controller 36 may be any device and may not include a plurality of load sensors as is different from the aforementioned example.
Also, in the aforementioned example, only when an area in which a current center-of-gravity position is located has been changed from an area in which the immediately previous center-of-gravity position is located, a menu operation is executed (steps S32 and S33). Alternatively, even when an area in which a current center-of-gravity position is located is the same as an area in which the immediately previous center-of-gravity position is located, a menu operation may be executed. In this case, for example, when the user continues to step on the downward key area 370c of the platform 36a, the cursor 351 is continuously moved downward.
Certain example embodiments are also applicable to an information processing apparatus other than a game apparatus. For example, an input device, such as the load controller 36, which can enter an input based a center-of-gravity position is connected to a personal computer, and the input device can be used to perform a menu operation similar to that described above. Also, in such a personal computer, a pointer indicating a position on a display screen may be caused to correspond to a center-of-gravity position, and the pointer may be operated, depending on a change in center-of-gravity position, thereby causing the personal computer to execute a predetermined process.
Also, in the aforementioned example, as shown in
Also, sizes (or distances from the origin) of the input areas 382a to 382d and the input areas 380a to 380d may be changed, depending on the situation of an operation. For example, on an operation screen on which an item decision operation or the like is more often used than an item selection operation, the input areas 382a to 382d used for the item decision operation or the like may be set to be larger than the input areas 380a to 380d used for the item selection operation. An example of such an operation screen is a setting wizard of an information processing apparatus. Specifically, in the setting wizard in which default settings are previously defined, most users use the default settings, however, some users enter their own settings. In this case, most users only make a decision on setting screens successively displayed, and therefore, if the input areas 382a to 382d used for the item decision operation or the like are larger than the input areas 380a to 380d used for the item selection operation, the operability is improved.
Also, the input operation using the load controller 36 is not limited to the aforementioned menu operation. For example, the load controller 36 can be used to move an object provided in a game space, or cause the object to perform a predetermined motion. In other words, the load controller 36 can be used to perform an operation similar to that which is performed using the conventional input means 26 (a cross-key, a button, etc.). A process performed by this operation is not particularly limited.
As described above, certain example embodiments can execute a process based on a center-of-gravity position, and is applicable as, for example, a game apparatus and a game program.
While certain embodiments have been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2009-054979 | Mar 2009 | JP | national |
This Application is a continuation of Ser. No. 12/510,437 filed Jul. 28, 2009, now U.S. Pat. No. 8,079,251 which claims the benefit of Japanese Patent Application No. 2009-054979 filed Mar. 9, 2009, the contents of all of which are herein incorporated by reference.
| Number | Name | Date | Kind |
|---|---|---|---|
| 588172 | Peters | Aug 1897 | A |
| 688076 | Ensign | Dec 1901 | A |
| D188376 | Hotkins et al. | Jul 1960 | S |
| 3184962 | Gay | May 1965 | A |
| 3217536 | Motsinger et al. | Nov 1965 | A |
| 3424005 | Brown | Jan 1969 | A |
| 3428312 | Machen | Feb 1969 | A |
| 3712294 | Muller | Jan 1973 | A |
| 3752144 | Weigle, Jr. | Aug 1973 | A |
| 3780817 | Videon | Dec 1973 | A |
| 3826145 | McFarland | Jul 1974 | A |
| 3869007 | Haggstrom et al. | Mar 1975 | A |
| 4058178 | Shinohara et al. | Nov 1977 | A |
| 4104119 | Schilling | Aug 1978 | A |
| 4136682 | Pedotti | Jan 1979 | A |
| 4246783 | Steven et al. | Jan 1981 | A |
| 4296931 | Yokoi | Oct 1981 | A |
| 4337050 | Engalitcheff, Jr. | Jun 1982 | A |
| 4404854 | Krempl et al. | Sep 1983 | A |
| 4488017 | Lee | Dec 1984 | A |
| 4494754 | Wagner, Jr. | Jan 1985 | A |
| 4558757 | Mori et al. | Dec 1985 | A |
| 4569519 | Mattox et al. | Feb 1986 | A |
| 4574899 | Griffin | Mar 1986 | A |
| 4577868 | Kiyonaga | Mar 1986 | A |
| 4598717 | Pedotti | Jul 1986 | A |
| 4607841 | Gala | Aug 1986 | A |
| 4630817 | Buckleu | Dec 1986 | A |
| 4660828 | Weiss | Apr 1987 | A |
| 4680577 | Straayer et al. | Jul 1987 | A |
| 4688444 | Nordstrom | Aug 1987 | A |
| 4691694 | Boyd et al. | Sep 1987 | A |
| 4711447 | Mansfield | Dec 1987 | A |
| 4726435 | Kitagawa et al. | Feb 1988 | A |
| 4739848 | Tulloch | Apr 1988 | A |
| 4742832 | Kauffmann et al. | May 1988 | A |
| 4742932 | Pedragosa | May 1988 | A |
| 4800973 | Angel | Jan 1989 | A |
| 4838173 | Schroeder et al. | Jun 1989 | A |
| 4855704 | Betz | Aug 1989 | A |
| 4880069 | Bradley | Nov 1989 | A |
| 4882677 | Curran | Nov 1989 | A |
| 4893514 | Gronert et al. | Jan 1990 | A |
| 4907797 | Gezari et al. | Mar 1990 | A |
| 4927138 | Ferrari | May 1990 | A |
| 4970486 | Gray et al. | Nov 1990 | A |
| 4982613 | Becker | Jan 1991 | A |
| D318073 | Jang | Jul 1991 | S |
| 5044956 | Behensky et al. | Sep 1991 | A |
| 5049079 | Furtado et al. | Sep 1991 | A |
| 5052406 | Nashner | Oct 1991 | A |
| 5054771 | Mansfield | Oct 1991 | A |
| 5065631 | Ashpitel et al. | Nov 1991 | A |
| 5089960 | Sweeney, Jr. | Feb 1992 | A |
| 5103207 | Kerr et al. | Apr 1992 | A |
| 5104119 | Lynch | Apr 1992 | A |
| 5116296 | Watkins et al. | May 1992 | A |
| 5118112 | Bregman et al. | Jun 1992 | A |
| 5151071 | Jain et al. | Sep 1992 | A |
| 5195746 | Boyd et al. | Mar 1993 | A |
| 5197003 | Moncrief et al. | Mar 1993 | A |
| 5199875 | Trumbull | Apr 1993 | A |
| 5203563 | Loper, III | Apr 1993 | A |
| 5207426 | Inoue et al. | May 1993 | A |
| 5259252 | Kruse et al. | Nov 1993 | A |
| 5269318 | Nashner | Dec 1993 | A |
| 5299810 | Pierce et al. | Apr 1994 | A |
| 5303715 | Nashner et al. | Apr 1994 | A |
| 5360383 | Boren | Nov 1994 | A |
| 5362298 | Brown et al. | Nov 1994 | A |
| 5368546 | Stark et al. | Nov 1994 | A |
| 5405152 | Katanics et al. | Apr 1995 | A |
| 5431569 | Simpkins et al. | Jul 1995 | A |
| 5462503 | Benjamin et al. | Oct 1995 | A |
| 5466200 | Ulrich et al. | Nov 1995 | A |
| 5469740 | French et al. | Nov 1995 | A |
| 5474087 | Nashner | Dec 1995 | A |
| 5476103 | Nahsner | Dec 1995 | A |
| 5507708 | Ma | Apr 1996 | A |
| 5541621 | Nmngani | Jul 1996 | A |
| 5541622 | Engle et al. | Jul 1996 | A |
| 5547439 | Rawls et al. | Aug 1996 | A |
| 5551445 | Nashner | Sep 1996 | A |
| 5551693 | Goto et al. | Sep 1996 | A |
| 5577981 | Jarvik | Nov 1996 | A |
| D376826 | Ashida | Dec 1996 | S |
| 5584700 | Feldman et al. | Dec 1996 | A |
| 5584779 | Knecht et al. | Dec 1996 | A |
| 5591104 | Andrus et al. | Jan 1997 | A |
| 5613690 | McShane et al. | Mar 1997 | A |
| 5623944 | Nashner | Apr 1997 | A |
| 5627327 | Zanakis | May 1997 | A |
| D384115 | Wilkinson et al. | Sep 1997 | S |
| 5669773 | Gluck | Sep 1997 | A |
| 5689285 | Asher | Nov 1997 | A |
| 5690582 | Ulrich et al. | Nov 1997 | A |
| 5697791 | Nashner et al. | Dec 1997 | A |
| 5713794 | Shimojima et al. | Feb 1998 | A |
| 5721566 | Rosenberg et al. | Feb 1998 | A |
| 5746684 | Jordan | May 1998 | A |
| 5785630 | Bobick et al. | Jul 1998 | A |
| D397164 | Goto | Aug 1998 | S |
| 5788618 | Joutras | Aug 1998 | A |
| 5792031 | Alton | Aug 1998 | A |
| 5800314 | Sakakibara et al. | Sep 1998 | A |
| 5805138 | Brawne et al. | Sep 1998 | A |
| 5813958 | Tomita | Sep 1998 | A |
| 5814740 | Cook et al. | Sep 1998 | A |
| 5820462 | Yokoi et al. | Oct 1998 | A |
| 5825308 | Rosenberg | Oct 1998 | A |
| 5837952 | Oshiro et al. | Nov 1998 | A |
| D402317 | Goto | Dec 1998 | S |
| 5846086 | Bizzi et al. | Dec 1998 | A |
| 5853326 | Goto et al. | Dec 1998 | A |
| 5854622 | Brannon | Dec 1998 | A |
| 5860861 | Lipps et al. | Jan 1999 | A |
| 5864333 | O'Heir | Jan 1999 | A |
| 5872438 | Roston | Feb 1999 | A |
| 5886302 | Germanton et al. | Mar 1999 | A |
| 5888172 | Andrus et al. | Mar 1999 | A |
| 5889507 | Engle et al. | Mar 1999 | A |
| D407758 | Isetani et al. | Apr 1999 | S |
| 5890995 | Bobick et al. | Apr 1999 | A |
| 5897457 | Mackovjak | Apr 1999 | A |
| 5897469 | Yalch | Apr 1999 | A |
| 5901612 | Letovsky | May 1999 | A |
| 5902214 | Makikawa et al. | May 1999 | A |
| 5904639 | Smyser et al. | May 1999 | A |
| D411258 | Isetani et al. | Jun 1999 | S |
| 5912659 | Rutledge et al. | Jun 1999 | A |
| 5919092 | Yokoi et al. | Jul 1999 | A |
| 5921780 | Myers | Jul 1999 | A |
| 5921899 | Rose | Jul 1999 | A |
| 5929782 | Stark et al. | Jul 1999 | A |
| 5947824 | Minami et al. | Sep 1999 | A |
| 5976063 | Joutras et al. | Nov 1999 | A |
| 5980256 | Carmein | Nov 1999 | A |
| 5980429 | Nashner | Nov 1999 | A |
| 5984785 | Takeda et al. | Nov 1999 | A |
| 5987982 | Wenman et al. | Nov 1999 | A |
| 5989157 | Walton | Nov 1999 | A |
| 5993356 | Houston et al. | Nov 1999 | A |
| 5997439 | Ohsuga et al. | Dec 1999 | A |
| 6001015 | Nishiumi et al. | Dec 1999 | A |
| 6007428 | Nishiumi et al. | Dec 1999 | A |
| 6010465 | Nashner | Jan 2000 | A |
| D421070 | Jang et al. | Feb 2000 | S |
| 6037927 | Rosenberg | Mar 2000 | A |
| 6038488 | Barnes et al. | Mar 2000 | A |
| 6044772 | Gaudette et al. | Apr 2000 | A |
| 6063046 | Allum | May 2000 | A |
| 6086518 | MacCready, Jr. | Jul 2000 | A |
| 6102803 | Takeda et al. | Aug 2000 | A |
| 6102832 | Tani | Aug 2000 | A |
| D431051 | Goto | Sep 2000 | S |
| 6113237 | Ober et al. | Sep 2000 | A |
| 6147674 | Rosenberg et al. | Nov 2000 | A |
| 6152564 | Ober et al. | Nov 2000 | A |
| D434769 | Goto | Dec 2000 | S |
| D434770 | Goto | Dec 2000 | S |
| 6155926 | Miyamoto et al. | Dec 2000 | A |
| 6162189 | Girone et al. | Dec 2000 | A |
| 6167299 | Galchenkov et al. | Dec 2000 | A |
| 6190287 | Nashner | Feb 2001 | B1 |
| 6200253 | Nishiumi et al. | Mar 2001 | B1 |
| 6203432 | Roberts et al. | Mar 2001 | B1 |
| 6216542 | Stockli et al. | Apr 2001 | B1 |
| 6216547 | Lehtovaara | Apr 2001 | B1 |
| 6220865 | Macri et al. | Apr 2001 | B1 |
| D441369 | Goto | May 2001 | S |
| 6225977 | Li | May 2001 | B1 |
| 6227968 | Suzuki et al. | May 2001 | B1 |
| 6228000 | Jones | May 2001 | B1 |
| 6231444 | Goto | May 2001 | B1 |
| 6239806 | Nishiumi et al. | May 2001 | B1 |
| 6241611 | Takeda et al. | Jun 2001 | B1 |
| 6244987 | Ohsuga et al. | Jun 2001 | B1 |
| D444469 | Goto | Jul 2001 | S |
| 6264558 | Nishiumi et al. | Jul 2001 | B1 |
| 6280361 | Harvey et al. | Aug 2001 | B1 |
| D447968 | Pagnacco et al. | Sep 2001 | S |
| 6295878 | Berme | Oct 2001 | B1 |
| 6296595 | Stark et al. | Oct 2001 | B1 |
| 6325718 | Nishiumi et al. | Dec 2001 | B1 |
| 6330837 | Charles et al. | Dec 2001 | B1 |
| 6336891 | Fedrigon et al. | Jan 2002 | B1 |
| 6353427 | Rosenberg | Mar 2002 | B1 |
| 6354155 | Berme | Mar 2002 | B1 |
| 6357827 | Brightbill et al. | Mar 2002 | B1 |
| 6359613 | Poole | Mar 2002 | B1 |
| D456410 | Ashida | Apr 2002 | S |
| D456854 | Ashida | May 2002 | S |
| D457570 | Brinson | May 2002 | S |
| 6387061 | Nitto | May 2002 | B1 |
| 6388655 | Leung | May 2002 | B1 |
| 6389883 | Berme et al. | May 2002 | B1 |
| 6394905 | Takeda et al. | May 2002 | B1 |
| 6402635 | Nesbit et al. | Jun 2002 | B1 |
| D459727 | Ashida | Jul 2002 | S |
| D460506 | Tamminga et al. | Jul 2002 | S |
| 6421056 | Nishiumi et al. | Jul 2002 | B1 |
| 6436058 | Krahner et al. | Aug 2002 | B1 |
| D462683 | Ashida | Sep 2002 | S |
| 6454679 | Radow | Sep 2002 | B1 |
| 6461297 | Pagnacco et al. | Oct 2002 | B1 |
| 6470302 | Cunningham et al. | Oct 2002 | B1 |
| 6482010 | Marcus et al. | Nov 2002 | B1 |
| 6510749 | Pagnacco et al. | Jan 2003 | B1 |
| 6514145 | Kawabata et al. | Feb 2003 | B1 |
| 6515593 | Stark et al. | Feb 2003 | B1 |
| 6516221 | Hirouchi et al. | Feb 2003 | B1 |
| D471594 | Nojo | Mar 2003 | S |
| 6543769 | Podoloff et al. | Apr 2003 | B1 |
| 6563059 | Lee | May 2003 | B2 |
| 6568334 | Gaudette et al. | May 2003 | B1 |
| 6616579 | Reinbold et al. | Sep 2003 | B1 |
| 6624802 | Klein et al. | Sep 2003 | B1 |
| 6632158 | Nashner | Oct 2003 | B1 |
| 6636161 | Rosenberg | Oct 2003 | B2 |
| 6636197 | Goldenberg et al. | Oct 2003 | B1 |
| 6638175 | Lee et al. | Oct 2003 | B2 |
| 6663058 | Peterson et al. | Dec 2003 | B1 |
| 6676520 | Nishiumi et al. | Jan 2004 | B2 |
| 6676569 | Radow | Jan 2004 | B1 |
| 6679776 | Nishiumi et al. | Jan 2004 | B1 |
| 6697049 | Lu | Feb 2004 | B2 |
| 6719667 | Wong et al. | Apr 2004 | B2 |
| 6726566 | Komata | Apr 2004 | B2 |
| 6764429 | Michalow | Jul 2004 | B1 |
| 6797894 | Montagnino et al. | Sep 2004 | B2 |
| 6811489 | Shimizu et al. | Nov 2004 | B1 |
| 6813966 | Dukart | Nov 2004 | B2 |
| 6817973 | Merril et al. | Nov 2004 | B2 |
| D500100 | Van Der Meer | Dec 2004 | S |
| 6846270 | Etnyre | Jan 2005 | B1 |
| 6859198 | Onodera et al. | Feb 2005 | B2 |
| 6872139 | Sato et al. | Mar 2005 | B2 |
| 6872187 | Stark et al. | Mar 2005 | B1 |
| 6888076 | Hetherington | May 2005 | B2 |
| 6913559 | Smith | Jul 2005 | B2 |
| 6936016 | Berme et al. | Aug 2005 | B2 |
| D510391 | Merril et al. | Oct 2005 | S |
| 6975302 | Ausbeck, Jr. | Dec 2005 | B1 |
| 6978684 | Nurse | Dec 2005 | B2 |
| 6991483 | Milan et al. | Jan 2006 | B1 |
| D514627 | Merril et al. | Feb 2006 | S |
| 7004787 | Milan | Feb 2006 | B2 |
| D517124 | Merril et al. | Mar 2006 | S |
| 7011605 | Shields | Mar 2006 | B2 |
| 7033176 | Feldman et al. | Apr 2006 | B2 |
| 7038855 | French et al. | May 2006 | B2 |
| 7040986 | Koshima et al. | May 2006 | B2 |
| 7070542 | Reyes et al. | Jul 2006 | B2 |
| 7083546 | Zillig et al. | Aug 2006 | B2 |
| 7100439 | Carlucci | Sep 2006 | B2 |
| 7121982 | Feldman | Oct 2006 | B2 |
| 7126584 | Nishiumi et al. | Oct 2006 | B1 |
| 7127376 | Nashner | Oct 2006 | B2 |
| 7163516 | Pagnacco et al. | Jan 2007 | B1 |
| 7179234 | Nashner | Feb 2007 | B2 |
| 7195355 | Nashner | Mar 2007 | B2 |
| 7202424 | Carlucci | Apr 2007 | B2 |
| 7202851 | Cunningham et al. | Apr 2007 | B2 |
| 7270630 | Patterson | Sep 2007 | B1 |
| 7307619 | Cunningham et al. | Dec 2007 | B2 |
| 7308831 | Cunningham et al. | Dec 2007 | B2 |
| 7331226 | Feldman et al. | Feb 2008 | B2 |
| 7335134 | LaVelle | Feb 2008 | B1 |
| RE40427 | Nashner | Jul 2008 | E |
| 7416537 | Stark et al. | Aug 2008 | B1 |
| 7526071 | Drapeau | Apr 2009 | B2 |
| 7530929 | Feldman et al. | May 2009 | B2 |
| 7722501 | Nicolas et al. | May 2010 | B2 |
| 7938751 | Nicolas et al. | May 2011 | B2 |
| 8079251 | Miyanaga | Dec 2011 | B2 |
| 8140228 | McCabe et al. | Mar 2012 | B2 |
| 8152744 | Mukumoto | Apr 2012 | B2 |
| 20010001303 | Ohsuga et al. | May 2001 | A1 |
| 20010018363 | Goto et al. | Aug 2001 | A1 |
| 20010026162 | Nagai et al. | Oct 2001 | A1 |
| 20010050683 | Ishikawa et al. | Dec 2001 | A1 |
| 20020055422 | Airmet et al. | May 2002 | A1 |
| 20020080115 | Onodera et al. | Jun 2002 | A1 |
| 20020185041 | Herbst | Dec 2002 | A1 |
| 20030054327 | Evensen | Mar 2003 | A1 |
| 20030069108 | Kaiserman et al. | Apr 2003 | A1 |
| 20030107502 | Alexander | Jun 2003 | A1 |
| 20030176770 | Merril et al. | Sep 2003 | A1 |
| 20030184517 | Senzui et al. | Oct 2003 | A1 |
| 20030193416 | Ogata et al. | Oct 2003 | A1 |
| 20040038786 | Kuo et al. | Feb 2004 | A1 |
| 20040041787 | Graves | Mar 2004 | A1 |
| 20040077464 | Feldman et al. | Apr 2004 | A1 |
| 20040099513 | Hetherington | May 2004 | A1 |
| 20040110602 | Feldman | Jun 2004 | A1 |
| 20040127337 | Nashner | Jul 2004 | A1 |
| 20040140137 | Selig et al. | Jul 2004 | A1 |
| 20040158380 | Farber et al. | Aug 2004 | A1 |
| 20040163855 | Carlucci | Aug 2004 | A1 |
| 20040180719 | Feldman et al. | Sep 2004 | A1 |
| 20040259688 | Stabile | Dec 2004 | A1 |
| 20050070154 | Milan | Mar 2005 | A1 |
| 20050076161 | Albanna et al. | Apr 2005 | A1 |
| 20050130742 | Feldman et al. | Jun 2005 | A1 |
| 20050202384 | DiCuccio et al. | Sep 2005 | A1 |
| 20060097453 | Feldman et al. | May 2006 | A1 |
| 20060161045 | Merril et al. | Jul 2006 | A1 |
| 20060205565 | Feldman et al. | Sep 2006 | A1 |
| 20060211543 | Feldman et al. | Sep 2006 | A1 |
| 20060217243 | Feldman et al. | Sep 2006 | A1 |
| 20060223634 | Feldman et al. | Oct 2006 | A1 |
| 20060258512 | Nicolas et al. | Nov 2006 | A1 |
| 20070021279 | Jones | Jan 2007 | A1 |
| 20070027369 | Pagnacco et al. | Feb 2007 | A1 |
| 20070155589 | Feldman et al. | Jul 2007 | A1 |
| 20070219050 | Merril | Sep 2007 | A1 |
| 20080012826 | Cunningham et al. | Jan 2008 | A1 |
| 20080228110 | Berme | Sep 2008 | A1 |
| 20080245972 | Drapeau | Oct 2008 | A1 |
| 20080261696 | Yamazaki et al. | Oct 2008 | A1 |
| 20090093305 | Okamoto et al. | Apr 2009 | A1 |
| 20090093315 | Matsunaga et al. | Apr 2009 | A1 |
| 20090094442 | Okamoto et al. | Apr 2009 | A1 |
| 20090107207 | Yamazaki et al. | Apr 2009 | A1 |
| 20090171500 | Matsumoto et al. | Jul 2009 | A1 |
| 20100137063 | Shirakawa et al. | Jun 2010 | A1 |
| 20100224420 | Miyanaga | Sep 2010 | A1 |
| 20100245236 | Takayama | Sep 2010 | A1 |
| 20100265173 | Matsunaga | Oct 2010 | A1 |
| 20110074665 | Konishi | Mar 2011 | A1 |
| 20110077088 | Hayashi et al. | Mar 2011 | A1 |
| 20110077899 | Hayashi et al. | Mar 2011 | A1 |
| Number | Date | Country |
|---|---|---|
| 40 04 554 | Aug 1991 | DE |
| 195 02 918 | Aug 1996 | DE |
| 297 12 785 | Jan 1998 | DE |
| 20 2004 021 792 | May 2011 | DE |
| 20 2004 021 793 | May 2011 | DE |
| 0 275 665 | Jul 1988 | EP |
| 0 299 738 | Jan 1989 | EP |
| 0 335 045 | Oct 1989 | EP |
| 0 519 836 | Dec 1992 | EP |
| 1 043 746 | Oct 2000 | EP |
| 1 120 083 | Aug 2001 | EP |
| 1 257 599 | Aug 2001 | EP |
| 1 870 141 | Dec 2007 | EP |
| 2 472 929 | Jul 1981 | FR |
| 2 587 611 | Mar 1987 | FR |
| 2 604 910 | Apr 1988 | FR |
| 2 647 331 | Nov 1990 | FR |
| 2 792 182 | Oct 2000 | FR |
| 2 801 490 | Jun 2001 | FR |
| 2 811 753 | Jan 2002 | FR |
| 2 906 365 | Mar 2008 | FR |
| 1 209 954 | Oct 1970 | GB |
| 2 288 550 | Oct 1995 | GB |
| 44-23551 | Oct 1969 | JP |
| 55-95758 | Dec 1978 | JP |
| 54-73689 | Jun 1979 | JP |
| 55-113472 | Sep 1980 | JP |
| 55-113473 | Sep 1980 | JP |
| 55-125369 | Sep 1980 | JP |
| 55-149822 | Nov 1980 | JP |
| 55-152431 | Nov 1980 | JP |
| 60-79460 | Jun 1985 | JP |
| 60-153159 | Oct 1985 | JP |
| 61-154689 | Jul 1986 | JP |
| 62-34016 | Feb 1987 | JP |
| 63-158311 | Oct 1988 | JP |
| 63-163855 | Oct 1988 | JP |
| 63-193003 | Dec 1988 | JP |
| 02-102651 | Apr 1990 | JP |
| 2-238327 | Sep 1990 | JP |
| 3-25325 | Feb 1991 | JP |
| 3-103272 | Apr 1991 | JP |
| 03-107959 | Nov 1991 | JP |
| 6-063198 | Mar 1994 | JP |
| 6-282373 | Oct 1994 | JP |
| 7-213741 | Aug 1995 | JP |
| 7-213745 | Aug 1995 | JP |
| 7-241281 | Sep 1995 | JP |
| 7-241282 | Sep 1995 | JP |
| 7-302161 | Nov 1995 | JP |
| 8-43182 | Feb 1996 | JP |
| 08-131594 | May 1996 | JP |
| 08-182774 | Jul 1996 | JP |
| 8-182774 | Jul 1996 | JP |
| 08-184474 | Jul 1996 | JP |
| 8-184474 | Jul 1996 | JP |
| 8-215176 | Aug 1996 | JP |
| 08-244691 | Sep 1996 | JP |
| 2576247 | Jan 1997 | JP |
| 9-120464 | May 1997 | JP |
| 9-168529 | Jun 1997 | JP |
| 9-197951 | Jul 1997 | JP |
| 9-305099 | Nov 1997 | JP |
| 11-309270 | Nov 1999 | JP |
| 2000-146679 | May 2000 | JP |
| U3068681 | May 2000 | JP |
| U3069287 | Jun 2000 | JP |
| 2000-254348 | Sep 2000 | JP |
| 3172738 | Jun 2001 | JP |
| 2001-178845 | Jul 2001 | JP |
| 2001-286451 | Oct 2001 | JP |
| 2002-112984 | Apr 2002 | JP |
| 2002-157081 | May 2002 | JP |
| 2002-253534 | Sep 2002 | JP |
| 2003-79599 | Mar 2003 | JP |
| 2003-235834 | Aug 2003 | JP |
| 2003280807 | Oct 2003 | JP |
| 3722678 | Nov 2005 | JP |
| 2005-334083 | Dec 2005 | JP |
| 3773455 | May 2006 | JP |
| 2006-167094 | Jun 2006 | JP |
| 3818488 | Sep 2006 | JP |
| 2006-284539 | Oct 2006 | JP |
| U3128216 | Dec 2006 | JP |
| 2008-49117 | Mar 2008 | JP |
| 2008071300 | Mar 2008 | JP |
| 2008264195 | Nov 2008 | JP |
| WO 9111221 | Aug 1991 | WO |
| WO 9212768 | Aug 1992 | WO |
| WO 9840843 | Sep 1998 | WO |
| WO 0012041 | Mar 2000 | WO |
| WO 0057387 | Sep 2000 | WO |
| WO 0069523 | Nov 2000 | WO |
| WO 0229375 | Apr 2002 | WO |
| WO 02057885 | Jul 2002 | WO |
| WO 2004051201 | Jun 2004 | WO |
| WO 2004053629 | Jun 2004 | WO |
| WO 2005043322 | May 2005 | WO |
| WO 2008099582 | Aug 2008 | WO |
| Entry |
|---|
| Addlesee, M.D., et al., “The ORL Active Floor,” IEEE—Personal Communications, Oct. 1997. |
| Baek, Seongmin, et al., “Motion Evaluation for VR-based Motion Training,” Eurographics 2001, vol. 20, No. 3, 2001. |
| Biodex Medical Systems, Inc.—Balance System SD Product Information—http://www.biodex.com/rehab/balance/balance—300feat.htm. |
| Chen, I-Chun, et al., “Effects of Balance Training on Hemiplegic Stroke Patients,” Chang Gung Medical Journal, vol. 25, No. 9, pp. 583-590, Sep. 2002. |
| Dingwell, Jonathan, et al., “A Rehabilitation Treadmill with Software for Providing Real-Time Gait Analysis and Visual Feedback,” Transactions of the ASME, Journal of Biomechanical Engineering, 118 (2), pp. 253-255, 1996. |
| Geiger, Ruth Ann, et al., “Balance and Mobility Following Stroke: Effects of Physical Therapy Interventions With and Without Biofeedback/Forceplate Training,” Physical Therapy, vol. 81, No. 4, pp. 995-1005, Apr. 2001. |
| Harikae, Miho, “Visualization of Common People's Behavior in the Barrier Free Environment,” Graduate Thesis—Master of Computer Science and Engineering in the Graduate School of the University of Aizu, Mar. 1999. |
| Hodgins, J.K., “Three-Dimensional Human Running,” Proceedings: 1996 IEEE International Conference on Robotics and Automation, vol. 4, Apr. 1996. |
| Kim, Jong Yun, et al., “Abstract—A New VR Bike System for Balance Rehabilitation Training,” Proceedings: 2001 IEEE Seventh International Conference on Virtual Systems and Multimedia, Oct. 2001. |
| McComas, Joan, et al., “Virtual Reality Applications for Prevention, Disability Awareness, and Physical Therapy Rehabilitation in Neurology: Our Recent Work,” School of Rehabilitation Sciences, University of Ottawa—Neurology Report, vol. 26, No. 2, pp. 55-61, 2002. |
| NeuroCom International, Inc.—Balance Manager Systems/Products—http://resourcesonbalance.com/neurocom/products/index.aspx. |
| NeuroCom International, Inc.—Neurogames—http://resourcesonbalance.com/neurocom/products/NeuroGames.aspx. |
| Nicholas, Deborah S, “Balance Retraining After Stroke Using Force Platform Feedback,” Physical Therapy, vol. 77, No. 5, pp. 553-558, May 1997. |
| Nintendo Co., Ltd.—Aerobic Exercise Rhythm Boxing—http://www.nintendo.co.jp/wii/rfnj/training/aerobics/aerobics07.html. |
| Redfern, Mark, et al., “Visual Influences of Balance,” Journal of Anxiety Disorders, vol. 15, pp. 81-94, 2001. |
| Sackley, Catherine, “Single Blind Randomized Controlled Trial of Visual Feedback After Stroke: Effects on Stance Symmetry and Function,” Disavility and Rehabilitation, vol. 19, No. 12, pp. 536-546, 1997. |
| Tossavainen, Timo, et al., “Postural Control as Assessed with Virtual Reality,” Acta Otolaryngol, Suppl 545, pp. 53-56, 2001. |
| Tossavainen, Timo, et al., “Towards Virtual Reality Simulation in Force Platform Posturography,” MEDINFO, pp. 854-857, 2001. |
| Tsutsuguchi, Ken, et al., “Human Walking Animation Based on Foot Reaction Force in the Three-Dimensional Virtual World,” The Journal of Visualization and Computer Animation, vol. 11, pp. 3-16, 2000. |
| Wong, Alice, et al., “The Devlopment and Clinical Evaluation of a Standing Biofeedback Trainer,” Journal of Rehabilitation Research and Development, vol. 34, No. 3, pp. 322-327, Jul. 1997. |
| Yang, Ungyeon, et al., “Implementation and Evaluation of ‘Just Follow Me’: An Immersive, VR-Based, Motion-Training System,” Presence, vol. 11, No. 3, pp. 304-323, 2002. |
| Interface, Inc.—Advanced Force Measurement—SM Calibration Certificate Installation Information, 1984. |
| Hugh Stewart, “Isometric Joystick: A Study of Control by Adolescents and Young Adults with Cerebral Palsy,” The Australian Occupational Therapy Journal, Mar. 1992, vol. 39, No. 1, pp. 33-39. |
| Raghavendra S. Rao, et al., “Evaluation of an Isometric and a Position Joystick in a Target Acquisition Task for Individuals with Cerebral Palsy,” IEEE Transactions on Rehabilitation Engineering, vol. 8, No. 1, Mar. 2000, pp. 118-125. |
| D. Sengupta, et al., “Comparative Evaluation of Control Surfaces for Disabled Patients,”Proceedings of the 27th Annual Conference on Engineering in Medicine and Biology, vol. 16, Oct. 6-10, 1974, p. 356. |
| Ian Bogost, “The Rhetoric of Exergaming,”The Georgia Institute of Technology, 9 pages (date unknown). |
| Ludonauts, “Body Movin',” May 24, 2004, http://web.archive.org/web/20040611131903/http:/www.ludonauts.com; retrieved Aug. 31, 2010, 4 pages. |
| Atari Gaming Headquarters—AGH's Atari Project Puffer Page, http://www.atarihq.com/othersec/puffer/index.html, retrieved Sep. 19, 2002, 4 pages. |
| Michael Antonoff, “Real estate is cheap here, but the places you'd most want to visit are still under construction,” Popular Science, Jun. 1993, pp. 33-34. |
| Steve Aukstakalnis and David Blatner, “The Art and Science of Virtual Reality—Silicon Mirage,” 1992, pp. 197-207. |
| Electronics, edited by Michael Antonoff, “Video Games—Virtual Violence: Boxing Without Bruises,” Popular Science, Apr. 1993, p. 60. |
| Stuart F. Brown, “Video cycle race,” Popular Science, May 1989, p. 73. |
| Scanning the Field for Ideas, “Chair puts Player on the Joystick,” Machine Design, No. 21, Oct. 24, 1991, XP 000255214, 1 page. |
| Francis Hamit, “Virtual Reality and the Exploration of Cyberspace,” University of MD Baltimore County, 1993, 4 pages. |
| Innovation in Action—Biofeed back Motor Control, Active Leg Press—IsoLegPress, 2 pages (date unknown). |
| Ric Manning, “Videogame players get a workout with the Exertainment,” The Gizmo Page from the Courier Journal Sep. 25, 1994, 1 page. |
| Tech Lines, Military—Arcade aces and Aviation—Winging it, Popular Mechanics, Mar. 1982, p. 163. |
| Sarju Shah, “Mad Catz Universal MC2 Racing Wheel: Mad Catz MC2 Universal,” Game Spot, posted Feb. 18, 2005, 3 pages. |
| Joe Skorupa, “Virtual Fitness,” Sports Science, Popular Mechanics, Oct. 1994, 3 pages. |
| AGH Musuem—Suncom Aerobics Joystick; http://atarihq.com/museum/2678/hardware/aerobics.html, (retrieved date unknown) 1 page. |
| Nintendo Zone—The History of Nintendo (1889-1997), retrieved Aug. 24, 1998 pp. 1, 9-10. |
| The Legible City, Computergraphic Installation with Dirk Groeneveld, Manhattan version (1989), Amsterdam version (1990), Karlsruhe version (1991), 3 pages. |
| The New Exertainment System. It's All About Giving Your Members Personal Choices, Life Fitness, Circle Reader Service Card No. 28, 1995, 1 page. |
| The Race Begins with $85, Randal Windracer, Circle Reader Service Card No. 34, 1990, 1 page. |
| Universal S-Video/Audio Cable; Product #5015, MSRP 9.99; http://www.madcatz.com/Default.asp?Page=133&CategoryImg=Universal—Cables, retrieved May 12, 2005, 1 page. |
| Tom Dang, et al., “Interactive Video Exercise System for Pediatric Brain Injury Rehabilitation,” Assistive Technology Research Center, Rehabilitation Engineering Service, National Rehabilitation Hospital, Proceedings of the RESNA 20th Annual Conference, Jun. 1998, 3 pages. |
| Linda S. Miller, “Upper Limb Exerciser,” Biometrics Ltd—Unique Solutions for Clinical and Research Applications, 6 pages (date unknown). |
| Raymond W. McGorry, “A system for the measurement of grip forces and applied moments during hand tool use,” Liberty Mutual Research Center for Safety and Health, Applied Ergonomics 32 (2001) 271-279. |
| NordicTrack's Aerobic Cross Trainer advertisment as shown in “Big Ideas—For a Little Money: Great Places to Invest $1,000 or Less,” Kiplinger's Personal Finance Magazine, Jul. 1994, 3 pages. |
| Maurice R. Masliah, “Measuring the Allocation of Control in 6 Degree of Freedom Human-Computer Interaction Tasks,” Graduate Department of Mechanical and Industrial Engineering, University of Toronto, 2001, 177 pages. |
| Leigh Ann Roman, “Boing! Combines Arcade Fun with Physical Training,” Memphis—Health Care News: Monitoring the Pulse of Our Health Care Community, Sep. 20, 1996, One Section, 1 page. |
| “No More Couch Potato Kids,” as shown in Orange Coast, Sep. 1994, p. 16. |
| Gary L. Downey, et al., “Design of an Exercise Arcade for Children with Disabilities,” Resna, Jun. 26-30, 1998, pp. 405-407. |
| Frank Serpas, et al., “Forward-dynamics Simulation of Anterior Cruciate Ligament Forces Developed During Isokinetic Dynamometry,” Computer Methods in Biomechanics and Biomedical Engineering, vol. 5 (1), 2002, pp. 33-43. |
| Carolyn Cosmos, “An ‘Out of Wheelchair Experience’”, The Washington Post, May 2, 2000, 3 pages. |
| “Look Ma! No Hands!”, The Joyboard—Power Body Control, (date unknown). |
| David H. Ahl, “Controller update,” Creative Computing, vol. 9, No. 12, Dec. 1983, p. 142. |
| Ian Bogost, “Water Cooler Games—The Prehistory of Wii Fit,” Videogame Theory, Criticism, Design, Jul. 15, 2007, 2 pages. |
| Jeremy Reimer, “A history of the Amiga, part 2: The birth of Amiga,” last updated Aug. 12, 2007, 2 pages. |
| The Amiga Joyboard (1982) image, Photos: Fun with plastic—peripherals that changed gaming; http://news.cnet.com/2300-27076—3-10001507-2.html (retrieved Jul. 23, 2010), 1 page. |
| The Amiga Power System Joyboard, Amiga history guide, http://www.amigahistory.co.uk/joyboard.html (retrieved Jul. 23, 2010), 2 pages. |
| “Joyboard,” Wikipedia—The free encyclopedia, http://en.wikipedia.org/wiki/Joyboard (retrieved Jul. 26, 2010), 2 pages. |
| “Dance Dance Revolution,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Dance Dance Revolution (retrieved Jul. 23, 2010), 9 pages. |
| “Cure for the couch potato,” Kansas City Star (MO), Jan. 2, 2005, WLNR 22811884, 1 page. |
| JC Fletcher, “Virtually Overlooked: The Power Pad games,” Joystiq, http://www.joystiq.com/2007/09/20/virtually-overlooked-the-power-pad-games/ (retrieved Jul. 26, 2010), 3 pages. |
| Family Fun Fitness, Nintendo Entertainment System, BANDAI, (date unknown). |
| “Power Pad/Family Fun and Fitness/Family Trainer,” http://www.gamersgraveyard.com/repository/nes/peripherals/powerpad.html (retrieved Jul. 26, 2010), 2 pages. |
| “Power Pad Information,” Version 1.0 (Sep. 23, 1999) http://www.gamersgraveyard.com/repository/nes/peripherals/powerpad.txt (retrieved Jul. 26, 2010), 2 pages. |
| Wii+Power+Pad.jpg (image), http://bpl.blogger.com/—J5LEiGp54I/RpZbNpnLDgl/AAAAAAAAAic/Gum6DD3Umjg/s1600-11/Wii+Power+Pad.jpg (retrieved Jul. 26, 2010), 1 page. |
| Vs. Slalom—Videogame by Nintendo, KLOV—Killer List of Video Games, http://www.arcade-museum.com/game—detail.php?game—id=10368 (retrieved Jul. 26, 2010), 3 pages. |
| “Nintendo Vs. System,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Nintendo—Vs.—System (retrieved Jul. 26, 2010), 3 pages. |
| Vs. Slalom—Step Up to the Challenge, Nintendo, (date unknown). |
| Vs. Slalom—Live the Thrill, Nintendo, (date unknown). |
| Vs. Slalom—Operation Manual, MDS(MGS), Nintendo, 4 pages, (date unknown). |
| HyperspaceArcade.com—Specialists in Arcade Video Game Repair and Restoration, http://www.hyperspacearcade.com/VSTypes.html (retrieved Jul. 3, 2010), 3 pages. |
| Vs. Slalom—Attachment Pak Manual; for Installation in: VS. UniSystem (Upright) and VS. DualSystem (Upright), TM of Nintendo of America Inc., 1986, 15 pages. |
| Leiterman, “Project Puffer: Jungle River Cruise,” Atari, Inc., 1982, 2 pages. |
| Leiterman, “Project Puffer: Tumbleweeds,” Atari, Inc., 1982, 1 page. |
| Jerry Smith, “Other Input Devices,” Human Interface Technology Laboratory, 2 pages, (date unknown). |
| Trevor Meers, “Virtually There: VR Entertainment Transports Players to Entrancing New Worlds,” Smart Computing, vol. 4, Issue 11, Nov. 1993, 6 pages. |
| “Dance Aerobics,” Moby Games, Feb. 12, 2008, 2 pages. |
| “Hard Drivin',” KLOV—Killer List of Video Games, The International Arcade Museum, http://www.arcade-museum.com, 6 pages, (date unknown). |
| “The World's First Authentic Driving Simulation Game!”, Hard Drivin'—Get Behind the Wheel and Feel the Thrill (image), Atari games Corporation, 1 page, (date unknown). |
| Electronic Entertainment Expo (E3) Overview, Giant Bomb—E3 2004 (video game concept), http://www.giantbomb.com/e3-2004/92-3436/ (retrieved Sep. 3, 2010), 3 pages. |
| Guang Yang Amusement, Product Name: Live Boxer, 1 page, (date unknown). |
| Family Fun Fitness: Basic Set (Control Mat and Athletic World Game Pak), Nintendo Entertainment System, Bandai, (date unknown). |
| Roll & Rocker (image), 1 page, (date unknown). |
| Roll & Rocker, Enteractive (image), 2 pages, (date unknown). |
| Michael Goldstein, “Revolution on Wheels—Thatcher Ulrich,” Nov.-Dec. 1994, 3 pages. |
| “Playboy on the Scene: Ride On!”, 1 page, (date unknown). |
| Candace Putnam, “Software for Hardbodies: A virtual-reality hike machine takes you out on the open road,” Design, 1 page, (date unknown). |
| Rachel, “No-Sweat Exercise—Can you get healthier without really trying?” Fitness, 1 page, (date unknown). |
| Fitness article, Sep. 1994, p. 402-404. |
| “Wired Top 10: Best Selling Toys in Jun. 1994,” Wired Sep. 1994, 1 page. |
| “Top Skater,” Sega Amusements U.S.A, Inc, 1 page, (date unknown). |
| Katharine Alter, et al., “Video Games for Lower Extremity Strength Training in Pediatric Brain Injury Rehabilitation,” National Rehabilitation Hospital, 18 pages, (date unknown). |
| Cateye Recumbent GameBike Pro: Latest Technology in Exercise Bikes, beyondmoseying.com High Performance Exercise Equipment, 2 pages (advertisement; no date). |
| Fitness Fun, while Exercising and Getting Fit for Kids, Teens and Adults, (advertisement, no date). |
| Warranty Information and Your Joyboard: How it Works, Amiga Corporation, date unknown, 2 pages. |
| Complaint for Patent Infringement, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Northern Division (Apr. 2, 2010), 317 pages. |
| Plaintiff IA Labs CA, LLC's Opening Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Dec. 13, 2010), 36 pages. |
| Nintendo Co., Ltd. and Nintendo of America Inc.'s Opening Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Dec. 13, 2010), 55 pages. |
| Plaintiff IA Labs CA, LLC's Response Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Jan. 6, 2011), 49 pages. |
| Nintendo Co., Ltd. and Nintendo of America Inc.'s Closing Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Jan. 6, 2011), 25 pages. |
| Expert Report of Lee Rawls, Nov. 2, 2010, 37 pages (redacted). |
| Nintendo Co., Ltd. and Nintendo of America'S Opposition to IA Labs CA, LLC's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), including the Appendix of Exhibits and Exhibits A-R, 405 pages. |
| Declaration of R. Lee Rawls in Support o Nintendo Co., Ltd. and Nintendo of America Inc.'s Opposition to IA Labs CA. LLC's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), including Exhibits 1, 3-12, 193 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), 7 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd, et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Appendix of Exhibits, 2 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 1, 36 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 2, 40 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 3, 85 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 4, 10 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 5, 9 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd, et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 6, 17 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 7, 16 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 8, 45 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 9, 4 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 10, 22 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 11, 27 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 12, 3 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 13, 7 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 14, 22 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 15, 45 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 16, 42 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 17, 19 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 18, 27 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 19, 13 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 20, 29 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 21, 25 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 22, 11 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 23, 20 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 24, 7 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 25, 80 pages. |
| Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. P. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 26, 32 pages. |
| Trademark U.S. Appl. No. 74/402,755, filed Jun. 14, 1993, 43 pages. |
| “AccuSway Dual Top: For Balance and Postural Sway Measurement,” AMTI: Force and Motion, ISO 9001:2000, 2 pages. |
| Borzelli G., Cappozzo A., and Papa E., “Inter- and intra-individual variability of ground rejection forces during sit-to-stand with principal component analysis,” Medical Engineering & Physics 21 (1999), pp. 235-240. |
| Chiari L., Cappello A., Lenzi D., and Della Croce U, “An Improved Technique for the Extraction of Stochasitc Parameters from Stabilograms,” Gait and Posture 12 (2000), pp. 225-234. |
| Cutlip R., Hsiao H., Garcia R., Becker E., Mayeux B., “A comparison of different postures for scaffold end-frame disassembly,” Applied Ergonomics 31 (2000), pp. 507-513. |
| Davis K.G., Marras W.S., Waters T.R., “Evaluation of spinal loading during lowering and lifting,” The Ohio State University, Biodynamics Laboratory, Clinical Biomechanics vol. 13, No. 3, 1998 pp. 141-152. |
| Rolf G. Jacob, Mark S. Redfern, Joseph M. Furman, “Optic Flow-induced Sway in Anxiety Disorders Associated with Space and Motion Discomfort,” Journal of Anxiety Disorders, vol. 9, No. 5, 1995, pp. 411-425. |
| Jorgensen M.J., Marras W.S., “The effect of lumbar back support tension on trunk muscle activity,” Clinical Biomechanics 15 (2000), pp. 292-294. |
| Deborah L. King and Vladimir M. Zatsiorsky, “Extracting gravity line displacement from stabilographic recordings,” Gait & Posture 6 (1997), pp. 27-38. |
| Kraemer W.J., Volek J.S., Bush J.A., Gotshalk L.A., Wagner P.R., Gómez A.L., Zatsiorsky V.M., Duzrte M., Ratamess N.A., Mazzetti S.A., Selle B.J., “Influence of compression hosiery on physiological responses to standing fatigue in women,” The Human Performance Laboratory, Medical & Science in Sports & Exercise, 2000, pp. 1849-1858. |
| Papa E. and Cappozzo A., “A telescopic inverted-pendulum model of the musculo-skeletal system and its use for the analysis of the sit-to-stand motor task,” Journal of Biomechanics 32 (1999), pp. 1205-1212. |
| Balance System, BalanceTrak 500, & Quantrem, ZapConnect.com: Medical Device Industry Portal, http://www.zapconnect.com/products/index/cfm/fuseaction/products, 2 pages. (Retrieved Apr. 5, 2011). |
| BERTEC: Dominate Your Field, Physician's Quick Guide, Version 1.0.0, Feb. 2010, 13 pages. |
| BERTEC: Dominate Your Field, Balancecheck Screener, Version 1.0.0, Feb. 2010, 35 pages. |
| BERTEC: Dominate Your Field, Balancecheck Trainer, Version 1.0.0, Feb. 2010, 37 pages. |
| BERTEC Corporation—Balancecheck Standard Screener Package, http://bertec.com/products/balance-systems/standard-screener.html, 1 page. (Retrieved Apr. 12, 2011). |
| BERTEC Corporation—Balance Systems: Balancecheck Advanced balance assessment & training products for the balance professional, http://bertec.com/products/balance-systems.html, 1 page. (Retrieved Mar. 31, 2011). |
| BERTEC Corporation—Balancecheck Mobile Screener Package: Portable balance screening with full functionality, http://bertec.com/products/balance-systems/mobile-screener .html, 1 page. (Retrieved Mar. 31, 2011). |
| BERTEC Corporation—Balancecheck Standard Screener & Trainer Package: Advanced balance screening and rehabilitation system, http://bertec.com/products/balance-systems/standard-screener-trainer.html, 1 page. (Retrieved Mar. 31, 2011). |
| Trademark U.S. Appl. No. 75/136,330, filed Jul. 19, 1996, 47 pages. |
| BERTEC: Dominate Your Field, Digital Acquire 4, Version 4.0.10, Mar. 2011, 22 pages. |
| BERTEC: Dominate Your Field, Bertec Force Plates, Version 1.0.0, Sep. 2009, 31 pages. |
| BERTEC: Dominate Your Field, Product Information: Force Plate FP4060-08:Product Details and Specifications, 4 pages. |
| BERTEC: Dominate Your Field, Product Information: Force Plate FP4060-10:Product Details and Specifications, 2 pages. |
| Trademark U.S. Appl. No. 73/542,230, filed Jun. 10, 1985, 52 pages. |
| Brent L. Arnold and Randy J. Schmitz, “Examination of Balance Measures Produced by the Biodex Stability System,” Journal of Athletic Training, vol. 33(4), 1998, pp. 323-327. |
| Trademark Registration No. 1,974,115 filed Mar. 28, 1994, 8 pages. |
| ICS Balance Platform, Fall Prevention: Hearing Assessment, Fitting Systems, Balance Assessment, Otometrics: Madsen, Aurical, ICS, 2 pages. |
| Trademark U.S. Appl. No. 75/471,542, filed Apr. 16, 1998, 102 pages. |
| VTI Force Platform, Zapconnect.com: Medical Device Industry Portal, http://zapconnect.com/products/index.cfm/fuseaction/products, 2 pages. (Retrieved Apr. 5, 2011). |
| Amin M., Girardi M., Konrad H.R., Hughes L., “A Comparison of Electronystagmorgraphy Results with Posturography Findings from the BalanceTrak 500,” Otology Neurotology, 23(4), 2002, pp. 488-493. |
| Girardi M., Konrad H.R., Amin M , Hughes L.F., “Predicting Fall Risks in an Elderly Population: Computer Dynamic Posturography Versus Electronystagmography Test Results,” Laryngoscope, 111(9), 2001, 1528-32. |
| Dr. Guido Pagnacco, Publications, 1997-2008, 3 pages. |
| College of Engineering and Applied Science: Electrical and Computer Engineering, University of Wyoming, Faculty: Guido Pagnacco, http://wwweng.uwyo.edu/electrical/faculty/Pagnacco.html, 2 pages. (Retrieved Apr. 20, 2011). |
| EyeTracker, IDEAS, DIFRA, 501(k) Summary: premarket notification, Jul. 5, 2007, 7 pages. |
| Vestibular technologies, copyright 2000-2004, 1 page. |
| Scopus preview—Scopus—Author details (Pagnacco, Guido), http:www.scopus.com/authid/detail.url?authorId=6603709393, 2 pages. (Retrieved Apr. 20, 2011). |
| Vestibular Technologies Company Page, “Vestibular technologies: Helping People Regain their Balance for Life,” http:www.vestibtech.com/AboutUs.html, 2 pages. (Retrieved Apr. 20, 2011). |
| GN Otometrics Launces ICS Balance Platform: Portable system for measuring postural sway, http://audiologyonline.com/news/pf—news—detail.asp?news—id=3196, 1 page. (Retrieved Mar. 31, 2011). |
| Trademark U.S. Appl. No. 75/508,272, filed Jun. 25, 1998, 36 pages. |
| Trademark U.S. Appl. No. 75/756,991, filed Jul. 21, 1999, 9 pages. |
| Trademark U.S. Appl. No. 76/148,037, filed Oct. 17, 2000, 78 pages. |
| Vestibular technologies, VTI Products: BalanceTRAK User's Guide, Preliminary Version 0.1, 2005, 34 pages. |
| Trademark U.S. Appl. No. 76/148,037, filed Oct. 17, 2000, 57 pages. |
| Vestibular Technologies, Waybackmachine, http://vestibtech.com/balancetrak500.html, 7 pages. (Retrieved Mar. 30, 2011). |
| Vestibular Technologies, 2004 Catalog, 32 pages. |
| The Balance Trak 500—Normative Data, 8 pages. |
| State of Delaware: The Official Website of the First State, Division of Corporations—Online Services, http://delecorp.delaware.gov/tin/controller, 2 pages. (Retrieved Mar. 21, 2011). |
| Memorandum in Support of Plaintiff IA Labs' Motion for Partial Summary Judgment on Defendants' Affirmative Defense and Counterclaim That U.S. Patent No. 7,121,982 is Invalid Under 35 U.S.C. §§ 102 and 103, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (Apr. 27, 2011), 17 pages. |
| “Instructions for ‘Wii Family Trainer’,” NAMCO BANDAI Games Inc., p. 9, p. 11 (Partial Translation—discussed at p. 1 of the specification). |
| Search Report (2 pgs.) dated May 27, 2011 issued in German Application No. 20 2004 021 793.7. |
| Notice of Reasons for Rejection from Japanese Application No. 2009-054979, issued Dec. 10, 2012. |
| Number | Date | Country | |
|---|---|---|---|
| 20120048004 A1 | Mar 2012 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 12510437 | Jul 2009 | US |
| Child | 13289129 | US |