The disclosure of Japanese Patent Application No. 2008-334795, filed on Dec. 26, 2008, is incorporated herein by reference.
1. Field of the Invention
The present invention relates to a biological information management system, and in particular, to a biological information management system including a first information processing apparatus and a second information processing apparatus capable of communicating with the first information processing apparatus.
2. Description of the Background Art
Conventionally, there is a system that transmits biological information, obtained by an apparatus, to another apparatus. For example, Japanese Patent Laid-open Publication No. 2004-118339 discloses a system in which a pedometer counts the number of steps taken, a mobile phone is connected to the pedometer to obtain step count data, and the step count data is transmitted from the mobile phone to a server through the Internet. Further, Japanese Patent Laid-open Publication No. H11-151211 discloses a system in which a terminal apparatus obtains data of bloodpressures, body temperatures, and weights, which are measured by health measuring instruments, and transmits the data to a center apparatus through a public line.
However, the system disclosed in Japanese Patent Laid-open Publication No. 2004-118339 has a problem that it is difficult to accurately determine the condition of a user because any message created by the user is not transmitted to the server together with the obtained step count data.
Further, in the system disclosed in Japanese Patent Laid-open Publication No. H11-151211, the measured data transmitted to the terminal apparatus is transmitted to the center apparatus with automatic dialing once a day, and if the communication fails, the transmission process is repeated until the communication succeeds. However, the measured data cannot be transmitted at a timing desired by a user, and any message created by the user cannot be transmitted together with the measured data. Thus, it is impossible to accurately determine the condition of the user.
Therefore, an object of the present invention is to provide a biological information management system capable of transmitting biological information together with a message.
The present invention has the following features to attain the object mentioned above. It is noted that reference characters and the like in parentheses are merely provided to facilitate the understanding of the present invention in relation to the drawings, rather than limiting the scope of the present invention in any way.
A biological information management system of the present invention comprises a first information processing apparatus (12) and a second information processing apparatus (90) capable of communicating with the first information processing apparatus.
The first information processing apparatus includes: biological information obtaining means (40, S82) for obtaining biological information; first message creation means (40, S70 to S79) for creating a first message based on an input operation of an operator of the first information processing apparatus; and first message transmission means (40, S84) for transmitting the biological information obtained by the biological information obtaining means and the first message created by the first message creation means to the second information processing apparatus,
The second information processing apparatus includes: first message reception means (S141) for receiving the biological information and the first message from the first information processing apparatus; and storage means (S142) for storing the biological information and the first message which are received by the first message reception means.
Thus, it is possible for an operator of the second information processing apparatus to know the condition of the operator of the first information processing apparatus because the biological information of the operator can be transmitted together with the message created by the operator of the first information processing apparatus.
Here, the “biological information” may include information regarding a body, such as weight, body fat percentage, and the like; information regarding exercise, such as the number of steps taken, exercise time, and the like; and information regarding vital signs, such as blood pressure, body temperature, heart rate, and the like.
Further, the “first information processing apparatus” and the “second information processing apparatus” may be arbitrary information processing apparatuses, and, for example, may be stationary game apparatuses, hand-held game apparatuses, mobile phones, personal computers, PDAs, and the like.
Further, the “first message” is not limited to a message including only characters, but may be a message including choices, or a message including a character input section or a numeric input section.
Upon receiving a transmission instruction of the first message from the operator of the first information processing apparatus, the first message transmission means may automatically obtain the biological information by the biological information obtaining means and may transmit the obtained biological information together with the first message to the second information processing apparatus.
Thus, when the operator of the first information processing apparatus transmits the message, inputting of the biological information is omitted, and the biological information is assuredly transmitted.
The biological information management system may further comprise a biological information measuring apparatus (36, 92) including: biological information measuring means (36b) for measuring biological information; and biological information transmission means (106) for transmitting the biological information measured by the biological information measuring means to the first information processing apparatus, and the biological information obtaining means may include biological information reception means (50) for receiving the biological information from the biological information measuring apparatus.
Thus, the operator of the first information processing apparatus can carry only the biological information measuring means to a desired place where biological information is to be measured.
Here, the “biological information measuring means” may include a weighing apparatus, a pedometer, a blood-pressure gauge, a clinical thermometer, and the like.
Further, the communication between the “first information processing apparatus” and the “biological information measuring means” may be wire or wireless communication. Further, its communication method is arbitrary. For example, it may be the Bluetooth (registered trademark), infrared communication, or another wireless communication method. Further, the communication may be performed through a public communication line such as the Internet and the like, or through a local network. Further, the communication may be performed not through a communication cable.
Further, the biological information obtaining means may include biological information measuring means for measuring biological information.
Thus, it is easy to obtain the biological information because the first information processing apparatus includes the biological information measuring means.
Further, the biological information measuring means may measure a weight, or counts the number of steps taken, as the biological information.
Further, the first information processing apparatus may also include process means (40, S21) for executing a process using the biological information measuring means, and process time counting means (40, S22) for counting a time of the process executed by the processing means; and the biological information obtaining means may also obtain the time counted by the process time counting means.
Thus, the time, for which the operator of the first information processing apparatus uses the biological information measuring means, can be transmitted as the biological information to the second information processing apparatus.
Further, the first information processing apparatus may also include storage means (40) for storing the biological information obtained by the biological information obtaining means and measurement date information of the biological information, and the first message transmission means may refer to the measurement date information and may transmit biological information of a predetermined period.
Thus, it is possible for the operator of the second information processing apparatus to have knowledge of the transition of the biological information, of the operator of the first information processing apparatus, for the predetermined period.
Further, the first message transmission means may transmit the biological information and the first message as one file to the second information processing apparatus.
Thus, it becomes easy to manage data at the second information processing apparatus.
Further, the second information processing apparatus may also include: second message creation means (S30 to S38) for creating a second message based on an input operation of an operation of the second information processing apparatus; and second message transmission means (S41) for transmitting the second message created by the second message creation means to the first information processing apparatus, the first information processing apparatus may also include second message reception means (50) for receiving the second message from the second information processing apparatus, and the first message creation means may create the first message based on the second message received by the second message reception means.
Thus, it becomes possible to exchange detailed information between the operator of the first information processing apparatus and the operator of the second information processing apparatus because the first information processing apparatus creates the first message based on the second message transmitted from the second information processing apparatus, and transmits the first message together with the biological information to the second information processing apparatus.
The “second message” is not limited to a message including only characters, but may be a message including choices, or a message including a character input section or a numeric input section.
Further, the second message may be a question message, the first information processing apparatus may also include question execution means (40, S90, S100, S110) for executing a question based on the question message received by the second message reception means, and the first message creation means may create, as the first message, an answer message to the question executed by the question execution means.
Thus, it becomes possible to exchange on-target information between the operator of the first information processing apparatus and the operator of the second information processing apparatus because the first information processing apparatus creates the answer message to the question message transmitted from the second information processing apparatus, and transmits the answer message to the second information processing apparatus.
A computer-readable storage medium (44) of the present invention stores a computer program (AP1) for a biological information management system comprising a first information processing apparatus (12) and a second information processing apparatus (90) capable of communicating with the first information processing apparatus.
The computer program causes a computer (40) of the first information processing apparatus to function as: biological information obtaining means (S82) for obtaining biological information; first message creation means (S70 to S79) for creating a first message based on an input operation of an operator of the first information processing apparatus; and first message transmission means (S84) for transmitting the biological information obtained by the biological information obtaining means and the first message created by the first message creation means to the second information processing apparatus.
An information processing apparatus of the present invention is used as a first information processing apparatus (12) in a biological information management system comprising the first information processing apparatus and a second information processing apparatus (90) capable of communicating with the first information processing apparatus. The information processing apparatus comprises: biological information obtaining means (40, S82) for obtaining biological information; first message creation means (40, S70 to S79) for creating a first message based on an input operation of an operator of the first information processing apparatus; and first message transmission means (40, S84) for transmitting the biological information obtained by the biological information obtaining means and the first message created by the first message creation means to the second information processing apparatus.
A health guidance support system of the present invention is a health guidance support system for a healthcare professional to perform health guidance for a healthcare recipient. The health guidance support system comprises: a healthcare professional terminal (90) operated by the healthcare professional; and a healthcare recipient terminal (12) operated by the healthcare recipient and capable of communicating with the healthcare professional terminal.
The healthcare professional terminal includes: question message creation means (S30 to S38) for creating a question message in accordance with an input operation of the healthcare professional; and question message transmission means (S41) for transmitting the question message from the healthcare professional terminal to the healthcare recipient terminal.
the healthcare recipient terminal includes: question message reception means (50) for receiving the question message from the healthcare professional terminal; question message displaying means (40, 34, S90, S100, S110) for displaying the received question message to the healthcare recipient; answer message creation means (40, S70 to S79) for creating an answer message to the question message in accordance with an input operation of the healthcare recipient; and answer message transmission means (40, S84) for, upon receiving a transmission instruction from the healthcare recipient, automatically obtaining biological information (weight data, exercise time data, step count data) regarding health of the healthcare recipient from a storage unit (44) that is provided in or connected to the healthcare recipient terminal, and transmitting the biological information together with the created answer message to the healthcare professional terminal.
The healthcare professional terminal further includes: answer message reception means (S141) for receiving the answer message together with the biological information from the healthcare recipient terminal; and answer message displaying means (S145) for displaying the received answer message and the biological information to the healthcare professional.
Thus, when the healthcare recipient transmits the answer message, inputting of the biological information is omitted, and the biological information is assuredly transmitted. In addition, because the healthcare professional can fully confirm the biological information required for health guidance, the healthcare professional can properly and efficiently perform health guidance.
The healthcare recipient terminal may be a game apparatus; and the answer message transmission means may obtain the biological information regarding the health of the healthcare recipient from saved data (D2, D3) of a game executed previously in the healthcare recipient terminal.
Thus, the game apparatus can be used for health guidance, and the user of the game apparatus can readily take health guidance. In addition, because the biological information is obtained from the saved data of the game executed previously in the game apparatus, inputting of the biological information by the user is omitted.
According to the present invention, a biological information management system capable of transmitting biological information together with a message can be provided.
These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
The following will describe an embodiment of the present invention with reference to the drawings.
(Game System)
First, a game system used in the present embodiment will be described. As shown in
The game apparatus 12 includes a parallelepiped-shaped housing 14 provided with a disc slot 16 on a front surface thereof. Through the disc slot 16, an optical disc 18, which is an example of an information storage medium storing a game program and the like, is inserted, and mounted to a disc drive 54 (see
Further, on an upper portion of the front surface of the housing 14 of the game apparatus 12, a power button 20a and a reset button 20b are provided, and on a lower portion thereof, an eject button 20c is provided. In addition, an external memory card connector cover 28 is provided between the reset button 20b and the eject button 20c and adjacent to the disc slot 16. Inside the external memory card connector cover 28, an external memory card connector 62 (see
As the memory card, a general-purpose SD card can be used, but other general-purpose memory cards such as a memory stick and a Multi-Media Card (registered trademark) can be used.
On a rear surface of the housing 14 of the game apparatus 12, an AV connector 58 (see
The electric power is supplied to the game apparatus 12 through a general AC adapter (not shown). The AC adapter is inserted into a household standard wall socket, and the game apparatus 12 converts household power (commercial power) into a low DC-voltage signal that is suitable for driving the game apparatus 12. In an alternative embodiment, a battery may be used as a power source.
In the game system 10, in order for a user or a player to play a game (it is not limited thereto, and may be another application), the user turns on the game apparatus 12, and then selects an appropriate optical disc 18 storing the video game (or the other application that the user desires to play), and the optical disc 18 is loaded onto the disc drive 54 of the game apparatus 12. Accordingly, the game apparatus 12 starts to execute the video game or the other application based on a program stored in the optical disc 18. The user operates the controller 22 for performing an input to the game apparatus 12. For example, by operating any one of input means 26, the use starts the game or the other application. Further, in addition to operating the input means 26, by moving the controller 22, an image object (player object) can be moved in a different direction, or a viewpoint (camera position) of the user in a 3D game world can be changed.
The external main memory 46 stores a program, such as a game program and the like, and various data, and is used as a work region and a buffer region for the CPU 40. The ROM/RTC 48 includes a ROM (so-called boot ROM) that stores a program for starting up the game apparatus 12; and a clock circuit for counting time. The disc drive 54 reads program data and texture data from the optical disc 18, and writes these data into a later-described internal main memory 42e or the external main memory 46 under the control of the CPU 40.
The system LSI 42 is provided with an input-output processor (I/O process) 42a, a GPU (Graphics Processor Unit) 42b, a DSP (Digital Signal Processor) 42c, a VRAM 42d, and the internal main memory 42e. Although not shown in the drawings, these components are connected to each other through an internal bus.
The input-output processor 42a performs transmission and reception of data to and from each component connected to the input-output processor 42a, and downloads data. The transmission and reception of data and the download of data will be described in detail later.
The GPU 42b forms a part of drawing means, receives a graphics command (command for generating graphics) from the CPU 40, and generates an image according to the graphics command. In addition to the graphics command, the CPU 40 provides the GPU 42b with an image generation program required for generating game image data.
Although not shown in the drawings, the VRAM 42d is connected to the CPU 42b as described above. Prior to executing the graphics command, the CPU 42b accesses the VRAM 42d and obtains required data (image data: data such as polygon data, texture data, and the like). The CPU 40 writes image data, required for drawing an image, into the VRAM 42d through the CPU 42b. The CPU 42b accesses the VRAM 42d and generates game image data for drawing an image.
The present embodiment describes a case where the GPU 42b generates game image data. However, in a case of executing an arbitrary application other than the game application, the GPU 42b generates image data for the arbitrary application.
Further, the DSP 42c functions as an audio processor, and generates audio data, which correspond to sound, voice, and music outputted from the speakers 34a, by using sound data and sound waveform (tone color) data which are stored in the internal main memory 42e and the external main memory 46.
The image data and the audio data generated thus are read by the AVIC 56. The AVIC 56 outputs the image data and the audio data to the monitor 34 and the speakers 34a, respectively, through the AV connector 58. Thus, a game image is displayed on the monitor 34, and sound (music) required for the game is outputted from the speakers 34a.
Further, the input-output processor 42a is connected to a flash memory 44, a wireless communication module 50, a wireless controller module 52, an extension connector 60, and the external memory card connector 62. The wireless communication module 50 is connected to an antenna 50a, and the wireless controller module 52 is connected to an antenna 52a.
The input-output processor 42a is capable of communicating with another game apparatus connected to a network and various servers connected to the network, through the wireless communication module 50. However, the input-output processor 42a is capable of communicating directly with the other game apparatus, not through the network. The input-output processor 42a periodically accesses the flash memory 44 to detect whether or not there is data (referred to as transmission data) required to be transmitted to the network. If there is the transmission data, the input-output processor 42a transmits the transmission data to the network through the wireless communication module 50 and the antenna 50a. The input-output processor 42a receives data (referred to as reception data) transmitted from the other game apparatus through the network, the antenna 50a, and the wireless communication module 50, and stores the reception data in the flash memory 44. In a predetermined case, the input-output processor 42a discards the reception data. In addition, the input-output processor 42a receives data downloaded from a download server through the network, the antenna 50a, and the wireless communication module 50, and stores the downloaded data in the flash memory 44.
Further, the input-output processor 42a receives input data transmitted from the controller 22 and the load controller 36 through the antenna 52a and the wireless controller module 52, and stores (temporarily stores) the input data in the buffer region of the internal main memory 42e or the external main memory 46. The input data in the buffer region is deleted after being used by a game process executed by the CPU 40.
In the present embodiment, as described above, the wireless controller module 52 communicates with the controller 22 and the load controller 36 in accordance with the Bluetooth standard.
For convenience's sake, the controller 22 and the load controller 36 are shown together as one unit in
The input-output processor 42a is connected to the extension connector 60 and the external memory card connector 62. The extension connector 60 is a connector for an interface such as a USB and an SCSI, and enables connection of a medium such as an external storage medium and connection of a peripheral apparatus such as another controller. Further, instead of the wireless communication module 50, a wired LAN can be used by connecting a wired LAN adapter to the extension connector 60. To the external memory card connector 62, an external storage medium such as a memory card can be connected. Thus, for example, the input-output processor 42a is capable of accessing the external storage medium through the extension connector 60 or the external memory card connector 62 for storing data in the external storage medium and reading data from the external storage medium.
Although not described in detail, the game apparatus 12 (housing 14) is provided with the power button 20a, the reset button 20b, and the eject button 20c as shown in
Even in the standby mode, electric power is supplied to the system LSI 42, but, a supply of electric power to the GPU 42b, the DSP 42c, and the VRAM 42d is stopped so as not to drive them, thereby reducing the power consumption.
Further, although not shown in the drawings, a fan is provided in the housing 14 of the game apparatus 12 for discharging heat of the CPU 40, the system LSI 42, and the like. In the standby mode, the fan is stopped.
When the standby mode is not desired to be used, settings are made so as not to use the standby mode, whereby a supply of electric power to all the circuit components is stopped when the power button 20a is turned off.
Further, switching between the normal mode and the standby mode can be performed by means of remote operation such as by turning on/off a power switch 26h (see
The reset button 20b is also connected to the system LSI 42. When the reset button 20b is pressed, the system LSI 42 restarts a boot program of the game apparatus 12. The eject button 20c is connected to the disc drive 54. When the eject button 20c is pressed, the optical disc 18 is ejected from the disc drive 54.
(Controller)
Referring to
The cross key 26a is a cross-shaped four-direction push switch, and includes operation portions corresponding to four directions, frontward (or upward), rearward (or downward), rightward, and leftward, indicated by arrows. By operating any one of the operation portions, the player can perform an instruction of a direction in which a character or an object (a player character or a player object) operable by the player moves, and can perform an instruction of a direction in which a cursor moves.
The 1 button 26b and the 2 button 26c are push-button switches. For example, the 1 button 26b and the 2 button 26c are used for game operations, such as adjustment of a viewing position and a viewing direction, namely, the viewing position and the viewing angle of a virtual camera, when displaying a three-dimensional game image, and the like. Or, the 1 button 26b and the 2 button 26c may be used when the same operation as the A button 26d and the B trigger switch 26i is performed or an auxiliary operation is performed.
The A button 26d is a push-button switch, and is used for performing an instruction, other than a direction instruction, causing the player character or the player object to perform a motion, namely, an arbitrary action such as hitting (punching), throwing, grabbing (obtaining), riding, jumping, and the like. For example, in an action game, an instruction for jumping, punching, moving a weapon, and the like can be performed. Further, in a role-playing game (RPG) or a simulation RPG, an instruction for obtaining an item, selecting or deciding on a weapon or a command, and the like can be performed.
The − button 26e, the HOME button 26f, the + button 26g, and the power switch 26h are also push-button switches. The − button 26e is used for selecting a game mode. The HOME button 26f is used for displaying a game menu (menu screen). The + button 26g is used for starting (restarting) or pausing a game. The power switch 26h is used for turning on/off the game apparatus 12.
In the present embodiment, the controller 22 is not provided with a power switch for turning of/off the controller 22. The controller 22 is turned on by operating any one of the input means 26 of the controller 22, and is automatically turned off when the input means 26 are not performed for a certain time period (e.g. 30 seconds).
The B trigger switch 26i is also a push-button switch, and is used mainly for performing an input simulating a trigger for shooting a bullet, and for designating a position selected by using the controller 22. When the B trigger switch 26i is continuously pressed, a motion and a parameter of the player object can be maintained in a constant state. In a predetermined case, the B trigger switch 26i functions similarly as a normal B button, and is used for canceling an action decided by using the A button 26d, and the like.
Further, as shown in
Further, the controller 22 includes an imaging information calculation section 80 (see
The shapes of the controller 22, and the shapes, the number, and the installation positions of the input means 26 as shown in
The processor 70 conducts the entire control of the controller 22, and transmits (inputs) information (input information) inputted by the input means 26, the acceleration sensor 74, and the imaging information calculation section 80, as input data, to the game apparatus 12 through the wireless module 76 and the antenna 78. At this time, the processor 70 uses the memory 72 as a work region and a buffer region.
Operation signals (operation data) from the aforementioned input means 26 (26a-26i) are inputted to the processor 70, and the processor 70 stores once the operation data in the memory 72.
The acceleration sensor 74 detects acceleration in three directions, i.e., a vertical direction (y axis), a horizontal direction (x axis), and a front-rear direction (z axis). The acceleration sensor 74 is typically an electrostatic capacitance type acceleration sensor, but may be of another type.
For example, the acceleration sensor 74 detects acceleration (ax, ay, az) along x axis, y axis, and z axis every first predetermined time period, and inputs data (acceleration data) of the detected acceleration to the processor 70. For example, the acceleration sensor 74 detects acceleration in each axial direction in a range between −2.0 g to 2.0 g (g denotes the gravitational acceleration; hereinafter, it is the same). The processor 70 detects the acceleration data, provided by the acceleration sensor 74, every second predetermined time period, and stores once the acceleration data in the memory 72. The processor 70 generates input data including at least one of the operation data, the acceleration data, and later-described marker coordinate data, and transmits the generated input data to the game apparatus 12 every third predetermined time period (e.g. 5 msec).
Although not shown in
By using, for example, the Bluetooth (registered trademark) technology, the wireless module 76 modulates a carrier wave of a predetermined frequency with the operation data and radiates the resultant weak radio signal from the antenna 78. In other words, the input data is modulated by the wireless module 76 into the weak radio signal and transmitted from the antenna 78 (controller 22). The weak radio signal is received by the wireless controller module 52 provided in the aforementioned game apparatus 12. The received weak radio signal is demodulated and decoded, and thus the game apparatus 12 (CPU 40) can obtain the input data from the controller 22. Then, the CPU 40 performs the game process in accordance with the obtained input data and a program (game program).
Further, as described above, the controller 22 is provided with the imaging information calculation section 80. The imaging information calculation section 80 includes an infrared filter 80a, a lens 80b, an image pickup element 80c, and an image processing circuit 80d. The infrared filter 80a allows, among light incident on the front surface of the controller 22, only infrared light to pass therethrough. As described above, the markers 340m and 340n located adjacent to the display screen of the monitor 34 are infrared LEDS that emit infrared light forward from the monitor 34. Thus, images of the markers 340m and 340n can be accurately taken by providing the infrared filter 80a. The lens 80b converges the infrared light that has passed through the infrared filter 80a, and outputs the infrared light to the image pickup element 80c. The image pickup element 80c is a solid-state image pickup element such as a CMOS sensor and a CCD. The image pickup element 80c takes an image of the infrared light collected by the lens 80b. In other words, the image pickup element 80c takes an image of only the infrared light that has passed through the infrared filter 80c. Then, the image pickup element 80c generates image data of the image. Hereinafter, an image taken by the image pickup element 80c is referred to as a taken image. The image data generated by the image pickup element 80c is processed by the image processing circuit 80d. The image processing circuit 80d calculates the position of a target (the markers 340m and 340n) whose image is to be taken, and outputs each coordinate value, representing the position, as imaging data to the processor 70 every fourth predetermined time period. The processing by the image processing circuit 80d will be described later.
(Load Controller)
The stand 36a is formed in a generally parallelepiped shape, and has a generally rectangular shape in top view. For example, the short side of the rectangle is set to about 30 cm, and the long side thereof is set to about 50 cm. The stand 36a has a flat top surface on which the player stands. The stand 36a has at four corners thereof side surfaces that are formed so as to partially project to have a cylindrical shape.
In the stand 36a, the four load sensors 36b are disposed at predetermined intervals. In the present embodiment, the four load sensors 36b are disposed at the periphery of the stand 36a, specifically, at the four corners thereof, respectively. The intervals among the load sensors 36b are set appropriately such that the load sensors 36b can accurately detect the intention of a game operation which is expressed by a manner of exerting a load on the stand 36a by the player.
The support plate 360 includes an upper plate 360a that forms the top surface and a side surface upper portion, a lower plate 360b that forms a bottom surface and a side surface lower portion, and a mid plate 360c provided between the upper plate 360a and the lower plate 360b. The upper plate 360a and the lower plate 360b are formed, for example, by plastic molding, and integrated with each other by means of adhesion. The mid plate 360c is formed, for example, from one metallic plate by press molding. The mid plate 360c is fixed on the four load sensors 36b. The upper plate 360a has a grid-shaped rib (not shown) on the lower surface thereof, and is supported on the mid plate 360c through the rib. Thus, when the player stands on the stand 36a, the load is transmitted through the support plate 360, the load sensors 36b, and the legs 362. As shown by arrows in
The load sensors 36b are load converters that convert inputted loads into electric signals, for example, strain gauges (strain sensors) type load cells. In each load sensor 36b, according to the inputted load, a strain-generating body 370a deforms to generate strain. The strain is converted into a change of electric resistance by a strain sensor 370b attached to the strain-generating body 370a, and further converted into a voltage change. Thus, each load sensor 36b outputs a voltage signal indicative of the inputted load, from its output terminal.
Each load sensor 36b may be a load sensor of another type, such as a tuning fork vibration type, a string vibration type, an electrostatic capacitance type, a piezoelectric type, a magnetic strain type, and a gyro type.
Referring back to
The load controller 36 includes a microcomputer 100 for controlling the operation of the load controller 36. The microcomputer 100 includes a CPU, a ROM, a RAM, and the like, which are not shown in the drawings. The CPU controls the operation of the load controller 36 in accordance with a program stored in the ROM.
The microcomputer 100 is connected to the power button 36c, an AD converter 102, a DC-DC converter 104, and a wireless module 106. The wireless module 106 is connected to an antenna 106a. The four load sensors 36b are shown as load cells 36b in
Further, a battery 110 is contained in the load controller 36 for supplying electric power. In an alternative embodiment, instead of the battery, an AC adapter may be connected to the load controller 36 for supplying commercial power thereto. In this case, instead of the DC-DC converter, a power circuit, which converts alternating current into direct current and lowers and rectifies a direct current voltage, needs to be provided. In the present embodiment, electric power is supplied directly from the battery to the microcomputer 100 and the wireless module 106. In other words, the electric power is always supplied to the wireless module 106 and some components (the CPU) in the microcomputer 100 to detect whether or not the power button 36c is turned on and whether or not a command for turning on the power (load detection) is transmitted from the game apparatus 12. Meanwhile, the electric power is supplied from the battery 110 through the DC-DC converter 104 to the load sensors 36b, the AD converter 102, the amplifiers 108, and the battery 110. The DC-DC converter 104 converts a voltage value of direct current from the battery 110 into a different voltage value, and provides the resultant direct current to the load sensors 36b, the AD converter 102, and the amplifiers 108.
A supply of electric power to the load sensors 36b, the AD converter 102, and the amplifiers 108 may be conducted according to need by controlling the DC-DC converter 104 by the microcomputer 100. In other words, when it is determined that the load sensors 36b need to be activated to detect loads, the microcomputer 100 may control the DC-DC converter 104 so as to supply electric power to the load sensors 36b, the AD converter 102, and the amplifiers 108.
When the electric power is supplied, each load sensor 36b outputs a signal indicative of the inputted load. The signal is amplified by the corresponding amplifier 108, converted from the analog signal into digital data by the AD converter 102, and inputted to the microcomputer 100. Identification information of each load sensor 36bA is assigned to a detection value of each load sensor 36b such that it is possible to identify by which load sensor 36b the detection value is detected. As described above, the microcomputer 100 can obtain data indicative of each load detection value of the four load sensors 36b at the same time.
On the other hand, when it is determined that the load sensors 36b do not need to be activated, namely, that it is not a timing of load detection, the microcomputer 100 controls the DC-DC converter 104 so as to stop the supply of electric power to the load sensors 36b, the AD converter 102, and the amplifiers 108. As described above, in the load controller 36, because the load sensors 36b are activated to detect loads only when necessary, power consumption for load detection can be reduced.
A time when load detection is needed is typically a time when the game apparatus 12 (see
Alternatively, the microcomputer 100 may determine a timing of load detection every constant time period, and control the DC-DC converter 104. When such periodical load detection is conducted, information regarding the constant time period may be initially provided from the game apparatus 12 to the microcomputer 100 of the load controller 36 and stored therein, or may be stored in the microcomputer 100 in advance.
Data indicative of the detection values from the load sensors 36b is transmitted as operation data (input data) of the load controller 36 from the microcomputer 100 to the game apparatus 12 (see
It is noted that the wireless module 106 is set so as to perform communication according to the same wireless standard (the Bluetooth, a wireless LAN, and the like) as that for the wireless controller module 52 of the game apparatus 12. Thus, the CPU 40 of the game apparatus 12 is capable of transmitting a load obtaining command to the load controller 36 through the wireless controller module 52 and the like. The microcomputer 100 of the load controller 36 is capable of receiving the command from the game apparatus 12 through the wireless module 106 and the antenna 106a, and transmitting input data including a load detection value (or a load calculation value) of each load sensor 36b to the game apparatus 12.
In the case of a game executed based on the total of four load values detected by the four load sensors 36b, the player can stand at any position on the load controller 36 with respect to the four load sensors 36b, in other words, the player can play the game while standing at any position on the stand 36a and in any facing direction. However, depending on types of games, a process need to be executed while identifying which direction a load value detected by each load sensor 36b comes from with respect to the player, namely, it is necessary to know the positional relation between the four load sensors 36b of the load controller 36 and the player. In this case, for example, a positional relation between the four load sensors 36b and the player may be defined in advance, and it may be postulated that the player stands on the stand 36a so as to meet this predetermined positional relation. Typically, a positional relation in which two load sensors 36b are present on each of right and left sides or each of front and rear sides of the player standing at the center of the stand 36a, namely, a positional relation in which, when the player stands at the center of the stand 36a of the load controller 36, the load sensors 36b are present in the front right direction, the front left direction, the rear right direction, and the rear left direction from the player, respectively, is defined. In this case, in the present embodiment, because the stand 36a of the load controller 36 is formed in a rectangular shape in plan view and the power button 36c is provided at one side (long side) of the rectangle, it is defined in advance, using the power button 36c as a mark, that the player stands on the stand 36a such that the long side at which the power button 36c is provided is present in a predetermined direction (front, rear, left, or right) from the player. By doing so, a load value detected by each load sensor 36b becomes a load value in a predetermined direction (right front, left front, right rear, and left rear) from the player. Thus, the load controller 36 and the game apparatus 12 can identify which direction from the player each load detection value corresponds to, based on the identification information of each load sensor 36b which is included in the load detection value data and preset (prestored) position data indicative of a direction from the player to each load sensor 36b. Accordingly, it is possible to know the intention of a game operation, such as operation directions of front, rear, left, and right, which is expressed by the player.
The position of each load sensor 36b with respect to the player may not be defined in advance, and may be set by an input performed by the player at initial setting or at setting during a game. For example, the positional relation of each load sensor 36b with respect to the player can be specified by displaying an image for instructing the player to stand on a portion present in a predetermined direction (left front, right front, left rear, or right rear) from the player is displayed; and obtaining load values. Position data obtained by this setting may be generated and stored. Alternatively, a screen for selecting a position on the load controller 36 may be displayed on the monitor 34, and the player may be made to select in which direction from the player a mark (the power button 36c) is present, by an input with the controller 22. In accordance with this selection, position data of each load sensor 36b may be generated and stored.
(Game Play)
(Pointing)
When the position and the pointing direction of the controller 22 are out of the range, it becomes impossible to perform the game operations based on the position and the pointing direction of the controller 22. Hereinafter, this range is referred to as “operable range”.
When the controller 22 is held in the operable range, an image of each of the markers 340m and 340n is taken by the imaging information calculation section 80. In other words, a taken image obtained by the image pickup element 80c includes the image (target image) of each of the markers 340m and 340n that are targets whose images are to be taken.
In the imaging data of the taken image, the target images appear as regions that have a high brightness. Thus, first, the image processing circuit 80d detects the high brightness regions as candidates of the target images. Next, the image processing circuit 80d determines whether or not the high brightness regions indicate the target images, based on the sizes of the detected high brightness regions. The taken image may include images other than the target images due to sunlight incoming through a window and light from a fluorescent lamp in a room, in addition to the images 340m′ and 340n′ of the two markers 340m and 340n that are target images. The process of determining whether or not the high brightness regions indicate the target images is executed for distinguishing the images 340m′ and 340n′ of the markers 340m and 340n, which are target images, from the other images, and accurately detecting the target images. Specifically, in the determination process, whether or not the detected high brightness regions have a size in a predetermined range is determined. When the high brightness regions have a size in the predetermined range, it is determined that the high brightness regions indicate target images. When the high brightness regions do not have a size in the predetermined range, it is determined that the high brightness regions indicate images other than the target images.
Further, the image processing circuit 80d calculates the positions of the high brightness regions that are determined to indicate the target images as the result of the determination process. Specifically, the image processing circuit 80d calculates the positions of the centers of the high brightness regions. Here, the coordinates of the positions of the centers are referred to as marker coordinates. The positions of the centers can be calculated on a scale smaller than the resolution of the image pickup element 80c. Here, the resolution of an image taken by the image pickup element 80d is 126×96, and the positions of the centers are calculated on a scale of 1024×768. In other words, the coordinates of the positions of the centers are represented by integer values of (0,0) to (1024,768).
It is noted that a position in the taken image is represented by a coordinate system (xy coordinate system) whose origin is at the upper left corner of the taken image, whose y axis positive direction is the downward direction, and whose x axis positive direction is the rightward direction.
When the target images are correctly detected, two high brightness regions are determined to indicate target images by the determination process, and thus the marker coordinates of two positions are calculated. The image processing circuit 80d outputs data indicative of the calculated marker coordinates of the two positions. The outputted data (marker coordinate data) of the marker coordinates is caused to be included in input data by the processor 70, and transmitted to the game apparatus 12 as described above.
When detecting the marker coordinate data from the received input data, the game apparatus 12 (CPU 40) can calculate the pointing position (pointing coordinate) of the controller 22 on the screen of the monitor 34 and the distances between the controller 22 and the markers 340m and 340n based on the marker coordinate data. Specifically, based on the position of the midpoint between the two marker coordinates, the position to which the controller 22 points, namely, the pointing position of the controller 22, is calculated. Thus, the controller 22 functions as a pointing device to point an arbitrary position on the screen of the monitor 34. Because the distance between the target images in the taken image changes in accordance with the distances between the controller 22 and the markers 340m and 340n, it is possible for the game apparatus 12 to know the distances between the controller 22 and the markers 340m and 340n by calculating the distance between the two marker coordinates.
The healthcare professional terminal 90 and the game apparatus 12 are capable of communicating with each other, for example, through the Internet. The healthcare professional terminal 90 creates a question message based on an input operation of the healthcare professional, and transmits the question message to the game apparatus 12. The question message is a message containing questions that the healthcare professional asks the healthcare recipient, for example, for confirming the health condition and the number of meals of the healthcare recipient.
The game apparatus 12 creates an answer message based on an input operation of the healthcare recipient, and transmits the answer message to the healthcare professional terminal 90 together with biological information of the healthcare recipient. The answer message is a message containing answers to the questions contained in the question message received from the healthcare professional terminal 90. The biological information is various information useful for health guidance, including: information regarding a body, such as weight, body fat percentage, and the like; information regarding exercise, such as the number of steps taken, exercise time, and the like; and information regarding vital signs, such as blood pressure, body temperature, heart rate, and the like. The biological information is automatically transmitted by the game apparatus 12 when the answer message is transmitted to the healthcare professional terminal 90. Thus, an operation, such as an operation of attaching the biological information to the answer message by the healthcare recipient, is not needed.
(Flow of Biological Information)
A health guidance support application is an application for causing the game apparatus 12 to function as a part of the health guidance support system, and causes the CPU 40 of the game apparatus 12 to execute a process of receiving a question message from the healthcare professional terminal 90, a process of creating an answer message, a process of obtaining biological information, and a process of transmitting the answer message and the biological information. The weight data and the exercise time data are obtained from saved data of the weight measuring application and the exercise game application, which are stored in the flash memory 44 of the game apparatus 12, and the step count data is obtained from the pedometer 92 by means of communication. When the step count data has been already obtained from the pedometer 92 by means of communication and stored in the flash memory 44, the step count data can be obtained from the flash memory 44.
(Data in Flash Memory)
In addition to the applications, various data, such as registered user information D1, weight measuring application saved data D2, exercise game application saved data D3, step count data D4, unanswered question message data D5, transmitted answer message data D6, and the like, are stored in the flash memory 44.
The registered user information D1 is information (name, nickname, icon, and the like) regarding the user of the game apparatus 12, and in the present embodiment, three users A to C are registered as users of the game apparatus 12. In other words, as the registered user information D1, user information D1a of the user A, user information D1b of the user B, and user information D1c of the user C are stored in the flash memory 44.
The weight measuring application saved data D2 is saved data that is generated or updated when the weight measuring application AP2 is executed in the game apparatus 12, and includes weight data of each user. Here, the weight measuring application saved data D2 includes weight data D2a of the user A, weight data D2b of the user B, and weight data D2c of the user C. In the weight data, for example, measuring results of a plurality of times for past several months as well as the latest measuring results are registered together with their measurement dates.
The exercise game application saved data D3 is saved data that is generated or updated when the exercise game application AP3 is executed in the game apparatus 12, and includes exercise time data of each user. Here, the exercise game application saved data D3 includes exercise time data D1a of the user A, exercise time data D3b of the user B, and exercise time data D3c of the user C. In the exercise time data, for example, measuring results of a plurality of times for past several months as well as the latest measuring results are registered together with their measurement dates.
The step count data D4 is data to be transmitted to the game apparatus 12 from the pedometer 92 carried by the user, and is generated or updated when the step count data is transmitted from the pedometer 92 to the game apparatus 12. The pedometer 92 has a step count function of counting the number of steps taken by the user, a step count data storing function of storing the counted number of steps and the measurement data as step count data in a built-in storage unit, and a step count data transmission function of transmitting the step count data stored in the built-in storage unit to the game apparatus 12. Here, the step count data D4 includes step count data D4a of the user A, step count data D4b of the user B, and step count data D4c of the user C.
The unanswered question message data D5 is a group of unanswered question messages received by the game apparatus 12 from the healthcare professional terminal 90. Upon receiving a question message from the healthcare professional terminal 90, the game apparatus 12 stores the question message as a part of the unanswered question message data D5 in the flash memory 44.
The transmitted answer message data D6 is a group of answer messages transmitted from the game apparatus 12 to the healthcare professional terminal 90.
(Weight Measuring Process in Game Apparatus)
First, an operation of the game apparatus 12 when the weight measuring application AP2 is executed in the game apparatus 12 will be described.
When the weight measuring process is started, at a step S10, the CPU 40 refers to the registered user information D1, and selects an user whose weight is to be measured, from among the users registered to the game apparatus 12. The selection of the user is performed based on an input operation (e.g. an input signal from the controller 22) of the user. It is noted that when the user whose weight is to be measured has not been registered to the game apparatus 12, a process of registering the user is executed according to need.
At a step S11, the CPU 40 determines, based on a signal from the load controller 36, whether or not the load controller 36 has detected a load. Then, when the load controller 36 has detected a load, the processing proceeds to a step S12. When the load controller 36 has not detected a load, the CPU 40 waits until a load is detected.
At the step S12, the CPU 40 measures, based on the signal from the load controller 36, the weight of the user standing on the load controller 36.
At a step S13, the CPU 40 stores the weight measured at the step S12 together with the measurement date as weight data of the weighed user (i.e. the user selected at the step S10) in the flash memory 44. It is noted that when weight data of the user has been present in the flash memory 44, the latest measuring result is added to the weight data. Then, the weight measuring process ends.
(Exercise Time Measuring Process in Game Apparatus)
The following will describe an operation of the game apparatus 12 when the exercise game application AP3 is executed in the game apparatus 12.
When the exercise time measuring process is started, at a step S20, the CPU 40 refers to the registered user information D1, and selects a user who is to play an exercise game, from among the users registered to the game apparatus 12. The selection of the user is performed based on an input operation (e.g. an input signal from the controller 22) of the user. It is noted that when the user who is to play the exercise game has not been registered to the game apparatus 12, the process of registering the user is executed according to need.
At a step S21, the CPU 40 starts the exercise game using the controller 22 or the load controller 36. Examples of the exercise game include: a boxing game in which a player shadow-boxes holding the controllers 22 with his or her hands; a hiking game in which a player repeatedly steps on the load controller 36; and the like.
At a step S22, the CPU 40 starts counting an exercise time.
At a step S23, the CPU 40 determines whether or not the exercise game has ended. When the exercise game has ended, the processing proceeds to a step S24. When the exercise game continues, the CPU 40 waits until the exercise game ends.
At the step S24, the CPU 40 stops counting the exercise time.
At a step S25, the CPU 40 stores the counted exercise time together with the measurement date as exercise time data of the user who plays the exercise game (i.e. the user selected at the step S20) in the flash memory 44. It is noted that when exercise time data of the user has been present in the flash memory 44, the latest measuring result is added to the exercise time data. Then, the exercise time measuring process ends Instead of storing the counted exercise time as exercise time data, a value obtained by multiplying the counted exercise time by a coefficient according to an exercise load exerted on the user playing the exercise game may be stored as exercise time data.
(Question Process in Healthcare Professional Terminal)
The following will describe an operation of the game apparatus 12 when a question message creation application is executed in the healthcare professional terminal 90. The question message creation application is typically supplied to the healthcare professional terminal 90 through a computer-readable storage medium such as an optical disc and the like or through the Internet, and installed in a hard disk of the healthcare professional terminal 90.
When the question process is started, at a step S30, the CPU of the healthcare professional terminal 90 selects a format of a question or a message to be created, based on an input operation by the healthcare professional. In the present embodiment, a question message transmitted from the healthcare professional terminal 90 to the game apparatus 12 contains a group of one or more questions or messages. Specifically, the healthcare professional can create a question message by combining arbitrarily a “multiple-choice question”, “character-input-type question”, a “numeric-input-type question”, and a “message” according to need. The “multiple-choice question” is a question including a question section Q1 and a choice section A1 as shown in
At a step S31, the CPU of the healthcare professional terminal 90 determines whether or not the format selected at the step S30 is the “multiple-choice question”, When the format is the “multiple-choice question”, the CPU of the processing proceeds to a step S32. When the format is not the “multiple-choice question”, the processing proceeds to a step S33.
At the step S32, the CPU of the healthcare professional terminal 90 executes a multiple-choice question creation process. Specifically, in accordance with an input operation by the healthcare professional, the CPU of the healthcare professional terminal 90 creates a sentence in the question section Q1 and choices in the choice section A1 as shown in
At the step S33, the CPU of the healthcare professional terminal 90 determines whether or not the format selected at the step S30 is the “character-input-type question”. When the format is the “character-input-type question”, the processing proceeds to a step S34. When the format is not the “character-input-type question”, the processing proceeds to a step S35.
At the step S34, the CPU of the healthcare professional terminal 90 executes a character-input-type question creation process. Specifically, in accordance with an input operation by the healthcare professional, the CPU of the healthcare professional terminal 90 creates a sentence in the question section Q2 as shown in
At the step S35, the CPU of the healthcare professional terminal 90 determines whether or not the format selected at the step S30 is the “numeric-input-type question”. When the format is the “numeric-input-type question”, the processing proceeds to a step S36. When the format is not the “numeric-input-type question”, the processing proceeds to a step S37.
At the step S36, the CPU of the healthcare professional terminal 90 executes a numeric-input-type question creation process. Specifically, in accordance with an input operation by the healthcare professional, the CPU of the healthcare professional terminal 90 creates sentences in the question section Q3 and phrases in the numeric input section A3 as shown in
At the step S37, the CPU of the healthcare professional terminal 90 executes a message creation process. Specifically, in accordance with an input operation by the healthcare professional, the CPU of the healthcare professional terminal 90 creates sentences in the message section M1 as shown in
At the step S38, the CPU of the healthcare professional terminal 90 determines, based on an input operation by the healthcare professional, whether or not to create the next question or message. When the CPU of the healthcare professional terminal 90 creates the next question or message, the processing returns to the step S30. When the CPU of the healthcare professional terminal 90 does not create the next question or message, the processing proceeds to a step S39.
At the step S39, the CPU of the healthcare professional terminal 90 determines, based on an input operation by the healthcare professional, whether to transmit the created question message to the game apparatus 12 of the healthcare recipient. When the CPU of the healthcare professional terminal 90 transmits the created question message, the processing proceeds to a step S40. When the CPU of the healthcare professional terminal 90 does not transmit the created question message, the question process ends.
At the step S40, the CPU of the healthcare professional terminal 90 generates a transmission file by forming the created question message (a group of one or more questions or messages) into one file.
At a step S41, the CPU of the healthcare professional terminal 90 transmits the transmission file generated at the step S40 to the game apparatus 12 of the healthcare recipient. Then, the question process ends. Regarding transmission of a file from the healthcare professional terminal 90 to the game apparatus 12, for example, a generated transmission file can be transmitted as an attached file of an e-mail message by using a general transfer protocol of. In this case, the e-mail message may be blank or a fixed phrase may be contained in the e-mail message.
(Main Process in Game Apparatus)
The following will describe an operation of the game apparatus 12 when the health guidance support application AP1 is executed in the game apparatus 12.
When the main process is started, at a step S50, the CPU 40 refers to the registered user information D1, and selects a user who is to use the health guidance support application, from among the users registered to the game apparatus 12. The selection of the user is performed based on an input operation (e.g. an input signal from the controller 22) of the user. It is noted that when the user who is to use the health guidance support application has not been registered to the game apparatus 12, the process of registering the user is executed according to need.
At a step S51, the CPU 40 loads data (unanswered question message data D5 to the user, and the like) of the user, selected at the step S50, from the flash memory 44 to the internal main memory 42e according to need.
At a step S52, the CPU 40 displays a menu screen on the monitor 34. In the menu screen, for example, an “ANSWER QUESTION!” button B1, a “SEND E-MAIL” button 52, a “ANSWER RECORD” button B3, and a “RECEIVE STEP COUNT DATA” button B4 are displayed as shown in
At a step S53, the CPU 40 determines whether or not the “ANSWER QUESTION!” button B1 has been selected. When the “ANSWER QUESTION!” button B1 has been selected, the processing proceeds to a step S54. When the “ANSWER QUESTION!” button B1 has not been selected, the processing proceeds to a step S55.
At the step S54, the CPU 40 executes an answer process. The answer process is a process of creating an answer message to a question message from the healthcare professional, and of transmitting the answer message to the healthcare professional terminal 90. The answer process will be described in detail later. When the answer process ends, the processing returns to the step S52.
At the step S55, the CPU 40 determines whether or not the “SEND E-MAIL” button B2 has been selected. When the “SEND E-MAIL” button 132 has been selected, the processing proceeds to a step S56. When the “SEND E-MAIL” button B2 has not been selected, the processing proceeds to a step S57.
At the step S56, the CPU 40 executes an e-mail transmission process. The e-mail transmission process is a process of creating a message (discussion about health maintenance) to the healthcare professional based on an input operation (e.g. an input signal from the controller 22) of the user, and of transmitting the message to the healthcare professional terminal 90. When the e-mail transmission process ends, the processing returns to the step S52.
At the step S57, the CPU 40 determines whether or not the “ANSWER RECORD” button B3 has been selected. When the “ANSWER RECORD” button B3 has been selected, the processing proceeds to a step S58. When the “ANSWER RECORD” button B3 has not been selected, the processing proceeds to a step S59.
At the step S58, the CPU 40 executes an answer record display process. The answer record display process is a process for the user to freely view the transmitted answer message data D6 stored in the flash memory 44. Thus, according to need, the user can confirm the contents of answer messages transmitted previously. When the answer record display process ends, the processing returns to the step S52.
At the step S59, the CPU 40 determines whether or not the “RECEIVE STEP COUNT DATA” button B4 has been selected. When the “RECEIVE STEP COUNT DATA” button B4 has been selected, the processing proceeds to a step S60. When the “RECEIVE STEP COUNT DATA” button B4 has not been selected, the processing returns to the step S52.
At the step S60, the CPU 40 executes a step count data reception process. The step count data reception process is a process of transferring the step count data stored in the pedometer 92 to the flash memory 44 of the game apparatus 12. The step count data reception process will be described in detail later. When the step count data reception process ends, the processing returns to the step S52.
(Answer Process in Game Apparatus)
The following will describe in detail the answer process at the step S54 in
When the answer process is started, at a step S70, the CPU 40 selects a question message to the user currently operating the game apparatus 12 as a to-be-answered question message from the unanswered question message data D5 stored in the flash memory 44. The CPU 40 reads out from the flash memory 44 the first question or message among the one or more questions or messages contained ins the to-be-answered question message.
At a step S71, the CPU 40 determines whether or not the question or message read out from the flash memory 44 is a “multiple-choice question”. When the question or message is a “multiple-choice question”, the processing proceeds to a step S72 When the question or message is not a “multiple-choice question”, the processing proceeds to a step S73.
At the step S72, the CPU 40 executes a multiple-choice answer process. The multiple-choice answer process is a process of displaying the “multiple-choice question” to the user, and of causing the user to input an answer to the question. The multiple-choice answer process will be described in detail later. When the multiple-choice answer process ends, the processing proceeds to a step S78.
At the step S73, the CPU 40 determines whether or not the question or message read out from the flash memory 44 is a “character-input-type question”. When the question or message is a “character-input-type question”, the processing proceeds to a step S74. When the question or message is a “character-input-type question”, the processing proceeds to a step S75.
At the step S74, the CPU 40 executes a character-input-type answer process. The character-input-type answer process is a process of displaying the “character-input-type question” to the user, and of causing the user to input an answer to the question. The character-input-type answer process will be described in detail later. When the character-input-type answer process ends, the processing proceeds to the step S78.
At the step S75, the CPU 40 determines whether or not the question or message read out from the flash memory 44 is a “numeric-input-type question”. When the question or message is a “numeric-input-type question”, the processing proceeds to a step S76. When the question or message is not a “numeric-input-type question”, the processing proceeds to a step S77.
At the step S76, the CPU 740 executes a numeric-input-type answer process. The numeric-input-type answer process is a process of displaying the “numeric-input-type question” to the user, and of causing the user to input an answer to the question. The numeric-input-type answer process will be described in detail later. When the numeric-input-type answer process ends, the processing proceeds to the step S78.
At the step S77, the CPU 40 executes a message display process. The message display process is a process of displaying a “message” to the user. The message display process will be described in detail later. When the message display process ends, the processing proceeds to the step S78.
At the step S78, the CPU 40 determines whether or not there is the next question or message in the current to-be-answered question message. When there is the next question or message, the processing proceeds to a step S79. When there is not the next question or message (i.e. when inputs of answers to all the questions or messages contained in the to-be-answered question message have been completed), the processing proceeds to a step S80.
At the step S79, the CPU 40 reads out from the flash memory 44 the next question or message contained in the current to-be-answered question message. Then, the processing proceeds to a step S71.
At the step S80, the CPU 40 displays a sending confirmation screen on the monitor 34, for example, as shown in
At a step S81, the CPU 40 determines whether or not the “SEND” button B8 has been selected. When the “SEND” button 38 has been selected, the processing proceeds to a step S82. When the “SEND” button B8 has not been selected, the processing proceeds to a step S85.
At the step S82, the CPU 40 reads out from the flash memory 44 biological information of the user (i.e. the user selected at the step S50 in
At a step S83, the CPU 40 generates a transmission file by combining the created answer message and the biological information, read out from the flash memory 44 at the step S82, into one file.
At a step S84, the CPU 40 transmits the transmission file generated at the step S83 to the healthcare professional terminal 90. Then, the answer process ends. Regarding transmission of a file from the game apparatus 12 to the healthcare professional terminal 90, for example, a generated transmission file can be transmitted as an attached file of an e-mail message by using a general transfer protocol of e-mail. In this case, the e-mail message may be blank or a fixed phrase may be contained in the e-mail message.
At the step S85, the CPU 40 determines whether or not the “BACK” button B7 has been selected. When the “BACK” button B7 has been selected, the processing proceeds to a step S86. When the “BACK” button B7 has not been selected, the processing returns to the step S80.
At the step S86, the CPU 40 reads out again from the flash memory 44 the final question or message contained in the current to-be-answered question message. Then, the processing returns to the step S71. When the “BACK” button B7 is selected in the sending confirmation screen, the display may return to the first question or message instead of returning to the final question or message.
(Multiple-Choice Answer Process in Game Apparatus)
The following will describe in detail the multiple-choice answer process at the step S72 in
When the multiple-choice answer process is started, at a step S90, the CPU 40 displays a multiple-choice answer screen as shown in
At a step S91, the CPU 40 determines whether or not any one of the choices displayed in the choice section A1 has been selected by the user. When any one of the choices has been selected, the processing proceeds to a step S92. When any one of the choices has not been selected, the processing proceeds to a step S93. The selection of the choice by the user is performed by operating the controller 22.
At the step S92, the CPU 40 highlights the choice selected by the user. Then, the processing returns to the step S90. Examples of the highlighting include putting a checkmark to the head of the designated choice as shown in
At the step S93, the CPU 40 determines whether or not the “NEXT” button” button B6 has been selected by the user. When the “NEXT” button” button B6 has been selected, the multiple-choice answer process ends, and the processing proceeds to the step S78 in
At the step S94, the CPU 40 determines whether or not the “PREVIOUS” button B5 is selected by the user. When the “PREVIOUS” button B5 is selected, the processing proceeds to a step S95. When the “PREVIOUS” button B5 is not selected, the processing returns to the step S90.
At the step S95, the CPU 40 determines whether or not there is a previous question or message in the current to-be-answered question message. When there is a previous question or message, the processing proceeds to a step S96. When there is not a previous question or message (i.e. when the question or message being currently displayed is the first question or message in the current to-be-answered question message), the multiple-choice answer process ends, and the processing returns to the step S52 (i.e. the display returns to the menu display screen).
At the step S96, the CPU 40 reads out from the flash memory 44 the previous question or message contained in the current to-be-answered question message. Then, the multiple-choice answer process ends, and the processing returns to the step S71 in
(Character-Input-Type Answer Process in Game Apparatus)
The following will describe in detail the character-input-type answer process at the step S74 in
When the character-input-type answer process is started, at a step S100, the CPU 40 displays the character-input-type answer screen as shown in
At a step S101, the CPU 40 determines whether or not the character input section A2 has been designated by the user. When the character input section A2 has been designated, the processing proceeds to a step S102. When the character input section A2 has not been designated, the processing proceeds to a step S103. The designation of the character input section A2 by the user is performed by operating the controller 22.
At the step S102, the CPU 40 accepts a character input performed by the user using a software keyboard. Then, the processing returns to the step S100. The character input using the software keyboard is performed by the user designating sequentially desired keys in the keyboard displayed on the screen by using the pointer P.
At the step S103, the CPU 40 determines the “NEXT” button 36 has been selected by the user. When the “NEXT” button 36 has been selected, the character-input-type answer process ends, and the processing proceeds to the step S78 in
At the step S104, the CPU 40 determines whether or not the “PREVIOUS” button B5 has been selected by the user. When the “PREVIOUS” button 35 has been selected, the processing proceeds to a step S015. When the “PREVIOUS” button B5 has not been selected, the processing returns to the step S100.
At the step S105, the CPU 40 determines whether or not there is a previous question or message in the current to-be-answered question message. When there is a previous question or message, the processing proceeds to a step S106. Where there is not a previous question or message, the character-input-type answer process ends, and the processing proceeds to the step S52 in
At the step S106, the CPU 40 reads out form the flash memory 44 the previous question or message contained in the current to-be-answered question message. Then, the character-input-type answer process ends, and the processing returns to the step S71 in
(Numeric-Input-Type Answer Process in Game Apparatus)
The following will describe in detail the numeric-input-type answer process at the step S76 in
When the numeric-input-type answer process is started, at a step S110, the CPU 40 displays a numeric-input-type answer screen as shown in
At a step S111, the CPU 40 determines whether or not the numeric input section A3 has been designated by the user. When the numeric input section A3 has been designated, the processing proceeds to a step S112. When the numeric input section A3 has not been designated, the processing proceeds to a step S113. The designation of the numeric input section A3 by the user is performed by operating the controller 22.
At the step S112, the CPU 40 accepts a numeric input from the user using a software keyboard (only a numeric keypad). Then, the processing returns to the step S110. The numeric input using the software keyboard is performed by the user designating sequentially desired keys in the numeric keypad displayed on the screen by using the pointer P.
At the step S113, the CPU 40 determines whether or not the “NEXT” button 36 has been selected by the user. When the “NEXT” button B6 has been selected, the numeric-input-type answer process ends, and the processing proceeds to the step S78 in
At the step S114, the CPU 40 determines whether or not the “PREVIOUS” button B5 has been selected by the user. When the “PREVIOUS” button B5 has been selected, the processing proceeds to a step S115. When the “PREVIOUS” button 35 has not been selected, the processing returns to the step S110.
At the step S115, the CPU 40 determines whether or not a previous question or message in the current to-be-answered question message. When there is a previous question or message, the processing proceeds to a step S116. When there is not a previous question or message, the numeric-input-type answer process ends, and the processing returns to the step S52 in
At the step S116, the CPU 40 reads out from the flash memory 44 the previous question or message contained in the current to-be-answered question message. Then, the numeric-input-type answer process ends, and the processing returns to the step S71 in
(Message Display Process In Game Apparatus)
The following will describe in detail the message display process at the step S77 in
When the message display process is started, at a step S120, the CPU 40 displays a message display screen as shown in
At a step S121, the CPU 40 determines whether or not the “NEXT” button B6 has been selected by the user. When the “NEXT” button BE has been selected, the message display process ends, and the processing proceeds to the step S78 in
At the step S122, the CPU 40 determines whether or not the “PREVIOUS” button 35 has been selected by the user. When the “PREVIOUS” button 35 has been selected, the processing proceeds to a step S123. When the “PREVIOUS” button B5 has not been selected, the processing returns to the step S120.
At the step S123, the CPU 40 determines whether or not there is a previous question or message in the current to-be-answered question message. When there is a previous question or message, the processing proceeds to a step S124. When there is not a previous question or message, the message display process ends, and the processing returns to the step S52 in
At the step S124, the CPU 40 reads out from the flash memory 44 the previous question or message contained in the current to-be-answered question message. Then, the message display process ends, and the processing returns to the step S71 in
In the flow charts of
(Step Count Data Reception Process in Game Apparatus)
The following will describe in detail the step count data reception process at the step S60 in
When the step count data reception process is started, at a step S130, the CPU 40 starts communication with the pedometer 92.
At a step S131, the CPU 40 receives step count data from the pedometer 92. At this time, the CPU 40 may receive all step count data stored in the pedometer 92, or may receive only a part of step count data stored in the pedometer 92 (e.g. the step count data and the step count data measured during a predetermined period (e.g. during the past one month) of the user (i.e. the user selected at the step S50 in
At a step S132, the CPU 40 stores the step count data, received from the pedometer 92, in the flash memory 44.
At a step S133, the CPU 40 ends the communication with the pedometer 92. Then, the step count data reception process ends.
(Answer Reception Process in Healthcare Professional Terminal)
The following will describe an operation of the healthcare professional terminal 90 when an answer message reception application is executed in the healthcare professional terminal 90. The answer message reception application is typically supplied to the healthcare professional terminal 90 through a computer-readable storage medium such as an optical disc and the like or through the Internet, and installed in the hard disk of the healthcare professional terminal 90. The function of the aforementioned question message creation application and the function of the answer message reception application may be realized by one application.
When the answer reception process is started, at a step S140, the CPU of the healthcare professional terminal 90 accesses a mail server, and determines whether or not the mail server has received an answer message from the healthcare recipient (i.e. the game apparatus 12). When the mail server has received an answer message from the healthcare recipient, the processing proceeds to a step S141. When the mail server has not received any answer messages from the healthcare recipient, the processing proceeds to a step S143.
At the step S141, the CPU of the healthcare professional terminal 90 receives the answer message (specifically, a file including an answer message and biological information as described above) received by the mail server from the healthcare recipient.
At a step S142, the CPU of the healthcare professional terminal 90 stores the answer message received at the step S141 in the hard disk of the healthcare professional terminal 90.
At the step S143, the CPU of the healthcare professional terminal 90 displays a list of answer messages, stored in the hard disk, on the monitor of the healthcare professional terminal 90.
At a step S144, the CPU of the healthcare professional terminal 90 determines whether or not there is a view request from the operator (the healthcare professional) for any one of the answer messages in the list displayed on the monitor. When there is a view request for any one of the answer messages, the processing proceeds to a step S145. When there is not a view request for any one of the answer messages, the answer reception process ends.
At the step S145, the CPU of the healthcare professional terminal 90 displays the answer message, for which the view request is performed by the operator, on the monitor together with the corresponding biological information.
As described above, according to the present embodiment, when the answer message to the question message from the healthcare professional is transmitted from the game apparatus 12 to the healthcare professional terminal 90, the biological information of the healthcare recipient is automatically read out from the flash memory 44 and transmitted to the healthcare professional terminal 90 together with the answer message. Thus, when the healthcare recipient transmits the answer message, inputting of the biological information is omitted, and the biological information is assuredly transmitted. In addition, because the healthcare professional can fully confirm the biological information required for health guidance, the healthcare professional can properly and efficiently perform health guidance.
The present embodiment has described the case where the load controller 36 and the pedometer 92 are used as apparatuses for measuring biological information. The present invention is not limited thereto. For example, biological information (blood pressure data and body temperature data) may be obtained from a blood-pressure gauge and a clinical thermometer.
Further, in the present embodiment, the game apparatus 12 obtains biological information from an external apparatus such as the load controller 36 and the pedometer 92. However, the present invention is not limited thereto, and the game apparatus 12 may have a biological information measuring function. As such a game apparatus, for example, there is a game apparatus having a load sensor for measuring a load exerted by the user standing on the game apparatus.
Further, for the communication between the game apparatus 12 and the load controller 36 and the communication between the game apparatus 12 and the pedometer 92, arbitrary communication methods (e.g. the Bluetooth, infrared communication, other wireless communication methods, wire communication methods, public communication lines such as the Internet, wired lines such as cable) can be used.
Further, the present embodiment has described the case where a general-purpose personal computer is used as the healthcare professional terminal 90 and the game apparatus 12 is used as the healthcare recipient terminal. However, the present invention is not limited thereto, arbitrary information processing apparatuses (personal computers, stationary game apparatuses, hand-held game machines, mobile phones, PDAs) can be used as the healthcare professional terminal and the healthcare recipient terminal.
Further, in the present embodiment, transmission and reception of messages (a question message and an answer message) between the game apparatus 12 and the healthcare professional terminal 90 are performed by using a well-known protocol of e-mail. However, the present invention is not limited thereto, and another method using the Internet or a wire or wireless communication method may be used.
Further, the data structures of a question message and an answer message are not limited to the structures described in the above embodiment, and arbitrary data structures can be used. The formats of a question and a message are not limited to “multiple-choice question”, “character-input-type question”, “numeric-input-type question”, and “message” which are described in the above embodiment.
Further, the present embodiment has described the case where an answer message and biological information are combined into one file. However, the present invention is not limited thereto.
Further, the present embodiment has described the health guidance support system in which a healthcare nurse or the like performs health guidance for a healthcare recipient. However, the present invention is not limited thereto, and is also applicable to purposes other than health guidance.
While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It is understood that numerous other modifications and variations can be devised without departing from the scope of the invention,
Number | Date | Country | Kind |
---|---|---|---|
2008-334795 | Dec 2008 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
588172 | Peters | Aug 1897 | A |
688076 | Ensign | Dec 1901 | A |
D188376 | Hotkins et al. | Jul 1960 | S |
3184962 | Gay | May 1965 | A |
3217536 | Motsinger et al. | Nov 1965 | A |
3424005 | Brown | Jan 1969 | A |
3428312 | Machen | Feb 1969 | A |
3712294 | Muller | Jan 1973 | A |
3752144 | Weigle, Jr. | Aug 1973 | A |
3780817 | Videon | Dec 1973 | A |
3826145 | McFarland | Jul 1974 | A |
3869007 | Haggstrom et al. | Mar 1975 | A |
4058178 | Shinohara et al. | Nov 1977 | A |
4104119 | Schilling | Aug 1978 | A |
4136682 | Pedotti | Jan 1979 | A |
4246783 | Steven et al. | Jan 1981 | A |
4296931 | Yokoi | Oct 1981 | A |
4337050 | Engalitcheff, Jr. | Jun 1982 | A |
4404854 | Krempl et al. | Sep 1983 | A |
4488017 | Lee | Dec 1984 | A |
4494754 | Wagner, Jr. | Jan 1985 | A |
4558757 | Mori et al. | Dec 1985 | A |
4569519 | Mattox et al. | Feb 1986 | A |
4574899 | Griffin | Mar 1986 | A |
4577868 | Kiyonaga | Mar 1986 | A |
4598717 | Pedotti | Jul 1986 | A |
4607841 | Gala | Aug 1986 | A |
4630817 | Buckleu | Dec 1986 | A |
4660828 | Weiss | Apr 1987 | A |
4680577 | Straayer et al. | Jul 1987 | A |
4688444 | Nordstrom | Aug 1987 | A |
4691694 | Boyd et al. | Sep 1987 | A |
4711447 | Mansfield | Dec 1987 | A |
4726435 | Kitagawa et al. | Feb 1988 | A |
4739848 | Tulloch | Apr 1988 | A |
4742832 | Kauffmann et al. | May 1988 | A |
4742932 | Pedragosa | May 1988 | A |
4800973 | Angel | Jan 1989 | A |
4838173 | Schroeder et al. | Jun 1989 | A |
4855704 | Betz | Aug 1989 | A |
4880069 | Bradley | Nov 1989 | A |
4882677 | Curran | Nov 1989 | A |
4893514 | Gronert et al. | Jan 1990 | A |
4907797 | Gezari et al. | Mar 1990 | A |
4927138 | Ferrari | May 1990 | A |
4970486 | Gray et al. | Nov 1990 | A |
4982613 | Becker | Jan 1991 | A |
D318073 | Jang | Jul 1991 | S |
5044956 | Behensky et al. | Sep 1991 | A |
5049079 | Furtado et al. | Sep 1991 | A |
5052406 | Nashner | Oct 1991 | A |
5054771 | Mansfield | Oct 1991 | A |
5065631 | Ashpitel et al. | Nov 1991 | A |
5089960 | Sweeney, Jr. | Feb 1992 | A |
5103207 | Kerr et al. | Apr 1992 | A |
5104119 | Lynch | Apr 1992 | A |
5116296 | Watkins et al. | May 1992 | A |
5118112 | Bregman et al. | Jun 1992 | A |
5151071 | Jain et al. | Sep 1992 | A |
5195746 | Boyd et al. | Mar 1993 | A |
5197003 | Moncrief et al. | Mar 1993 | A |
5199875 | Trumbull | Apr 1993 | A |
5203563 | Loper, III | Apr 1993 | A |
5207426 | Inoue et al. | May 1993 | A |
5259252 | Kruse et al. | Nov 1993 | A |
5269318 | Nashner | Dec 1993 | A |
5299810 | Pierce et al. | Apr 1994 | A |
5303715 | Nashner et al. | Apr 1994 | A |
5360383 | Boren | Nov 1994 | A |
5362298 | Brown et al. | Nov 1994 | A |
5368546 | Stark et al. | Nov 1994 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5431569 | Simpkins et al. | Jul 1995 | A |
5462503 | Benjamin et al. | Oct 1995 | A |
5466200 | Ulrich et al. | Nov 1995 | A |
5469740 | French et al. | Nov 1995 | A |
5474087 | Nashner | Dec 1995 | A |
5476103 | Nasher et al. | Dec 1995 | A |
5507708 | Ma | Apr 1996 | A |
5541621 | Nmngani | Jul 1996 | A |
5541622 | Engle et al. | Jul 1996 | A |
5547439 | Rawls et al. | Aug 1996 | A |
5551445 | Nashner | Sep 1996 | A |
5551693 | Goto et al. | Sep 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
D376826 | Ashida | Dec 1996 | S |
5584700 | Feldman et al. | Dec 1996 | A |
5584779 | Knecht et al. | Dec 1996 | A |
5591104 | Andrus et al. | Jan 1997 | A |
5613690 | McShane et al. | Mar 1997 | A |
5623944 | Nashner | Apr 1997 | A |
5627327 | Zanakis | May 1997 | A |
D384115 | Wilkinson et al. | Sep 1997 | S |
5669773 | Gluck | Sep 1997 | A |
5689285 | Asher | Nov 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5697791 | Nasher et al. | Dec 1997 | A |
5713794 | Shimojima et al. | Feb 1998 | A |
5721566 | Rosenberg et al. | Feb 1998 | A |
5746684 | Jordan | May 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
D397164 | Goto | Aug 1998 | S |
5788618 | Joutras | Aug 1998 | A |
5792031 | Alton | Aug 1998 | A |
5800314 | Sakakibara et al. | Sep 1998 | A |
5805138 | Brawne et al. | Sep 1998 | A |
5813958 | Tomita | Sep 1998 | A |
5814740 | Cook et al. | Sep 1998 | A |
5820462 | Yokoi et al. | Oct 1998 | A |
5825308 | Rosenberg | Oct 1998 | A |
5837952 | Oshiro et al. | Nov 1998 | A |
D402317 | Goto | Dec 1998 | S |
5846086 | Bizzi et al. | Dec 1998 | A |
5853326 | Goto et al. | Dec 1998 | A |
5854622 | Brannon | Dec 1998 | A |
5860861 | Lipps et al. | Jan 1999 | A |
5864333 | O'Heir | Jan 1999 | A |
5872438 | Roston | Feb 1999 | A |
5886302 | Germanton et al. | Mar 1999 | A |
5888172 | Andrus et al. | Mar 1999 | A |
5889507 | Engle et al. | Mar 1999 | A |
D407758 | Isetani et al. | Apr 1999 | S |
5890995 | Bobick et al. | Apr 1999 | A |
5897457 | Mackovjak | Apr 1999 | A |
5897469 | Yalch | Apr 1999 | A |
5901612 | Letovsky | May 1999 | A |
5902214 | Makikawa et al. | May 1999 | A |
5904639 | Smyser et al. | May 1999 | A |
D411258 | Isetani et al. | Jun 1999 | S |
5912659 | Rutledge et al. | Jun 1999 | A |
5919092 | Yokoi et al. | Jul 1999 | A |
5921780 | Myers | Jul 1999 | A |
5921899 | Rose | Jul 1999 | A |
5929782 | Stark et al. | Jul 1999 | A |
5947824 | Minami et al. | Sep 1999 | A |
5976063 | Joutras et al. | Nov 1999 | A |
5980256 | Carmein | Nov 1999 | A |
5980429 | Nashner | Nov 1999 | A |
5984785 | Takeda et al. | Nov 1999 | A |
5987982 | Wenman et al. | Nov 1999 | A |
5989157 | Walton | Nov 1999 | A |
5993356 | Houston et al. | Nov 1999 | A |
5997439 | Ohsuga et al. | Dec 1999 | A |
6001015 | Nishiumi et al. | Dec 1999 | A |
6007428 | Nishiumi et al. | Dec 1999 | A |
6010465 | Nashner | Jan 2000 | A |
D421070 | Jang et al. | Feb 2000 | S |
6037927 | Rosenberg | Mar 2000 | A |
6038488 | Barnes et al. | Mar 2000 | A |
6044772 | Gaudette et al. | Apr 2000 | A |
6063046 | Allum | May 2000 | A |
6086518 | MacCready, Jr. | Jul 2000 | A |
6102803 | Takeda et al. | Aug 2000 | A |
6102832 | Tani | Aug 2000 | A |
D431051 | Goto | Sep 2000 | S |
6113237 | Ober et al. | Sep 2000 | A |
6147674 | Rosenberg et al. | Nov 2000 | A |
6152564 | Ober et al. | Nov 2000 | A |
D434769 | Goto | Dec 2000 | S |
D434770 | Goto | Dec 2000 | S |
6155926 | Miyamoto et al. | Dec 2000 | A |
6162189 | Girone et al. | Dec 2000 | A |
6167299 | Galchenkov et al. | Dec 2000 | A |
6190287 | Nashner | Feb 2001 | B1 |
6200253 | Nishiumi et al. | Mar 2001 | B1 |
6203432 | Roberts et al. | Mar 2001 | B1 |
6216542 | Stockli et al. | Apr 2001 | B1 |
6216547 | Lehtovaara | Apr 2001 | B1 |
6220865 | Macri et al. | Apr 2001 | B1 |
D441369 | Goto | May 2001 | S |
6225977 | Li | May 2001 | B1 |
6227968 | Suzuki et al. | May 2001 | B1 |
6228000 | Jones | May 2001 | B1 |
6231444 | Goto | May 2001 | B1 |
6239806 | Nishiumi et al. | May 2001 | B1 |
6241611 | Takeda et al. | Jun 2001 | B1 |
6244987 | Ohsuga et al. | Jun 2001 | B1 |
D444469 | Goto | Jul 2001 | S |
6264558 | Nishiumi et al. | Jul 2001 | B1 |
6280361 | Harvey et al. | Aug 2001 | B1 |
D447968 | Pagnacco et al. | Sep 2001 | S |
6295878 | Berme | Oct 2001 | B1 |
6296595 | Stark et al. | Oct 2001 | B1 |
6325718 | Nishiumi et al. | Dec 2001 | B1 |
6330837 | Charles et al. | Dec 2001 | B1 |
6336891 | Fedrigon et al. | Jan 2002 | B1 |
6353427 | Rosenberg | Mar 2002 | B1 |
6354155 | Berme | Mar 2002 | B1 |
6357827 | Brightbill et al. | Mar 2002 | B1 |
6359613 | Poole | Mar 2002 | B1 |
D456410 | Ashida | Apr 2002 | S |
D456854 | Ashida | May 2002 | S |
D457570 | Brinson | May 2002 | S |
6387061 | Nitto | May 2002 | B1 |
6388655 | Leung | May 2002 | B1 |
6389883 | Berme et al. | May 2002 | B1 |
6394905 | Takeda et al. | May 2002 | B1 |
6402635 | Nesbit et al. | Jun 2002 | B1 |
D459727 | Ashida | Jul 2002 | S |
D460506 | Tamminga et al. | Jul 2002 | S |
6421056 | Nishiumi et al. | Jul 2002 | B1 |
6436058 | Krahner et al. | Aug 2002 | B1 |
D462683 | Ashida | Sep 2002 | S |
6454679 | Radow | Sep 2002 | B1 |
6461297 | Pagnacco et al. | Oct 2002 | B1 |
6470302 | Cunningham et al. | Oct 2002 | B1 |
6482010 | Marcus et al. | Nov 2002 | B1 |
6510749 | Pagnacco et al. | Jan 2003 | B1 |
6514145 | Kawabata et al. | Feb 2003 | B1 |
6515593 | Stark et al. | Feb 2003 | B1 |
6516221 | Hirouchi et al. | Feb 2003 | B1 |
D471594 | Nojo | Mar 2003 | S |
6543769 | Podoloff et al. | Apr 2003 | B1 |
6563059 | Lee | May 2003 | B2 |
6568334 | Guadette et al. | May 2003 | B1 |
6616579 | Reinbold et al. | Sep 2003 | B1 |
6624802 | Klein et al. | Sep 2003 | B1 |
6632158 | Nashner | Oct 2003 | B1 |
6636161 | Rosenberg | Oct 2003 | B2 |
6636197 | Goldenberg et al. | Oct 2003 | B1 |
6638175 | Lee et al. | Oct 2003 | B2 |
6663058 | Peterson et al. | Dec 2003 | B1 |
6676520 | Nishiumi et al. | Jan 2004 | B2 |
6676569 | Radow | Jan 2004 | B1 |
6679776 | Nishiumi et al. | Jan 2004 | B1 |
6697049 | Lu | Feb 2004 | B2 |
6719667 | Wong et al. | Apr 2004 | B2 |
6726566 | Komata | Apr 2004 | B2 |
6764429 | Michalow | Jul 2004 | B1 |
6797894 | Montagnino et al. | Sep 2004 | B2 |
6811489 | Shimizu et al. | Nov 2004 | B1 |
6813966 | Dukart | Nov 2004 | B2 |
6817973 | Merril et al. | Nov 2004 | B2 |
D500100 | van der Meer | Dec 2004 | S |
6846270 | Etnyre | Jan 2005 | B1 |
6859198 | Onodera et al. | Feb 2005 | B2 |
6872139 | Sato et al. | Mar 2005 | B2 |
6872187 | Stark et al. | Mar 2005 | B1 |
6888076 | Hetherington | May 2005 | B2 |
6913559 | Smith | Jul 2005 | B2 |
6936016 | Berme et al. | Aug 2005 | B2 |
D510391 | Merril et al. | Oct 2005 | S |
6975302 | Ausbeck, Jr. | Dec 2005 | B1 |
6978684 | Nurse | Dec 2005 | B2 |
6991483 | Milan et al. | Jan 2006 | B1 |
D514627 | Merril et al. | Feb 2006 | S |
7004787 | Milan | Feb 2006 | B2 |
D517124 | Merril et al. | Mar 2006 | S |
7011605 | Shields | Mar 2006 | B2 |
7033176 | Feldman et al. | Apr 2006 | B2 |
7038855 | French et al. | May 2006 | B2 |
7040986 | Koshima et al. | May 2006 | B2 |
7070542 | Reyes et al. | Jul 2006 | B2 |
7083546 | Zillig et al. | Aug 2006 | B2 |
7100439 | Carlucci | Sep 2006 | B2 |
7121982 | Feldman | Oct 2006 | B2 |
7126584 | Nishiumi et al. | Oct 2006 | B1 |
7127376 | Nashner | Oct 2006 | B2 |
7163516 | Pagnacco et al. | Jan 2007 | B1 |
7179234 | Nashner | Feb 2007 | B2 |
7195355 | Nashner | Mar 2007 | B2 |
7202424 | Carlucci | Apr 2007 | B2 |
7202851 | Cunningham et al. | Apr 2007 | B2 |
7270630 | Patterson | Sep 2007 | B1 |
7307619 | Cunningham et al. | Dec 2007 | B2 |
7308831 | Cunningham et al. | Dec 2007 | B2 |
7331226 | Feldman et al. | Feb 2008 | B2 |
7335134 | LaVelle | Feb 2008 | B1 |
RE40427 | Nashner | Jul 2008 | E |
7416537 | Stark et al. | Aug 2008 | B1 |
7530929 | Feldman et al. | May 2009 | B2 |
7722501 | Nicolas et al. | May 2010 | B2 |
7938751 | Nicolas et al. | May 2011 | B2 |
20010001303 | Ohsuga et al. | May 2001 | A1 |
20010018363 | Goto et al. | Aug 2001 | A1 |
20010050683 | Ishikawa et al. | Dec 2001 | A1 |
20020055422 | Airmet et al. | May 2002 | A1 |
20020080115 | Onodera et al. | Jun 2002 | A1 |
20020185041 | Herbst | Dec 2002 | A1 |
20030054327 | Evensen | Mar 2003 | A1 |
20030069108 | Kaiserman et al. | Apr 2003 | A1 |
20030107502 | Alexander et al. | Jun 2003 | A1 |
20030176770 | Merril et al. | Sep 2003 | A1 |
20030182067 | Asano et al. | Sep 2003 | A1 |
20030193416 | Ogata et al. | Oct 2003 | A1 |
20040038786 | Kuo et al. | Feb 2004 | A1 |
20040041787 | Graves | Mar 2004 | A1 |
20040077464 | Feldman et al. | Apr 2004 | A1 |
20040099513 | Hetherington | May 2004 | A1 |
20040110602 | Feldman | Jun 2004 | A1 |
20040127337 | Nashner | Jul 2004 | A1 |
20040163855 | Carlucci | Aug 2004 | A1 |
20040180719 | Feldman et al. | Sep 2004 | A1 |
20040259688 | Stabile | Dec 2004 | A1 |
20050070154 | Milan | Mar 2005 | A1 |
20050076161 | Albanna et al. | Apr 2005 | A1 |
20050130742 | Feldman et al. | Jun 2005 | A1 |
20050202384 | DiCuccio et al. | Sep 2005 | A1 |
20060097453 | Feldman et al. | May 2006 | A1 |
20060161045 | Merril et al. | Jul 2006 | A1 |
20060181021 | Seelig et al. | Aug 2006 | A1 |
20060205565 | Feldman et al. | Sep 2006 | A1 |
20060211543 | Feldman et al. | Sep 2006 | A1 |
20060217243 | Feldman et al. | Sep 2006 | A1 |
20060223634 | Feldman et al. | Oct 2006 | A1 |
20060258512 | Nicolas et al. | Nov 2006 | A1 |
20070021279 | Jones | Jan 2007 | A1 |
20070027369 | Pagnacco et al. | Feb 2007 | A1 |
20070155589 | Feldman et al. | Jul 2007 | A1 |
20070219050 | Merril | Sep 2007 | A1 |
20080012826 | Cunningham et al. | Jan 2008 | A1 |
20080076978 | Ouchi et al. | Mar 2008 | A1 |
20080228110 | Berme | Sep 2008 | A1 |
20080261696 | Yamazaki et al. | Oct 2008 | A1 |
20090093315 | Matsunaga et al. | Apr 2009 | A1 |
20100137063 | Shirakawa et al. | Jun 2010 | A1 |
20110070953 | Konishi | Mar 2011 | A1 |
20110077899 | Hayashi et al. | Mar 2011 | A1 |
20110207534 | Meldeau | Aug 2011 | A1 |
Number | Date | Country |
---|---|---|
40 04 554 | Aug 1991 | DE |
195 02 918 | Aug 1996 | DE |
297 12 785 | Jan 1998 | DE |
20 2004 021 792 | May 2011 | DE |
20 2004 021 793 | May 2011 | DE |
0 275 665 | Jul 1988 | EP |
0 299 738 | Jan 1989 | EP |
0 335 045 | Oct 1989 | EP |
0 519 836 | Dec 1992 | EP |
1 043 746 | Nov 2000 | EP |
1 120 083 | Aug 2001 | EP |
1 257 599 | Aug 2001 | EP |
1 394 707 | Mar 2004 | EP |
1 870 141 | Dec 2007 | EP |
2 472 929 | Jul 1981 | FR |
2 587 611 | Mar 1987 | FR |
2 604 910 | Apr 1988 | FR |
2 647 331 | Nov 1990 | FR |
2 792 182 | Oct 2000 | FR |
2 801 490 | Jun 2001 | FR |
2 811 753 | Jan 2002 | FR |
2 906 365 | Mar 2008 | FR |
1 209 954 | Oct 1970 | GB |
2 288 550 | Oct 1995 | GB |
44-23551 | Oct 1969 | JP |
55-95758 | Dec 1978 | JP |
54-73689 | Jun 1979 | JP |
55-113472 | Sep 1980 | JP |
55-113473 | Sep 1980 | JP |
55-125369 | Sep 1980 | JP |
55-149822 | Nov 1980 | JP |
55-152431 | Nov 1980 | JP |
60-79460 | Jun 1985 | JP |
60-153159 | Oct 1985 | JP |
61-154689 | Jul 1986 | JP |
62-34016 | Feb 1987 | JP |
63-158311 | Oct 1988 | JP |
63-163855 | Oct 1988 | JP |
63-193003 | Dec 1988 | JP |
02-102651 | Apr 1990 | JP |
2-238327 | Sep 1990 | JP |
3-25325 | Feb 1991 | JP |
3-103272 | Apr 1991 | JP |
03-107959 | Nov 1991 | JP |
6-063198 | Mar 1994 | JP |
6-282373 | Oct 1994 | JP |
07-213741 | Aug 1995 | JP |
7-213745 | Aug 1995 | JP |
7-241281 | Sep 1995 | JP |
7-241282 | Sep 1995 | JP |
7-275307 | Oct 1995 | JP |
07-302161 | Nov 1995 | JP |
8-43182 | Feb 1996 | JP |
08-131594 | May 1996 | JP |
08-182774 | Jul 1996 | JP |
08-184474 | Jul 1996 | JP |
8-215176 | Aug 1996 | JP |
08-244691 | Sep 1996 | JP |
2576247 | Jan 1997 | JP |
9-120464 | May 1997 | JP |
9-168529 | Jun 1997 | JP |
9-197951 | Jul 1997 | JP |
9-305099 | Nov 1997 | JP |
11-151211 | Jun 1999 | JP |
11-309270 | Nov 1999 | JP |
2000-146679 | May 2000 | JP |
U3068681 | May 2000 | JP |
U3069287 | Jun 2000 | JP |
2000-254348 | Sep 2000 | JP |
3172738 | Jun 2001 | JP |
2001-178845 | Jul 2001 | JP |
2001-286451 | Oct 2001 | JP |
2002-049697 | Feb 2002 | JP |
2002-112984 | Apr 2002 | JP |
2002-157081 | May 2002 | JP |
2002-253534 | Sep 2002 | JP |
2002-366664 | Dec 2002 | JP |
2003-79599 | Mar 2003 | JP |
2003-235834 | Aug 2003 | JP |
2004-118339 | Apr 2004 | JP |
3722678 | Nov 2005 | JP |
2005-334083 | Dec 2005 | JP |
3773455 | May 2006 | JP |
2006-155462 | Jun 2006 | JP |
2006-167094 | Jun 2006 | JP |
3818488 | Sep 2006 | JP |
2006-263101 | Oct 2006 | JP |
2006-284539 | Oct 2006 | JP |
U3128216 | Dec 2006 | JP |
2008-49117 | Mar 2008 | JP |
2008-197784 | Aug 2008 | JP |
2008-310606 | Dec 2008 | JP |
WO 9111221 | Aug 1991 | WO |
WO 9212768 | Aug 1992 | WO |
WO 9840843 | Sep 1998 | WO |
WO 0012041 | Mar 2000 | WO |
WO 0057387 | Sep 2000 | WO |
WO 0069523 | Nov 2000 | WO |
WO 0229375 | Apr 2002 | WO |
WO 02057885 | Jul 2002 | WO |
03067484 | Aug 2003 | WO |
WO 2004051201 | Jun 2004 | WO |
WO 2004053629 | Jun 2004 | WO |
WO 2005043322 | May 2005 | WO |
WO 2008034965 | Mar 2008 | WO |
WO 2008099582 | Aug 2008 | WO |
Entry |
---|
Vs. Slalom—Operation Manual, MDS(MGS), Nintendo, 4 pages, (date unknown). |
HyperspaceArcade.com—Specialists in Arcade Video Game Repair and Restoration, http://www.hyperspacearcade.com/VSTypes.html (retrieved Jul. 3, 2010), 3 pages. |
Vs. Slalom—Attachment Pak Manual; For Installation in: VS. UniSystem (UPRIGHT) and VS. DualSystem (UPRIGHT), TM of Nintendo of America Inc., 1986, 15 pages. |
Leiterman, “Project Puffer: Jungle River Cruise,” Atari, Inc., 1982, 2 pages. |
Leiterman, “Project Puffer: Tumbleweeds,” Atari, Inc., 1982, 1 page. |
Jerry Smith, “Other Input Devices,” Human Interface Technology Laboratory, 2 pages, (date unknown). |
Trevor Meers, “Virtually There: VR Entertainment Transports Players to Entrancing New Worlds,” Smart Computing, vol. 4, Issue 11, Nov. 1993, 6 pages. |
“Dance Aerobics,” Moby Games, Feb. 12, 2008, 2 pages. |
Electronic Entertainment Expo (E3) Overview, Giant Bomb—E3 2004 (video game concept), http://www.giantbomb.com/e3-2004/92/3436/ (retrieved Sep. 3, 2010), 3 pages. |
Michael Goldstein, “Revolution on Wheels—Thatcher Ulrich,” Nov.-Dec. 1994, 3 pages. |
Fitness article, Sep. 1994, p. 402-404. |
“Wired Top 10: Best Selling Toys in Jun. 1994,” Wired Sep. 1994, 1 page. |
Warranty Information and Your Joyboard: How it Works, Amiga Corporation, date unknown, 2 pages. |
Complaint for Patent Infringement, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Northern Division (Apr. 2, 2010), 317 pages. |
Plaintiff IA Labs CA, LLC's Opening Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Dec. 13, 2010), 36 pages. |
Nintendo Co., Ltd. and Nintendo of America's Opposition to IA Labs CA, LLC's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), including the Appendix of Exhibits and Exhibits A-R, 405 pages. |
Declaration of R. Lee Rawls in Support of Nintendo Co., Ltd. and Nintendo of America Inc.'s Opposition to IA Labs CA. LLC's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), including Exhibits 1, 3-12, 193 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), 7 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Appendix of Exhibits, 2 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 1, 36 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 2, 40 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 3, 85 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 4, 10 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 5, 9 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 6, 17 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 7, 16 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 8, 45 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 9, 4 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 10, 22 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 11, 27 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 12, 3 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 13, 7 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 14, 22 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 15, 45 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 16, 42 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 17, 19 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 18, 27 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 19, 13 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 20, 29 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 21, 25 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 22, 11 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 23, 20 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 24, 7 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 25, 80 pages. |
Declaration of Tyler C. Peterson Pursuant to Fed. R. Civ. p. 56(D) in Support of Nintendo Co., Ltd. and Nintendo of American Inc.'s Opposition to Plaintiff's Motion for Partial Summary Judgment, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (May 16, 2011), Exhibit 26, 32 pages. |
U.S. Trademark Application No. 74/402,755 filed Jun. 14, 1993, 43 pages. |
“AccuSway Dual Top: For Balance and Postural Sway Measurement,” AMTI: Force and Motion, ISO 9001:2000, 2 pages. |
Borzelli G., Cappozzo A., and Papa E., “Inter- and intra-individual variability of ground rejection forces during sit-to-stand with principal component analysis,” Medical Engineering & Physics 21 (1999), pp. 235-240. |
Chiari L., Cappello A., Lenzi D., and Della Croce U, “An Improved Technique for the Extraction of Stochasitc Parameters from Stabilograms,” Gait and Posture 12 (2000), pp. 225-234. |
Cutlip R., Hsiao H., Garcia R., Becker E., Mayeux B., “A comparison of different postures for scaffold end-frame disassembly,” Applied Ergonomics 31 (2000), pp. 507-513. |
Davis K.G., Marras W.S., Waters T.R., “Evaluation of spinal loading during lowering and lifting,” The Ohio State University, Biodynamics Laboratory, Clinical Biomechanics vol. 13, No. 3, 1998 pp. 141-152. |
Rolf G. Jacob, Mark S. Redfern, Joseph M. Furman, “Optic Flow-induced Sway in Anxiety Disorders Associated with Space and Motion Discomfort,” Journal of Anxiety Disorders, vol. 9, No. 5, 1995, pp. 411-425. |
Jorgensen M.J., Marras W.S., “The effect of lumbar back support tension on trunk muscle activity,” Clinical Biomechanics 15 (2000), pp. 292-294. |
Deborah L. King and Vladimir M. Zatsiorsky, “Extracting gravity line displacement from stabilographic recordings,” Gait & Posture 6 (1997), pp. 27-38. |
Kraemer W.J., Volek J.S., Bush J.A., Gotshalk L.A., Wagner P.R., Gómez A.L., Zatsiorsky V.M., Duzrte M., Ratamess N.A., Mazzetti S.A., Selle B.J., “Influence of compression hosiery on physiological responses to standing fatigue in women,” The Human Performance Laboratory, Medical & Science in Sports & Exercise, 2000, pp. 1849-1858. |
Papa E. and Cappozzo A., “A telescopic inverted-pendulum model of the musculo-skeletal system and its use for the analysis of the sit-to-stand motor task,” Journal of Biomechanics 32 (1999), pp. 1205-1212. |
Balance System, BalanceTrak 500, & Quantrem, ZapConnect.com: Medical Device Industry Portal, http://www.zapconnect.com/products/index/cfm/fuseaction/products, 2 pages. (Retrieved Apr. 5, 2011). |
Bertec: Dominate Your Field, Physician's Quick Guide, Version 1.0.0, Feb. 2010, 13 pages. |
Bertec: Dominate Your Field, Balancecheck Screener, Version 1.0.0, Feb. 2010, 35 pages. |
Bertec: Dominate Your Field, Balancecheck Trainer, Version 1.0.0, Feb. 2010, 37 pages. |
Bertec Corporation—Balancecheck Standard Screener Package, http://bertec.com/products/balance-systems/standard-screener.html, 1 page. (Retrieved Apr. 12, 2011). |
Bertec Corporation—Balance Systems: Balancecheck Advanced balance assessment & training products for the balance professional, http://bertec.com/products/balance-systems.html, 1 page. (Retrieved Mar. 31, 2011). |
Bertec Corporation—Balancecheck Mobile Screener Package: Portable balance screening with full functionality, http://bertec.com/products/balance-systems/mobile-screener .html, 1 page. (Retrieved Mar. 31, 2011). |
Bertec Corporation—Balancecheck Standard Screener & Trainer Package: Advanced balance screening and rehabilitation system, http://bertec.com/products/balance-systems/standard-screener-trainer.html, 1 page. (Retrieved Mar. 31, 2011). |
U.S. Trademark Application No. 75/136,330 filed Jul. 19, 1996, 47 pages. |
Bertec: Dominate Your Field, Digital Acquire 4, Version 4.0.10, Mar. 2011, 22 pages. |
Bertec: Dominate Your Field, Bertec Force Plates, Version 1.0.0, Sep. 2009, 31 pages. |
U.S. Trademark Application No. 73/542,230 filed Jun. 10, 1985, 52 pages. |
Brent L. Arnold and Randy J. Schmitz, “Examination of Balance Measures Produced by the Biodex Stability System,” Journal of Athletic Training, vol. 33(4), 1998, pp. 323-327. |
Trademark Registration No. 1,974,115 filed Mar. 28, 1994, 8 pages. |
U.S. Trademark Application No. 75/471,542 filed Apr. 16, 1998, 102 pages. |
VTI Force Platform, Zapconnect.com: Medical Device Industry Portal, http://zapconnect.com/products/index.cfm/fuseaction/products, 2 pages. (Retrieved Apr. 5, 2011). |
Amin M., Girardi M., Konrad H.R., Hughes L., “A Comparison of Electronystagmorgraphy Results with Posturography Findings from the BalanceTrak 500,” Otology Neurotology, 23(4), 2002, pp. 488-493. |
Girardi M., Konrad H.R., Amin M., Hughes L.F., “Predicting Fall Risks in an Elderly Population: Computer Dynamic Posturography Versus Electronystagmography Test Results,” Laryngoscope, 111(9), 2001, 1528-32. |
Dr. Guido Pagnacco, Publications, 1997-2008, 3 pages. |
College of Engineering and Applied Science: Electrical and Computer Engineering, University of Wyoming, Faculty: Guido Pagnacco, http://wwweng.uwyo.edu/electrical/faculty/Pagnacco.html, 2 pages. (Retrieved Apr. 20, 2011). |
EyeTracker, IDEAS, DIFRA, 501(k) Summary: premarket notification, Jul. 5, 2007, 7 pages. |
Vestibular technologies, copyright 2000-2004, 1 page. |
Scopus preview—Scopus—Author details (Pagnacco, Guido), http:www.scopus.com/authid/detail.url?authorId=6603709393, 2 pages. (Retrieved Apr. 20, 2011). |
Vestibular Technologies Company Page, “Vestibular technologies: Helping People Regain their Balance for Life,” http:www.vestibtech.com/AboutUs.html, 2 pages. (Retrieved Apr. 20, 2011). |
GN Otometrics Launces ICS Balance Platform: Portable system for measuring postural sway, http://audiologyonline.com/news/pf—news—detail.asp?news—id=3196, 1 page. (Retrieved Mar. 31, 2011). |
U.S. Trademark Application No. 75/508,272 filed Jun. 25, 1998, 36 pages. |
U.S. Trademark Application No. 75/756,991 filed Jul. 21, 1999, 9 pages. |
U.S. Trademark Application No. 76/148,037 filed Oct. 17, 2000, 78 pages. |
Vestibular technologies, VTI Products: BalanceTRAK User's Guide, Preliminary Version 0.1, 2005, 34 pages. |
U.S. Trademark Application 76/148,037 filed Oct. 17, 2000, 57 pages. |
Vestibular Technologies, Waybackmachine, http://vestibtech.com/balancetrak500.html, 7 pages. (Retrieved Mar. 30, 2011). |
Vestibular Technologies, 2004 Catalog, 32 pages. |
State of Delaware: The Official Website of the First State, Division of Corporations—Online Services, http://delecorp.delaware.gov/tin/controller, 2 pages. (Retrieved Mar. 21, 2011). |
Memorandum in Support of Plaintiff IA Labs' Motion for Partial Summary Judgment on Defendants' Affirmative Defense and Counterclaim That U.S. Patent No. 7,121,982 is Invalid Under 35 U.S.C. §§ 102 and 103, IA Labs CA, LLC, (Plaintiff) v. Nintendo Co., Ltd. et al., (Defendant), United States District Court for the District of Maryland Southern Division (Apr. 27, 2011), 17 pages. |
Interface, Inc.—Advanced Force Measurement—SM Calibration Certificate Installation Information, 1984. |
Hugh Stewart, “Isometric Joystick: A Study of Control by Adolescents and Young Adults with Cerebral Palsy,” The Australian Occupational Therapy Journal, Mar. 1992, vol. 39, No. 1, pp. 33-39. |
Raghavendra S. Rao, et al., “Evaluation of an Isometric and a Position Joystick in a Target Acquisition Task for Individuals with Cerebral Palsy,” IEEE Transactions on Rehabilitation Engineering, vol. 8, No. 1, Mar. 2000, pp. 118-125. |
D. Sengupta, et al., “Comparative Evaluation of Control Surfaces for Disabled Patients,”Proceedings of the 27th Annual Conference on Engineering in Medicine and Biology, vol. 16, Oct. 6-10, 1974, p. 356. |
Ludonauts, “Body Movin',” May 24, 2004, http://web.archive.org/web/20040611131903/http:/www.ludonauts.com; retrieved Aug. 31, 2010, 4 pages. |
Atari Gaming Headquarters—AGH's Atari Project Puffer Page, http://www.atarihq.com/othersec/puffer/index.html, retrieved Sep. 19, 2002, 4 pages. |
Michael Antonoff, “Real estate is cheap here, but the places you'd most want to visit are still under construction,” Popular Science, Jun. 1993, pp. 33-34. |
Steve Aukstakalnis and David Blatner, “The Art and Science of Virtual Reality—Silicon Mirage,” 1992, pp. 197-207. |
Electronics, edited by Michael Antonoff, “Video Games—Virtual Violence: Boxing Without Bruises,” Popular Science, Apr. 1993, p. 60. |
Stuart F. Brown, “Video cycle race,” Popular Science, May 1989, p. 73. |
Scanning the Field for Ideas, “Chair puts Player on the Joystick,” Machine Design, No. 21, Oct. 24, 1991, XP000255214, 1 page. |
Francis Hamit, “Virtual Reality and the Exploration of Cyberspace,” University of MD Baltimore County, 1993, 4 pages. |
Innovation in Action—Biofeed back Motor Control, Active Leg Press—IsoLegPress, 2 pages (date unknown). |
Ric Manning, “Videogame players get a workout with the Exertainment,” The Gizmo Page from the Courier Journal Sep. 25, 1994, 1 page. |
Tech Lines, Military—Arcade aces and Aviation—Winging it, Popular Mechanics, Mar. 1982, p. 163. |
Sarju Shah, “Mad Catz Universal MC2 Racing Wheel: Mad Catz MC2 Universal,” Game Spot, posted Feb. 18, 2005, 3 pages. |
Joe Skorupa, “Virtual Fitness,” Sports Science, Popular Mechanics, Oct. 1994, 3 pages. |
Nintendo Zone—The History of Nintendo (1889-1997), retrieved Aug. 24, 1998 pp. 1, 9-10. |
The Legible City, Computergraphic Installation with Dirk Groeneveld, Manhattan version (1989), Amsterdam version (1990), Karlsruhe version (1991), 3 pages. |
The New Exertainment System. It's All About Giving Your Members Personal Choices, Life Fitness, Circle Reader Service Card No. 28, 1995, 1 page. |
The Race Begins with $85, Randal Windracer, Circle Reader Service Card No. 34, 1990, 1 page. |
Universal S-Video/Audio Cable; Product #5015, MSRP 9.99; http://www.madcatz.com/Default.asp?Page=133&CategoryImg=Universal—Cables, retrieved May 12, 2005, 1 page. |
Tom Dang, et al., “Interactive Video Exercise System for Pediatric Brain Injury Rehabilitation,” Assistive Technology Research Center, Rehabilitation Engineering Service, National Rehabilitation Hospital, Proceedings of the RESNA 20th Annual Conference, Jun. 1998, 3 pages. |
Raymond W. McGorry, “A system for the measurement of grip forces and applied moments during hand tool use,” Liberty Mutual Research Center for Safety and Health, Applied Ergonomics 32 (2001) 271-279. |
NordicTrack's Aerobic Cross Trainer advertisment as shown in “Big Ideas—For a Little Money: Great Places to Invest $1,000 or Less,” Kiplinger's Personal Finance Magazine, Jul. 1994, 3 pages. |
Maurice R. Masliah, “Measuring the Allocation of Control in 6 Degree of Freedom Human-Computer Interaction Tasks,” Graduate Department of Mechanical and Industrial Engineering, University of Toronto, 2001, 177 pages. |
Leigh Ann Roman, “Boing! Combines Arcade Fun with Physical Training,” Memphis—Health Care News: Monitoring the Pulse of Our Health Care Community, Sep. 20, 1996, One Section, 1 page. |
“No More Couch Potato Kids,” as shown in Orange Coast, Sep. 1994, p. 16. |
Gary L. Downey, et al., “Design of an Exercise Arcade for Children with Disabilities,” Resna, Jun. 26-30, 1998, pp. 405-407. |
Frank Serpas, et al., “Forward-dynamics Simulation of Anterior Cruciate Ligament Forces Developed During Isokinetic Dynamometry,” Computer Methods in Biomechanics and Biomedical Engineering, vol. 5 (1), 2002, pp. 33-43. |
Carolyn Cosmos, “An ‘Out of Wheelchair Experience’”, The Washington Post, May 2, 2000, 3 pages. |
David H. Ahl, “Controller update,” Creative Computing, vol. 9, No. 12, Dec. 1983, p. 142. |
Ian Bogost, “Water Cooler Games—The Prehistory of Wii Fit,” Videogame Theory, Criticism, Design, Jul. 15, 2007, 2 pages. |
Jeremy Reimer, “A history of the Amiga, part 2: The birth of Amiga,” last updated Aug. 12, 2007, 2 pages. |
The Amiga Joyboard (1982) image, Photos: Fun with plastic—peripherals that changed gaming; http://news.cnet.com/2300-27076—3-10001507-2.html (retrieved Jul. 23, 2010), 1 page. |
The Amiga Power System Joyboard, Amiga history guide, http://www.amigahistory.co.uk/joyboard.html (retrieved Jul. 23, 2010), 2 pages. |
“Joyboard,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Joyboard (retrieved Jul. 26, 2010), 2 pages. |
“Dance Dance Revolution,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Dance Dance Revolution (retrieved Jul. 23, 2010), 9 pages. |
“Cure for the couch potato,” Kansas City Star (MO), Jan. 2, 2005, WLNR 22811884, 1 page. |
JC Fletcher, “Virtually Overlooked: The Power Pad games,” Joystiq, http://www.joystiq.com/2007/09/20/virtually-overlooked-the-power-pad-games/ (retrieved Jul. 26, 2010), 3 pages. |
Family Fun Fitness, Nintendo Entertainment System, BANDAI, (date unknown). |
“Power Pad/Family Fun and Fitness/Family Trainer,” http://www.gamersgraveyard.com/repository/nes/peripherals/powerpad.html (retrieved Jul. 26, 2010), 2 pages. |
“Power Pad Information,” Version 1.0 (Sep. 23, 1999) http://www.gamersgraveyard.com/repository/nes/peripherals/powerpad.txt (retrieved Jul. 26, 2010), 2 pages. |
Wii+Power+Pad.jpg (image), http://bpl.blogger.com/—J5LEiGp54I/RpZbNpnLDgl/AAAAAAAAAic/Gum6DD3Umjg/s1600-h/Wii+Power+Pad.jpg (retrieved Jul. 26, 2010), 1 page. |
Vs. Slalom—Videogame by Nintendo, KLOV—Killer List of Video Games, http://www.arcade-museum.com/game—detail.php?game—id=10368 (retrieved Jul. 26, 2010), 3 pages. |
“Nintendo Vs. System,” Wikipedia—the free encyclopedia, http://en.wikipedia.org/wiki/Nintendo—Vs.—System (retrieved Jul. 26, 2010), 3 pages. |
Family Fun Fitness: Basic Set (Control Mat and Athletic World Game Pak), Nintendo Entertainment System, Bandai, (date unknown). |
Roll & Rocker, Enteractive (image), 2 pages, (date unknown). |
Candace Putnam, “Software for Hardbodies: A virtual-reality hike machine takes you out on the open road,” Design, 1 page, (date unknown). |
“Top Skater,” Sega Amusements U.S.A, Inc, 1 page, (date unknown). |
Nintendo Co., Ltd. and Nintendo of America Inc.'s Opening Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Dec. 13, 2010), 55 pages. |
Plaintiff IA Labs CA, LLC's Response Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Jan. 6, 2011), 49 pages. |
Nintendo Co., Ltd. and Nintendo of America Inc.'s Closing Claim Construction Brief, IA Labs CA, LLC v. Nintendo Co., Ltd. and Nintendo of America, Inc., United States District Court for the District of Maryland Southern Division (Jan. 6, 2011), 25 pages. |
Expert Report of Lee Rawls, Nov. 2, 2010, 37 pages (redacted). |
Addlesee, M.D., et al., “The ORL Active Floor,” IEEE—Personal Communications, Oct. 1997. |
Baek, Seongmin, et al., “Motion Evaluation for VR-based Motion Training,” Eurographics 2001, vol. 20, No. 3, 2001. |
Biodex Medical Systems, Inc.—Balance System SD Product Information—http://www.biodex.com/rehab/balance/balance—300feat.htm. |
Chen, I-Chun, et al., “Effects of Balance Training on Hemiplegic Stroke Patients,” Chang Gung Medical Journal, vol. 25, No. 9, pp. 583-590, Sep. 2002. |
Dingwell, Jonathan, et al., “A Rehabilitation Treadmill with Software for Providing Real-Time Gait Analysis and Visual Feedback,” Transactions of the ASME, Journal of Biomechanical Engineering, 118 (2), pp. 253-255, 1996. |
Geiger, Ruth Ann, et al., “Balance and Mobility Following Stroke: Effects of Physical Therapy Interventions With and Without Biofeedback/Forceplate Training,” Physical Therapy, vol. 81, No. 4, pp. 995-1005, Apr. 2001. |
Harikae, Miho, “Visualization of Common People's Behavior in the Barrier Free Environment,” Graduate Thesis—Master of Computer Science and Engineering in the Graduate School of the University of Aizu, Mar. 1999. |
Hodgins, J.K., “Three-Dimensional Human Running,” Proceedings: 1996 IEEE International Conference on Robotics and Automation, vol. 4, Apr. 1996. |
Kim, Jong Yun, et al., “Abstract—A New VR Bike System for Balance Rehabilitation Training,” Proceedings: 2001 IEEE Seventh International Conference on Virtual Systems and Multimedia, Oct. 2001. |
McComas, Joan, et al., “Virtual Reality Applications for Prevention, Disability Awareness, and Physical Therapy Rehabilitation in Neurology: Our Recent Work,” School of Rehabilitation Sciences, University of Ottawa—Neurology Report, vol. 26, No. 2, pp. 55-61, 2002. |
Nicholas, Deborah S, “Balance Retraining After Stroke Using Force Platform Feedback,” Physical Therapy, vol. 77, No. 5, pp. 553-558, May 1997. |
Redfern, Mark, et al., “Visual Influences of Balance,” Journal of Anxiety Disorders, vol. 15, pp. 81-94, 2001. |
Sackley, Catherine, “Single Blind Randomized Controlled Trial of Visual Feedback After Stroke: Effects on Stance Symmetry and Function,” Disavility and Rehabilitation, vol. 19, No. 12, pp. 536-546, 1997. |
Tossavainen, Timo, et al., “Postural Control as Assessed with Virtual Reality,” Acta Otolaryngol, Suppl 545, pp. 53-56, 2001. |
Tossavainen, Timo, et al., “Towards Virtual Reality Simulation in Force Platform Posturography,” MEDINFO, pp. 854-857, 2001. |
Tsutsuguchi, Ken, et al., “Human Walking Animation Based on Foot Reaction Force in the Three-Dimensional Virtual World,” The Journal of Visualization and Computer Animation, vol. 11, pp. 3-16, 2000. |
Wong, Alice, et al., “The Development and Clinical Evaluation of a Standing Biofeedback Trainer,” Journal of Rehabilitation Research and Development, vol. 34, No. 3, pp. 322-327, Jul. 1997. |
Yang, Ungyeon, et al., “Implementation and Evaluation of ‘Just Follow Me’: An Immersive, VR-Based, Motion-Training System,” Presence, vol. 11, No. 3, pp. 304-323, 2002. |
Search Report (2 pgs.) dated May 27, 2011 issued in German Application No. 20 2004 021 793.7. |
Number | Date | Country | |
---|---|---|---|
20100169110 A1 | Jul 2010 | US |