This nonprovisional application is based on Japanese Patent Application No. 2011-197032 filed with the Japan Patent Office on Sep. 9, 2011, the entire contents of which are hereby incorporated by reference.
The invention generally relates to a game system, a portable game device, a method of controlling an information processing unit, and a non-transitory storage medium encoded with a computer readable program for controlling an information processing unit.
Some conventional portable game devices are provided with an angular velocity sensor representing one type of an inertial sensor. For example, an angle of rotation at the time when a user moves the game device is sensed by the angular velocity sensor. Then, an image resulting from image pick-up of a virtual object in a virtual space with a virtual camera in the virtual space moved in accordance with the sensed angle of rotation is generated. Thus, a virtual object viewed from different viewpoints as a position of the virtual camera is moved by moving the portable game device can be displayed.
Meanwhile, in a case of the portable game device, various operation positions for operating the game device can be taken, and there are also for example a case where a game device is placed and operated on a flat surface such as a desk and for example a case where a game device is set with hands at an angle close to perpendicular.
In a conventional game device, though game processing utilizing motion of the device has been performed, change in game processing in consideration of an operation position of the operated device has not been made.
An exemplary embodiment provides a game system, a portable game device, a method of controlling an information processing unit, and a non-transitory storage medium encoded with a computer readable program for controlling an information processing unit, capable of changing game processing in consideration of an operation position of an operated device.
An exemplary embodiment provides a game system. The game system includes an operation apparatus and an information processing unit for transmitting and receiving data to and from the operation apparatus. The operation apparatus includes a display portion for displaying information, an operation portion for accepting a user's operation input, and a sensor portion for obtaining position data in a space where the operation apparatus is present. The information processing unit includes a viewpoint mode switching unit for setting a first viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a first virtual camera in a virtual space on a game when the position data of the operation apparatus obtained by the sensor portion is in a first range and setting a second viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a second virtual camera when the position data of the operation apparatus is in a second range, and a game processing unit for performing game processing in accordance with the operation input accepted by the operation portion.
According to the exemplary embodiment, the viewpoint mode switching unit for setting a first viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a first virtual camera in a virtual space on a game when position data of an axis of a prescribed direction in a reference direction in the space where the operation apparatus is present is in a first range and setting a second viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a second virtual camera when position data of the operation apparatus is in a second range is provided, so that change in game processing in accordance with the viewpoint mode in accordance with the operation position can be made and resultant game processing can be performed. Therefore, zest of a game can be increased. In addition, since switching between the viewpoint modes can be made in accordance with the operation position, an operation of a button or the like for switching between the viewpoint modes is not necessary, a desired viewpoint mode can readily be set, and operability is also good.
In an exemplary embodiment, the operation portion has a direction input portion for accepting at least an input of a direction, and the game processing unit moves a first object provided in the virtual space in accordance with the input of the direction accepted by the direction input portion.
In an exemplary embodiment, the game processing unit controls the position and the direction of image pick-up of the second virtual camera in accordance with the position data of the operation apparatus in the second viewpoint mode.
According to the exemplary embodiment, in the second viewpoint mode, the position and the direction of image pick-up of the second virtual camera are controlled in accordance with the position data of the operation apparatus. Thus, a viewpoint is further changed in accordance with the operation position and hence zest of a game is further increased.
In an exemplary embodiment, the game processing unit provides a second object of which movement in the virtual space is controlled, and the game processing unit controls movement of the second object in accordance with the operation input accepted by the operation portion at prescribed timing in the first viewpoint mode and controls movement thereof in accordance with the position data of the operation apparatus at prescribed timing in the second viewpoint mode.
In an exemplary embodiment, the game processing unit provides a second object of which movement in the virtual space is controlled, and the game processing unit controls movement of the second object in accordance with the operation input accepted by the operation portion at prescribed timing in the first viewpoint mode, and controls movement thereof in accordance with the operation input accepted by the operation portion at prescribed timing and controls movement thereof in accordance with the position data of the operation apparatus at timing different from the prescribed timing in the second viewpoint mode.
According to the exemplary embodiment, in the second viewpoint mode, movement of the second object is controlled in accordance with the position data of the operation apparatus. Therefore, game processing in coordination with a manner of operation (motion) of the operation apparatus can be performed and hence zest of a game is further increased.
In an exemplary embodiment, in the virtual space, the first virtual camera is set at a position higher than positions of the first object and the second virtual camera.
According to the exemplary embodiment, by setting the first virtual camera at a position higher than the position of the second virtual camera, an overhead view image can be obtained and displayed so that zest of a game is increased owing to different viewpoints.
In an exemplary embodiment, a case where the position data of the operation apparatus is in the first range refers to a case where an orientation of a display surface of the display portion is close to horizontal, and a case where it is in the second range refers to a case where an orientation of the display surface of the display portion is close to vertical.
According to the exemplary embodiment, since the viewpoint mode can be switched between a case where an orientation of the display surface of the display portion is close to horizontal and a case where an orientation of the display surface of the display portion is close to vertical, a desired viewpoint mode can readily be set and operability is good.
In an exemplary embodiment, the sensor portion corresponds to an inertial sensor.
In an exemplary embodiment, the inertial sensor includes an acceleration sensor and an angular velocity sensor.
According to the exemplary embodiment, since the sensor portion is implemented by an acceleration sensor and an angular velocity sensor, the operation position and the manner of operation (motion) of the operation apparatus can both be detected and independently be reflected on game processing. Therefore, zest of a game is increased.
In an exemplary embodiment, the information processing unit is provided in a main body apparatus provided separately from the operation apparatus, the operation apparatus further includes a communication portion for transmitting and receiving data to and from the main body apparatus, the communication portion of the operation apparatus transmits the operation input accepted by the operation portion and the position data obtained by the sensor portion to the main body apparatus and receives game image data in accordance with the game processing performed by the information processing unit of the main body apparatus, and the display portion of the operation apparatus displays a game image in accordance with the received game image data.
According to the exemplary embodiment, since a portable game device or the operation apparatus is provided separately from the main body apparatus, operability is good.
An exemplary embodiment provides a game system. The game system includes an operation apparatus and an information processing unit for transmitting and receiving data to and from the operation apparatus. The operation apparatus includes a display portion for displaying information, an operation portion for accepting a user's operation input, and a sensor portion for obtaining first position data of an axis of a first prescribed direction in a reference direction in a space where the operation apparatus is present and second position data around an axis of a second prescribed direction. The information processing unit performs game processing in accordance with the operation input accepted by the operation portion when the first position data of the operation apparatus obtained by the sensor portion is in a first range and performs game processing in accordance with the operation input accepted by the operation portion and the second position data of the operation apparatus obtained by the sensor portion when the first position data of the operation apparatus obtained by the sensor portion is in a second range.
According to the exemplary embodiment, when the first position data of the operation apparatus is in the first range, game processing in accordance with the operation input accepted by the operation portion is performed, and when the first position data of the operation apparatus is in the second range, game processing in accordance with the operation input accepted by the operation portion and the second position data of the operation apparatus obtained by the sensor portion is performed. Namely, since inputs accepted in performing game processing in accordance with the operation position of the operation apparatus are different, zest of a game is increased.
An exemplary embodiment provides a portable game device. The portable game device includes an operation apparatus and an information processing unit for transmitting and receiving data to and from the operation apparatus. The operation apparatus includes a display portion for displaying information, an operation portion for accepting a user's operation input, and a sensor portion for obtaining position data in a space where the operation apparatus is present. The information processing unit includes a viewpoint mode switching unit for setting a first viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a first virtual camera in a virtual space on a game when the position data of the operation apparatus obtained by the sensor portion is in a first range and setting a second viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a second virtual camera when the position data of the operation apparatus is in a second range, and a game processing unit for performing game processing in accordance with the operation input accepted by the operation portion.
An exemplary embodiment provides a method of controlling an information processing unit, and the method of controlling an information processing unit is a method of controlling an information processing unit operating in cooperation with an operation apparatus having a display portion for displaying information. The method of controlling an information processing unit includes the steps of accepting a user's operation input, obtaining position data in a space where the operation apparatus is present, setting a first viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a first virtual camera in a virtual space on a game when the obtained position data of the operation apparatus is in a first range and a second viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a second virtual camera when the position data of the operation apparatus is in a second range, and performing game processing in accordance with the accepted operation input.
According to the exemplary embodiment, owing to the step of setting a first viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a first virtual camera in a virtual space on a game when position data of an axis of a prescribed direction in a reference direction in the space where the operation apparatus is present is in a first range and a second viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a second virtual camera when position data of the operation apparatus is in a second range, change in game processing in accordance with the viewpoint mode in accordance with the operation position can be made and resultant game processing can be performed, and therefore zest of a game can be increased. In addition, since switching between the viewpoint modes can be made in accordance with the operation position, an operation of a button or the like for switching between the viewpoint modes is not necessary, a desired viewpoint mode can readily be set, and operability is also good.
In an exemplary embodiment, the step of accepting a user's operation input includes the step of accepting at least an input of a direction, and in the step of performing game processing, a first object provided in the virtual space is moved in accordance with the accepted input of the direction.
In an exemplary embodiment, in the step of performing game processing, in the second viewpoint mode, the position and the direction of image pick-up of the second virtual camera are controlled in accordance with the position data of the operation apparatus.
According to the exemplary embodiment, in the second viewpoint mode, the position and the direction of image pick-up of the second virtual camera are controlled in accordance with the position data of the operation apparatus. Thus, a viewpoint is further changed in accordance with the operation position and hence zest of a game is further increased.
In an exemplary embodiment, the step of performing game processing includes the steps of providing a second object of which movement in the virtual space is controlled, controlling movement of the second object in accordance with the operation input accepted at prescribed timing in the first viewpoint mode, and controlling movement of the second object in accordance with the position data of the operation apparatus at prescribed timing in the second viewpoint mode.
In an exemplary embodiment, the step of performing game processing includes the steps of providing a second object of which movement in the virtual space is controlled, controlling movement of the second object in accordance with the operation input accepted at prescribed timing in the first viewpoint mode, and controlling movement of the second object in accordance with the operation input accepted at prescribed timing and controlling movement thereof in accordance with the position data of the operation apparatus at timing different from the prescribed timing in the second viewpoint mode.
According to the exemplary embodiment, in the second viewpoint mode, movement of the second object is controlled in accordance with the position data of the operation apparatus. Therefore, game processing in coordination with a manner of operation (motion) of the operation apparatus can be performed and hence zest of a game is further increased.
In an exemplary embodiment, in the virtual space, the first virtual camera is set at a position higher than positions of the first object and the second virtual camera.
According to the exemplary embodiment, by setting the first virtual camera at a position higher than the position of the second virtual camera, an overhead view image can be obtained and displayed, so that zest of a game is increased owing to different viewpoints.
In an exemplary embodiment, a case where the position data of the operation apparatus is in the first range refers to a case where an orientation of a display surface of the display portion is close to horizontal, and a case where it is in the second range refers to a case where an orientation of the display surface of the display portion is close to vertical.
According to the exemplary embodiment, since the viewpoint mode can be switched between a case where an orientation of the display surface of the display portion is close to horizontal and a case where an orientation of the display surface of the display portion is close to vertical, a desired viewpoint mode can readily be set and operability is good.
An exemplary embodiment provides a method of controlling an information processing unit. The information processing unit operates in cooperation with an operation apparatus having a display portion for displaying information. The method of controlling an information processing unit includes the steps of accepting a user's operation input, obtaining first position data of an axis of a first prescribed direction in a reference direction in a space where the operation apparatus is present and second position data around an axis of a second prescribed direction, and performing game processing in accordance with the accepted operation input when the obtained first position data of the operation apparatus is in a first range and performing game processing in accordance with the accepted operation input and the obtained second position data of the operation apparatus when the obtained first position data of the operation apparatus is in a second range.
According to the exemplary embodiment, when the first position data of the operation apparatus is in the first range, game processing in accordance with the accepted operation input is performed, and when the first position data of the operation apparatus is in the second range, game processing in accordance with the accepted operation input and the obtained second position data of the operation apparatus is performed. Namely, since inputs accepted in performing game processing in accordance with the operation position of the operation apparatus are different, zest of a game is increased.
An exemplary embodiment provides a non-transitory storage medium encoded with a computer readable program for controlling an information processing unit and executable by a computer of the information processing unit. The information processing unit operates in cooperation with an operation apparatus having a display portion for displaying information. The program for controlling an information processing unit includes instructions for accepting a user's operation input, instructions for obtaining position data in a space where the operation apparatus is present, instructions for setting a first viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a first virtual camera in a virtual space on a game when the obtained position data of the operation apparatus is in a first range and setting a second viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a second virtual camera when the position data of the operation apparatus is in a second range, and instructions for performing game processing in accordance with the accepted operation input.
According to the exemplary embodiment, owing to the step of setting a first viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a first virtual camera in a virtual space on a game when position data of an axis of a prescribed direction in a reference direction in the space where the operation apparatus is present is in a first range and a second viewpoint mode for displaying on the display portion a display image picked up based on a position and a direction of image pick-up of a second virtual camera when position data of the operation apparatus is in a second range, change in game processing in accordance with the viewpoint mode in accordance with the operation position can be made and resultant game processing can be performed, and therefore zest of a game can be increased. In addition, since switching between the viewpoint modes can be made in accordance with the operation position, an operation of a button or the like for switching between the viewpoint modes is not necessary, a desired viewpoint mode can readily be set, and operability is also good.
In an exemplary embodiment, the instructions for accepting a user's operation input include instructions for accepting at least an input of a direction, and the instructions for performing game processing cause a first object provided in the virtual space to move in accordance with the accepted input of the direction.
In an exemplary embodiment, the instructions for performing game processing control the position and the direction of image pick-up of the second virtual camera in accordance with the position data of the operation apparatus in the second viewpoint mode.
According to the exemplary embodiment, in the second viewpoint mode, the position and the direction of image pick-up of the second virtual camera are controlled in accordance with the position data of the operation apparatus. Thus, a viewpoint is further changed in accordance with the operation position and hence zest of a game is further increased.
In an exemplary embodiment, the instructions for performing game processing include instructions for providing a second object of which movement in the virtual space is controlled, instructions for controlling movement of the second object in accordance with the operation input accepted at prescribed timing in the first viewpoint mode, and instructions for controlling movement of the second object in accordance with the position data of the operation apparatus at prescribed timing in the second viewpoint mode.
In an exemplary embodiment, the instructions for performing game processing include instructions for providing a second object of which movement in the virtual space is controlled, instructions for controlling movement of the second object in accordance with the operation input accepted at prescribed timing in the first viewpoint mode, and instructions for controlling movement of the second object in accordance with the operation input accepted at prescribed timing and instructions for controlling movement thereof in accordance with the position data of the operation apparatus at timing different from the prescribed timing in the second viewpoint mode.
According to the exemplary embodiment, in the second viewpoint mode, movement of the second object is controlled in accordance with the position data of the operation apparatus. Therefore, game processing in coordination with a manner of operation (motion) of the operation apparatus can be performed and hence zest of a game is further increased.
In an exemplary embodiment, in the virtual space, the first virtual camera is set at a position higher than positions of the first object and the second virtual camera.
According to the exemplary embodiment, by setting the first virtual camera at a position higher than the position of the second virtual camera, an overhead view image can be obtained and displayed, so that zest of a game is increased owing to different viewpoints.
In an exemplary embodiment, a case where the position data of the operation apparatus is in the first range refers to a case where an orientation of a display surface of the display portion is close to horizontal, and a case where it is in the second range refers to a case where an orientation of the display surface of the display portion is close to vertical.
According to the exemplary embodiment, since the viewpoint mode can be switched between a case where an orientation of the display surface of the display portion is close to horizontal and a case where an orientation of the display surface of the display portion is close to vertical, a desired viewpoint mode can readily be set and operability is good.
An exemplary embodiment provides a non-transitory storage medium encoded with a computer readable program for controlling an information processing unit and executable by a computer of the information processing unit. The information processing unit operates in cooperation with an operation apparatus having a display portion for displaying information. The program for controlling an information processing unit includes instructions for accepting a user's operation input, instructions for obtaining first position data of an axis of a prescribed direction in a reference direction in a space where the operation apparatus is present and second position data around an axis of a second prescribed direction, and instructions for performing game processing in accordance with the accepted operation input when the obtained first position data of the operation apparatus is in a first range and performing game processing in accordance with the accepted operation input and the obtained second position data of the operation apparatus when the obtained first position data of the operation apparatus is in a second range.
According to the exemplary embodiment, when the first position data of the operation apparatus is in the first range, game processing in accordance with the accepted operation input is performed, and when the first position data of the operation apparatus is in the second range, game processing in accordance with the accepted operation input and the obtained second position data of the operation apparatus is performed. Namely, since inputs accepted in performing game processing in accordance with the operation position of the operation apparatus are different, zest of a game is increased.
The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
This embodiment will be described in detail with reference to the drawings. The same or corresponding elements in the drawings have the same reference characters allotted and description thereof will not be repeated.
[Configuration of Appearance of Game Device]
A game device according to one example (one embodiment) will be described below.
Referring to
As shown in
(Description of Lower Housing)
Initially, a construction of lower housing 11 will be described. As shown in
As shown in
As shown in
Operation buttons 14A to 14I are input devices for providing a prescribed input. As shown in
Slide pad 15 is a device indicating a direction. A key top of slide pad 15 is constructed to slide in parallel to the inner surface of lower housing 11. Slide pad 15 functions in accordance with a program executed by game device 10. For example, when game device 10 executes a game in which a prescribed object appears in a three-dimensional virtual space, slide pad 15 functions as an input device for moving the prescribed object in the three-dimensional virtual space. In this case, the prescribed object is moved in a direction in which the key top of slide pad 15 slides. It is noted that a component allowing an analog input based on inclination by a prescribed amount in any of up, down, left, right, and diagonal directions may be employed as slide pad 15. In the present example, by way of example, it is assumed that slide pad 15 is used as direction indication input means for movement or the like of a player object in game processing which will be described later. In addition, it is assumed that slide pad 15 is also used as indication input means for correcting a direction of movement of a ball object.
It is noted that cross-shaped button 14A can also be used instead of slide pad 15.
Further, microphone hole 18 is provided in the inner surface of lower housing 11. A microphone (see
Though not shown, an L button 14J and an R button 14K are provided on the upper side surface of lower housing 11. L button 14J and R button 14K can function, for example, as a shutter button of the image pick-up portion (a shooting indication button). Further, though not shown, a sound volume button 14L is provided on a left side surface of lower housing 11. Sound volume button 14L is used for adjusting volume of a speaker provided in game device 10.
Furthermore, as shown in
In addition, as shown in
Moreover, as shown in
Though not shown, a rechargeable battery serving as a power supply of game device 10 is accommodated in lower housing 11, and the battery can be charged through a terminal provided in a side surface (for example, the upper side surface) of lower housing 11.
(Description of Upper Housing)
A construction of upper housing 21 will now be described. As shown in
As shown in
Upper LCD 22 is a display capable of displaying an image that can stereoscopically be viewed. In addition, in the present embodiment, an image for left eye and an image for right eye are displayed by using substantially the same display region. Specifically, a display is adapted to a scheme in which an image for left eye and an image for right eye are alternately displayed in a lateral direction in a prescribed unit (for example, one column by one column). Alternatively, a display may be adapted to a scheme in which an image for left eye and an image for right eye are alternately displayed in a time-divided manner. In addition, in the present embodiment, a display can allow stereoscopic vision with naked eyes. Then, a lenticular type or parallax barrier type display is used such that an image for left eye and an image for right eye alternately displayed in a lateral direction can be viewed as decomposed by left and right eyes respectively. In the present embodiment, it is assumed that a parallax barrier type display is employed as upper LCD 22. Upper LCD 22 displays an image that can stereoscopically be viewed with naked eyes (hereinafter referred to as a stereoscopic image) by using an image for right eye and an image for left eye. Namely, upper LCD 22 can display a stereoscopic image with stereoscopic effect as felt by a user by having the user's left and right eyes visually recognize the image for left eye and the image for right eye respectively with the use of a parallax barrier. In addition, upper LCD 22 can inactivate the parallax barrier above, and when the parallax barrier is inactivated, an image can two-dimensionally be displayed (a two-dimensional image in a sense contrary to stereoscopic vision described above can be displayed; that is, such a display mode that the same displayed image is viewed both by right and left eyes). Thus, upper LCD 22 is a display capable of switching between stereoscopic display for displaying a stereoscopic image that can stereoscopically be viewed (a stereoscopic display mode) and two-dimensional display for two-dimensionally displaying an image (displaying a two-dimensional image) (a two-dimensional display mode). Switching of display can be made by processing by a CPU 311 or by 3D adjustment switch 25 which will be described later.
Outer image pick-up portion 23 is a collective denotation of two image pick-up portions (23a and 23b) provided on an outer surface (a rear surface opposite to the main surface where upper LCD 22 is provided) 21D of upper housing 21. Directions of image pick-up of outer image pick-up portion (left) 23a and outer image pick-up portion (right) 23b are both a direction of normal extending outward from outer surface 21D. Outer image pick-up portion (left) 23a and outer image pick-up portion (right) 23b can be used as stereo cameras by means of a program executed by game device 10. Outer image pick-up portion (left) 23a and outer image pick-up portion (right) 23b each include image pick-up elements (for example, a CCD image sensor, a CMOS image sensor, or the like) having prescribed common resolution and a lens. A lens may have a zoom mechanism.
Inner image pick-up portion 24 is an image pick-up portion provided in an inner surface (main surface) 21B of upper housing 21 and having a direction of normal extending inward from the inner surface as a direction of image pick-up. Inner image pick-up portion 24 includes image pick-up elements (for example, a CCD image sensor, a CMOS image sensor, or the like) having prescribed resolution and a lens. A lens may have a zoom mechanism.
Three-dimension adjustment switch 25 is a slide switch and it is a switch used for switching between display modes of upper LCD 22 as described above. Three-dimension adjustment switch 25 is used for adjusting stereoscopic effect of a stereoscopic image displayed on upper LCD 22. As will clearly be described later, however, in the present embodiment, by way of example, regardless of 3D adjustment switch 25, an image displayed on upper LCD 22 is switched between a stereoscopic image and a two-dimensional image in accordance with an angle of inclination of lower housing 11. Specifically, when game device 10 is set at an angle close to perpendicular, that is, when an angle of inclination at which lower housing 11 is inclined in a horizontal plane increases and exceeds a threshold value, a two-dimensional image is displayed.
Three-dimension indicator 26 indicates whether upper LCD 22 is in a stereoscopic display mode or not. Three-dimension indicator 26 is implemented by an LED and it illuminates when the stereoscopic display mode of upper LCD 22 is active. It is noted that 3D indicator 26 may illuminate only when upper LCD 22 is in the stereoscopic display mode and program processing for displaying a stereoscopic image is being performed. As shown in
In addition, a speaker hole 21E is provided in the inner surface of upper housing 21. Voice and sound from a speaker 44 which will be described later is output through this speaker hole 21E.
[Internal Configuration of Game Device 10]
An internal electrical configuration of game device 10 will now be described with reference to
Information processing unit 31 is information processing means including CPU (Central Processing Unit) 311 for executing a prescribed program, a GPU (Graphics Processing Unit) 312 for image processing, and the like. CPU 311 of information processing unit 31 performs processing in accordance with a program by executing the program stored in a memory in game device 10 (for example, external memory 45 connected to external memory I/F 33 or internal memory for data saving 35). It is noted that a program executed by CPU 311 of information processing unit 31 may be obtained from other equipment through communication with other equipment. In addition, information processing unit 31 includes a VRAM (Video RAM) 313. GPU 312 of information processing unit 31 generates an image in response to an instruction from CPU 311 of information processing unit 31 and renders the image in VRAM 313. Then, GPU 312 of information processing unit 31 outputs the image rendered in VRAM 313 to upper LCD 22 and/or lower LCD 12, so that upper LCD 22 and/or lower LCD 12 display(s) the image.
Main memory 32, external memory I/F 33, external memory I/F for data saving 34, and internal memory for data saving 35 are connected to information processing unit 31. External memory I/F 33 is an interface for removably connecting external memory 45. In addition, external memory I/F for data saving 34 is an interface for removably connecting external memory for data saving 46.
Main memory 32 is volatile storage means used as a work area or a buffer area of (CPU 311 of) information processing unit 31. Namely, main memory 32 temporarily stores various types of data used for processing based on the program above or temporarily stores a program obtained from the outside (external memory 45, other equipment, and the like). In the present embodiment, for example, a PSRAM (Pseudo-SRAM) is employed as main memory 32.
External memory 45 is non-volatile storage means for storing a program executed by information processing unit 31. External memory 45 is implemented, for example, by a read-only semiconductor memory. As external memory 45 is connected to external memory I/F 33, information processing unit 31 can read a program stored in external memory 45. As information processing unit 31 executes the read program, prescribed processing is performed. External memory for data saving 46 is implemented by a non-volatile readable and writable memory (for example, a NAND type flash memory) and it is used for storing prescribed data. For example, external memory for data saving 46 stores an image picked up by outer image pick-up portion 23 or an image picked up by other equipment. As external memory for data saving 46 is connected to external memory I/F for data saving 34, information processing unit 31 reads an image stored in external memory for data saving 46 so that upper LCD 22 and/or lower LCD 12 can display the image.
Internal memory for data saving 35 is implemented by a non-volatile readable and writable memory (for example, a NAND type flash memory) and used for storing prescribed data. For example, internal memory for data saving 35 stores data or a program downloaded through wireless communication via wireless communication module 36.
Wireless communication module 36 has a function to connect to wireless LAN, for example, under a scheme complying with IEEE 802.11b/g standards. In addition, local communication module 37 has a function to establish wireless communication with a game device of the same type under a prescribed communication scheme (for example, communication based on an original protocol or infrared communication). Wireless communication module 36 and local communication module 37 are connected to information processing unit 31. Information processing unit 31 can transmit and receive data to and from other equipment over the Internet through wireless communication module 36 or transmit and receive data to and from another game device of the same type through local communication module 37.
In addition, RTC 38 and power supply circuit 41 are connected to information processing unit 31. RTC 38 counts time and outputs the time to information processing unit 31. Information processing unit 31 calculates the current time (date) based on the time counted by RTC 38.
Acceleration sensor 39 representing one type of an inertial sensor is further connected to information processing unit 31. Acceleration sensor 39 senses magnitude of acceleration in a linear direction (linear acceleration) along a three-axis (xyz-axis) direction. Acceleration sensor 39 is provided inside lower housing 11. As shown in
For example, in a static state, that is, in a case where processing is performed assuming that acceleration detected by acceleration sensor 39 is only acceleration of gravity, whether a position of game device 10 (in the present example, mainly the z axis perpendicular to the inner surface (main surface) of lower housing 11) is inclined in a direction of gravity or not or an extent of inclination can be known based on the detected acceleration data. Specifically, with a state that an axis of detection by the acceleration sensor extends in a vertically downward direction being defined as the reference, whether the game device is inclined or not can be known only based on whether 1 G (acceleration of gravity) is applied or not, and an extent of inclination can be known based on magnitude thereof.
In a case of a multi-axis acceleration sensor, to which extent the game device is inclined in the direction of gravity can be known in further detail by further subjecting data on acceleration in each axis to processing. Even based on the premise that acceleration sensor 39 is in a dynamic state, inclination in the direction of gravity can be known by eliminating acceleration in accordance with motion of the acceleration sensor through prescribed processing.
Angular velocity sensor (a gyro sensor) 40 representing one type of an inertial sensor is connected to information processing unit 31. Angular velocity sensor 40 senses an angular velocity produced around three axes (in the present embodiment, xyz axes) of game device 10 and outputs position data (angular velocity data) indicating the sensed angular velocity to information processing unit 31. Angular velocity sensor 40 is provided, for example, in lower housing 11. Information processing unit 31 calculates motion of game device 10 as it receives the angular velocity data output from angular velocity sensor 40.
In the present example, in sensing a position of operation of game device 10, acceleration sensor 39 is employed by way of example. Specifically, it is assumed that to which extent the direction of the z axis of game device 10 (the direction perpendicular to the inner surface (main surface) of lower housing 11) is inclined in the direction of gravity, that is, an angle of inclination, is calculated based on the acceleration data, and a position is sensed based on the angle of inclination.
Alternatively, in sensing motion of game device 10, angular velocity sensor 40 is employed by way of example. Specifically, it is assumed that an angular velocity produced around the direction of gravity of game device 10 is calculated based on the angular velocity data, and to which extent the game device has rotated in the reference, that is, the angle of rotation (motion), is sensed based on the angular velocity.
Power supply circuit 41 controls electric power from a power supply provided in game device 10 (the rechargeable battery above accommodated in lower housing 11) and causes supply of electric power to each component of game device 10.
In addition, an I/F circuit 42 is connected to information processing unit 31. A microphone 43 and speaker 44 are connected to I/F circuit 42. Specifically, speaker 44 is connected to I/F circuit 42 through a not-shown amplifier. Microphone 43 senses user's voice and sound and outputs an audio signal to I/F circuit 42. The amplifier amplifies the audio signal from I/F circuit 42 and causes speaker 44 to output the voice and sound. In addition, touch panel 13 described above is also connected to I/F circuit 42. I/F circuit 42 includes an audio control circuit for controlling microphone 43 and speaker 44 (amplifier) and a touch panel control circuit for controlling the touch panel. The audio control circuit carries out A/D conversion and D/A conversion of an audio signal or converts an audio signal to audio data in a prescribed format. The touch panel control circuit generates touch position data in a prescribed format based on a signal from touch panel 13 and outputs the data to information processing unit 31. The touch position data indicates a coordinate at a position where an input has been made onto an input surface of touch panel 13.
Operation button 14 is constituted of operation buttons 14A to 14L above and connected to information processing unit 31. In addition, slide pad 15 is connected to information processing unit 31.
A status of input onto operation buttons 14A to 14I (whether a button has been pressed or not) or operation data indicating a direction indication of slide is output from operation button 14 or slide pad 15 to information processing unit 31. As information processing unit 31 obtains operation data from operation button 14 or slide pad 15, it performs processing in accordance with the input to operation buttons 14A to 14L above or slide pad 15.
It is noted that CPU 311 obtains operation data from operation button 14 or slide pad 15 once in a prescribed period of time. Alternatively, inputs from both of operation button 14 and slide pad 15 can also be accepted at a prescribed time.
Lower LCD 12 and upper LCD 22 are connected to information processing unit 31. Lower LCD 12 and upper LCD 22 each display an image in response to an instruction from (GPU 312 of) information processing unit 31. In the present embodiment, information processing unit 31 causes upper LCD 22 to display a stereoscopic image (an image that can stereoscopically be viewed).
Specifically, information processing unit 31 is connected an LCD controller (not shown) of upper LCD 22 and controls the LCD controller to turn ON/OFF the parallax barrier. When the parallax barrier of upper LCD 22 is turned ON, an image for right eye and an image for left eye stored in VRAM 313 of information processing unit 31 are output to upper LCD 22. More specifically, the LCD controller reads the image for right eye and the image for left eye from VRAM 313 by alternately repeating processing for reading pixel data for one line in a vertical direction with regard to the image for right eye and processing for reading pixel data for one line in the vertical direction with regard to the image for left eye. Thus, the image for right eye and the image for left eye are each divided into strip-like images in which pixels are aligned vertically for each one line, and an image in which strip-like images of the image for right eye and strip-like images of the image for left eye as divided are alternately arranged is displayed on the screen of upper LCD 22. Then, as the image is visually recognized by the user through the parallax barrier of upper LCD 22, the user's right and left eyes visually recognize the image for right eye and the image for left eye respectively.
Outer image pick-up portion 23 and inner image pick-up portion 24 are connected to information processing unit 31. Outer image pick-up portion 23 and inner image pick-up portion 24 each pick up an image in response to an instruction from information processing unit 31 and output data of the picked-up image to information processing unit 31.
Three-dimension adjustment switch 25 is connected to information processing unit 31. Three-dimension adjustment switch 25 transmits an electric signal in accordance with a position of a slider 25a to information processing unit 31.
In addition, 3D indicator 26 is connected to information processing unit 31. Information processing unit 31 controls illumination of 3D indicator 26. For example, in a case where upper LCD 22 is in the stereoscopic display mode, information processing unit 31 causes 3D indicator 26 to illuminate. The foregoing is the description of the internal configuration of game device 10.
[Outlines of Information Processing]
Outlines of information processing will now be described. By way of example of information processing, game device 10 is assumed to perform game processing. In the present example, a case where a tennis game is played by way of example of game processing will be described. Game processing in the present embodiment is performed as a prescribed game program is executed in game device 10.
Referring to
Then, a case where the user can use slide pad 15 to operate operable player object OBJ1 and to play a tennis game with opposition object OBJ2 in the tennis court in the virtual space based on prescribed rules of the tennis game is shown. Specifically, prescribed game processing proceeds by repeating what is called services and strokes as necessary.
Here, positions of two virtual cameras 100 are shown. A position of one virtual camera 100 is arranged at a position distant by a prescribed distance r0 from a coordinate V representing the position of player object OBJ1 and the virtual camera shoots player object OBJ1 or the like. Here, a case where virtual camera 100 is arranged at a coordinate P in the position of player object OBJ1 (a first user viewpoint mode) is shown.
A position of another virtual camera 100 is provided above coordinate P (a positive direction in a Z axis of the world coordinate system) and the virtual camera is arranged at a position allowing entire tennis court object OBJ0 including player object OBJ1 and opposition object OBJ2 to be looked down upon. Here, a case where virtual camera 100 is arranged at a coordinate Q (an overhead viewpoint mode) is shown. It is noted that, in the present example, in the case of the overhead viewpoint mode, it is assumed that virtual camera 100 is fixed at a fixed point, however, it may be moved to another position at which the entirety can be looked down upon without being fixed.
Though description will be given later, in the present example, a position of virtual camera 100 is switched in accordance with a position of game device 10. In addition, the number of virtual cameras is also switched. In this regard, in a case where acceleration sensor 39 is used to calculate an angle of the direction of the z axis of game device 10 inclined in the direction of gravity (an angle of inclination) and the angle of inclination in the direction of gravity is smaller than a prescribed angle, for example, the overhead viewpoint mode in which a coordinate position of virtual camera 100 is set at coordinate Q is set. Though a case where only a single virtual camera 100 is provided at coordinate Q is shown here, it is assumed that two virtual cameras are provided in order to permit a stereoscopic image (not shown).
On the other hand, when an angle of inclination of the direction of the z axis of game device 10 in the direction of gravity is equal to or greater than a first prescribed angle, for example, the first user viewpoint mode in which a coordinate position of virtual camera 100 is set at coordinate P is set. Though description will be given later, when an angle of inclination of the direction of the z axis of game device 10 in the direction of gravity is equal to or greater than a second prescribed angle, a second user viewpoint mode is set.
In the case of the overhead viewpoint mode, since the virtual camera is arranged at a position where entire tennis court object OBJ0 including player object OBJ1 and opposition object OBJ2 can be looked down upon, a game operation can be performed at a viewpoint at which the entire state can be grasped. It is noted that, since two virtual cameras are arranged in the case of the overhead viewpoint mode, a stereoscopic image is displayed on upper LCD 22.
In the case of the user viewpoint mode, since the virtual camera is arranged behind player object OBJ1, a game operation can be performed from a viewpoint of player object OBJ1. In the case of the user viewpoint mode, since a single virtual camera is arranged, a two-dimensional image is displayed on upper LCD 22. By doing so, a screen is readily viewed even when input based on an angle of rotation is made. In another embodiment, however, a stereoscopic image may be displayed also in the user viewpoint mode, depending on the purpose of the game or an operation method.
The virtual camera is arranged such that an angle of depression and an angle of elevation of the virtual camera with player object OBJ1 serving as the reference are different between the first user viewpoint mode and the second user viewpoint mode.
Further, in the case of the user viewpoint mode, virtual camera 100 is arranged at a position in accordance with arrangement of player object OBJ1. In the present example, it is assumed that arrangement is such that an orientation (a direction of line of sight) of player object OBJ1 on an XY plane is the same as a direction of shooting of the virtual camera. It is noted that this direction of line of sight is used as a direction parameter for controlling a reference direction of travel of the ball object in a case where ball object OBJ4 is hit.
A stereoscopic image displayed on upper LCD 22 will now be described.
In the present embodiment, as described above, upper LCD 22 can display a stereoscopic image. It is assumed that a stereoscopic image in the present embodiment is generated by using two virtual cameras arranged in the virtual space and having respective directions of image pick-up set in the case of the overhead viewpoint mode as described above.
The two virtual cameras arranged in the virtual space are used for generating the image for left eye and the image for right eye described above, respectively. One of the two virtual cameras arranged in the virtual space is arranged as a left virtual camera for generating an image for left eye, and the other thereof is arranged as a right virtual camera for generating an image for right eye.
Referring to
Referring to
In a case where a hand holding game device 10 is placed in parallel to the horizontal plane having the direction of gravity as a perpendicular (game device 10 (mainly lower housing 11) is located in parallel to the horizontal plane), an angle of inclination at which the direction of the z axis of game device 10 is inclined in the direction of gravity is 0°.
Alternatively, in a case where the hand holding game device 10 is lifted to incline game device 10, an angle of inclination at which the direction of the z axis of game device 10 is inclined in the direction of gravity (a positive value) increases.
By way of example, in a case where angle of inclination θ is smaller than a first prescribed angle (S°) (within a range of −180°≦θ<S°), the overhead viewpoint mode is set. Namely, a case where angle of inclination θ is smaller than the first prescribed angle refers to a case where an orientation of lower LCD 12 of lower housing 11 is close to parallel to the horizontal plane. In a case where angle of inclination θ is equal to or greater than the first prescribed angle (S°) (within a range of S≦θ<T°), the first user viewpoint mode is set. Alternatively, in a case where angle of inclination θ is equal to or greater than a second prescribed angle (T°), the second user viewpoint mode is set. The case where angle of inclination θ is equal to or greater than the second prescribed angle (within a range of T°≦θ<180°) refers to a case where an orientation of lower LCD 12 of lower housing 11 is close to vertical in the horizontal plane.
Though a case where switching from the overhead viewpoint mode to the first user viewpoint mode is made with the first prescribed angle (S°) serving as a threshold value has been described in the present example, it is also possible that the threshold value for switching is not set to the same threshold value when switching to the overhead viewpoint mode is made from the first user viewpoint mode. For example, by setting a prescribed angle (R°) smaller than the first prescribed angle (S°) as a threshold value so as to prevent continuous switching around that angle, influence on user's operability can be avoided. Namely, a threshold value at which switching is made can be changed in correspondence with the current viewpoint mode. For example, in a case where the first user viewpoint mode is set, the first user viewpoint mode may be maintained while angle of inclination θ is within a range of R°≦θ<T°, and switching to the overhead viewpoint mode may be made when angle of inclination θ is set to θ<R. Though adjustment of a threshold value in the case of the overhead viewpoint mode and the first user viewpoint mode has been described, such adjustment is similarly applicable also to a case of the first user viewpoint mode and the second user viewpoint mode.
Referring to
Virtual camera 100 is arranged at a coordinate P0 in the case of the first user viewpoint mode. Alternatively, virtual camera 100 is arranged at a coordinate P1 in the case of the second user viewpoint mode.
With coordinate V0 of player object OBJ1 being defined as the reference, coordinate P1 is located at a position distant by r0 in a horizontal direction (on the XY plane), and coordinate P0 is located at a position distant by r0 and in correspondence with an angle of elevation w° from the horizontal direction (on the XY plane) with coordinate V0 being defined as the reference.
Then, in the case of the first user viewpoint mode, a direction of shooting of virtual camera 100 is in a direction from coordinate P0 of virtual camera 100 toward coordinate V0 of the player object.
Alternatively, in the case of the second user viewpoint mode, a direction of shooting of virtual camera 100 is in a direction from coordinate P1 of virtual camera 100 toward coordinate V0 of the player object.
It is noted that, in the first and second user viewpoint modes, the direction of shooting on the XY plane is set to be the same as an orientation of player object OBJ1 (the direction of line of sight).
In the first user viewpoint mode, shooting with virtual camera 100 is carried out from a position at angle of elevation w° in the horizontal direction, whereas in the second user viewpoint mode, virtual camera 100 is positioned at a position at angle of elevation 0° for shooting. Therefore, shooting is carried out at a height in the vicinity of eyes of player object OBJ1 and hence a game operation at the viewpoint of the player is allowed. Thus, the sense of realism increases.
It is noted that a value of an angle of elevation is by way of example, without limited thereto.
(Setting of Direction of Line of Sight of Player Object OBJ1)
A scheme for setting a direction of line of sight of player object OBJ1 will now be described.
Referring to
On the other hand, in a case where player object OBJ1 is located in a court region on the left in tennis court object OBJ0 at the time of service, reference line rf1 is used as valid. It is noted that, at the time of service, an initial position of player object OBJ1 is set in advance. In addition, during stroke, as shown in
Then,
It is noted that a direction of line of sight of player object OBJ1 is used as a direction parameter for controlling a reference direction of travel in a case where player object OBJ1 hits ball object OBJ4. By way of example, when there is no direction indication input through slide pad 15 in a case where player object OBJ1 hits ball object OBJ4, movement of ball object OBJ4 is controlled with a direction of line of sight of player object OBJ1 serving as a direction parameter. Namely, ball object OBJ4 moves in the direction of line of sight of player object OBJ1.
Referring to
Referring to
Referring to
Referring to
Specifically, by way of example, when rotation by angle of rotation Sk to the left is made, for example, player object OBJ1 is rotated accordingly by angle of rotation Sk to the left. As the orientation of the player object varies, virtual camera 100 also similarly rotates.
Referring to
Referring to
Referring to
Specifically, in a case where an orientation (a direction of line of sight) of player object OBJ1 is varied in the reference field of view, it is assumed that an angle of rotation detected when game device 10 is rotated around the direction of gravity and an angle of rotation by which the orientation (the direction of line of sight) of player object OBJ1 is varied from a centerline have the same value.
On the other hand, a case where, in varying the orientation (the direction of line of sight) of player object OBJ1 in the region outside the reference field of view, an amount of operation thereof becomes greater as a distance from the reference field of view is greater, is shown.
Specifically, in a case of such movement that the orientation (the direction of line of sight) of player object OBJ1 is in a direction away from the reference field of view, prescribed weighting is made such that an amount of rotation by which game device 10 is rotated around the direction of gravity is greater. As a result of this processing, in the reference field of view, by varying the orientation (the direction of line of sight) of player object OBJ1 in coordination with an angle of rotation of game device 10, operability in game processing is improved, and in the outside of the reference field of view, by weighting the amount of operation for rotating game device 10, the direction of line of sight is prevented from easily being oriented toward the outside of the reference field of view, so that operability in game processing is improved. In a case where the orientation (the direction of line of sight) of player object OBJ1 is in the region outside the reference field of view, movement of player object OBJ1 (change in orientation) may be controlled such that the orientation (the direction of line of sight) of player object OBJ1 is gradually accommodated in the reference field of view.
Referring to
It is noted that the movement parameter can include other various parameters such as an initial hitting angle in connection with control of movement of the ball object and various parameters may be adjusted (corrected) in such a manner that simulation in accordance with that algorithm for controlling movement is carried out first to achieve movement to a target position.
On the other hand, when ball object OBJ4 is hit (button 14B is selected) together with a direction indication input or the like through slide pad 15, the movement parameter (the direction parameter) is corrected and control is carried out such that movement in a direction displaced from the reference direction of travel of the ball object is made. In the present example, for example, when right is input as a direction indication input through slide pad 15, the movement parameter (the direction parameter) of ball object OBJ4 is corrected and it is controlled to move to the right from the reference direction of travel of the ball object. It is noted that the direction of travel of the ball object may vary from the reference direction of travel in accordance with variation in the orientation (the direction of line of sight) of the player object.
Here, it is assumed that, with regard to control of movement of ball object OBJ4, in the user viewpoint mode, an input of an angle of rotation around a reference axis with the direction of gravity of game device 10 being defined as the reference axis is also accepted together with a direction indication input through slide pad 15.
Namely, in the user viewpoint mode, even when no direction indication input through slide pad 15 is provided, for example in a case where game device 10 is rotated to the right with the direction of gravity being defined as the reference axis (the angle of rotation being positive), it is determined that right has been input as the direction indication input, an amount of correction of the movement parameter (the direction parameter) is reflected in accordance with the angle of rotation, and control is carried out such that movement to the right from the reference direction of travel of the ball object is made.
As a result of such processing, a manner of operation of game device 10 is accepted as a prescribed input and reflected on the game processing. Thus, zest of the game increases.
Referring to
Referring to
Referring to
In the present example, a case where player object OBJ1 is arranged at an initial position at the time of service, an orientation (a direction of line of sight) of player object OBJ1 is set based on that position, and the player object faces the direction of opposition object OBJ2 as an initial direction of line of sight is shown.
Referring to
Referring to
Referring to
By way of example, in the game processing, a position of virtual camera 100 is changed in accordance with the operation position of operated game device 10. Specifically, when an angle in the direction of the z axis of game device 10 (angle of inclination θ) with the direction of gravity of game device 10 being defined as a reference direction is in a first range, a virtual camera position in accordance with the overhead viewpoint mode is set, and when it is in a second range, switching to a virtual camera position in accordance with the user viewpoint mode is made. Namely, switching between the viewpoint modes can be made in accordance with an operation position so as to change game processing. Thus, zest of a game can be increased. In addition, since switching between the viewpoint modes can be made in accordance with the operation position, an operation of a button or the like for switching between the viewpoint mode is not necessary and setting of a desired viewpoint mode can readily be made. Thus, operability is also good.
In addition, in the game device, a manner of operation of operated game device 10 is reflected on the game processing. Specifically, a direction of line of sight of the player object (that is, a direction of shooting of the virtual camera) is moved in accordance with an angle of rotation with the direction of gravity of game device 10 being defined as the reference axis. Since the direction of line of sight (the direction of shooting of the virtual camera) changes in coordination with motion of game device 10 as a result of such processing, operability of the game is improved and zest increases.
Moreover, an angle of rotation with the direction of gravity of game device 10 being defined as the reference axis is accepted as a direction indication input and it is reflected on control of movement of the ball object. As a result of this processing, a manner of operation of game device 10 is accepted as a prescribed input and game processing in coordination with motion of the game device is performed. Thus, zest increases.
[Data Stored in Main Memory]
Before description of a specific operation of CPU 311 of game device 10, data stored in main memory 32 as CPU 311 executes a game program will now be described.
Referring to
As described above, operation data 501 is data obtained by CPU 311 from operation button 14 and slide pad 15 once in a prescribed period of time (for example, once in a processing unit time period above) and then stored.
Acceleration data 502 is data indicating acceleration along directions of xyz axes obtained from acceleration sensor 39 every processing unit time above. Acceleration data 502 is obtained from acceleration sensor 39 every processing unit time above and stored in main memory 32. In the present example, an angle of inclination (a position) by which game device 10 is inclined in the direction of gravity is sensed based on the data indicating acceleration along the directions of xyz axes.
Angular velocity data 503 is data indicating a velocity of an angle of rotation around each of xyz axes obtained from angular velocity sensor 40 every processing unit time above. Angular velocity data 503 is obtained from angular velocity sensor 40 every processing unit time above and stored in main memory 32. It is assumed that angular velocity sensor 40 according to the present embodiment senses an angular velocity with each of xyz axes being defined as the center and rotation clockwise being defined as positive and rotation counterclockwise being defined as negative. In the present example, an angular velocity produced around the direction of gravity is calculated based on data indicating a velocity of an angle of rotation around each of xyz axes, and an angle of rotation (motion), that is, how much rotation was made in the reference, is sensed based on the angular velocity.
Object data 504 is data on tennis court object OBJ0, net object OBJ3, player object OBJ1, opposition object OBJ2, and ball object OBJ4. Specifically, it is data indicating a shape, a position, an orientation, and the like of each object. The object data is assumed as data set in advance at an initial value. Then, a value of the data is updated every processing unit time in accordance with object movement control.
Virtual camera data 505 is data indicating the number and arrangement of virtual cameras in accordance with the overhead viewpoint mode or the user viewpoint mode, and it includes data on those position and direction of shooting. As described above, for example, in the user viewpoint mode, a value of data on a position and a direction of shooting of the virtual camera is updated every processing unit time, in accordance with object movement control.
Various programs 601 are a variety of programs executed by CPU 311. For example, a game program or the like described above is stored in main memory 32 as various programs 601. In addition, an algorithm for controlling movement of the ball object, or the player object, or the opposition object, or the like in accordance with a manner of operation of the game device, a control algorithm relating to a direction of line of sight of the player object and arrangement setting of a virtual camera, and the like are also included in game programs to be stored in main memory 32.
[Game Processing]
A specific operation of CPU 311 of game device 10 in the present embodiment will now be described. Initially, when power of game device 10 is turned on, a boot program (not shown) is executed by CPU 311, and then a game program stored in internal memory for data saving 35 is thus read and stored in main memory 32. Then, as the game program stored in main memory 32 is executed by CPU 311, processing shown in the flowchart below is performed.
Referring to
Then, CPU 311 performs viewpoint setting processing in accordance with a position of game device 10 (step S4). Details of the viewpoint setting processing will be described later.
Then, CPU 311 performs input acceptance processing (step S6). Specifically, processing for accepting input from operation button 14 or slide pad 15 and processing for accepting input from angular velocity sensor 40 are performed. Details of the input acceptance processing will also be described later.
Then, CPU 311 performs player object movement processing (step S8). The player object movement processing is performed based on a prescribed movement control algorithm and on data accepted in the input acceptance processing. Details of the object movement processing will also be described later.
Then, CPU 311 performs ball object movement processing (step S9). The ball object movement processing is performed based on a prescribed movement control algorithm and on data accepted in the input acceptance processing. Details of the object movement processing will also be described later.
Then, CPU 311 performs game determination processing (step S10). For example, game winning and losing determination or the like based on tennis rules is made based on a position of ball object OBJ4 in tennis court object OBJ0.
Then, CPU 311 updates a position of the virtual camera (step S11). Specifically, the position of the virtual camera is updated in accordance with switching between the viewpoint modes or movement of the player object updated in accordance with the object movement processing.
Then, CPU 311 converts the position of the player object or the like into a three-dimensional camera coordinate system with a position of the arranged virtual camera being defined as the reference (step S12).
Then, CPU 311 converts the resultant three-dimensional camera coordinate system into a two-dimensional projection plane coordinate system (step S13). It is noted that designation of a texture, clipping (cutting of an invisible world), or the like is also carried out here. Thus, two-dimensional image data obtained by shooting the player object or the like with the virtual camera can be obtained.
Then, CPU 311 performs processing for generating image data serving as a game image (step S14).
Then, CPU 311 outputs the image data to upper LCD 22 for display of the game image (step S15). It is noted that, in the case of the user viewpoint mode, a two-dimensional image is displayed. Alternatively, in the case of the overhead viewpoint mode, a stereoscopic image is displayed.
Then, whether the game has ended or not is determined (step S16). When it is determined that the game has not ended (NO in step S16), the process again returns to step S4. It is noted that the processing above is repeated every processing unit time.
On the other hand, when the game has ended (YES in step S16), the process ends (end).
The viewpoint setting processing will now be described.
Referring to
Specifically, CPU 311 obtains acceleration data along directions of xyz axes from acceleration sensor 39.
Then, CPU 311 calculates an angle of inclination by which the direction of the z axis of game device 10 is inclined in the direction of gravity (step S22).
Then, CPU 311 determines whether the angle of inclination is within the first range or not (step S24). For example, as described with reference to
When CPU 311 determines that the angle of inclination is within the first range (YES in step S24), it sets the overhead viewpoint mode (step S26). Then, the process ends (return).
On the other hand, when CPU 311 determines in step S24 that the angle of inclination is not within the first range (NO in step S24), it determines whether the angle of inclination is within the second range or not (step S28). For example, as described with reference to
When CPU 311 determines that the angle of inclination is within the second range (YES in step S28), it sets the first user viewpoint mode (step S30). Then, the process ends (return).
On the other hand, when CPU 311 determines in step S28 that the angle of inclination is not within the second range (NO in step S28), it determines whether the angle of inclination is within a third range or not (step S32). For example, as described with reference to
When CPU 311 determines that the angle of inclination is within the third range (YES in step S32), it sets the second user viewpoint mode (step S34). Then, the process ends (return).
When the angle of inclination is included in none of the ranges (NO in step S32), the angle of inclination is processed as invalid (return).
As a result of such processing, the viewpoint mode of the virtual camera is set in accordance with the operation position of game device 10.
The input acceptance processing will now be described.
Referring to
Then, when CPU 311 determines in step S60 that the overhead viewpoint mode has been set (YES in step S60), it performs processing for accepting a slide pad input and a shot input (step S62). Then, the process ends (return). It is assumed in the present example that the shot input refers to selection of button 14B among operation buttons 14.
On the other hand, when CPU 311 determines in step S60 that the overhead viewpoint mode has not been set (NO in step S60), that is, when the user viewpoint mode has been set, it obtains angular velocity data (step S64).
Then, CPU 311 calculates an angle of rotation to the left and right of a main body based on the angular velocity data (step S66). Specifically, an angle of rotation around the reference axis with the direction of gravity being defined as the reference axis is calculated. A direction of rotation can be determined based on a positive or negative sign of the angle of rotation.
Then, CPU 311 performs processing for accepting a slide pad input, a shot input, and an angle of rotation input. Then, the process ends (return).
Namely, in the case of the overhead viewpoint mode, only a slide pad input and a shot input are considered as active, while in the user viewpoint mode, a slide pad input and a shot input as well as an angle of rotation to the left and right in game device are also considered as valid as inputs.
Referring to
Then, when CPU 311 determines in step S70 that the overhead viewpoint mode has been set (YES in step S70), it determines whether an input in accordance with the input acceptance processing has been made or not (step S72).
When CPU 311 determines in step S72 that an input has been made (YES in step S72), it determines whether a shot input has been made or not (step S74). Specifically, whether an input indication through button 14B for hitting ball object OBJ4 has been given or not is determined. Such determination can be made based on operation data 501 in main memory 32.
When it is determined in step S74 that a shot input has been made (YES in step S74), whether or not ball object OBJ4 is present in an area in which player object OBJ1 can hit a shot is determined (step S76). Specifically, it is assumed that an area in which ball object OBJ4 can be hit is set in advance within a prescribed range, based on a position of player object OBJ1. Determination processing as to whether ball object OBJ4 is present in that range or not is performed.
When CPU 311 determines in step S76 that ball object OBJ4 is present in the area in which player object OBJ1 can hit a shot (YES in step S76), it corrects a ball object movement parameter based on a shot input and a slide pad input (step S78). Specifically, the movement parameter (the direction parameter) for controlling the direction of travel of the ball object to the direction of line of sight of player object OBJ1 in accordance with the shot input is corrected. In addition, when a slide pad input has been made, the movement parameter (the direction parameter) is further corrected in accordance with the direction input indication. It is noted that other movement parameters, for example, such parameters as used for controlling movement of the ball object including a ball hitting angle and spin, may be corrected, without limited to the direction parameter.
Then, the process ends (return).
On the other hand, when CPU 311 determines that ball object OBJ4 is not present in the area in which player object OBJ1 can hit a shot (NO in step S76), the process ends (return). In this case, since a shot cannot be hit, for example a trajectory of ball object OBJ4 does not change and such animation as player object OBJ1 with a racket making an air shot can be provided.
Namely, when a shot input has been made, a slide pad input is used for controlling movement of ball object OBJ4 and not used for controlling movement of player object OBJ1. It is noted that, at the time of a shot input as well, a slide pad input may be reflected on control of movement of player object OBJ1, with a degree of reflection being lowered.
On the other hand, when CPU 311 determines in step S74 that a shot input has not been made (NO in step S74), it controls movement in accordance with a prescribed algorithm for controlling movement of player object OBJ1 based on a slide pad input (step S79). Specifically, a direction of movement and an amount of movement are determined in accordance with a direction input indication through the slide pad and player object OBJ1 moves. It is noted that setting of an orientation (a direction of line of sight) of the player object is controlled as will be described later.
Then, the process ends (return).
When CPU 311 determines in step S70 that the overhead viewpoint mode has not been set (NO in step S70), that is, when the user viewpoint mode has been set, it determines whether an input in accordance with the input acceptance processing has been made or not (step S80).
When CPU 311 determines in step S80 that an input has been made (YES in step S80), it determines whether a shot input has been made or not (step S82). Specifically, whether an input indication through button 14B for hitting ball object OBJ4 has been made or not is determined. Such determination can be made based on operation data 501 in main memory 32.
When it is determined in step S82 that a shot input has been made (YES in step S82), whether or not ball object OBJ4 is present in an area in which player object OBJ1 can hit a shot is determined (step S84). Specifically, it is assumed that an area in which ball object OBJ4 can be hit is set in advance within a prescribed range, based on a position of player object OBJ1. Determination processing as to whether ball object OBJ4 is present in that range or not is performed.
When CPU 311 determines in step S84 that ball object OBJ4 is present in the area in which player object OBJ1 can hit a shot (YES in step S84), it corrects a ball object movement parameter based on a shot input, a slide pad input, and an angle of rotation input (step S86). Specifically, the movement parameter (the direction parameter) for controlling the direction of travel of the ball object to the direction of line of sight of player object OBJ1 in accordance with the shot input is corrected. In addition, when a slide pad input has been made, the movement parameter (the direction parameter) is further corrected in accordance with the direction input indication. Moreover, the movement parameter (the direction parameter) is corrected also based on the angle of rotation input. For example, when game device 10 is rotated in the right direction (the angle of rotation being positive), it is determined that right has been input as the direction indication input and the movement parameter (the direction parameter) is corrected in accordance with the angle of rotation. It is noted that other movement parameters, for example, such parameters as used for controlling movement of the ball object including a ball hitting angle and spin, may be corrected, without limited to the direction parameter.
Though a case where, when a shot input has been made, a slide pad input and an angle of rotation input are reflected on correction of the ball object movement parameter has been described in the present example, such movement control as correcting also the movement parameter (the direction parameter) of player object OBJ1 to vary the orientation (the direction of line of sight) of the player object may be carried out. In that case, the direction of travel of the ball object should only be varied in conformity with variation in orientation (direction of line of sight) of the player object.
Then, the process ends (return).
On the other hand, when CPU 311 determines that ball object OBJ4 is not present in the area in which player object OBJ1 can hit a shot (NO in step S84), the process ends (return). In this case, since a shot cannot be hit, for example a trajectory of ball object OBJ4 does not change and such animation as player object OBJ1 with a racket making an air shot can be provided.
Namely, when a shot input has been made, a slide pad input is used for controlling movement of ball object OBJ4 and not used for controlling movement of player object OBJ1. It is noted that, at the time of a shot input as well, a slide pad input may be reflected on control of movement of player object OBJ1, with a degree of reflection being lowered.
On the other hand, when CPU 311 determines in step S82 that a shot input has not been made (NO in step S82), it controls movement in accordance with a prescribed algorithm for controlling movement of the player object based on a slide pad input and an angle of rotation input (step S88). Specifically, a direction of movement and an amount of movement are determined in accordance with a direction input indication through the slide pad and the player object moves. Moreover, the orientation (the direction of line of sight) of player object OBJ1 is further controlled based on the angle of rotation input. It is noted that, in the present embodiment, as described above, in the case where the overhead viewpoint mode has not been set in step S70, the angle of rotation input is reflected on the operation. In other embodiments, however, depending on contents of the game, to the contrary, such an operation method that switching is made to reflect the angle of rotation input on the operation only when the overhead viewpoint mode has been set may be employed.
It is noted that setting of an orientation (a direction of line of sight) of the player object is controlled as will be described later.
Referring to
Then, CPU 311 sets the reference field of view based on the object position, so as to accommodate the reference line (step S42).
Then, CPU 311 sets the center of the reference field of view as the direction of line of sight (step S44).
Specifically, as described with reference to
Then, the process ends (return).
With the direction of line of sight being defined as the reference, processing for moving the ball object and arrangement of the virtual camera are determined.
Referring to
Then, when CPU 311 determines in step S90 that the movement parameter has been corrected (YES in step S90), it controls movement of the ball object in accordance with a prescribed movement control algorithm based on the corrected movement parameter (step S92). For example, when ball object OBJ4 is hit by a racket of player object OBJ1, ball object OBJ4 is controlled to move in the direction of line of sight of player object OBJ1. In this case, in the case of the shot input, movement of the ball object is controlled to move in the direction of line of sight of player object OBJ1. On the other hand, in the case where a slide pad input or an angle of rotation input to the right has been made, movement of the ball object is controlled to move from the direction of line of sight to the right.
On the other hand, when CPU 311 determines in step S90 that the movement parameter has not been corrected (NO in step S90), movement of the ball object is controlled based on the previous movement parameter (step S94). When correction is not to be made, such movement control as maintaining the direction of travel of the ball object is carried out in accordance with the previously set movement parameter.
Then, the process ends (return).
(Variation)
A case where an angle of rotation input is reflected on ball object movement control when both of a shot input and an angle of rotation input have been made has been described above as the ball object movement processing, however, the timing of inputs does not have to be the same. For example, a period during which an angle of rotation input is accepted is provided for a prescribed period after a shot input was made, so that the angle of rotation input may be reflected on ball object movement control when the angle of rotation input is made during that period.
Referring to
Specifically, when CPU 311 determines in step S84 that ball object OBJ4 is present in the area in which player object OBJ1 can hit a shot (YES in step S84), it corrects a ball object movement parameter based on a shot input and a slide pad input (step S86#). Specifically, the movement parameter (the direction parameter) for controlling the direction of travel of the ball object to the direction of line of sight of player object OBJ1 is corrected in accordance with the shot input. In addition, when a slide pad input has been made, the movement parameter (the direction parameter) is further corrected in accordance with the direction input indication.
Then, the process ends (return).
In addition, when CPU 311 determines in step S82 that a shot input has not been made (NO in step S82), it then determines whether or not a shot input was made previously and whether an input other than a shot input has been made within a prescribed period since the previous shot input when the shot input had been made previously (step S100). For example, for making determination as to whether an input has been made within the prescribed period, assuming a processing unit time in a period until an input other than the shot input is made after the shot input as one count, whether the number of counts is within a prescribed number of times may be determined.
When CPU 311 determines that a shot input was made previously and an input other than a shot input has been made within the prescribed period since the previous shot input when the shot input had been made previously (YES in step S100), it determines whether an input other than a shot input is an angle of rotation input or not (step S102).
When CPU 311 determines that an input other than a shot input is the angle of rotation input (YES in step S102), it corrects a ball object movement parameter based on the angle of rotation input (step S104). Specifically, the movement parameter (the direction parameter) is corrected based on the angle of rotation input. For example, when game device 10 is rotated in the right direction (the angle of rotation being positive), it is determined that right has been input as the direction indication input and the movement parameter (the direction parameter) is corrected in accordance with the angle of rotation. It is noted that such other parameters as used for controlling movement of the ball object may be corrected without limited to the direction parameter.
Though a case where, when a shot input has been made, an angle of rotation input is reflected on correction of a ball object movement parameter has been described in the present example, such movement control as correcting also the movement parameter (the direction parameter) of player object OBJ1 to vary an orientation (a direction of line of sight) of the player object may be carried out. In that case, the direction of travel of the ball object should only be varied in conformity with variation in orientation (direction of line of sight) of the player object.
Then, the process ends (return).
On the other hand, when CPU 311 determines that a shot input was not made previously or an input other than a shot input has not been made within the prescribed period since the previous shot input when the shot input had been made previously (NO in step S100), it controls movement in accordance with a prescribed algorithm for controlling movement of the player object based on a slide pad input and an angle of rotation input (step S88). Further, when it is determined in step S102 that an input other than a shot input is not an angle of rotation input (NO in step S102), movement is controlled in accordance with a prescribed algorithm for controlling movement of the player object based on a slide pad input and an angle of rotation input (step S88). Specifically, a direction of movement and an amount of movement are determined in accordance with a direction input indication through the slide pad and the player object moves. Moreover, the orientation (the direction of line of sight) of player object OBJ1 is further controlled based on the angle of rotation input.
As a result of this processing, for example, when the angle of rotation input has been made as an input other than a shot input during the prescribed period after the shot input was made, the movement parameter is corrected in accordance with the angle of rotation input and the angle of rotation input can be reflected on ball object movement control.
In addition, though a case of application to portable game device 10 has been described above, the case is not limited to the portable game device. For example, application to a stationary game device, a portable telephone, a personal handyphone system (PHS), and such a portable information terminal as a PDA is also possible. Moreover, application to a stationary game machine and a personal computer is also possible. A specific example will be described below.
Referring to
Referring to
<Other Forms>
Though game device 10 has been exemplified as a representative example of an information processing apparatus in the embodiment described above, the information processing apparatus is not limited thereto. Namely, an application executable by a personal computer may be provided as a program. Here, the program may be incorporated as a partial function of various applications executed on the personal computer.
In addition, though a case where a tennis game is played has been described as game processing in the present example, the game processing is not limited to the tennis game and it is applicable also to other games. For example, game processing may be performed also in a golf game as a viewpoint mode is switched in accordance with an operation position of the game device. For example, switching between a top view of a course and a shot view may be made in accordance with a position of the game device. In the shot view, a direction of ball hitting may be controlled in accordance with a position of the game device, and in the top view, a direction of a shot may be operated in response to an operation of the slide pad or the touch panel. Here, in the top view, an icon or the like for operation may be displayed, while in the shot view, the number of function indicators such as an icon for improving the sense of realism may be decreased so that functional display may be varied in accordance with an operation method. Further, in the golf game as well, after a ball is hit, a manner of operation of the game device, such as motion of rotation or the like in the right direction, may be reflected on the game processing so that a parameter used for ball movement control may be changed.
While certain example systems, methods, devices, and apparatuses have been described herein, it is to be understood that the appended claims are not to be limited to the systems, methods, devices, and apparatuses disclosed, but on the contrary, are intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2011-197032 | Sep 2011 | JP | national |