The disclosure of Japanese Patent Application No. 2010-294536, filed on Dec. 29, 2010, is incorporated herein by reference.
1. Field of the Invention
The present invention relates to a computer-readable storage medium, a display control apparatus, a display control method, and a display control system. The present invention particularly relates to a computer-readable storage medium, a display control apparatus, a display control method, and a display control system for performing stereoscopically visible display.
2. Description of the Background Art
There is a known conventional method for performing stereoscopic display by using two images having parallax therebetween. For example, Japanese Laid-Open Patent Publication No. 2003-107603 discloses a technique for performing stereoscopic display, in which technique two virtual cameras are set within a three dimensional virtual space and images including subjects for the respective virtual cameras such as virtual objects are rendered based on the virtual cameras.
In the technique disclosed in Japanese Laid-Open Patent Publication No. 2003-107603, the stereoscopic effect can be enhanced by increasing a distance between the two virtual cameras. The stereoscopic display can be switched to planar display by setting the distance between the two virtual cameras to 0. The rendering is performed by using the same virtual cameras regardless of whether the display is stereoscopic display or planar display.
A main object of the present invention is to provide a computer-readable storage medium, a display control apparatus, a display control method, and a display control system, which are novel.
The present invention may have the following exemplary features to attain the object mentioned above.
In one configuration example of a computer-readable storage medium of the present invention. the computer-readable storage medium has stored therein a display control program for causing a computer of a display control apparatus to act as: adjustment means for adjusting a degree of stereopsis; virtual camera control means for setting an angle of view of a virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means; and rendering means for rendering a virtual object placed in a virtual space, based on the virtual camera of which the angle of view has been set by the virtual camera control means.
It should be noted that the term “degree of stereopsis” refers to the degree of effect by which a user feels, when viewing a screen, that the virtual space extends in a direction perpendicular to the screen.
According to the above configuration example, when the degree of stereopsis is changed, the angle of view of the virtual camera is changed and the virtual object placed in the virtual space is rendered with the changed angle of view. In this manner, a change in stereoscopic effect can be accentuated by a change in the angle of view of the virtual camera.
As another configuration example, the virtual camera control means may set the angle of view of the virtual camera to a first angle of view if the degree of stereopsis is adjusted to a first value by the adjustment means, and set the angle of view of the virtual camera to a second angle of view if the degree of stereopsis is adjusted to a second value by the adjustment means. Moreover, the rendering means may render a stereoscopically visible image if the angle of view of the virtual camera is set to the first angle of view by the virtual camera control means, and render a planarly visible image if the angle of view of the virtual camera is set to the second angle of view by the virtual camera control means.
As another further configuration example, the first value may be greater than the second value, and the first angle of view may be greater than the second angle of view.
As another further configuration example, the virtual camera control means may increase the angle of view of the virtual camera in accordance with an increase in the degree of stereopsis adjusted by the adjustment means.
As another further configuration example, the virtual camera control means may change a distance between a pair of virtual cameras in accordance with the degree of stereopsis adjusted by the adjustment means, and the rendering means may render a stereoscopically visible image by rendering, based on the pair of virtual cameras, an image for left eye and an image for right eye, in each of which the virtual object is captured.
As another further configuration example, the virtual camera control means may increase the distance between the pair of virtual cameras in accordance with an increase in the degree of stereopsis adjusted by the adjustment means.
As another further configuration example. the pair of virtual cameras may each be a perspective-projection virtual camera. If the degree of stereopsis is adjusted to a minimum value by the adjustment means. the virtual camera control means may set the angle of view of each of the pair of perspective-projection virtual cameras to a minimum value such that the angle of view is set close to an angle of view for orthogonal projection, and set the distance between the pair of perspective-projection virtual cameras to zero or a value close to zero.
As another further configuration example, the virtual camera control means may change a distance from the virtual object to the virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means.
As another further configuration example, the virtual camera control means may move the virtual camera, such that the greater the degree of stereopsis adjusted by the adjustment means, the closer the virtual camera is to the virtual object.
In one configuration example of a display control apparatus of the present invention, the display control apparatus includes: adjustment means for adjusting a degree of stereopsis; virtual camera control means for setting an angle of view of a virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means; and rendering means for rendering a virtual object placed in a virtual space, based on the virtual camera of which the angle of view has been set by the virtual camera control means.
In one configuration example of a display control method of the present invention, the display control method includes: an adjustment step of adjusting a degree of stereopsis; a virtual camera control step of setting an angle of view of a virtual camera in accordance with the degree of stereopsis adjusted in the adjustment step; and a rendering step of rendering a virtual object placed in a virtual space, based on the virtual camera of which the angle of view has been set in the virtual camera control step.
In one configuration example of a display control system of the present invention, the display control system includes: adjustment means for adjusting a degree of stereopsis; virtual camera control means for setting an angle of view of a virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means; and rendering means for rendering a virtual object placed in a virtual space, based on the virtual camera of which the angle of view has been set by the virtual camera control means.
In another configuration example of the computer-readable storage medium of the present invention, the computer-readable storage medium has stored therein a display control program for causing a computer of a display control apparatus to act as: adjustment means for adjusting a degree of stereopsis; virtual camera control means for setting a type of virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means; and rendering means for rendering a virtual object placed in a virtual space, based on the virtual camera of which the type has been set by the virtual camera control means.
As another configuration example, the virtual camera control means may set a first virtual camera if the degree of stereopsis is adjusted to a first value by the adjustment means, and set a second virtual camera which is a different type of virtual camera from the first virtual camera if the degree of stereopsis is adjusted to a second value by the adjustment means. Moreover, the rendering means may render, based on the first virtual camera, a stereoscopically visible image in which the virtual object is captured, and render, based on the second virtual camera, a planarly visible image in which the virtual object is captured.
It should be noted that the term “stereoscopically visible image” refers to an image that is generated to show a virtual object in a stereoscopic manner (i.e., in a manner that allows a user to feel as if the virtual object was placed at the front or at the back of the screen), such that the user can view the virtual object with naked eyes or through glasses or head-mounted display. Typically, the term “stereoscopically visible image” refers to multiple images that are obtained by rendering a virtual object with respective different viewpoints, or refers to an image that results from combining such multiple images. The term “planarly visible image” refers to an image that is generated to show a virtual object in a planar manner (i.e., in a manner that does not make the user feel as if the virtual object was placed at the front or at the back of the screen).
According to the above configuration example, rendering of a planarly visible image and rendering of a stereoscopically visible image are performed by using different types of virtual cameras, respectively. For example, a difference in appearance between a planarly visible image and a stereoscopically visible image can be accentuated by changing the type of virtual camera to be used.
As another further configuration example, the virtual camera control means may set a perspective-projection virtual camera if the degree of stereopsis is adjusted to the first value by the adjustment means, and set one of an orthogonal-projection virtual camera and a pseudo orthogonal-projection virtual camera if the degree of stereopsis is adjusted to the second value by the adjustment means. Moreover, the rendering means may render, based on the perspective-projection virtual camera, a stereoscopically visible image in which the virtual object is captured, and render, based on one of the orthogonal-projection virtual camera and the pseudo orthogonal-projection virtual camera, a planarly visible image in which the virtual object is captured.
According to the above configuration example, in the case of rendering a stereoscopically visible image, a perspective-projection virtual camera is used, whereas in the case of rendering a planarly visible image, an orthogonal-projection virtual camera or a pseudo orthogonal-projection virtual camera (i.e., a perspective-projection virtual camera of which the angle of view is similar to one for orthogonal projection) is used. Accordingly. the stereoscopic effect that is obtained in a case where a stereoscopically visible image is displayed is much greater than that obtained in a case where a planarly visible image is displayed. Assume, for example, a case where existing application software (such as a game program) capable of planar display for displaying an image that is generated based on an orthogonal-projection virtual camera is upgraded into application software capable of both planar display and stereoscopic display. In the case of such upgraded application software, the planar display may maintain the way the images look in the existing application software, and the stereoscopic display may allow a user to feel the sense of depth realized by perspective projection in addition to the sense of depth realized by parallax. In this manner, the stereoscopic effect can be accentuated.
As another further configuration example, the first value may be greater than the second value.
As another further configuration example, the virtual camera control means may set an angle of view of the perspective-projection virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means, and the rendering means may render, based on the perspective-projection virtual camera of which the angle of view has been set by the virtual camera control means, a stereoscopically visible image in which the virtual object is captured.
As another further configuration example, the virtual camera control means may set an angle of view of the perspective-projection virtual camera to a minimum value if the degree of stereopsis is adjusted to a minimum value by the adjustment means, such that the angle of view of the perspective-projection virtual camera is set close to an angle of view for orthogonal projection, and increase the angle of view of the perspective-projection virtual camera in accordance with an increase in the degree of stereopsis adjusted by the adjustment means.
According to the above configuration example, if, for example, a user adjusts the degree of stereopsis by gradually increasing it to a desired degree, the angle of view of the virtual camera changes smoothly. Therefore, the appearance of a displayed image does not suddenly change, which allows the user to perform the adjustment comfortably.
As another further configuration example, the virtual camera control means may set a distance between a pair of perspective-projection virtual cameras in accordance with the degree of stereopsis adjusted by the adjustment means, and the rendering means may render a stereoscopically visible image by rendering, based on the pair of perspective-projection virtual cameras, an image for left eye and an image for right eye, in each of which the virtual object is captured.
As another further configuration example, if the degree of stereopsis is adjusted to the first value by the adjustment means, the rendering means may render a stereoscopically visible image by rendering. based on a pair of perspective-projection virtual cameras, an image for left eye and an image for right eye, in each of which the virtual object is captured, and if the degree of stereopsis is adjusted to the second value by the adjustment means, the rendering means may render, based on a single orthogonal-projection virtual camera, a planarly visible image in which the virtual object is captured.
As another further configuration example, the virtual camera control means may increase a distance between the pair of perspective-projection virtual cameras in accordance with an increase in the degree of stereopsis adjusted by the adjustment means.
As another further configuration example, the virtual camera control means may change a distance from the virtual object to the virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means.
As another further configuration example, the virtual camera control means may move the virtual camera, such that the greater the degree of stereopsis adjusted by the adjustment means, the closer the virtual camera is to the virtual object.
In one configuration example of a display control apparatus of the present invention, the display control apparatus includes: adjustment means for adjusting a degree of stereopsis; virtual camera control means for setting a type of virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means; and rendering means for rendering a virtual object placed in a virtual space, based on the virtual camera of which the type has been set by the virtual camera control means.
In one configuration example of a display control method of the present invention, the display control method includes: an adjustment step of adjusting a degree of stereopsis; a virtual camera control step of setting a type of virtual camera in accordance with the degree of stereopsis adjusted in the adjustment step; and a rendering step of rendering a virtual object placed in a virtual space, based on the virtual camera of which the type has been set in the virtual camera control step.
In one configuration example of a display control system of the present invention, the display control system includes: adjustment means for adjusting a degree of stereopsis; virtual camera control means for setting a type of virtual camera in accordance with the degree of stereopsis adjusted by the adjustment means; and rendering means for rendering a virtual object placed in a virtual space, based on the virtual camera of which the type has been set by the virtual camera control means.
The present invention provides a computer-readable storage medium, a display control apparatus, a display control method, and a display control system. which are novel.
These and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
Hereinafter, a game apparatus according to one embodiment of the present invention will be described.
Initially, an external structure of the game apparatus 10 will be described with reference to
Initially, a structure of the lower housing 11 will be described. As shown in
As shown in
As shown in FIG 1, the game apparatus 10 includes the touch panel 13 as an input device. The touch panel 13 is mounted on the screen of the lower LCD 12. In the present embodiment, a resistive film type touch panel is used as the touch panel 13. However. the touch panel 13 is not limited to a resistive film type touch panel, but may be any type of touch panel. For example, a touch panel of electrostatic capacitance type may be used as the touch panel 13. In the present embodiment, the touch panel 13 has the same resolution (detection accuracy) as that of the lower LCD 12. However, the resolution of the touch panel 13 and the resolution of the lower LCD 12 need not be the same. The insertion opening 17 (indicated by dashed lines in
The operation buttons 14A to 14L are each an input device for performing a predetermined input. As shown in
The analog stick 15 is a device for indicating a direction. The analog stick 15 has a keytop which is configured to slide parallel to the inner side surface of the lower housing 11. The analog stick 15 acts in accordance with a program executed by the game apparatus 10. For example, when a game in which a predetermined object appears in a three-dimensional virtual space is executed by the game apparatus 10, the analog stick 15 acts as an input device for moving the predetermined object in the three-dimensional virtual space. In this case, the predetermined object moves in a direction in which the keytop of the analog stick 15 is slid. It should be noted that any component that enables an analog input by being tilted by a predetermined amount in any direction among up, down, left, right, and diagonal directions may be used as the analog stick 15.
Further, the microphone hole 18 is provided in the inner side surface of the lower housing 11. Under the microphone hole 18, a microphone 42 (see
As shown in
As shown in
As shown in
A rechargeable battery (not shown) which is the power source for the game apparatus 10 is accommodated in the lower housing 11, and the battery can be charged through a terminal provided at a side surface (for example, the upper side surface) of the lower housing 11.
Next, a structure of the upper housing 21 will be described. As shown in
As shown in
The upper LCD 22 is a display device capable of displaying a stereoscopically visible image. In the present embodiment, an image for left eye and an image for right eye are displayed by using substantially the same display area. Specifically, the upper LCD 22 may be a display device using a method in which the image for left eye and the image for right eye are alternately displayed in the horizontal direction in predetermined units (for example, every other line). Alternatively, a display device using a method in which the image for left eye and the image for right eye are alternately displayed for a predetermined time period may be used. Further, in the present embodiment, the upper LCD 22 is a display device capable of displaying an image which is stereoscopically visible with naked eyes. A lenticular lens type display device or a parallax barrier type display device is used for enabling the image for left eye and the image for right eye, which are alternately displayed in the horizontal direction, to be separately viewed by the left eye and the right eye, respectively. In the present embodiment, the upper LCD 22 of a parallax barrier type is used. The upper LCD 22 displays, by using the image for right eye and the image for left eye, an image (a stereoscopic image) which is stereoscopically visible with naked eyes. That is, the upper LCD 22 allows a user to view, by means of a parallax barrier, the image for left eye with the user's left eye and the image for right eye with the user's right eye. In this manner, a stereoscopic image (a stereoscopically visible image) exerting a stereoscopic effect for the user can be displayed. Further, the upper LCD 22 may disable the parallax barrier. When the parallax barrier is disabled, an image can be displayed in a planar manner (i.e., it is possible to display an image not in the above-described stereoscopically visible manner but in a planarly visible manner; specifically, a display mode is used in which the same displayed image is viewed by both left and right eyes). Thus, the upper LCD 22 is a display device capable of switching between a stereoscopic display mode for displaying a stereoscopically visible image and a planar display mode for displaying an image in a planar manner (i.e., for displaying a planarly visible image). The switching of the display mode is performed by the 3D adjustment switch 25. which will be described below.
Two imaging sections (23a and 23b) provided at the outer side surface (the back surface reverse of the main surface at which the upper LCD 22 is provided) 21D of the upper housing 21 are collectively referred to as the outer imaging section 23. Both directions in which the outer imaging section (left) 23a and the outer imaging section (right) 23b capture images, respectively, extend outward from the outer side surface 21D and are both normal to the outer side surface 21D. The outer imaging section (left) 23a and the outer imaging section (right) 23b can be used as a stereo camera in accordance with a program executed by the game apparatus 10. Each of the outer imaging section (left) 23a and the outer imaging section (right) 23b includes an imaging device (such as a CCD image sensor or a CMOS image sensor) having the same predetermined resolution, and a lens. The lens may have a zooming mechanism.
The inner imaging section 24 is provided at the inner side surface (the main surface) 21B of the upper housing 21, and acts as an imaging section which captures an image in a direction that extends inward from and normal to the inner side surface. The inner imaging section 24 includes an imaging device (such as a CCD image sensor and a CMOS image sensor) having a predetermined resolution, and a lens. The lens may have a zooming mechanism.
The 3D adjustment switch 25 is a slide switch, and is used for switching the display mode of the upper LCD 22 as described above. Further, the 3D adjustment switch 25 is used for adjusting the stereoscopic effect of a stereoscopically visible image (stereoscopic image) that is displayed on the upper LCD 22 (i.e., used for adjusting the degree of stereopsis). The 3D adjustment switch 25 has slider 25a which is slidable to any position in a predetermined direction (in the longitudinal direction along the right side surface). The display mode of the upper LCD 22 is set in accordance with the position of the slider 25a. Also, the appearance of a stereoscopic image in the display (i.e., the degree of stereopsis) is adjusted in accordance with the position of the slider 25a.
As shown in
The 3D indicator 26 indicates whether the upper LCD 22 is in the stereoscopic display mode. The 3D indicator 26 is an LED, and is lit up when the stereoscopic display mode of the upper LCD 22 is enabled. The 3D indicator 26 may be lit up only in a case where program processing for displaying a stereoscopically visible image is performed when the upper LCD 22 is in the stereoscopic display mode.
Further, speaker holes 21E are formed in the inner side surface of the upper housing 21. A sound from a below-described speaker 43 is outputted through the speaker holes 21E.
Next, an internal electrical configuration of the game apparatus 10 will be described with reference to
The information processing section 31 is information processing means which includes a CPU (Central Processing Unit) 311 for executing a predetermined program, a GPU (Graphics Processing Unit) 312 for performing image processing, and the like. The CPU 311 of the information processing section 31 executes a program stored in a memory (for example, the external memory 44 connected to the external memory I/F 33 or the internal data storage memory 35) inside the game apparatus 10, thereby performing processing in accordance with the program. The program executed by the CPU 311 of the information processing section 31 may be obtained from another device through communication with the other device. The information processing section 31 further includes a VRAM (Video RAM) 313. The GPU 312 of the information processing section 31 generates an image in accordance with an instruction from the CPU 311 of the information processing section 31, and renders the image in the VRAM 313, The GPU 312 of the information processing section 31 outputs the image rendered in the VRAM 313, to the upper LCD 22 and/or the lower LCD 12, and the image is displayed on the upper LCD 22 and/or the lower LCD 12.
The main memory 32, the external memory 1/F 33, the external data storage memory I/F 34, and the internal data storage memory 35 are connected to the information processing section 31. The external memory 1/F 33 is an interface for detachably connecting to the external memory 44. The external data storage memory I/F 34 is an interface for detachably connecting to the external data storage memory 45.
The main memory 32 is volatile storage means used as a work area and a buffer area for (the CPU 311 of) the information processing section 31, That is, the main memory 32 temporarily stores various types of data used for the aforementioned processing based on a program, and temporarily stores a program obtained from the outside (i.e., from the external memory 44, another device, or the like), for example. In the present embodiment, for example, a PSRAM (Pseudo-SRAM) is used as the main memory 32.
The external memory 44 is nonvolatile storage means for storing a program executed by the information processing section 31. The external memory 44 is structured as, for example, a read-only semiconductor memory. When the external memory 44 is connected to the external memory I/F 33, the information processing section 31 can load a program stored in the external memory 44. Predetermined processing is performed when the program loaded by the information processing section 31 is executed. The external data storage memory 45 is structured as a non-volatile readable and writable memory (for example, a NAND flash memory), and is used for storing predetermined data. For example, images captured by the outer imaging section 23 and/or images captured by another device are stored in the external data storage memory 45. When the external data storage memory 45 is connected to the external data storage memory I/F 34, the information processing section 31 can load an image stored in the external data storage memory 45, and display the image on the upper LCD 22 and/or the lower LCD 12.
The internal data storage memory 35 is structured as a non-volatile readable and writable memory (for example. a NAND flash memory), and is used for storing predetermined data. For example, data and/or programs downloaded via the wireless communication module 36 by wireless communication are stored in the internal data storage memory 35.
The wireless communication module 36 has a function of connecting to a wireless LAN by a method compliant with, for example, IEEE 802.11 b/g standard. The local communication module 37 has a function of performing wireless communication with the same type of game apparatus by a predetermined communication method (for example, communication based on a unique protocol or infrared communication). The wireless communication module 36 and the local communication module 37 are connected to the information processing section 31. The information processing section 31 can perform data transmission to and data reception from another device via the Internet by using the wireless communication module 36, and perform data transmission to and data reception from the same type of another game apparatus by using the local communication module 37.
The acceleration sensor 39 is connected to the information processing section 31. The acceleration sensor 39 detects magnitudes of acceleration (linear acceleration) in the directions of respective straight lines along three axes (xyz axes). The acceleration sensor 39 is provided inside the lower housing 11. In the acceleration sensor 39, as shown in
The RTC 38 and the power supply circuit 40 are connected to the information processing section 31. The RTC 38 counts time, and outputs the time to the information processing section 31. The information processing section 31 calculates the current time (date) based on the time counted by the RTC 38. The power supply circuit 40 controls power from the power source (i.e., the rechargeable battery accommodated in the lower housing 11 as described above) of the game apparatus 10, and supplies the power to each component of the game apparatus 10.
The I/F circuit 41 is connected to the information processing section 31. The microphone 42 and the speaker 43 are connected to the I/F circuit 41. Specifically, the speaker 43 is connected to the I/F circuit 41 through an amplifier which is not shown. The microphone 42 detects a voice uttered by a user, and outputs a sound signal to the I/F circuit 41, accordingly. The amplifier amplifies a sound signal from the I/F circuit 41, and a resultant sound is outputted from the speaker 43. The touch panel 13 is connected to the I/F circuit 41. The I/F circuit 41 includes a sound control circuit for controlling the microphone 42 and the speaker 43 (amplifier), and a touch panel control circuit for controlling the touch panel. For example, the sound control circuit performs A/D conversion and D/A conversion on sound signals, and also converts sound signals into a predetermined form of sound data. The touch panel control circuit generates a predetermined form of touch position data based on a signal outputted from the touch panel 13, and outputs the touch position data to the information processing section 31. The touch position data indicates coordinates of a position, on an input surface of the touch panel 13, at which an input has been performed. The touch panel control circuit reads a signal outputted from the touch panel 13 and generates touch position data once in every predetermined period, The information processing section 31 obtains the touch position data to recognize a position, on the touch panel 13, at which an input has been performed.
Operation buttons 14 include the above-described operation buttons 14A to 14L, and are connected to the information processing section 31. The operation buttons 14 output, to the information processing section 31, operation data indicating input states of the respective operation buttons 14A to 14L (i.e., indicating whether the operation buttons 14A to 14L have been pressed). The information processing section 31 obtains the operation data from the operation buttons 14 to perform processing in accordance with the inputs performed via the operation buttons 14.
The lower LCD 12 and the upper LCD 22 are connected to the information processing section 31. The lower LCD 12 and the upper LCD 22 each display an image in accordance with an instruction from (the GPU 312 of) the information processing section 31. In the present embodiment, the information processing section 31 causes the upper LCD 22 to display a stereoscopic image (i.e., a stereoscopically visible image).
Specifically, the information processing section 31 is connected to an LCD controller (not shown) of the upper LCD 22, and causes the LCD controller to set the parallax barrier to ON or OFF. When the parallax barrier is set to ON in the upper LCD 22, an image for right eye and an image for left eye, which are stored in the VRAM 313 of the information processing section 31, are outputted to the upper LCD 22. More specifically, the LCD controller alternately repeats reading of pixel data of the image for right eye for one line in the vertical direction, and reading of pixel data of the image for left eye for one line in the vertical direction, thereby reading, from the VRAM 313, the image for right eye and the image for left eye. Thus. an image to be displayed is divided into images for right eye and images for left eye, each of which is a rectangle-shaped image having one line of pixels aligned in the vertical direction. Then, an image, in which the rectangle-shaped images for right eye that are obtained through the division and the rectangle-shaped images for left eye that are obtained through the division are alternately arranged, is displayed on the screen of the upper LCD 22. A user views the image through the parallax barrier in the upper LCD 22, so that the images for right eye are viewed by the user's right eye and the images for left eye are viewed by the user's left eye. In this manner, a stereoscopically visible image is displayed on the screen of the upper LCD 22.
The outer imaging section 23 and the inner imaging section 24 are connected to the information processing section 31. The outer imaging section 23 and the inner imaging section 24 each capture an image in accordance with an instruction from the information processing section 31, and output data of the captured image to the information processing section 31.
The 3D adjustment switch 25 is connected to the information processing section 31. The 3D adjustment switch 25 transmits, to the information processing section 31, an electrical signal in accordance with the position of the slider 25a.
The 3D indicator 26 is connected to the information processing section 31. The information processing section 31 controls lighting-up of the 3D indicator 26. For example, the information processing section 31 lights up the 3D indicator 26 when the upper LCD 22 is in the stereoscopic display mode. The game apparatus 10 has the internal configuration as described above.
Next, operations of the game apparatus 10 will be briefly described with reference to
As shown in
It should be noted that, in the planar display mode, a perspective-projection virtual camera may be used. In this case, the angle of view of the perspective-projection virtual camera may be set to 0 or a value close to 0. This allows the perspective-projection virtual camera to be used as an orthogonal-projection virtual camera or a pseudo orthogonal-projection virtual camera.
In the case of upgrading an existing video game (e.g., a video game of the past) to one capable of both planar display and stereoscopic display, it is desired in the planar display to keep the way the images look in the existing video game, rather than improving the verisimilitude of the images, to allow the user to feel nostalgic.
As shown in
As described above, in the stereoscopic display mode, the virtual space is rendered by using two virtual cameras that are placed with a distance therebetween. This generates parallax between the image for right eye and the image for left eye. As a result, the parallax allows the user feel a sense of depth as shown in
In this case, the distance between the right virtual camera and the left virtual camera and the angle of view of each virtual camera are both smaller than in a case where the degree of stereopsis is set to the maximum value. When the distance between the right virtual camera and the left virtual camera is reduced, the parallax between the image for right eye and the image for left eye is reduced, accordingly. Therefore, as shown in
In the present embodiment, the degree of stereopsis is determined in accordance with a 3D volume value which is based on an output signal from the 3D adjustment switch 25. The 3D volume value is a real number in the range of 0.0 to 1.0, for example. When the slider 25a is in the first position (see
In another embodiment, the 3D volume value may be 0.0 when the slider 25a is in the third position (see
The angle of view of each of the right virtual camera and the left virtual camera and the distance between the right virtual camera and the left virtual camera are determined in accordance with the above-described 3D volume value. As shown in
Described below is an example of formulas which are used for determining the setting of a virtual camera in accordance with the 3D volume value. In the formulas shown below, m3DVol (0.0 f to 1.0 f) represents the 3D volume value; mFov2Len represents a distance from a rendering target (i.e., a virtual object) to the virtual camera in a case where the 3D volume value is minimum; and mMaxFov represents the angle of view of the virtual camera in a ease where the 3D volume value is maximum. It should be noted that if the angle of view of the virtual camera is simply increased, then the size of the rendering target displayed on the screen decreases. Therefore, in the present embodiment, as shown in the formulas below, the greater the angle of view of the virtual camera (i.e., an angle of view setting value shown below), the smaller the virtual camera coordinates (i.e., the virtual camera is placed closer to the rendering target). In this manner, the size of the rendering target displayed on the screen is maintained.
Angle of view setting value=1.0 f+(mMaxFov×m3DVol)
Virtual camera coordinates=mFov2Len/angle of view setting value−((angle of view setting value×angle of view coefficient)/mFov2Len)
The setting of a perspective-projection virtual camera is determined based on calculation results of the above formulas.
It is desired that each of the minimum angle of view and the minimum distance is a value close to 0 so that, when the user operates the slider 25a to switch the display mode from the planar display mode to the stereoscopic display mode (or from the stereoscopic display mode to the planar display mode), the appearance of a displayed image changes smoothly. A virtual camera of which the angle of view is close to 0 may be called a pseudo orthogonal-projection virtual camera. The above effect is produced effectively by employing a switch similar to the 3D adjustment switch 25 shown in
It should be noted that the same perspective-projection virtual cameras may be used for both the stereoscopic display mode and the planar display mode. In this case, in the planar display mode, the angle of view of each perspective-projection virtual camera may be set to the minimum value for the purpose of using each perspective-projection virtual camera as an orthogonal-projection virtual camera or as a pseudo orthogonal-projection virtual camera. In the stereoscopic display mode, the angle of view of each perspective-projection virtual camera may be increased in accordance with an increase in the degree of stereopsis.
According to the above-described control of the angle of view and the distance, an image for left eye rendered based on the left virtual camera and an image for right eye rendered based on the right virtual camera change as shown in
The display control program is a program for generating a planarly visible image and a stereoscopically visible image as described above. For example, at the start of the game, the display control program is loaded from the external memory 44 or the like into the main memory 32 together with the virtual object data which is described below.
The virtual object data relates to a virtual object in a virtual space. The virtual object data indicates, for example, the shape and color of the virtual object. The virtual object data also contains data indicating the position and orientation of the virtual object in the virtual space. These data are updated at any time in accordance with the progress in the game.
The virtual camera data relates to the virtual cameras which are used to render the virtual space. The virtual camera data contains, for example, data indicating the position and orientation of each virtual camera. The virtual camera data contains data relating to an orthogonal-projection virtual camera used in generating a planarly visible image and data relating to perspective-projection virtual cameras (i.e., the right virtual camera and the left virtual camera) used in generating a stereoscopically visible image. The data relating to the perspective-projection virtual cameras contains: the angle of view of the right virtual camera and the angle of view of the left virtual camera; and the distance between the right virtual camera and the left virtual camera (or positional coordinates of the right virtual camera and positional coordinates of the left virtual camera).
As described above, the 3D volume value is obtained at any time based on an output signal from the 3D adjustment switch 25, and stored in the main memory 32.
The operation data is obtained at any time based on output signals from input devices such as the operation buttons 14, and stored in the main memory 32.
First. at step S11 in
At step S12, the CPU 311 determines, based on an output signal from the 3D adjustment switch 25, whether the stereoscopic display mode is currently selected (i.e., whether the slider 25a is located at a position between the first position (see
At step S13, the CPU 311 obtains the 3D volume value based on an output signal from the 3D adjustment switch 25.
At step S14, the CPU 311 determines, based on the obtained 3D volume value, the angle of view of each of the pair of perspective-projection virtual cameras (i.e., the right virtual camera and the left virtual camera) and the distance between the right virtual camera and the left virtual camera (or the positional coordinates of the right virtual camera and the positional coordinates of the left virtual camera). These may be determined by referring to a table as shown in
At step S15, the CPU 311 renders images (an image for right eye and an image for left eye) by using the pair of perspective-projection virtual cameras. Typically, the GPU 312 performs this rendering process in accordance with a rendering instruction from the CPU 311. The images generated in this manner are combined as necessary and then displayed at a predetermined timing on the upper LCD 22 in a stereoscopic manner.
At step S16, the CPU 311 uses a single orthogonal-projection virtual camera to render an image. Typically, the GPU 312 performs this rendering process in accordance with a rendering instruction from the CPU 311. The image generated in this manner is displayed at a predetermined timing on the upper LCD 22 in a planar manner.
At step S17, the CPU 311 obtains operation data based on output signals from the operation buttons 14 or the like.
At step S18, the CPU 311 controls a virtual object (in the present embodiment, the rider) based on the obtained operation data.
At step S19, the CPU 311 determines whether the game has ended. If the game has ended, the execution of the display control program is ended. Otherwise, the processing returns to step S12.
As described above, according to the present embodiment, perspective-projection virtual cameras are used in the stereoscopic display mode, and an orthogonal-projection virtual camera is used in the planar display mode. Accordingly, as compared to the planar display mode, the stereoscopic effect for the user is enhanced in the stereoscopic display mode which allows the user to feel not only the sense of depth realized by parallax but also the sense of depth realized by perspective projection.
Further, according to the present embodiment, in the stereoscopic display mode, the angle of view of each virtual camera changes in accordance with the degree of stereopsis (i.e., in accordance with the position of the slider 25a). This allows the user to feel, when the degree of stereopsis changes, not only a change in the sense of depth realized by parallax but also a change in the sense of depth realized by perspective projection. In this manner. changes in the stereoscopic effect can be accentuated.
Still further, according to the present embodiment, the perspective-projection virtual cameras may be used in both the stereoscopic display mode and the planar display mode. In the planar display mode, by setting the angle of view of each perspective-projection virtual camera to the minimum value, each perspective-projection virtual camera can be used as an orthogonal-projection virtual camera or a pseudo orthogonal-projection virtual camera. In the stereoscopic display mode, the angle of view of each perspective-projection virtual camera is increased in accordance with an increase in the degree of stereopsis. This allows the user to feel, when the degree of stereopsis changes, not only a change in the sense of depth realized by parallax (i.e., the distance between the left and right virtual cameras) but also a change in the sense of depth realized by the angle of view of each perspective-projection virtual camera. In this manner, changes in the stereoscopic effect can be accentuated.
The above embodiment describes an example where the virtual space is rendered by using an orthogonal-projection virtual camera in the planar display mode. However, in another embodiment, in the planar display mode, the virtual space may be rendered by using a perspective-projection virtual camera of which the angle of view is similar to one for orthogonal projection (i.e., a pseudo orthogonal-projection virtual camera). Moreover, in the planar display mode, the distance between a pair of virtual cameras used in the stereoscopic display mode may be set to 0 (i.e., the positional coordinates of the right virtual camera and the positional coordinates of the left virtual camera may be set to be identical with each other). Then, only one of these virtual cameras may be used to render the virtual space.
Further, the above embodiment describes an example where a pair of perspective-projection virtual cameras is used in the stereoscopic display mode. However, in another embodiment, a single perspective-projection virtual camera may be used in the stereoscopic display mode in such a manner that the position of the single perspective-projection virtual camera is moved as necessary to render two images (i.e., an image for right eye and an image for left eye), the respective viewpoints of which are different from each other.
Still further, the above embodiment describes an example where the virtual cameras are switched between the orthogonal-projection type used in the planar display mode and the perspective-projection type used in the stereoscopic display mode (i.e., switching between two types of cameras). However, the present invention is not limited thereto. Without switching the virtual cameras, changing virtual camera parameters (e.g., the angle of view and the like) between the planar display mode and the stereoscopic display mode will suffice. This allows the sense of depth realized by parallax to be changed between the planar display mode and the stereoscopic display mode, and in addition, allows the appearance of a displayed image to be changed in accordance with the virtual camera parameters. Alternatively, multiple types of perspective-projection virtual cameras, each type having a different angle of view, may be prepared. Based on an obtained 3D volume value, a corresponding type of perspective-projection virtual camera may be set.
Still further, the above embodiment describes an example where the game, which the user plays by operating a rider, is executed. However, in another embodiment, any application software that is not game software may be executed.
Still further, the above embodiment describes an example where a CPU (i.e., the CPU 311) performs processing of the display control program. However, in another embodiment, at least a part of the processing of the display control program may be realized solely by hardware components.
Still further, the above embodiment describes a handheld game apparatus which includes a display device (i.e., the upper LCD 22) capable of stereoscopic display. However, in another embodiment, any information processing apparatus (e.g., a stationary game apparatus, desktop PC, laptop PC, or mobile phone) capable of outputting image signals to a display device capable of stereoscopic display may be used.
Still further, the above embodiment describes an example where a single CPU (i.e., the CPU 311) performs processing of the display control program. However, in another embodiment, the processing of the display control program may be distributed to a plurality of processors. The plurality of processors may be included in a single information processing apparatus, or may be separately provided in a plurality of information processing apparatuses, respectively, such as a server apparatus and a client apparatus.
While the invention has been described in detail, the foregoing description is in all aspects illustrative and not restrictive. It will be understood that numerous other modifications and variations can be devised without departing from the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2010-294536 | Dec 2010 | JP | national |