Japanese Patent Application No. 2012-080193 filed on Mar. 30, 2012, is hereby incorporated by reference in its entirety.
The present invention relates to a stereoscopic device and an image generation method.
JP-A-2003-85586 discloses a technique that projects an image with a small amount of distortion onto a curved screen.
According to one aspect of the invention, there is provided a stereoscopic device that allows a user to observe a stereoscopic image by observing an image projected onto a curved screen from a presumed viewing position, the stereoscopic device comprising:
a virtual camera setting section that sets a right virtual camera and a left virtual camera in a virtual three-dimensional space;
a virtual plane setting section that sets a plurality of virtual planes in front of the right virtual camera and the left virtual camera in a line-of-sight direction;
a virtual plane image generation section that generates a virtual plane image that corresponds to the right virtual camera and a virtual plane image that corresponds to the left virtual camera by perspective projection transformation of the virtual three-dimensional space onto the plurality of virtual planes; and
a projection image generation section that generates a projection image that is projected onto the curved screen using a pixel position correspondence relationship and the virtual plane images, the pixel position correspondence relationship defining a relationship between a pixel position of the plurality of virtual planes and a pixel position of the projection image so that the virtual plane images are observed without distortion when the user observes the curved screen from the presumed viewing position.
According to another aspect of the invention, there is provided an image generation method that generates a projection image that can be observed as a stereoscopic image when a user observes the projection image projected onto a curved screen from a presumed viewing position, the method comprising:
setting a right virtual camera and a left virtual camera in a virtual three-dimensional space;
setting a plurality of virtual planes in front of the right virtual camera and the left virtual camera in a line-of-sight direction;
generating a virtual plane image that corresponds to the right virtual camera and a virtual plane image that corresponds to the left virtual camera by perspective projection transformation of the virtual three-dimensional space onto the plurality of virtual planes; and
generating the projection image using a pixel position correspondence relationship and the virtual plane images, the pixel position correspondence relationship defining a relationship between a pixel position of the plurality of virtual planes and a pixel position of the projection image so that the virtual plane images are observed without distortion when the user observes the curved screen from the presumed viewing position.
Several embodiments of the invention may implement a stereoscopic device that projects an image onto a curved screen.
According to one embodiment of the invention, there is provided a stereoscopic device that allows a user to observe a stereoscopic image by observing an image projected onto a curved screen from a presumed viewing position, the stereoscopic device comprising:
a virtual camera setting section that sets a right virtual camera and a left virtual camera in a virtual three-dimensional space;
a virtual plane setting section that sets a plurality of virtual planes in front of the right virtual camera and the left virtual camera in a line-of-sight direction;
a virtual plane image generation section that generates a virtual plane image that corresponds to the right virtual camera and a virtual plane image that corresponds to the left virtual camera by perspective projection transformation of the virtual three-dimensional space onto the plurality of virtual planes; and
a projection image generation section that generates a projection image that is projected onto the curved screen using a pixel position correspondence relationship and the virtual plane images, the pixel position correspondence relationship defining a relationship between a pixel position of the plurality of virtual planes and a pixel position of the projection image so that the virtual plane images are observed without distortion when the user observes the curved screen from the presumed viewing position.
According to another embodiment of the invention, there is provided an image generation method that generates a projection image that can be observed as a stereoscopic image when a user observes the projection image projected onto a curved screen from a presumed viewing position, the method comprising:
setting a right virtual camera and a left virtual camera in a virtual three-dimensional space;
setting a plurality of virtual planes in front of the right virtual camera and the left virtual camera in a line-of-sight direction;
generating a virtual plane image that corresponds to the right virtual camera and a virtual plane image that corresponds to the left virtual camera by perspective projection transformation of the virtual three-dimensional space onto the plurality of virtual planes; and
generating the projection image using a pixel position correspondence relationship and the virtual plane images, the pixel position correspondence relationship defining a relationship between a pixel position of the plurality of virtual planes and a pixel position of the projection image so that the virtual plane images are observed without distortion when the user observes the curved screen from the presumed viewing position.
According to the stereoscopic device and the image generation method, the right virtual camera and the left virtual camera are set in the virtual three-dimensional space, and a plurality of virtual planes are set in the line-of-sight direction of the right virtual camera and the left virtual camera. Note that the range covered by the virtual planes need not necessarily include the entire field of view of the virtual camera. It suffices that the virtual planes cover the range projected onto the curved screen. The virtual plane image that corresponds to the right virtual camera and the virtual plane image that corresponds to the left virtual camera are generated by perspective projection transformation of the virtual three-dimensional space onto the virtual planes. The projection image is generated using the pixel position correspondence relationship and the virtual plane images, the pixel position correspondence relationship defining the relationship between the pixel position of the virtual planes and the pixel position of the projection image so that the virtual plane images are observed without distortion when the user observes the curved screen from the presumed viewing position. The projection image thus generated is projected onto the curved screen, and can be observed as a stereoscopic image when the user observes the curved screen from the presumed viewing position.
When observing a 2D image (non-stereoscopic image) displayed on a curved screen, the curved screen may be recognized as a wall. According to the stereoscopic image obtained by applying the technology according to the embodiment of the invention, it is possible to implement openness as if the wall of the curved screen were removed.
In the stereoscopic device,
the virtual plane image generation section may generate the virtual plane image that corresponds to the right virtual camera and the virtual plane image that corresponds to the left virtual camera by perspective projection transformation using the plurality of virtual planes as a screen common to the right virtual camera and the left virtual camera.
According to the above feature, the perspective projection transformation based on the right virtual camera and the left virtual camera is performed using the virtual planes as a common perspective projection transformation screen. This makes it unnecessary to separately set the perspective projection transformation screen corresponding to each of the right virtual camera and the left virtual camera.
The stereoscopic device may further comprise:
a projection device that projects the projection image so that a center of the projection image coincides with an intersection position of the presumed viewing direction of the user and the curved screen.
According to the above feature, the projection image is projected so that the center of the projection image coincides with the intersection position of the presumed viewing direction of the user and the curved screen. This makes it possible to suppress distortion of the stereoscopic image in the presumed viewing direction as compared with the end of the curved screen, and improve the visibility of the stereoscopic image when the user observes the stereoscopic image.
In the stereoscopic device,
the virtual plane setting section may set two or three virtual planes so that the virtual planes are connected to each other.
According to the above feature, two or three virtual planes are set so that the virtual planes are connected to each other. This makes it possible to minimize an increase in drawing load that may occur when performing the stereoscopic image drawing calculation process since it suffices to provide only two or three virtual planes.
In the stereoscopic device,
the virtual plane setting section may change a connection angle of the virtual planes.
According to the above feature, the connection angle of the virtual planes can be changed.
In the stereoscopic device,
the virtual plane setting section gradually may change a position of the virtual planes so that a relative angle with respect to the virtual camera gradually changes.
According to the above feature, the position of the virtual planes can be gradually changed so that the relative angle with respect to the virtual camera gradually changes.
The stereoscopic device according to the embodiment of the invention utilizes two or three virtual planes as a substitute (approximation) to the curved screen in order to reduce the drawing load. Therefore, the curved screen and the virtual planes are matched via partial scaling (e.g., Mercator projection that projects the globe onto a cylinder). If the connection angle of the virtual planes and the relative angle with respect to the virtual camera are fixed, the degree of scaling differs partially, but does not change. On the other hand, an accurately calculated parallax image is required in order to implement a stereoscopic image. Therefore, an area that can be relatively easily observed as a stereoscopic image, and an area that is difficult to observe as a stereoscopic image may be present on the curved screen depending on the degree of scaling.
In order to deal with such a situation, the relationship between the curved screen and the virtual plane is changed by changing the connection angle of the virtual planes or the relative angle with respect to the virtual camera to adjust the degree of scaling. This makes it possible to generate an appropriate projection image so that an area of the curved screen that is desired to be observed as a stereoscopic image can be easily observed as a stereoscopic image.
The stereoscopic device may further comprise:
an attention point setting section that sets an attention point that moves in the virtual three-dimensional space,
the virtual plane setting section changing the position of the virtual planes corresponding to a position of the attention point.
According to the above feature, the position of the virtual planes is changed corresponding to the position of the attention point in the virtual three-dimensional space. Therefore, the image at the attention point can be relatively easily observed as a stereoscopic image.
Exemplary embodiments of the invention are described below with reference to the drawings. Note that the embodiments to which the invention can be applied are not limited to the following exemplary embodiments.
The direction and the height of the player's seat 1002 are adjusted so that the center area of the screen 1004 is present in a presumed viewing direction of the player who sits on the player's seat 1002. In one embodiment of the invention, the presumed viewing direction is the front direction of the player who sits on the player's seat 1002. The curved screen 1004 is formed to be convex in the front direction (presumed viewing direction) of the player who sits on the player's seat 1002.
The projector 1006 that is a projection device is supported by a housing frame 1014 and posts 1012 that are provided behind the player's seat 1002, and is disposed at a position above the player's seat 1002 at which the projector 1006 does not interfere with the player who sits on the player's seat 1002 so that the center area of the screen 1004 is present in the direction from the projection center of the projector 1006. Specifically, the projector 1006 is disposed so that the direction from the projection center intersects (faces) the intersection position of the presumed viewing direction of the player and the curved screen. A wide-angle lens is attached to the projector 1006 as a projection lens. A projection image is projected onto the entire plane of projection of the screen 1004 through the wide-angle lens.
The control board 1020 includes a microprocessor (e.g., CPU, GPU, and DSP), an ASIC, and an IC memory (e.g., VRAM, RAM, and ROM). The control board 1020 performs various processes for implementing the car racing game based on a program and data stored in the IC memory, and operation signals input from the steering wheel 1008, the accelerator pedal 1010, the brake pedal 1009, and the like.
The player sits on the player's seat 1002, and plays the car racing game by operating the steering wheel 1006, the accelerator pedal 1010, and the brake pedal 1009 while observing the game screen displayed on the screen 1004, and hearing the game sound output from the speaker.
The car racing game according to one embodiment of the invention is designed so that a background object (e.g., racetrack) is disposed in a virtual three-dimensional space to form a game space. Various objects such as a player's car operated by the player and a player's car operated by another player are disposed in the game space. A virtual viewpoint (virtual camera) is also disposed in the game space at the position of the driver of the player's car. The projector 1006 projects (displays) an image (stereoscopic image) of the game space based on the virtual viewpoint onto (on) the screen 1004 as the game image.
In one embodiment of the invention, a right-eye stereoscopic image and a left-eye stereoscopic image are alternately displayed by time division so that the player can observe a stereoscopic image. The player wears stereoscopic glasses (not illustrated in the drawings) that utilize a film-type patterned retarder technique, a time-division technique (e.g., liquid crystal shutter technique), or a spectroscopic technique, and observes the image displayed on (projected onto) the screen 1004 as a stereoscopic image.
In one embodiment of the invention, the virtual three-dimensional space (game space) and the real space have a 1:1 scaling relationship. Note that the scaling relationship between the virtual three-dimensional space (game space) and the real space may be arbitrarily set depending on the game. For example, when the game is designed so that a small player character (e.g., insect) travels around the game world, the scaling relationship between the virtual three-dimensional space (game space) and the real space may be set to 1:25.
The stereoscopic image generation principle according to one embodiment of the invention is described below. A presumed position of each eye of the player (viewer) in a state in which the player sits on the player's seat 1002 is referred to as “presumed viewing position”. A virtual viewpoint 10 (right-eye virtual viewpoint 10a and left-eye virtual viewpoint 10b) is disposed in the virtual three-dimensional space (game space) at a position corresponding to the presumed viewing position (see
As illustrated in
Note that the range covered by the virtual planes 20a and 20b need not necessarily include the entire field of view of the virtual viewpoint 10. It suffices that the virtual planes 20a and 20b cover a range that corresponds to the range displayed on (projected onto) the screen 1004.
When the virtual planes 20a and 20b have been set as described above, a plane image 32 is generated by utilizing the virtual planes 20a and 20b as a perspective projection transformation screen. More specifically, the plane image 32 is generated by perspective projection transformation of the virtual three-dimensional space onto the virtual plane 20 based on the virtual viewpoint 10 (see
As illustrated in
As illustrated in
The reason why the virtual planes 20a and 20b are commonly used as the perspective projection transformation virtual screen for the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b is described below. Specifically, since an image drawn in the virtual plane is corrected to fit the screen 1004, the virtual screen need not necessarily be common to the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b. However, since a drawing program can be simplified, and it is easy to intuitively determine the drawing state as a result of using the virtual screen common to the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b, an adjustment and debugging are facilitated.
A perspective projection transformation process for generating a two-dimensional mage is normally designed so that the virtual screen is orthogonal to the line-of-sight direction, and the intersection point of the virtual screen and the line-of-sight direction coincides with the center of the rectangular virtual screen by setting the line-of-sight direction, the vertical angle of view, and the horizontal angle of view. However, when applying such a method to the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b, vertical parallax due to keystone (trapezoidal) distortion occurs since the virtual screen differs between the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b (see
Such a situation can be avoided by setting a virtual screen common to the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b (see
where, w is the width (horizontal dimension) of the common screen, h is the height (vertical dimension) of the common screen, (x, y, z) is the screen coordinates when the center of the common screen is the origin, and (xc, yc, zc) is the position coordinates of the virtual viewpoint when the center of the common screen is the origin (see
A versatile CG drawing system such as OpenGL (registered trademark) or DirectX (registered trademark) utilizes the direction of the virtual viewpoint (virtual camera) in order to determine the direction (normal) of the virtual screen. It is necessary to set the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b to perpendicularly face the virtual plane 20a (see
When the virtual screen has thus been determined, it suffices to take account of only the position of the right-eye virtual viewpoint 10a and the position of the left-eye virtual viewpoint 10b. The following description illustrates an example in which the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b face front for convenience of explanation.
The plane image 32 (32a and 32b) is generated by perspective projection transformation of the virtual three-dimensional space onto the virtual plane 20. When the plane image 32 is projected directly onto the curved screen (plane of projection) 1004, the resulting image is distorted. The amount of distortion is relatively small in the vicinity of the contact position of the virtual plane 20 and the screen (plane of projection) 1004, and increases as the distance from the contact position increases.
In order to correct the distortion, a coordinate transformation process is performed on the plane image to generate a projection image 40. More specifically, the relationship (correspondence relationship) between the position of the projection image 40 and the projection position in the virtual plane 20 (i.e., the position of the plane image 32) is calculated, and the projection image 40 is generated using the calculated relationship and the plane image 32. The above relationship may be calculated for the right-eye plane image 32a based on the right-eye virtual viewpoint 10a and the left-eye plane image 32b based on the left-eye virtual viewpoint 10h using an identical method.
As illustrated in
The position in the virtual plane 20 that corresponds to each pixel of the projection image 40 is calculated in the same manner as described above. A stereoscopic image without distortion can be implemented by setting the color of each pixel of the projection image 40 to be the color at the corresponding pixel position of the plane image 32.
The relationship between the pixel position of the projection image 40 and the pixel position of the plane image 32 thus calculated is referred to as “pixel position correspondence map 50”. A right-eye pixel position correspondence map 50a that corresponds to the right-eye virtual viewpoint 10a, and a left-eye pixel position correspondence map 50b that corresponds to the left-eye virtual viewpoint 10h are generated as the pixel position correspondence map 50.
When the drawing environment including the positions of the right eye and the left eye, the virtual plane, and the screen shape is symmetrical, the left-eye pixel position correspondence map 50b is obtained by reversing the right-eye pixel position correspondence map 50a. In this case, since it suffices to provide one pixel position correspondence map 50, it is possible to save the pixel position correspondence map calculation time and the memory capacity for storing the pixel position correspondence map 50.
Since the pixel position correspondence map 50 is fixed when the size and the position of the virtual plane 20 do not change, and the positions of the right eye and the left eye of the viewer do not change to a large extent, the pixel position correspondence map 50 may be generated in advance before generating an image, or may be generated when generating an image.
When the pixel position correspondence map 50 is generated when generating an image, the process can be performed at a sufficiently high speed, and the positions of the right eye and the left eye of the viewer can be accurately and quickly detected by head tracking or eye tracking, it is possible to deal with a situation in which the viewer has moved his head, and display a more natural stereoscopic image by generating the pixel position correspondence map 50 based on the positions of the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b in real time.
Even if the process cannot be performed in real time, it is possible to generate a stereoscopic image that is more appropriate for the viewer by acquiring the positions of the right eye and the left eye of the viewer during the initial setting process, and generating the pixel position correspondence map 50 corresponding to the positions of the right eye and the left eye of the viewer.
It is also possible to implement an effective effect in which the space appears to gradually extend as compared with a dome screen image (i.e., an image that is stuck to the wall of a dome) by setting the positions of the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b at the center between the right eye and the left eye so that the right-eye virtual viewpoint 10a and the left-eye virtual viewpoint 10b gradually move to the positions of the left eye and the right eye, respectively, and applying the corresponding pixel position correspondence map 50. Note that the pixel position correspondence map 50 may be generated in advance corresponding to the positions of the virtual viewpoints 10a and 10b, or may be generated at a high speed during the process.
When the projection image 40 (i.e., the projection image that corresponds to the right-eye virtual viewpoint and the projection image that corresponds to the left-eye virtual viewpoint) has been generated using the plane image 32 (32a and 32b) and the pixel position correspondence map 50 (50a and 50b), the projection image 40 is projected from the projector 1006.
The projection image 40 is thus generated by the above stereoscopic image generation principle. It is reasonable to generate the projection image 40 as described below instead of generating the plane image 32 in advance. As illustrated in
When using the above drawing process, since the pixel position correspondence map 50 is generated corresponding to each pixel of the projection image 40, it is necessary to provide a map having a size corresponding to the number of pixels of the projection image 40. Therefore, it may be difficult to store such a large map in the memory depending on the hardware conditions. Moreover, it takes time to calculate the map.
The above problems may be solved by utilizing the vertex coordinates of a polygon. The details of the method that utilizes the vertex coordinates of a polygon are described below. As illustrated in
In this case, a normal textured polygon drawing process implemented by a GPU may be used. Specifically, the texture coordinates (U, V) of each vertex of each polygon are referred to, and the polygon drawing process is performed so that the pixel value of the plane image 32 is acquired based on the coordinates (U, V) obtained by interpolating the coordinate value of each vertex. This makes it possible to store the pixel position relationship using a significantly small memory capacity as compared with the case of storing a map that corresponds to the number of pixels of the projection image 40. Moreover, since the amount of calculations is reduced, the coordinate position relationship data can be generated at a higher speed.
Since the above method utilizes the interpolation process, the accuracy may decrease as compared with the case of providing a map (pixel position correspondence map 50) that corresponds to the number of pixels of the projection image 40. However, since the number of meshes can be arbitrarily increased or decreased, the coordinate relationship data can be appropriately generated while giving priority to the accuracy of the image or the memory capacity/calculation time.
The operation input section 110 allows the player to input an operation instruction, and outputs an operation signal that corresponds to the operation instruction to the processing section 200. The function of the operation input section 110 is implemented by a button switch, a lever, a mouse, a keyboard, a sensor, and the like. In
The processing section 200 controls the entire game system 1, and performs various calculation processes (e.g., game process and image generation process) based on a program and data read from the storage section 300, the operation signal input from the operation input section 110, and the like. The processing section 200 includes a game calculation section 210 that mainly performs a game execution calculation process, a stereoscopic image generation section 220, a sound generation section 230, and a communication control section 240.
The game calculation section 210 controls the car racing game by performing a process in accordance with a game control program 312 based on a game operation performed by the player using the operation input section 110, and the like.
The stereoscopic image generation section 220 includes a virtual viewpoint setting section 221, a virtual plane setting section 222, a perspective projection transformation section 223, and a projection image generation section 224, and generates a stereoscopic image displayed on the image display section 120 based on the results of calculations performed by the game calculation section 210. More specifically, the stereoscopic image generation section 220 generates a right-eye stereoscopic image based on the right-eye virtual viewpoint 10a and a left-eye stereoscopic image based on the left-eye virtual viewpoint 10b at a given image generation timing (e.g., at intervals of 1/120th of a second), and outputs image signals of the generated images to the image display section 120. The function of the stereoscopic image generation section 220 is implemented by a processor (e.g., digital signal processor (DSP)), a control program, a drawing frame IC memory (e.g., frame buffer), and the like.
The virtual viewpoint setting section 221 sets the virtual viewpoint 10 (right-eye virtual viewpoint 10a and left-eye virtual viewpoint 10b) that corresponds to each eye of the player (viewer) in the game space.
The virtual plane setting section 222 sets the virtual planes 20a and 20b in the game space in accordance with the above principle.
The perspective projection transformation section 223 generates the plane image 32 by perspective projection transformation of the game space onto the virtual plane 20 based on the virtual viewpoint 10. More specifically, the perspective projection transformation section 223 generates the plane images 30a and 30b by perspective projection transformation of the game space onto the virtual planes 20a and 20b based on the right-eye virtual viewpoint 10a. The perspective projection transformation section 223 combines the plane images 30a and 30b to generate the right-eye plane image 32a (see
The plane image 32 thus generated is stored in a plane image buffer 330. The plane image buffer 330 includes a storage area for the right-eye plane image 32a and a storage area for the left-eye plane image 32b.
The projection image generation section 224 generates the projection image that corresponds to the right-eye virtual viewpoint and the projection image that corresponds to the left-eye virtual viewpoint using the plane image 32 generated by the perspective projection transformation section 223 and the pixel position correspondence map 50. The projection image thus generated is displayed on the display section 120 as a stereoscopic image. More specifically, the projection image generation section 224 generates the right-eye projection image by setting the color information about the corresponding pixel of the right-eye plane image 32a to each pixel of the right-eye projection image referring to a right-eye pixel position correspondence map 321. Likewise, the projection image generation section 224 generates the left-eye projection image by setting the color information about the corresponding pixel of the left-eye plane image 32b to each pixel of the left-eye projection image referring to a left-eye pixel position correspondence map 322.
The projection image thus generated is stored in a projection image buffer 340. The projection image buffer 340 includes a storage area for the right-eye projection image and a storage area for the left-eye projection image.
The image display section 120 is a curved display that alternately displays the right-eye stereoscopic image and the left-eye stereoscopic image by time division based on the image signal input from the stereoscopic image generation section 220. The function of the image display section 120 is implemented by a combination of the curved screen 1004 and the projector 1006 illustrated in
The sound generation section 230 generates a game sound (e.g., effect sound and background music (BGM)) used during the game, and outputs a sound signal of the generated game sound to the sound output section 130. The sound output section 130 outputs the game sound (e.g., effect sound and background music (BGM)) based on the sound signal input from the sound generation section 230. The function of the sound output section 130 is implemented by a speaker or the like.
The storage section 300 stores a system program that implements a function that causes the processing section 200 to integrally control the game system, a program and data necessary for causing the processing section 200 to execute the game, and the like. The storage section 300 is used as a work area for the processing section 200, and temporarily stores the results of calculations performed by the processing section 200 according to various programs, data input from the operation input section 110, and the like. The function of the storage section 300 is implemented by a storage device such as an IC memory, a hard disk, a CD-ROM, a DVD, an MO, a RAM, or a VRAM. In
In the loop A process, the game calculation section 210 controls the game according to an operation instruction input by the player using the operation input section 110 (step A3). The stereoscopic image generation section 220 then generate a stereoscopic image.
Specifically, the virtual viewpoint setting section 221 sets the virtual viewpoint 10 (right-eye virtual viewpoint 10a and left-eye virtual viewpoint 10b) in the game space (step A5). The virtual plane setting section 222 sets the virtual planes 20a and 20b in the game space (step A7).
The perspective projection transformation section 223 then generates the right-eye plane image 32a by perspective projection transformation of the game space onto the virtual planes 20a and 20b based on the right-eye virtual viewpoint 10a (step A9). The projection image generation section 224 generates the right-eye projection image 40a based on the right-eye plane image 32a referring to the right-eye pixel position correspondence map 50a (step A11). The perspective projection transformation section 223 generates the left-eye plane image 32b by perspective projection transformation of the game space onto the virtual planes 20a and 20b based on the left-eye virtual viewpoint 10b (step A13). The projection image generation section 224 generates the left-eye projection image 40b based on the left-eye plane image 32b referring to the left-eye pixel position correspondence map 50b (step A15).
The right-eye projection image 40a and the left-eye projection image 40b are displayed on the image display section 120 (step A17). The game calculation section 210 then determines whether or not to terminate the game. When the game calculation section 210 has determined to terminate the game (step A19: YES), the game calculation section 210 terminates the loop A process to complete the game process.
According to one embodiment of the invention, the right-eye and left-eye virtual viewpoints 10 (10a and 10b) are set in the virtual three-dimensional space (game space), and a plurality of virtual planes 20a and 20b that cover at least the range of the field of view of each virtual viewpoint 10 are set in the line-of-sight direction of each virtual viewpoint 10. The plane images 32a and 32b are generated by perspective projection transformation of the virtual three-dimensional space onto the virtual planes 20a and 20b based on the virtual viewpoint 10. The projection image 40 is generated using the pixel position correspondence map 50 and the plane images 32a and 32b, the pixel position correspondence map 50 defining the relationship between the pixel position of the virtual planes 20a and 20b and the pixel position of the projection image 40 so that the plane images 32a and 32b are observed without distortion when observing the screen 1004 from the presumed viewing position. The projection image 40 thus generated is projected onto the curved screen 1004 through the wide-angle lens, and can be observed as a stereoscopic image when observing the screen 1004 from the presumed viewing position.
The following advantageous effect is obtained as a secondary effect. Specifically, when observing a 2D image (non-stereoscopic image) displayed on a curved screen, the curved screen may be recognized as a wall. According to the stereoscopic image obtained by applying the technology according to one embodiment of the invention, it is possible to implement openness as if the wall of the curved screen were removed.
According to one embodiment of the invention, the screen 1004 is formed to be convex in the presumed viewing direction of the player who sits on the player's seat 1002, and the projector 1006 is disposed so that the direction from the projection center of the projector 1006 intersects the intersection position of the presumed viewing direction and the screen 1004. This makes it possible to suppress distortion of the stereoscopic image in the presumed viewing direction as compared with the end of the screen 1004, and improve the visibility of the stereoscopic image when the player who sits on the player's seat 1002 observes the stereoscopic image.
The virtual planes 20a and 20b are connected in the horizontal direction, and are set to face the virtual viewpoint 10. This makes it possible to minimize an increase in drawing load that may occur when performing the drawing calculation process on the stereoscopic image that is displayed on a dome screen.
The embodiments to which the invention can be applied are not limited to the above embodiments. Various modifications and variations may be appropriately made without departing from the scope of the invention.
The above embodiments have been described taking an example in which the relative positional relationship between the virtual planes 20a and 20b and the screen 1004 is fixed. Note that the relative positional relationship between the virtual planes 20a and 20b and the screen 1004 may be set variably.
The virtual planes 20a and 20b are set as a plane that approximates the plane of projection of the curved screen 1004. Therefore, the curved screen 1004 and the virtual planes 20a and 20b are matched via partial scaling (e.g., Mercator projection that projects the globe onto a cylinder). On the other hand, an accurately calculated parallax image is required in order to implement a stereoscopic image. Therefore, an area that can be easily observed as a stereoscopic image, and an area that is difficult to observe as a stereoscopic image may be present on the screen (plane of projection) 1004 depending on the degree of scaling. In the example illustrated in
A position to which it is desired that the player pay attention, and a position to which the player tends to pay attention differ depending on the game situation. For example, when the game is a racing game, the player may pay attention in the front direction when the player's car travels on a straight level road, and may pay attention to an upper position in the front direction when the player's car travels on a straight uphill road. The player may pay attention to a position on the right side of the front direction when the player's car travels on a right-hand curve since the player turns his eyes to a position ahead of the curve. Therefore, the virtual planes 20a and 20b (i.e., the relative positional relationship between the virtual planes 20a and 20b and the screen (plane of projection) 1004) may be set so that the attention point that may change depending on the game situation coincides with the contact position of the virtual plane 20a or 20b and the screen (plane of projection) 1004.
For example, the connection angle of the virtual planes 20a and 20b may be set variably. The connection angle may be gradually decreased when the attention point moves toward the right end or the left end of the screen 1004. In this case, the contact position is shifted toward the right end or the left end of the screen 1004. The connection angle may be gradually increased when the attention point moves toward the center of the screen 1004. In this case, the contact position is shifted toward the center of the screen 1004. Note that it is necessary to change the size of the virtual planes 20a and 20b along with a change in the connection angle. This is because the virtual planes 20a and 20b must largely cover the field of view of the virtual viewpoint 10.
As illustrated in
Note that it is also possible to change both the connection angle and the relative angle of the virtual planes 20a and 20b with respect to the virtual viewpoint 10.
The above embodiments have been described taking an example in which two virtual planes 20a and 20b are provided. Note that an arbitrary number of virtual planes may be set. For example, three virtual planes 20 may be connected in the shape of the character “C” when viewed from above (see
Although only some embodiments of the present invention have been described in detail above, those skilled in the art will readily appreciate that many modifications are possible in the embodiments without materially departing from the novel teachings and advantages of this invention. Accordingly, all such modifications are intended to be included within scope of this invention.
Number | Date | Country | Kind |
---|---|---|---|
2012-080193 | Mar 2012 | JP | national |