The present invention relates to an image processing technique of a virtual reality space using a head mount display.
In recent years, a virtual reality (VR) technique using a head mount display (HMD) has been becoming common. In a VR game for providing virtual reality using an HMD, a user can get a sense as if he/she actually exists in a virtual three-dimensional (3D) space.
[Patent Literature 1] Japanese Unexamined Patent Application Publication No. 5565258
[Patent Literature 2] Japanese Unexamined Patent Application Publication No. 5869177
In VR games, there is difficulty in game playing since simulator sickness easily occurs due to a high level of immersion.
One of the reasons for simulator sickness is considered to be a gap between an image displayed on an HMD and the sense of a user. For example, in a case where the user's head is turned to the right, if the direction of the virtual camera in a virtual 3D space is turned to the right together with the motion of the user's head, a gap does not occur between the image and the sense of the user. If the motion of the virtual camera is delayed or if the direction of the virtual camera is fixed and the virtual camera does not move together with the motion of the user's head, a gap occurs between the image and the sense of the user.
The present invention is made in view of the above and aims to reduce occurrence of simulator sickness in a VR space using an HMD.
To solve the above problem, the game device according to the present invention is a game device that provides a game for a user to play with a head mount display on, the game device including: an input unit configured to input a direction of a head of the user from the head mount display; and a camera control unit configured to control a virtual camera in a virtual space by reflecting the direction of the head of the user on a direction of a player character, which is operated by the user. The camera control unit controls the virtual camera by reflecting the direction of the head of the user on the direction of the player character even in a case where an operation to the player character by the user is restricted.
According to the present invention, even in a case where operations to a player character are restricted, a camera control unit reflects motion of the user's head on the virtual camera, so that it is possible to reduce occurrence of simulator sickness in a VR space using an HMD.
Hereinafter, an explanation is given of embodiments of the present invention with reference to the drawings.
[Game System]
The game system illustrated in
In the VR game according to the present embodiment, an image captured from the perspective of a player character in the virtual 3D space is displayed on the HMD 300. A typical VR game is First-Person Shooting (FPS). A user can move a player character by operating the controller 200. The virtual camera is controlled such that images are captured from the first-person perspective of the player character in accordance with the position and the direction of the player character. The virtual camera is controlled based on motion of the HMD 300 (i.e., motion of the head part of the user) in addition to the position and the direction of the player character. For example, when the user turns his/her head to the right, the HMD 300 detects the motion of the user's head and transmits, to the game machine 100, HMD information causing the direction of the virtual camera to turn right. The user can look over the virtual 3D space by moving his/her head.
There may be a case where a game program controls a player character without accepting operations by a user during the game. For example, in conventional non-VR games, during a cutscene where a boss character appears, the position and the direction of a player character are fixed and a scene where a boss character appears is displayed. In a case where the game program controls the player character without accepting operations by the user, unless motion of the HMD 300 is reflected on the control of the virtual camera, a gap occurs between images and the sense of the user. Therefore, in the present embodiment, the game program reflects motion of the HMD 300 on the virtual camera even in a situation where the player character is controlled without accepting operations by the user.
[Game Device]
The character control unit 51 normally controls the position and the direction of the player character in accordance with operations by the user. During an event, the character control unit 51 ignores operations by the user and controls the player character in accordance with event data. Data for an event is executed when a predetermined condition is met in the game. During an event, the game proceeds in accordance with event data. Event data is data relating to contents of an event, such as data indicating motion of a player character, data indicating motion of an object, and data indicating motion of a virtual camera. An example of events is an event where a boss character appears. The event data of the event where a boss character appears includes data for showing a cutscene where the boss character appears, such as motions and dialogues of the boss character. Generally, during an event where a boss character appears, a virtual camera is controlled to capture the boss character.
The camera control unit 52 normally sets the direction of the virtual camera, based on the direction of the player character and the direction of the user's head, which is included in HMD information received from the HMD 300. During an event, the camera control unit 52 decides the direction of the virtual camera by adding the direction of the user's head to the direction of the virtual camera that is set based on event data.
The rendering unit 53 generates a two-dimensional (2D) image by capturing the virtual 3D space using the virtual camera. The rendering unit 53 generates an image for right eye and an image for left eye having parallax, so that, in a case where the images are displayed on the displays 301A and 301B of the HMD 300, respectively, the user can see a 3D stereoscopic image on the HMD 300.
An explanation is given of an execution example of camera control in the present embodiment. Here, camera control is explained using an example of an event of holding a ladder.
During the event of holding the ladder 61, the character control unit 51 rotates the direction of the player character by 180 degrees so that the position of the player character is changed from the standing position to the position of holding the ladder 61. The character control unit 51 does not accept operations by the user until the player character gets in the position of holding the ladder 61. The camera control unit 52 rotates the direction of the virtual camera by 180 degrees in accordance with the motion of the player character. When the direction of the virtual camera is rotated, the camera control unit 52 adds the direction of the user's head, which is detected by the HMD 300, to the direction of the virtual camera that is set based on event data, so as to decide the final direction of the virtual camera. Below is an explanation of controlling the virtual camera.
For example, suppose that, in the situation of
Note that, although the example of only reflecting the yaw angle is taken for the explanation above, the pitch angle or the roll angle may be reflected.
As explained above, according to the present embodiment, in a case where operations to a player character are restricted and a virtual camera is controlled in accordance with even data, motion of the user's head, which is detected by the HMD 300, is reflected on the direction of the virtual camera that is set based on the event data. Therefore, even in a situation where the user cannot operate the player character as he/she wants, the image displayed on the HMD 300 moves together with motion of his/her head. As a result, it is possible to reduce occurrence of simulator sickness.
Since the virtual camera is controlled in accordance with event data during an event, there is a space that is not supposed to be seen. For example, in a case where data for an event in woods is executed, trees are arranged only in directions in the perspective of the virtual camera, which is set based on event data, in the virtual 3D space. The virtual 3D space in the other directions is not supposed to be seen, and therefore trees are not arranged there. As objects are not arranged in a space that is not supposed to be seen, it is possible to delete unnecessary data and to reduce processing loads for rendering. In a case where the direction of the user's head is reflected on the virtual camera even during an event as described in the first embodiment, there is possibility that a space that is not supposed to be seen is displayed. Therefore, in the present embodiment, when a space that is not supposed to be seen is captured by the virtual camera, the screen is blacked out.
[Game Device]
As with the first embodiment, the character control unit 51 normally controls the player character in accordance with operation information and, during an event, controls the player character in accordance with event data.
As with the first embodiment, the camera control unit 52 reflects motion of the user's head, which is detected by the HMD 300, to the virtual camera even in a situation where the user cannot operate the player character.
The determining unit 55 determines whether the direction of the virtual camera is a direction within a blackout-process range. The blackout-process range is a range that is preset by game developers, in which a space that is not supposed to be seen, such as a space where rendering is omitted, is included.
The rendering unit 53 turns a screen to be output to the HMD 300 into black in a case where the determining unit 55 determines that the direction of the virtual camera is within the blackout-process range.
The processes of
The camera control unit 52 inputs motion of the user's head detected by the HMD 300 (Step S11).
The camera control unit 52 adds the direction of the user's head to the direction of the virtual camera that is set based on event data, so as to calculate the direction of the virtual camera (Step S12).
The determining unit 55 determines whether the direction of the virtual camera is within the blackout-process range (Step S13). The blackout-process range may be specified based on event data. For example, a reference line and a threshold value are predetermined. In a case where the angle between the reference line and the direction of the virtual camera exceeds the threshold value, the determining unit 55 determines that the direction of the virtual camera is within the blackout-process range. For example, the reference line is a line connecting the player character and the boss character. Alternatively, the determining unit 55 regards the north in the virtual 3D space as 0 degrees and, clockwise, regards the east as 90 degrees, the south as 180 degrees, and the west as 270 degrees. Then, the blackout-process range may be specified using directions in the virtual 3D space.
In a case where the determining unit 55 determines that the direction of the virtual camera is within the blackout-process range (YES in Step S13), the rendering unit 53 performs the blackout process (Step S14). Specifically, the rendering unit 53 starts decreasing luminance of the image in which the virtual 3D space is rendered. In a case where the direction of the virtual camera gets out of the blackout-process range after the blackout process is initiated, the rendering unit 53 cancels the blackout process and increases luminance of the image back to the original luminance. Instead of the blackout process, the rendering unit 53 may increase luminance of the image to whiteout the image. Alternatively, the rendering unit 53 may perform a mosaic process on the image to decrease resolution of the image. Alternatively, the rendering unit 53 may display a pattern in the image. For example, the rendering unit 53 displays a large arrow in the image so as to indicate the direction for the user to preferably look at.
An explanation is given of an execution example of the blackout process in the present embodiment.
The screen of
Even in a case where the character control unit 51 fixes the player character, the user can move his/her head and look over the virtual 3D space as he/she wants.
When the user turns his/her head further to the left direction, the direction of the virtual camera is turned further to the left direction as well. As a result, since the direction of the virtual camera gets inside the blackout-process range 85, the rendering unit 53 initiates the blackout process.
The blackout-process range may be divided into multiple ranges so that percentage of decreasing luminance can be set for each blackout-process range. For example, from 45 degrees to 60 degrees, luminance is decreased by 50%; from 60 degrees to 75 degrees, luminance is decreased by 75%; and, at 75 degrees or higher, luminance is decreased by 100%.
Note that, although the example of making determination based on the yaw angle only is taken for the explanation above, it is possible to make determination based on the pitch angle and the roll angle as well.
As explained above, according to the present embodiment, by setting a blackout-process range in the virtual 3D space and performing a blackout process on an output image when the direction of the virtual camera is within the blackout-process range, it is possible not to display a space that is not supposed to be seen.
Number | Date | Country | Kind |
---|---|---|---|
JP2016-113579 | Jun 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/020959 | 6/6/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/213131 | 12/14/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
9256069 | Wada | Feb 2016 | B2 |
10525352 | Noda | Jan 2020 | B2 |
20080180438 | Sasaki | Jul 2008 | A1 |
20150325027 | Herman et al. | Nov 2015 | A1 |
20170076496 | Inomata | Mar 2017 | A1 |
Number | Date | Country |
---|---|---|
2003325969 | Nov 2003 | JP |
5869177 | Jan 2016 | JP |
5914739 | Apr 2016 | JP |
Entry |
---|
Boucher, Robin, “Example of measures against VR sickness,” GREE Advent Calendar 2015 5th day, Dec. 5, 2015. |
Number | Date | Country | |
---|---|---|---|
20200306636 A1 | Oct 2020 | US |