This disclosure relates to position designation in a virtual space, for identifying, in a virtual reality space (VR) or an augmented reality space (AR), an object that is an operator's target of operation in order for an operator to perform operation.
In Patent Literatures 1 and 2, there is described a technology of determining a point on which an operator wearing a head-mounted display (HMD) focuses his or her gaze based on a line of sight of the operator, to thereby display a cursor or a pointer for indicating a point of gaze at that point.
However, in the technology described in Patent Literatures 1 and 2, designating a part of an object in a virtual space, which has a small apparent area as viewed from the operator side, is difficult. This disclosure helps to enable easy designation of a predetermined position of an object in a virtual space.
According to at least one embodiment of this disclosure, there are provided a virtual space position designation method and a device in which a provisional line of sight for designating a position in a virtual space is output not from a position of an eye of an operator in the virtual space but from a position separated from the position of the eye by a certain first distance in an up-down direction, and an angle α is formed in a vertical direction between the provisional line of sight and an actual line of sight output from the position of the eye of the operator so that the provisional line of sight intersects with the actual line of sight at a position separated by a certain second distance in a horizontal direction.
According to some embodiments of this disclosure, the predetermined position of the object in the virtual space can be easily designated.
Other features and advantages of this disclosure are made clear from the description of at least one embodiment of this disclosure, the accompanying drawings, and the appended claims.
First, details of at least one embodiment of this disclosure are listed and described. At least one embodiment of this disclosure has at least the following configuration.
(Item 1)
A virtual space position designation method for displaying a pointer indicating a place corresponding to a target of operation in a virtual space. The virtual space position designation method includes determining an actual line of sight and a provisional line of sight, the actual line of sight connecting between a position A of an eye of an operator in a virtual space and a position C separated from the position A by a distance x in a horizontal direction of the virtual space, the provisional line of sight connecting between the position C and a position B separated from the position A of the eye of the operator in the virtual space by a distance y1 in a vertical direction of the virtual space. The method further includes displaying a pointer indicating a position corresponding to a target of operation at a point at which the provisional line of sight intersects with an object in the virtual space. The method further includes rendering the virtual space including the pointer based on the actual line of sight. The method further includes moving the provisional line of sight based on movement of the actual line of sight.
(Item 2)
A virtual space position designation method for displaying a pointer indicating a place corresponding to a target of operation in a virtual space according to Item 1, in which the position B is set at a position higher than the position A by the distance y1 in the vertical direction of the virtual space and setting the position C at a position lower than the position A by a distance y2 in the vertical direction of the virtual space.
(Item 3)
A virtual space position designation method for displaying a pointer indicating a place corresponding to a target of operation in a virtual space according to Item 1, in which the position B is set at a position lower than the position A by the distance y1 in the vertical direction of the virtual space and setting the position C at a position higher than the position A by a distance y2 in the vertical direction of the virtual space.
(Item 4)
A virtual space position designation method according to any one of Items 1 to 3, in which the displayed pointer is modified and displayed in such a form that the pointer adheres to a surface of the object in the virtual space. The virtual space position designation method described in Item 4 further includes displaying, in an emphasized manner, whether the pointer is present on an upper surface of an object or on other surfaces. Thus, operability is improved.
(Item 5)
A system for executing the method of any one of Items 1 to 4.
(Item 6) A non-transitory computer readable medium having recorded thereon a program for execution by the system for implementing the method of any one of Items 1 to 4.
(Item 7)
A virtual space position designation system configured to display a pointer indicating a place corresponding to a target of operation in a virtual space. The virtual space position designation device includes an initial line-of-sight calculation means for determining an actual line of sight and a provisional line of sight, the actual line of sight connecting between a position A of an eye of an operator in a virtual space and a position C separated from the position A by a distance x in a horizontal direction of the virtual space, the provisional line of sight connecting between the position C and a position B separated from the position A of the eye of the operator in the virtual space by a distance y1 in a vertical direction of the virtual space. The system further includes a pointer display means for displaying a pointer indicating a position corresponding to a target of operation at a point at which the provisional line of sight intersects with an object in the virtual space. The system further includes a field-of-view image generation means for rendering the virtual space including the pointer based on the actual line of sight. The system further includes a line-of-sight movement means for moving the provisional line of sight based on movement of the actual line of sight.
(Item 8)
A virtual space position designation system configured to display a pointer indicating a place corresponding to a target of operation in a virtual space according to Item 7, in which the initial line-of-sight calculation means is configured to set the position B at a position higher than the position A by the distance y1 in the vertical direction of the virtual space and set the position C at a position lower than the position A by a distance y2 in the vertical direction of the virtual space.
(Item 9)
A virtual space position designation system configured to display a pointer indicating a place corresponding to a target of operation in a virtual space according to Item 8, in which the initial line-of-sight calculation means is configured to set the position B at a position lower than the position A by the distance y1 in the vertical direction of the virtual space and set the position C at a position higher than the position A by a distance y2 in the vertical direction of the virtual space.
(Item 10)
A virtual space position designation system according to any one of Items 7 to 9, in which the pointer displayed by the pointer display means is modified and displayed in such a form that the pointer adheres to a surface of the object in the virtual space. The virtual space position designation system described in Item 10 is capable of displaying, in an emphasized manner, whether the pointer is present on an upper surface of an object or on other surfaces. Thus, operability is improved.
Now, with reference to the drawings, at least one embodiment of this disclosure is described. In at least one embodiment, description is given based on a premise of the following immersive virtual space. In the immersive virtual space, a head-mounted display (HMD) including various sensors (for example, an acceleration sensor and an angular velocity sensor) and capable of measuring posture data of itself is used, and this posture data is used to scroll an image displayed on the head-mounted display (HMD) to achieve movement of a line of sight on the virtual space. However, this disclosure can be also applied to a case in which a virtual space is displayed on a normal display and the line of sight on the virtual space is moved based on input performed on a keyboard, a mouse, a joystick, or other devices. Further, the virtual space is a three-dimensional virtual space herein, but the virtual space is not necessarily limited thereto.
Further, in the drawings, like components are denoted by like reference symbols.
In
A point D is a point at a position vertically separated from the point C by the first distance y1. Therefore, the straight line AC is parallel to a straight line BD. A straight line BC and the straight line BD intersect with each other at the point B at an angle α, and the straight line BC and the straight line AC intersect with each other at the point C at the angle α. Therefore, the straight line BC indicates a view of looking downward at an angle β-α. The straight line BC corresponds to a provisional line of sight for designating the object being a target of operation.
The positional relationship among the points A, B, C, and D may be inverted upside down, and the straight line AC representing the actual line of sight may indicate an upward-looking view.
In at least one embodiment, a pointer is displayed at a point at which the straight line BC corresponding to the provisional line of sight for designating the object being the target of operation intersects with the object being the target of operation, and the object designated by the pointer is set to the target of operation. As initial setting, the height y0, the first distance y1, the second distance x, the third distance y2, and the angles α and β may be set in accordance with characteristics of, for example, the object being the target of operation or a game that uses the object.
Now, with reference to
In
In
In the case of
In
In
With use of the straight line BC corresponding to the provisional line of sight for designating the object being the target of operation instead of the straight line AC corresponding to the actual line of sight, for example in
Description has been given so far of only the change in distance between the operator and the object O, that is, movement of the operator or the object O in a Z-axis direction of
First, as a first example of the movement of the line of sight of the operator, there is given movement of the line of sight of the operator in a right-left horizontal direction. This movement occurs when the operator turns his or her head, i.e., the field of view is moved right or left due to rotation in a yaw angle direction about a Y axis of
Next, as a second example of the movement of the line of sight of the operator, there is given a case in which the operator himself or herself moves in the up-down direction in the virtual space, that is, the operator himself or herself moves in the Y-axis direction of
Further, as a third example of the movement of the line of sight of the operator, there is given a case in which the head is inclined to right or left, that is, rotation is obtained in a roll angle direction about the Z axis of
In contrast, as a fourth example of the movement of the line of sight of the operator, there is given a case in which the head is shaken up and down, that is, the line of sight of the operator is swung in the up-down direction due to rotation in a pitch angle direction about the X axis of
As the processing for the fourth example, many methods can be conceived, and at least three representative methods are described below.
A first method is described with respect to
A second method is described with respect to
A third method is described with respect to
In the first method, the change in angle β for looking downward does not cause change in position of the pointer P on the object O. Therefore, when the pointer P is mainly moved only in the right-left direction in the virtual space, the unconscious change in angle β for looking downward does not affect the position of the pointer P on the object O, which is convenient. In other words, in order to move the pointer P on the object O in the up-down direction of the field of view, the distance between the operator and the object O is required to be changed in the virtual space or the operator himself or herself is required to be moved in the up-down direction, and hence this method is not suitable for a case in which the pointer P is required to be moved in the virtual space or on the object O in the up-down direction of the field of view.
In contrast, in the second method and the third method, the change in angle β for looking downward causes change in position of the pointer P on the object O, and hence those methods are suitable for the case in which the pointer P is required to be moved in the virtual space or on the object O in the up-down direction of the field of view. However, when the angle β for looking downward changes, both of the position P′ of the point of gaze on the object O and the pointer P on the object O move but make different movements, and hence the operation may become difficult.
As described above, at least one embodiment has an effect that a user is able to easily perform operations with respect to the upper surface of the object O, for example, raising or crushing the object O. Further, when the pointer P is modified and displayed in such a form that the pointer P adheres to the surface of the object O, whether the pointer P is present on the upper surface of the object O or on other surfaces can be displayed in an emphasized manner. Thus, the operability is improved.
As specific examples,
In those examples, the pointer P is a three-dimensional object having a thickness, and hence whether the pointer P is present on the upper surface of the object O or on other surfaces is displayed in a more emphasized manner. However, even when the pointer P does not have a thickness, when the pointer P is modified and displayed in such a form that the pointer P adheres to the surface of the object O, whether the pointer P is present on the upper surface of the object O or on other surfaces can be made clear.
Step S1501 is an initial line-of-sight calculation step. The actual line of sight connecting between the position A of the eye of the operator in the virtual space and the position C separated from the position A by the distance x in the horizontal direction of the virtual space and the provisional line of sight connecting between the position C and the position B separated from the position A of the eye of the operator in the virtual space by the distance y1 in the vertical direction of the virtual space are determined as initial values.
Step S1502 is a pointer display step. The pointer P representing a place corresponding to the target of operation is displayed at a point at which the provisional line of sight intersects with the object in the virtual space. When the pointer P is displayed as in
Step S1503 is a field-of-view image generation step. In the field of view that is based on the actual line of sight, the virtual space including the pointer is rendered.
Step S1504 is a line-of-sight movement step. The provisional line of sight is moved along with the movement of the actual line of sight, which occurs when the head of the operator is turned so that the field of view is moved to right or left, the operator himself or herself moves in the virtual space in the horizontal direction, or the operator himself or herself moves in the virtual space in the up-down direction. The processing of the case in which the line of sight of the operator is swung in the up-down direction, which occurs when the head is shaken up and down or the like as described with reference the above description of
In
The head-mounted display (HMD) 1610 includes a display 1612 and a sensor 1614. The display 1612 is a non-transmissive display device configured to completely cover a field of view of a user. The user can see only a screen displayed on the display 1612. The user wearing the non-transmissive head-mounted display (HMD) 1610 entirely loses his or her field of view of the outside world. Therefore, there is obtained a display mode in which the user is completely immersed in the virtual space displayed by an application executed in the control circuit unit 1620. In at least one embodiment, the display device 1612 is a partially transmissive display device.
The sensor 1614 included in the head-mounted display (HMD) 1610 is fixed near the display 1612. The sensor 1614 includes a geomagnetic sensor, an acceleration sensor, and/or an inclination (angular velocity or gyro) sensor. With use of at least one of those sensors, various movements of the head-mounted display (HMD) 1610 (display 1612) worn on the head of the user can be detected. Particularly in the case of the angular velocity sensor, as illustrated in
With reference to
Referring back to
Instead, the control circuit unit 1620 may be mounted on the head-mounted display (HMD) 1610 as an object operation device. In this case, the control circuit unit 1620 can perform all functions or only a part of the functions of the object operation device. When only a part of the functions is performed by control circuit unit 1620 mounted on the HMD 1610, the remaining functions may be performed by the head-mounted display (HMD) 1610 or on the server computer (not shown) via a network.
The position tracking camera (position sensor) 1630 included in the system 1600 is connected to the control circuit unit 1620 so as to enable communication therebetween, and has a function of tracking the position of the head-mounted display (HMD) 1610. The position tracking camera (position sensor) 1630 is implemented with use of an infrared sensor or a plurality of optical cameras. The system 1600 includes the position tracking camera (position sensor) 1630 configured to detect the position of the head-mounted display (HMD) on the user's head, and thus the system 1600 can accurately associate and identify a virtual space position of a vertical camera and the immersed user in the three-dimensional virtual space.
More specifically, the position tracking camera (position sensor) 1630 detects over time actual space positions of a plurality of detection points, which are virtually provided on the head-mounted display (HMD) 1610, as in
Referring back to
The movement detection unit 1910 measures the movement data of the head-mounted display (HMD) 1610 worn on the head of the user based on the input of the movement information from the sensor 1614 or the position tracking camera (position sensor) 1630. In this disclosure, in particular, the angle information detected over time by the inclination sensor 1614 and the position information detected over time by the position tracking camera (position sensor) 1630 are determined.
The field-of-view movement unit 1920 determines the field-of-view information based on three-dimensional virtual space information stored in a space information storage unit 1950, and on detection information of a field-of-view direction of the virtual camera, which is based on the angle information detected by the inclination sensor 1614 and the position information detected by the position sensor 1630. An actual line-of-sight movement unit 1922 included in the field-of-view movement unit 1920 determines the actual line of sight in the three-dimensional virtual space, that is, the movement of the straight line AC, based on the field-of-view information. When the actual line of sight can be moved by detecting the movement of eyeballs and using some auxiliary input, the field-of-view movement unit 1920 and the actual line-of-sight movement unit 1922 further perform processing for this operation. The actual line-of-sight movement unit 1922 performs processing corresponding to the line-of-sight movement step S1504 together with a virtual line-of-sight movement unit 1946 to be described later, and can be treated as a line-of-sight movement unit as a whole. The processing of the case in which the line of sight of the operator is swung in the up-down direction, which occurs when the head is shaken up and down or the like as described above with reference to
The field-of-view image generation unit 1930 generates a field-of-view image based on the field-of-view information and the position of the pointer P transmitted from the pointer control unit 1940, and performs processing corresponding to the field-of-view image generation step S1503.
The pointer control unit 1940 is a unit that performs the control of the pointer in the field of view image. Specifically, the pointer control unit 1940 includes an initial line-of-sight calculation unit 1942, a pointer display unit 1944, and the virtual line-of-sight movement unit 1946.
The initial line-of-sight calculation unit 1942 sets the initial values of both of the actual line of sight, that is, the straight line AC, and the provisional line of sight, that is, the straight line BC, and performs processing corresponding to the initial line-of-sight calculation step S1501.
The pointer display unit 1944 places the pointer P at a point at which the provisional line of sight, that is, the straight line BC intersects with the object O, and performs processing corresponding to the pointer display step S1502. When the pointer is display as in
The virtual line-of-sight movement unit 1946 moves the provisional line of sight, that is, the straight line BC in accordance with the movement of the actual line of sight, that is, the straight line AC. The virtual line-of-sight movement unit 1946 performs processing corresponding to the line-of-sight movement step S1504 together with the actual line-of-sight movement unit 1922 described above, and can be treated as the line-of-sight movement unit as a whole. As described above, the processing of the case in which the line of sight of the operator is swung in the up-down direction, which occurs when the head is shaken up and down or the like as described above with reference to
The respective elements in
This disclosure has been described above with reference to at least one embodiment, but this disclosure is not limited to the details mentioned above. A person skilled in the art would understand that various modifications can be made to the at least one embodiment as long as the modifications do not deviate from the spirit and scope of this disclosure described in the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2015-119250 | Jun 2015 | JP | national |
The present application is a National Stage of PCT International Application No. PCT/JP2016/066812, filed Jun. 6, 2016, which claims priority to Japanese Patent Application No. 2015-119250 filed Jun. 12, 2015.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/066812 | 6/6/2016 | WO | 00 |