The present invention pertains to a data processing technique, and particularly pertains to an information processing apparatus and an adjustment screen display method.
Image display systems that enable a user who is wearing a head-mounted display (HMD) to appreciate a target space from any viewpoint are becoming widespread. For example, electronic content that realizes a virtual reality (VR) by taking a virtual three-dimensional space as a display target and causing a head-mounted display to display an image that corresponds to a user's line-of-sight direction is known. Using a head-mounted display makes it possible to increase a sense of immersion into a video and improve operability for an application such as a game. In addition, there has also been development of walkthrough systems which allow a user who is wearing a head-mounted display to virtually walk around in a space displayed as a video by physically moving.
In a case where the distance between the lens for a left eye and the lens for a right eye (hereinafter, also referred to as an “inter-lens distance”) which are provided in a head-mounted display is not set appropriately, display by the head-mounted display may appear blurred to a user. Accordingly, the user needs to appropriately set the inter-lens distance for the head-mounted display.
The present invention is made in light of such a problem, and one objective of the present invention is to provide a technique for assisting setting of the inter-lens distance for a head-mounted display.
In order to solve the above-described problem, an information processing apparatus according to a certain aspect of the present invention includes an adjustment screen generation unit that generates an adjustment screen for allowing a user who is wearing a head-mounted display to adjust an inter-lens distance for the head-mounted display, and a display control unit that causes the head-mounted display to display the adjustment screen. The adjustment screen generation unit disposes, in the adjustment screen, a lens image indicating a lens in the head-mounted display and also disposes, in the adjustment screen, a pupil image that indicates a pupil of the user in reference to an eye tracking result.
Another aspect of the present invention is an adjustment screen display method. In this method, a computer executes a step for generating an adjustment screen for allowing a user who is wearing a head-mounted display to adjust an inter-lens distance for the head-mounted display, and a step for causing the head-mounted display to display the adjustment screen. In the adjustment screen, a lens image indicating a lens in the head-mounted display is disposed, and a pupil image that indicates a pupil of the user on the basis of an eye tracking result is also disposed.
Note that any combination of the above components or a representation of the present invention may be mutually converted between a system, a computer program, a recording medium onto which the computer program has been recorded in a readable manner, a data structure, etc., which are effective as aspects of the present invention.
By virtue of the present invention, it is possible to assist setting of an inter-lens distance for a head-mounted display.
The present embodiment pertains to an image display system that displays an application image on a head-mounted display that is worn on the head of a user. The head-mounted display may also be referred to as a VR headset.
The output mechanism section 102 includes a housing 108 having such a shape that left and right eyes are covered in a state where a user has worn the head-mounted display 100, and is internally provided with a display panel that faces the eyes when worn. The display panel belonging to the head-mounted display 100 in the embodiment is assumed to lack transparency. In other words, the head-mounted display 100 in the embodiment is a light-opaque type head-mounted display.
The inside of the housing 108 is further provided with eyepiece lenses (a left lens 114 and a right lens 116 that are described below) that are positioned between the display panel and the user's eyes when the head-mounted display 100 is worn and enlarge the user's viewing angle. The head-mounted display 100 may further be provided with speakers or earphones at positions corresponding to the user's ears when worn. In addition, the head-mounted display 100 incorporates a motion sensor, and, for the head of a user who is wearing the head-mounted display 100, detects translational motion and rotational motion, as well as the position or orientation thereof at each point in time.
In addition, the head-mounted display 100 is provided with a stereo camera 110 at the front surface of the housing 108. The stereo camera 110 captures a video of the surrounding real space, by a field of view corresponding to the user's line of sight. If a captured image is caused to be immediately displayed, it is possible to realize what is generally called video see through in which the situation of the real space in the direction that the user is facing can be seen unchanged. Moreover, it is possible to realize augmented reality (AR) if a virtual object is drawn on an image of a real object appearing in a captured image. Note that the number of cameras that the image display system 10 is provided with is not limited to any specific number, and the head-mounted display 100 may be provided with one camera, or may be provided with three or more cameras.
In addition, the head-mounted display 100 is provided with an adjustment dial 112 on an upper section of the housing 108. The adjustment dial 112 is a member for adjusting the inter-lens distance for the head-mounted display 100. The user turns the adjustment dial 112 to thereby lengthen or shorten the inter-lens distance for the head-mounted display 100.
The image generation apparatus 200 is an information processing apparatus that, according to the position or orientation of the head of the user that is wearing the head-mounted display 100, identifies the position of a viewpoint or the direction of a line of sight, generates a display image such that a field of view that corresponds thereto is achieved, and outputs the display image to the head-mounted display 100. The image generation apparatus 200 may be a stationary game device, a personal computer (PC), or a tablet terminal. While the image generation apparatus 200 can execute various applications pertaining to VR or AR, in the embodiment, it is assumed that the image generation apparatus 200 generates a display image for a virtual world that is a game stage for causing an electronic game (hereinafter, may be referred to as a “VR game”) to progress, and causes the head-mounted display 100 to display this display image.
Note that the image generation apparatus 200 may generate a moving image for the purpose of enjoyment or information provision, irrespective of whether for a virtual world or the real world, and cause the head-mounted display 100 to display this moving image. In addition, the image generation apparatus 200 may cause the head-mounted display 100 to display a panoramic image having a wide angle of view that is centered on a user's viewpoint, whereby it is possible to impart the user with a deep sense of immersion into the display world.
The controller 140 is an input apparatus (for example, a game controller), which is grasped by a user's hand and which is inputted with an operation by the user. An operation by the user includes an operation for controlling image generation in the image generation apparatus 200, and an operation for controlling image display in the head-mounted display 100. The controller 140 is connected to the image generation apparatus 200 by wireless communication, and transmits data indicating an operation by the user to the image generation apparatus 200. As a variation, one of or both the head-mounted display 100 and the controller 140 may be connected to the image generation apparatus 200 by wired communication that goes via a signal cable or the like.
The image generation apparatus 200 obtains the viewpoint position or line of sight direction (hereinafter, these may be inclusively referred to as a “viewpoint”) for the user 12 from the head-mounted display 100 at a predetermined rate, and causes the position or direction of the view screen 14 to change according to the viewpoint. As a result, it is possible to cause the head-mounted display 100 to display an image at a field of view corresponding to the viewpoint of a user. In addition, it is possible to allow the user 12 to stereoscopically view a virtual space if the image generation apparatus 200 generates a stereo image having parallax and causes left and right regions of the display panel in the head-mounted display 100 to display the stereo image. As a result, the user 12 can experience a virtual reality as if the user 12 were in the room that is in the display world.
The communication unit 232 includes a peripheral interface such as for a universal serial bus (USB) or Institute of Electrical and Electronics Engineering (IEEE) 1394, or a network interface such as for a wired local area network (LAN) or a wireless LAN. The storage unit 234 includes, inter alia, a hard disk drive or a non-volatile memory. The output unit 236 outputs data to the head-mounted display 100. The input unit 238 accepts input of data from the head-mounted display 100, and also accepts input of data from the controller 140. The recording medium driving unit 240 drives a removable recording medium such as a magnetic disk, an optical disc, or a semiconductor memory.
The CPU 222 executes an operating system stored in the storage unit 234 and thereby controls the entirety of the image generation apparatus 200. In addition, the CPU 222 executes various programs (for example, a VR game application or the like) that have been read out from the storage unit 234 or a removable recording medium and loaded into the main memory 226, or downloaded via the communication unit 232. The GPU 224 has a geometry engine function and a rendering processor function, performs a drawing process in accordance with a drawing command from the CPU 222, and outputs a drawing result to the output unit 236. One of or both the CPU 222 and the GPU 224 can be referred to as a processor. The main memory 226 includes a random access memory (RAM), and stores data or a program that is necessary for processing.
The CPU 120 processes information obtained from each unit in the head-mounted display 100 via the bus 128, and also supplies the audio output unit 126 or the display unit 124 with audio data or a display image obtained from the image generation apparatus 200. The main memory 122 stores data or a program necessary for processing by the CPU 120.
The display unit 124 includes a display panel that is a liquid-crystal panel, an organic electroluminescence (EL) panel, or the like, and displays an image in front of the eyes of the user who is wearing the head-mounted display 100. The display unit 124 displays a pair of stereo images on a left eye display panel that is provided in front of the user's left eye and a right eye display panel that is provided in front of the user's right eye, to thereby realize stereoscopic vision.
The display unit 124 also includes a pair of lenses that are used for expanding the user's viewing angle and are positioned between the user's eyes and the display panel when the head-mounted display 100 is being worn. The pair of lenses include the left lens 114 and the right lens 116. The left lens 114 is provided between the left eye display panel and the user's left eye, and the right lens 116 is provided between the right eye display panel and the user's right eye. The adjustment dial 112 is mechanically or electrically connected to the left lens 114 and the right lens 116, and adjusts the inter-lens distance between the left lens 114 and the right lens 116. The inter-lens distance is, for example, the distance between the center of the left lens 114 and the center of the right lens 116.
The audio output unit 126 includes speakers or earphones provided at positions corresponding to the user's ears when the head-mounted display 100 is being worn, and allow the user to hear audio. The communication unit 132 is an interface for sending and receiving data to and from the image generation apparatus 200, and uses a known wireless communication technology such as Bluetooth (registered trademark) to realize communication.
The motion sensor 134 includes a gyro sensor and an acceleration sensor, and obtains an angular velocity or an acceleration of the head-mounted display 100. The eye tracking sensor 136 is a publicly known sensor that is used for eye tracking. Eye tracking can also be said to be line-of-sight measurement, and is a technique for detecting the position of, motion by, and line-of-sight direction for a user's pupil (could be said to be eyeball). For example, the eye tracking sensor 136 uses infrared rays or the like to detect the position of and motion by a user's pupil.
As illustrated in
Data transmitted from the head-mounted display 100 to the image generation apparatus 200 via the communication unit 132 includes the following content.
Description will be given regarding features of the image display system 10 according to the embodiment. The image display system 10 provides an adjustment screen which is a user interface that allows a user who is wearing the head-mounted display 100 to adjust the inter-lens distance for the head-mounted display 100. Lens images indicating the left lens 114 and the right lens 116 of the head-mounted display 100 are disposed in the adjustment screen, according to the orientation of the head-mounted display 100. In addition, pupil images indicating the user's pupils (left eye and right eye) are disposed in the adjustment screen, in reference to an eye tracking result. As a result, assistance is given such that adjustment of the inter-lens distance for the head-mounted display 100 by the user is facilitated.
The plurality of functional blocks illustrated in
The image generation apparatus 200 is provided with a data processing unit 250, and a data storage unit 252. The data storage unit 252 corresponds to the storage unit 234 in
The data processing unit 250 executes various kinds of data processing. The data processing unit 250 transmits and receives data to and from the head-mounted display 100 and the controller 140, via the communication unit 232, the output unit 236, and the input unit 238 illustrated in
The data processing unit 250 includes a system unit 260, an App execution unit 262, and a display control unit 264. The functions of the plurality of functional blocks included in the data processing unit 250 may be implemented by a computer program. It may be that a processor in the image generation apparatus 200 (for example, the CPU 222 and the GPU 224) reads out the abovementioned computer program which is stored in storage in the image generation apparatus 200 (for example, the storage unit 234) into the main memory 226 and executes the computer program to thereby exhibit the functionality of the above-described plurality of functional blocks.
The App execution unit 262 reads out data pertaining to an application (a VR game in the embodiment) selected by the user from the data storage unit 252, and executes the application selected by the user. In reference to a camera image obtained by the system unit 260, the position and orientation of the head-mounted display 100 that are obtained by the system unit 260, and the user's line-of-sight direction that is measured by the system unit 260, the App execution unit 262 generates a VR image indicating a result of executing the VR game. The VR image includes a left eye image and a right eye image.
The display control unit 264 transmits data for various VR images generated by the App execution unit 262 to the head-mounted display 100 and causes the display unit 124 in the head-mounted display 100 to display the VR images. The display unit 124 in the head-mounted display 100 displays the left eye image on the left eye display panel and displays the right eye image on the right eye display panel.
The system unit 260 executes processing for a system that pertains to the head-mounted display 100. The system unit 260 provides a common service to a plurality of applications (for example, a plurality of VR games) that are for the head-mounted display 100. The common service includes provision of camera images, provision of information regarding the position and orientation of the head-mounted display 100, and provision of line-of-sight measurement results, for example. In addition, the system unit 260 executes processing pertaining to basic settings for the head-mounted display 100, and executes processing for assisting adjustment of the inter-lens distance in the embodiment.
The system unit 260 includes an inter-lens distance obtainment unit 272, a line-of-sight measurement unit 276, a deviation detection unit 278, and an adjustment screen generation unit 280.
The inter-lens distance obtainment unit 272 obtains the inter-lens distance for the head-mounted display 100, in reference to the amount of rotation or rotation angle of the adjustment dial 112, which is transmitted from the head-mounted display 100.
In reference to a detection value from the eye tracking sensor 136 in the head-mounted display 100, the line-of-sight measurement unit 276 uses a publicly known eye tracking technology to detect the position of, motion by, and line-of-sight direction of a pupil of the user who is wearing the head-mounted display 100.
The deviation detection unit 278 detects the magnitude of deviation between the detection value from the eye tracking sensor 136 in the head-mounted display 100 and the position of the user's pupil that is detected by the line-of-sight measurement unit 276.
The adjustment screen generation unit 280 generates data for an adjustment screen that is for allowing a user to adjust the inter-lens distance for the head-mounted display 100. As described below in relation to
The adjustment screen generation unit 280 outputs the generated data for the adjustment screen to the display control unit 264. The display control unit 264 transmits the data for the adjustment screen generated by the adjustment screen generation unit 280 to the head-mounted display 100, and causes the display unit 124 in the head-mounted display 100 to display the adjustment screen.
The adjustment screen generation unit 280 disposes an HMD image 302, which illustrates the head-mounted display 100, in the adjustment screen 300. The HMD image 302 includes a left lens image 304a that indicates the left lens 114 and a right lens image 304b that indicates the right lens 116. The left lens image 304a and the right lens image 304b may be images for indicating as if portions corresponding to the left lens 114 and the right lens 116 have been carved out, in the HMD image 302. When the position or orientation of the HMD image 302 changes, the positions of the left lens image 304a and the right lens image 304b also change. Note that the positions of each element in the adjustment screen 300 have positions resulting from a left-right inversion in order to indicate as if such elements are appearing in a mirror. In a case of generically referring to the left lens image 304a and the right lens image 304b below, reference may be simply given to a lens image.
When a user uses the adjustment dial 112 to change the inter-lens distance between the left lens 114 and the right lens 116, the adjustment screen generation unit 280 updates the adjustment screen 300 such that the interval between left lens image 304a and the right lens image 304b widens, or such that the interval narrows. Inter-lens distance indicators 308 are a pair of objects that suggest the magnitude of the inter-lens distance. In a case where the inter-lens distance is changed, the adjustment screen generation unit 280 updates the adjustment screen 300 such that the interval between the inter-lens distance indicators 308 widens or narrows, in tandem with the left lens image 304a and the right lens image 304b.
In the embodiment, the best positions for the left lens 114 and the right lens 116 are at locations where the center position of the left lens 114 matches the center position of the user's left eye and the center position of the right lens 116 matches the center position of the user's right eye. The adjustment screen 300 is configured to prompt the user to make such adjustments that the left lens 114 and the right lens 116 approach the best positions.
Specifically, the size of the left lens image 304a is designed such that the entirety of the left eye image 306a fits within a circle for the left lens image 304a, if deviation between the center of the left lens image 304a and the center of the left eye image 306a is within a predetermined threshold. In other words, the size of the left lens image 304a is designed such that at least a portion of the left eye image 306a protrudes from the circle for the left lens image 304a (is hidden behind the HMD image 302 on the screen) in a case where the abovementioned deviation exceeds the abovementioned threshold.
Similarly, the size of the right lens image 304b is designed such that the entirety of the right eye image 306b fits within a circle for the right lens image 304b, if deviation between the center of the right lens image 304b and the center of the right eye image 306b is within a predetermined threshold. In other words, the size of the right lens image 304b is designed such that at least a portion of the right eye image 306b protrudes from the circle for the right lens image 304b (is hidden behind the HMD image 302 on the screen) in a case where the abovementioned deviation exceeds the abovementioned threshold. The abovementioned thresholds that pertain to deviation may be determined by means of experimentation using the image display system 10 or the knowledge of a developer. The thresholds in the embodiment are ±3 millimeters for both left and right.
In addition, a normal range 312 (a range indicated by a broken line in
Note that the adjustment screen generation unit 280 may determine that the position of the user's left pupil is in an appropriate range in a case where the center of the user's left pupil detected by the line-of-sight measurement unit 276 is positioned within the range of a circle having a radius of approximately 3 millimeters from the center of the left lens 114 in the head-mounted display 100. Similarly, the adjustment screen generation unit 280 may determine that the position of the user's right pupil is in an appropriate range in a case where the center of the user's right pupil detected by the line-of-sight measurement unit 276 is positioned within the range of a circle having a radius of approximately 3 millimeters from the center of the right lens 116 in the head-mounted display 100.
The adjustment screen generation unit 280 disposes a correct/incorrect example 310 in the adjustment screen 300. Disposed in an upper level of the correct/incorrect example 310 is an image that illustrates an example of a correct positional relation between the left lens image 304a, the right lens image 304b, the left eye image 306a, and the right eye image 306b. In addition, disposed in a lower level of the correct/incorrect example 310 is an image that illustrates an example of an incorrect positional relation between the left lens image 304a, the right lens image 304b, the left eye image 306a, and the right eye image 306b. The lower level in the correct/incorrect example 310 illustrates an example in which the inter-lens distance has been widened too much.
In the adjustment screen 300, adjustment ends when the left eye image 306a fits within the circle for the left lens image 304a and the right eye image 306b fits within the circle for the right lens image 304b. When adjustment ends, the user selects (presses) the end button 314. Note that it may be that the end button 314 is hidden in the adjustment screen 300 initially, and the adjustment screen generation unit 280 displays the end button 314 when the position and orientation of the head-mounted display 100 as well as the position of the user's pupils are correctly adjusted.
Operation by the image generation apparatus 200 according to the above configuration will be described.
The inter-lens distance obtainment unit 272 in the image generation apparatus 200 obtains the inter-lens distance for the head-mounted display 100, in reference to the amount of rotation or rotation angle of the adjustment dial 112 in the head-mounted display 100 (S10).
In reference to a measurement value obtained by the eye tracking sensor 136 in the head-mounted display 100, the line-of-sight measurement unit 276 in the image generation apparatus 200 detects the positions of, motion by, and a line-of-sight directions for the pupils of the user wearing the head-mounted display 100 (S11). In S11, the deviation detection unit 278 in the image generation apparatus 200 detects deviation between the positions of the lens in the head-mounted display 100 and the positions of the user's pupils, and specifically detects the magnitude of deviation between an alignment of the left lens 114 and the right lens 116 in the head-mounted display 100 and an alignment of the user's left and right pupils.
The adjustment screen generation unit 280 in the image generation apparatus 200 generates data for an adjustment screen in reference to, inter alia, the inter-lens distance for the head-mounted display 100 obtained in S10, the positions of the user's pupils measured in S11, and the deviation detected in S12 that is between the positions of the lenses in the head-mounted display 100 and the positions of the user's pupils (S12). The display control unit 264 in the image generation apparatus 200 causes the head-mounted display 100 to display the adjustment screen (S13).
While viewing the adjustment screen 300 that is displayed by the head-mounted display 100, the user adjusts the position or orientation (can also be said to be the fit) of the head-mounted display 100 or turns the adjustment dial 112 in the head-mounted display 100, such that the left eye image 306a fits within the circle for the left lens image 304a and the right eye image 306b fits within the circle for the right lens image 304b. The user selects the end button 314 in the adjustment screen 300 when adjustment of the inter-lens distance ends.
When the end button 314 in the adjustment screen 300 is selected (Y in S14), the display control unit 264 causes display of the adjustment screen 300 to end, and the image generation apparatus 200 ends the inter-lens distance adjustment assistance process. If the end button 314 is not selected (N in S14), S10 is returned to. While processing for S10 through S13 is repeated, the adjustment screen generation unit 280 successively updates the display content of the adjustment screen 300 in response to, inter alia, change of the position or orientation of the head-mounted display 100 or change of the inter-lens distance.
For example, in a case where the position or orientation of the head-mounted display 100 has changed, the deviation detection unit 278 detects deviation between the position of the lenses in the head-mounted display 100 after the change and the positions of the user's pupils, in other words, detects the positional relation between the positions of the lenses in the head-mounted display 100 after the change and the positions of the user's pupils. According to deviation (the positional relation) between the positions of lenses and pupils that is successively detected by the deviation detection unit 278, the adjustment screen generation unit 280 generates a new adjustment screen 300 that results from changing the positions of the left lens image 304a and the right lens image 304b. In addition, in a case where the user has performed an operation for changing the positions of the lenses (in other words, rotation of the adjustment dial 112), the adjustment screen generation unit 280 generates a new adjustment screen 300 in which the positions of the left lens image 304a and the right lens image 304b have been changed.
Specifically, as the magnitude of the deviation between the positions of the lenses in the head-mounted display 100 and the positions of the user's pupils, the deviation detection unit 278 detects an angle formed between the LSD line 322 and the IPD line 324. As illustrated in
By virtue of the image generation apparatus 200 according to the embodiment, the adjustment screen 300 that includes lens images and pupil images is provided to a user who is wearing the head-mounted display 100, whereby it is possible to assist setting of an appropriate inter-lens distance for the head-mounted display 100. In addition, in a case where a user changes the lens positions in the head-mounted display 100, the positions of the lens images in the adjustment screen 300 are changed, whereby it is possible to effectively assist setting of an appropriate inter-lens distance.
In addition, by virtue of the image generation apparatus 200 according to the embodiment, the position of the lens images in the adjustment screen 300 is changed in a case where the orientation of the head-mounted display 100 has changed and thus the positional relation between the lenses in the head-mounted display 100 and the user's pupils has changed. As a result, it is possible to assist appropriate adjustment of the orientation of the head-mounted display 100. In addition, in a case where deviation between the positions of lenses in the head-mounted display 100 and the positions of the user's pupils has become high, it is possible to cause content suggesting that deviation is high to be displayed in the adjustment screen 300, and thereby prompt the user to resolve the deviation between the positions of the lenses and the positions of the pupils. In addition, the correct/incorrect example 310 is disposed in the adjustment screen 300, whereby it is possible to effectively assist setting of an appropriate inter-lens distance, and appropriate adjustment of the orientation of the head-mounted display 100.
The present invention has been described above in reference to an embodiment. The embodiment is an example, and a person skilled in the art would understand that various variations can be made to combinations of respective components or processing processes of the embodiment, and that these variations are within the scope of the present invention.
A variation will be described. Although description is not given in the embodiment described above, in a case where a state in which the positions of the user's pupils are within an appropriate range with respect to the positions of the lenses in the head-mounted display 100 has continued for a predetermined threshold for an amount of time or longer, the adjustment screen generation unit 280 in the image generation apparatus 200 may set content suggesting that the inter-lens distance for the head-mounted display 100 is appropriate, in the adjustment screen 300. Regarding the abovementioned threshold for an amount of time, an appropriate value may be determined by means of experimentation using the image display system 10 or the knowledge of a developer. 1.3 seconds is assumed below.
It may be that, in a case where the magnitude of deviation between the positions of the lenses in head-mounted display 100 and the positions of the user's pupils, which is successively detected by the deviation detection unit 278, is less than or equal to a predetermined threshold (for example, ±3 millimeters), the adjustment screen generation unit 280 determines that the positions of the user's pupils with respect to the positions of the lenses are within an appropriate range. In addition, the adjustment screen generation unit 280 may determine that the positions of the user's pupils are in an appropriate range in a case where the centers of the user's pupils detected by the line-of-sight measurement unit 276 are positioned within the range of a circle having a radius of approximately 3 millimeters from the centers of the lenses in the head-mounted display 100. The adjustment screen generation unit 280 may make a determination for each of the user's left and right pupils.
As illustrated in
As illustrated in
As illustrated in
As illustrated in
In a case of detecting that the position of a user's pupil has departed from the appropriate range while the feedback objects 334 are being displayed in the second mode, the adjustment screen generation unit 280 deletes the feedback objects 334 from the adjustment screen 300, and returns the adjustment screen 300 to the state in
The user, upon confirming that the feedback objects 334 in the adjustment screen 300 are displayed in the second mode, selects (presses) the end button 314 in the adjustment screen 300, and ends adjustment of the inter-lens distance. By virtue of the present variation, it is possible to provide visual feedback (the feedback objects 334) to a user when the inter-lens distance is being adjusted, whereby the user can intuitively and correctly determine whether the inter-lens distance is appropriate.
Another variation will be described. In a case such as where a user has closed his/her eyes, it may be that the line-of-sight measurement unit 276 in the image generation apparatus 200 does not detect the position of at least one of the user's left eye and right eye. It may be that the adjustment screen generation unit 280 disposes only the right eye image 306b in the adjustment screen 300 in a case where the position of the user's left eye is not detected, and disposes only the left eye image 306a in the adjustment screen 300 in a case where the position of the user's right eye is not detected. In addition, in a case where the positions of both the user's left eye and right eye are not detected, the adjustment screen generation unit 280 does not need to dispose both the left eye image 306a and the right eye image 306b in the adjustment screen 300. In a case of not disposing at least one of the left eye image 306a and the right eye image 306b in the adjustment screen 300, the adjustment screen generation unit 280 may cause the adjustment screen 300 to display, to the user, advice pertaining to adjustment of the inter-lens distance. This advice may be “Please press the OK button in a case where you can clearly see the screen, even if an eye is not being displayed,” for example.
Yet another variation will be described. At least some functions among a plurality of functions implemented by the image generation apparatus 200 in the above-described embodiment may be implemented by the head-mounted display 100, or may be implemented by a server that is connected to the image generation apparatus 200 via a network. For example, the head-mounted display 100 may be provided with a function for generating various kinds of screens or image data in reference to a camera image or a sensor measurement value. In addition, the server may be provided with a function for generating various kinds of screens or image data in reference to a camera image or a sensor measurement value, and the head-mounted display 100 may display a screen or an image generated by the server.
Any combination of the embodiment and variations described above is valid as an embodiment of the present disclosure. A new embodiment that arises by this combining has the effects of respectively combined embodiments and variations. In addition, a person skilled in the art would also understand that functions to be fulfilled by respective constituent features described in the claims are realized solely by respective components described in the embodiment and variations, or through cooperation of them.
The present invention can be applied to an apparatus or a system for assisting the adjustment of an inter-lens distance for a head-mounted display.
Number | Date | Country | Kind |
---|---|---|---|
2022-023962 | Feb 2022 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/026988 | 7/7/2022 | WO |