This application claims the benefit of Japanese Priority Patent Application JP 2022-091633 filed Jun. 6, 2022, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an image display system and an image display method that display an image by projecting the image onto the retina.
A technology for projecting an image onto the human retina by using the Maxwellian view is being put into practical use in the field of wearable displays (refer to, for example, PCT Patent Publication No. WO2009/066465 (hereinafter, referred to as Patent Document 1)). In this technology, light representative of an image is converged at the center of the pupil of a user, and the image is formed on the retina in two-dimensional form. With this, the result of image formation is less likely to be affected by the crystalline lens, and an image can visually be recognized by users with substantially the same quality irrespective of personal visual acuity and the focus position, that is, a focus-free capability is easily obtained (refer to, for example, Mitsuru Sugawara and six other persons, “Every aspect of advanced retinal imaging laser eyewear: principle, free focus, resolution, laser safety, and medical welfare applications,” SPIE OPTO, Feb. 22, 2018, Proceedings Volume 10545, MOEMS and Miniaturized Systems XVII, 105450O (hereinafter, referred to as Non-Patent Document 1)).
According to Non-Patent Document 1, it has been known that the focus-free capability and visual resolution variously change depending on the state of a beam. For example, when the beam is adjusted in such a direction as to increase the visual resolution, the focus-free capability is degraded, which may possibly make it difficult for a user to view an image depending on his or her visual acuity. Conversely, when the beam is adjusted in such a direction as to enhance the focus-free capability, the visual resolution is relatively decreased. As described above, the resolution and the focus-free capability are in a trade-off relation. Therefore, it is difficult to balance between the resolution and the focus-free capability.
Further, when viewing an object in the real world, a person focuses on the object by changing the thickness of the crystalline lens according to the distance to the object. Meanwhile, when the above-described technology is used, the image to be visually recognized is less likely to be affected by the thickness of the crystalline lens. Therefore, even when the thickness of the crystalline lens physiologically changes according to the distance in the image, the appearance of the image remains unchanged, that is, a convergence-accommodation conflict occurs. Due to this, the sense of presence is likely to be degraded.
In view of the above circumstances, the present disclosure has been made, and it is desirable to provide a technology that allows a user to visually recognize a realistic image with increased quality during the use of a display technology for projecting an image onto the retina.
According to an embodiment of the present disclosure, there is provided an image display system including an eyeball state acquisition section, a beam control section, and an image projection section. The eyeball state acquisition section acquires information regarding a state of an eye of a user. The beam control section adjusts, according to the information regarding the state of the eye, a beam state of image laser light for forming pixels of a display image. The image projection section projects the image laser light onto a retina of the user.
According to another embodiment of the present disclosure, there is provided an image display method performed by an image display system. The image display method includes acquiring information regarding a state of an eye of a user, adjusting, according to the information regarding the state of the eye, a beam state of image laser light for forming pixels of a display image, and projecting the image laser light onto a retina of the user.
It should be noted that any combinations of the above-mentioned constituent elements and any conversions of expressions of the present disclosure between, for example, methods, devices, systems, computer programs, and recording media having computer programs recorded therein are also effective as the embodiments of the present disclosure.
According to the present disclosure, a realistic image can visually be recognized with increased quality during the use of a display technology for projecting an image onto the retina.
A first embodiment of the present disclosure relates to a display technology for projecting an image onto the retina of a user by using a laser beam scanning method. The laser beam scanning method is a method for forming an image on a target by performing a two-dimensional scan with laser light corresponding to pixels, with the use of a deflection mirror.
The image data output unit 10 outputs data of a display target image to the light emission unit 12. The light emission unit 12 acquires the data and generates laser light indicating the colors of individual pixels forming the image. The laser light includes, for example, red (R), green (G), and blue (B) components. The scanning mirror 52 is controlled to change its angle around two axes and displaces, in two-dimensional directions, the destination of the laser light generated by the light emission unit 12. The light emission unit 12 sequentially changes the color of the laser light in synchronism with the oscillation of the scanning mirror 52. Thus, the image is formed with pixels colored at each specific time point.
The reflective mirror 100 receives the laser light reflected from the scanning mirror 52, and reflects the laser light toward an eyeball 102 of the user. The reflective mirror 100 is designed such that the laser light two-dimensionally scanned by the scanning mirror 52 is converged on the pupil of the eyeball 102 and is then two-dimensionally scanned over a retina 104. This increases the possibility of achieving a focus-free capability, which allows users to visually recognize an image with substantially the same quality at all times irrespective of unaided visual acuity and the focus position.
In practice, it is conceivable that the structure depicted in
The present embodiment improves the quality of a visually recognized image during the application of the display technology that uses the method depicted in
In other words, there exists an incident beam diameter that gives a local minimum to the beam spot on the retina. In order to increase resolution when an image displayed by retinal direct drawing is visually recognized, it is preferable that the incident beam diameter be adjusted for the local minimum. In the example of
NA=sin θ
As seen from
Meanwhile, when the numerical aperture is −0.0012 or −0.0029 as indicated in
Moreover, as can be seen from the above-described results, as the fineness (resolution) of a visually recognized image is increased, the focus-free capability is more impaired. This phenomenon theoretically occurs because the conditions for the optimal beam diameter and the beam divergence angle vary in relation to to the refractive power of a user's eye. In the present embodiment, the beam state is automatically adjusted according to the state of the eyes of each of the users, so that the users can view highly fine images with higher resolution. Since anyone can view such a fine image, the focus-free capability is also achieved.
The image laser light 57 includes, for example, three different laser light waves corresponding to R, G, and B. However, the wavelength and the number of waves are not limited to particular ones as long as they indicate the colors corresponding to pixel values. The beam control section 54 adjusts at least either the beam diameter or divergence angle of the image laser light 57 according to the state of the user's eye. Hereinafter, both the beam diameter and the convergence angle or either one of them may generically be referred to as the “beam state.” An example of specific means for adjusting the beam state will be described later. The adjusted image laser light 57 is reflected from the scanning mirror 52 and finally enters the eyeball 102 of the user.
For example, a microelectromechanical systems (MEMS) mirror is employed as the scanning mirror 52. The MEMS mirror is a small-size, low-power-consumption device that is able to accurately control angular changes around two axes by being electromagnetically driven. However, the method for driving the scanning mirror 52 is not limited to a particular one. The scanning mirror 52 includes a control mechanism for controlling its angle in synchronism with the colors of the image laser light 57, which forms the pixels of the projection image. The control mechanism may alternatively be included in the image data output unit 10.
In the present embodiment, the light emission unit 12 further has a function of acquiring information regarding the state of the eyeball 102 of the user. More specifically, the light emission unit 12 acquires predetermined parameters which are related to the structure of the eyeball 102 and which can be used to estimate the unaided visual acuity. Examples of the predetermined parameter include the thickness of a crystalline lens 108 and the length of the eyeball in the depth direction (eye axial length). The light emission unit 12 estimates the unaided visual acuity of the user on the basis of such visual acuity estimation parameters, and emits the image laser light 57 in the optimal beam state, which varies from one user to another.
Specifically, the light emission unit 12 includes a reference laser light source 56, a beam splitter 58, a reference laser light transmission filter 62, a distance acquisition section 60, and an eyeball state acquisition section 64. The reference laser light source 56 outputs reference laser light for acquiring the visual acuity estimation parameters. The beam splitter 58 superimposes the reference laser light on the image laser light and introduces the resulting laser light to the scanning mirror 52. The reference laser light transmission filter 62 transmits therethrough light having the wavelength of the reference laser light. The distance acquisition section 60 detects the reflected reference laser light and acquires the distance to the point where the reference laser light is reflected. The eyeball state acquisition section 64 acquires the visual acuity estimation parameters from information regarding the acquired distance and estimates the unaided visual acuity.
In the above example, the reference laser light source 56 and the beam splitter 58 function as a reference light emission section. As reference laser light 59, the reference laser light source 56 generates, for example, near-infrared laser light having a pulse width of 100 picoseconds to several nanoseconds. The beam splitter 58 is disposed in such a manner as to make the reference laser light 59 join the path of the image laser light 57 and to introduce the reference laser light 59 to the scanning mirror 52. With such a beam splitter 58, the reference laser light 59 is, in a common path shared by the image laser light 57, reflected from the scanning mirror 52 and delivered to the eyeball 102, for example, through a reflective mirror. It should be noted that, when paths are substantially common to each other, they may be regarded as the “common path” even though they are slightly displaced from each other.
The distance acquisition section 60 detects the reference laser light 59 reflected from tissues in the eyeball 102, and thus acquires the distance to the point where the reference laser light is reflected. Part of the reference laser light 59 delivered to the pupil of the eyeball 102 is transmitted through various tissues in the eyeball and delivered to the retina 104, while the other reference laser light 59 is reflected from the surfaces of the tissues. For example, part of the reference laser light 59 is reflected from a surface (front surface) of the crystalline lens 108 facing the pupil, and the remaining reference laser light 59 is transmitted through the front surface of the crystalline lens 108. Part of the transmitted laser light is reflected from a surface (back surface) of the crystalline lens 108 facing the retina, and the remaining transmitted laser light is transmitted through the back surface of the crystalline lens 108. The last remaining laser light is delivered to and reflected from the retina 104.
The distance acquisition section 60 detects, at light-receiving elements thereof, photons that are reflected from a plurality of surfaces in the above-described manner, and derives the distance to each surface on the basis of the time lag between the emission and detection of the reference laser light 59. The distance acquisition section 60 includes, for example, a direct time-of-flight (dToF) sensor and is driven in synchronism with the emission of the reference laser light 59. More specifically, in response to a synchronization signal S inputted from the distance acquisition section 60, the reference laser light source 56 periodically generates a pulse of the reference laser light 59. The distance acquisition section 60 repeatedly measures, for a predetermined period of time, the time lag between a time point when the reference laser light 59 is emitted, which is based on the time point when the synchronization signal S is outputted, and a time point when reflected light 61 of the reference laser light 59 is detected.
When the time lag between the emission of the reference laser light 59 and the detection of the reflected light 61 is Δt and the speed of light is c, a distance C between the light-receiving element of the distance acquisition section 60 and the reflective surface is theoretically determined by the following equation.
C=c×Δt/2
It should be noted that a port through which the laser light is emitted from the scanning mirror 52 is substantially the same as the light-receiving surface of the distance acquisition section 60. In the present embodiment, however, the method of measuring the distance to the point of reflection by detecting the reflection of the reference laser light is not limited to dToF.
In the present embodiment described above, there are a plurality of reflective surfaces from which the reference laser light is reflected. Therefore, the number of photons detected by the distance acquisition section 60 takes a local maximum at a plurality of time points t1, t2, t3, which correspond to the distances to the reflective surfaces. For example, the distance acquisition section 60 accumulates and counts temporal changes in the number of detected photons from the time point when the laser light is emitted, and thus acquires the time lags Δt1, Δt2, Δt3, . . . that cause the number of photons to take the local maximum. As a result, distances C1, C2, C3, . . . to the plurality of reflective surfaces can be determined highly accurately on the basis of the above equation.
It should be noted that, although the path of the reference laser light 59 is linear in
The eyeball state acquisition section 64 acquires, from the distance acquisition section 60, a data set C including the distances C1, C2, C3, . . . to the plurality of reflective surfaces, and acquires the visual acuity estimation parameters by using the acquired data set C. For example, the eyeball state acquisition section 64 acquires, as the thickness of the crystalline lens 108, the difference between the distance to the front surface of the crystalline lens 108 and the distance to the back surface of the crystalline lens 108. The eyeball state acquisition section 64 estimates unaided visual acuity VA of the user on the basis of the acquired thickness, and reports the estimated unaided visual acuity VA to the beam control section 54.
The beam control section 54 adjusts the image laser light 57 until the beam state becomes suitable for the reported unaided visual acuity VA. It should be noted that unaided acuity acquisition by the reference laser light source 56, the distance acquisition section 60, and the eyeball state acquisition section 64 may be performed only once as an initial operation when the user begins to view images such as content by using the image display system according to the present embodiment.
Alternatively, the above-mentioned unaided acuity acquisition may be performed while the user is viewing images, at regular intervals or at a predetermined timing such as a timing when a scene changes. In the present embodiment, image display processing and visual acuity estimation processing do not interfere with each other and are compatible with each other within the same system. Hence, the user is able to enjoy viewing images under optimal conditions without having trouble.
Further, the distance acquisition section 60, the eyeball state acquisition section 64, and the beam control section 54 may be implemented as one or two devices or as four or more devices in practice. Moreover, some of the depicted functions of the distance acquisition section 60 and the beam control section 54 may be included in the eyeball state acquisition section 64. Alternatively, some or all of the functions of the eyeball state acquisition section 64 may be included in the distance acquisition section 60 or in the beam control section 54.
The distance acquisition section 60 includes a synchronization signal output block 72, a detection block 70, and a computation block 74. The synchronization signal output block 72 outputs a synchronization signal to the reference laser light source 56. The detection block 70 detects the reflected reference laser light. The computation block 74 acquires the distance to a reflection position according to the result of the detection. The synchronization signal output block 72 generates the synchronization signal for starting the generation of the pulse of the reference laser light, as mentioned above, and gives the generated synchronization signal to the reference laser light source 56. The detection block 70 includes an array of light-receiving elements, detects the reflected reference laser light when the eyeball is hit by the pulse of the reference laser light generated by the reference laser light source 56 in response to the synchronization signal, and reports temporal changes in the number of detections to the computation block 74.
The computation block 74 determines the time point when the pulse of the reference laser light is emitted, according to the timing of the synchronization signal generated by the synchronization signal output block 72. Then, as mentioned earlier, the computation block 74 accumulates and counts the number of times the reflected light is detected, for a plurality of pulses, in relation to the elapsed time from the time point of the emission, and thus determines the plurality of time lags Δt1, Δt2, Δt3, . . . that give the local maximum to the number of detections. Further, the computation block 74 uses the above equation to derive distance values that correspond to the respective determined time lags.
The eyeball state acquisition section 64 includes an unaided visual acuity acquisition block 78 that estimates the unaided visual acuity of the user by using the derived distance values. More specifically, the unaided visual acuity acquisition block 78 determines one or more unaided visual acuity estimation parameters such as the crystalline lens thickness and the eye axial length according to the difference between the distance values. Then, the eyeball state acquisition section 64 estimates the unaided visual acuity of the user according to the values of the determined unaided visual acuity estimation parameters. For the purpose of estimating the unaided visual acuity, the unaided visual acuity acquisition block 78 has an unaided visual acuity table retained in an internal memory thereof, for example. The unaided visual acuity table defines the association between the unaided visual acuity estimation parameters and the unaided visual acuity.
The beam control section 54 includes a beam state determination block 80 and an adjustment block 82. The beam state determination block 80 determines the optimal value of the beam state. The adjustment block 82 adjusts the beam state of the image laser light according to the result of the determination. The beam state determination block 80 acquires an estimated value of the user's unaided visual acuity from the eyeball state acquisition section 64, and determines the beam state optimal for the user's unaided visual acuity. For the purpose of determining the beam state, the beam state determination block 80 has a beam state table retained in an internal memory thereof, for example. The beam state table defines the association between the unaided visual acuity and the optimal beam state.
The adjustment block 82 adjusts the image laser light to obtain the beam state determined by the beam state determination block 80. For the purpose of adjusting the image laser light, the adjustment block 82 includes means for adjusting at least either the beam diameter or the beam divergence angle. For example, the adjustment block 82 includes a beam expander for use in adjusting the beam diameter.
The beam expander is a well-known device that includes a first lens for expanding the diameter of incident laser light and a second lens for turning such expanded laser light into parallel light and that is able to change the magnification of the beam diameter by adjusting the distance between the first and second lenses (refer to, for example, “beam expander,” [online], Edmund Optics Japan, [search on May 9, 2022], Internet <URL: https://www.edmundoptics.jp/knowledge-center/application-notes/lasers/beam-expanders>). However, the means for adjusting the beam diameter is not limited to the beam expander.
Further, the adjustment block 82 includes, for example, a device that is used for adjusting the beam divergence angle and that is equipped with any of a diffusion lens, a condenser lens, a liquid lens, and photonic crystals. When the diffusion lens or the condenser lens is used, the divergence angle can be adjusted by shifting the lens position in the axial direction. In this instance, the adjustment block 82 needs to mechanically move the lens by using an actuator, and the time required for the adjustment is approximately 100 msec to 1 sec.
The liquid lens is a device that changes the refractive index of incident light by inducing interface deformation by changing a voltage applied into a holder in which water or other polar liquid and silicone oil or other nonpolar liquid are enclosed (refer to, for example, Japanese Patent Laid-open No. 2011-90054). When provided with a device equipped with the liquid lens, the adjustment block 82 is able to control the divergence angle simply by changing the applied voltage, and thus significantly increase the response speed for the adjustment.
A device equipped with the photonic crystals uses the combination of a photonic crystal with a fixed period of refractive index profile and a photonic crystal with a continuously varying period of refractive index profile, changes the difference between the above-mentioned periods by shifting the position of an electrode to be driven, and thus adjusts a beam emission angle (refer to, for example, Japanese Patent Laid-open No. 2013-211542). When provided with the device equipped with the photonic crystals, the adjustment block 82 is also able to control the divergence angle simply by changing the electrode to be driven. It should be noted that the means for adjusting the beam state in the present embodiment is not limited to the above-described one.
The local maxima can be obtained more clearly by increasing the number of reference laser pulse emissions and adding the results of detection to the histogram. The time lags Δt1, Δt2, Δt3 giving the local maxima correspond to the distances C1, C2, C3 to the surfaces causing the relevant reflections. For ease of understanding,
The unaided visual acuity acquisition block 78 acquires data regarding a plurality of distance values obtained in the above manner, and determines, for example, the distance difference between the back and front surfaces of the crystalline lens, i.e., C2−C1, as the thickness of the crystalline lens. Alternatively, the unaided visual acuity acquisition block 78 determines the distance difference between the retina and the front surface of the crystalline lens, i.e., C3−C1, as the eye axial length. It should be noted that the type and number of the unaided visual acuity estimation parameters are not limited to particular ones as long as the unaided visual acuity can be estimated on the basis of the distances to the reflective surfaces of the eyeball. Further, it is sufficient if the crystalline lens thickness and the eye axial length are indices for deriving the unaided visual acuity, and they may not be exact numerical values based on the general definition.
Hence, when the unaided visual acuity table 110 depicted in
Meanwhile, it has been known that the unaided visual acuity also depends on the eye axial length (axial error). More specifically, when the eye axial length is greater than a normal value, the near-sightedness, which is the condition where incident light forms an image in front of the retina, occurs. On the other hand, when the eye axial length is smaller than the normal value, the far-sightedness, which is the condition where incident light forms an image behind the retina, occurs. Hence, the unaided visual acuity table may include data indicating the association between the eye axial length and the unaided visual acuity. Alternatively, the unaided visual acuity table may include data indicating the association between the unaided visual acuity and the combination of the crystalline lens thickness and the eye axial length or include data indicating the association between the unaided visual acuity and any other parameter.
In a case where only the beam diameter is to be adjusted, as depicted in
The beam state determination block 80 references the beam state table 112 to determine the optimal beam state according to the user's unaided visual acuity obtained by the unaided visual acuity acquisition block 78 of the eyeball state acquisition section 64, and reports the determined optimal beam state to the adjustment block 82. It should be noted that numerical values depicted in
That is, the distance acquisition section 60 detects the reflected reference laser light at a position circumscribing the port through which the above-mentioned laser light is emitted. As long as the reflected reference laser light is detected, the shape of the opening 66 or the surface on which the light-receiving elements are arrayed is not limited to a particular shape. In the present embodiment, the laser beam scanning method is adopted to sequentially irradiate individual pixels with laser light. Hence, even when the light-receiving surface of the distance acquisition section is positioned to circumscribe the laser light emitting port as depicted in
Further, as depicted in
The image/reference laser light source 120 not only generates the image laser light on the basis of the image data I acquired from the image data output unit 10, but also generates the pulse of the reference laser light according to the synchronization signal S from the distance acquisition section 60. The other operations of the image/reference laser light source 120 may be similar to the operations depicted in
According to the present embodiment described above, in the image display system which projects an image onto the retina by using the laser beam scanning method, the reference laser light is emitted to the eyeball with the use of an image laser light emission mechanism, and the reflected reference laser light is detected, thereby acquiring the state of the eyeball. More specifically, the image display system estimates the unaided visual acuity of the user by acquiring the predetermined parameters affecting the unaided visual acuity. Subsequently, the image display system optimizes the beam state of the image laser light, and thus enables the user to visually recognize a fine image with the highest resolution possible irrespective of the user's unaided visual acuity.
Further, since the mechanism for displaying an image is used to emit the reference laser light to the eyeball, it is not necessary to use a special device for measuring visual acuity. That is, measurements can be made automatically while the user wears, for example, a wearable display for viewing images, and the results of measurements can immediately be reflected in the beam state. Consequently, accurate adjustments can be made without the user knowing. Moreover, since the reference laser light is emitted along the same path as the image laser light, the eyeball can accurately be measured without the trouble of performing calibration, for example.
In the first embodiment, the beam state is adjusted to ensure visual recognition of images with the highest resolution possible, without regard to the unaided visual acuity. However, in a second embodiment, the resolution of an image to be visually recognized is changed to match changes in a biological focal distance. More specifically, in the second embodiment, the distance in which the user focuses on something is estimated on the basis of the crystalline lens thickness, and the beam state is adjusted on a real time basis according to the estimated distance.
When the user views (focuses on) the nearby object 130b as depicted in
When a display technology based on retinal direct drawing is adopted, as mentioned earlier, images are visually recognized in the same state no matter where the user's focus is. In other words, even when the user focuses on the image of an object, the user feels that the image is out of focus, that is, a convergence-accommodation conflict occurs. In the present embodiment, the crystalline lens thickness is acquired in parallel with image display, and then, the focal distance is estimated on a real time basis according to the acquired crystalline lens thickness. Subsequently, in the three-dimensional space of the display target, that is, in a virtual space of or the space of a live captured image of the display target, the image of an object placed at a position corresponding to the focal distance is visually recognized with high resolution, and the image of an object placed at a distant position not corresponding to the focal distance is visually recognized with low resolution. In this manner, the convergence-accommodation conflict is resolved, and a more realistic image representation is achieved.
It should be noted that the image laser light source 50 and the reference laser light source 56 may be replaced by the image/reference laser light source 120 depicted in
Temporal changes in the distances C1, C2, C3, . . . can be acquired by continuously emitting the pulse of the reference laser light. When the crystalline lens thickness changes, the relevant distance values also change. Therefore, the distance acquisition section 60 determines the data set C, for example, at predetermined intervals and sequentially supplies the determined data set C to the eyeball state acquisition section 64. The eyeball state acquisition section 64 determines temporal changes in the thickness of the crystalline lens 108 according to the data set C. The eyeball state acquisition section 64 estimates a focal distance Df at predetermined intervals on the basis of the result of the determination and reports the estimated focal distance Df to the beam control section 54. The eyeball state acquisition section 64 may additionally report information regarding the focal distance Df to the image data output unit 10 as appropriate.
The beam control section 54 adjusts the image laser light 57 in such a manner that the image of an object at a position corresponding to the reported focal distance Df looks clear and that the image of an object at a distant position not corresponding to the focal distance Df looks blurry. Therefore, the beam control section 54 acquires depth information Di corresponding to a display image from the image data output unit 10 through the image laser light source 50. Subsequently, the beam control section 54 collates the depth information Di with the focal distance Df and determines a region of the image plane where the image is clarified and a region of the image plane where the image is blurred.
A resolution adjustment principle described in conjunction with the first embodiment can be applied as a method for clarifying and blurring images. More specifically, the beam control section 54 makes adjustments such that the beam state of image laser light representing an image varies from one image region to another. As a result of such adjustments, the resolution can be controlled with the pixels considered as the smallest units in order to express focal point changes that match changes in the crystalline lens. In the present embodiment, the image data output unit 10 may further acquire the information regarding the focal distance Df and vary the resolution from one set of image data to another according to the acquired focal distance information.
For example, the image data output unit 10 decreases, from the beginning, the resolution of a region to be blurred by the beam control section 54, and then generates display image data. As regards the region to be blurred by the adjustments of the beam state of the image laser light, visual recognition is not significantly affected even when an image generation process is simplified to decrease the resolution. Consequently, it is possible to reduce the load on the image generation process in the image data output unit 10 without causing image quality degradation. In this case, the image data output unit 10 collates the depth information Di corresponding to the display image, with the focal distance Df, determines the region where the image generation process can be simplified, and generates the display image.
The input/output interface 28 is connected to a communication section 32, a storage section 34, an output section 36, an input section 38, and a recording medium drive section 40. The communication section 32 establishes communication, for example, with a server. The storage section 34 includes, for example, a hard disk drive or a non-volatile memory. The output section 36 outputs data to the image laser light source 50. The input section 38 receives, as input, data from the eyeball state acquisition section 64. The recording medium drive section 40 drives a removable recording medium such as a magnetic disk, an optical disk, or a semiconductor memory. The communication section 32 includes a peripheral device interface such as a universal serial bus (USB) or Institute of Electrical and Electronics Engineers (IEEE) 1394, and a network interface such as a wired or wireless local area network (LAN).
The CPU 23 controls the whole of the image data output unit 10 by executing an operating system stored in the storage section 34. The CPU 23 also executes various programs that are read from a removable recording medium and loaded into the main memory 26 or that are downloaded through the communication section 32. The GPU 24 functions as a geometry engine and as a rendering processor, performs a drawing process according to a drawing command from the CPU 23, and outputs the result of the drawing process to the output section 36. The main memory 26 includes a random-access memory (RAM) and stores programs and data necessary for processing.
It should be noted that the functional block configuration of the distance acquisition section 60 is not depicted in
For example, in a case where the eyeball is positioned in the laser light path at a distance of 5 cm from the laser light emitting port, the time lag Δt between the emission of reference laser light and the detection of the reflected reference laser light is determined as indicated below.
Δt=1/(3.0×10 8)[m/sec]×0.1 [m]=0.33 [nsec]
When the frame rate of the projection image is 30 fps (frame/sec), the number P of times the laser pulse is emitted per frame is determined as indicated below.
P=1/30[fps]/0.33 [nsec]=5×108 [dots]
When the resolution of the projection image is 1280×720 pixels, the number p of times the reference laser light is emitted per pixel is determined as indicated below.
p=5×108 [dots]/(1280×720)[pixel]=108.5 [dots/pixel]
Ideally, supposing that measurements are made approximately 500 times in a period of time required to measure a distance with practical accuracy and that the reflected laser light pulses can be detected by all light-receiving elements, it is sufficient to dispose approximately five light-receiving elements on the light-receiving surface of the distance acquisition section 60.
When the above-described configuration is adopted, the focal distance acquisition block 140 is able to acquire the data set C regarding distance values at significantly shorter intervals than frame display intervals and acquire the crystalline lens thickness with high temporal resolution at such shorter intervals. Consequently, even if the focal distance changes during a period of time when an image corresponding to one frame is projected, the beam state can be adjusted accordingly, so that the focus can be adjusted appropriately in an image. However, the frequency of beam state adjustments is not specifically limited. The beam state may be adjusted, for example, at one-frame intervals or longer intervals.
It should be noted that the distance acquisition section 60 may measure changes in the crystalline lens thickness by using a method other than the use of a dToF sensor. For example, a camera for capturing the image of an eyeball is adopted as the distance acquisition section 60. Then, the distance acquisition section 60 measures changes in the crystalline lens thickness on the basis of Purkinje-Sanson image changes in a captured image of the eyeball to which the reference laser light is emitted (refer to, for example, Japanese Patent Laid-open No. 2000-139841). In this case, it is sufficient if an additional path is provided such that the reference laser light obliquely enters the crystalline lens.
The image data output unit 10 includes a focal distance acquisition section 142, an image generation section 144, a depth information generation section 146, and an output section 148. The focal distance acquisition section 142 acquires information regarding the focal distance. The image generation section 144 generates a display target image. The depth information generation section 146 generates the depth information regarding an image. The output section 148 outputs the display image data and the depth information. The focal distance acquisition section 142 acquires the information regarding the focal distance from the eyeball state acquisition section 64 continuously, for example, at predetermined intervals.
The image generation section 144 generates frame data of a still image or video to be displayed. In this instance, the image generation section 144 may acquire image data that has generated in advance, for example, from an external device such as a server or from an internal storage device. Alternatively, the image generation section 144 may draw an image by itself with the use of a program and model data stored, for example, in the internal storage device. The depth information generation section 146 generates the depth information that indicates the distance distribution of objects to be represented in the plane of the image generated by the image generation section 144.
In a case where the image generation section 144 draws an image by itself, the depth information generation section 146 generates the depth information by acquiring, from the image generation section 144, distance information obtained during the drawing process. In a case where an image that has been generated in advance is to be reproduced, the depth information generation section 146 may acquire the depth information that is supplied in association with the relevant image data. Alternatively, the depth information generation section 146 may generate the depth information corresponding to an image that has been generated in advance, for example, by image analysis or deep learning.
In a case where the resolution of the image data itself is to be changed according to changes in the focal distance obtained from the crystalline lens, the image generation section 144 collates the depth information generated by the depth information generation section 146, with the focal distance information acquired by the focal distance acquisition section 142, and reflects the result of the collation in the image generation process. That is, the image generation section 144 generates an image in such a manner that the resolution of an object to be depicted in the image increases as the position of the object is closer to the focal distance (the resolution of the object decreases as the position of the object is farther away from the focal distance). Therefore, the image generation section 144 retains a resolution table indicative of the association between an object distance (depth) based on the focal distance and a target resolution value, for example, in an internal memory.
The image generation section 144 preferably reflects the focal distance information that has just been obtained, in the resolution of the image to be generated next. However, the rate of focal distance acquisition by the eyeball state acquisition section 64 and the rate of resolution distribution update by the image generation section 144 may be set independently of each other. In a case where the focal distance is not to be reflected in the image data, the function of the focal distance acquisition section 142 may be removed. The output section 148 associates data of the image generated by the image generation section 144 with data regarding the depth information generated by the depth information generation section 146, and outputs the resulting data to the image laser light source 50. In a case where a video is to be displayed, the output section 148 outputs data of the video at a predetermined frame rate.
The beam control section 54 includes a beam state determination block 150 and an adjustment block 152. The beam state determination block 150 determines the optimal value of the beam state. The adjustment block 152 adjusts the beam state of the image laser light according to the result of the determination. The beam state determination block 150 collates the depth information acquired through the image laser light source 50, with the focal distance information acquired from the eyeball state acquisition section 64, and thus determines the optimal value of the beam state for each pixel or region of the display image.
More specifically, the beam state determination block 150 determines the beam state in such a manner that the acquired visual acuity for an object to be depicted in an image increases as the position of the object is closer to the focal distance (the acquired visual acuity decreases as the position of the object is farther away from the focal distance). Therefore, the beam state determination block 150 retains a beam state table indicative of the association between an object distance (depth) based on the focal distance and a target beam state value, for example, in an internal memory.
The adjustment block 152 adjusts the image laser light by means similar to that described in conjunction with the first embodiment, in order to obtain the beam state determined by the beam state determination block 150. However, the adjustment block 152 in the present embodiment identifies which pixel in the display image is formed by the image laser light emitted from the image laser light source 50, and then adjusts the image laser light to obtain the beam state determined for the relevant pixel. It is desirable that, when the distribution of the beam state determined by the beam state determination block 150 changes in response to a change in the focal distance, the adjustment block 152 respond immediately and reflect such a change in the beam state.
For example, in a case where the original resolution is “High,” the image generation section 144 adjusts the resolution only in such a direction as to decrease the resolution. In a case where the original resolution is “Medium,” the image generation section 144 adjusts the resolution in such a direction as to either increase or decrease the resolution depending on the region. In a case where the original resolution is “Low,” the image generation section 144 adjusts the resolution only in such a direction as to increase the resolution.
Further, it is assumed that the axis of “Distance Range” typically represents the depth direction from a virtual viewpoint with respect to an image. However, in a case, for example, where the motion of the user's eyeball can be acquired separately, the distance range may be set in another direction as well. For example, a gaze point detector may be added to the configuration of the image display system, and the distribution of the resolution may be set within the range of horizontal distances with a gaze point in an image as a reference point. The gaze point detector is a well-known device that emits reference light such as infrared rays to the eyeball and that identifies the gaze point from an eyeball motion on the basis of a captured image of the eyeball.
It should be noted that the distance acquisition section 60 and eyeball state acquisition section 64 in the present embodiment may act as the gaze point detector. More specifically, when light-receiving elements are two-dimensionally arrayed on the light-receiving surface of the distance acquisition section 60, the reflected reference laser light is indicated as an image expressing a two-dimensional brightness distribution. Hence, the eyeball state acquisition section 64 may identify the gaze point by acquiring the eyeball motion from the image expressing the two-dimensional brightness distribution.
Alternatively, the distance acquisition section 60 may acquire, as the two-dimensional distribution, the above-mentioned distance to the reflective surface according to the two-dimensional array of the light-receiving elements. Accordingly, the eyeball state acquisition section 64 may acquire the two-dimensional distribution of crystalline lens thicknesses. The crystalline lens thickness is greatest at the center of the crystalline lens, which corresponds to the optical axis, and decreases with a decrease in the distance to the edge of the crystalline lens. Hence, the eyeball state acquisition section 64 may detect the eyeball motion on the basis of changes in the two-dimensional distribution of crystalline lens thicknesses and may thus identify the gaze point.
According to the settings indicated in
Accordingly, the image generation section 144 does not have to place unnecessary processing load on a region that is to be finally blurred by the beam state adjustment. Further, when a region close to a position corresponding to the focal distance is surely provided with high resolution in the image data, the effect of the beam state adjustment can clearly be expressed without being negated. It should be noted that numerical values depicted in
As indicated, for example, in
In a case where both the beam diameter and the divergence angle are to be adjusted, both of them are regarded as variables. Then, experiments or calculations are performed to determine the combination of the beam diameter and the divergence angle in such a manner that the distribution of resolution can visually be recognized without a sense of discomfort, and the association between the beam diameter and the divergence angle is defined. The beam state determination block 150 references the beam state table 170 to determine the optimal beam state for each pixel according to the focal distance which is acquired by the focal distance acquisition block 140 of the eyeball state acquisition section 64 on a real time basis, and to the depth information outputted from the image data output unit 10. Then, the beam state determination block 150 reports the determined optimal beam state to the adjustment block 152.
It should be noted that the contents of the beam state table 170 may be changed according to the user's unaided visual acuity. As described in conjunction with the first embodiment, the beam state where the highest resolution is visually recognized varies with the user's unaided visual acuity. Therefore, when the constituent elements used in the present embodiment are combined with those used in the first embodiment, setup can be made such that an object placed at the focal distance is visually recognized with the highest resolution irrespective of unaided visual acuity, and that, with such a state described above as a reference state, a distant object looks appropriately blurry.
In the above case, the beam state determination block 150 retains a plurality of beam state tables associated with unaided visual acuity, for example, in an internal memory. Then, the beam state determination block 150 first acquires the user's unaided visual acuity as described in conjunction with the first embodiment, and reads out the beam state table associated with the acquired user's unaided visual acuity. Subsequently, the adjustment block 152 uses the read-out beam state table to adjust the beam state according to the focal distance.
The setup of “Distance Range” in the depicted beam state table 170 may be made only in the depth direction as described in conjunction with the resolution table 160 depicted in
In the above example, the data 182 regarding the depth information is in the form of a depth image in which the distance values of an object appearing in the frame 180 are expressed as pixel values with brightness that increases with a decrease in the distance values. However, the data format of depth information is not limited to a particular format.
The beam control section 54 references the beam state table according to the information regarding the user's focal distance that has just been obtained, and adjusts the beam state in such a manner that the resolution (acquired visual acuity) varies with the distance from a position corresponding to the focal distance. As a result, while the user focuses on a car 186a in front, the car 186a and its surroundings are clearly visible as indicated in a display image 184a, but a car 186b in the back and its surroundings look blurry. While the user focuses on the car 186b in the back, the car 186b and its surroundings are clearly visible as indicated in a display image 184b, but the car 186a in front and its surroundings look blurry.
It should be noted that, in the depicted example, resolution adjustments are not made to the data of the frame 180 that is outputted from the image data output unit 10. However, as mentioned earlier, the distribution may be given to the resolution of the data of the frame 180 as depicted in
According to the present embodiment described above, in the image display system which projects an image onto the retina by using the laser beam scanning method, the reference laser light is emitted to the eyeball in parallel with emission of the image laser light, and the reflected reference laser light is detected, thereby acquiring the state of the eyeball. More specifically, the image display system measures the thickness of the user's crystalline lens and estimates the focal distance on the basis of the result of measurement on a real time basis. The image display system controls the beam state on the basis of the estimated focal distance in such a manner that the resolution (acquired visual acuity) varies in the image plane. Thus, an appearance similar to the appearance in the real world can be reproduced on a display. This can resolve the convergence-accommodation conflict.
The present disclosure has been described above on the basis of the embodiments. It is to be understood by persons skilled in the art that the foregoing embodiments are illustrative, that a combination of the constituent elements and processes described in conjunction with the foregoing embodiments can be variously modified, and that such modifications can be made without departing from the spirit and scope of the present disclosure.
For example, in the first embodiment, the reference laser light is emitted to measure the numerical values of the crystalline lens thickness and other structural properties of the eye, and then, the unaided visual acuity is estimated to determine the optimal beam state. Alternatively, however, the unaided visual acuity may also be acquired by other means. For example, the unaided visual acuity may be inputted by the user through an undepicted input device. In such a case, for example, the image data output unit 10 may use the image laser light to display an unaided visual acuity input screen prompting the user to input data, and the eyeball state acquisition section 64 may receive the data inputted by the user. As long as the same process is subsequently performed as described in conjunction with the first embodiment, the resulting effect is similar to the effect obtained in the first embodiment.
Further, in the second embodiment, the reference laser light is emitted to measure changes in the crystalline lens thickness and estimate the focal distance, and the beam state is thus changed. Alternatively, however, in a case where the image display system includes the gaze point detector as mentioned earlier, the beam state may be changed only on the basis of the position of the gaze point in the image plane. In this case, for example, the beam control section 54 adjusts the beam state in such a manner that the highest acquired visual acuity is obtained in a region within a predetermined range including the gaze point in the image plane, and that the acquired visual acuity decreases with an increase in the distance from the gaze point.
The above alternative can also give a user a sense that a spot viewed by the user is in focus, and makes it possible to reduce discomfort caused by the convergence-accommodation conflict, in a simplified manner. It should be noted that, as mentioned earlier, the distance acquisition section 60 and the eyeball state acquisition section 64 may have the function of the gaze point detector. Also, in this case, the image data output unit 10 may generate image data by changing the distribution of resolution in the image plane in correspondence with the beam state adjustment. This reduces the load on the image generation process without significantly affecting visual recognition.
Number | Date | Country | Kind |
---|---|---|---|
2022-091633 | Jun 2022 | JP | national |