This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2023-108563, filed on Jun. 30, 2023, the entire contents of which are incorporated herein by reference.
The present disclosure relates to a display device.
Conventionally, there is known a technique of displaying a three-dimensional virtual image by emitting a right-eye image and a left-eye image having parallax with each other via a light guide plate.
A related technique is disclosed in US 2021/0294101 A.
However, in the related art, a right-eye waveguide is installed at a position corresponding to the right eye, and a left-eye waveguide is installed at a position corresponding to the left eye. Therefore, in a case where the position of the user's eye changes to either the left or right across the center, the right eye and the left eye are located at positions corresponding to the right-eye waveguide, or the left eye and the right eye are located at positions corresponding to the left-eye waveguide, and there is an issue that a three-dimensional virtual image cannot be appropriately displayed.
An object of the present disclosure is to provide a display device capable of appropriately displaying a three-dimensional virtual image even when a position of a user's eyes change.
According to the present disclosure, a display device includes an image light emitter, a light guide body, a memory and a processor. The image light emitter emits image light beams of a left-eye image and a right-eye image. The light guide body has an emission surface from which light incident from the image light emitter is emitted. The processor is coupled to the memory and configured to control the image light emitter. The light guide body includes a holographic element, and a light guide portion. The image light beams emitted from the image light emitter are incident on the holographic element. The light guide portion encloses the holographic element. The processor is configured to: identify positions of both eyes of a user based on a captured image of an imager that captures both eyes of the user; and adjust positions where the image light beams are incident on the holographic element according to the positions of the both eyes of the user.
c are diagrams illustrating a relationship between a pupillary distance and a propagation distance;
Hereinafter, a display device according to an embodiment of the present disclosure will be described in detail with reference to the accompanying drawings.
As illustrated in
The display device 1 is a light guide plate type hologram display, and projects a right-eye image and a left-eye image emitted from the image light emitting unit 10 via the light guide body 20 to display a three-dimensional virtual image. For example, by disposing the display device 1 on a dashboard of a vehicle, the display device 1 can also be used as a light-guide plate type hologram HUD (head-up display). Hereinafter, a specific configuration of the display device 1 will be described.
The image light emitting unit 10 emits image light beams of a left-eye image (two-dimensional image) and a right-eye image (two-dimensional image). For example, the image light emitting unit 10 includes a small display device that displays a left-eye image and a right-eye image.
The light guide body 20 has an emission surface that emits the image light beams incident from the image light emitting unit 10. The image light beams emitted from the emission surface are guided to both eyes of the user, and the user can visually recognize a three-dimensional virtual image. The light guide body 20 includes a holographic element 40 on which light emitted from the image light emitting unit 10 is incident and a light guide portion 50 enclosing the holographic element 40.
As illustrated in
In this case, each of the incident holographic element 41, the turning holographic element 42, and the emission holographic element 43 is configured as a transmission diffractive optical element.
In the present embodiment, the case where the holographic element 40 includes three holographic elements is illustrated, but the present invention is not limited thereto. For example, the holographic element 40 may include two holographic elements. Specifically, the holographic element 40 may include at least two of the incident holographic element 41, the turning holographic element 42, and the emission holographic element 43.
In this case, when the holographic element 40 includes two holographic elements of the incident holographic element 41 and the turning holographic element 42, the turning holographic element 42 may have the function of the emission holographic element 43.
In addition, in a case where the holographic element 40 includes two holographic elements of the turning holographic element 42 and the emission holographic element 43, the turning holographic element 42 may have the function of the incident holographic element 41.
In addition, in a case where the holographic element 40 includes two holographic elements of the incident holographic element 41 and the emission holographic element 43, the incident holographic element may have the function of the turning holographic element 42.
In the example of
The image light beams incident on the turning holographic element 42 are diffracted while being expanded (enlarged) in a light propagation direction (x direction in the example of
Further, the image light incident on the emission holographic element 43 is diffracted while being expanded (enlarged) in a light propagation direction (y direction in the example of
In addition, the light guide portion 50 of the present embodiment includes a pair of (two) glass plates facing each other, and the incident holographic element 41, the turning holographic element 42, and the emission holographic element 43 are sandwiched between the pair of glass plates.
Returning to
More specifically, the control unit 30 determines the emission positions at which the image light beams from the emission holographic element 43 is emitted from the emission surface 50S of the light guide portion 50 on the basis of the positions of both eyes of the user. That is, the control unit 30 determines the emission position of the image light beam corresponding to the left-eye image so that the image light beam corresponding to the left-eye image is incident on the pupil of the left eye of the user, and determines the emission position of the image light beam corresponding to the right-eye image so that the image light beam corresponding to the right-eye image is incident on the pupil of the right eye of the user.
Then, the control unit 30 determines the turning positions at which the image light beams are emitted from the turning holographic element 42 on the basis of the emission positions of the image light beams from the emission surface 50S of the light guide portion 50.
Furthermore, the control unit 30 determines the incident positions where the image light beams are incident on the incident holographic element 41 on the basis of the turning positions for each of the image light beam corresponding to the left-eye image and the image light beam corresponding to the right-eye image.
Then, the control unit 30 performs adjustment so that the image light beams (the image light beam corresponding to the left-eye image and the image light beam corresponding to the right-eye image) emitted by the image light emitting unit 10 is incident on the determined incident position. Furthermore, the control unit 30 can adjust positions where the image light beams are incident on the holographic element 40 according to the positions of both eyes of the user and control the image light emitting unit 10 to change the left-eye image and the right-eye image. Hereinafter, a specific configuration of the control unit 30 will be described.
In the present embodiment, the control unit 30 is configured as a so-called computer device.
As illustrated in
The processor 301 is, for example, a central processing unit (CPU). The processor 301 executes the program to integrally control the operation of the control unit 30 and implement various functions of the control unit 30. Various functions of the control unit 30 will be described later.
The ROM 302 is a non-volatile memory, and stores various types of information including programs and the like executed by the processor 301.
The RAM 303 is a volatile memory having a work area of the processor 301.
The device I/F unit 304 is an interface for connecting to an external device (for example, the image light emitting unit 10, the imaging unit 60, and the like).
The bus 305 communicably connects the processor 301, the ROM 302, the RAM 303, and the device I/F unit 304.
As illustrated in
The binocular position identifying unit 310 identifies the positions of both eyes of the user based on the captured image captured by the imaging unit 60. More specifically, the binocular position identifying unit 310 measures coordinates of the pupil positions of the right eye and the left eye as right-eye coordinates PRe and left-eye coordinates PLe by eye tracking using the imaging unit 60. In the example of
The emission position determination unit 311 determines emission positions at which the image light beams are emitted from the emission holographic element 43 on the basis of the positions of both eyes of the user identified by the binocular position identifying unit 310.
In the example of
In the example of
The turning position determination unit 312 illustrated in
In the example of
In addition, it is assumed that the light emitted from the left-eye turning position of the turning holographic element 42 to the emission holographic element 43 passes through the emission holographic element 43 inside the light guide portion 50 and reaches the left-eye emission position PLo of the emission holographic element 43 while repeating total reflection.
That is, the right-eye emission position PRo=(x1, y1) is a position where the light emitted from the right-eye turning position PRt of the turning holographic element 42 to the emission holographic element 43 reaches the emission holographic element 43 by repeating total reflection in the light propagation direction (y direction).
Here, the right-eye turning position PRt=(x1, y1+α1) can be expressed. α1 is a distance corresponding to the number of total reflection (in the example of
Similarly, the left-eye emission position PLo=(x2, y2) is a position where the light emitted from the left-eye turning position PLt of the turning holographic element 42 to the emission holographic element 43 is reached by repeating total reflection in the light propagation direction (y direction).
Here, the left-eye turning position PLt=(x2, y2+β1) can be expressed. β1 is a distance corresponding to the number of total reflection (in the example of
The incident position determination unit 313 illustrated in
In the example of
That is, the right-eye turning position PRt=(x1, y1+α1) is a position at which the light emitted from the right-eye incident position PRi of the incident holographic element 41 to the turning holographic element 42 reaches by repeating total reflection in the light propagation direction (x direction), and can be expressed as a right-eye incident position PRi=(x1+α2, y1+α1). Here, α2 is a distance corresponding to the number of total reflection (in the example of
Similarly, the left-eye turning position PLt=(x2, y2+β1) is a position at which the light emitted from the left-eye incident position of the incident holographic element 41 to the turning holographic element 42 reaches by repeating total reflection in the light propagation direction (x direction), and can be expressed as a left-eye incident position PLi=(x2+β2, y2+β1). Here, β2 is a distance corresponding to the number of total reflection (in the example of
Here, if the thickness of the light guide portion 50 (size in the z direction in the example of
Based on the above-described incident condition, it can be considered that a position shifted in the x direction by c×4×n (n: natural number) from the right-eye incident position PRi is the right-eye turning position PRt (α2=−c×4×n).
Similarly, it can be considered that a position shifted in the x direction by c×4×n from the left-eye incident position PLi becomes the left-eye turning position PLt (β2=−c×4×n). Note that, in a case where n cannot be expressed as a natural number, the right-eye incident position PRi and the left-eye incident position PLi may be adjusted so as to be a natural number.
Returning to
As this adjustment method, various methods can be used. For example, in a case where the image light emitting unit 10 is a small display device, the incident positions of the image light beams on the incident holographic element 41 can be adjusted by changing the display positions of the right-eye image and the left-eye image displayed on the small display device. Furthermore, for example, the incident positions of the image light beams on the incident holographic element 41 can be adjusted by moving the small display device without changing the display position of the image on the small display device.
Furthermore, the image light emitting unit 10 may be, for example, a projector such as a liquid crystal on silicon (LCOS) using a laser light source, and can also adjust the emission directions of the image light beams so that the image light beams emitted from the projector enter the incident positions PRi and PLi of the incident holographic element 41. Furthermore, the image light emitting unit 10 may be in a mode of generating a right-eye image and a left-eye image using, for example, a computer-generated hologram (CGH) technology, and can adjust the emission directions of the image light beams so that the image light beams generated using the CGH are incident on the incident positions of the incident holographic element 41.
The image light change control unit 315 controls the image light emitting unit 10 to change the left-eye image and the right-eye image according to the positions of both eyes of the user identified by the binocular position identifying unit 310. For example, the image light change control unit 315 may not be provided.
As described above, the display device 1 of the present embodiment adjusts the positions where the image light beams are incident on the holographic element 40 such that the image light beams emitted from the image light emitting unit 10 are guided to both eyes according to the positions of both eyes of the user, and thus, it is possible to appropriately display a three-dimensional virtual image even if the positions of both eyes of the user change.
Although embodiments of the present disclosure have been described above, these embodiments described above have been presented as examples, and are not intended to limit the scope of the invention. These novel embodiments can be implemented in various other forms, and various omissions, substitutions, and changes can be made without departing from the gist of the invention. These novel embodiments and modifications thereof are included in the scope and gist of the invention and are included in the invention described in the claims and the equivalent scope thereof.
Furthermore, the effects of the embodiments described in the present specification are merely examples and are not limited, and other effects may be provided.
Hereinafter, modifications will be described.
For example, as illustrated in
The light shielding member 70 includes a plurality of regions 71 and 72 that can transition between a state where light is transmitted and a state where light. is blocked The region 71 corresponds to a user A (corresponds to both eyes of the user A), and the region 72 corresponds to a user B (corresponds to both eyes of the user B). The setting positions of these regions 71 and 72 can be changed according to the positions of the corresponding user's eyes.
The light shielding member 70 is disposed on the emission surface side of the emission holographic element 43.
Similarly, the light shielding member 80 also includes a plurality of regions 81 and 82 that can transition between a state where light is transmitted and a state where light is blocked. The region 81 corresponds to the user A (corresponds to both eyes of the user A), and the region 82 corresponds to the user B (corresponds to both eyes of the user B). The light shielding member 80 is disposed between the turning holographic element 42 and the emission holographic element 43.
In the form of
Similarly, in the form of
As described above, the control unit 30 controls the light shielding member so that a plurality of regions on the light shielding member corresponding to a plurality of users on a one-to-one basis are in a state where light is transmitted in a time division manner, and controls the image light emitting unit 10 so that image light beams for each user (image light according to the positions of both eyes of the user) are also emitted in a time division manner, whereby it is possible to prevent crosstalk in which an image of the other eye enters one eye.
In the incident holographic element 41, the size of an incident region (incident surface) indicating areas where image light beams are incident are desirably at least equal to or larger than the sizes of the right eye and the left eye. This is because when the size of the incident surface is smaller than the sizes of the right eye and the left eye, a plurality of image light beams enter the pupil, and crosstalk may occur.
For example, a propagation distance indicating an interval of the image light beams emitted from the emission surface of the light guide portion 50 may be set to a value at which crosstalk does not occur according to a pupillary distance indicating a distance between a pupil of a right eye and a pupil of a left eye of a user.
For example, as illustrated in
For example, in a case where the pupillary distance is assumed to be 60 to 70 mm, the propagation distance is set to a value larger than 35 mm and smaller than 60 mm.
Further, in a case where the pupillary distance is assumed to be 50 to 80 mm, the propagation distance is set to a value larger than 40 mm and smaller than 50 mm.
Here, the propagation distance LP is determined according to thickness d of the light guide portion 50. More specifically, in the light guide portions 50 of the same material having the same critical angle of total reflection, the propagation distance LP becomes shorter as the thickness d of the light guide portion 50 is smaller, and the propagation distance LP becomes longer as the thickness d of the light guide portion 50 is larger.
Therefore, since the pupillary distance is generally said to be 60 to 70 mm, the occurrence of crosstalk can be suppressed by setting the propagation distance LP to a value larger than 35 mm and smaller than 60 mm as described above. In addition, in a case where the pupillary distance is assumed to be 50 to 80 mm on the assumption that the pupillary distance is less than 60 mm or more than 70 mm, the occurrence of crosstalk can be more reliably suppressed by setting the propagation distance LP to a value larger than 40 mm and smaller than 50 mm.
As described above, when the pupillary distance is 60 to 70 mm, the thickness d of the light guide portion 50 may be set such that the propagation distance LP becomes a value larger than 35 mm and smaller than 60 mm. In addition, when the pupillary distance is 50 to 80 mm, the thickness d of the light guide portion 50 may be set such that the propagation distance LP becomes a value larger than 40 mm and smaller than 50 mm. In other cases, similarly, the thickness of the light guide portion 50 is set to have a desired propagation distance.
For example, the light guide portion 50 may have a form to further include a reflection member disposed in a portion facing the incident holographic element 41 and the turning holographic element 42.
For example, as illustrated in
In this case, assuming that the propagation angle of the image light beam in the portion of the light guide portion 50 where the mirror coat is disposed is θ1 and the propagation angle of the image light beam in the portion of the light guide portion 50 where the mirror coat is not disposed is θ2 (=the propagation angle at the time of total reflection), for example, θ1>θ2 may be satisfied.
In other words, the any angle can be set without being affected by the value of the critical angle of total reflection of the light guide portion 50 at the portion where the mirror coat is not disposed.
For example, it is also possible to configure such that for the left-eye image light region and the right-eye image light region corresponding to the incident image light beams, the emission region is enlarged on the emission surface of the emission holographic element and emits in a state where it is rotated by a predetermined angle based on the eye movement frequency predicted according to the viewing usage.
JP 2011-180178 A discloses that image light beam is incident on a user at a steep angle and image light beam is incident at a shallow angle.
Based on this disclosure, the inventor has conceived a configuration in which the left-eye image light region and the right-eye image light region corresponding to the incident image light beams emit in a state where they are rotated by a predetermined angle on the emission surface of the emission holographic element based on the eye movement frequency predicted according to the viewing usage.
As illustrated in
As illustrated in
For example, in a case where the predicted eye movement frequency is larger in the vertical direction (y-direction when viewed from the emission surface 50S of the light guide body 20 (the light guide portion 50)) than in the horizontal direction (x-direction when viewed from the emission surface 50S of the light guide portion 50) as in the case of viewing a vertically long display, as illustrated in
Further, for example, in a case where the predicted movement frequency of eyes of the user is substantially equal in the vertical direction and in the horizontal direction, as illustrated in
Furthermore, for example, in a case where the predicted movement frequency of eyes of the user is larger in the horizontal direction than in the vertical direction, as illustrated in
In the above description, the emission images at one emission position PRo or PLo of the emission holographic element 43 corresponding to the position of the user's eyes have been described, but the same image is emitted from a plurality of emission positions of the emission holographic element 43.
Therefore, in the sixth modification, the emission holographic element 43 is formed to emit continuously an emission image at an emission position of the emission holographic element corresponding to the position of the user's eye and an emission image at an emission position adjacent to the aforementioned emission position on the emission surface 50S of the light guide body 20 (the light guide portion 50).
That is, for example, the size of the image light beam incident from the image light emitting unit 10 on the incident holographic element 41 is set such that the right-eye image light region formed by being emitted from the emission position of the emission holographic element 43 corresponding to the position of the user's right eye and another right-eye image light region formed by being emitted from the emission position adjacent to the emission position are continuously displayed.
As illustrated in
Therefore, this is effectively equivalent to expansion of the right-eye image light region AGR (=AGR1+AGR2), and the control unit 30 can suppress the switching frequency of the right-eye image.
Similarly, the left-eye image light region AGL1 corresponding to the emission image at the emission position of the emission holographic element corresponding to the position of the user's eye and the left-eye image light region AGL2 corresponding to the emission image at the emission position adjacent to the emission position of the emission holographic element corresponding to the position of the user's eye are continuously emitted on the emission surface of the light guide body (light guide portion), which is effectively equivalent to expansion of the left-eye image light region AGL (=AGL1+AGL2), and the control unit 30 can suppress the switching frequency of the left-eye image.
As a result, the positions of the pupils of the user hardly deviate from the corresponding left-eye image light region AGL and right-eye image light region AGR, and the control unit 30 can suppress the switching frequency of the left-eye image and the right-eye image.
In addition, similarly to the fifth modification, in a case where it is assumed that a virtual plane PLN1 parallel to the emission surface 50S of the light guide portion 50 is disposed in front of the user's face, the emission holographic element 43 may be configured to emit in a state where it is rotated by the predetermined angle β on the emission surface of the emission holographic element 43 so that the left-eye image light region AGL and the right-eye image light region AGR corresponding to the incident image light beams are in a state where they are rotated by a predetermined angle β.
According to this configuration, the effect of the fifth modification can be obtained in addition to the effect of the sixth modification.
In addition, in the above-described embodiment, an example has been described in which the incident holographic element 41, the turning holographic element 42, and the emission holographic element 43 are transmission type holograms that transmit and diffract light, but the incident holographic element 41, the turning holographic element 42, and the emission holographic element 43 can also be formed of reflection type holograms that reflect and diffract light.
The above-described embodiment can be arbitrarily combined with the above-described modifications, or the above-described modifications may be arbitrarily combined.
Furthermore, the display device 1 according to the above-described embodiment can also be used as a head mounted display, an eyeglass-type display, or the like.
The following technical aspects are disclosed by the above description of the embodiment.
A display device including:
The display device according to First Aspect, wherein
The display device according to First Aspect or Second Aspect, wherein
The display device according to Third Aspect, wherein
The display device according to Fourth Aspect, wherein
The display device according to Fifth Aspect, wherein
The display device according to Fifth Aspect, wherein
The display device according to any one of First Aspect to Seventh Aspect, wherein
The display device according to any one of First Aspect to Eight Aspect, wherein
The display device according to any one of Third Aspect, Fourth Aspect, and Sixth Aspect, wherein
The display device according to Tenth Aspect, wherein
The display device according to any one of Third Aspect, Fourth Aspect, and Sixth Aspect, wherein
The display device according to Twelfth Aspect, wherein
The display device according to any one of Third Aspect, Fourth Aspect, and Sixth Aspect, wherein
The display device according to any one of Third Aspect, Fourth Aspect, and Sixth Aspect,
According to the present disclosure, a three-dimensional virtual image can be appropriately displayed even when the positions of the user's eyes change. Note that the effects described herein are not necessarily limited, and may be any of the effects described in the present specification.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2023-108563 | Jun 2023 | JP | national |