The present disclosure relates to an image display device and a head-up display system including the image display device to display a virtual image.
Conventionally, a head-up display system includes an image display device that displays an image. A head-up display system is a vehicle information projection system that performs augmented reality (AR) display using an image display device. For example, the head-up display device projects light representing a virtual image on a windshield of a vehicle to allow a driver to visually recognize the virtual image together with a real view of an outside world of the vehicle.
As a device for displaying a virtual image, U.S. patent Ser. No. 10/429,645 describes an optical element including a waveguide (light guide body) for expanding an exit pupil in two directions. The optical element may utilize a diffractive optical element to expand the exit pupil. In addition, WO 2018/198587 A describes a head-mounted display that performs augmented reality (AR) display using a volume hologram diffraction grating.
However, when the wavelength of the light emitted from the image display device changes, distortion occurs in the displayed virtual image.
An object of the present disclosure is to provide an image display device and a head-up display system that reduce distortion of a virtual image.
An image display device of the present disclosure includes: a display that emits a light flux that forms an image visually recognized by an observer as a virtual image; a light guide body that guides the light flux to a light-transmitting member; a controller that controls the image displayed by the display; and a sensor that detects a physical quantity used to obtain a wavelength of the light flux. The light guide body includes an incident surface on which the light flux from the display is incident and an emission surface from which the light flux is emitted from the light guide body. The light flux incident on the incident surface of the light guide body is changed in a traveling direction in the light guide body, and is emitted from the emission surface so as to expand a visual field area by being replicated in a horizontal direction and a vertical direction of the virtual image visually recognized by the observer. The controller controls a position and a shape of the image displayed by the display based on the physical quantity detected by the sensor.
Further, a head-up display system of the present disclosure includes: the above-described image display device; and the light-transmitting member that reflects a light flux emitted from the light guide body, in which the head-up display system displays the virtual image so as to be superimposed on a real view visually recognizable through the light-transmitting member.
According to the image display device and the head-up display system of the present disclosure, it is possible to reduce the distortion of the virtual image.
First, an outline of the present disclosure will be described with reference to
The coupling region 21, the first expansion region 23, and the second expansion region 25 each have diffraction power for diffracting image light, and an embossed hologram or a volume hologram is formed. The embossed hologram is, for example, a diffraction grating. The volume hologram is, for example, an interference fringe by a dielectric film. The coupling region 21 changes the traveling direction of the image light incident from the outside to the first expansion region 23 by the diffraction power.
In the first expansion region 23, for example, diffraction grating elements are located, and image light is replicated by dividing the incident image light into image light traveling in the first direction and image light traveling to the second expansion region 25 by diffraction power. For example, in
In the second expansion region 25, for example, diffraction grating elements are located, and image light is replicated by dividing the incident image light into image light traveling in the second direction and image light emitted from the second expansion region 25 to the outside by diffraction power. For example, in
Next, a difference between a pupil expansion type HMD and a head-up display (hereinafter, referred to as an HUD) will be described with reference to
As illustrated in
On the other hand, as illustrated in
When a temperature of a light source 11b of the display 11 changes, the wavelength of the image light emitted from the display 11 drifts. The diffraction pitch d at which the light flux constituting the image light is diffracted, the incident angle α, the diffraction angle β, and the wavelength λ of the light flux satisfy the following relational expression.
d(sin α−sin β)=mλ
Therefore, the wavelength of the image light emitted from the light source 11b is monitored, and the display position of the image on the display 11 is corrected.
As described above, the HUD is different from the HMD in that the diffraction pitch is not constant because the image light emitted from the light guide body 13 is reflected by the windshield 5 and made incident on the visual recognition region Ac, and thus, in the HUD, the distortion of the virtual image is more conspicuous due to the change in wavelength of the image light.
As illustrated in
Since distortion occurs when the light flux L2 emitted from the light guide body 13 is reflected by the windshield 5, the image displayed from the display 11 is deformed in a direction opposite to the distortion in advance, so that the observer can visually recognize an image without distortion. For example, when a deformed quadrangular image 12 is displayed in a display region 11a of the display 11 as illustrated in
However, as the temperature of the light source 11b of the display 11 rises, the wavelength of the light flux L1 emitted from the display 11 changes. In a case where a narrow band light source such as a laser element is used as the light source 11b, for example, the wavelength becomes longer as the temperature rises. As a result, as illustrated in
As a result, before the temperature of the display 11 rises, the rectangular virtual image Iva illustrated in
Hereinafter, a first embodiment will be described with reference to
[1-1. Configuration]
[1-1-1. Overall Configuration of Image Display Device and Head-up Display System]
A specific embodiment of a head-up display system 1 (hereinafter, referred to as an HUD system 1) of the present disclosure will be described.
Hereinafter, directions related to the HUD system 1 will be described based on the X axis, the Y axis, and the Z axis illustrated in
As illustrated in
The image display device 2 includes a display 11, a light guide body 13, a controller 15, a storage 17, and a sensor 19. The display 11 emits a light flux that forms an image visually recognized by the observer as the virtual image Iv. The light guide body 13 divides and replicates a light flux L1 emitted from the display 11, and guides the replicated light flux 12 to the windshield 5. The light flux L2 reflected by the windshield 5 is displayed as the virtual image Iv so as to be superimposed on a real view visible through the windshield 5.
The display 11 displays an image based on control by the external controller 15. As the display 11 including the light source 11b, for example, a liquid crystal display with a backlight, an organic light-emitting diode display, a plasma display, or the like can be used. In addition, a laser element may be used as the light source lib. In addition, as the display 11, an image may be generated using a screen that diffuses or reflects light and a projector or a scanning laser. The display 11 can display image content including various types of information such as a road guidance display, a distance to a vehicle ahead, a remaining battery level of the vehicle, and a current vehicle speed. As described above, the display 11 emits the light flux L1 including the image content visually recognized by the observer D as the virtual image Iv.
The controller 15 can be implemented by a circuit including a semiconductor element or the like. The controller 15 can be configured by, for example, a microcomputer, a CPU, an MPU, a GPU, a DSP, an FPGA, or an ASIC. The controller 15 reads data and programs stored in the built-in storage 17 and performs various arithmetic processing, thereby implementing a predetermined function. Furthermore, the controller 15 includes a storage 17. The controller 15 performs correction by changing the position and shape of the image displayed from the display 11 according to the detection value of the sensor 19.
The storage 17 is a storage medium that stores programs and data necessary for implementing the functions of the controller 15. The storage 17 can be implemented by, for example, a hard disk (HDD), an SSD, a RAM, a DRAM, a ferroelectric memory, a flash memory, a magnetic disk, or a combination thereof. The storage 17 stores an image representing the virtual image Iv and shape data when the image is displayed on the display 11. In addition, a first lookup table in which a wavelength, a display position, and a shape of an image are associated with each other is stored. In addition, a second lookup table in which the amount of light detected by the sensor 19 and the wavelength of light are associated with each other is also stored. The controller 15 determines the shape of the image displayed on the display 11 based on the detection value of the sensor 19. The controller 15 reads the determined display image and shape data from the storage 17 and outputs them to the display 11.
The sensor 19 receives a light flux that is emitted from the light guide body 13 and is not visually recognized by the observer D. For example, a light flux emitted from the display 11 and propagating in an optical path branched from the optical path to the visual recognition region AD of the observer D is received. The sensor 19 detects a physical quantity used to obtain the wavelength of the light flux L1. The sensor 19 is, for example, a light detector, detects a wavelength and a light amount of received light, and transmits the detected value to the controller 15. The sensor 19 is located, for example, on a straight line connecting the display 11 and the coupling region 21.
[1-1-2. Light Guide Body]
A configuration of the light guide body 13 will be described with reference to
The emission surface 27 faces the second expansion region 25. The first main surface 13a faces the windshield 5. In the present embodiment, the incident surface 20 is included in the coupling region 21, but may be included in the first main surface 13a which is a surface facing the coupling region 21. The emission surface 27 may be included in the second expansion region 25.
The coupling region 21, the first expansion region 23, and the second expansion region 25 have different diffraction powers, and a diffraction grating or a volume hologram is formed in each region. The coupling region 21, the first expansion region 23, and the second expansion region 25 have different diffraction angles of image light. In addition, the light guide body 13 has a configuration in which the incident light flux is totally reflected inside. The light guide body 13 is made of, for example, a glass or resin plate whose surface is mirror-finished. The shape of the light guide body 13 is not limited to a planar shape, and may be a curved shape. As such, the light guide body 13 includes a diffraction grating or a volume hologram that diffracts light in part. The coupling region 21, the first expansion region 23, and the second expansion region 25 are three-dimensional regions in a case where a volume hologram is included.
The coupling region 21 is a region where the light flux L1 emitted from the display 11 is incident from the incident surface 20 and the traveling direction of the light flux L1 is changed. The coupling region 21 has diffraction power and changes the propagation direction of the incident light flux L1 to the direction of the first expansion region 23. In the present embodiment, coupling is a state of propagating in the light guide body 13 under the total reflection condition. As illustrated in
As illustrated in
The second expansion region 25 expands the light flux L1 in the second direction perpendicular to the first direction, for example, and emits the expanded light flux L2 from the emission surface 27. The light guide body 13 is located such that the second direction is a negative direction of the Z axis. The light flux L1 propagated from the first expansion region 23 is propagated in the second direction while repeating total reflection on the first main surface 13a and the second main surface 13b, and the light flux L1 is copied by the diffraction grating of the second expansion region 25 formed on the second main surface 13b and emitted to the outside of the light guide body 13 via the emission surface 27.
Therefore, when viewed from the viewpoint of the observer D, the light guide body 13 expands the light flux L1 incident on the incident surface 20 and changed in the traveling direction in the horizontal direction (X-axis direction) of the virtual image Iv visually recognized by the observer D, and then further expands the light flux L1 in the vertical direction (Y-axis direction) of the virtual image Iv to emit the light flux L2 from the emission surface 27.
[1-1-3. Pupil Expansion Order]
In the light guide body 13 located as described above, the order of pupil expansion of the first embodiment will be described with reference to
The light flux L1 of the image light incident on the light guide body 13 is changed in the propagation direction to the first expansion region 23 in which pupil expansion is performed in the horizontal direction (negative direction of the X axis) as the first direction by the diffraction element formed in the coupling region 21. Therefore, the light flux L1 is obliquely incident on the coupling region 21, and then propagates in the direction of the first expansion region 23 under the action of the wave number vector k1 illustrated in
The light flux L1 propagating to the first expansion region 23 extending in the first direction is divided into the light flux L1 propagating in the first direction and the light flux L1 replicated and changed in the propagation direction to the second expansion region 25 by the diffraction element formed in the first expansion region 23 while repeating total reflection. At this time, the replicated light flux L1 propagates in the direction of the second expansion region 25 under the action of the wave number vector k2 illustrated in
The light flux L1 changed in the propagation direction to the second expansion region 25 extending along the negative direction of the Z axis as the second direction is divided into the light flux L1 propagating in the second direction and the light flux L2 replicated and emitted from the second expansion region 25 to the outside of the light guide body 13 via the emission surface 27 by the diffraction element formed in the second expansion region 25. At this time, the replicated light flux L2 propagates in the direction of the emission surface 27 under the action of the wave number vector k3 illustrated in
[1-1-4. Detection of Wavelength Change of Light]
The sensor 19 directly or indirectly detects a change in wavelength of the light flux L1. The sensor 19 includes, for example, a filter having transmittance different depending on a wavelength as illustrated in
In addition, as illustrated in
In addition, as illustrated in
By these wavelength detection methods, the wavelength of the light flux L1 can be detected from the amount of light detected by the sensor 19. Only one of these wavelength detection methods may be used, or a combination thereof may be used.
Next, a flow of image correction processing of the image display device 2 will be described with reference to
In step S1, when an image is displayed from the display 11, the controller 15 acquires the light amount detected by the sensor 19 and acquires the wavelength of the light flux L1 emitted from the display 11 by referring to the second lookup table stored in the storage 17 in which the light amount and the wavelength are associated with each other.
Next, in step S2, the controller 15 refers to the first lookup table stored in the storage 17 in which the wavelength of the light flux L1 and the display position and shape of the image displayed from the display 11 are associated with each other. Next, in step S3, the controller 15 controls the image displayed from the display 11 based on the reference result of the first lookup table. For example, as illustrated in
For example, when the wavelength of the light flux L1 emitted from the display 11 becomes longer due to the temperature rise of the display 11, as illustrated in
As a result, the light flux L incident on the light guide body 13 can be diffracted in each of the coupling region 21, the first expansion region 23, and the second expansion region 25 and emitted from the light guide body 13 at the emission angle before the temperature rises, and the observer D can see the virtual image Iv with reduced distortion.
The virtual image Iv in the first embodiment will be described with reference to
In the five dots Dt1 to Dt5 in
When the wavelength of the light flux L1 emitted from the display 11 is increased to 530 nm, coordinates of the dots Dt1 to Dt5 as the virtual image Iv viewed by the observer D are illustrated in
Furthermore, according to
Therefore, by controlling the position and shape of the image displayed on the display 11 according to the change in wavelength of the light flux L1 emitted from the display 11 by the image display device 2 of the first embodiment, the dots Dt1 to Dt5 illustrated in
Next, a first modification of the first embodiment will be described with reference to
The side surface 13c of the light guide body 13 is on the extension of the first expansion region 23 in the first direction. When the side surface 13c is roughened, the light flux L1 incident on the side surface 13c is scattered and emitted to the outside of the light guide body 13. The sensor 19 of the first modification receives the light flux L1 emitted from the side surface 13c of the light guide body 13 by repeating total reflection without being diffracted in the first expansion region 23. Thus, the image display device 2 of the first modification can also have the same function as that of the first embodiment. In addition, as illustrated in
Next, a second modification of the first embodiment will be described with reference to
A side surface 13d of the light guide body 13 is on the extension of the second expansion region 25 in the second direction. When the side surface 13d is roughened, the light flux L1 incident on the side surface 13d is scattered and emitted to the outside of the light guide body 13. The sensor 19 of the second modification receives the light flux L1 emitted from the side surface 13d of the light guide body 13 by repeating total reflection without being diffracted in the second expansion region 25. Thus, the image display device 2 of the second modification can also have the same function as that of the first embodiment. In addition, as illustrated in
Next, a third modification of the first embodiment will be described with reference to
The light flux L1 of off-axis light that is not incident on the second expansion region 25 propagates in the second direction on both sides of the second expansion region 25 in the first direction. The light guide body 13 includes a diffractor 26, and for example, in
Next, a fourth modification of the first embodiment will be described with reference to
The sensor 19A includes an incident slit 31, a collimating lens 33, a transmission grating 35, a focus lens 37, and an image sensor 39. The incident light is dispersed by the transmission grating 35, and the image sensor 39 detects the amount of light for each wavelength of the dispersed light.
Since the sensor 19A directly detects the wavelength of the light flux L1 emitted from the display 11, it is possible to accurately perform image correction according to the wavelength.
[1-2. Effects, Etc.]
The image display device 2 of the present disclosure includes the display 11 that emits the light flux L1 that forms an image visually recognized by the observer D as the virtual image Iv, and the light guide body 13 that guides the light flux L1 to the windshield 5. The image display device 2 further includes the controller 15 that controls an image displayed by the display 11, and the sensor 19 that detects a physical quantity used to obtain the wavelength of the light flux L1. The light guide body 13 includes the incident surface 20 on which the light flux L1 from the display 11 is incident and the emission surface 27 from which the light flux L1 is emitted from the light guide body 13. The light flux L1 incident on the incident surface 20 of the light guide body 13 is changed in the traveling direction in the light guide body 13, and is replicated in the horizontal direction and the vertical direction of the virtual image Iv visually recognized by the observer D to be emitted from the emission surface 27 so as to expand the visual field area. The controller 15 controls the position and shape of the image displayed by the display 11 based on the physical quantity detected by the sensor 19.
Even if the wavelength of the light flux 11 emitted from the display 11 changes, the sensor 19 detects a physical quantity used to obtain the wavelength of the light flux L1, and the controller 15 controls the position and shape of the image displayed by the display 11 based on the detected physical quantity. As a result, even if the traveling direction in the light guide body 13 changes due to the change in wavelength of the light flux L1, it is possible to display a virtual image with reduced distortion by controlling the position and shape of the image displayed on the display 11.
The sensor 19 is an optical sensor, and detects a physical quantity of light by receiving a part of the light flux L1 that is not visually recognized by the observer. As a result, it is possible to obtain a physical quantity for obtaining the wavelength of the light flux while maintaining the brightness of the virtual image.
The sensor 19 may detect the wavelength of the received light. Since the sensor 19 directly detects the wavelength of light, wavelength detection accuracy can be improved.
The sensor 19 may detect the amount of received light, and the controller 15 determines the wavelength of the light of the light flux L1 based on the amount of light detected by the sensor 19. Since the light amount sensor is used as the sensor 19, the space for disposing the sensor 19 can be reduced, and the cost can be reduced.
Further, by projecting light emitted from the HUD system 1 onto the windshield 5 of the vehicle 3, the virtual image Iv suitable for the observer D riding on the vehicle 3 can be displayed.
Hereinafter, a second embodiment will be described with reference to
The sensor 19B used in the display device 2B of the second embodiment is a temperature sensor instead of a sensor that detects light, and detects the temperature of the display 11B or the temperature of the light source lib. The sensor 19B may be located inside the display 11B or may be located on the outer surface of the display 11B. In addition, a third lookup table in which the relationship between the temperature detected by the sensor 19B and the wavelength of the light flux L1 emitted from the display 118 is associated in advance is stored in the storage 17.
Based on the temperature detected by the sensor 19B, the controller 15 determines the wavelength of the light flux L1 emitted from the display 11B with reference to the third lookup table stored in the storage 17. The sensor 19B can detect the wavelength of the light flux L1 with higher accuracy by measuring the temperature at a position as close as possible to the light emission point of the light source 11b.
The controller 15 controls the position and shape of the image displayed from the display 11B based on the determined wavelength as in the first embodiment. As a result, even if the traveling direction in the light guide body 13 changes due to the change in wavelength of the light flux L1, the virtual image Iv with reduced distortion can be displayed.
Next, a modification of the above-described second embodiment will be described. The modification of the second embodiment can also correct both the change in wavelength of the light flux L1 and the change in diffraction angle due to the temperature change of the coupling region 21, the first expansion region 23, and the second expansion region 25 by combining the second embodiment and the second modification of the first embodiment.
When the temperature of each of the coupling region 21, the first expansion region 23, and the second expansion region 25 rises, the pitch of the diffraction grating expands, and the diffraction angle of the light flux decreases. The sensor 19 detects a change in the diffraction angle due to this influence, and the sensor 198 detects a change in wavelength of the light flux L1. A correction parameter of the change in wavelength of the light flux L1 is prepared based on the detection value of the sensor 19B, and a correction parameter of the change in the diffraction angle is prepared based on the detection value of the sensor 19. A fourth lookup table of the display position and shape of the image corresponding to the two parameters is stored in advance in the storage 17. The controller 15 can correct the distortion of the virtual image with higher accuracy by controlling the position and shape of the image displayed from the display 11B based on the detection values of the sensors 19 and 19B and the third and fourth lookup tables.
Hereinafter, a third embodiment will be described with reference to
The display mode is set according to the type of the image displayed from the display 11. The display mode is set to, for example, about five patterns according to the ratio of the light emission amount of each of red, blue, and green. In addition, a fifth lookup table in which the relationship between the detection value of the sensor 19 and the wavelength of the light flux L1 is associated according to each display mode is stored in the storage 17 in advance.
In step S11, the controller 15 acquires information on the display mode of the image displayed from the display 11. In step S12, the controller 15 acquires a detection value from the sensor 19. In step S13, the controller 15 refers to a fifth lookup table corresponding to the acquired display mode, and determines the wavelength of the light flux L1 from the acquired detection value of the sensor 19.
Next, as in the first embodiment, steps S2 and S3 are performed to control the position and shape of the image displayed from the display 11, whereby the distortion of the virtual image can be reduced.
As described above, the embodiment has been described as an example of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and is applicable to embodiments in which changes, replacements, additions, omissions, and the like are appropriately made. Thus, in the following, other embodiments will be exemplified.
In the above embodiment, the sensor 19, the sensor 19A, or the sensor 19B is used. However, a plurality of sensors 19, sensors 19A, or sensors 19B may be combined, and the controller 15 may determine the wavelength of the light flux L1 based on each detection value.
In the above embodiment, the virtual image Iv is visually recognized by the observer D by reflecting the divided and replicated light flux L2 on the windshield 5, but the present invention is not limited thereto. The virtual image Iv may be visually recognized by the observer D by reflecting the divided and replicated light flux L2 on a combiner using the combiner instead of the windshield 5.
In the above embodiment, the first direction in which the light flux L1 is expanded in the first expansion region 23 and the second direction in which the light flux L1 is expanded in the second expansion region 25 are orthogonal to each other, but the present invention is not limited thereto. In expanding the light flux L1 in the first direction in the first expansion region 23, a component expanding in the horizontal direction only needs to be larger than that in the direction along the Z axis, and in expanding the light flux 11 in the second direction in the second expansion region 25, a component expanding in the direction along the Z axis only needs to be larger than that expanding in the horizontal direction.
In the above embodiment, the light flux L1 incident on the incident surface 20 is expanded in the vertical direction after being expanded in the horizontal direction of the virtual image Iv by the light guide body 13, but the present invention is not limited thereto. The light guide body 13 may expand the light flux L1 incident on the incident surface 20 and changed in the traveling direction in the vertical direction (Y-axis direction) of the virtual image Iv visually recognized by the observer D when viewed from the viewpoint of the observer D, and then further expand the light flux L1 in the horizontal direction (X-axis direction) of the virtual image Iv to emit the light flux L2 from the emission surface 27.
In the above embodiment, the case where the HUD system 1 is applied to the vehicle 3 such as an automobile has been described. However, the object to which the HUD system 1 is applied is not limited to the vehicle 3. The object to which the HUD system 1 is applied may be, for example, a train, a motorcycle, a ship, or an aircraft, or an amusement machine without movement. In the case of an amusement machine, the light flux from the display 11 is reflected by a transparent curved plate as a light-transmitting member that reflects the light flux emitted from the display 11 instead of the windshield 5. Further, the real view visually recognizable by a user through the transparent music plate may be a video displayed from another video display device. That is, a virtual image by the HU) system 1 may be displayed so as to be superimposed on a video displayed from another video display device. As described above, any one of the windshield 5, the combiner, and the transparent curved plate may be adopted as the light-transmitting member in the present disclosure.
(1) An image display device of the present disclosure includes: a display that emits a light flux that forms an image visually recognized by an observer as a virtual image; a light guide body that guides the light flux to a light-transmitting member; a controller that controls the image displayed by the display; and a sensor that detects a physical quantity used to obtain a wavelength of the light flux. The light guide body includes an incident surface on which the light flux from the display is incident and an emission surface from which the light flux is emitted from the light guide body. The light flux incident on the incident surface of the light guide body is changed in a traveling direction in the light guide body, and is emitted from the emission surface so as to expand a visual field area by being replicated in a horizontal direction and a vertical direction of the virtual image visually recognized by the observer. The controller controls a position and a shape of the image displayed by the display based on the physical quantity detected by the sensor.
As a result, even if the wavelength of the light flux emitted from the display changes, the sensor detects a physical quantity used to obtain the wavelength of the light flux, and the controller controls the position and shape of the image displayed by the display based on the detected physical quantity. As a result, even if the traveling direction in the light guide body changes due to the change in wavelength of the light flux, it is possible to display a virtual image with reduced distortion by controlling the position and shape of the image displayed on the display.
(2) In the image display device of (1), the sensor is a light detector and detects a physical quantity of light.
(3) In the image display device of (2), the sensor detects a wavelength of received light.
(4) ln the image display device of (3), the sensor is an image sensor having a diffraction grating.
(5) In the image display device of (2), the sensor detects an amount of received light, and the controller determines a wavelength of light of the light flux based on the light amount detected by the sensor.
(6) In the image display device of (5), the sensor includes a filter whose transmittance changes according to a wavelength.
(7) In the image display device according to any one of (1) to (6), the light guide body includes a region that guides a part of the light flux to the emission surface and a region that guides a part of the light flux to the sensor.
(8) In the image display device of (1), the sensor is a temperature detector, and detects a temperature of the display as the physical quantity, and the controller determines a wavelength of light of the light flux based on the temperature of the display.
(9) In the image display device according to any one of (1) to (8), the light guide body includes a coupling region that changes a traveling direction of a light flux incident on the incident surface, a first expansion region that replicates the light flux changed in the traveling direction in the coupling region in a first direction in the light guide body, and a second expansion region that replicates the light flux replicated in the first expansion region in a second direction intersecting the first direction in the light guide body, the coupling region, the first expansion region, and the second expansion region have different diffraction powers and diffraction angles, respectively, and the light flux replicated in the second expansion region is emitted from the emission surface.
(10) In the image display device of (9), at least one of the coupling region, the first expansion region, and the second expansion region includes a volume hologram.
(11) In the image display device of (9), the coupling region, the first expansion region, and the second expansion region are regions having diffraction structures, and have different magnitudes of wave number vectors of the respective diffraction structures.
(12) in the image display device according to any one of (1) to (11), the controller controls a position and a shape of an image so as to reduce distortion of the image due to a light flux emitted from the light guide body.
(13) A head-up display system of the present disclosure includes: the image display device according to any one of (1) to (12); and the light-transmitting member that reflects a light flux emitted from the light guide body, in which the head-up display system displays the virtual image so as to be superimposed on a real view visually recognizable through the light-transmitting member.
(14) In the head-up display of (13), the light-transmitting member is a windshield of a moving body.
The present disclosure is applicable to an image display device used in a head-up display system.
Number | Date | Country | Kind |
---|---|---|---|
2021-098795 | Jun 2021 | JP | national |
This is a continuation application of International Application No. PCT/JP2022/016246, with an international filing date of Mar. 30, 2022, which claims priority of Japanese Patent Application No. 2021-098795 filed on Jun. 14, 2021, the content of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2022/016246 | Mar 2022 | US |
Child | 18536662 | US |