The present application claims priority to United Kingdom Patent Application No. 2116786.1 filed on Nov. 22, 2021, the contents of which is hereby incorporated by reference in its entirety.
Embodiments discussed herein are generally related to optics, head-up display (HUDs), and augmented reality (AR) systems, and in particular, to configurations and arrangements of optical elements and devices to enhance and/or improve visual ergonomics by improving stereoscopic depth of field.
Optical systems of see-through head-up displays provide the ability to present information and graphics to an observer without requiring the observer to look away from a given viewpoint or otherwise refocus his or her eyes. In such systems, the observer views an external scene through a combiner. The combiner allows light from the external scene to pass through while also redirecting an image artificially generated by a projector so that the observer can see both the external light as well as the projected image at the same time. The projected image can include one or more virtual objects that augment the observer's view of the external scene, which is also referred to as augmented reality (AR).
An optical system, in accordance with an example of the present disclosure, is compact and can produce augmented reality in a head-up display with a relatively high stereoscopic depth of field overcoming limitations on the usable size of the virtual objects which does not exceed a stereo-threshold and are not perceived as inclined by an observer. In an example, the optical system includes a picture generation unit, a correcting optical unit, and a combiner. The correcting optical unit is configured to create, in a direction of a horizontal field of view, a monotonic variation of an optical path length of light rays propagating from the picture generation unit. The combiner is configured to redirect light rays propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable from the eye box. The optical system provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from an observer, such that a virtual image on a first side of the virtual image surface appears closer to the eye box than a virtual image on a second side of the virtual image surface. The virtual image surface has a non-zero angle between projections on a horizontal plane defined by a first axis and a second axis, the first axis being perpendicular to the virtual image surface and extending through an arbitrary intersection point on the virtual image surface, and the second axis being parallel to a line of sight and extending through an arbitrary intersection point on the virtual image surface. The techniques provided herein are particularly useful in the context of an automotive vehicle. Numerous configurations and variations and other example use cases will be appreciated in light of this disclosure.
As noted above, certain types of optical systems provide a head-up display using a combiner that combines light from the external (e.g., real world) environment with artificially generated images, including virtual objects or symbols that are projected into the field of view of an observer. Such a display is also referred to as an augmented reality head-up display (AR HUD). To provide a natural, fully integrated three-dimensional (stereoscopic) visual perception of the virtual objects within the external environment, an AR HUD can be arranged such that an observer perceives the virtual objects at different distances along a virtual image surface, which provides a sense of depth of those objects within the augmented reality scene. As used herein, the term virtual image surface refers to an imaginary surface (planar or non-planar) upon which virtual objects and other virtual images appear to lie when viewed from the eye box inside the field of view area. In general, the visual ergonomics of AR HUD improve as the stereoscopic depth of field increases. For example, a large stereoscopic depth of field increases the number of virtual objects that can simultaneously appear to be at different distances in front of an observer.
A standard AR HUD achieves stereoscopic depth of field by inclining a virtual image surface with the respect to a road or ground surface (the inclination of the virtual image surface in a direction of a vertical field of view). In such an AR HUD, virtual objects displayed in a lower portion of the field of view appear to be closer to the observer than virtual objects are in the upper portion of the field of view. However, some existing AR HUDs have a relatively small stereoscopic depth of field due, for example, to structural limitations of the HUD on the maximum size of the vertical field of view (FoV) and the spatial orientation of the virtual image surface. A structural limitation on the maximum size of the vertical FoV of the HUD relates to a combiner size. In existing automotive HUDs, the combiner inclination angle is more than 60°, so the combiner size is at least twice larger than the combiner's projection on the vertical plane. So, a vertical field of view increase leads to the combiner size increase, the numerical aperture increase, and hence the fast increase of aberrations (especially, increase of the astigmatism aberration). Such limitations, as well as human binocular vision limitations, can also limit the maximum usable size of the virtual objects which are not perceived as inclined. For instance, virtual image surfaces inclined in the direction of the vertical field of view enable a limited stereoscopic depth of field due to the limited size of a vertical field of view of the HUD. With a virtual image surface inclined in the direction of the vertical field of view, the stereoscopic depth of field can be increased by increasing the inclination angle of the virtual image surface. However, increasing the inclination angle relative to the direction of the vertical field of view reduces the usable size and/or height of the virtual objects (which are not perceived as inclined) displayed on the virtual image surface inclined in the direction of the vertical field of view. This decrease in the usable size of the virtual objects restricts an improvement of the stereoscopic depth of field in existing HUDs.
To this end, in accordance with an example of the present disclosure, an optical system is provided that is relatively compact and can produce augmented reality in a head-up display with a relatively high stereoscopic depth of field overcoming the limitations on the usable size of the virtual objects (which are not perceive as inclined) appearing in the field of view area. An example optical system includes a picture generation unit, a correcting optical unit, and a combiner. The correcting optical unit is configured to create, in a direction of a horizontal field of view, a monotonic variation of an optical path length of light rays propagating from the picture generation unit. The combiner is configured to redirect light rays propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable at the eye box.
The optical system thus provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from the observer such that one or more virtual images on a first side of the virtual image surface appear closer to the eye box than one or more virtual images on a second side of the virtual image surface. By inclining the virtual image surface in the direction of the horizontal field of view (horizontal inclination), the stereoscopic depth of field is increased at least twofold in comparison to existing techniques, which incline the virtual image surface in the direction of the vertical field of view (vertical inclination). Such benefit refers to that horizontal field of view of existing AR HUDs is always at least twice as wide as the vertical field of view.
To produce an inclined virtual image surface, in some examples, the correcting optical unit includes a specific combination of optical elements. For example, the inclination of the virtual image surface can be achieved by inclining a lens through which the optical image passes, by using an optical surface with an asymmetrical shape forming a wedge with adjacent optical surfaces, or by using a combination of an inclined lens and an optical surface with an asymmetrical shape. In some examples, the combiner includes a holographic optical element with positive optical power, which in combination with the correcting optical unit further increases the stereoscopic depth of field. Various other examples will be apparent in light of the present disclosure.
As described in further detail below, the optical system 104 produces an inclined virtual image surface that is non-perpendicular to a line of sight through the system 104 such that virtual objects on the left side of the virtual image 108 are displayed closer to a viewer than virtual objects on the right side of the virtual image 108, or such that virtual objects on the right side of the field of view of the system 104 are displayed closer to a viewer than virtual objects on the left side of the field of view of the system 104, depending on the angle of inclination of the virtual image surface in the direction of the horizontal field of view. The line of sight is a line extending from the center of the eye box area 106 into the center of the field of view area of the optical system 104. According to embodiments of the present disclosure, the optical system 104 is designed to occupy a relatively small and compact area (by volume) so as to be easily integrated into the structure of the vehicle 102. Several examples of the optical system 104 are described below with respect to
The optical system 104 is arranged such that the optical image 210 output by the PGU 206 passes through the correcting optical unit 204, which produces one or more modified optical images 212. The modified optical images 212 redirects to the combiner 206, then toward an eye box 208 outside of the optical system 104. The combiner 206 is further configured to permit at least some of the external light 214 to pass through the combiner 206 and combine with the redirected optical image to produce an augmented reality scene 216 visible from the eye box 208. In some examples, the augmented reality scene 216 includes an augmented reality display of the virtual image 108 of
For example, such as shown in
As noted above, is possible to improve stereoscopic depth of field by increasing the inclination angle of the virtual image surface 310. However, increasing the inclination angle of the virtual image surface 310 leads to a decrease in the usable size of the virtual object. As shown in
Another technique to improve stereoscopic depth of field without an influence on the usable size of the virtual object is to increase the field of view. As shown in
As noted above, in accordance with an embodiment of the present disclosure, the virtual image surface 310 is an imaginary surface upon which virtual objects projected from the PGU 202 appear to lie. It will be understood that the virtual image surface 310 can be inclined such as shown in
2.1. First Example Correcting Optical Unit of AR Optical System
In some examples, the optical system 104 operates in monochromatic mode at a wavelength of 532 nm. Referring to
With reference to
Δd=L
far
−L
near
The stereoscopic depth of field in an angular measure is the angle η in milliradians (mrad), defined as the difference between angles ε and θ converging on the nearest point to a viewer and the farthest point to a viewer.
η=ε−θ
The stereoscopic depth of field can be estimated in a number of scenes (e.g., Scene 1, Scene 2, etc.) of a 3D virtual image placed between the nearest point to a viewer and the farthest point to a viewer. The size of each scene in a 3D virtual image is an area at a predetermined distance from a viewer where the displayed virtual objects (e.g., the letters “A” and “B” in
The usable size of the virtual object, which isn't perceived as inclined, is limited by the stereo-threshold of human vision, such as shown in
2.2. Second Example Correcting Optical Unit of AR Optical System
The freeform mirror 1106 has an asymmetrical surface profile forming a wedge with adjacent optical surfaces of the optical element 1104 and the output lens 1108. The cross-sectional shape of the freeform mirror 1106 in the direction of a vertical field of view can, in some examples, be close to a parabolic cylinder surface, such as shown in
Referring to
2.3. Third Example Correcting Optical Unit of AR Optical System
The freeform surface of the output lens 1408 in the direction of ray propagation has an asymmetrical freeform profile and forms a wedge with adjacent optical surfaces of the output lens 1408 and the mirror 1406 in the direction of the horizontal field of view. The cross-sectional shape of the freeform surface of the output lens 1408 in the direction of a vertical field of view can, in some examples, have a symmetrical shape closed to a sphere with a radius of about −280 mm, such as shown in
Referring to
2.4. Further Examples
In some examples, the optical system 104 can operate in monochromatic mode or in full-color mode with chromatism correction.
Table 1 shows example geometrical characteristics of the optical system 104.
Table 2 shows the stereoscopic depth of field parameters of the optical system 104, according to an example of the present disclosure. The usable size of the virtual object is listed for a stereo-threshold of 150 arc sec.
In some examples, to achieve a larger stereoscopic depth of field, the combiner 206 includes a HOE. An advantage of the holographic combiner in AR HUD optical systems is the ability to provide a wide field of view while maintaining the compactness of AR HUD.
Another advantage, according to some examples of the present disclosure, is that the optical system 104 is suitable for integration into side-view AR HUDs, where a viewer observes the real world surrounding the vehicle 102 at an angle to the direction of travel, such as shown in
The following examples describe further example embodiments, from which numerous permutations and configurations will be apparent.
Example 1 provides an optical system for an augmented reality head-up display. The optical system includes a picture generation unit; a correcting optical unit configured to create, in a direction of a horizontal field of view, a monotonic variation of an optical path length of light rays propagating from the picture generation unit; and a combiner configured to redirect light rays propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable from the eye box; wherein the optical system provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from the eye box, the virtual image surface having a non-zero angle between projections on a horizontal plane defined by a first axis and a second axis, the first axis being perpendicular to the virtual image surface and extending through an arbitrary intersection point on the virtual image surface, the second axis being parallel to a line of sight and extending through the arbitrary intersection point on the virtual image surface, such that a virtual image on a first side of the virtual image surface appears closer to the eye box than a virtual image on a second side of the virtual image surface.
Example 2 includes the subject matter of Example 1, wherein the correcting optical unit includes at least one optical element having at least one optical surface inclined in the direction of the horizontal field of view.
Example 3 includes the subject matter of any one of Examples 1 and 2, wherein the correcting optical unit includes at least one optical element having at least one optical surface with an asymmetrical cross-sectional profile.
Example 4 includes the subject matter of Example 1, wherein the correcting optical unit includes a combination of at least one optical element inclined in the direction of the horizontal field of view and at least one optical surface with an asymmetrical cross-sectional profile.
Example 5 includes the subject matter of any one of Examples 1-4, wherein the combiner includes a holographic optical element with a positive optical power.
Example 6 includes the subject matter of any one of Examples 1-5, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross-sectional profile, the second surface having a spherical cross-sectional profile.
Example 7 includes the subject matter of any one of Examples 1-5, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including the at least one optical surface, the at least one optical surface having a freeform shape with an asymmetrical cross-sectional profile; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross-sectional profile, the second surface having a spherical cross-sectional profile.
Example 8 includes the subject matter of any one of Examples 1-5, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having a freeform shape with an asymmetrical cross-sectional profile in the direction of the horizontal field of view, the second surface having a spherical cross-sectional profile.
Example 9 includes the subject matter of any one of Examples 1-8, wherein the inclined virtual image surface is approximately planar.
Example 10 includes the subject matter of any one of Examples 1-9, wherein the correcting optical unit is implemented for a side-view perception functionality and provides the inclined virtual image surface being aligned with a direction of travel of a vehicle.
Example 11 provides an optical system for an augmented reality head-up display. The optical system includes a picture generation unit configured to generate an optical image; a correcting optical unit configured to create, in a direction of a horizontal field of view, a monotonic variation of a plurality of optical path lengths of light rays in the optical image propagating from the picture generation unit, thereby producing a plurality of modified optical images; and a combiner configured to redirect the modified optical images propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable from the eye box; wherein the optical system provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from the eye box, the virtual image surface having a non-zero angle between projections on a horizontal plane defined by a first axis and a second axis, the first axis being perpendicular to the virtual image surface and extending through an arbitrary intersection point on the virtual image surface, the second axis being parallel to a line of sight and extending through the arbitrary intersection point on the virtual image surface, such that a virtual image on a first side of the virtual image surface appears closer to the eye box than a virtual image on a second side of the virtual image surface.
Example 12 includes the subject matter of Example 11, wherein the correcting optical unit includes at least one optical element having at least one optical surface inclined in the direction of the horizontal field of view.
Example 13 includes the subject matter of any one of Examples 11 and 12, wherein the correcting optical unit includes at least one optical element having at least one optical surface with an asymmetrical cross-sectional profile.
Example 14 includes the subject matter of Example 11, wherein the correcting optical unit includes a combination of at least one optical element inclined in the direction of the horizontal field of view and at least one optical surface with an asymmetrical cross-sectional profile.
Example 15 includes the subject matter of any one of Examples 11-14, wherein the combiner includes a holographic optical element with a positive optical power.
Example 16 includes the subject matter of any one of Examples 11-15, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross-sectional profile, the second surface having a spherical cross-sectional profile.
Example 17 includes the subject matter of any one of Examples 11-15, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including the at least one optical surface, the at least one optical surface having a freeform shape with an asymmetrical cross-sectional profile; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross-sectional profile, the second surface having a spherical cross-sectional profile.
Example 18 includes the subject matter of any one of Examples 11-15, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having a freeform shape with an asymmetrical cross-sectional profile in the direction of the horizontal field of view, the second surface having a spherical cross-sectional profile.
Example 19 includes the subject matter of any one of Examples 11-18, wherein the inclined virtual image surface is approximately planar.
Example 20 includes the subject matter of any one of Examples 11-19, wherein the correcting optical unit is implemented for a side-view perception functionality and provides the inclined virtual image surface being aligned with a direction of travel of a vehicle.
The foregoing description and drawings of various embodiments are presented by way of example only. These examples are not intended to be exhaustive or to limit the present disclosure to the precise forms disclosed. Alterations, modifications, and variations will be apparent in light of this disclosure and are intended to be within the scope of the present disclosure as set forth in the claims. Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. Any references to examples, components, elements or acts of the systems and methods herein referred to in the singular can also embrace examples including a plurality, and any references in plural to any example, component, element or act herein can also embrace examples including only a singularity. References in the singular or plural form are not intended to limit the presently disclosed systems or methods, their components, acts, or elements. The use herein of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. References to “or” can be construed as inclusive so that any terms described using “or” can indicate any of a single, more than one, and all of the described terms. In addition, in the event of inconsistent usages of terms between this document and documents incorporated herein by reference, the term usage in the incorporated references is supplementary to that of this document; for irreconcilable inconsistencies, the term usage in this document controls.
Number | Date | Country | Kind |
---|---|---|---|
2116786.1 | Nov 2021 | GB | national |