This disclosure relates to optical systems such as optical systems in electronic devices having displays.
Electronic devices can include displays that provide images near the eyes of a user. Such electronic devices often include virtual or augmented reality headsets with displays having optical elements that allow users to view the displays. If care is not taken, components used to display images can be bulky and might not exhibit desired levels of optical performance.
An electronic device may have a display system for providing image light to eye boxes. The display system may include waveguides. Projectors may generate image light containing a virtual object. Input couplers may couple the image light into the waveguides. Output couplers may couple the image light out of the waveguides and towards the eye boxes. The eye boxes may have a field of view (FOV). The output couplers may also pass world light from external objects to the eye boxes within the FOV.
A first bias lens may transmit world light to the output coupler on a corresponding one of the waveguides. The output coupler may transmit the world light. A second bias lens may transmit the world light and the image light to the eye box. An electrically adjustable tint layer may transmit the world light towards the output coupler. The tint layer may be planar and may be layered onto a planar surface of the first bias lens and/or onto a planar surface of an additional lens having a curvature. If desired, the tint layer may be separated from the first bias lens by an air gap. If desired, the tint layer may be tilted at a non-parallel angle with respect to a surface of the waveguide.
If desired, the tint layer may be curved and may be separated from both the first bias lens and the waveguide by air gaps. The interfaces of within the tint layer may be selected to minimize reflectivity to the image light. When the tint layer is planar, the tint layer may have one or more characteristics that enhance its planarity and parallelism with the waveguide, including increased substrate thicknesses, a set of spacers between the substrates of the tint layer, and/or spacers between the tint layer and the waveguide. If desired, one or more of the air gaps may be filled with low-index materials.
System 10 of
The operation of system 10 may be controlled using control circuitry 16. Control circuitry 16 may include storage and processing circuitry for controlling the operation of system 10. Control circuitry 16 may include storage such as hard disk drive storage, nonvolatile memory (e.g., electrically-programmable-read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random-access-memory), etc. Processing circuitry in control circuitry 16 may include one or more processors (e.g., microprocessors, microcontrollers, digital signal processors, baseband processors, etc.), power management units, audio chips, graphics processing units, application specific integrated circuits, and other integrated circuits. Software code may be stored on storage in control circuitry 16 and run on processing circuitry in control circuitry 16 to implement operations for system 10 (e.g., data gathering operations, operations involving the adjustment of components using control signals, image rendering operations to produce image content to be displayed for a user, etc.).
System 10 may include input-output circuitry such as input-output devices 12. Input-output devices 12 may be used to allow data to be received by system 10 from external equipment (e.g., a tethered computer, a portable device such as a handheld device or laptop computer, or other electrical equipment) and to allow a user to provide head-mounted device 10 with user input. Input-output devices 12 may also be used to gather information on the environment in which system 10 (e.g., head-mounted device 10) is operating. Output components in devices 12 may allow system 10 to provide a user with output and may be used to communicate with external electrical equipment. Input-output devices 12 may include sensors and other components 18 (e.g., image sensors for gathering images of real-world object that are digitally merged with virtual objects on a display in system 10, accelerometers, depth sensors, light sensors, haptic output devices, speakers, batteries, wireless communications circuits for communicating between system 10 and external electronic equipment, etc.).
Projectors 26 may include liquid crystal displays, organic light-emitting diode displays, laser-based displays, or displays of other types. Projectors 26 may include light sources, emissive display panels, transmissive display panels that are illuminated with illumination light from light sources to produce image light, reflective display panels such as digital micromirror display (DMD) panels and/or liquid crystal on silicon (LCOS) display panels that are illuminated with illumination light from light sources to produce image light 30, etc.
Optical systems 22 may form lenses that allow a viewer (see, e.g., a viewer's eyes at eye box 24) to view images on display(s) 20. There may be two optical systems 22 (e.g., for forming left and right lenses) associated with respective left and right eyes of the user. A single display 20 may produce images for both eyes or a pair of displays 20 may be used to display images. In configurations with multiple displays (e.g., left and right eye displays), the focal length and positions of the lenses formed by system 22 may be selected so that any gap present between the displays will not be visible to a user (e.g., so that the images of the left and right displays overlap or merge seamlessly).
If desired, optical system 22 may contain components (e.g., an optical combiner formed from reflective components, diffractive components, a waveguide, a direct view optical combiner, etc.) to allow real-world light (sometimes referred to as world light) from real-world (external) objects such as real-world (external) object 28 to be combined optically with virtual (computer-generated) images such as virtual images in image light 30. In this type of system, which is sometimes referred to as an augmented reality system, a user of system 10 may view both real-world content (e.g., world light from object 28) and computer-generated content that is overlaid on top of the real-world content. Camera-based augmented reality systems may also be used in device 10 (e.g., in an arrangement in which a camera captures real-world images of object 28 and this content is digitally merged with virtual content at optical system 22).
System 10 may, if desired, include wireless circuitry and/or other circuitry to support communications with a computer or other external equipment (e.g., a computer that supplies display 20 with image content). During operation, control circuitry 16 may supply image content to display 20. The content may be remotely received (e.g., from a computer or other content source coupled to system 10) and/or may be generated by control circuitry 16 (e.g., text, other computer-generated content, etc.). The content that is supplied to display 20 by control circuitry 16 may be viewed by a viewer at eye box 24.
If desired, system 10 may include an optical sensor. The optical sensor may be used to gather optical sensor data associated with a user's eyes at eye box 24. The optical sensor may, for example, be a gaze tracking sensor that gathers optical sensor data such as gaze image data (gaze tracking image data or gaze tracking sensor data) from a user's eye at eye box 24. Control circuitry 16 may process the optical sensor data to identify and track the direction of the user's gaze in real time. Control circuitry 16 may perform any desired operations based on the tracked direction of the user's gaze over time.
As shown in
Infrared emitter(s) 8 may direct light 4 towards optical system 22. Optical system 22 may direct the light 4 emitted by infrared emitter(s) 8 towards eye box 24. Light 4 may reflect off portions (regions) of the user's eye at eye box 24 as reflected light 4R (sometimes referred to herein as reflected sensor light 4R, which is a reflected version of light 4). Optical system 22 may receive reflected light 4R and may direct reflected light 4R towards infrared sensor(s) 6. Infrared sensor(s) 6 may receive reflected light 4R from optical system 22 and may gather (e.g., generate, measure, sense, produce, etc.) optical sensor data in response to the received reflected light 4R. Infrared sensor(s) 6 may include an image sensor or camera (e.g., an infrared image sensor or camera), for example. Infrared sensor(s) 6 may include, for example, one or more image sensor pixels (e.g., arrays of image sensor pixels). The optical sensor data may include image sensor data (e.g., image data, infrared image data, one or more images, etc.). Infrared sensor(s) 6 may pass the optical sensor data to control circuitry 16 for further processing. Infrared sensor(s) 6 and infrared emitter(s) 8 may be omitted if desired.
If desired, waveguide 32 may also include one or more layers of holographic recording media (sometimes referred to herein as holographic media, grating media, or diffraction grating media) on which one or more diffractive gratings are recorded (e.g., holographic phase gratings, sometimes referred to herein as holograms, surface relief gratings, etc.). A holographic recording may be stored as an optical interference pattern (e.g., alternating regions of different indices of refraction) within a photosensitive optical material such as the holographic media. The optical interference pattern may create a holographic phase grating that, when illuminated with a given light source, diffracts light to create a three-dimensional reconstruction of the holographic recording. The holographic phase grating may be a non-switchable diffractive grating that is encoded with a permanent interference pattern or may be a switchable diffractive grating in which the diffracted light can be modulated by controlling an electric field applied to the holographic recording medium. Multiple holographic phase gratings (holograms) may be recorded within (e.g., superimposed within) the same volume of holographic medium if desired. The holographic phase gratings may be, for example, volume holograms or thin-film holograms in the grating medium. The grating medium may include photopolymers, gelatin such as dichromated gelatin, silver halides, holographic polymer dispersed liquid crystal, or other suitable holographic media.
Diffractive gratings on waveguide 32 may include holographic phase gratings such as volume holograms or thin-film holograms, meta-gratings, or any other desired diffractive grating structures. The diffractive gratings on waveguide 32 may also include surface relief gratings (SRGs) formed on one or more surfaces of the substrates in waveguide 32 (e.g., as modulations in thickness of a SRG medium layer). The diffractive gratings may, for example, include multiple multiplexed gratings (e.g., holograms) that at least partially overlap within the same volume of grating medium (e.g., for diffracting different colors of light and/or light from a range of different input angles at one or more corresponding output angles). Other light redirecting elements such as louvered mirrors may be used in place of diffractive gratings in waveguide 32 if desired.
As shown in
Optical system 22 may include one or more optical couplers (e.g., light redirecting elements) such as input coupler 34, cross-coupler 36, and output coupler 38. In the example of
Waveguide 32 may guide image light 30 down its length via total internal reflection. Input coupler 34 may be configured to couple image light 30 from projector 26 into waveguide 32 (e.g., within a total-internal reflection (TIR) range of the waveguide within which light propagates down the waveguide via TIR), whereas output coupler 38 may be configured to couple image light 30 from within waveguide 32 (e.g., propagating within the TIR range) to the exterior of waveguide 32 and towards eye box 24 (e.g., at angles outside of the TIR range). Input coupler 34 may include an input coupling prism, an edge or face of waveguide 32, a lens, a steering mirror or liquid crystal steering element, diffractive grating structures (e.g., volume holograms, SRGs, etc.), partially reflective structures (e.g., louvered mirrors), or any other desired input coupling elements.
As an example, projector 26 may emit image light 30 in direction +Y towards optical system 22. When image light 30 strikes input coupler 34, input coupler 34 may redirect image light 30 so that the light propagates within waveguide 32 via total internal reflection towards output coupler 38 (e.g., in direction +X within the TIR range of waveguide 32). When image light 30 strikes output coupler 38, output coupler 38 may redirect image light 30 out of waveguide 32 towards eye box 24 (e.g., back along the Y-axis). In implementations where cross-coupler 36 is formed on waveguide 32, cross-coupler 36 may redirect image light 30 in one or more directions as it propagates down the length of waveguide 32 (e.g., towards output coupler 38 from a direction of propagation as coupled into the waveguide by the input coupler). In redirecting image light 30, cross-coupler 36 may also perform pupil expansion on image light 30 in one or more directions. In expanding pupils of the image light, cross-coupler 36 may, for example, help to reduce the vertical size of waveguide 32 (e.g., in the Z direction) relative to implementations where cross-coupler 36 is omitted. Cross-coupler 36 may therefore sometimes also be referred to herein as pupil expander 36 or optical expander 36. If desired, output coupler 38 may also expand image light 30 upon coupling the image light out of waveguide 32.
Input coupler 34, cross-coupler 36, and/or output coupler 38 may be based on reflective and refractive optics or may be based on diffractive (e.g., holographic) optics. In arrangements where couplers 34, 36, and 38 are formed from reflective and refractive optics, couplers 34, 36, and 38 may include one or more reflectors (e.g., an array of micromirrors, partial mirrors, louvered mirrors, or other reflectors). In arrangements where couplers 34, 36, and 38 are based on diffractive optics, couplers 34, 36, and 38 may include diffractive gratings (e.g., volume holograms, surface relief gratings, etc.).
The example of
The operation of optical system 22 on image light 30 is shown in
Image light 30 may include images of virtual objects, sometimes referred to herein as virtual object images or simply as virtual objects. Projector 26 may receive image data that includes the virtual object images (e.g., pixels of image data at different pixel locations that form the virtual object images). Output coupler 38 may serve to overlay the virtual object images with world light from real-world object 28 within the field of view (FOV) of eye box 24. The control circuitry for system 10 may provide image data to projector 26 that places the virtual object images at desired locations within the FOV at eye box 24 (e.g., such that the virtual object images are overlaid with desired real-world objects in the scene/environment in front of system 10.)
Optical system 22 may include one or more lenses 40 that overlap output coupler 38. For example, optical system 22 may include at least a first lens 40A and a second lens 40B. Lens 40B may be interposed between waveguide 32 and real-world object 28. Lens 40A may be interposed between waveguide 32 and eye box 24. Lenses 40 are transparent and allow world light from real-world object 28 to pass to eye box 24 for viewing by the user. At the same time, the user can view virtual object images directed out of waveguide 32 and through lens 40A to eye box 24. Lenses 40A and 40B may sometimes also be referred to herein as lens elements.
The strength (sometimes referred to as the optical power, power, or diopter) of lens 40A can be selected to place virtual object images in image light 30 at a desired image distance (depth) from eye box 24 (sometimes referred to herein as a virtual object distance, virtual object image distance, virtual image distance (VID), virtual object depth, virtual image depth, or image depth). For example, it may be desirable to place virtual objects (virtual object images) such as text, icons, moving images, characters, effects, or other content or features at a certain virtual image distance (e.g., to integrate the virtual object image within, onto, into, or around the real-world objects in front of system 10). The placement of the virtual object at that distance can be accomplished by appropriate selection of the strength of lens 40A. Lens 40A may be a negative lens for users whose eyes do not have refraction errors. The strength (larger net negative power) of lens 40A can therefore be selected to adjust the distance (depth) of the virtual object. Lens 40A may therefore sometimes be referred to herein as bias lens 40A or bias- (B-) lens 40A.
If desired, lens 40B may have a complementary power value (e.g., a positive power with a magnitude that matches the magnitude of the negative power of lens 40A). Lens 40B may therefore sometimes be referred to herein as bias+ (B+) lens 40B, complementary lens 40B, or compensation lens 40B. For example, if lens 40A has a power of −2.0 diopter, lens 40B may have an equal and opposite power of +2.0 diopter (as an example). In this type of arrangement, the positive power of lens 40B cancels the negative power of lens 40A. As a result, the overall power of lenses 40A and 40B taken together will be 0 diopter. This allows a viewer to view real-world objects such as real-world object 28 without optical influence from lenses 40A and 40B. For example, a real-world object 28 located far away from system 10 (effectively at infinity), may be viewed as if lenses 40A and 40B were not present.
For a user with satisfactory uncorrected vision, this type of complementary lens arrangement therefore allows virtual objects to be placed in close proximity to the user (e.g., at a virtual image distance of 0.5-5 m, at least 0.1 m, at least 1 m, at least 2 m, less than 20 m, less than 10 m, less than 5 m, or other suitable near-to-midrange distance from device 10 while simultaneously allowing the user to view real world objects without modification by the optical components of the optical system). For example, a real-world object located at a distance of 2 m from device 10 (e.g., a real-world object being labeled by a virtual text label at a virtual image distance of 2 m) will optically appear to be located 2 m from device 10. This is merely illustrative and, if desired, lenses 40A and 40B need not be complementary lenses (e.g., lenses 40A and 40B may have any desired optical powers).
In addition, some users may require vision correction. Vision correction may be provided using tunable lenses, fixed (e.g., removable) lenses (sometimes referred to as supplemental lenses, vision correction lenses, removable lenses, or clip-on lenses), and/or by adjusting the optical power of lens 40A and/or lens 40B to implement the desired vision correction. In general, the vision correction imparted to the lens(es) may include corrections for ametropia (eyes with refractive errors) such as lenses to correct for nearsightedness (myopia), corrections for farsightedness (hyperopia), corrections for astigmatism, corrections for skewed vision, corrections to help accommodate age-related reductions in the range of accommodation exhibited by the eyes (sometimes referred to as presbyopia), and/or other vision disorders.
Lenses 40A and 40B may be provided with any desired optical powers and any desired shapes (e.g., may be plano-convex lenses, plano-concave lenses, plano-freeform lenses, freeform-convex lenses, freeform-concave lenses, convex-concave lenses, freeform-freeform lenses, etc.). Implementations in which the optical power(s) of lenses 40A and/or 40B are fixed (e.g., upon manufacture) are described herein as an example. If desired, one or both of lenses 40A and/or 40B may be electrically adjustable to impart different optical powers or power profiles over time (e.g., lenses 40A and/or 40B may be adjustable/tunable liquid crystal lenses).
In some operating conditions, such as when system 10 is operated outdoors, in rooms with bright lighting, or in other environments having relatively high light levels, world light from real-world objects 28 can overpower or wash out virtual objects presented to eye box 24 in image light 30, thereby limiting the contrast and visibility of the virtual objects when viewed at eye box 24. To reduce the brightness of the world light and maximize the contrast of the images (virtual objects) in image light 30 when viewed at eye box 24, optical system 22 may include a light-absorbing layer such as tint layer 42. Tint layer 42 may be disposed within the optical path between real-world objects 28 and output coupler 38. The world light from real-world objects 28 may pass through tint layer 42 prior to reaching eye box 24 (e.g., tint layer 42 may transmit the world light without transmitting image light 30). Tint layer 42 may absorb some of the real-world light, thereby reducing its brightness and increasing the contrast of virtual objects in image light 30 at eye box 24. If desired, the tint layer may also function to absorb real-world light, even when the virtual image is turned off, performing a function like switchable sunglasses.
Tint layer 42 may be a fixed tint layer or may be a dynamically adjustable tint layer. When implemented as a fixed tint layer, tint layer 42 has a fixed transmission profile that absorbs the same amount of incident world light over time. Fixed tint layers may be formed from a polymer film containing dye and/or pigment (as an example). When implemented as a dynamically (electrically) adjustable tint layer, tint layer 42 has a dynamically (electrically) adjustable transmission profile. In these implementations, tint layer 42 may be controlled by control signals from control circuitry 16. Implementations in which tint layer 42 is a dynamically adjustable tint layer are described herein as an example. However, in general, tint layer 42 as described herein may be replaced with a fixed tint layer.
Electrically adjustable tint layers (sometimes referred to as electrically adjustable light modulators or electrically adjustable light modulator layers) may be formed from an organic or inorganic electrochromic light modulator layer or a guest-host liquid crystal light modulator layer. When implemented using organic electrochromic tint materials, the active tint materials in the tint layer may be formed from one or more polymer layers which change their absorption upon being oxidized or reduced by charge from adjacent electrodes, or the active tint materials in the tint layer may be made from one or more species of organic small molecules, which diffuse in a liquid or gel medium and change their absorption upon being oxidized or reduced by charge from adjacent electrodes. When implemented using inorganic electrochromic tint materials, the active tint materials may be formed from one or more metal oxides, which change their absorption upon being oxidized or reduced by charge from adjacent electrodes, and may include counter-ions. During operation of system 10, the electrically adjustable tint layer may be dynamically placed in a high transmission mode (sometimes referred to herein as a clear state) when it is desired to enhance the visibility of real-world objects or in a lower transmission mode (sometimes referred to herein as a dark state) when it is desired to reduce scene brightness and thereby help enhance the viewability of image light from projector 26 (e.g., to allow virtual objects such as virtual objects in image light 30 to be viewed without being overwhelmed by bright environmental light). If desired, tint layer 42 may also be controlled to exhibit intermediate levels of transmission and/or transmission levels that vary across the field of view of eye box 24.
Tint layer 42 may be planar (e.g., having a lateral surface that lies in a flat plane) or may be curved (e.g., having a lateral surface that is curved and non-planar). Tint layer 42 may be disposed at any desired location within optical system 22 between real-world objects 28 (e.g., the scene in front of system 10) and output coupler 38 on waveguide 32.
As shown in
In general, surface 48 may be planar (flat), may be curved in one dimension (e.g., may be a developable surface that is bent about a single axis and can therefore be flattened into a plane without distortion), may be curved in two dimensions (e.g., may be a non-developable surface that is bent about multiple axes such that the surface cannot be flattened into a plane without distortion), may be spherically curved, may be aspherically curved, may be elliptically curved, may be cylindrically curved, may be toroidally curved, may be convex, may be concave, may be freeform curved, or may exhibit two or more of these curvatures. Similarly, surface 50 may be planar, may be curved in one dimension, may be curved in two dimensions, may be spherically curved, may be aspherically curved, may be elliptically curved, may be cylindrically curved, may be toroidally curved, may be convex, may be concave, may be freeform curved, or may exhibit two or more of these curvatures. Surface 50 may have the same curvature as surface 48 (e.g., surface 50 may extend parallel to surface 48) or may have a different curvature from surface 48 (e.g., surfaces 48 and 50 may be non-parallel). For the sake of illustration, in the example of
As shown in
In general, surface 52 may be planar, may be curved in one dimension, may be curved in two dimensions, may be spherically curved, may be aspherically curved, may be elliptically curved, may be cylindrically curved, may be toroidally curved, may be convex, may be concave, may be freeform curved, or may exhibit two or more of these curvatures. Similarly, surface 54 may be planar, may be curved in one dimension, may be curved in two dimensions, may be spherically curved, may be aspherically curved, may be elliptically curved, may be cylindrically curved, may be toroidally curved, may be convex, may be concave, may be freeform curved, or may exhibit two or more of these curvatures. Surface 52 may have the same curvature as surface 54 (e.g., surface 52 may extend parallel to surface 54) or may have a different curvature from surface 54 (e.g., surfaces 52 and 54 may be non-parallel).
In the example of
If desired, the curvature of surface 54 of lens 46 may impart optical power to the reflected image light produced by tint layer 42. The optical power may serve to defocus (blur) the reflected light (e.g., virtual images in the reflected image light) at the location of eye box 24, thereby helping to reduce the visibility of ghost image artifacts at eye box 24 (e.g., by placing the reflected light outside of the eye's accommodation range). However, the concave curvature of surface 54 in the example of
To mitigate these issues, surface 54 may be provided with a convex curvature as shown in the example of
In the examples of
As shown in
The example of
The curvature of tint layer 42 (
If desired, tint layer 42 may be planar and may be tilted at a non-parallel angle with respect to lateral surface 44 of waveguide 32.
As shown in
The examples of
As shown in
The example of
Tint layer 42 may, for example, be curved in one dimension (e.g., may have a one-dimensional curvature), may be curved in two dimensions (e.g., may have a two-dimensional curvature), may be spherically curved (e.g., may have a spherical curvature), may be aspherically curved (e.g., may have an aspheric curvature), may be elliptically curved (e.g., may have an elliptical curvature), may be cylindrically curved (e.g., may have an elliptical curvature), may be toroidally curved (e.g., may have a toroidal curvature), may be freeform curved (e.g., may have a freeform curvature), or may exhibit two or more of these curvatures. The curvature of tint layer 42 may be selected to impart the reflected image light with the desired optical power to defocus the reflected image light without causing the reflected image light to compete with the primary image light at the eye box (e.g., tint layer 42 of
As shown in
If desired, anti-reflective coatings (ARCs) such as anti-reflective coatings 68 may be disposed on one or both of the lateral surfaces of substrates 70 opposite tint material 64. In implementations where tint layer 42 is curved (see, e.g.,
In other implementations, substrates 70 may be formed from plastic that is molded or bent into a curved shape or tint layer 42 may be formed between two curved molded plastic substrates. The example of
In general, the reflectance of tint layer 42 is produced by the various interfaces of tint layer 42. The reflectance of these interfaces may be reduced to reduce the reflectivity of tint layer 42 and thus the amount of reflected image light directed back towards the eye box (e.g., minimizing the production of ghost image artifacts at the eye box). For example, anti-reflective coatings 68 may be configured to minimize reflection at the interface between substrates 70 and air. Additionally or alternatively, index matching layers 82 may be selected to effectively match the refractive index of electrode layers 66 (e.g., ITO) to the refractive index of substrates 70 (e.g., glass), thereby forming a smooth refractive index transition that minimizes reflection of image light. Index matching layers 82 may, for example, be multi-layer stacks of thin-films (e.g., thin-film interference filters) having refractive indices and/or thicknesses that are selected to produce optical reflections and transmissions that cause electrodes 66 and substrates 70 to exhibit the same effective refractive index. Additionally or alternatively, index matching layers 82 may be selected to effectively match the refractive index of electrode layers 66 to the refractive index of tint material 64.
The examples of
Tint layer 42 and waveguide 32 may be configured to align the primary image light with the reflected image light by maximizing the parallelism between the lateral surfaces of tint layer 42 and the lateral surfaces of waveguide 32. In practice, non-parallelism can have many sources, such as imperfect waveguide flatness, imperfect tint layer flatness, imperfect tint cell gap uniformity, and a non-parallel orientation of the tint layer relative to the waveguide. Tint layer non-flatness can, for example, be produced by stress imbalances in glass treatments or glass coatings (e.g., chemical strengthening, the anti-reflective coatings, electrode coatings, etc.) and/or from stress imbalances in the cell assembly process (e.g., carrier tape delamination, epoxy stress such as curing shrinkage, CTE mismatch with glass, and seal width uniformity, EC gel vacuum filing stress, UV plug injection pressure, and flexible printed circuit thickness).
To mitigate these issues and maximize the parallelism between the lateral surfaces of tint layer 42 and the lateral surfaces of waveguide 32, the thickness of the substrates 70 in tint layer 42 may be increased, additional spacers may be disposed between tint layer 42 and waveguide 32, and/or additional spacers may be disposed within tint layer 42.
As shown in
If desired, tint layer 42 may be separated from lateral surface 44 of waveguide 32 by air gap 86. Tint layer 42 may be mounted to lateral surface 44 of waveguide 32 using a peripheral ring of adhesive such as peripheral seal 84 (or using any other desired spacers at the edge or along the lateral periphery of tint layer 42). A set of one or more additional spacers 88 (e.g., epoxy beads) may be disposed within air gap 86 between lateral surface 92 (e.g., at uniform/regular intervals or other intervals across the lateral surface area of lateral surfaces 92 and 44). Spacers 88 may help to ensure that air gap 86 exhibits a uniform width across lateral surfaces 92 and 44, thereby helping to reenforce the flatness of tint layer 42 across its lateral area and thus maximizing parallelism between lateral surfaces 90 and 92 of tint layer 42 and lateral surfaces 44 and 94 of waveguide 32. This increased level of parallelism may, for example, serve to align the reflected image light with the primary image light thereby preventing the formation of visible ghost image artifacts.
Additionally or alternatively, tint layer 42 may itself include additional spacers for maximizing parallelism, as shown in the example of
If desired, low-index materials can be used to fill one or more of the air gaps described herein. In the example of
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of U.S. Provisional Patent Application No. 63/425,547, filed Nov. 15, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63425547 | Nov 2022 | US |