This disclosure relates to optical systems such as optical systems in electronic devices having displays.
Electronic devices can include displays that provide images near the eyes of a user. Such electronic devices often include virtual or augmented reality headsets with displays having optical elements that allow users to view the displays overlaid with world light. If care is not taken, such optical systems might not exhibit desired levels of optical performance for viewing the displays and/or the world light.
An electronic device may have a display system for providing image light to eye boxes. The display system may include waveguides. Projectors may generate image light containing a virtual object. Input couplers may couple the image light into the waveguides. Output couplers may couple the image light out of the waveguides and towards the eye boxes. The eye boxes may have a field of view (FOV). The output couplers may also pass world light from external objects to the eye boxes within the FOV.
A first lens may transmit the world light to the output coupler. The output coupler may transmit the world light. A second lens may transmit the world light and the image light to the eye box. One or more surfaces of the first and second lenses may collectively have a first region overlapping a first range of elevation angles, a second region overlapping a second range of elevation angles lower than the first range of elevation angles, a corridor region overlapping a third range of elevation angles between the first and second ranges of elevation angles, and blending regions around the corridor region and/or the second region. The first range of elevation angles may overlap the FOV. At least some of the third range of elevation angles may overlap the FOV. At least some of the second range of elevation angles may overlap the FOV or the second range of elevation angles may be non-overlapping with respect to the FOV.
The first region may have a first radius of curvature to impart the world light and optionally the image light with a first optical power. The second region may have a second radius of curvature to impart the world light and optionally the image light with a second optical power. The corridor region may have gradient optical power and constant astigmatism. The blending regions may have variable astigmatism. The second region may be shifted downwards in elevation angle, the corridor may be elongated, and/or the blending regions may be formed away from the FOV to prevent the blending regions from introducing astigmatism to the image light at the eye box.
System 10 of
The operation of system 10 may be controlled using control circuitry 16. Control circuitry 16 may include storage and processing circuitry for controlling the operation of system 10. Control circuitry 16 may include storage such as hard disk drive storage, nonvolatile memory (e.g., electrically-programmable-read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random-access-memory), etc. Processing circuitry in control circuitry 16 may include one or more processors (e.g., microprocessors, microcontrollers, digital signal processors, baseband processors, etc.), power management units, audio chips, graphics processing units, application specific integrated circuits, and other integrated circuits. Software code may be stored on storage in control circuitry 16 and run on processing circuitry in control circuitry 16 to implement operations for system 10 (e.g., data gathering operations, operations involving the adjustment of components using control signals, image rendering operations to produce image content to be displayed for a user, etc.).
System 10 may include input-output circuitry such as input-output devices 12. Input-output devices 12 may be used to allow data to be received by system 10 from external equipment (e.g., a tethered computer, a portable device such as a handheld device or laptop computer, or other electrical equipment) and to allow a user to provide head-mounted device 10 with user input. Input-output devices 12 may also be used to gather information on the environment in which system 10 (e.g., head-mounted device 10) is operating. Output components in devices 12 may allow system 10 to provide a user with output and may be used to communicate with external electrical equipment. Input-output devices 12 may include sensors and other components 18 (e.g., image sensors for gathering images of real-world object that are digitally merged with virtual objects on a display in system 10, accelerometers, depth sensors, light sensors, haptic output devices, speakers, batteries, wireless communications circuits for communicating between system 10 and external electronic equipment, etc.).
Projectors 26 may include liquid crystal displays, light-emitting diode displays, laser-based displays, or displays of other types. Projectors 26 may include light sources, emissive display panels (e.g., micro light-emitting diode (uLED) panels), transmissive display panels (spatial light modulators) that are illuminated with illumination light from light sources to produce image light, reflective display panels (spatial light modulators) such as digital micromirror display (DMD) panels and/or liquid crystal on silicon (LCOS) display panels that are illuminated with illumination light from light sources to produce image light 30, etc.
Optical systems 22 may form lenses that allow a viewer (see, e.g., a viewer's eyes at eye box 24) to view images on display(s) 20. There may be two optical systems 22 (e.g., for forming left and right lenses) associated with respective left and right eyes of the user. A single display 20 may produce images for both eyes or a pair of displays 20 may be used to display images. In configurations with multiple displays (e.g., left and right eye displays), the focal length and positions of the lenses formed by system 22 may be selected so that any gap present between the displays will not be visible to a user (e.g., so that the images of the left and right displays overlap or merge seamlessly).
If desired, optical system 22 may contain components (e.g., an optical combiner formed from reflective components, diffractive components, a waveguide, a direct view optical combiner, etc.) to allow real-world light (sometimes referred to as world light, external light, or scene light) from real-world objects such as real-world (external) object 28 from the scene (environment) in front of or around device 10 to be combined optically with virtual (computer-generated) images such as virtual images in the image light 30 emitted by projector(s) 26. In this type of system, which is sometimes referred to as an augmented reality system, a user of system 10 may view both real-world content (e.g., world light from real-world object 28) and computer-generated content that is overlaid on top of the real-world content. Camera-based augmented reality systems may also be used in device 10 (e.g., in an arrangement in which a camera captures real-world images of real-world object 28 and this content is digitally merged with virtual content at optical system 22).
System 10 may, if desired, include wireless circuitry and/or other circuitry to support communications with a computer or other external equipment (e.g., a computer that supplies display 20 with image content). During operation, control circuitry 16 may supply image content to display 20. The content may be remotely received (e.g., from a computer or other content source coupled to system 10) and/or may be generated by control circuitry 16 (e.g., text, other computer-generated content, etc.). The content that is supplied to display 20 by control circuitry 16 may be viewed by a viewer at eye box 24.
If desired, system 10 may include an optical sensor. The optical sensor may be used to gather optical sensor data associated with a user's eyes at eye box 24. The optical sensor may, for example, be a gaze tracking sensor that gathers optical sensor data such as gaze image data (gaze tracking image data or gaze tracking sensor data) from a user's eye at eye box 24. Control circuitry 16 may process the optical sensor data to identify and track the direction of the user's gaze in real time. Control circuitry 16 may perform any desired operations based on the tracked direction of the user's gaze over time.
If desired, waveguide 32 may also include one or more layers of holographic recording media (sometimes referred to herein as holographic media, grating media, or diffraction grating media) on which one or more diffractive gratings are recorded (e.g., holographic phase gratings, sometimes referred to herein as holograms, surface relief gratings, etc.). A holographic recording may be stored as an optical interference pattern (e.g., alternating regions of different indices of refraction) within a photosensitive optical material such as the holographic media. The optical interference pattern may create a holographic phase grating that, when illuminated with a given light source, diffracts light to create a three-dimensional reconstruction of the holographic recording. The holographic phase grating may be a non-switchable diffractive grating that is encoded with a permanent interference pattern or may be a switchable diffractive grating in which the diffracted light can be modulated by controlling an electric field applied to the holographic recording medium. Multiple holographic phase gratings (holograms) may be recorded within (e.g., superimposed within) the same volume of holographic medium if desired. The holographic phase gratings may be, for example, volume holograms or thin-film holograms in the grating medium. The grating medium may include photopolymers, gelatin such as dichromated gelatin, silver halides, holographic polymer dispersed liquid crystal, or other suitable holographic media.
Diffractive gratings on waveguide 32 may include holographic phase gratings such as volume holograms or thin-film holograms, meta-gratings, or any other desired diffractive grating structures. The diffractive gratings on waveguide 32 may also include surface relief gratings (SRGs) formed on one or more surfaces of the substrates in waveguide 32 (e.g., as modulations in thickness of a SRG medium layer), gratings formed from patterns of metal structures (e.g., meta structures or surfaces), etc. The diffractive gratings may, for example, include multiple multiplexed gratings (e.g., holograms) that at least partially overlap within the same volume of grating medium (e.g., for diffracting different colors of light and/or light from a range of different input angles at one or more corresponding output angles according to the Bragg matching conditions of the holograms). Other light redirecting elements such as louvered mirrors may be used in place of diffractive gratings in waveguide 32 if desired.
As shown in
Optical system 22 may include one or more optical couplers (e.g., light redirecting elements) such as input coupler 34, cross-coupler 31, and output coupler 38. In the example of
Waveguide 32 may guide image light 30 down its length via total internal reflection. Input coupler 34 may be configured to couple image light 30 from projector 26 into waveguide 32 (e.g., within a total-internal reflection (TIR) range of the waveguide within which light propagates down the waveguide via TIR), whereas output coupler 38 may be configured to couple image light 30 from within waveguide 32 (e.g., propagating within the TIR range) to the exterior of waveguide 32 and towards eye box 24 (e.g., at angles outside of the TIR range). Input coupler 34 may include an input coupling prism, an edge or face of waveguide 32, a lens, a steering mirror or liquid crystal steering element, diffractive grating structures (e.g., volume holograms, SRGs, etc.), partially reflective structures (e.g., louvered mirrors), or any other desired input coupling elements.
As an example, projector 26 may emit image light 30 in direction +Y towards optical system 22. When image light 30 strikes input coupler 34, input coupler 34 may redirect image light 30 so that the light propagates within waveguide 32 via total internal reflection towards output coupler 38 (e.g., in direction +X within the TIR range of waveguide 32). When image light 30 strikes output coupler 38, output coupler 38 may redirect image light 30 out of waveguide 32 towards eye box 24 (e.g., back along the Y-axis). In implementations where cross-coupler 31 is formed on waveguide 32, cross-coupler 31 may redirect image light 30 in one or more directions as it propagates down the length of waveguide 32 (e.g., towards output coupler 38 from a direction of propagation as coupled into the waveguide by the input coupler). In redirecting image light 30, cross-coupler 31 may also perform pupil expansion on image light 30 in one or more directions. In expanding pupils of the image light, cross-coupler 31 may, for example, help to reduce the vertical size of waveguide 32 (e.g., in the Z direction) relative to implementations where cross-coupler 31 is omitted. Cross-coupler 31 may therefore sometimes also be referred to herein as pupil expander 31 or optical expander 31. If desired, output coupler 38 may also expand image light 30 upon coupling the image light out of waveguide 32.
Input coupler 34, cross-coupler 31, and/or output coupler 38 may be based on reflective and refractive optics or may be based on diffractive (e.g., holographic) optics. In arrangements where couplers 34, 31, and 38 are formed from reflective and refractive optics, couplers 34, 31, and 38 may include one or more reflectors (e.g., an array of micromirrors, partial mirrors, louvered mirrors, or other reflectors). In arrangements where couplers 34, 31, and 38 are based on diffractive optics, couplers 34, 31, and 38 may include diffractive gratings (e.g., volume holograms, surface relief gratings, etc.).
The example of
The operation of optical system 22 on image light 30 is shown in
Image light 30 may include images of virtual objects, sometimes referred to herein as virtual object images, virtual images, or simply as virtual objects. Projector 26 may receive image data that includes the virtual object images (e.g., pixels of image data at different pixel locations that form the virtual object images). Output coupler 38 may serve to overlay the virtual object images with world light 42 from real-world object 28 within the field of view (FOV) of eye box 24. The control circuitry for system 10 may provide image data to projector 26 that places the virtual object images at desired locations within the FOV at eye box 24 (e.g., such that the virtual object images are overlaid with desired real-world objects in the scene/environment in front of system 10).
Optical system 22 may include one or more lenses 40 that overlap output coupler 38 (sometimes referred to herein as bias lens(es) 40). For example, optical system 22 may include at least a first lens 40A and a second lens 40B. Lens 40B may be interposed between waveguide 32 and real-world object 28 (e.g., the scene or environment in front of device 10). Lens 40A may be interposed between waveguide 32 and eye box 24 (e.g., the user's eye while wearing device 10). Lenses 40 are transparent and allow world light 42 from real-world object 28 to pass to eye box 24 for viewing by the user. At the same time, the user can view virtual object images in the image light 30 directed out of waveguide 32 and through lens 40A to eye box 24.
The strength (sometimes referred to as the optical power, power, or diopter) of lens 40A can be selected to place virtual object images in image light 30 at a desired image distance (depth) from eye box 24 (sometimes referred to herein as a virtual object distance, virtual object image distance, virtual image distance (VID), virtual object depth, virtual image depth, or image depth). For example, it may be desirable to place virtual objects (virtual object images) such as text, icons, moving images, characters, effects, or other content or features at a certain virtual image distance (e.g., to integrate the virtual object image within, onto, into, or around the real-world objects in front of system 10). The placement of the virtual object at that distance can be accomplished by appropriate selection of the strength of lens 40A. Lens 40A may be a negative lens for users whose eyes do not have refraction errors. The strength (larger net negative power) of lens 40A can therefore be selected to adjust the distance (depth) of the virtual object. Lens 40A may therefore sometimes be referred to herein as bias lens 40A or bias− (B−) lens 40A.
If desired, lens 40B may have a complementary power value (e.g., a positive power with a magnitude that matches the magnitude of the negative power of lens 40A). Lens 40B may therefore sometimes be referred to herein as bias+ (B+) lens 40B, complementary lens 40B, or compensation lens 40B. For example, if lens 40A has a power of −2.0 diopter, lens 40B may have an equal and opposite power of +2.0 diopter (as an example). In this type of arrangement, the positive power of lens 40B cancels the negative power of lens 40A. As a result, the overall power of lenses 40A and 40B taken together will be 0 diopter. This allows a viewer to view real-world objects such as real-world object 28 without optical influence from lenses 40A and 40B. For example, a real-world object 28 located far away from system 10 (effectively at infinity), may be viewed as if lenses 40A and 40B were not present.
For a user with satisfactory uncorrected vision, this type of complementary lens arrangement therefore allows virtual objects to be placed in close proximity to the user (e.g., at a virtual image distance of 0.5-5 m, at least 0.1 m, at least 1 m, at least 2 m, less than 20 m, less than 10 m, less than 5 m, or other suitable near-to-midrange distance from device 10 while simultaneously allowing the user to view real world objects without modification by the optical components of the optical system). For example, a real-world object located at a distance of 2 m from device 10 (e.g., a real-world object being labeled by a virtual text label at a virtual image distance of 2 m) will optically appear to be located 2 m from device 10. This is merely illustrative and, if desired, lenses 40A and 40B need not be complementary lenses (e.g., lenses 40A and 40B may have any desired optical powers).
In addition, some users may require vision correction. Vision correction may be provided using tunable lenses, fixed (e.g., removable) lenses (sometimes referred to as supplemental lenses, vision correction lenses, removable lenses, or clip-on lenses), and/or by adjusting the optical power of lens 40A and/or lens 40B to implement the desired vision correction. In general, the vision correction imparted to the lens(es) may include corrections for ametropia (eyes with refractive errors) such as lenses to correct for nearsightedness (myopia), corrections for farsightedness (hyperopia), corrections for astigmatism, corrections for skewed vision, corrections to help accommodate age-related reductions in the range of accommodation exhibited by the eyes (sometimes referred to as presbyopia), and/or other vision disorders.
Lenses 40A and 40B may be provided with any desired optical powers and any desired shapes (e.g., may be plano-convex lenses, plano-concave lenses, plano-freeform lenses, freeform-convex lenses, freeform-concave lenses, convex-concave lenses, etc.). Implementations in which the optical power(s) of lenses 40A and/or 40B are fixed (e.g., upon manufacture) are described herein as an example. If desired, one or both of lenses 40A and/or 40B may be electrically adjustable to impart different optical powers or power profiles over time (e.g., lenses 40A and/or 40B may be adjustable/tunable liquid crystal lenses).
Lens 40A may provide image light 30 coupled out of waveguide 32 (e.g., by output coupler 38 of
The vergence-accommodation conflict (VAC) is a documented phenomenon regarding the comfort of viewing three-dimensional images generated by near-to-eye displays such as system 10. Some systems (e.g., extended-reality (XR) systems) mitigate VAC by placing virtual objects at a fixed virtual image distance (VID), where lens 40A is provided with a focal length that places the virtual objects at the fixed VID and the fixed VID is selected to minimize viewing discomfort when viewing the virtual objects projected within the working range of the system. In such systems, VAC is generally greatest at short VIDs such as 0.5 m or less.
In augmented reality systems such as optical system 22 of
However, in most every day real-world conditions, real-world objects 28 in the lower part of the FOV 60 of eye box 24 tend to be closer to eye box 24 than objects in the upper part of the FOV 60 of eye box 24. For example, real-world objects 28 located at relatively low elevation angles such as angles within low-elevation-angle portion (subset) 48 of FOV 60 (e.g., negative elevation angles that are less than a first threshold angle with respect to optical axis 43 of lenses 40A and 40B) may typically be located at a relatively close distance such as distance D1 from eye box 24. Portion 48 of FOV 60 may therefore sometimes be referred to herein as near-field portion 48 or low angle portion 48 of FOV 60. Eye box 24 may receive world light 42B from real-world objects 28 located within near-field portion 48 of FOV 60 (e.g., external objects often or typically located at relatively close distances such as distance D1).
On the other hand, real-world objects 28 located at relatively high elevation angles such as angles within high-elevation-angle portion (subset) 44 of FOV 60 (e.g., positive elevation angles that are greater than a second threshold angle with respect to optical axis 43) may typically be located at a relatively far distance such as distance D2 from eye box 24. Portion 44 of FOV 60 may therefore sometimes be referred to herein as far-field portion 48 or high angle portion 44 of FOV 60. Eye box 24 may receive world light 42A from real-world objects 28 located within far-field portion 44 of FOV 60 (e.g., external objects often or typically located at relatively far distances such as distance D2). Real-world objects 28 may also be present within intermediate-elevation-angle portion (subset) 46 of FOV 60 (e.g., elevation angles at and around optical axis 43 that are less than the elevation angles associated with far-field portion 44 but greater than the elevation angles associated with near-field portion 48 of FOV 60). Real-world objects 28 located within intermediate portion 46 of FOV 60 may typically or often be located at intermediate distances between distances D1 and D2.
Consider one practical example in which a user of device 10 is reading a book, the book is held in hand at around D1=0.5 m from the observer and occupies the lower portion of the FOV (e.g., near-field portion 48), whereas the background scene at D2=10-100+ m from the observer is located in the upper part of the FOV (e.g., far-field portion 44). As another example, when a user is driving a car, the lower portion of the driver's FOV (e.g., near-field portion 48) is occupied by controls and displays for the car, which are located around D1=40 cm from the driver. On the other hand, the car directly ahead of the driver is typically located around the middle of the FOV (e.g., within intermediate portion 46) and around 10 m from the driver, and the background scene is located around the top of the FOV (e.g., within far-field portion 44) and around D2=10-100+ m from the driver. As yet another example, when an observer is manipulating or preparing ingredients for a meal in their kitchen, the ingredients are typically within arm's reach (e.g., within D1=40 cm) and within the bottom portion of the observer's FOV (e.g., near-field portion 48). The observer may be following a written or displayed recipe that is just out of arm's length (e.g., around 70 cm) and that occupies the middle of the observer's FOV (e.g., intermediate portion 46). At the same time, the observer may be watching television or observing children in the background at a distance of D2=several meters, occupying the top portion of the observer's FOV (e.g., far-field portion 44).
In some implementations, lens 40A has a fixed virtual image distance (VID) that is invariant (constant) across all of field of view 60. This configures the virtual object images in image light 30 to be provided to eye box 24 at the same fixed VID regardless of the angular location of the virtual object image within FOV 60. However, real-world objects 28 will often be present at distances that are different from the fixed VID unless located at or around an elevation angle of zero degrees. This means that for other portions of the FOV, it is very likely that real-world object 28 will be at a different distance from eye box 24 than the virtual object image at the fixed VID, the user will be unable to properly focus on both the virtual object image and the real-world object such that one of the two objects will appear out of focus, and the display will cause viewing discomfort for the user. At the same time, if desired, lenses 40A and/or 40B may be used to provide a progressive prescription to allow the user to view real-world objects 28 at different distances with respect to eye box 24 (e.g., to allow the user to properly and comfortably focus on real-world objects 28 within near-field portion 48 at a relatively close distance, real-world objects 28 within far-field portion 44 at a relative far distance, and real world objects within transition portion 46 at intermediate distances).
To help mitigate these issues, lens 40A and/or lens 40B may exhibit a progressive prescription in which the lens(es) exhibit different optical powers at different elevation angles or in different regions of FOV 60 (e.g., the lens(es) may be configured to impart image light 30 and/or world light 42 with different optical powers at different points within the eye box and/or at different angles within FOV 60). The different optical powers may, for example, configure lens 40A to provide virtual object images at different respective VIDs within the different regions of FOV 60 (e.g., at least within regions 44, 46, and 48) to more closely match the expected location of real-world objects 28, and/or to configure lenses 40A and 40B to collectively allow a user to focus on real-world objects 28 from world light 42 within the different regions of FOV 60 (e.g., at least within regions 44, 46, and 48), thereby minimizing focus conflict and viewing discomfort (e.g., even if the user requires a progressive prescription to view real-world objects 28).
As shown in
Region 68 may overlap far-field portion 44 (
Region 70 may overlap near-field portion 48 (
Lens(es) 40 may also have a corridor region 62 that extends from far-field region 68 to near-field region 70. Corridor region 62 may, for example, overlap intermediate portion 46 (
Lens(es) 40 may also include blending regions 64 that are laterally located between near-field region 70 and far-field region 68 and around (surrounding) at least a portion of both sides of corridor region 62. Blending regions 64 may sometimes also be referred to herein as boundary regions 64, progressive blending regions 64, or transition regions 64. Blending regions 64 exhibit changing (non-constant or variable) astigmatism, as illustrated by the multiple isometric lines of constant astigmatism 66 within each blending region 64.
Blending regions 64 can produce substantial astigmatism to light that passes through lens(es) 40 within blending regions 64. In the example of
To mitigate these issues, the geometry of lens(es) 40 can be shaped such that blending regions 64 do not overlap any or a substantial portion of FOV 60.
Near-field region 70 may at least partially overlap FOV 60 (e.g., corridor region 62 may be elongated to exhibit length L2 as shown in
The example of
In the example of
As shown in
In practice, lens 40A may be misaligned or offset with respect to lens 40B. In these implementations, optical axis 82 is offset or misaligned with respect to optical axis 84 by offset 80. This offset may be due to requirements given by the form factor of system 10 (e.g., to accommodate the presence of other components and/or to allow system 10 to be comfortably worn on a user's head) and/or to accommodate a particular interpupillary distance (IPD) of the user.
If care is not taken, offset 80 may cause undesirable refraction of the world light relative to the image light and/or eye box 24. This may cause some of the light to reach eye box 24 at an incorrect position/angle, may cause world light from undesired angles to be directed to eye box 24, may cause misalignment between virtual objects and real world objects when viewed at the eye box, and/or may cause undesirable light loss.
To mitigate these issues, an optical wedge may be incorporated into lens 40. The optical wedge may mitigate or counteract refraction of the world light by lens 40B (e.g., world light 42 of
If desired, the projector may distort, warp, or otherwise adjust the image data used to generate image light 30 in a manner that compensates for or mitigates any bending of image light 30 by the tilted surface 52 of lens 40A. Additionally or alternatively, one or more diffractive gratings may be layered onto surface 52 (e.g., a planar surface, a curved surface, or a surface tilted at angle 86). The diffractive grating(s) (e.g., surface relief gratings, volume holograms, thin film holograms, metasurfaces, etc.) may diffract the world light transmitted by lens 40B onto output angles that serve to compensate for or reverse any prismatic bending of the world light after transmission through lens 40B given offset 80 (e.g., the diffractive grating(s) may perform similar redirection of the world light via diffraction as performed via refraction by tilting surface 52 by angle 86). If desired, a combination of refraction (e.g., tilting surface 52) and diffraction may be used to redirect the light towards eye box 24.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of U.S. Provisional Patent Application No. 63/433,069, filed Dec. 16, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63433069 | Dec 2022 | US |