The present disclosure relates to visual display devices and related components, modules, and methods.
Visual displays provide information to viewer(s) including still images, video, data, etc. Visual displays have applications in diverse fields including entertainment, education, engineering, science, professional training, advertising, to name just a few examples. Some visual displays such as TV sets display images to several users, and some visual display systems such as near-eye displays (NEDs) are intended for individual users.
An artificial reality system generally includes an NED (e.g., a headset or a pair of glasses) configured to present content to a user. The near-eye display may display virtual objects or combine images of real objects with virtual objects, as in virtual reality (VR), augmented reality (AR), or mixed reality (MR) applications. For example, in an AR system, a user may view images of virtual objects (e.g., computer-generated images (CGIs)) superimposed with the surrounding environment by seeing through a “combiner” component. The combiner of a wearable display is typically transparent to external light but includes some light routing optic to direct the display light into the user's field of view.
Because a display of HMD or NED is usually worn on the head of a user, a large, bulky, unbalanced, and/or heavy display device with a heavy battery would be cumbersome and uncomfortable for the user to wear. Consequently, head-mounted display devices can benefit from a compact and efficient configuration, including efficient light sources and illuminators providing illumination of a display panel, high-throughput combiner components, ocular lenses, and other optical elements in the image forming train.
Exemplary embodiments will now be described in conjunction with the drawings, which are not to scale, in which like elements are indicated with like reference numerals, and in which:
While the present teachings are described in conjunction with various embodiments and examples, it is not intended that the present teachings be limited to such embodiments. On the contrary, the present teachings encompass various alternatives and equivalents, as will be appreciated by those of skill in the art. All statements herein reciting principles, aspects, and embodiments of this disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
As used herein, the terms “first”, “second”, and so forth are not intended to imply sequential ordering, but rather are intended to distinguish one element from another, unless explicitly stated. Similarly, sequential ordering of method steps does not imply a sequential order of their execution, unless explicitly stated.
A term “eyebox” as used herein means a geometrical area for the user's eye where a good-quality image may be observed by a user of the NED. The term “eyebox” refers to a viewing area, i.e. a spatial region where a user's eye may be located for a satisfactory image quality, typically near an output region of the lightguide for embodiments described herein. In the context of this specification, the terms “viewing area” and “eyebox” are used interchangeably. A term “diffraction efficiency” as used herein refers to aspects of the performance of a diffractive optical element, e.g. a diffraction grating, in terms of power throughput of the diffraction grating. In particular, the diffraction efficiency can be a measure of the optical power diffracted into a given direction compared to the power incident onto the diffractive element. In examples described herein, the diffraction efficiency is typically a measure of the optical power diffracted by the grating or a segment thereof in the first order of diffraction relative to the power incident onto the grating or the segment thereof. A term “output efficiency” as used herein refers to a fraction of the optical power of a light source of a display apparatus that is available to the user for viewing images. Terms “grating pitch” and “grating period” are used herein interchangeably.
An aspect of the present disclosure relates to a display system comprising a pupil replicating lightguide configured to convey images to a viewing area or eyebox. The lightguide is configured to receive image light emitted by an image light source and to convey the image light to the eyebox for presenting to a user in an angular domain within a field-of-view (FOV) of the display. The term “field of view”, when used in relation to a display system, may refer to an angular range of light propagation supported by the system or visible to the user. A two-dimensional (2D) FOV may be defined by angular ranges in two orthogonal planes. For example, a 2D FOV of a NED device may be defined by two one-dimensional (1D) FOVs, which may be a vertical FOV, for example +\−20° relative to a horizontal plane, and a horizontal FOV, for example +\−30° relative to the vertical plane. With respect to a FOV of a NED, the “vertical” and “horizontal” planes or directions may be defined relative to the head of a standing person wearing the NED. Otherwise the terms “vertical” and “horizontal” may be used in the present disclosure with reference to two orthogonal planes of an optical system or device being described, without implying any particular relationship to the environment in which the optical system or device is used, or any particular orientation thereof to the environment.
AR and VR displays may use pupil-replicating lightguides to carry images to an eyebox and/or to illuminate display panels that generate images to be displayed. A pupil-replicating lightguide may include grating structures for in-coupling a light beam into the lightguide, and/or for out-coupling portions of the light beam along the waveguide surface. In accordance with this disclosure, a grating structure of a pupil-replicating lightguide may include an in-coupling diffractive optical element with a spatially variable grating pitch or grating period, a spatially variable blazing angle, etc.
Embodiments described herein relate to a pupil-replicating lightguide operable to project an image into an eye pupil of a viewer (“user”) at a plurality of positions of a viewer's eye within an eyebox of the display system. Such pupil-replicating projection lightguides may include an input diffractive optical element (DOE) with a spatially variable pitch (SVP), which may be referred to herein as SVP-DOE. Typically, e.g. for a NED or a heads-up display, having a large eyebox is advantageous as it allows accommodating users with different interpupillary distance, and generally relaxes many requirements on the display and the positioning of the user's head relative to the display. Supporting a large eyebox however in a conventional lightguide-based NED having uniform-pitch in-coupling and out-coupling gratings may require a large-area out-coupler, e.g. a large-area out-coupling (“output”) grating, with only a small portion of out-coupled light actually reaching the user's eye pupil for any specific eye position within the eyebox. Projecting or focusing the image light into the eye pupil of the user at a plurality of positions may enable increasing the lightguide throughput as compared to a traditional pupil-replicating lightguide, and thus making the image appear brighter for the viewer.
Accordingly, an aspect of the present disclosure provides a display apparatus comprising a lightguide for relaying image light to an eyebox, the lightguide comprising a substrate, an in-coupler, and an out-coupler integral with the substrate, the in-coupler comprising a diffractive optical element (DOE) having a spatially variable pitch (SVP). In some implementations, the DOE may comprise a holographic optical element (HOE). In some implementations, the DOE may be configured to have a positive optical power. In some implementations, the DOE may be configured to function as an off-axis optical lens.
An aspect of the present disclosure provides a lightguide for a display apparatus, the lightguide comprising: a substrate for relaying image light to an eyebox; an input diffractive optical element (DOE) configured to couple the image light into the substrate, the input DOE having a spatially variable pitch to provide the DOE with a non-zero optical power; and a grating out-coupler supported by the substrate for out-coupling portions of the image light from the substrate toward the eyebox. The substrate may comprise two opposing surfaces for guiding the image light within the substrate by total internal reflection (TIR) from the surfaces. In some implementations, the input DOE may be disposed at one of the two opposing surfaces, and may be configured so as not to diffract rays of in-coupled image light that are incident thereon after a TIR at the other one of the two opposing surfaces.
In some implementations, the input DOE comprises a holographic optical element (HOE) having a positive optical power. In some these or other implementations, the HOE may be configured to operate as a focusing lens. In any of these implementations, the HOE may be configured to operate as an off-axis lens. In any of the above implementations, the HOE may be configured to transmit therethrough, substantially without diffraction, light incident thereon at angles outside an angular acceptance range of the input DOE. The angular acceptance range of the input DOE may be outside of an angular range of TIR at the opposing surfaces of the substrate.
In any of the above implementations, the grating out-coupler may comprise a first optical diffraction grating (ODG). In at least some of such implementations, the input DOE and the first ODG may be disposed along the two opposing surfaces with an overlap. In at least some of such implementations, the first ODG may have a constant grating pitch. In at least some of such implementations, the grating out-coupler comprises a second ODG, with the first and second ODGs having opposite slant angles.
In any of the above implementations of the lightguide, the input DOE may have a positive optical power.
A further aspect of the present disclosure provides a display apparatus comprising: a first lightguide comprising a substrate for relaying image light to a viewing area; an input diffractive optical element (DOE) configured to couple the image light into the lightguide, the input DOE having a spatially variable pitch to provide the DOE with a positive optical power; and a grating out-coupler supported by the substrate for out-coupling the image light from the substrate toward the viewing area.
In some implementations, the display apparatus may comprise a curved shell substrate of an optically transparent material, the curved shell substrate comprising the first lightguide including the input DOE and the grating out-coupler. In some of such implementations, the first lightguide may be disposed in a cavity within the curved shell substrate with gaps between opposing surfaces of the substrate and the material of the shell. In some of such implementations, the display apparatus may comprise a second lightguide disposed within the curved shell substrate and comprising a substrate, an input DOE, and a diffractive out-coupler, wherein the substrates of the first and second lightguides are an angle to each other.
In any of the above implementations, the display apparatus may comprise an image projector for directing the image light toward the input DOE, the image projector disposed at a top portion of the substrate when the display apparatus is in use by a standing person.
In any of the above implementations of the display apparatus, the grating out-coupler may comprise a diffraction grating having a substantially constant grating pitch.
A further aspect of the present disclosure provides a lightguide for a display apparatus, the lightguide comprising: a substrate for relaying image light to an eyebox; an input diffractive optical element (DOE) configured to couple the image light into the substrate, the input DOE having a spatially variable pitch; and a diffractive out-coupler supported by the substrate for out-coupling portions of the image light from the substrate toward the eyebox, the diffractive out-coupler comprising an optical diffraction grating having a substantially constant grating pitch. In any of the above implementations of the lightguide, the DOE may be configured to have a positive optical power.
The substrate 125, which may be e.g. a slab of a material that is transparent to visible light and may include one or more layers, has two opposing surfaces 121 and 122, e.g. the main outer surfaces of the substrate. The substrate 125 is configured for guiding the image light 150 within the substrate in a zig-zag fashion by reflections from the surfaces 121 and 122. In the illustrated embodiment the surfaces 121 and 122 are parallel to each other and to the (x,y) plane of a Cartesian coordinate system (x,y,z) 55. The z-axis direction is generally orthogonal to the surfaces 121 and 122.
The DOE 110 is configured to couple rays of the image light 150 spanning the acceptance angle 155 into the substrate 125, so that the in-coupled light impinges the surfaces 121, 122 at angles of incidence exceeding a critical angle of total internal reflection (TIR) at the surfaces 121, 122. The DOE 110 has a spatially variable grating pitch, i.e. the grating pitch that varies along the surfaces 121 or 122, e.g. along the y-axis of the coordinate system 55. In some embodiments, the pitch of the DOE 110 may vary so that the DOE 110 operates as an optical lens having a positive, i.e. focusing, optical power. In some embodiments, the DOE 110 may be configured to form a real or virtual image at some distance from the DOE 110. In some embodiments, the DOE 110 may be configured to form an image at or near an eyebox 140. In some embodiments, the pitch of the DOE 110 may spatially vary so that the edge rays 151, 153 of the image light 150, upon in-coupling into the substrate 125, propagate along converging directions. The ODG 130, which may have a constant, i.e. spatially uniform grating pitch, is configured to diffract laterally offset portions of image light 150 incident thereon out of the substrate 125 toward the eyebox 140. In an example embodiment, the grating pitch of the ODG 130 may be selected e.g. to diffract a central ray 152 of the image light 150 in a direction perpendicular to the surfaces 121, 122. The DOE 110 may cooperate with the ODG 130 so that after a first diffraction from the ODG 130, the edge rays 151, 153 of the image light 150 converge at a “focus” location 141, defining a FOV 166 of the display apparatus 100.
In some embodiments, the display apparatus 100 may be configured to project at least a fraction of the image light 150 through a pupil 222 of a user's eye 220, as illustrated in
Referring back to
When observed in the z-axis direction normal to the surfaces 121, 122, the DOE 110 may at least partially overlap with both the ODG 130 (as indicated at 132) and the eyebox 140, allowing for a more compact and efficient configuration. The DOE 110 may be configured to operate as an off-axis optical lens, so that the image projector 103 may be positioned outside of the FOV 166 for any viewing location within the eyebox 140. The term “off-axis” maybe used here to refer to an optical lens that is configured for off-axis incidence, i.e. to redirect, focus, defocus, collimate, etc. off-axis light beams.
The DOE 110 may further be configured to have a suitably narrow angular bandwidth of diffraction outside the angular acceptance range 155, so as to allow rays of the in-coupled image light 150, which impinge upon subsequent locations at the DOE 110 at corresponding angles of TIR, to be reflected from the substrate surface 121 substantially without being affected by the DOE 110.
In some embodiments, the DOE 110 may have a spatially varying angular bandwidth of diffraction, i.e. the range of angles of incidence outside of which the diffraction efficiency of the DOE 110 is suitably small. In some embodiments, the DOE 110 may be configured so that a ray of the image light that is in-coupled by the DOE 110 at a first location thereon, propagates within the substrate at a TIR angle that is outside of the local diffraction bandwidth at the locations of subsequent incidences. The local diffraction bandwidth may depend on at least one of the local grating pitch or a local tilt angle of diffractive fringes or grooves.
In the case of the conventional lightguide of
The input portions of the substrates 325 and 326 are illustrated in
In some embodiments, the DOE 110 of
Turning briefly back to
In some embodiments, the image projector 103 may be a pixelated image projector 410 that includes a display panel 418, as schematically illustrated in
One complication of using a variable-pitch DOE as an in-coupler of a lightguide is that the replicated output “focal points” of the image light, e.g. 141-143, may be at different distances from the lightguide, as schematically illustrated in
Referring to
In some embodiments, the display apparatus 700 may be configured to operate as an augmented reality (AR) display, e.g. an AR NED, to combine for a viewer the outside scenery carried by ambient light 766 with images carried by image light 750. In such embodiments, a second portion 722 of the curved shell substrate 720, which is distal from the eyebox 740 and faces the outside scenery when in use, may be also shaped to have an optical power, e.g. to at least partially compensate the optical power of the first portion 721. In some embodiments, the two portions 721, 722 of the curved shell substrate 720 may cooperate to have a target optical power, e.g. to function as a prescription lens, or to have a zero optical power.
In some embodiments, two or more slab lightguides may be disposed within a same curved optically transparent shell substrate, with their orientation generally following the curvature of the shell substrate. Referring to
Referring to
The purpose of the eye-tracking cameras 1004 is to determine position and/or orientation of both eyes of the user. The eyebox illuminators 1006 illuminate the eyes at the corresponding eyeboxes 1012, allowing the eye-tracking cameras 1004 to obtain the images of the eyes, as well as to provide reference reflections i.e. glints. The glints may function as reference points in the captured eye image, facilitating the eye gazing direction determination by determining position of the eye pupil images relative to the glints images. To avoid distracting the user with the light of the eyebox illuminators 1006, the latter may be made to emit light invisible to the user. For example, infrared light may be used to illuminate the eyeboxes 1012.
Embodiments of the present disclosure may include, or be implemented in conjunction with, an artificial reality system. An artificial reality system adjusts sensory information about outside world obtained through the senses such as visual information, audio, touch (somatosensation) information, acceleration, balance, etc., in some manner before presentation to a user. By way of non-limiting examples, artificial reality may include virtual reality (VR), augmented reality (AR), mixed reality (MR), hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include entirely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, somatic or haptic feedback, or some combination thereof. Any of this content may be presented in a single channel or in multiple channels, such as in a stereo video that produces a three-dimensional effect to the viewer. Furthermore, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in artificial reality and/or are otherwise used in (e.g., perform activities in) artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a wearable display such as an HMD connected to a host computer system, a standalone HMD, a near-eye display having a form factor of eyeglasses, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
Turning to
In some embodiments, the front body 1102 includes locators 1108 and an inertial measurement unit (IMU) 1110 for tracking acceleration of the HMD 1100, and position sensors 1112 for tracking position of the HMD 1100. The IMU 1110 is an electronic device that generates data indicating a position of the HMD 1100 based on measurement signals received from one or more of position sensors 1112, which generate one or more measurement signals in response to motion of the HMD 1100. Examples of position sensors 1112 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, a type of sensor used for error correction of the IMU 1110, or some combination thereof. The position sensors 1112 may be located external to the IMU 1110, internal to the IMU 1110, or some combination thereof.
The locators 1108 are traced by an external imaging device of a virtual reality system, such that the virtual reality system can track the location and orientation of the entire HMD 1100. Information generated by the IMU 1110 and the position sensors 1112 may be compared with the position and orientation obtained by tracking the locators 1108, for improved tracking accuracy of position and orientation of the HMD 1100. Accurate position and orientation is important for presenting appropriate virtual scenery to the user as the latter moves and turns in 3D space.
The HMD 1100 may further include a depth camera assembly (DCA) 1111, which captures data describing depth information of a local area surrounding some or all of the HMD 1100. The depth information may be compared with the information from the IMU 1110, for better accuracy of determination of position and orientation of the HMD 1100 in 3D space.
The HMD 1100 may further include an eye tracking system 1114 for determining orientation and position of user's eyes in real time. The obtained position and orientation of the eyes also allows the HMD 1100 to determine the gaze direction of the user and to adjust the image generated by the display system 1180 accordingly. The determined gaze direction and vergence angle may be used to adjust the display system 1180 to reduce the vergence-accommodation conflict. The direction and vergence may also be used for displays' exit pupil steering as disclosed herein. Furthermore, the determined vergence and gaze angles may be used for interaction with the user, highlighting objects, bringing objects to the foreground, creating additional objects or pointers, etc. An audio system may also be provided including e.g. a set of small speakers built into the front body 1102.
Display embodiments described above are intended to be merely non-limiting illustrative examples. Many variations and modifications are possible. For instance, monochromatic or polychromatic image light may be used. For color images, the image light of different color channels may be spatially and/or temporally multiplexed. Two or more stacked lightguides may be used to guide different color channels. The grating pitch of the input and/or output couplers of the same lightguide may be tuned to accommodate different color channels in a time multiplexed manner. More than one diffraction grating with parallel or non-parallel grating vectors may be used to couple light out of the lightguide. Some embodiments may utilize more than one input DOE and/or more than one ODGs, e.g. to support a 2D FOV. When two or more diffraction gratings are used for the out-coupling, the gratings may be superimposed to form a 2D grating structure, e.g. at a same outer surface of the lightguide's substrate. In some embodiments, different in-coupling gratings or different out-coupling gratings may be disposed at the opposite outer surfaces of the substrate. Embodiments in which one or more input DOEs or one or more output gratings are disposed in the bulk of the substrate are also within the scope of the present disclosure.
The present disclosure is not to be limited in scope by the specific embodiments described herein. Indeed, other various embodiments and modifications, in addition to those described herein, will be apparent to those of ordinary skill in the art from the foregoing description and accompanying drawings. Thus, such other embodiments and modifications are intended to fall within the scope of the present disclosure. Further, although the present disclosure has been described herein in the context of a particular implementation in a particular environment for a particular purpose, those of ordinary skill in the art will recognize that its usefulness is not limited thereto and that the present disclosure may be beneficially implemented in any number of environments for any number of purposes. Accordingly, the claims set forth below should be construed in view of the full breadth and spirit of the present disclosure as described herein.