The present disclosure generally relates to optical display systems and devices, and in particular to waveguide displays and components therefor.
Head mounted displays (HMD), helmet mounted displays, near-eye displays (NED), and the like are being used increasingly for displaying virtual reality (VR) content, augmented reality (AR) content, mixed reality (MR) content, etc. Such displays are finding applications in diverse fields including entertainment, education, training and biomedical science, to name just a few examples. The displayed VR/AR/MR content can be three-dimensional (3D) to enhance the experience and to match virtual objects to real objects observed by the user. Eye position and gaze direction, and/or orientation of the user may be tracked in real time, and the displayed imagery may be dynamically adjusted depending on the user's head orientation and gaze direction, to provide a better experience of immersion into a simulated or augmented environment.
Compact display devices are desired for head-mounted displays. Because a display of HMD or NED is usually worn on the head of a user, a large, bulky, unbalanced, and/or heavy display device would be cumbersome and may be uncomfortable for the user to wear.
Projector-based displays provide images in angular domain, which can be observed by a user's eye directly, without an intermediate screen or a display panel. An imaging waveguide may be used to carry the image in angular domain to the user's eye. The lack of a screen or a display panel in a projector display enables size and weight reduction of the display.
Embodiments disclosed herein will be described in greater detail with reference to the accompanying drawings which represent example embodiments thereof, in which like elements are indicated with like reference numerals, and wherein:
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular optical and electronic circuits, optical and electronic components, techniques, etc. in order to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known methods, devices, and circuits are omitted so as not to obscure the description of the example embodiments. All statements herein reciting principles, aspects, and embodiments, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
Note that as used herein, the terms “first”, “second”, and so forth are not intended to imply sequential ordering, but rather are intended to distinguish one element from another, unless explicitly stated. Similarly, sequential ordering of method or process steps does not imply a sequential order of their execution, unless explicitly stated.
Furthermore, the following abbreviations and acronyms may be used in the present document: HMD (Head Mounted Display); NED (Near Eye Display); VR (Virtual Reality); AR (Augmented Reality); MR (Mixed Reality); LED (Light Emitting Diode); FOV (Field of View); TIR (Total Internal Reflection). The terms “NED” and “HMD” may be used herein interchangeably.
Example embodiments may be described hereinbelow with reference to polychromatic light that is comprised of three distinct color channels. The color channel with the shortest wavelengths may be referred to as the blue (B) channel or color, and may represent the blue channel of an RGB color scheme. The color channel with the longest wavelengths may be referred to as the red (R) channel or color and may represent the red channel of the RGB color scheme. The color channel with wavelengths between the red and blue color channels may be referred to as the green (G) channel or color, and may represent the green channel of the RBG color scheme. The blue light or color channel may correspond to wavelength about 500 nanometers (nm) or shorter, the red light or color channel may correspond to wavelength about 600 nm or longer, and the green light or color channel may correspond to a wavelength range 500 nm to 565 nm. It will be appreciated however that the embodiments described herein may be adapted for use with polychromatic light comprised of any combination of two or more, or preferably three or more color channels, which may represent non-overlapping portions of a relevant optical spectrum.
An aspect of the present disclosure relates to a display system comprising a waveguide and an image light source coupled thereto, wherein the waveguide is configured to receive image light emitted by the image light source and to convey the image light received in a field of view (FOV) of the waveguide to an eyebox for presenting to a user. The waveguide may be configured to prevent undesired ambient light from being directed into the eye of the user. The term “field of view” (FOV), when used in relation to a display system, may define an angular range of light propagation supported by the system or visible to the user. A two-dimensional (2D) FOV may be defined by angular ranges in two orthogonal planes. For example, a 2D FOV of a NED device may be defined by two one-dimensional (1D) FOVs, which may be a vertical FOV, for example +\−20° relative to a horizontal plane, and a horizontal FOV, for example +\−30° relative to the vertical plane. With respect to a FOV of a NED, the “vertical” and “horizontal” planes or directions may be defined relative to the head of a standing person wearing the NED. Otherwise the terms “vertical” and “horizontal” may be used in the present disclosure with reference to two orthogonal planes of an optical system or device being described, without implying any particular relationship to the environment in which the optical system or device is used, or any particular orientation thereof to the environment.
An aspect of the present disclosure relates to a waveguide for conveying image light from an image light source to an eyebox with a target FOV spanning an angular range Γ. The waveguide may comprise a substrate for propagating the image light therein by total internal reflection, an input coupler supported by the substrate and configured to couple the image light into the waveguide, and an output coupler supported by the substrate and configured to couple the image light out of the waveguide for propagating toward the eyebox. The output coupler may comprise a first output grating having a pitch p1 that does not exceed
where λ may be a shortest wavelength of a visible light.
In some implementations the input coupler comprises an input grating having a pitch that does not exceed p1.
In some implementations p1 may be equal or smaller than
In some implementations the substrate may have a refractive index of at least 2.3. In some implementations the substrate may have a refractive index of at least 2.4. In some implementations the substrate may have a refractive index of at least 2.5.
In some implementations the output coupler may further comprise a second output grating configured to cooperate with the first output grating to diffract the image light out of the waveguide, wherein the second output grating may have a pitch that does not exceed p. In some implementations the first output grating and the second output grating cooperate for diffracting the image light out of the waveguide at an output angle equal to an angle of incidence thereof upon the waveguide. In some implementations the first and second output gratings may be disposed at opposite faces of the waveguide.
In some implementations the waveguide may be configured for conveying to the eyebox at least one of a red color (R) channel and a green color (G) channel, and the pitch p may be equal or smaller than
where λ may be a wavelength of blue light. In some implementations the wavelength λ may be smaller than 500 nm.
In some implementations the pitch p may be equal or less than 300 nm. In some implementations the pitch p may be equal or less than 280 nm.
In some implementations wherein the eyebox extends over a length 2a in a first direction, and wherein the first output grating extends over a length 2b in the first direction and is disposed at a distance d from the eyebox; the pitch p may satisfy the condition
wherein α=atan[(b+a)/d].
An aspect to the present disclosure relates to a near-eye display (NED) device comprising: a light source configured to emit image light comprising a plurality of color channels, and a first waveguide optically coupled to the light source and configured to convey a portion of the image light from the light source to an eyebox within a target field of view (FOV) spanning an angular range Γ. The first waveguide may comprise an input coupler for receiving the portion of the image light, and an output coupler for coupling said portion out of the first waveguide toward the eyebox. The output coupler may comprise a first output grating having a pitch p1 that does not exceed
where λ is a wavelength of a shortest-wavelength color channel of the image light.
In some implementations of the NED device, the first waveguide may comprise dielectric material with an index of refraction of at least 2.3. In some implementations of the NED device, the first waveguide may comprise dielectric material with an index of refraction of at least 2.4. In some implementations of the NED device, the waveguide may comprise dielectric material with an index of refraction of at least 2.5.
In some implementations of the NED device, the output coupler may further comprise a second output grating configured to cooperate with the first output grating to diffract the image light out of the first waveguide at an output angle equal to an incidence angle of the image light upon the input coupler, wherein the second output grating has a pitch not exceeding p1.
In some implementations of the NED device, λ is a wavelength of blue light, and the first waveguide may be configured to convey to the eyebox at least one of a red color channel of the image light or a green color channel of the image light.
In some implementations of the NED device, λ≤500 nm, and the first waveguide may be configured to convey to the eyebox a red color channel of the image light with wavelengths equal or longer than 600 nm.
In some implementations the NED device may comprise a waveguide stack including the first waveguide, wherein each waveguide of the waveguide stack comprises an output grating with a pitch of at most p1.
In some implementations the image light may comprise RGB light comprising a red color channel, a green color channel, and a red color channel, and the first waveguide is configured to convey to the eyebox each of the red, green, and blue color channels.
An aspect of the disclosure relates to a waveguide for conveying image light comprising a plurality of color channels from an image light source to an eyebox, the waveguide comprising: a substrate for propagating the image light therein by total internal reflection; an input coupler supported by the substrate for receiving the image light; and, an output coupler supported by the substrate for coupling the image light out of the waveguide toward the eyebox. The output coupler may comprise a first output grating having a pitch p that does not exceed 300 nm. In some implementations the substrate may have an index of refraction of at least 2.3. In some implementations the substrate may have an index of refraction of at least 2.4. In some implementations the substrate may have an index of refraction of at least 2.5. In some implementations the waveguide may be configured for conveying to the eyebox at least one of a red color (R) channel of the image light and a green color (G) channel of the image light.
An aspect of the present disclosure relates to a waveguide for conveying image light from an image light source to an eyebox with a target field of view (FOV) spanning an angular range Γ. The waveguide may comprise a substrate for propagating the image light therein by total internal reflection, an input coupler supported by the substrate and configured to couple the image light into the waveguide, and an output coupler supported by the substrate and configured to couple the image light out of the waveguide for propagating toward the eyebox. The output coupler may comprise a first output grating having a pitch p1 does not exceed
where λ is a wavelength of blue light. In some implementations the pitch p1 does not exceed
In some implementations λ may be 500 nm. In some implementations λ may be 450 nm.
Example embodiments of the present disclosure will now be described with reference to a waveguide display. Generally a waveguide display may include an image light source such as an electronic display assembly, a controller, and an optical waveguide configured to transmit image light from the electronic display assembly to an exit pupil for presenting images to a user. The image light source may also be referred to herein as a display projector, an image projector, or simply as a projector. Example display systems that may incorporate a waveguide display, and wherein features and approaches disclosed here may be used, include, but not limited to, a near-eye display (NED), a head-up display (HUD), a head-down display, and the like.
With reference to
In some embodiments the image light source 110 may include a pixelated electronic display 114 that may be optically followed by an optics block 116. The electronic display 114 may be any suitable electronic display configured to display images, such as for example but not limited to a liquid crystal display (LCD), an organic light emitting display (OLED), an inorganic light emitting display (ILED), an active-matrix organic light-emitting diode (AMOLED) display, or a transparent organic light emitting diode (TOLED) display. In some embodiment the electronic display 114 may be in the form of a linear array of light sources, such as light-emitting diodes (LED), laser diodes (LDs), or the like, with each light source configured to emit polychromatic light. In some embodiments it may include a two-dimensional (2D) pixel array, with each pixel configured to emit polychromatic light.
The optics block 116 may include one or more optical components configured to suitably condition the image light emitted by the electronic display 114. This may include, without limitation, expanding, collimating, correcting for aberrations, and/or adjusting the direction of propagation of the image light emitted by the electronic display 114, or any other suitable conditioning as may be desired for a particular system and electronic display. The one or more optical components in the optics block 116 may include, without limitations, one or more lenses, mirrors, apertures, gratings, or a combination thereof. In some embodiments the optics block 116 may include one or more adjustable elements operable to scan the beam of light emitted by the electronic display 114 with respect to it propagation angle.
The waveguide assembly 120 may be in the form of, or include, a waveguide 123 comprising an in-coupler 130 and an out-coupler 140. In some embodiments a waveguide stack composed of two or more waveguides stacked one over another may be used in place of the waveguide 123. The input coupler 130 may be disposed at a location where it can receive the image light 111 from the image light source 110. The input coupler 130, which may also be referred to herein as the in-coupler 130, is configured to couple the image light 111 into the waveguide 123, where it propagates toward the output coupler 140. The output coupler 140, which may also be referred to herein as the out-coupler, may be offset from the input coupler 130 and configured to de-couple the image light from the waveguide 123 for propagating in a desired direction, such as for example toward a user's eye 166. The out-coupler 140 may be greater in size than the in-coupler 130 to expand the image beam in size as it leaves the waveguide, and to support a larger exit pupil than that of the image light source 110. In some embodiments the waveguide assembly 120 may be partially transparent to outside light, and may be used in AR applications. The waveguide 123 may be configured to convey a 2D FOV from an input coupler 130 to the output coupler 140, and ultimately to the eye 166 of the user. Here and in the following description a Cartesian coordinate system (x,y,z) is used for convenience, in which the (x,y) plane is parallel to the main faces of the waveguide assembly 120 through which the assembly receives and/or outputs the image light, and the z-axis is orthogonal thereto. The 2D FOV of waveguide 123 may be defined by a 1D FOV in the (y,z) plane and a 1D FOV in the (x,z) plane, which may also be referred to as the vertical and horizontal FOVs, respectively.
Referring now to
Waveguide 210 may be a slab waveguide formed of a substrate 205, which may be for example in the form of a thin plate of an optical material that is transparent in visible light, such as glass or suitable plastic or polymer as non-limiting examples. Opposing main faces 211, 212 of waveguide 210, through which image light may enter or leave the waveguide, may be nominally parallel to each other. The refractive index n of the substrate material may be greater than that of surrounding media, and may be for example in the range of 1.4 to 2.6. In some embodiments, high-index materials having an index of refraction equal or greater than about 2.3 may be used for the substrate 205. In some embodiments these materials may have an index of refraction n greater than about 2.4. In some embodiments these materials may have an index of refraction n greater than about 2.5. Non-limiting examples of such materials are lithium niobate (LiNbO3), titanium dioxide (TiO2), galium nitirde (GaN), aluminum nitiride (AlN), silicon carbide (SiC), CVD diamond, zinc sulfide (ZnS).
An in-coupler 230 may be provided in or upon the waveguide 210 and may be in the form of one or more diffraction gratings. An out-coupler 240, which may also be in the form of one or more diffraction gratings, is laterally offset from the in-coupler 230, for example along the y-axis. In the illustrated embodiment the out-coupler 240 is located at the same face 211 of the waveguide 210 as the in-coupler 130, but in other embodiments it may be located at the opposite face 212 of the waveguide. Some embodiments may have two input gratings that may be disposed at opposing faces 211, 212 of the waveguide, and/or two output gratings that may be disposed at opposing faces 211, 212 of the waveguide. The gratings embodying couplers 230, 240 may be any suitable diffraction gratings, including volume and surface-relief gratings, such as for example blaze gratings. The gratings may also be volume holographic gratings. In some embodiments they may be formed in the material of the waveguide itself. In some embodiments they may be fabricated in a different material or materials that may be affixed to a face or faces of the waveguide at desired locations. In the example embodiment illustrated in
The in-coupler 230 may be configured to provide the waveguide 210 with an input FOV 234, which may also be referred to herein as the acceptance angle. The input FOV 234, which depends on the wavelength, defines a range of angles of incidence a for which the light incident upon the in-coupler 230 is coupled into the waveguide and propagates toward the out-coupler 240. In the context of this specification, “coupled into the waveguide” means coupled into the guided modes of the waveguide or modes that have suitably low radiation loss, so that light coupled into the waveguide becomes trapped therein by total internal reflection (TIR), and propagates within the waveguide with suitably low attenuation until it is engaged by an out-coupler. Thus waveguide 210 may trap light of a particular wavelength λ by means of TIR, and guide the trapped light toward the out-coupler 240, provided that the angle of incidence of the light upon the in-coupler 230 from the outside of the waveguide is within the input FOV 234 of the waveguide 210. The input FOV 234 of the waveguide is determined at least in part by a pitch p of the in-coupler grating 230 and by the refractive index n of the waveguide. For a given grating pitch p, the first-order diffraction angle β of the light incident upon the grating 230 from the air at an angle of incidence α in the (y, z) plane may be found from a diffraction equation (1):
n·sin(β)+sin(α)=λ/p. (1)
Here the angle of incidence α and the diffraction angle β are positive if corresponding rays are on the same side from the normal 207 to the opposing faces 211, 212 of the waveguide and is negative otherwise. Equation (1) may be easily modified for embodiments in which the waveguide 210 is surrounded by cladding material with refractive index nc>1. Equation (1) holds for rays of image light with a plane of incidence normal to the groves of the in-coupler grating, i.e. when the grating vector of the in-coupler grating lies within the plane of incidence of image light.
The TIR condition for the diffracted light within the waveguide, referred hereinafter as the in-coupled light, is defined by the TIR equation (2):
n·sin(β)≥1, (2)
where the equality corresponds to a critical TIR angle βc=asin(1/n). The input FOV 234 of the waveguide spans between a first FOV angle of incidence α1 and a second FOV angle of incidence α2, which may be referred to herein as the FOV edge angles. The first FOV angle of incidence α1 corresponding to the right-most incident ray 111b in
The second FOV angle of incidence α2, corresponding to the left-most incident ray 111a in
The width w=|α1−α2| of the input 1D FOV of the waveguide 210 at a particular wavelength can be estimated from equations (3) and (4). Generally the input FOV of a waveguide increases as the refractive index of the waveguide increases relative to that of the surrounding media. By way of example, for a substrate of index n surrounded by air and for βmax=75°, λ/p=1.3, the width w of the input FOV of the waveguide is about 26° for n=1.5, about 43° for n=1.8, and is about 107° for n=2.4.
As can be seen from equations (3) and (4), the input FOV 234 of waveguide 210 is a function of the wavelength λ of input light, so that the input FOV 234 shifts its position in the angle space as the wavelength changes; for example, it shifts towards the out-coupler 240 as the wavelength increases. Thus it can be challenging to provide a sufficiently wide FOV for polychromatic image light.
Referring to
In some embodiments the gratings embodying the in-coupler 230 and the out-coupler 240 may be configured so that the vector sum of their grating vectors kg is equal to substantially zero:
|Σkg|≅0. (5)
Here the summation in the left hand side (LHS) of equation (5) is performed over grating vectors kg of all gratings that diffract the input light traversing the waveguide, including the one or more gratings of the in-coupler 230, and the one or more gratings of the out-coupler 230. A grating vector kg is a vector that is directed normally to the equal-phase planes of the grating, i.e. its “grooves”, and which magnitude is inversely proportional to the grating pitch p, |kg|=2 π/p. Under conditions of equation (5), rays of the image light exit the waveguide by means of the out-coupler 240 at the same angle at which they entered the in-coupler 230, provided that the waveguide 210 is an ideal slab waveguide with parallel opposing faces 211, 212, and the FOV of the waveguide is defined by its input FOV. In practical implementations the equation (5) will hold with some accuracy, within an error threshold that may be allowed for a particular display system. In an example embodiment with a single one-dimensional (1D) input grating and a 1D output grating, the grating pitch of the out-coupler 240 may be substantially equal to the grating pitch of the in-coupler 230.
Here n is the refractive index of the substrate where light is propagating, and the angles θx and θy define the direction of light propagation in the plane of the waveguide in projection on the x-axis and y-axis, respectively. These angles may also represent the coordinates of angle space in which a 2D FOV of the waveguide may be defined. The (kx, ky) plane may be referred to herein as the k-space, and the 2D wavevector k=(kx, ky) as the k-vector.
In the k-space, the in-coupled light may be graphically represented by a TIR ring 500. The TIR ring 500 is an area of the k-space bounded by a TIR circle 501 and a maximum-angle circle 502. The TIR circle 501 corresponds to the critical TIR angle βc. The maximum-angle circle 502 corresponds to a maximum propagation angle βmax for in-coupled light. States within the TIR circle 501 represent uncoupled light, i.e. the in-coming light that is incident upon the in-coupler 430 or the light coupled out of the waveguide by one of the out-coupler gratings 441, 442. Without normalization, the radius rTIR of the TIR circle 501 and the radius rmax of the outer circle 502 may be defined by the following equations:
The greater the refractive index n, the broader is the angular range of input light of a wavelength λ that can be coupled into the waveguide.
Arrows labeled g0, gi, and g2 in
The position, size, and shape of each partial FOV 520, 530 in the angle space, and thus the full 2D FOV of the waveguide, depends on the wavelength λ of the input light, on the ratios of pitches p0, p1, and p2 of the input and output gratings to the wavelength of incoming light X, and on the relative orientation of the gratings. Thus, the 2D FOV of the waveguide may be suitably shaped and positioned in the angle space for a particular color channel or channels by selecting the pitch sizes and the relative orientation of the gratings. In some embodiments of waveguide 410, the output gratings 441, 442 may have the same pitch, p1=p2 and be symmetrically oriented relative to the input grating. In such embodiments the grating vectors g1, g2 of the first and second output gratings may be oriented at angles of +\−ϕ relative to the grating vector g0 of the in-coupler. By way of non-limiting example, the grating orientation angle ϕ may be in the range of 50 to 70 degrees, for example 60 to 66 degrees, and may depend on the refractive index of the waveguide.
In some embodiments a display waveguide of a NED, such as the display waveguide 410 of
Referring now to
where pi is the grating's pitch, which defines the length g of the grating vector gi as g=2π/pi, i=1, 2. If condition (8) is fulfilled, a single diffraction of even a glancing ray of ambient light will trap that ray within the waveguide by TIR, thereby preventing the ambient light of wavelengths equal or greater than λ from being diffracted by the output gratings toward the eyebox at an angle different from its angle of incidence.
Referring to
where i=1 or 2. This condition provides a corresponding condition (9) on the pitch pi of the out-coupler gratings 741, 742:
p
i≤ξλ (9)
where scaling parameter ξ<1 is defined by the MRF angle γ:
Referring to
and in equation (9) the MRF angle γ≅θm. By way of example, for a=35 mm, b=10 mm, d=7 mm, θm≅83°. For a smaller output coupler with a=20 mm and a condition that the ambient ray does not reach the center of the eyebox, so that b may be set to 0, equation (11) yields θm≅76°.
In some embodiments it may be sufficient to prevent ambient light from appearing within a target FOV that is supported by the HMD. In such embodiments, MRF angle γ may be defined by a characteristic FOV width Γ of the NED, for example its diagonal width.
which corresponds to a condition
Here Γ is a characteristic width of a target FOV of the display, and c is a fraction of the target FOV that is to remain free of the ambient light leakage described above. In embodiments configured to support a rectangular 2D FOV, Γ may be the diagonal width of its 2D FOV. In some embodiments it may be sufficient that the central 90% of the target diagonal FOV is free of the ambient leakage, corresponding to c=0.9. In some embodiments it may be sufficient that the central 80% of the target diagonal FOV is free of the ambient leakage, corresponding to c=0.8. By way of example, the supported 2D FOV may be 40 by 60 degrees, and Γ may be about 72 degrees, which corresponds to p≤0.63λ, for c=1, and p≤0.67λ, for c=0.8, or for λ=450 nm (blue light) p≤280 nm and p≤300 nm, respectively. In some embodiments the output gratings may be configured with pitch pi that satisfies equation (12B) with parameter c greater than 1, for example c=1.1 or 1.2, so that the leakage of ambient light with wavelengths equal or greater than λ is suppressed in an angular range broader than the target FOV of the display.
Conditions (8) to (12B) limit the pitch of the output gratings for a specific wavelength of ambient light. If any one of them is fulfilled for the shortest wavelengths of a visible spectrum of ambient light that may be incident upon the waveguide, it will also be fulfilled for all longer wavelength of the visible spectrum. The term “visible spectrum” may refer here to a portion of a spectrum of electromagnetic radiation that is visible to a typical human eye under normal lighting conditions, such as 3 candelas per square meter (cd/m2) and higher (photopic vision), which spans from about 420 nm to about 700 nm. For the purpose of lessening the appearance of the rainbow artifact, the shortest wavelength of the visible spectrum, which may also be referred to as the shortest wavelength of visible light, may correspond to the wavelength of about 420 nm. In some embodiments it may be sufficient that one or more of the conditions (8) to (12B) is fulfilled for a wavelength of the blue color range of visible light, where the photopic vision sensitivity of the human eye falls to less than 1-5% of its peak value at 555 nm, e.g. for λ≥450 nm. In some embodiments it may be therefore sufficient that condition (9) with the scaling factor defined according to equations (8), (10), (11), or (12A) is fulfilled for blue light. In some embodiments the output gratings may be configured with a pitch satisfying one of the above cited conditions for λ=450 nm. In some embodiments the output gratings may be configured with a pitch satisfying one of the above cited conditions for λ=500 nm.
By way of example, in embodiment where the MRF angle γ=c·Γ that should be free of once-diffracted ambient light of wavelength k is 60 degrees, the pitch of the output gratings could be about 0.54λ or less. If the MRF angle γ is 45 degrees, the pitch of the output gratings could be about 0.6λ or less. If the MRF angle γ is 30 degrees, the pitch of the output gratings could be about ⅔λ or less. If the MRF angle γ is 20 degrees, the pitch of the output gratings could be about 0.745λ or less. For blue light with wavelength of 450 nm, the corresponding values may be about 241 nm, 263 nm, 300 nm, and 335 nm, respectively.
As follows from equations (7), the inner radius of the TIR ring in the k-plane depends on the wavelength λ, and thus the TIR rings 500 for light of different wavelength may only partially overlap, or not overlap at all, depending on the wavelengths and the refractive index of the waveguide. The greater the refractive index of the waveguide, the broader is the range of in-plane k-vectors in which two different wavelengths of image light may be coupled by the waveguide and guided to the eyebox, and therefore the broader is the FOV that the display system employing the waveguide can support.
The width of the polychromatic TIR ring 511, which limits the FOV that may be supported at the two wavelengths simultaneously, increases as the refractive index n of the waveguide rises above a minimum value of λR/λB.
In some embodiments, a single waveguide made of optically transparent high-index material may be used in a display system to convey multiple color channels of RGB light from an image source to an eyebox of a NED, with the same input and output gratings used for at least one of the Red and Green color channels, as well as the Blue color channel. In some embodiments a condition on a minimum value of the refractive index n of the waveguide may be estimated by requiring that the in-coupler grating couples rays of the longest-wavelength color channel (Red) incident at corners of the FOV into the waveguide. This corresponds to a condition
where p0 is the pitch of the in-coupler, and Γ is a width of the FOV in the direction of the diffraction vector of the in-coupler. A corresponding condition on the refractive index n may be expressed as
By way of example, to fully support a 60×40 degrees rectangular 2D FOV, which corresponds to Γ=72 degrees when the grating vector of the in-coupler is directed along a diagonal of the FOV, for λR=650 nm, p0=300 nm, and βmax=75 degrees, the refractive index n of the waveguide should exceed 2.8. In some embodiments slight vignetting of images at a corner of a rectangular 2D FOV may be allowed without significantly degrading the viewer's experience. By way of a corresponding example, a waveguide with the refractive index n˜2.6 may support a 60×40 degrees 2D FOV in embodiments where some loss of the red spectrum is allowed at a corner of the FOV, starting about 20-25 degrees away from the center of the FOV.
In some embodiments, a single waveguide made of optically transparent high-index material with the refractive index of about 2.3, or preferably 2.4 or greater may be used in a display system to convey RGB light from an image source to an eyebox of a NED. In some embodiments, a single waveguide made of optically transparent high-index material with the refractive index of about 2.5-2.6 or greater may be used.
In the embodiment described above with reference to
In some embodiments two or more waveguides may be stacked one over the other, with the input and output gratings of the waveguides that may be optimized for different wavelength ranges. In some embodiments, a stack of three waveguides may be used, one per color of RGB light. In some embodiments, one or more of the colors may be conveyed over two different waveguides. In some embodiments, a stack of two waveguides may be used to convey RGB light, so that one of the waveguides conveys light of two of the three color bands, for example Red and Green, and the other conveys the remaining color band, for example Blue. In some embodiments light of the green color band may be carried by both waveguides. In some embodiments, the output gratings of each waveguide may be configured to satisfy condition (9) with the scaling factor according to equations (10) or (12) for at least a portion of visible spectrum, so as to reduce ambient light leakage into a pre-defined fraction of the supported FOV of the display.
Referring to
In embodiments with multiple output/redirecting gratings, such as those illustrated in
Embodiments of the present disclosure may include, or be implemented in conjunction with, an artificial reality system. An artificial reality system adjusts sensory information about outside world obtained through the senses such as visual information, audio, touch (somatosensation) information, acceleration, balance, etc., in some manner before presentation to a user. By way of non-limiting examples, artificial reality may include virtual reality (VR), augmented reality (AR), mixed reality (MR), hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include entirely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, somatic or haptic feedback, or some combination thereof. Any of this content may be presented in a single channel or in multiple channels, such as in a stereo video that produces a three-dimensional effect to the viewer. Furthermore, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in artificial reality and/or are otherwise used in (e.g., perform activities in) artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a wearable display such as an HMD connected to a host computer system, a standalone HMD, a near-eye display having a form factor of eyeglasses, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
Referring to
In some embodiments, the front body 1102 includes locators 1108 and an inertial measurement unit (IMU) 1110 for tracking acceleration of the HMD 1100, and position sensors 1112 for tracking position of the HMD 1100. The IMU 1110 is an electronic device that generates data indicating a position of the HMD 1100 based on measurement signals received from one or more of position sensors 1112, which generate one or more measurement signals in response to motion of the HMD 1100. Examples of position sensors 1112 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, a type of sensor used for error correction of the IMU 1110, or some combination thereof. The position sensors 1112 may be located external to the IMU 1110, internal to the IMU 1110, or some combination thereof.
The locators 1108 are traced by an external imaging device of a virtual reality system, such that the virtual reality system can track the location and orientation of the entire HMD 1100. Information generated by the IMU 1110 and the position sensors 1112 may be compared with the position and orientation obtained by tracking the locators 1108, for improved tracking accuracy of position and orientation of the HMD 1100. Accurate position and orientation is important for presenting appropriate virtual scenery to the user as the latter moves and turns in 3D space.
The HMD 1100 may further include a depth camera assembly (DCA) 1111, which captures data describing depth information of a local area surrounding some or all of the HMD 1100. To that end, the DCA 1111 may include a laser radar (LIDAR), or a similar device. The depth information may be compared with the information from the IMU 1110, for better accuracy of determination of position and orientation of the HMD 1100 in 3D space.
The HMD 1100 may further include an eye tracking system for determining orientation and position of user's eyes in real time. The determined position of the user's eyes allows the HMD 1100 to perform (self-) adjustment procedures. The obtained position and orientation of the eyes also allows the HMD 1100 to determine the gaze direction of the user and to adjust the image generated by the display system 1180 accordingly. In one embodiment, the vergence, that is, the convergence angle of the user's eyes gaze, is determined. The determined gaze direction and vergence angle may also be used for real-time compensation of visual artifacts dependent on the angle of view and eye position. Furthermore, the determined vergence and gaze angles may be used for interaction with the user, highlighting objects, bringing objects to the foreground, creating additional objects or pointers, etc. An audio system may also be provided including e.g. a set of small speakers built into the front body 1102.
Referring to
As described above with reference to
The I/O interface 1115 is a device that allows a user to send action requests and receive responses from the console 1190. An action request is a request to perform a particular action. For example, an action request may be an instruction to start or end capture of image or video data or an instruction to perform a particular action within an application. The I/O interface 1115 may include one or more input devices, such as a keyboard, a mouse, a game controller, or any other suitable device for receiving action requests and communicating the action requests to the console 1190. An action request received by the I/O interface 1115 is communicated to the console 1190, which performs an action corresponding to the action request. In some embodiments, the I/O interface 1115 includes an IMU that captures calibration data indicating an estimated position of the I/O interface 1115 relative to an initial position of the I/O interface 1115. In some embodiments, the I/O interface 1115 may provide haptic feedback to the user in accordance with instructions received from the console 1190. For example, haptic feedback can be provided when an action request is received, or the console 1190 communicates instructions to the I/O interface 1115 causing the I/O interface 1115 to generate haptic feedback when the console 1190 performs an action.
The console 1190 may provide content to the HMD 1100 for processing in accordance with information received from one or more of: the IMU 1110, the DCA 1111, the eye tracking system 1125, and the I/O interface 1115. In the example shown in
The application store 1155 may store one or more applications for execution by the console 1190. An application is a group of instructions that, when executed by a processor, generates content for presentation to the user. Content generated by an application may be in response to inputs received from the user via movement of the HMD 1100 or the I/O interface 1115. Examples of applications include: gaming applications, presentation and conferencing applications, video playback applications, or other suitable applications.
The tracking module 1160 may calibrate the AR/VR system 1150 using one or more calibration parameters and may adjust one or more calibration parameters to reduce error in determination of the position of the HMD 1100 or the I/O interface 1115. Calibration performed by the tracking module 1160 also accounts for information received from the IMU 1110 in the HMD 1100 and/or an IMU included in the I/O interface 1115, if any. Additionally, if tracking of the HMD 1100 is lost, the tracking module 1160 may re-calibrate some or all of the AR/VR system 1150.
The tracking module 1160 may track movements of the HMD 1100 or of the I/O interface 1115, the IMU 1110, or some combination thereof. For example, the tracking module 1160 may determine a position of a reference point of the HMD 1100 in a mapping of a local area based on information from the HMD 1100. The tracking module 1160 may also determine positions of the reference point of the HMD 1100 or a reference point of the I/O interface 1115 using data indicating a position of the HMD 1100 from the IMU 1110 or using data indicating a position of the I/O interface 1115 from an IMU included in the I/O interface 1115, respectively. Furthermore, in some embodiments, the tracking module 1160 may use portions of data indicating a position or the HMD 1100 from the IMU 1110 as well as representations of the local area from the DCA 1111 to predict a future location of the HMD 1100. The tracking module 1160 provides the estimated or predicted future position of the HMD 1100 or the I/O interface 1115 to the processing module 1165.
The processing module 1165 may generate a 3D mapping of the area surrounding some or all of the HMD 1100 (“local area”) based on information received from the HMD 1100. In some embodiments, the processing module 1165 determines depth information for the 3D mapping of the local area based on information received from the DCA 1111 that is relevant for techniques used in computing depth. In various embodiments, the processing module 1165 may use the depth information to update a model of the local area and generate content based in part on the updated model.
The processing module 1165 executes applications within the AR/VR system 1150 and receives position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the HMD 1100 from the tracking module 1160. Based on the received information, the processing module 1165 determines content to provide to the HMD 1100 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the processing module 1165 generates content for the HMD 1100 that mirrors the user's movement in a virtual environment or in an environment augmenting the local area with additional content. Additionally, the processing module 1165 performs an action within an application executing on the console 1190 in response to an action request received from the I/O interface 1115 and provides feedback to the user that the action was performed. The provided feedback may be visual or audible feedback via the HMD 1100 or haptic feedback via the I/O interface 1115.
In some embodiments, based on the eye tracking information (e.g., orientation of the user's eyes) received from the eye tracking system 1125, the processing module 1165 determines resolution of the content provided to the HMD 1100 for presentation to the user with the image projector(s) 1114. The processing module 1165 may provide the content to the HMD 1100 having a maximum pixel resolution in a foveal region of the user's gaze. The processing module 1165 may provide a lower pixel resolution in the periphery of the user's gaze, thus lessening power consumption of the AR/VR system 1150 and saving computing resources of the console 1190 without compromising a visual experience of the user. In some embodiments, the processing module 1165 can further use the eye tracking information to adjust where objects are displayed for the user's eye to prevent vergence-accommodation conflict and/or to offset optical distortions and aberrations.
The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
The present disclosure is not to be limited in scope by the specific embodiments described herein. Indeed, other various embodiments and modifications, in addition to those described herein, will be apparent to those of ordinary skill in the art from the foregoing description and accompanying drawings. Thus, such other embodiments and modifications are intended to fall within the scope of the present disclosure. Further, although the present disclosure has been described herein in the context of a particular implementation in a particular environment for a particular purpose, those of ordinary skill in the art will recognize that its usefulness is not limited thereto and that the present disclosure may be beneficially implemented in any number of environments for any number of purposes. Accordingly, the claims set forth below should be construed in view of the full breadth and spirit of the present disclosure as described herein.