The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
Artificial reality devices, such as virtual reality headsets, can be used to simulate and/or reproduce a variety of virtual and remote environments. For example, stereoscopic images can be displayed on an electronic display inside a headset to simulate the illusion of depth, and head tracking sensors can be used to estimate what portion of the virtual environment is being viewed by the user. However, because existing headsets are often unable to correctly render or otherwise compensate for vergence and accommodation conflicts, such simulation can cause visual fatigue and discomfort for users. Augmented reality and mixed reality headsets may display a virtual image overlapping with real-world images. To create a comfortable viewing experience, virtual images generated by such headsets are typically displayed at distances suitable for eye accommodations of real-world images in real time during the viewing process.
Vergence-accommodation conflict is a common problem in artificial reality systems, including virtual, augmented, and mixed reality systems. “Accommodation” is a process of adjusting the focal length of an eye lens. During accommodation, the optics of an eye are adjusted to keep an object in focus on the retina as its distance from the eye varies. “Vergence” is the simultaneous movement or rotation of both eyes in opposite directions to obtain or maintain binocular vision and is connected to accommodation of the eye. Under normal conditions, when human eyes look at a new object at a distance different from an object they had been looking at, the eyes automatically change focus (by changing their shape) to provide accommodation at the new distance or vergence distance of the new object.
In accordance with various embodiments, disclosed display devices may include gradient-index liquid crystal (GRIN LC) lenses that utilize variations in liquid crystal alignment to refract light in a manner similar to conventional lenses. A GRIN LC lens, as disclosed herein, may include an electrode array that provides variations in voltages applied to a liquid crystal layer of the lens, with the variations producing a voltage gradient(s) proceeding from a center of the lens outward. Voltages applied to the liquid crystal layer may be selectively changed so as to generate different lens powers corresponding to active display conditions and/or user eye orientation. Accordingly, GRIN LC lenses, as disclosed herein, may address the vergence-accommodation conflict by compelling a user's eyes to focus at a focal distance coinciding with a vergence location of a virtual object displayed by the display device. Moreover, since the lens diopter is not determined solely by a surface shape of the GRIN LC lens, the thickness of the disclosed GRIN LC lenses may be significantly reduced in comparison to conventional lenses.
GRIN LC lenses having large diameters may be desirable in various devices to provide a sufficient aperture. However, as the lens diameter increases, the necessary lens thickness and required voltage drop may also increase. Additionally, the required reset time may be excessively long in such larger diameter lenses. In order to produce such larger diameter lenses, Fresnel resets may be included in the lens structure. The Fresnel resets may allow for thinner GRIN LC lenses that have sufficiently fast response times. However, transition regions between Fresnel reset sections may diffract and scatter light in undesired directions, causing unpleasant image artifacts and/or distortions. While dark masking layers may be used to block scattered light at the transition regions, such masking layers may be visible to viewers so as to interfere with their viewing experience.
In accordance with embodiments disclosed herein, a lens system may include a GRIN LC lens having one or more curved surfaces. In some examples, the GRIN LC lens element may include a curved incident side surface and a curved exit side surface. The curved incident side surface may, for example, be convex in shape with a radius of curvature selected to reduce incident angles of light received from a display screen. Reducing incident light angles may help reduce perceptible light diffraction and scattering. For example, the curved surface(s) may prevent light that is diffracted and/or scattered at Fresnel reset regions of a GRIN LC lens from reaching an eye-box of a head-mounted display device. Additionally, the curved surface(s) of the GRIN LC lens may have little or substantially no effect on the lens focal distance. In various examples, optical elements may be shaped to conform to one or more curved surfaces of the GRIN LC lenses. Such optical elements may be placed abutting a GRIN LC, and in some examples, may be laminated onto at least one curved surface of the GRIN LC.
GRIN LC lenses with curved surfaces, as disclosed herein, may obviate the need to use a dark masking layer or other image noise blocking layer, to block undesirable light scattering. Thus, optical characteristics of GRIN LC lenses having Fresnel resets may be improved, resulting in reduced light scattering and increased clarity in comparison to other lenses. Thus, visible lines, such as those evident on a dark masking layer, may not be present in the disclosed lens systems. Accordingly, the lens systems described herein may have minimal space requirements while providing high quality display characteristics, making them suitable for use in a variety of display systems, including various head-mounted display systems.
Features from any of the embodiments described herein may be used in combination with one another in accordance with the general principles described herein. These and other embodiments, features, and advantages will be more fully understood upon reading the following detailed description in conjunction with the accompanying drawings and claims.
The following will provide, with reference to
As such, instead of changing power or focal length to accommodate for the further vergence distance dv associated with virtual object 110, each eye 102 maintains accommodation at a closer accommodation distance da associated with electronic screen(s) 108. Thus, the vergence distance dv may not be equal to the accommodation distance da for the human eye for objects displayed on 2-dimensional electronic screens. This discrepancy between vergence distance dv and accommodation distance da is commonly referred to as “vergence-accommodation conflict.” A user experiencing only vergence or accommodation, but not both simultaneously, with respect to a virtual object may undesirably experience eye fatigue and discomfort during use.
“Optical series,” as used herein, may refer to relative positioning of a plurality of optical elements such that light, for each optical element of the plurality of optical elements, is transmitted by that optical element before being transmitted by another optical element of the plurality of optical elements. For embodiments described herein, optical elements may be aligned in various arrangements without regard to a specific ordering within an optical series. For example, optical element A placed before optical element B, or optical element B placed before optical element A, may both be in optical series with each other. An optical series may represent a combination of optical elements having individual optical properties that are compounded with each other when placed in series.
As used herein, a material or element that is “transparent” or “optically transparent” may, for a given thickness, have a transmissivity within the visible light spectrum of at least approximately 70%, e.g., approximately 70, 80, 90, 95, 97, 98, 99, or 99.5%, including ranges between any of the foregoing values, and less than approximately 10% bulk haze, e.g., approximately 0.5, 1, 2, 4, 6, or 8% bulk haze, including ranges between any of the foregoing values. In accordance with some embodiments, a “fully transparent” material or element may have (a) a transmissivity (i.e., optical transmittance) within the visible light spectrum of at least approximately 90%, e.g., approximately 90, 95, 97, 98, 99, or 99.5%, including ranges between any of the foregoing values, (b) less than approximately 5% bulk haze, e.g., approximately 0.1, 0.25, 0.5, 1, 2, or 4% bulk haze, including ranges between any of the foregoing values, (c) less than approximately 30% reflectivity, e.g., approximately 1, 2, 5, 10, 15, 20, or 25% reflectivity, including ranges between any of the foregoing values, and (d) at least 70% optical clarity, e.g., approximately 70, 80, 90, 95, 97, 98, 99, or 99.5% optical clarity, including ranges between any of the foregoing values. Transparent and fully transparent materials will typically exhibit very low optical absorption and minimal optical scattering. In some embodiments, “transparency” may refer to internal transparency, i.e., exclusive of Fresnel reflections.
As used herein, the terms “haze” and “clarity” may refer to an optical phenomenon associated with the transmission of light through a material, and may be attributed, for example, to the refraction of light within the material, e.g., due to secondary phases or porosity and/or the reflection of light from one or more surfaces of the material. As will be appreciated by those skilled in the art, haze may be associated with an amount of light that is subject to wide angle scattering (i.e., at an angle greater than 2.5° from normal) and a corresponding loss of transmissive contrast, whereas clarity may relate to an amount of light that is subject to narrow angle scattering (i.e., at an angle less than 2.5° from normal) and an attendant loss of optical sharpness or “see through quality.”
A material or element that is “reflective” or “optically reflective” may, for example, have a transmissivity within the visible light spectrum of less than approximately 2%, e.g., less than 2, 1, 0.5, 0.2, or 0.1%, including ranges between any of the foregoing values.
As used herein, the term “approximately” in reference to a particular numeric value or range of values may, in certain embodiments, mean and include the stated value as well as all values within 10% of the stated value. Thus, by way of example, reference to the numeric value “50” as “approximately 50” may, in certain embodiments, include values equal to 50±5, i.e., values within the range 45 to 55.
Varifocal block 232 may include one or more varifocal structures in optical series. A varifocal structure is an optical device that is configured to dynamically adjust its focus in accordance with instructions from a varifocal system. In some examples, varifocal block 232 may include a GRIN LC lens as disclosed herein (see, e.g.,
The orientations of liquid crystal molecules 344 in each region of liquid crystal layer 342 may be oriented by, for example, progressively changing a voltage applied to liquid crystal layer 342 at the respective regions. For example, a voltage applied to the peripheral region of liquid crystal layer 342 may be higher or lower than a voltage applied to the central region of liquid crystal layer 342, with voltages between the central and peripheral regions progressively increasing or decreasing proceeding from the central region to the peripheral region. While rod-shaped liquid crystal molecules are illustrated in the example shown in
Incident light 346 may pass through liquid crystal layer 342, where the light is refracted by liquid crystal molecules 344. Liquid crystal molecules 344 in different regions of liquid crystal layer 342 may be oriented at varied angles so as to refract light at correspondingly different angles within each region. For example, as shown in
In some examples, different voltage profiles may be applied to liquid crystal layer 342 to change optical characteristics of GRIN LC lens 340 as needed. For example, voltages may be selectively applied by an electrode array of GRIN LC lens 340 to reorient liquid crystal molecules 344 so as to change the location of focal point F1 and an optical power of GRIN LC lens 340. In at least one embodiment, liquid crystal molecules 344 may also be selectively oriented to produce a negative diopter in GRIN LC lens 340 so as to spread incoming light outward in a manner similar to a concave lens. In this example, the negative power may be accomplished by orienting liquid crystal molecules 344 within various regions of liquid crystal layer 342 to refract light outward to an increasingly greater extent proceeding from a central region outward toward the periphery.
As shown in
A bus line 462 may be electrically coupled to at least one of driving electrodes 458 to provide selected voltages to driving electrodes 458. For example, bus line 462 may be electrically coupled to the illustrated driving electrode 458 by a via interconnect 463 extending directly between bus line 462 and the driving electrode 458. Voltages at other driving electrodes 458 may be different than the voltage applied by bus line 462 due to, for example, reductions in voltages across the inter-electrode resistors connecting other driving electrodes to the driving electrode 458 coupled to the bus line 462. Voltages applied to each of driving electrodes 458 may be controllably varied to produce desired lensing of light passing through liquid crystal layer 442. In various examples, GRIN LC lens 440 may include multiple bus lines that are each electrically coupled to different electrodes to provide separate driving zones and/or Fresnel reset regions, as discussed in more detail below. Additionally, multiple bus lines within a particular driving zone and/or Fresnel reset may be used to apply different voltages to separate driving electrodes 458 so as to provide a voltage gradient(s) between the driving electrodes 458.
According to at least one embodiment, an insulating layer 460 may be disposed over driving electrodes 458 and bus line 462. Insulating layer 460 may also surround portions of bus line 462 not directly coupled to a driving electrode 458 such that portions of insulating layer 460 are disposed between bus line 462 and other driving electrodes 458. In some examples, portions of insulating layer 460 may also be disposed in gaps G1 defined between adjacent driving electrodes 458. Insulating layer 460 may include one or more dielectric layers, which may include a stoichiometric or non-stoichiometric oxide, fluoride, oxyfluoride, nitride, oxynitride, sulfide, SiO2, TiO2, Al2O3, Y2O3, HfO2, ZrO2, Ta2O5, Cr2O3, AlF3, MgFS2, NdF3, LaF3, YF3, CeF3, YbF3, Si3N4, ZnS, and/or ZnSe.
A floating electrode array including a plurality of floating electrodes 464 may be disposed on insulating layer 460 so that insulating layer 460 is disposed between driving electrodes 458/bus line 462 and floating electrodes 464. As shown in
A first alignment layer 466A may be formed over floating electrodes 464 and portions of insulating layer 460 exposed in gap regions between adjacent floating electrodes 464. First alignment layer 466A may contact liquid crystal layer 442 and may enable proper orientation of liquid crystal molecules within liquid crystal layer 442. First alignment layer 466A may include any material and surface texture suitable for aligning liquid crystal molecules in a desired manner. For example, first alignment layer 466A may be formed of a polyimide (PI) material that is rubbed on the surface facing liquid crystal layer 442. In at least one example, first alignment layer 466A may be formed of a PI layer having a surface that is modified by irradiation with ultraviolet (UV) light to promote curing or partial curing of the PI material. Following UV irradiation, the surface of first alignment layer 466A may be mechanically rubbed in selected directions (e.g., horizontally, circularly, etc.) to provide a substantially consistent surface structure producing predictable surface alignment of liquid crystal molecules in liquid crystal layer 442. Any other suitable material or combination of materials may be included in first alignment layer 466A, including, for example, polymers (e.g., perfluoropolyether films), metal-oxides, and/or carbon nanotubes.
GRIN LC lens 440 may also include a second alignment layer 466B facing first alignment layer 466A. In some embodiments, second alignment layer 466B may be formed in the same or similar manner as first alignment layer 466A and may include the same or similar materials (e.g., PI). Additionally or alternatively, second alignment layer 466B may include any other suitable materials formed using any suitable technique providing a surface configured to adequately align liquid crystal molecules within liquid crystal layer 442 in combination with first alignment layer 466A.
Liquid crystal layer 442 may be disposed between first and second alignment layers 466A and 466B, as illustrated in
In various embodiments, GRIN LC lens 440 may additionally include at least one common electrode 468 disposed between second alignment layer 466B and second lens substrate 456B. In one example, common electrode 468 may be formed as a unitary layer overlapping all or substantially all of liquid crystal layer 442, driving electrodes 458, and floating electrodes 464. In certain examples, GRIN LC lens 440 may include multiple common electrodes 468 that together cover or substantially cover liquid crystal layer 442. An electric field may be generated between common electrode 468 and driving electrodes 458 and/or floating electrodes 464 when selected voltages are applied to common electrode 468 and driving electrodes 458. In various examples, common electrode 468 may be held at a single selected voltage and, in combination with driving electrodes 458 and/or floating electrodes 464, may enable a range of voltage differentials to be selectively applied to regions of liquid crystal layer 442. Accordingly, driving electrodes 458 may, in combination with common electrode 468, generate variable electric fields that reorient liquid crystal molecules in liquid crystal layer 442 to produce a desired lens phase profile.
Driving electrodes 458, floating electrodes 464, common electrode 468, and bus line 462 may include one or more electrically conductive materials, such as a semiconductor (e.g., a doped semiconductor), metal, carbon nanotube, graphene, oxidized graphene, fluorinated graphene, hydrogenated graphene, other graphene derivatives, carbon black, transparent conductive oxides (TCOs, e.g., indium tin oxide (ITO), zinc oxide (ZnO), indium gallium zinc oxide (IGZO), etc.), conducting polymers (e.g., PEDOT), and/or other electrically conductive material. In some embodiments, the electrodes may include a metal such as nickel, aluminum, gold, silver, platinum, palladium, tantalum, tin, copper, indium, gallium, zinc, alloys thereof, and the like. Further example transparent conductive oxides include, without limitation, aluminum-doped zinc oxide, fluorine-doped tin oxide, indium-doped cadmium oxide, indium zinc oxide, indium zinc tin oxide, indium gallium tin oxide, indium gallium zinc oxide, indium gallium zinc tin oxide, strontium vanadate, strontium niobate, strontium molybdate, and calcium molybdate. In some examples, the electrodes and/or bus line may each include one or more layers, grids, nanowires, etc. of any suitable transparent conductive material, such as transparent conductive oxides, graphene, etc. Driving electrodes 458, floating electrodes 464, common electrode 468, and/or bus line 462 may have an optical transmissivity of at least approximately 50% (e.g., approximately 50%, approximately 60%, approximately 70%, approximately 80%, approximately 90%, approximately 95%, approximately 97%, approximately 98%, approximately 99%, or approximately 99.5%, including ranges between any of the foregoing values).
Electrode patterns for GRIN LC lenses, as disclosed herein, may be configured to produce desired lens profiles when operated. For example, modeling may be utilized to determine and/or optimize various design parameters, such as the shapes of the electrodes, the number of driving electrodes, the number of Fresnel reset regions, the types of resistors coupling adjacent electrodes, and/or the number of bus lines utilized to produce adequate lens shapes and provide a sufficient range of lens power while minimizing visual aberrations and delays in response time that might be perceptible to a wearer.
A “director,” as used herein, may refer to an axis oriented in an average direction of long molecular axes of all liquid crystal molecules in a liquid crystal bulk or selected region thereof. Individual liquid crystal molecules may be more or less aligned with this directional axis. Accordingly, liquid crystal molecules, such as rod-like liquid crystal molecules, may be generally oriented such that their moments of inertia are roughly aligned along the director.
A GRIN LC lens design may include concentric ring-shaped electrodes (see, e.g.,
In various embodiments, the slope of optical path difference (OPD) vs. voltage curve 502 of a liquid crystal material, as disclosed herein, may not remain constant but may rather become substantially steeper at regions corresponding to lower voltage values. In at least one example, the nonlinearity of OPD vs. voltage curve 502 may be addressed by segmenting curve 502 into a number of different linear sections that together may better approximate the profile of curve 502 in a manner that has little or no impact on perceptible optical characteristics of the resulting GRIN LC lens. As shown in
While seven linear sections are shown in the illustrated example, curve 502 may be segregated into any other suitable number of linear sections. The number of linear sections may determine the number of interconnections and bus lines required to drive the GRIN LC lens. In the example illustrated in
As shown in
In the example of
In at least one example, a first voltage may be applied by bus line BL1 to driving electrode 658(1) and a lower or higher voltage may be applied by bus line BL3 to driving electrode 658(3). A voltage having a value between that of bus lines BL1 and BL3 may be applied by bus line BL2 to driving electrode 658(2). In some examples, voltages of driving electrodes 658 may decrease or increase linearly or substantially linearly between pairs of bus lines (see, e.g., linear sections LS1-LS7 between pairs of bus lines B1-B8 shown in
In at least one embodiment, amounts of voltage drop or increase between adjacent driving electrodes 658 and/or between neighboring bus lines may be substantially constant. Because the radial width of driving electrodes 658 progressively decreases proceeding from the center of driving electrode array 670 outward, the voltage changes may likewise change at progressively smaller intervals proceeding radially outward. The decreasing radial intervals between driving electrodes 658 may result in progressively greater changes in liquid crystal orientation proceeding radially outward along the GRIN LC lens so that a selected lens curvature (e.g., a spherical curvature) is applied to light passing through the GRIN LC lens. For example, in one embodiment, bus line BL1 may apply approximately 4 V to the center-most driving electrode 658(1) and bus line BL3 may apply approximately 0 V to the outer-most driving electrode 658(3). In this example, bus line BL2 may apply approximately 2 V to driving electrode 658(2), which is disposed at a location between driving electrodes 658(1) and 658(3). Driving electrode 658(2) may be located such that the number of driving electrodes 658 located between driving electrodes 658(1) and 658(2) is the same or nearly the same as the number of driving electrodes 658 located between driving electrodes 658(2) and 658(3). Any other suitable number, distribution, and/or configuration of driving electrodes 658 may be utilized in various examples.
In some embodiments, voltage drops between different pairs of bus lines may have different slopes so as to produce a desired lens profile in the GRIN LC lens. Any suitable combination of voltage values may be applied to bus line BL1-BL3 to produce selected electrical field gradients in an overlapping liquid crystal layer. For example, a total voltage drop between bus lines BL2 and BL3 may be more or less steep than a total voltage drop between bus lines BL1 and BL2.
Driving electrode array 670 may be divided into a plurality of Fresnel reset sections. In the example shown in
In the embodiment of
Driving electrode array 670 may be utilized to provide GRIN LC lens 641 with a segregated Fresnel structure. The GRIN LC lens may include any appropriate type of Fresnel structure, such as a Fresnel zone plate lens including areas that have a phase difference of a half-wave to adjacent areas, a diffractive Fresnel lens having a segmented parabolic phase profile where the segments are small and can result in significant diffraction, or a refractive Fresnel lens having a segmented parabolic profile where the segments are large enough so that diffraction effects are minimized. Other structures may also be used.
In some embodiments, the driving electrode array 670 may be utilized in a refractive Fresnel GRIN LC lens having a segmented parabolic profile, where the segments are large enough that the resulting diffraction angle is smaller than the angular resolution of human eyes (i.e., diffraction effects are not observable by human eyes). Such a refractive Fresnel LC lens may be referred to as a segmented phase profile (SPP) LC lens.
For a positive thin lens, optical path difference (OPD) can be approximated with a Maclaurin series to a parabolic profile as shown in Equation (1)
where r is the lens radius (i.e., half of the lens aperture) and f is the focal length. The OPD of an LC lens is proportional to the cell thickness d and the birefringence Δn of the LC material as shown in Equation (2)
The response time τ of an Electrically Controlled Birefringence (ECB) LC cell, which is the time the material requires to recover to its original state, is quadratically dependent on cell thickness d (τ∝r4) as shown in Equation (3)
where γ and K22 are the rotational viscosity and the splay elastic constant of the LC material, respectively. As equations (1)-(3) show, there is typically a tradeoff between the aperture size and response time. Thus, designing a GRIN LC lens with large aperture and reasonable response time has conventionally presented challenges. In the disclosed embodiments, by introducing phase resets (i.e., Fresnel resets) in the parabolic phase profile, the aperture size of the LC lens may be increased without compromising the response time.
The total OPD step for a designed wavelength=n*λ, where % n is the number of waves in the designed wavelength. For a non-designed wavelength, the total OPD step may be the same, or lens power may be different for different wavelengths. However, Equation (4) shows a relation between OPD step for a designed wavelength and a non-designed wavelength:
where % n is the number of waves in the designed wavelength, λi is the non-designed wavelength, and φ is the phase condition. For a half-wave,
As a result,
% for the first half-wave condition. Hence, half-wave condition wavelengths are
% for all other half-wave conditions. As such, light having wavelengths of 790 nm, 668 nm, 579 nm, 511 nm, 457 nm, and 414 nm may cause a half-wave condition if a designed wavelength is 543.5 nm. Quarter-wave condition wavelengths are
Hence, light having wavelengths of 644 nm, 668 nm, 599 nm, 561 nm, 527 nm, 496 nm, and 470 nm may cause a quarter-wave condition if a designed wavelength is 543.5 nm.
The five Fresnel reset sections FS1-FS5 of GRIN LC lens 840 may enable the corresponding LC cell thickness of GRIN LC lens 840 to be reduced up to five times, resulting in an LC cell thickness as low as approximately 14 μm. Likewise, the response time of the illustrated GRIN LC lens may be improved by a factor of up to 25. That is, the introduction of the Fresnel resets in the GRIN LC lens phase profile may enable the optical power of GRIN LC lens 840 to be adjusted sufficiently fast to keep pace with human eye accommodation (e.g., accommodation may occur in approximately 300 ms) such that the vergence-accommodation conflict may be substantially or fully resolved. The number of Fresnel resets/segments in a particular lens may be determined based on specific configurations of the Fresnel structure and the GRIN LC lens requirements, such as the desired optical power, lens aperture, switching time, and/or image quality of the GRIN LC lens.
In some examples, each bus line 1062 may be coupled to a corresponding driving electrode 1058 within each of a plurality of Fresnel reset sections (see, e.g., Fresnel reset sections FS1 and FS2 in
As illustrated in
As illustrated in
In various embodiments, a curved GRIN LC lens may include one or more curved surfaces, such as a pair of curved surfaces including an incident side surface and an exit side surface. Additionally, in at least one example, layers within the curved GRIN LC lens may also be curved. For example, a liquid crystal layer and/or one or more substrates abutting the liquid crystal layer may also follow curved profiles.
In some examples, at least a portion of GRIN LC module 1240 may abut at least a portion of eye-tracking module 1282 and/or prescription lens member 1283. For example, as shown in
As shown in
First substrate 1280A and/or second substrate 1280B may include various electrical components and layers (e.g., alignment layers, insulation layers, etc.) for driving and orienting liquid crystal molecules in liquid crystal layer 1242 to produce a desired lensing effect on light passing through liquid crystal layer 1242. Wiring, electrodes, and various electronic components and layers within first substrate 1280A and second substrate 1280B may be laid out along a curved path conforming to the corresponding curved shape of first substrate 1280A and/or second substrate 1280B.
Incident side surface 1243A and exit side surface 1243B may each have any suitable curved shape. In some examples, incident side surface 1243A and/or exit side surface 1243B may have a partial spherical shape with a radius of curvature that is greater than 0 mm and less an infinity (i.e., not planar). For example, incident side surface 1243A and/or exit side surface 1243B may have a radius of curvature of between approximately 10 mm and approximately 1000 mm (e.g., approximately 10 mm, approximately 20 mm, approximately 30 mm, approximately 40 mm, approximately 50 mm, approximately 60 mm, approximately 70 mm, approximately 80 mm, approximately 90 mm, approximately 100 mm, approximately 150 mm, approximately 200 mm, approximately 250 mm, approximately 300 mm, approximately 350 mm, approximately 400 mm, approximately 450 mm, approximately 500 mm, approximately 600 mm, approximately 700 mm, approximately 800 mm, approximately 900 mm, approximately 1000 mm).
In various examples, the increased curvature of portions of GRIN LC module 1240, particularly incident side surface 1243A, in comparison to a flat GRIN LC lens, may decrease an angle of incidence of light impacting and/or passing through GRIN LC module 1240. For example, an angle of incidence for a curved GRIN LC lens, as described herein, may be approximately 200 at a max field of view of 48°. In contrast, an angle of incidence for a conventional flat GRIN LC lens may only be approximately 340 at a max field of view of 48°. The angle of incidence of the incoming light may thus be improved significantly with curved GRIN LC lens modules.
This reduction in angle of incidence may be particularly substantial when the incident light is directed toward GRIN LC module 1240 by another optical member, such as a pancake lens, having a larger diameter than GRIN LC module 1240. Reducing the angle of incidence of light entering GRIN LC module 1240 may reduce undesired scattering and/or diffraction of light. Such reductions in light scattering and/or diffraction may be particularly noticeable near peripherally outer regions of GRIN LC module 1240, where Fresnel reset sections may be narrower and more closely clustered.
As additionally shown in
In order to reflect the light correctly and avoid image artifacts, the polarization state of light within pancake lens 1385 may be precisely controlled. In some examples, a polarizer 1386 may be disposed at or near the exit-side surface of pancake lens 1385 such that polarizer 1386 is positioned between pancake lens 1385 and GRIN LC module 1340. In at least one example, polarizer 1386 may be laminated to the concave exit surface of pancake lens 1385. Polarizer 1386 may ensure that light passing from pancake lens 1385 to GRIN LC module 1340 is properly polarized, thus facilitating suitable lensing of the light during passage through GRIN LC module 1340.
Light 1348 exiting GRIN LC module 1340 may pass through an intermediate region and/or through eye-tracking module 1382 and prescription lens member 1383. Intermediate region 1381 may include an air gap defined between GRIN LC module 1340 and an adjacent member, such as eye-tracking module 1382 and/or prescription lens member 1383. Additionally or alternatively, intermediate region 1381 may include a solid, optically clear support and/or lens member on which GRIN LC module 1340 is laminated and/or otherwise affixed. GRIN LC module 1340 may focus light 1348 towards an eye-box of a display device, with prescription lens member 1383 and/or another optical element further adjusting the image as needed to meet a particular user's need.
Because light from pancake lens 1385 is directed towards GRIN LC module 1340 at various inward sloping angles, the curved shape of incident side surface 1343A of GRIN LC module 1340 may allow such incoming light to be received at incident side surface 1343A with a decreased angle of incidence in comparison to more conventional flat GRIN LC lens modules. Accordingly, undesired scattering and/or diffraction of light may be significantly reduced, particularly near peripherally outer regions of GRIN LC module 1340, where Fresnel reset sections may be narrower and more closely clustered.
While GRIN LC module 1340 module is shown and discussed, a switch liquid crystal module or any other suitable type of liquid crystal module may optionally be utilized in conjunction with components of the illustrated system 1300. GRIN LC module 1340, or any other module used in place of GRIN LC module 1340, may have a center thickness of approximately 3 mm or less. GRIN LC module 1340 may be integrated within the cavity of pancake lens 1385 using lamination, bonding, or any other suitable method. In some examples, GRIN LC module 1340 may be connected to a driving circuit using flex cables and/or any other suitable electronic connection. Any other suitable elements may also be included within and/or adjacent to the cavity of pancake lens 1385, such as, for example, an eye-tracking illumination module, an eye-tracking imaging module, color-correction films (e.g., chromatic aberration correction/Pancharatnam-Berry phase films), one or more additional liquid crystal modules (e.g., additional GRIN LC modules), and/or any other planar component or layer that provides functional value to display system 1300.
Line 1404 represents light having a wavelength of 400 nm that enters the GRIN LC lens at an incident angle of 0°. Line 1406 represents light having a wavelength of 400 nm that enters the GRIN LC lens at an incident angle of 10°. Line 1408 represents light having a wavelength of 400 nm that enters the GRIN LC lens at an incident angle of −10°. Line 1410 represents light having a wavelength of 450/470 nm that enters the GRIN LC lens at an incident angle of 0°.
At step 1920 in
GRIN LC lenses with curved surfaces, as described herein, may have reduced light scattering, particularly in fly-back regions between Fresnel reset zones. Accordingly, GRIN LC lens systems, as described herein, may not require the inclusion of a dark masking layer or other image noise blocking layer to reduce undesirable light scattering. Optical characteristics of GRIN LC lenses having Fresnel resets may thus be improved, resulting in reduced light scattering and increased clarity in comparison to other lenses. Thus, visible lines, such as those evident on a dark masking layer, may not be present in the disclosed lens systems. Accordingly, the lens systems described herein may have minimal space requirements while providing high quality display characteristics, making them suitable for use in a variety of display systems, including various head-mounted display systems.
Example 1: A lens system includes a lens having a liquid crystal module, an incident side surface on a first side of the liquid crystal module, and an exit side surface on a second side of the liquid crystal module, where at least one of the incident side surface and the exit side surface includes a curved surface.
Example 2: The lens system of Example 1, where both the incident side surface and the exit side surface have a curved surface.
Example 3: The lens system of Example 2, where a radius of curvature of the incident side surface is approximately the same as a radius of curvature of the exit side surface.
Example 4: The lens system of any of Examples 1-3, where the liquid crystal module includes a driving electrode array, a common electrode, and a lens liquid crystal layer disposed between the driving electrode array and the common electrode.
Example 5: The lens system of any of Examples 1-4, where the liquid crystal module extends along a curved path.
Example 6: The lens system of any of Examples 1-5, where the lens is a first lens and the lens system further includes a second lens overlapping the first lens.
Example 7: The lens system of Example 6, where the second lens has a curved surface facing the incident side surface or exit side surface of the first lens.
Example 8: The lens system of Example 7, where the curved surface of the second lens has a radius of curvature that is approximately the same as a radius of curvature of the facing incident side surface or exit side surface of the first lens.
Example 9: The lens system of any of Examples 6-8, where the second lens is a pancake lens.
Example 10: The lens system of any of Examples 6-9, further including a reflective polarizer positioned between the second lens and the first lens.
Example 11: The lens system of any of Examples 1-10, where the lens includes a plurality of Fresnel reset sections concentrically arranged between a center and an outer periphery of the lens.
Example 12: The lens system of any of Examples 1-11, wherein the incident side surface is a convex surface and the exit side surface is a concave surface.
Example 13: The lens system of any of Examples 1-12, where the concave exit side surface is laminated onto a convex surface of an abutting optical element.
Example 14: A display device including a display screen having a plurality of light emitting elements and a lens system that receives light emitted from the display screen, the lens system having a lens including a liquid crystal module, an incident side surface on a first side of the liquid crystal module, and an exit side surface on a second side of the liquid crystal module, where at least one of the incident side surface and the exit side surface has a curved surface.
Example 15: The display device of Example 14, where the lens is a first lens and the lens system further includes a second lens overlapping the first lens.
Example 16: The display device of Example 15, where the second lens is disposed between the display screen and the first lens.
Example 17: The display device of any of Examples 14-16, where the incident side surface includes a convex surface facing the display screen.
Example 18: A method includes providing a lens having a liquid crystal module, an incident side surface on a first side of the liquid crystal module, and an exit side surface on a second side of the liquid crystal module, where at least one of the incident side surface and the exit side surface includes a curved surface. The method also includes positioning another curved surface of an optical element adjacent the at least one curved surface of the lens.
Example 19: The method of Example 18, where the optical element is a second lens.
Example 20: The method of any of Examples 18 and 19, further including positioning a display screen such that light emitted from the display screen passes through the optical element to the lens.
HMD 2005 may present content to a user. In some examples, HMD 2005 may be an embodiment of HMD 200 described above with reference to
Eye-tracking system 236 may track eye position and eye movement of a user of HMD 2005. A camera or other optical sensor, which may be part of eye-tracking system 236 inside HMD 2005, may capture image information of a user's eye(s), and eye-tracking system 236 may use the captured information to determine interpupillary distance, interocular distance, a three dimensional (3D) position of each eye relative to HMD 2005 (e.g., for distortion adjustment purposes), including a magnitude of torsion and rotation (i.e., roll, pitch, and yaw), and gaze directions for each eye.
In some embodiments, infrared light may be emitted within HMD 2005 and reflected from each eye. The reflected light may be received or detected by the camera and analyzed to extract eye rotation from changes in the infrared light reflected by each eye. Many methods for tracking the eyes of a user may be used by eye-tracking system 236. Accordingly, eye-tracking system 236 may track up to six degrees of freedom of each eye (i.e., 3D position, roll, pitch, and yaw), and at least a subset of the tracked quantities may be combined from two eyes of a user to estimate a gaze point (i.e., a 3D location or position in the virtual scene where the user is looking). For example, eye-tracking system 236 may integrate information from past measurements, measurements identifying a position of a user's head, and 3D information describing a scene presented by electronic display 208. Thus, information for the position and orientation of the user's eyes may be used to determine the gaze point in a virtual scene presented by HMD 2005 where the user is currently looking.
Varifocal block 232 may adjust its focal length (i.e., optical power) by adjusting a focal length of one or more varifocal structures. As noted above with reference to
Vergence processing module 2030 may determine a vergence distance of a user's gaze based on the gaze point or an estimated intersection of the gaze lines determined by eye-tracking system 236. Vergence is the simultaneous movement or rotation of both eyes in opposite directions to maintain single binocular vision, which is naturally and automatically performed by the human eye. Thus, a location where a user's eyes are verged is where the user is currently looking and is also typically the location where the user's eyes are currently focused. For example, vergence processing module 2030 may triangulate the gaze lines to estimate a distance or depth from the user associated with intersection of the gaze lines. Then the depth associated with intersection of the gaze lines may be used as an approximation for the accommodation distance, which identifies a distance from the user where the user's eyes are directed. Thus, the vergence distance may allow determination of a location where the user's eyes should be focused.
Locators 230 may be objects located in specific positions on HMD 2005 relative to one another and relative to a specific reference point on HMD 2005. A locator 230 may be a light emitting diode (LED), a corner cube reflector, a reflective marker, a type of light source that contrasts with an environment in which HMD 2005 operates, or some combination thereof.
IMU 226 may be an electronic device that generates fast calibration data based on measurement signals received from one or more of head tracking sensors 2035, which generate one or more measurement signals in response to motion of HMD 2005. Examples of head tracking sensors 2035 include accelerometers, gyroscopes, magnetometers, other sensors suitable for detecting motion, correcting error associated with IMU 226, or some combination thereof.
Based on the measurement signals from head tracking sensors 2035, IMU 226 may generate fast calibration data indicating an estimated position of HMD 2005 relative to an initial position of HMD 2005. For example, head tracking sensors 2035 may include multiple accelerometers to measure translational motion (forward/back, up/down, left/right) and multiple gyroscopes to measure rotational motion (e.g., pitch, yaw, and roll). IMU 226 may, for example, rapidly sample the measurement signals and calculate the estimated position of HMD 2005 from the sampled data. Alternatively, IMU 226 may provide the sampled measurement signals to console 2020, which determines the fast calibration data.
IMU 226 may additionally receive one or more calibration parameters from console 2020. As further discussed below, the one or more calibration parameters may be used to maintain tracking of HMD 2005. Based on a received calibration parameter, IMU 226 may adjust one or more of the IMU parameters (e.g., sample rate). In some embodiments, certain calibration parameters may cause IMU 226 to update an initial position of the reference point to correspond to a next calibrated position of the reference point. Updating the initial position of the reference point as the next calibrated position of the reference point may help to reduce accumulated error associated with determining the estimated position. The accumulated error, also referred to as drift error, may cause the estimated position of the reference point to “drift” away from the actual position of the reference point over time.
Scene rendering module 2040 may receive contents for the virtual scene from a virtual reality engine 2045 and provide display content for display on electronic display 208. Scene rendering module 2040 may include a hardware central processing unit (CPU), graphics processing unit (GPU), and/or a controller/microcontroller. Additionally, scene rendering module 2040 may adjust the content based on information from eye-tracking system 236, vergence processing module 2030, IMU 226, and head tracking sensors 2035. Scene rendering module 2040 may determine a portion of the content to be displayed on electronic display 208, based on one or more of eye-tracking system 236, tracking module 2055, head tracking sensors 2035, or IMU 226. For example, scene rendering module 2040 may determine a virtual scene, or any part of the virtual scene, to be displayed to the viewer's eyes. Scene rendering module 2040 may also dynamically adjust the displayed content based on the real-time configuration of varifocal block 232. In addition, based on the information of the determined lens center shift provided by varifocal block 232, scene rendering module 2040 may determine a shift of the virtual scene to be displayed on electronic display 208.
Imaging device 2010 may provide a monitoring function for HMD 2005 and may generate slow calibration data in accordance with calibration parameters received from console 2020. Slow calibration data may include one or more images showing observed positions of locators 230 that are detectable by imaging device 2010. Imaging device 2010 may include one or more cameras, one or more video cameras, other devices capable of capturing images including one or more locators 230, or some combination thereof. Slow calibration data may be communicated from imaging device 2010 to console 2020, and imaging device 2010 may receive one or more calibration parameters from console 2020 to adjust one or more imaging parameters (e.g., focal length, focus, frame rate, ISO, sensor temperature, shutter speed, aperture, etc.).
Input/output interface 2015 may be a device that allows a user to send action requests to console 2020. An action request may be a request to perform a particular action. For example, an action request may be used to start or end an application or to perform a particular action within the application. Input/output interface 2015 may include one or more input devices such as a keyboard, a mouse, a game controller, or any other suitable device. An action request received by input/output interface 2015 may be communicated to console 2020, which performs an action corresponding to the action request. In some embodiments, input/output interface 2015 may provide haptic feedback to the user in accordance with instructions received from console 2020. For example, haptic feedback may be provided by input/output interface 2015 when an action request is received, or console 2020 may communicate instructions to input/output interface 2015 causing input/output interface 2015 to generate haptic feedback when console 2020 performs an action.
Console 2020 may provide content to HMD 2005 for presentation to the user in accordance with information received from imaging device 2010, HMD 2005, or input/output interface 2015. In one embodiment, as shown in
Application store 2050 may store one or more applications for execution by console 2020. An application may be a group of instructions that, when executed by a processor, generate content for presentation to the user. Content generated by an application may be in response to inputs received from the user via movement of HMD 2005 and/or input/output interface 2015. Examples of applications include gaming applications, conferencing applications, video playback applications, and/or other suitable applications.
Tracking module 2055 may calibrate varifocal system 2000 using one or more calibration parameters and may adjust one or more calibration parameters to reduce error in determining position of HMD 2005. For example, tracking module 2055 may adjust the focus of imaging device 2010 to obtain a more accurate position for observed locators 230 on HMD 2005. Moreover, calibration performed by tracking module 2055 may also account for information received from IMU 226. Additionally, when tracking of HMD 2005 is lost (e.g., imaging device 2010 loses line of sight of at least a threshold number of locators 230), tracking module 2055 may re-calibrate some or all of varifocal system 2000 components.
Additionally, tracking module 2055 may track the movement of HMD 2005 using slow calibration information from imaging device 2010, and determine positions of a reference point on HMD 2005 using observed locators from the slow calibration information and a model of HMD 2005. Tracking module 2055 may also determine positions of the reference point on HMD 2005 using position information from the fast calibration information from IMU 226 on HMD 2005. Additionally, tracking module 2055 may use portions of the fast calibration information, the slow calibration information, or some combination thereof, to predict a future location of HMD 2005, which is provided to virtual reality engine 2045.
Virtual reality engine 2045 may function as a controller to execute applications within varifocal system 2000 and may receive position information, acceleration information, velocity information, predicted future positions, or some combination thereof for HMD 2005 from tracking module 2055. Based on the received information, virtual reality engine 2045 may determine content to provide to HMD 2005 for presentation to the user, such as a virtual scene, one or more virtual objects to overlay onto a real-world scene, etc. In some embodiments, virtual reality engine 2045 may maintain focal capability information of varifocal block 232. Focal capability information is information that describes what focal distances are available to varifocal block 232. Focal capability information may include, e.g., a range of focus that varifocal block 232 is able to accommodate (e.g., 0 to 4 diopters) and/or combinations of settings for each activated LC lens that map to particular focal planes. In some examples, virtual reality engine 2045 may operate a GRIN LC lens(es) of varifocal block 232 by controlling voltages applied to driving electrodes and/or common electrodes of the GRIN LC lens(es).
Virtual reality engine 2045 may provide information to varifocal block 232, such as the accommodation and/or convergence parameters including what focal distances are available to varifocal block 232. Virtual reality engine 2045 may generate instructions for varifocal block 232 that cause varifocal block 232 to adjust its focal distance to a particular location. Virtual reality engine 2045 may generate the instructions based on focal capability information and, e.g., information from vergence processing module 2030, IMU 226, and head tracking sensors 2035, and provide the instructions to varifocal block 232 to configure and/or adjust the adaptive lens assembly. Virtual reality engine 2045 may use the information from vergence processing module 2030, IMU 226, and/or head tracking sensors 2035 to select a focal plane to present content to the user. Additionally, virtual reality engine 2045 may perform an action within an application executing on console 2020 in response to an action request received from input/output interface 2015 and may provide feedback to the user that the action was performed. The provided feedback may, for example, include visual and/or audible feedback via HMD 2005 and/or haptic feedback via input/output interface 2015.
Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial-reality systems. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, for example, a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivative thereof. Artificial-reality content may include completely computer-generated content or computer-generated content combined with captured (e.g., real-world) content. The artificial-reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional (3D) effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in an artificial reality and/or are otherwise used in (e.g., to perform activities in) an artificial reality.
Artificial-reality systems may be implemented in a variety of different form factors and configurations. Some artificial-reality systems may be designed to work without near-eye displays (NEDs). Other artificial-reality systems may include an NED that also provides visibility into the real world (such as, e.g., augmented-reality system 2100 in
Turning to
In some embodiments, augmented-reality system 2100 may include one or more sensors, such as sensor 2140. Sensor 2140 may generate measurement signals in response to motion of augmented-reality system 2100 and may be located on substantially any portion of frame 2110. Sensor 2140 may represent one or more of a variety of different sensing mechanisms, such as a position sensor, an inertial measurement unit (IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some embodiments, augmented-reality system 2100 may or may not include sensor 2140 or may include more than one sensor. In embodiments in which sensor 2140 includes an IMU, the IMU may generate calibration data based on measurement signals from sensor 2140. Examples of sensor 2140 may include, without limitation, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors used for error correction of the IMU, or some combination thereof.
In some examples, augmented-reality system 2100 may also include a microphone array with a plurality of acoustic transducers 2120(A)-2120(J), referred to collectively as acoustic transducers 2120. Acoustic transducers 2120 may represent transducers that detect air pressure variations induced by sound waves. Each acoustic transducer 2120 may be configured to detect sound and convert the detected sound into an electronic format (e.g., an analog or digital format). The microphone array in
In some embodiments, one or more of acoustic transducers 2120(A)-(J) may be used as output transducers (e.g., speakers). For example, acoustic transducers 2120(A) and/or 2120(B) may be earbuds or any other suitable type of headphone or speaker.
The configuration of acoustic transducers 2120 of the microphone array may vary. While augmented-reality system 2100 is shown in
Acoustic transducers 2120(A) and 2120(B) may be positioned on different parts of the user's ear, such as behind the pinna, behind the tragus, and/or within the auricle or fossa. Or, there may be additional acoustic transducers 2120 on or surrounding the ear in addition to acoustic transducers 2120 inside the ear canal. Having an acoustic transducer 2120 positioned next to an ear canal of a user may enable the microphone array to collect information on how sounds arrive at the ear canal. By positioning at least two of acoustic transducers 2120 on either side of a user's head (e.g., as binaural microphones), augmented-reality system 2100 may simulate binaural hearing and capture a 3D stereo sound field around about a user's head. In some embodiments, acoustic transducers 2120(A) and 2120(B) may be connected to augmented-reality system 2100 via a wired connection 2130, and in other embodiments acoustic transducers 2120(A) and 2120(B) may be connected to augmented-reality system 2100 via a wireless connection (e.g., a BLUETOOTH connection). In still other embodiments, acoustic transducers 2120(A) and 2120(B) may not be used at all in conjunction with augmented-reality system 2100.
Acoustic transducers 2120 on frame 2110 may be positioned in a variety of different ways, including along the length of the temples, across the bridge, above or below display devices 2115(A) and 2115(B), or some combination thereof. Acoustic transducers 2120 may also be oriented such that the microphone array is able to detect sounds in a wide range of directions surrounding the user wearing the augmented-reality system 2100. In some embodiments, an optimization process may be performed during manufacturing of augmented-reality system 2100 to determine relative positioning of each acoustic transducer 2120 in the microphone array.
In some examples, augmented-reality system 2100 may include or be connected to an external device (e.g., a paired device), such as neckband 2105. Neckband 2105 generally represents any type or form of paired device. Thus, the following discussion of neckband 2105 may also apply to various other paired devices, such as charging cases, smart watches, smart phones, wrist bands, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external compute devices, etc.
As shown, neckband 2105 may be coupled to eyewear device 2102 via one or more connectors. The connectors may be wired or wireless and may include electrical and/or non-electrical (e.g., structural) components. In some cases, eyewear device 2102 and neckband 2105 may operate independently without any wired or wireless connection between them. While
Pairing external devices, such as neckband 2105, with augmented-reality eyewear devices may enable the eyewear devices to achieve the form factor of a pair of glasses while still providing sufficient battery and computation power for expanded capabilities. Some or all of the battery power, computational resources, and/or additional features of augmented-reality system 2100 may be provided by a paired device or shared between a paired device and an eyewear device, thus reducing the weight, heat profile, and form factor of the eyewear device overall while still retaining desired functionality. For example, neckband 2105 may allow components that would otherwise be included on an eyewear device to be included in neckband 2105 since users may tolerate a heavier weight load on their shoulders than they would tolerate on their heads. Neckband 2105 may also have a larger surface area over which to diffuse and disperse heat to the ambient environment. Thus, neckband 2105 may allow for greater battery and computation capacity than might otherwise have been possible on a stand-alone eyewear device. Since weight carried in neckband 2105 may be less invasive to a user than weight carried in eyewear device 2102, a user may tolerate wearing a lighter eyewear device and carrying or wearing the paired device for greater lengths of time than a user would tolerate wearing a heavy standalone eyewear device, thereby enabling users to more fully incorporate artificial-reality environments into their day-to-day activities.
Neckband 2105 may be communicatively coupled with eyewear device 2102 and/or to other devices. These other devices may provide certain functions (e.g., tracking, localizing, depth mapping, processing, storage, etc.) to augmented-reality system 2100. In the embodiment of
Acoustic transducers 2120(I) and 2120(J) of neckband 2105 may be configured to detect sound and convert the detected sound into an electronic format (analog or digital). In the embodiment of
Controller 2125 of neckband 2105 may process information generated by the sensors on neckband 2105 and/or augmented-reality system 2100. For example, controller 2125 may process information from the microphone array that describes sounds detected by the microphone array. For each detected sound, controller 2125 may perform a direction-of-arrival (DOA) estimation to estimate a direction from which the detected sound arrived at the microphone array. As the microphone array detects sounds, controller 2125 may populate an audio data set with the information. In embodiments in which augmented-reality system 2100 includes an inertial measurement unit, controller 2125 may compute all inertial and spatial calculations from the IMU located on eyewear device 2102. A connector may convey information between augmented-reality system 2100 and neckband 2105 and between augmented-reality system 2100 and controller 2125. The information may be in the form of optical data, electrical data, wireless data, or any other transmittable data form. Moving the processing of information generated by augmented-reality system 2100 to neckband 2105 may reduce weight and heat in eyewear device 2102, making it more comfortable to the user.
Power source 2135 in neckband 2105 may provide power to eyewear device 2102 and/or to neckband 2105. Power source 2135 may include, without limitation, lithium ion batteries, lithium-polymer batteries, primary lithium batteries, alkaline batteries, or any other form of power storage. In some cases, power source 2135 may be a wired power source. Including power source 2135 on neckband 2105 instead of on eyewear device 2102 may help better distribute the weight and heat generated by power source 2135.
As noted, some artificial-reality systems may, instead of blending an artificial reality with actual reality, substantially replace one or more of a user's sensory perceptions of the real world with a virtual experience. One example of this type of system is a head-worn display system, such as virtual-reality system 2200 in
Artificial-reality systems may include a variety of types of visual feedback mechanisms. For example, display devices in augmented-reality system 2100 and/or virtual-reality system 2200 may include one or more liquid crystal displays (LCDs), light emitting diode (LED) displays, microLED displays, organic LED (OLED) displays, digital light project (DLP) micro-displays, liquid crystal on silicon (LCoS) micro-displays, and/or any other suitable type of display screen. These artificial-reality systems may include a single display screen for both eyes or may provide a display screen for each eye, which may allow for additional flexibility for varifocal adjustments or for correcting a user's refractive error. Some of these artificial-reality systems may also include optical subsystems having one or more lenses (e.g., concave or convex lenses, Fresnel lenses, adjustable liquid lenses, etc.) through which a user may view a display screen. These optical subsystems may serve a variety of purposes, including to collimate (e.g., make an object appear at a greater distance than its physical distance), to magnify (e.g., make an object appear larger than its actual size), and/or to relay (to, e.g., the viewer's eyes) light. These optical subsystems may be used in a non-pupil-forming architecture (such as a single lens configuration that directly collimates light but results in so-called pincushion distortion) and/or a pupil-forming architecture (such as a multi-lens configuration that produces so-called barrel distortion to nullify pincushion distortion).
In addition to or instead of using display screens, some of the artificial-reality systems described herein may include one or more projection systems. For example, display devices in augmented-reality system 2100 and/or virtual-reality system 2200 may include micro-LED projectors that project light (using, e.g., a waveguide) into display devices, such as clear combiner lenses that allow ambient light to pass through. The display devices may refract the projected light toward a user's pupil and may enable a user to simultaneously view both artificial-reality content and the real world. The display devices may accomplish this using any of a variety of different optical components, including waveguide components (e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements), light-manipulation surfaces and elements (such as diffractive, reflective, and refractive elements and gratings), coupling elements, etc. Artificial-reality systems may also be configured with any other suitable type or form of image projection system, such as retinal projectors used in virtual retina displays.
The artificial-reality systems described herein may also include various types of computer vision components and subsystems. For example, augmented-reality system 2100 and/or virtual-reality system 2200 may include one or more optical sensors, such as two-dimensional (2D) or 3D cameras, structured light transmitters and detectors, time-of-flight depth sensors, single-beam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. An artificial-reality system may process data from one or more of these sensors to identify a location of a user, to map the real world, to provide a user with context about real-world surroundings, and/or to perform a variety of other functions.
The artificial-reality systems described herein may also include one or more input and/or output audio transducers. Output audio transducers may include voice coil speakers, ribbon speakers, electrostatic speakers, piezoelectric speakers, bone conduction transducers, cartilage conduction transducers, tragus-vibration transducers, and/or any other suitable type or form of audio transducer. Similarly, input audio transducers may include condenser microphones, dynamic microphones, ribbon microphones, and/or any other type or form of input transducer. In some embodiments, a single transducer may be used for both audio input and audio output.
In some embodiments, the artificial-reality systems described herein may also include tactile (i.e., haptic) feedback systems, which may be incorporated into headwear, gloves, body suits, handheld controllers, environmental devices (e.g., chairs, floormats, etc.), and/or any other type of device or system. Haptic feedback systems may provide various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms. Haptic feedback systems may be implemented independent of other artificial-reality devices, within other artificial-reality devices, and/or in conjunction with other artificial-reality devices.
By providing haptic sensations, audible content, and/or visual content, artificial-reality systems may create an entire virtual experience or enhance a user's real-world experience in a variety of contexts and environments. For instance, artificial-reality systems may assist or extend a user's perception, memory, or cognition within a particular environment. Some systems may enhance a user's interactions with other people in the real world or may enable more immersive interactions with other people in a virtual world. Artificial-reality systems may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, business enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, visual aids, etc.). The embodiments disclosed herein may enable or enhance a user's artificial-reality experience in one or more of these contexts and environments and/or in other contexts and environments.
In some embodiments, the systems described herein may also include an eye-tracking subsystem designed to identify and track various characteristics of a user's eye(s), such as the user's gaze direction. The phrase “eye tracking” may, in some examples, refer to a process by which the position, orientation, and/or motion of an eye is measured, detected, sensed, determined, and/or monitored. The disclosed systems may measure the position, orientation, and/or motion of an eye in a variety of different ways, including through the use of various optical-based eye-tracking techniques, ultrasound-based eye-tracking techniques, etc. An eye-tracking subsystem may be configured in a number of different ways and may include a variety of different eye-tracking hardware components or other computer-vision components. For example, an eye-tracking subsystem may include a variety of different optical sensors, such as two-dimensional (2D) or 3D cameras, time-of-flight depth sensors, single-beam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. In this example, a processing subsystem may process data from one or more of these sensors to measure, detect, determine, and/or otherwise monitor the position, orientation, and/or motion of the user's eye(s).
As noted, the eye-tracking systems or subsystems disclosed herein may track a user's eye position and/or eye movement in a variety of ways. In one example, one or more light sources and/or optical sensors may capture an image of the user's eyes. The eye-tracking subsystem may then use the captured information to determine the user's inter-pupillary distance, interocular distance, and/or a 3D position of each eye (e.g., for distortion adjustment purposes), including a magnitude of torsion and rotation (i.e., roll, pitch, and yaw) and/or gaze directions for each eye. In one example, infrared light may be emitted by the eye-tracking subsystem and reflected from each eye. The reflected light may be received or detected by an optical sensor and analyzed to extract eye rotation data from changes in the infrared light reflected by each eye.
The eye-tracking subsystem may use any of a variety of different methods to track the eyes of a user. For example, a light source (e.g., infrared light-emitting diodes) may emit a dot pattern onto each eye of the user. The eye-tracking subsystem may then detect (e.g., via an optical sensor coupled to the artificial reality system) and analyze a reflection of the dot pattern from each eye of the user to identify a location of each pupil of the user. Accordingly, the eye-tracking subsystem may track up to six degrees of freedom of each eye (i.e., 3D position, roll, pitch, and yaw) and at least a subset of the tracked quantities may be combined from two eyes of a user to estimate a gaze point (i.e., a 3D location or position in a virtual scene where the user is looking) and/or an IPD.
In some cases, the distance between a user's pupil and a display may change as the user's eye moves to look in different directions. The varying distance between a pupil and a display as viewing direction changes may be referred to as “pupil swim” and may contribute to distortion perceived by the user as a result of light focusing in different locations as the distance between the pupil and the display changes. Accordingly, measuring distortion at different eye positions and pupil distances relative to displays and generating distortion corrections for different positions and distances may allow mitigation of distortion caused by pupil swim by tracking the 3D position of a user's eyes and applying a distortion correction corresponding to the 3D position of each of the user's eyes at a given point in time. Thus, knowing the 3D position of each of a user's eyes may allow for the mitigation of distortion caused by changes in the distance between the pupil of the eye and the display by applying a distortion correction for each 3D eye position. Furthermore, as noted above, knowing the position of each of the user's eyes may also enable the eye-tracking subsystem to make automated adjustments for a user's IPD.
In some embodiments, a display subsystem may include a variety of additional subsystems that may work in conjunction with the eye-tracking subsystems described herein. For example, a display subsystem may include a varifocal subsystem, a scene-rendering module, and/or a vergence-processing module. The varifocal subsystem may cause left and right display elements to vary the focal distance of the display device. In one embodiment, the varifocal subsystem may physically change the distance between a display and the optics through which it is viewed by moving the display, the optics, or both. Additionally, moving or translating two lenses relative to each other may also be used to change the focal distance of the display. Thus, the varifocal subsystem may include actuators or motors that move displays and/or optics to change the distance between them. This varifocal subsystem may be separate from or integrated into the display subsystem. The varifocal subsystem may also be integrated into or separate from its actuation subsystem and/or the eye-tracking subsystems described herein.
In one example, the display subsystem may include a vergence-processing module configured to determine a vergence depth of a user's gaze based on a gaze point and/or an estimated intersection of the gaze lines determined by the eye-tracking subsystem. Vergence may refer to the simultaneous movement or rotation of both eyes in opposite directions to maintain single binocular vision, which may be naturally and automatically performed by the human eye. Thus, a location where a user's eyes are verged is where the user is looking and is also typically the location where the user's eyes are focused. For example, the vergence-processing module may triangulate gaze lines to estimate a distance or depth from the user associated with intersection of the gaze lines. The depth associated with intersection of the gaze lines may then be used as an approximation for the accommodation distance, which may identify a distance from the user where the user's eyes are directed. Thus, the vergence distance may allow for the determination of a location where the user's eyes should be focused and a depth from the user's eyes at which the eyes are focused, thereby providing information (such as an object or plane of focus) for rendering adjustments to the virtual scene.
The vergence-processing module may coordinate with the eye-tracking subsystems described herein to make adjustments to the display subsystem to account for a user's vergence depth. When the user is focused on something at a distance, the user's pupils may be slightly farther apart than when the user is focused on something close. The eye-tracking subsystem may obtain information about the user's vergence or focus depth and may adjust the display subsystem to be closer together when the user's eyes focus or verge on something close and to be farther apart when the user's eyes focus or verge on something at a distance.
The eye-tracking information generated by the above-described eye-tracking subsystems may also be used, for example, to modify various aspect of how different computer-generated images are presented. For example, a display subsystem may be configured to modify, based on information generated by an eye-tracking subsystem, at least one aspect of how the computer-generated images are presented. For instance, the computer-generated images may be modified based on the user's eye movement, such that if a user is looking up, the computer-generated images may be moved upward on the screen. Similarly, if the user is looking to the side or down, the computer-generated images may be moved to the side or downward on the screen. If the user's eyes are closed, the computer-generated images may be paused or removed from the display and resumed once the user's eyes are back open.
The above-described eye-tracking subsystems can be incorporated into one or more of the various artificial reality systems described herein in a variety of ways. For example, one or more eye-tracking system components may be incorporated into augmented-reality system 2100 in
The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
The preceding description has been provided to enable others skilled in the art to best utilize various aspects of the exemplary embodiments disclosed herein. This exemplary description is not intended to be exhaustive or to be limited to any precise form disclosed. Many modifications and variations are possible without departing from the spirit and scope of the present disclosure. The embodiments disclosed herein should be considered in all respects illustrative and not restrictive. Reference should be made to any claims appended hereto and their equivalents in determining the scope of the present disclosure.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and/or claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and/or claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and/or claims, are interchangeable with and have the same meaning as the word “comprising.”
This application claims the benefit of and priority to U.S. Provisional Patent Application No. 63/483,176, filed 3 Feb. 2023, and titled GRADIENT-INDEX LIQUID CRYSTAL LENS HAVING CURVED SURFACE SHAPE, the disclosure of which is incorporated, in its entirety, by this reference.
Number | Date | Country | |
---|---|---|---|
63483176 | Feb 2023 | US |