The present disclosure relates to visual display devices and related components, modules, and methods.
Visual displays provide information to viewer(s) including still images, video, data, etc. Visual displays have applications in diverse fields including entertainment, education, engineering, science, professional training, advertising, to name just a few examples. Some visual displays, such as TV sets, display images to several users at a time, and some visual display systems, such s near-eye displays (NEDs), are intended for individual users.
An artificial reality system generally includes an NED (e.g., a headset or a pair of glasses) configured to present content to a user. The near-eye display may display virtual objects or combine images of real objects with virtual objects, as in virtual reality (VR), augmented reality (AR), or mixed reality (MR) applications. For example, in an AR system, a user may view images of virtual objects (e.g., computer-generated images (CGIs)) superimposed with the surrounding environment by seeing through a “combiner” component. The combiner of a wearable display is typically transparent to external light but includes some light routing optics to direct the display light into the user's field of view.
Human sight has a rather wide overall field of view (FOV). For an AR/VR system to mimic human sight, the operational field of view needs to approach that of human vision. A straightforward approach to achieve the full vision FOV would require wide numerical aperture optics and large pixel counts, increasing size, weight, complicates the processing electronics, and increases power demands of a display. Because a display of HMD or NED is usually worn on the head of a user, a large, bulky, unbalanced, and/or heavy display device with heavy electro-optical modules and heavy battery would be cumbersome and uncomfortable for the user to wear. Consequently, head-mounted display devices can benefit from a compact and efficient configuration enabling FOV expansion up to natural FOV limits.
Exemplary embodiments will now be described in conjunction with the drawings, in which:
While the present teachings are described in conjunction with various embodiments and examples, it is not intended that the present teachings be limited to such embodiments. On the contrary, the present teachings encompass various alternatives and equivalents, as will be appreciated by those of skill in the art. All statements herein reciting principles, aspects, and embodiments of this disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
As used herein, the terms “first”, “second”, and so forth are not intended to imply sequential ordering, but rather are intended to distinguish one element from another, unless explicitly stated. Similarly, sequential ordering of method steps does not imply a sequential order of their execution, unless explicitly stated. In
In accordance with this disclosure, a field of view (FOV) of a near-eye display using a pupil-replicating lightguide may be expanded by providing a beam redirector downstream of the lightguide for controllably redirecting image light out-coupled by the lightguide. Such a configuration allows different FOV portions to be displayed at different moments of time, expanding the overall FOV by time sequencing through different FOV portions.
In accordance with the present disclosure, there is provided a lightguide assembly for a near-eye display. The lightguide assembly comprises a lightguide body for receiving and propagating image light carrying an image in angular domain to be displayed by the near-eye display, and a beam redirector downstream of the lightguide body. The lightguide body includes an out-coupling structure for out-coupling spaced apart portions of the image light. The beam redirector is configured for switchably redirecting the image light portions out-coupled by the out-coupling structure.
In embodiments where the beam redirector comprises a Pancharatnam-Berry phase (PBP) grating, the latter may be a passive PBP grating, or an active PBP grating comprising a liquid crystal (LC) layer switchable by application of electric field to the LC layer. The beam redirector may further include a switchable polarization rotator upstream of the PBP grating for switching polarization of the image light portions between two mutually orthogonal polarization states. The switchable polarization rotator may include a switchable waveplate, which may be based on liquid crystals. In some embodiments, the out-coupling grating may include an out-coupling grating, e.g. a polarization volume hologram (PVH) grating, and/or a plurality of slanted partial bulk reflectors.
In accordance with the present disclosure, there is provided a near-eye display (NED) comprising an image projector for providing first and second portions of an image in angular domain to be displayed by the near-eye display, and a lightguide assembly described above. The NED may further include a controller operably coupled to the image projector and the beam redirector. The controller may be configured to operate as follows. During a first time interval, the controller may cause the image projector to display the first portion of the image in angular domain, and cause the first beam redirector to redirect the image light portions by a first angle. During a second, subsequent time interval, the controller may cause the image projector to display the second portion of the image in angular domain, and cause the first beam redirector to redirect the image light portions by a second, different angle.
In embodiments where the first image portion corresponds to a first field of view (FOV) portion of the image, and the second image portion corresponds to a second, adjacent FOV portion of the image, a difference between the first and second angles of the first beam redirector may be such that the first and second FOV portions partially overlap with one another. In some embodiments, the NED may further include an additional beam redirector in a path of external light upstream of the lightguide assembly, for controllably redirecting the external light to offset a redirection of the external light by the first beam redirector.
The controller may be configured to do the following. During a first time interval, the controller may cause the image projector to display the first portion of the image in angular domain and cause the first beam redirector to redirect the image light portions by a first angle. During a second, subsequent time interval, the controller may cause the image projector to display the second portion of the image in angular domain and cause the first beam redirector to redirect the image light portions by a second, different angle. The controller may be configured to cause the additional (i.e. the second) beam redirector offset the redirection of the external light by the first beam redirector during both the first and the second time intervals.
In accordance with the present disclosure, there is further provided a method for displaying an image to a user. The method comprises the following: during a first time interval, causing an image projector to emit image light carrying a first portion of the image, and causing a first beam redirector to redirect the image light by a first angle; and during a second, subsequent time interval, causing the image projector to emit image light carrying a second, different portion of the image, and causing the first beam redirector to redirect the image light by a second, different angle. The first image portion may correspond to a first field of view (FOV) portion of the image, and the second image portion may correspond to a second, adjacent FOV portion of the image. A difference between the first and second angles of the first beam redirector may be selected such that the first and second FOV portions partially overlap with one another. The method may further include using a second beam redirector upstream in a path of external light w.r.t. the first beam redirector for controllably redirecting the external light to offset a redirection of the external light by the first beam redirector, thereby avoiding splitting of the outside imagery observed through the near-eye display (NED).
Referring now to
The image light 108 is shown propagating in a straight line for simplicity and generality. In some implementations, the lightguide body 102 may propagate the image light 108 by a series of zigzag reflections from its outer surfaces. The lightguide body 102 has an out-coupling structure 104 for out-coupling spaced apart portions 106 of image light 108 propagating within the lightguide body 102. The portions 106 carry an image in angular domain.
A beam redirector 110 is disposed downstream of the lightguide body 102. The beam redirector 110 switchably redirects the image light portions 106, i.e. deflects all the image light portions 106 by one of a set of pre-defined switchable angles. By having the image projector 180 display different FOV portions in a time-sequential manner, the overall FOV of an image conveyed by the image light 108 propagating in the lightguide assembly 100 may be considerably expanded.
For example, during a first time interval, the image light 108 may carry a first FOV portion, or in other words a first portion of the image in angular domain, and the beam redirector 110 may redirect the image light portions 106 by an angle −α, as illustrated with dashed lines 111. During a second, subsequent time interval, the image light 108 may carry a second FOV portion, e.g. an adjacent FOV portion, and the beam redirector 110 may redirect the image light portions 106 by an angle +α, as illustrated with dotted lines 112. In this manner, the overall FOV may be expanded by the angle of 2α. The required coordination of operation of the image projector 180 and the beam redirector 110 may be provided by a controller 190 operably coupled to both and configured to perform the above steps during the first and second time intervals.
The first and second FOV portions may have overlap between them, to avoid gaps in the overall FOV, or in other words the entire image in angular domain, carried by the image light 108. This is illustrated in
For augmented reality (AR) and similar applications where the external world remains at least partially visible to the NED wearer, the lightguide assembly 100/NED 170 may further include a second, matching beam redirector 110′ on the opposite side of the lightguide body 102, i.e. on the distal or “world” side of the lightguide body 102. The purpose of the matching beam redirector 110′ is to compensate or offset the shift that the beam redirector 110 would otherwise impose on the outside world view, causing the outside world appearance to “double” or even split in multiple overlapping images. The controller 190 may be configured to operate both beam redirectors 110, 110′ in a coordinated manner, such that one always compensates the other.
In the embodiment shown in
ϕ(x)=πx/T=πx sin θ/λo (1)
where T is pitch of the grating 200, and θ is a diffraction angle given by
θ=sin−1(λo/T) (2)
The azimuthal angle ϕ varies continuously across the surface of an LC layer 204 as illustrated in
P(x)=2ϕ(x)=2πx sin θ/λo (3)
when R=λ0/2.
The PBP LC grating 200 may be a passive device or an active device. The passive version of the PBP LC grating 200 may be preceded by an active waveplate or polarization rotator for switching polarization of the impinging light. In the active PBP LC device, the orientation of the LC molecules may be controlled by an electric field applied to the LC layer 204. In a normal (undriven) state, the LC molecules 202 orientation is determined by an alignment layer, which has been illuminated with polarized light, typically UV light, such that the LC molecules 202 are aligned in accordance with the polarization direction of the UV light used to cure the alignment layer. The pattern of the UV light, used to obtain the required spatial distribution of polarization, can be generated by employing optical interference, for example. In the driven state, the LC molecules 202 are oriented almost perpendicular to Z axis.
The operation of an active PBP LC device is illustrated in
Optical performance of a beam redirector based on PBP LC gratings is illustrated in
Optical performance of a beam redirector module based on active PBP LC grating(s) is illustrated in
In some embodiments, active PBP LC gratings may be combined with switchable waveplates and/or passive PBP LC gratings. Furthermore, PBP grating based redirectors may be assembled in binary stacks enabling several switchable beam deflection angles. By way of non-limiting examples, a stack of two PBP gratings may provide 4 switching angles, a stack of three PBP gratings may provide 8 switching angles, and so on. PVH gratings will be considered further below with reference to
Referring now to
In operation, the PVH grating 504 out-couples only light of one handedness of polarization, in this example right-circular polarized (RCP) image light portions 506. The lightguide assembly 500 further includes a beam redirector module 510 disposed downstream of the lightguide body 502 for controllably redirecting the image light portions 506. In the embodiment shown, the beam redirector module 510 includes a switchable half-wave plate (sHWP) 511 that switches the polarization of the image light portions 506 between RCP light and an opposite polarization, left circular polarized (LCP). The beam redirector module 510 further includes a PBP grating 512 that redirects the image light portions 506 to the left for LCP light and to the right for RCP light, as illustrated, enabling the switching of FOV portions and thereby expanding the overall FOV as explained above with reference to
For augmented reality (AR) and similar applications of the NED 570 where the external world remains at least partially visible to the user of a NED 570, the lightguide assembly 500 may include a matching beam redirector 510′ on the opposite side of the lightguide body 502, i.e. on the distal or “world” side of the lightguide body 502. The purpose of the matching beam redirector 510′ is to compensate, offset, or undo the shift that the beam redirector 110 would otherwise impose on external light 508′, causing the outside scenery appearance to “double”. In the embodiment shown, the matching beam redirector 510′ includes a matching PBP grating 512′ that splits the external light into LCP and PRC light propagating at opposed angles to the impinging external light 508′ direction as illustrated. The matching PBP grating 512′ is followed by a matching switchable half-wave plate (sHWP) 511′ that switches the polarization of the external light 508′ from RCP to LCP and vice versa, in sync with the (sHWP) 511 disposed on the inner (proximal) side of the waveguide body 503. The controller 590 may be operably coupled to the 2D scanner 580 and both redirector modules 510, 510′ and configured to cause the matching beam redirector 510′ to offset the redirection of the external light 508′ by the main beam redirector 510 at any moment of time.
Turning to
Non-limiting examples of PVH gratings usable in lightguide assemblies of this disclosure will now be presented. Referring to
Boundary LC molecules 707b at the top surface 705 of the LC layer 704 may be oriented at an angle to the top surface 705. The boundary LC molecules 707b may have a spatially varying azimuthal angle, e.g. linearly varying along X-axis parallel to the top surface 705, as shown in
The boundary LC molecules 707b define relative phases of the helical structures 708 having the helical period p. The helical structures 708 form a volume grating comprising helical fringes 714 tilted at an angle ϕ, as shown in
The helical nature of the fringes 714 of the volume grating makes the PVH grating 700 preferably responsive to light of polarization having one particular handedness, e.g. left- or right-circular polarization, while being substantially non-responsive to light of the opposite handedness of polarization. Thus, the helical fringes 714 make the PVH grating 700 polarization-selective, causing the PVH grating 700 to diffract light of only one handedness of circular polarization. This is illustrated in
The polarization selectivity of the PVH grating 700 results from the effective refractive index of the grating being dependent on the relationship between the handedness, or chirality, of the impinging light beam and the handedness, or chirality, of the grating fringes 714. Changing the handedness of the impinging light may be used to switch the performance of the PVH grating 700. The PVH grating 700 may also be made tunable by applying voltage to the LC layer 704, which distorts or erases the above-described helical structure. It is further noted that sensitivity of the PVH 700 to right circular polarized light in particular is only meant as an illustrative example. When the handedness of the helical fringes 714 is reversed, the PVH 700 may be made sensitive to left circular polarized light. Thus, the operation of the PVH 700 may be controlled by controlling the polarization state of the impinging light beam 720. Furthermore, in some embodiments the PVH 700 may be made tunable by application of electric field across the LC layer 704, which erases the periodic helical structures 708.
Turning to
Still referring to
In some embodiments, the method 800 may further include using a second beam redirector upstream in a path of external light w.r.t. the first beam redirector such as, for example, the matching beam redirector 110′ of
Referring now to
The purpose of the eye-tracking cameras 904 is to determine position and/or orientation of both eyes of the user. The eyebox illuminators 910 illuminate the eyes at the corresponding eyeboxes 912, allowing the eye-tracking cameras 904 to obtain the images of the eyes, as well as to provide reference reflections i.e. glints. The glints may function as reference points in the captured eye image, facilitating the eye gazing direction determination by determining position of the eye pupil images relative to the glint positions. To avoid distracting the user with the light of the eyebox illuminators 910, the latter may be made to emit light invisible to the user. For example, infrared light may be used to illuminate the eyeboxes 912.
Turning to
In some embodiments, the front body 1002 includes locators 1008 and an inertial measurement unit (IMU) 1010 for tracking acceleration of the HMD 1000, and position sensors 1012 for tracking position of the HMD 1000. The IMU 1010 is an electronic device that generates data indicating a position of the HMD 1000 based on measurement signals received from one or more of position sensors 1012, which generate one or more measurement signals in response to motion of the HMD 1000. Examples of position sensors 1012 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, a type of sensor used for error correction of the IMU 1010, or some combination thereof. The position sensors 1012 may be located external to the IMU 1010, internal to the IMU 1010, or some combination thereof.
The locators 1008 are traced by an external imaging device of a virtual reality system, such that the virtual reality system can track the location and orientation of the entire HMD 1000. Information generated by the IMU 1010 and the position sensors 1012 may be compared with the position and orientation obtained by tracking the locators 1008, for improved tracking accuracy of position and orientation of the HMD 1000. Accurate position and orientation is important for presenting appropriate virtual scenery to the user as the latter moves and turns in 3D space.
The HMD 1000 may further include a depth camera assembly (DCA) 1011, which captures data describing depth information of a local area surrounding some or all of the HMD 1000. The depth information may be compared with the information from the IMU 1010, for better accuracy of determination of position and orientation of the HMD 1000 in 3D space.
The HMD 1000 may further include an eye tracking system 1014 for determining orientation and position of user's eyes in real time. The obtained position and orientation of the eyes also allows the HMD 1000 to determine the gaze direction of the user and to adjust the image generated by the display system 1080 accordingly. The determined gaze direction and vergence angle may be used to adjust the display system 1080 to reduce the vergence-accommodation conflict. The direction and vergence may also be used for displays' exit pupil steering as disclosed herein. Furthermore, the determined vergence and gaze angles may be used for interaction with the user, highlighting objects, bringing objects to the foreground, creating additional objects or pointers, etc. An audio system may also be provided including e.g. a set of small speakers built into the front body 1002.
Embodiments of the present disclosure may include, or be implemented in conjunction with, an artificial reality system. An artificial reality system adjusts sensory information about outside world obtained through the senses such as visual information, audio, touch (somatosensation) information, acceleration, balance, etc., in some manner before presentation to a user. By way of non-limiting examples, artificial reality may include virtual reality (VR), augmented reality (AR), mixed reality (MR), hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include entirely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, somatic or haptic feedback, or some combination thereof. Any of this content may be presented in a single channel or in multiple channels, such as in a stereo video that produces a three-dimensional effect to the viewer. Furthermore, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in artificial reality and/or are otherwise used in (e.g., perform activities in) artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a wearable display such as an HMD connected to a host computer system, a standalone HMD, a near-eye display having a form factor of eyeglasses, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
The present disclosure is not to be limited in scope by the specific embodiments described herein. Indeed, other various embodiments and modifications, in addition to those described herein, will be apparent to those of ordinary skill in the art from the foregoing description and accompanying drawings. Thus, such other embodiments and modifications are intended to fall within the scope of the present disclosure. Further, although the present disclosure has been described herein in the context of a particular implementation in a particular environment for a particular purpose, those of ordinary skill in the art will recognize that its usefulness is not limited thereto and that the present disclosure may be beneficially implemented in any number of environments for any number of purposes. Accordingly, the claims set forth below should be construed in view of the full breadth and spirit of the present disclosure as described herein.
This application claims priority from U.S. Provisional Patent Application No. 63/341,416 entitled “Active Eyebox Solutions and Applications” filed on May 12, 2022, and U.S. Provisional Patent Application No. 63/392,425 entitled “Field of View Expansion by Image Light Redirection” filed on Jul. 26, 2022, both of which being incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
6400885 | Hu et al. | Jun 2002 | B1 |
7884977 | Mori | Feb 2011 | B2 |
8086044 | Feng et al. | Dec 2011 | B2 |
8878773 | Bozarth | Nov 2014 | B1 |
9230515 | Liu | Jan 2016 | B2 |
9274597 | Karakotsios et al. | Mar 2016 | B1 |
9557568 | Ouderkirk et al. | Jan 2017 | B1 |
9664824 | Simmonds et al. | May 2017 | B2 |
10108014 | Vallius et al. | Oct 2018 | B2 |
10217286 | Angel et al. | Feb 2019 | B1 |
10295723 | Lee et al. | May 2019 | B1 |
10466484 | Yoon et al. | Nov 2019 | B1 |
10466779 | Liu | Nov 2019 | B1 |
10502963 | Noble et al. | Dec 2019 | B1 |
10527855 | Alexander | Jan 2020 | B2 |
10571699 | Parsons | Feb 2020 | B1 |
10712576 | McEldowney | Jul 2020 | B1 |
10775633 | Lee et al. | Sep 2020 | B1 |
10782526 | Van Heugten | Sep 2020 | B2 |
10838132 | Calafiore et al. | Nov 2020 | B1 |
10885843 | Lu et al. | Jan 2021 | B1 |
10890823 | Jiang et al. | Jan 2021 | B1 |
11120728 | Nagasaki et al. | Sep 2021 | B2 |
11176367 | Fix et al. | Nov 2021 | B1 |
11393430 | Nagasaki et al. | Jul 2022 | B2 |
11428938 | Yaroshchuk | Aug 2022 | B2 |
20040227838 | Atarashi et al. | Nov 2004 | A1 |
20060146012 | Arneson | Jul 2006 | A1 |
20070188837 | Shimizu et al. | Aug 2007 | A1 |
20080143820 | Peterson | Jun 2008 | A1 |
20080212161 | Valette et al. | Sep 2008 | A1 |
20080212942 | Gordon et al. | Sep 2008 | A1 |
20080309649 | Kojima et al. | Dec 2008 | A1 |
20090040580 | Mukawa | Feb 2009 | A1 |
20090196460 | Jakobs et al. | Aug 2009 | A1 |
20110234750 | Lai et al. | Sep 2011 | A1 |
20120188467 | Escuti | Jul 2012 | A1 |
20120218481 | Popovich et al. | Aug 2012 | A1 |
20120249957 | Shibata et al. | Oct 2012 | A1 |
20120250980 | Gillard et al. | Oct 2012 | A1 |
20120254369 | Gillard et al. | Oct 2012 | A1 |
20120257005 | Browne | Oct 2012 | A1 |
20130099700 | Kreye et al. | Apr 2013 | A1 |
20130182066 | Ishimoto | Jul 2013 | A1 |
20140037213 | Niederberger et al. | Feb 2014 | A1 |
20140049452 | Maltz | Feb 2014 | A1 |
20140098010 | Travis | Apr 2014 | A1 |
20140300966 | Travers et al. | Oct 2014 | A1 |
20150243718 | Kwon et al. | Aug 2015 | A1 |
20150253591 | Kato et al. | Sep 2015 | A1 |
20160029883 | Cox | Feb 2016 | A1 |
20160085300 | Robbins et al. | Mar 2016 | A1 |
20160241892 | Cole et al. | Aug 2016 | A1 |
20160342205 | Shigeta et al. | Nov 2016 | A1 |
20170307886 | Stenberg et al. | Oct 2017 | A1 |
20180046859 | Jarvenpaa | Feb 2018 | A1 |
20180073686 | Quilici et al. | Mar 2018 | A1 |
20180081322 | Robbins et al. | Mar 2018 | A1 |
20180143586 | Narducci | May 2018 | A1 |
20180196263 | Vallius et al. | Jul 2018 | A1 |
20180232048 | Popovich et al. | Aug 2018 | A1 |
20180237696 | Tuffin et al. | Aug 2018 | A1 |
20180239177 | Oh | Aug 2018 | A1 |
20180275409 | Gao et al. | Sep 2018 | A1 |
20180307048 | Alexander et al. | Oct 2018 | A1 |
20180364487 | Yeoh et al. | Dec 2018 | A1 |
20190041634 | Popovich et al. | Feb 2019 | A1 |
20190079292 | Alexander | Mar 2019 | A1 |
20190086674 | Sinay et al. | Mar 2019 | A1 |
20190094981 | Bradski et al. | Mar 2019 | A1 |
20190147564 | Yuan et al. | May 2019 | A1 |
20190243134 | Perreault | Aug 2019 | A1 |
20190243209 | Perreault | Aug 2019 | A1 |
20190310456 | Meng et al. | Oct 2019 | A1 |
20190317450 | Yaroshchuk | Oct 2019 | A1 |
20190361241 | Amitai | Nov 2019 | A1 |
20200041787 | Popovich et al. | Feb 2020 | A1 |
20200043398 | Salazar | Feb 2020 | A1 |
20200081252 | Jamali et al. | Mar 2020 | A1 |
20200116995 | Chi et al. | Apr 2020 | A1 |
20200116996 | Lee et al. | Apr 2020 | A1 |
20200143741 | Tsuboi et al. | May 2020 | A1 |
20200159084 | Choi | May 2020 | A1 |
20200183159 | Danziger | Jun 2020 | A1 |
20200183174 | Noui et al. | Jun 2020 | A1 |
20200271936 | Leibovici et al. | Aug 2020 | A1 |
20200336645 | Fukuda | Oct 2020 | A1 |
20200368616 | Delamont | Nov 2020 | A1 |
20200371388 | Geng et al. | Nov 2020 | A1 |
20200412965 | Yoshida | Dec 2020 | A1 |
20210011284 | Andreev et al. | Jan 2021 | A1 |
20210041948 | Berkner-Cieslicki et al. | Feb 2021 | A1 |
20210055555 | Chi et al. | Feb 2021 | A1 |
20210191122 | Yaroshchuk et al. | Jun 2021 | A1 |
20210199958 | Huang et al. | Jul 2021 | A1 |
20210199970 | Huang et al. | Jul 2021 | A1 |
20210208397 | Lu et al. | Jul 2021 | A1 |
20210209364 | Park et al. | Jul 2021 | A1 |
20210405374 | Komanduri et al. | Dec 2021 | A1 |
20210405380 | Urness et al. | Dec 2021 | A1 |
20220004001 | Danziger et al. | Jan 2022 | A1 |
20220197376 | Boyle et al. | Jun 2022 | A1 |
20220299754 | Gollier et al. | Sep 2022 | A1 |
20220350219 | Danziger | Nov 2022 | A1 |
20220382061 | Schultz | Dec 2022 | A1 |
20220382064 | Rohn et al. | Dec 2022 | A1 |
20220394234 | Etigson et al. | Dec 2022 | A1 |
20220397956 | Lundell et al. | Dec 2022 | A1 |
20220413302 | Meitav et al. | Dec 2022 | A1 |
20220413603 | Held et al. | Dec 2022 | A1 |
20230014577 | Gollier | Jan 2023 | A1 |
20230057514 | Fix et al. | Feb 2023 | A1 |
Number | Date | Country |
---|---|---|
210323583 | Apr 2020 | CN |
113075793 | Jul 2021 | CN |
115698819 | Feb 2023 | CN |
2767852 | Aug 2014 | EP |
2422680 | Aug 2006 | GB |
2585211 | Jan 2021 | GB |
H0682851 | Mar 1994 | JP |
20170094350 | Aug 2017 | KR |
20180135646 | Dec 2018 | KR |
20210004776 | Jan 2021 | KR |
2019178398 | Sep 2019 | WO |
2021030093 | Feb 2021 | WO |
2021091622 | May 2021 | WO |
2021242667 | Dec 2021 | WO |
2022052949 | Mar 2022 | WO |
Entry |
---|
Draper C.T., et al., “Holographic Waveguide Head-Up Display with 2-D Pupil Expansion and Longitudinal Image Magnification,” Applied Optics, Feb. 10, 2019, vol. 58, No. 5, pp. A251-A257. |
Palto S.P., “Dynamic and Photonic Properties of Field-Induced Gratings in Flexoelectric LC Layers,” Crystals, 2021, vol. 11, 894, 13 pages. |
Pogue R.T., et al., “Electrically Switchable Bragg Gratings from Liquid Crystal/Polymer Composites,” Applied Spectroscopy, 2000, vol. 54, No. 1, pp. 12A-28A. |
Smalley D.E., et al., “Status of Leaky Mode of Holography,” Photonics, 2021, 8, 292, 22 pages. |
Xiang J., et al., “Electrooptic Response of Chiral Nematic Liquid Crystals with Oblique Helicoidal Director,” Physical Review Letters, 2014, 112, 217801, 14 pages. |
Zhan T., et al., “High-Efficiency Switchable Optical Elements for Advanced Head-Up Displays,” Journal of the Society for Information Display, Mar. 21, 2019, vol. 27, No. 4, pp. 223-231. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051388, mailed Apr. 6, 2023, 12 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051487, mailed Apr. 11, 2023, 8 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051608, mailed Apr. 5, 2023, 11 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051751, mailed Apr. 11, 2023, 14 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051758, mailed Mar. 22, 2023, 9 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051809, mailed Apr. 5, 2023, 10 pages. |
Invitation to Pay Additional Fees for International Application No. PCT/US2022/051814, mailed Apr. 11, 2023, 14 pages. |
Jolly S., et al., “Near-to-Eye Electroholography via Guided-Wave Acousto-Optics for Augmented Reality,” Proceedings of SPIE, vol. 10127, Mar. 2, 2017, 11 pages. |
Maimone A., et al., “Holographic Optics for Thin and Lightweight Virtual Reality,” Facebook Reality Labs, ACM Trans. Graph. Article 67, vol. 39, No. 4, Jul. 2020, 14 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051781, mailed Apr. 18, 2023, 13 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051801 mailed Apr. 14, 2023, 11 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051805 mailed Apr. 14, 2023, 10 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051755, mailed Apr. 26, 2023, 9 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2022/051814, mailed Jun. 2, 2023, 20 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2023/021926, mailed Sep. 4, 2023, 12 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2023/022012, mailed Sep. 1, 2023, 11 pages. |
Aalizadeh M., et al., “Toward Electrically Tunable, Lithography-Free, Ultra-Thin Color Filters Covering the Whole Visible Spectrum,” Scinetific Reports, vol. 8, No. 1, Jul. 27, 2018, 11 pages. |
Chang A. S. P., “Tunable Liquid Crystal-Resonant Grating Filter Fabricated by Nanoimprint Lithography,” IEEE Photonics Technology Letters, vol. 19, No. 19, Oct. 1, 2007, pp. 1457-1459. |
Kollosche M., et al., “Voltage-Controlled Compression for Period Tuning of Optical Surface Relief Gratings,” Optics Letters, vol. 36, No. 8, Apr. 15, 2011, pp. 1389-1391. |
Lee K. M., et al., “Color-Tunable Mirrors Based on Electrically Regulated Bandwidth Broadening in Polymer-Stabilized Cholesteric Liquid Crystals,” ACS Photonics, Sep. 17, 2014, pp. 1033-1041. |
Lin I-T., et al., “Electro-Responsive Surfaces with Controllable Wrinkling Patterns for Switchable Light reflection-Diffusion-Grating Devices,” Marterials Today, vol. 41, Dec. 2020, 11 pages. |
Shih W-C., et al., “High-Resolution Electrostatic Analog Tunable Grating With a Single-Mask Fabrication Process,” Journal of Microelectromechanical Systems, vol. 15, No. 4, Aug. 2006, pp. 763-769. |
Sirleto L., et al., “Electro-Optical Switch and Continuously Tunable Filter based on a Bragg Grating in a Planar Waveguide with a Liquid Crystal Overlayer,” Optical Engineering, vol. 41, No. 11, Nov. 2002, pp. 2890-2898. |
Xiang J., et al., “Electrically Tunable Selective Reflection of Light from Ultraviolet to Visible and Infrared by Heliconical Cholesterics,” Advanced Materials, vol. 27, Issue19, May 20, 2015, 5 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051388, mailed Jun. 20, 2024, 10 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051487, mailed Jun. 20, 2024, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051608, mailed Jun. 20, 2024, 9 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051755, mailed Jun. 20, 2024, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051758, mailed Jun. 20, 2024, 7 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051781, mailed Jun. 20, 2024, 11 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051801, mailed Jun. 20, 2024, 10 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051805, mailed Jun. 20, 2024, 9 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2022/051814, mailed Jun. 20, 2024, 16 pages. |
Number | Date | Country | |
---|---|---|---|
20230367123 A1 | Nov 2023 | US |
Number | Date | Country | |
---|---|---|---|
63341416 | May 2022 | US | |
63392425 | Jul 2022 | US |