The invention pertains to near eye displays. More particularly this invention pertains to near eye three-dimensional (3D) displays.
Since the advent of the smartphone, the great utility of having a versatile and always available device capable of general purpose computing and multimedia communication has been realized by the public at large. Nonetheless, a pronounced drawback of smartphones is the relatively small screen size. Smartphone display screens are a small fraction of the size of even small laptop computer screens.
It is now contemplated that smartphones will eventually be replaced or indispensably supplemented by augmented reality glasses that will, among other things, effectively provide users with a relatively large field of view 3D imagery output system that is accessible to users, at will, whether for business or entertainment purposes.
Beyond merely exceeding the screen size afforded by a laptop and without the encumbrance of carrying a laptop, augmented reality glasses will provide new mixed reality applications that seamlessly integrate the real world and virtual content. This not only preserves the user's engagement with the real world discouraging the social phenomenon of withdrawal from real world interaction that is sometimes associated with excessive use of smartphones but also enables new types of augmentation of the physical world, such as, for example: automatically generated contextually relevant information overlaid on automatically recognized real world objects; communication between remotely situated persons, through 3D avatar's of each party displayed to the other party; and mixed reality games, that include virtual content behaving realistically, e.g., respecting boundaries of physical objects in the real world.
One aspect of augmented reality glasses is that the virtual content is displayed via transparent eyepieces. One type of transparent eyepiece is based on waveguides that include see-through diffractive optical elements for controlling the propagation of light that carries virtual imagery. One issue with such waveguide eyepieces is the low efficiency with which they are able to transfer light carrying virtual imagery to a user's eyes. Low efficiency leads to higher power consumption, and thus shorter battery life and associated thermal management requirements.
Additionally, in order to enhance the realisticness of virtual content, it is desirable to display content at different depths. Properly displaying content at a certain distance from the user calls for curving the wavefront of light used to generate virtual images of the content. The curvature is inversely related to the virtual image distance. In order to achieve multiple virtual image distances when using waveguide-based eyepieces, a stack of waveguides, each of which has different out-coupling optics, is used. The latter approach practically limits the virtual distances that can be provided to a small finite number, e.g., 2 chosen distances.
Embodiments described here improve efficiency of coupling 3D imagery through an optics train and to a user's eye(s) and are further more versatile in terms of being able to control the depth of virtual images.
Embodiments of the invention provide augmented reality glasses including near eye displays that include a source of imagewise intensity modulated light coupled to a spatial phase modulator that can impart a spatially varied phase modulation across a beam of light received from the source of imagewise intensity modulated light. The spatial phase modulator is further coupled to an eye coupling optic. The source of imagewise amplitude modulated light, can for example, take the form of a light source coupled to a 2D pixelated amplitude modulator such as an Liquid Crystal on Silicon (LCoS) modulator or a Digital Micromirror Device (DMD) modulator, or emissive 2D display panel such as an Organic Light Emitting Diode (OLED) display panel. The spatial phase modulator can take the form of an LCoS modulator as well. The eye coupling optic can take the form of an off-axis volume holographic diffraction grating which receives light at a relatively high incidence angle compared to the angle at which it redirects light toward a user's eye, thereby allowing parts of the near eye display to be positioned to the side of the user's eye. In some embodiments, a path of light between the light source and the eye coupling optic can reach the amplitude modulator before reaching the spatial phase modulator. The near eye display can further comprise a beam splitter disposed between the amplitude modulator and the spatial phase modulator.
The drawings illustrate the design and utility of preferred embodiments of the present invention, in which similar elements are referred to by common reference numerals. In order to better appreciate how the above-recited and other advantages and objects of the present inventions are obtained; a more particular description of the present inventions briefly described above will be rendered by reference to specific embodiments thereof, which are illustrated in the accompanying drawings.
Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
The source of imagewise amplitude modulated light 102 is optically coupled to a spatial phase modulator 108. In the present specification, the term “optically coupled” can include propagation along an optical path that may include: free space and/or one or more optical elements, such as lens(es), mirror(s), and light pipe(s), for example. The spatial phase modulator 108 can, for example, include a Zero-Twist Electrically Controlled Birefringence Liquid Crystal (ZTECBLC) modulator. The spatial phase modulator 108 can be configured into a single Fresnel lens configuration, a configuration that includes a grid of Fresnel lenses, or a superposition of multiple Fresnel lenses in a non-grid configuration. The single Fresnel lens configuration can be used to impart a common wavefront curvature to all of the light received from the source of imagewise amplitude modulated light 102. The grid of Fresnel lenses and the non-grid multiple Fresnel lens configurations can be used to impart different wavefront curvatures to different regions of the light received from the source of imagewise amplitude modulated light. In each case, the wavefront curvature is the inverse of a virtual image distance. Setting the wavefront curvature as an inverse of the virtual image distance helps create a more realistic impression that the virtual imagery being output by the imagewise amplitude modulator 106 is at the virtual image distance relative to the user's position.
The spatial phase modulator 108 is optically coupled to eye coupling optics. The eye coupling optics can, for example, take the form of a holographic volume diffraction grating, or a specular eyepiece that includes refractive and/or reflective surfaces. The imagewise amplitude modulated light and the spatial phase modulated light may correspond to the amplitude and phase modulated components of a hologram, such as a computer generated hologram, for example.
The source of imagewise amplitude modulated light 202 is optically coupled to an active zone plate modulator 208. The active zone plate modulator 208 can reconfigurably form zone plates of varying focal lengths by presenting zone plate patterns that include alternating light and dark rings. The active zone plate modulator 208 can be a reflective light modulator or a transmissive light modulator. The active zone plate modulator can be implemented using a DMD modulator, an LCoS modulator or a transmissive liquid crystal (LC) modulator for example. The active zone plate modulator 208 can be used to present a single zone plate pattern, a grid of multiple zone plate patterns, or a non-grid superposition of zone plate patterns. A single zone plate pattern can be used to impart wavefront curvature to imagewise modulated light that is received from the source of imagewise amplitude modulated light 202. On the other hand, multiple zone plate patterns can be used to impart different wavefront curvatures to different portions of the imagewise modulated light that is received from the source of imagewise modulated light 202. In each case, the wavefront curvature corresponds to an inverse virtual image distance imagery presented by the near eye display 200. Depth perception of imagery presented by the near eye display system is enhanced by curving the wavefront of light used to present images based on the intended distance to virtual content (e.g., animate and inanimate objects) in the presented images. In the case that multiple zone plate patterns are implemented, a first portion of the imagewise modulated light that carriers the image of a first virtual object (e.g., a book) can be diverged by a first zone plate pattern so as to have a first curvature corresponding to an inverse of a first intended distance to the first virtual object and a second portion of the imagewise modulated light carriers the image of a second virtual object (e.g., an avatar) can be diverged by a second zone plate pattern so as to have a second curvature corresponding to an inverse of a second intended distance to the second virtual object. The active zone plate modulator 208 is coupled to eye coupling optics 210.
Virtual content that is displayed using the left and right image generators 312, 314 can include one or more virtual objects at different depths. The eye tracking cameras 324, 326 can be used to determine which particular virtual object a user is looking at. Based on the intended depth of the particular virtual object that the user is looking at, the spatial phase modulator 108 or the active zone plate modulator 208 can be used to form a negative power lens that imparts a diverging (convex toward user) wavefront curvature to light received from the source of imagewise amplitude modulated light 102 or 202. The radius of curvature of the light is suitably set equal to the depth of the particular virtual object that the user is looking at, as determined by the eye tracking cameras 324, 326. The depth of each particular virtual object can be determined by one or more programs, e.g., augmented reality programs that generate the virtual objects. Furthermore, the negative lens pattern formed by the spatial phase modulator 108 or the active zone plate modulator 208 can be transversely shifted to deflect light such that after being redirected by the eyepieces 308, 310 the light will be incident on the user's pupil.
Referring to
Referring to
Referring to
One of skill in the art will appreciate that while the disclosure refers to tracking a user's pupil in specific embodiments, eye imaging or positioning otherwise of other anatomy may be used. For example, a retinal image may be obtained and aggregated over time to provide a retinal map, wherein any single retinal image at a given time as obtained by an eye tracking camera 324, 326 corresponds to a gaze direction of the eye. That is, as a pupil changes a gaze direction, it will provide a variably positioned aperture through which the eye tracking camera 324, 326 will receive image data for the retina, the variable position corresponding to a new gaze direction.
Referring to
The reflective amplitude modulator 502 can, for example, be an LCoS modulator or a DMD modulator. The reflective phase modulator 504 can, for example, be a Zero-Twist Electrically Controlled Birefringence (ZTECB) LCoS modulator. According to an alternative embodiment, a second reflective amplitude modulator that is used to form one or more zone plate patterns, as discussed above in the context of
Light exiting the output face 822 of the RGB combiner cube can pass through an optional beam expander 836 that may include a negative lens 838 followed by a positive lens 840 which may be in a Galilean telescope configuration so as to output collimated light. Alternatively, only the negative lens 838 is provided. According to a further alternative embodiment, in lieu of the beam expander 836, laser beam shaping optics may be provided. For example, laser beam shaping optics configured to produce one or more substantially uniform rectangular cross section beams may be provided.
Light exiting the beam expander 836, or in the case of the absence thereof, light exiting light engine 834 enters an input face 842 of a beam splitter 844 and propagates to a partial reflector 846 which is embedded within beam splitter 844 which is, in some embodiments, oriented at 45 degrees. The partial reflector 846 can, for example, be a neutral density 50% reflector. Light is reflected by the partial reflector 846 to an LCoS amplitude modulator 830. An optical path 850 is indicated in
Light reflected by the LCoS phase modulator 832 is reflected by the partial reflector 846 toward an optical path folding mirror 858 which reflects the light through a protective optical window 860 toward a volume holographic eyepiece 862. The volume holographic eyepiece 862 includes gratings or other light redirecting features 864 oriented so as to diffract light toward the user's eye position 866. The near eye display 800 can be used in the augmented reality glasses 300 shown in
The processor 1102 is further coupled to left spatial phase/zone plate modulator driver 1122 and a right spatial phase/zone plate modulator driver 1124. The GPU 1104 is coupled to a left spatial amplitude modulator driver 1126 and to a right spatial amplitude modulator driver 1128 such that right eye imagery and left eye imagery can be output from the right frame buffer 1118 and the left frame buffer 1120 to the left spatial amplitude modulator 1126 and the right spatial amplitude modulator driver 1128 respectively. The left spatial phase/zone plate modulator 1122 is coupled to the left spatial phase or zone plate modulator 108L, 208L and the right spatial phase/zone plate modulator 1124 is coupled to the right spatial phase or zone plate modulator 108R, 208R such that each modulator driver drives (e.g. controls) the respective plate modulator.
According to one mode of operation, the processor 1102 receives from the eye tracking cameras 324, 326 information indicative of a direction in which the user is looking. The processor 1102 accesses information from the Z-buffer 1116 indicative of depth of virtual content corresponding to, or closest to the direction in which the user is looking. The processor 1102 then transmits a Fresnel lens pattern, or zone plate pattern, having a focal length based on the depth of the virtual content corresponding to, or closest to the direction in which the user is looking to the spatial phase/zone plate modulator drivers 1122, 1124. The focal length of the Fresnel lens or zone plate pattern transmitted to the drivers 1124, 1124 is set in consideration of the optical power of any other optical elements (e.g., the eyepieces 308, 310) in the path between the spatial amplitude modulators 106L, 206L, 108R, 208R and the user's eye position, such that the curvature of the wavefront of light reaching the user's eye will be the inverse of the value from the Z-buffer associated with (corresponding to, or closest to) the direction in which the user is looking. Furthermore the Fresnel lens or zone plate pattern that is transmitted to the spatial phase/zone plate modulator drivers is, in some embodiments, shifted (as shown, for example, in
According to an alternative embodiment the processor 1102 accesses information from the Z-buffer indicative of depth of multiple virtual objects. The processor 1102 then generates a grid of Fresnel lens patterns or zone plate patterns wherein each of multiple Fresnel lens patterns or zone plate patterns has a focal length selected to set a curvature of light reaching the user's eye positions to a value matching a distance of a corresponding virtual object as based on information accessed from the Z-buffer. According to a variation on the preceding embodiment, multiple Fresnel lens patterns or zone plate patterns are positioned in a non-grid arrangement.
Optical coupling referred to hereinabove can include coupling through free space propagation of light between optical components that are relatively positioned such that light propagating from one component is received by a second component. Imagewise intensity modulated light, imagewise modulated light, amplitude modulated light, imagewise amplitude modulated light, and imagewise modulated light are used interchangeably in the present application to indicate image data encoded in light that may change amplitude (i.e. intensity for a given wavelength) as an image changes over time.
This application is a continuation of U.S. patent application Ser. No. 17/548,900, filed on Dec. 13, 2021, which is a continuation of U.S. patent application Ser. No. 16/650,800, filed on Mar. 25, 2020, now U.S. Pat. No. 11,231,581, which is a national stage application under 35 U.S.C. § 371 of International Application No. PCT/US2018/052882, filed on Sep. 26, 2018, which published in English as WO 2019/067559 A1 on Apr. 4, 2019 and which claims priority benefit of U.S. Patent Application No. 62/564,024 filed on Sep. 27, 2017. The entire contents of each of the above-identified patent applications are incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
5521748 | Sarraf | May 1996 | A |
6850221 | Tickle | Feb 2005 | B1 |
8498053 | Futterer et al. | Jul 2013 | B2 |
8547615 | Leister | Oct 2013 | B2 |
9081426 | Armstrong | Jul 2015 | B2 |
9215293 | Miller | Dec 2015 | B2 |
9279984 | Aiki et al. | Mar 2016 | B2 |
9335548 | Cakmakci et al. | May 2016 | B1 |
9348143 | Gao et al. | May 2016 | B2 |
9417452 | Schowengerdt et al. | Aug 2016 | B2 |
9470906 | Kaji et al. | Oct 2016 | B2 |
9547174 | Gao et al. | Jan 2017 | B2 |
9671566 | Abovitz et al. | Jun 2017 | B2 |
9740006 | Gao | Aug 2017 | B2 |
9740167 | Leister et al. | Aug 2017 | B2 |
9791700 | Schowengerdt et al. | Oct 2017 | B2 |
9851563 | Gao et al. | Dec 2017 | B2 |
9857591 | Welch et al. | Jan 2018 | B2 |
9874749 | Bradski | Jan 2018 | B2 |
10845761 | Maimone | Nov 2020 | B2 |
10877437 | Gelman et al. | Dec 2020 | B2 |
11022939 | Maimone | Jun 2021 | B2 |
11106038 | Roggatz | Aug 2021 | B2 |
20050007550 | Turkov et al. | Jan 2005 | A1 |
20060028436 | Armstrong | Feb 2006 | A1 |
20070081123 | Lewis | Apr 2007 | A1 |
20070097277 | Hong et al. | May 2007 | A1 |
20080239420 | McGrew | Oct 2008 | A1 |
20090128901 | Tilleman et al. | May 2009 | A1 |
20100014136 | Haussler | Jan 2010 | A1 |
20120127062 | Bar-Zeev et al. | May 2012 | A1 |
20120162549 | Gao et al. | Jun 2012 | A1 |
20130082922 | Miller | Apr 2013 | A1 |
20130117377 | Miller | May 2013 | A1 |
20130125027 | Abovitz | May 2013 | A1 |
20130208234 | Lewis | Aug 2013 | A1 |
20130222384 | Futterer | Aug 2013 | A1 |
20130242262 | Lewis | Sep 2013 | A1 |
20130321899 | Haussler | Dec 2013 | A1 |
20140016051 | Kroll et al. | Jan 2014 | A1 |
20140071539 | Gao | Mar 2014 | A1 |
20140177023 | Gao et al. | Jun 2014 | A1 |
20140218468 | Gao et al. | Aug 2014 | A1 |
20140267420 | Schowengerdt | Sep 2014 | A1 |
20140306866 | Miller et al. | Oct 2014 | A1 |
20150016777 | Abovitz et al. | Jan 2015 | A1 |
20150103306 | Kaji et al. | Apr 2015 | A1 |
20150178939 | Bradski et al. | Jun 2015 | A1 |
20150205126 | Schowengerdt | Jul 2015 | A1 |
20150222883 | Welch | Aug 2015 | A1 |
20150222884 | Cheng | Aug 2015 | A1 |
20150241707 | Schowengerdt | Aug 2015 | A1 |
20150268415 | Schowengerdt et al. | Sep 2015 | A1 |
20150302652 | Miller et al. | Oct 2015 | A1 |
20150309263 | Abovitz et al. | Oct 2015 | A2 |
20150326570 | Publicover et al. | Nov 2015 | A1 |
20150346490 | TeKolste et al. | Dec 2015 | A1 |
20150346495 | Welch et al. | Dec 2015 | A1 |
20160011419 | Gao | Jan 2016 | A1 |
20160026253 | Bradski et al. | Jan 2016 | A1 |
20160033771 | Tremblay et al. | Feb 2016 | A1 |
20160037146 | McGrew | Feb 2016 | A1 |
20160216515 | Bouchier | Jul 2016 | A1 |
20160223986 | Archambeau | Aug 2016 | A1 |
20160379606 | Kollin et al. | Dec 2016 | A1 |
20170094265 | Mullins et al. | Mar 2017 | A1 |
20170123204 | Sung et al. | May 2017 | A1 |
20170184848 | Vallius | Jun 2017 | A1 |
20170199496 | Grata et al. | Jul 2017 | A1 |
20170219828 | Tsai et al. | Aug 2017 | A1 |
20170262054 | Lanman et al. | Sep 2017 | A1 |
20180120563 | Kollin et al. | May 2018 | A1 |
20180259904 | Georgiou | Sep 2018 | A1 |
20190227492 | Kroll et al. | Jul 2019 | A1 |
20190250439 | Urey et al. | Aug 2019 | A1 |
20190369403 | Leister | Dec 2019 | A1 |
20200043391 | Maimone | Feb 2020 | A1 |
20200049995 | Urey | Feb 2020 | A1 |
20200166754 | Leister et al. | May 2020 | A1 |
20200233214 | Jia et al. | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
H 10-507534 | Jul 1998 | JP |
2006-091333 | Apr 2006 | JP |
2013-235256 | Nov 2013 | JP |
2016-206495 | Dec 2016 | JP |
WO 1998018039 | Apr 1998 | WO |
WO 2008068900 | Jun 2008 | WO |
WO 2016156614 | Oct 2016 | WO |
2017158073 | Sep 2017 | WO |
WO 2019067559 | Apr 2019 | WO |
Entry |
---|
Azuma, “A Survey of Augmented Reality,” Presence: Teleoperators and Virtual Environments Aug. 1997, 6(4): 355-385. |
Azuma, “Predictive Tracking for Augmented Realty,” Dissertation for the degree of Doctor of Philosophy in the Department of Computer Science, University of North Carolina-Chapel Hill, School of Computer Science, Feb. 1995, 262 pages. |
Bimber's Spatial Augmented Reality—Merging Real and Virtual Worlds, 1st ed., Aug. 2005, 393 pages. |
hitl.washington.edu [online], “ARToolKit,” available on or before Oct. 13, 2005, via Internet Archive: Wayback Machine URL <https://web.archive.org/web/20051 013062315/http://www. hitl.washington.edu:80/artoolkit/documentation/hardware.htm> retrieved on Dec. 3, 2021, URL <http://www. hitl.washington.edu:80/artoolkit/documentation/hardware.htm>, 3 pages. |
International Preliminary Report on Patentability for PCT Application No. PCT/US2018/052882, dated Mar. 31, 2020, 10 pages. |
International Search Report and Written Opinion in International Appln. No. PCT/US2018/052882, dated Dec. 11, 2018, 11 pages. |
Jacob, “Eye Tracking in Advanced Interface Design,” Virtual Environments and Advanced Interface Design, 1995: 258-288. |
Office Action in European Appln. No. 18860556.2, dated Jun. 13, 2023, 5 pages. |
Office Action in Japanese Appln. No. 2020-514743, dated Jul. 15, 2022, 8 pages (with English translation). |
Tanriverdi et al., “Interacting with eye movements in virtual environments,” Paper, Presented at Proceedings of the ACM CHI 2000 Human Factors in Computing Systems Conference, The Hague, Netherlands, Apr. 1-6, 2000: 8 pages. |
Office Action in Korean Appln. No. 10-2020-7010604, dated Jul. 17, 2023, 22 pages (with English translation). |
Number | Date | Country | |
---|---|---|---|
20230341683 A1 | Oct 2023 | US |
Number | Date | Country | |
---|---|---|---|
62564024 | Sep 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17548900 | Dec 2021 | US |
Child | 18343497 | US | |
Parent | 16650800 | US | |
Child | 17548900 | US |