This disclosure relates generally to the field of optics, and in particular but not exclusively, relates to near-to-eye optical systems.
A head mounted display (“HMD”) is a display device worn on or about the head. HMDs usually incorporate some sort of near-to-eye optical system to display an image within a few centimeters of the human eye. Single eye displays are referred to as monocular HMDs while dual eye displays are referred to as binocular HMDs. Some HMDs display only a computer generated image (“CGI”), while other types of HMDs are capable of superimposing CGI over a real-world view. This latter type of HMD is often referred to as augmented reality because the viewer's image of the world is augmented with an overlaying CGI, also referred to as a heads-up display (“HUD”).
HMDs have numerous practical and leisure applications. Aerospace applications permit a pilot to see vital flight control information without taking their eye off the flight path. Military applications include tactical displays of maps and thermal imaging. Other application fields include video games, transportation, and telecommunications. Due to the infancy of this technology, there is certain to be new found practical and leisure applications as the technology evolves; however, many of these applications are currently limited due to the cost, size, field of view, and efficiency of conventional optical systems used to implemented existing HMDs.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of an apparatus, system and method for fabrication of a curved near-to-eye display are described herein. In the following description numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Curvatures may be selected that cause light 210, emitted normal from curved optical surface 200, to converge in front of eye 120 or behind eye 120. These offset convergence points can result in tradeoffs between the field of view and size of the eye box. Typically, if an image convergence point is positioned coincident with eye 120, then a large field of view but a small eye box results and when the image convergence point is offset either directly in front or behind eye 120, this results in a larger eye box at the expense of the field of view. However, by using a curved as opposed to flat optical surface, an increased field of view and eye box can be achieved at the same time. A variety of different curvatures may be selected according to application and design goals.
Curved optical surface 200 may be implemented as a transmissive/emissive surface (e.g., see
In one embodiment, sub-layers 310, 320, and 330 represent independent material layers that are laminated or otherwise fused together. In other embodiments, two or all three of the sub-layers are formed in different regions or layers of one or two substrate materials. Accordingly, sub-layers 310, 320, and 330 may be physically different material layers or relative regions/layers within a single material layer. For example, in one embodiment, sub-layer 310 which includes microlenses 315 may be formed on a frontside of a substrate layer while light emitting pixels 325 are formed in the interior or backside of the same substrate layer. Sub-layer 330 may then be disposed onto the backside of this substrate layer. In another embodiment, all three sub-layers may be built up consecutively one on the other via conventional fabrication techniques including deposition, etching, epitaxy, polymer spin on, reflow, etc.
In one embodiment, curved optical structure 300 is partially or semi-transparent and substrate 305 is fabricated of a clear substrate material (e.g., polymer, glass, quartz, thermoplastic, etc.). The components of each sub-layer are made sufficiently small and the interstitial gaps are sufficiently large to result in a semi-transparent structure that passes external light 340. Substrate 305 maybe flexible and therefore conformable to a selected shape and curvature using an appropriate frame assembly. Alternatively, substrate 305 may be rigid and its fixed curvature imparted during fabrication.
The array of light emitting pixels 325 is offset behind the array of microlenses 315. Each light emitting pixel 325 is aligned with a corresponding microlens 315 and positioned substantially at the focal distance of its associated microlens 315. Collectively, a light emitting pixel and microlens pair may be considered a pixel unit of curved optical structure 300. Although each light emitting pixel 325 emits non-collimated, divergent light 345, its associated microlens 315 collimates divergent light 345, which is output from the associated microlens 315 as collimated output light 347. While the light output from a given light emitting pixel 325 is collimated upon emission from its associated microlens 315, due to the curvature of curved optical structure 300, the output light of a given pixel unit may not be collimated relative to surrounding pixel units. However, since light 347 output from each pixel is collimated, each individual pixel will appear in focus and displayed at infinity. In one embodiment, the curvature of curved optical structure 300 is designed to substantially match arc 205 of eye 120, thereby increasing the eye box and field of view of output light 347 relative to a flat surface. In this manner, curved optical structure 300 is well suited for (but not limited to) near-to-eye applications, such HMDs or HUDs.
Sub-layer 310 may be fabricated via a variety of techniques. For example, a clear polymer material may be spun onto the surface of sub-layer 320, patterned into a checkerboard like pattern and then reflowed to allow surface tension of the fluid to gather the liquid polymer into a lens shape. Of course other fabrication techniques may be used. Although the size may variety according to application, each microlens 315 may be approximately 100 to 200 μm in diameter. Larger or smaller microlenses 315 may be used.
Sub-layer 320 may be implemented using a variety of different light emitting technologies integrated onto a transparent or semi-transparent substrate. For example, sub-layer 320 may be formed of a sheet of transparent organic light emitting diodes (“OLEDs”). Sub-layer 330 includes control circuitry for controlling light emitting pixels 325. For example, sub-layer 330 may be a transparent (or partially transparent) thin film transistor layer for active matrix addressing of light emitting pixels 325.
The two curved optical structures 300 are secured into an eye glass arrangement that can be worn on a head 375 of a user. The left and right ear assemblies 365 and 370 rest over the user's ears while nose assembly 360 rests over the user's nose. The frame assembly is shaped and sized to position each curved optical structure 300 in front of a corresponding eye 120 of the user with the sub-layer 310 facing eyes 120. Of course, other frame assemblies may be used (e.g., single, contiguous visor assembly for both eyes).
The illustrated embodiment is capable of displaying an augmented reality to the user. As such, each curved optical structure 300 is partially transparent and permits the user to see a real world image via external light 340. Left and right (binocular embodiment) CGIs may generated by an image processor (not illustrated) coupled to drive the array of light emitting pixels 325 via the control circuitry within sub-layer 330. Left and right CGIs are then output by the left and right curved optical structures 300. Since the output light of each pixel unit is collimated and the curvature of curved optical structure 300 partially or substantially matched to arc 205 of eyes 120, the CGIs are virtually projected at infinity and delivered to eyes 120 with a large eye box and field of view. Although the human eye is typically incapable of bringing objects within a few centimeters into focus, since the output light is virtually displayed at infinity it is readily in focus. The CGIs are seen by the user as virtual images superimposed over the real world as an augmented reality.
In one embodiment, sub-layers 410, 420, and 430 represent independent material layers that are laminated or otherwise fused together. In other embodiments, two or all three of the sub-layers are formed in different regions or layers of one or two substrate materials. Accordingly, sub-layers 410, 420, and 430 may be physically different material layers or relative regions/layers within a single material layer. For example, in one embodiment, sub-layer 410 which includes microlenses 415 may be formed on a frontside of a substrate layer while light scattering centers 425 are formed in the interior or on the backside of the same substrate layer. Sub-layer 430 may then be disposed onto the backside of this substrate layer. In another embodiment, all three sub-layers may be built up consecutively one on the other via conventional fabrication techniques including deposition, etching, epitaxy, polymer spin on, reflow, etc.
In one embodiment, curved optical structure 400 is partially or semi-transparent and substrate 405 is fabricated of a clear substrate material (e.g., polymer, glass, quartz, thermoplastic, etc.). The components of each sub-layer are made sufficiently small and the interstitial gaps are sufficiently large to result in a semi-transparent structure that passes external light 340. Substrate 405 maybe flexible and therefore conformable to a selected shape and curvature using an appropriate frame assembly. Alternatively, substrate 405 may be rigid and its fixed curvature imparted during fabrication.
The array of scattering centers 425 is offset behind the array of microlenses 415. Each scattering center 425 is aligned with a corresponding microlens 415 and positioned substantially at the focal point of its associated microlens 415. Each opaque structure 435 is positioned behind a corresponding scattering center 425 and microlens 415. Collectively, a microlens, scattering center, and opaque structure are referred to herein as a pixel unit 440 of curved optical structure 400. Opaque structures 435 are positioned to block external light 340 from interfering with their associated pixel unit 440.
During operation, an image source 450 illuminates the backside of curved optical structure 400 facing eye 120 with a CGI 455. In one embodiment, CGI 455 is made up of collimated light rays. Microlenses 415 are large enough (e.g., relative to the wavelengths of CGI 455 such that the light rays pass through microlenses 415 without significant scattering. However, scattering centers 425 residing behind microlenses 415 are purposefully design small to cause scattering of the incident light rays of CGI 455. In one embodiment, scattering centers 425 are reflective regions being substantially spherical in shape. Thus, when the backside of curved optical structure 400 is illuminated by image source 450, scattering centers 425 appear to “glow” by scattering the incident light of CGI 455 in a variety of directions. Since the scattering centers 425 are positioned at the focal points of microlenses 415, this scattered and divergent light is re-collimated and emitted towards eye 120 as output light 450. Opaque structures 435 block a significant portion of external light 340 from reaching microlenses 415, thereby reducing washout of the CGI light.
While each scattering center 425 generates non-collimated, divergent light 452, its associated microlens 415 re-collimates divergent light 452, which is output from the associated microlens 415 as collimated output light 450. While the light output from a given pixel unit 440 is collimated, due to the curvature of curved optical structure 400, the output light of a given pixel unit 440 may not be collimated relative to surrounding pixel units 440. However, since light 450 output from each pixel unit 440 is collimated, each individual pixel will appear in focus and displayed at infinity. In one embodiment, the curvature of curved optical structure 400 may be designed to partially or substantially match arc 205 of eye 120, thereby increasing the eye box and field of view of output light 450 relative to a flat surface. In this manner, curved optical structure 400 is also well suited for (but not limited to) near-to-eye applications, such HMDs or HUDs.
Sub-layer 410 may be fabricated via a variety of techniques. For example, a clear polymer material may be spun onto the surface of sub-layer 420, patterned into a checkerboard like pattern and then reflowed to allow surface tension of the fluid to gather the liquid polymer into a lens shape. Of course other fabrication techniques may be used. Although the size may variety according to application, each microlens 415 may be approximately 100 to 200 μm in diameter. Larger or smaller microlenses 415 may be used, so long as microlenses 415 are not too small relative to the wavelengths of CGI 455 to cause significant scattering.
Sub-layer 420 may be implemented by forming small reflective spherical elements within a transparent or partially transparent substrate layer (e.g., clear polymer, thermoplastic, quartz, glass, etc.). For example, scattering centers 425 may be metallic elements with a curved surface, such as spheres, partial spheres, 3-dimensional (“3D”) ellipses, partial 3D ellipses, irregular shapes, etc. Scattering centers 425 should be small enough relative to the wavelengths of CGI 455 to purposefully cause scattering and offset from microlenses 415 to be positioned approximately at their focal points. For example, scattering centers 425 may have a diameter of approximately 1-2 μm and offset from microlenses 415 approximately 50 μm. It should be appreciated that these are merely approximate and representative dimensions. Other dimensions may be used.
The two curved optical structures 400 are secured into an eye glass arrangement that can be worn on a head 375 of a user. The left and right ear assemblies 610 and 615 rest over the user's ears while nose assembly 605 rests over the user's nose. The frame assembly is shaped and sized to position each curved optical structure 400 in front of a corresponding eye 120 of the user with the sub-layer 410 facing eyes 120. Of course, other frame assemblies may be used (e.g., single, contiguous visor for both eyes).
The illustrated embodiment is capable of displaying an augmented reality to the user. As such, each curved optical structure 400 is partially transparent and permits the user to see a real world image via external light 340. Left and right (binocular embodiment) CGIs 455 may generated by an image processor (not illustrated) coupled to drive the image sources 450. Left and right CGIs 455 are then scattered, re-collimated, and output by the left and right curved optical structures 400 towards the eyes 120. Since the output light of each pixel unit is collimated and the curvature of curved optical structure 400 partially or substantially matched to arc 205 of eyes 120, the CGIs 455 are virtually projected at infinity and delivered to eyes 120 with a large eye box and field of view. Although the human eye is typically incapable of bringing objects within a few centimeters into focus, since the output light is virtually displayed at infinity it is readily in focus. The CGIs 455 are seen by the user as virtual images superimposed over the real world as an augmented reality.
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Number | Name | Date | Kind |
---|---|---|---|
5093567 | Staveley | Mar 1992 | A |
5379140 | Michel et al. | Jan 1995 | A |
5539422 | Heacock et al. | Jul 1996 | A |
5696521 | Robinson et al. | Dec 1997 | A |
5715337 | Spitzer et al. | Feb 1998 | A |
5771124 | Kintz et al. | Jun 1998 | A |
5815126 | Fan et al. | Sep 1998 | A |
5844530 | Tosaki | Dec 1998 | A |
5886822 | Spitzer | Mar 1999 | A |
5896232 | Budd et al. | Apr 1999 | A |
5943171 | Budd et al. | Aug 1999 | A |
5949583 | Rallison et al. | Sep 1999 | A |
6023372 | Spitzer et al. | Feb 2000 | A |
6091546 | Spitzer | Jul 2000 | A |
6172657 | Kamakura et al. | Jan 2001 | B1 |
6201629 | McClelland et al. | Mar 2001 | B1 |
6204974 | Spitzer | Mar 2001 | B1 |
6222677 | Budd et al. | Apr 2001 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6353492 | McClelland et al. | Mar 2002 | B2 |
6353503 | Spitzer et al. | Mar 2002 | B1 |
6356392 | Spitzer | Mar 2002 | B1 |
6384982 | Spitzer | May 2002 | B1 |
6538799 | McClelland et al. | Mar 2003 | B2 |
6618099 | Spitzer | Sep 2003 | B1 |
6690516 | Aritake et al. | Feb 2004 | B2 |
6701038 | Rensing et al. | Mar 2004 | B2 |
6724354 | Spitzer et al. | Apr 2004 | B1 |
6738535 | Kanevsky et al. | May 2004 | B2 |
6747611 | Budd et al. | Jun 2004 | B1 |
6829095 | Amitai | Dec 2004 | B2 |
6879443 | Spitzer et al. | Apr 2005 | B2 |
7021777 | Amitai | Apr 2006 | B2 |
7158096 | Spitzer | Jan 2007 | B1 |
7242527 | Spitzer et al. | Jul 2007 | B2 |
7325943 | Benoit et al. | Feb 2008 | B2 |
7346260 | Arakida et al. | Mar 2008 | B2 |
7391573 | Amitai | Jun 2008 | B2 |
7457040 | Amitai | Nov 2008 | B2 |
7576916 | Amitai | Aug 2009 | B2 |
7577326 | Amitai | Aug 2009 | B2 |
7619806 | Hagood et al. | Nov 2009 | B2 |
7643214 | Amitai | Jan 2010 | B2 |
7663805 | Zaloum et al. | Feb 2010 | B2 |
7672055 | Amitai | Mar 2010 | B2 |
7724441 | Amitai | May 2010 | B2 |
7724442 | Amitai | May 2010 | B2 |
7724443 | Amitai | May 2010 | B2 |
7751122 | Amitai | Jul 2010 | B2 |
7843403 | Spitzer | Nov 2010 | B2 |
7876489 | Gandhi et al. | Jan 2011 | B2 |
7900068 | Weststrate et al. | Mar 2011 | B2 |
8004765 | Amitai | Aug 2011 | B2 |
8189263 | Wang et al. | May 2012 | B1 |
20010004251 | Kurematsu et al. | Jun 2001 | A1 |
20010021058 | McClelland et al. | Sep 2001 | A1 |
20010022682 | McClelland et al. | Sep 2001 | A1 |
20030090439 | Spitzer et al. | May 2003 | A1 |
20030184664 | Iwasaki | Oct 2003 | A1 |
20050174651 | Spitzer et al. | Aug 2005 | A1 |
20060187512 | Sprague et al. | Aug 2006 | A1 |
20060192306 | Giller et al. | Aug 2006 | A1 |
20060192307 | Giller et al. | Aug 2006 | A1 |
20070047091 | Spitzer et al. | Mar 2007 | A1 |
20070103388 | Spitzer | May 2007 | A1 |
20080106775 | Amitai | May 2008 | A1 |
20080151379 | Amitai | Jun 2008 | A1 |
20080186604 | Amitai | Aug 2008 | A1 |
20080198471 | Amitai | Aug 2008 | A1 |
20080219025 | Spitzer et al. | Sep 2008 | A1 |
20080247722 | Van Gorkom et al. | Oct 2008 | A1 |
20080278812 | Amitai | Nov 2008 | A1 |
20080285140 | Amitai | Nov 2008 | A1 |
20090052046 | Amitai | Feb 2009 | A1 |
20090052047 | Amitai | Feb 2009 | A1 |
20090097127 | Amitai | Apr 2009 | A1 |
20090122414 | Amitai | May 2009 | A1 |
20090161383 | Meir et al. | Jun 2009 | A1 |
20090237804 | Amitai et al. | Sep 2009 | A1 |
20100046070 | Mukawa | Feb 2010 | A1 |
20100046219 | Pijlman et al. | Feb 2010 | A1 |
20100103078 | Mukawa et al. | Apr 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100171680 | Lapidot et al. | Jul 2010 | A1 |
20100278480 | Vasylyev et al. | Nov 2010 | A1 |
20110213664 | Osterhout et al. | Sep 2011 | A1 |
Number | Date | Country |
---|---|---|
2272980 | Jun 1994 | GB |
WO9605533 | Feb 1996 | WO |
WO 9946619 | Sep 1999 | WO |
Entry |
---|
Mukawa, Hiroshi et al., “8.4: Distinguished Paper: A Full Color Eyewear Display using Holographic Planar Waveguides”, SID Symposium Digest of Technical Papers, May 2008, pp. 89-92, vol. 39, Issue 1. |
Levola, Tapani, “Diffractive Optics for Virtual Reality Displays”, Academic Dissertation, University of Joensuu, Department of Physics, Vaisala Laboratory, 2005, 26 pages. |
Cakmakci, Ozan et al., “Head-Worn Displays: A Review”, Journal of Display Technology, Sep. 2006, 20 pages, vol. 2, Issue 3. |