The present invention relates generally to night vision systems, and more particularly, but not exclusively, to night vision systems that are compact in size due to the design of one or more of the optical system and the light detector module.
Existing night vision systems, which include optics, a detector, a display, and power supply, are bulky, because state-of-the-art devices are optimized for resolution and sensitivity, but not size and weight. Three types of night vision technologies are currently deployed. A first one makes use of an image intensifier, which amplifies light ˜30,000 to 1,000,000 times, usually at the wavelength range of 0.4 to 1 micron. Since moonlight or starlight is usually present at night, a scene can be made visible provided there is enough amplification with high signal-to-noise ratio. The image intensifier includes a photocathode, a single or multiple microchannel plate, and a phosphor screen that emits green light, to which the human eye is most sensitive. Light is absorbed by the photocathode and converted to electrons, which are amplified by the microchannel plate. The electrons are then accelerated to the phosphor screen at high voltage (˜600-900V), resulting in the generation of more light. In a typical night vision system, the image intensifier is sandwiched between the imaging optics and eyepiece, resulting in a bulky tubular structure. The image intensifier can also be coupled directly to a CCD detector by using intermediate optics or a fiber bundle.
A second night vision technology is application of active illumination in near infrared wavelengths (˜0.7 to 1 micron) and detection using a conventional silicon based focal plane array such as a CCD or CMOS. The technique is used extensively indoors. For outdoor applications, the range of view is limited by the intensity and directionality of the illumination source. A third technology is thermal imaging. Objects at ambient temperature emit long wavelength infrared radiation at 7.5 to 15 microns. The radiation can be detected using InSb, InGaAs, HgCdTe, a quantum well infrared photodetector (QWIP), or microbolometer focal plane arrays. In many applications, the thermal image is combined with a visible image acquired using conventional silicon focal plane array to provide a thermal map of an object or scene.
In one of its aspects the present invention provides low weight and size, portable night vision systems with direct line of sight, utilizing freeform optics and compact integration of detectors and displays. For example, one particularly useful optical design of the present invention incorporates wedge-shaped lenses having freeform surfaces in both the collection optics and display optics. In this regard, exemplary designs of the present invention may include a wedge-shaped, freeform objective as the collection optics and a wedge-shaped, freeform eyepiece as the display optics. The collection optics of the present invention may capture an image of a scene for amplification by an image intensifier. The image intensifier may electrically communicate through appropriate electronics, such as a computer, with the microdisplay for displaying the intensified image. As used herein, the term “electrically communicate” is defined to include both wired and wireless communication and combinations thereof. The appropriate electronics, such as a computer, may be a physically separate unit from the microdisplay and image intensifier, or may be incorporated as part of one or more of the image intensifier and microdisplay. The display optics of the present invention may be configured to form an image of the microdisplay at an eye pupil for viewing by a user.
For instance, in one exemplary configuration the present invention may provide a compact night vision system including an imaging device configured to receive light from a scene and a microdisplay disposed in electrical communication with the imaging device for generating an image to be viewed by a user. The image may be relayed to the user by a wedge-shaped, freeform eyepiece which has a first freeform surface positioned to receive light from the microdisplay, a second freeform surface configured to receive the light transmitted into the body of the eyepiece from the first freeform surface and configured to reflect the received light at the second surface, and a third freeform surface configured to receive the light reflected by the second freeform surface and configured to transmit the light out of the eyepiece. The first, second, and third freeform surfaces of the eyepiece may be positioned to provide the wedge-shape to the eyepiece.
In one desirable exemplary configuration, the imaging device may be a camera, such as a thermal imaging camera. Alternatively, the imaging device may include a wedge-shaped, freeform objective and an image intensifier positioned to receive light transmitted out of the objective. The image intensifier may be disposed in electrical communication with the microdisplay. The freeform objective may include a first freeform surface configured to receive light from a scene, a second freeform surface configured to receive the light transmitted into the body of the objective from the first freeform surface of the objective and configured to reflect the received light at the second surface of the objective, and a third freeform surface configured to receive the light reflected by the second freeform surface of the objective and configured to transmit the light out of the objective for delivery to the image intensifier. The first, second and third freeform surfaces of the objective may be positioned to provide the wedge-shape to the objective.
In another of its aspects, the present invention may provide a light detector module, which may be particularly compact, and therefore well-suited to night vision systems. The detector module may include a photocathode, a microchannel plate (MCP), a lens array, and a detector array,
In one desirable configuration, the light detector module may include, from a first end to a second end: a photocathode layer, a microchannel plate, a lens array, and a detector array where each element of the detector array is disposed in registration with a respective lens of the lens array. The lens array may include a plurality of microlenses coated with a phosphor layer and may include a barrier structure disposed between two adjacent microlenses of the microlens array. In addition, the lens array may include an Einzel lens array and the detector may include a Faraday cup array, a delta-doped CCD, an electrometer array, or a focal plane array.
The foregoing summary and the following detailed description of exemplary embodiments of the present invention may be further understood when read in conjunction with the appended drawings, in which:
Referring now to the figures, wherein like elements are numbered alike throughout, the present invention may provide particularly compact and portable night vision systems, where the compact and portable features are due in part to one or more of the design of the optical system 100 and/or light detector module 400. For example, in one of its aspects the present invention may provide an optical system layout that is compact due to the inclusion of freeform optics, such as a wedge-shaped, freeform lens 110, 120 (e.g., a prism-lens),
In exemplary configurations of the present invention as illustrated in
In the capture path 10, photons from a scene may pass through the objective stop and reach the detector (e.g., the image intensifier 20) through consecutive refraction and reflections by the prism-like objective 120. A principal advantage of the exemplary configurations of
Turning to
In yet another exemplary configuration in accordance with the present invention, as schematically illustrated in
An optical prescription of the exemplary freeform eyepiece 110 of
where z is the sag of the freeform surface measured along the z-axis of a local x, y, z coordinate system, cx and cy are the vertex curvature in x and y axes, respectively, Kx and Ky are the conic constant in x and y axes, respectively, AR, BR, CR and DR are the rotationally symmetric portion of the 4th, 6th, 8th, and 10th order deformation from the conic, AP, BP, CP, and DP are the non-rotationally symmetric components of the 4th, 6th, 8th, and 10th order deformation from the conic.
Surface S2 of the freeform eyepiece 110 may be an XY polynomial surface defined by:
where z is the sag of the freeform surface measured along the z-axis of a local x, y, z coordinate system, c is the vertex curvature (CUY), k is the conic constant, and Cj is the coefficient for xmyn.
Surface S3 may be an aspheric surface with a rotationally symmetric kinoform diffractive optical element, with the sag of the aspheric surface defined by:
where z is the sag of the surface measured along the z-axis of a local x, y, z coordinate system, c is the vertex curvature, k is the conic constant, A through J are the 4th, 6th, 8th, 10th, 12th, 14th, 16th, 18th, and 20th order deformation coefficients, respectively.
For the freeform objective 120 of
where z is the sag of the freeform surface measured along the z-axis of a local x, y, z coordinate system, c is the vertex curvature (CUY), r is the radial distance, k is the conic constant, and Cj is the coefficient for xmyn. The optical prescriptions for these surfaces (S4 through S6) are listed in Table 4, while the surface decenters with respect to the global origin which coincides with the center of the eye box are listed in Table 5.
In another of its aspects, the present invention provides a light detector module 400, which may be particularly compact, and therefore well-suited to night vision systems. Light detector modules of the present invention are expected to provide space-saving advantage over conventional systems, which typically use a fiber bundle array or relay optics to couple an image intensifier 20 to a CCD. The use of a fiber bundle array or relay optics requires additional separation between the image intensifier and CCD, leading to an increased and undesirable size.
Exemplary light detector modules of the present invention can effect night vision imaging by measurement of visible and near infrared photons using an image intensifier in combination with a silicon based focal plane array or delta-doped CCD. In particular, in an exemplary configuration in accordance with the present invention, the light detector module 400 may include a photocathode 410, a microchannel plate (MCP) 420, a lens array 430, and a detector array 440,
Turning to the photocathode 410 in more detail, the photocathode 410 converts incident light into electrons by the photoelectric effect. The quality of the photocathode 410 may be characterized by the quantum efficiency (QE), which is defined to be the percentage of incident photons that are converted to electrons. QE is generally wavelength dependent. Depending on the required spectral sensitivity, different photocathode materials can be used. Examples of suitable photocathode materials for use in the present invention include alkali, multi-alkali alloys (lithium, sodium, potassium, rubidium, cesium, antimony, silver) and semiconductor (GaAs, GaAsP, InGaAs, Cs—Te).
The MCP 420 may be positioned to receive electrons created by the photocathode 410. The MCP 420 then amplifies the electron signal, usually by >104 times. The MCP 420 may include a thin metal oxide coating to prevent ion feedback to the photocathode 410. Suitable MCPs for use in light modules of the present invention include LongLife′ MCP (Photonis, Sturbridge, Mass.) and F1551 MCP (Hamamatsu Photonics, Bridgewater N.J.).
After the electrons received by the MCP 420 are amplified, the resulting electrons may be accelerated by a constant voltage and subsequently collide with a phosphor material. In one exemplary configuration of the present invention, the phosphor material may be provided as a phosphor layer 432 on the microlenses 434,
Turning to the lens array 430 more particularly, in one particularly useful configuration of the present invention, the lens array 430 may include a plurality of microlenses 434, each of which may be coated with a phosphor layer 432,
The lens array 430 may be made of glass or polymer using techniques such as resist reflow, gray scale lithography, embossing, and casting. The material of the lens array 430 may desirably have low optical loss at the emission wavelength of the phosphor. A barrier structure 436 may also be provided on the lens array 430, which may include a conducting material, such as metal or semiconductor. The barrier structure 436 may remove excess charge build up on the phosphor layer 432 and separate the phosphor layer 432 into different spatial regions, such that light emitted from each spatial region is collected mainly into the single lens 434 adjacent the spatial region and respective pixel of the focal plane array 440. The barrier structure 436 may reduce pixel cross talk, by preventing light emitted from neighboring phosphor spatial regions from reaching the same pixel. The barrier structure 436 may be fabricated by conventional microfabrication techniques such as photolithography, sputtering, and etching.
In one exemplary configuration of the detector array of the present invention, the lens array 430 may be fabricated directly on top of the focal plane array 440, with a separate transparent substrate 435, phosphor layer 432, and a barrier structure 436 mounted on top. In another configuration, the detector array 440 may contain the barrier structure 436, the phosphor layer 432, and the lens array 430 directly fabricated on top of the focal plane array 440, which may be fabricated using conventional microfabrication techniques.
In another exemplary configuration of a light detector module in accordance with the present invention, the detector module 500 may include a photocathode 510, a microchannel plate 520, and a micro Faraday cup array 540,
In yet another exemplary configuration of a light detector module in accordance with the present invention, the detector module 600 may include a photocathode 610, a microchannel plate 620, and a delta-doped CCD 630,
Moreover, any of the light detector modules disclosed herein, such as those illustrated in
These and other advantages of the present invention will be apparent to those skilled in the art from the foregoing specification. Accordingly, it will be recognized by those skilled in the art that changes or modifications may be made to the above-described embodiments without departing from the broad inventive concepts of the invention. It should therefore be understood that this invention is not limited to the particular embodiments described herein, but is intended to include all changes and modifications that are within the scope and spirit of the invention as set forth in the claims.
A number of patent and non-patent publications are cited herein; the entire disclosure of each of these publications is incorporated herein by reference.
This application is a continuation of U.S. application Ser. No. 16/217,158, filed on Dec. 12, 2018, which is a divisional application of U.S. application Ser. No. 15/017,763, filed on Feb. 8, 2016, which claims the benefit of priority of U.S. Provisional Application No. 62/113,656, filed on Feb. 9, 2015, the entire contents of which application(s) are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3632184 | King | Jan 1972 | A |
3992084 | Nakamura | Nov 1976 | A |
4468101 | Ellis | Aug 1984 | A |
4669810 | Wood | Jun 1987 | A |
4753522 | Nishina | Jun 1988 | A |
4863251 | Herloski | Sep 1989 | A |
5109469 | Duggan | Apr 1992 | A |
5172272 | Aoki | Dec 1992 | A |
5172275 | Dejager | Dec 1992 | A |
5416315 | Filipovich | May 1995 | A |
5436763 | Chen | Jul 1995 | A |
5526183 | Chen | Jun 1996 | A |
5572229 | Fisher | Nov 1996 | A |
5621572 | Fergason | Apr 1997 | A |
5625495 | Moskovich | Apr 1997 | A |
5699194 | Takahashi | Dec 1997 | A |
5701202 | Takahashi | Dec 1997 | A |
5706136 | Okuyama | Jan 1998 | A |
5818632 | Stephenson | Oct 1998 | A |
5880711 | Tamada | Mar 1999 | A |
5880888 | Schoenmakers | Mar 1999 | A |
5917656 | Hayakawa | Jun 1999 | A |
5959780 | Togino | Sep 1999 | A |
6008781 | Furness | Dec 1999 | A |
6023373 | Inoguchi | Feb 2000 | A |
6028606 | Kolb | Feb 2000 | A |
6034823 | Togino | Mar 2000 | A |
6198577 | Kedar | Mar 2001 | B1 |
6201646 | Togino | Mar 2001 | B1 |
6236521 | Nanba | May 2001 | B1 |
6239915 | Takagi | May 2001 | B1 |
6243199 | Hansen | Jun 2001 | B1 |
6271972 | Kedar | Aug 2001 | B1 |
6384983 | Yamazaki | May 2002 | B1 |
6396639 | Togino | May 2002 | B1 |
6404561 | Isono | Jun 2002 | B1 |
6404562 | Ota | Jun 2002 | B1 |
6433376 | Kim | Aug 2002 | B2 |
6433760 | Vaissie | Aug 2002 | B1 |
6493146 | Inoguchi | Dec 2002 | B2 |
6510006 | Togino | Jan 2003 | B1 |
6563648 | Gleckman | May 2003 | B2 |
6646811 | Inoguchi | Nov 2003 | B2 |
6653989 | Nakanishi | Nov 2003 | B2 |
6671099 | Nagata | Dec 2003 | B2 |
6731434 | Hua | May 2004 | B1 |
6829113 | Togino | Dec 2004 | B2 |
6963454 | Martins | Nov 2005 | B1 |
6999239 | Martins | Feb 2006 | B1 |
7152977 | Ruda | Dec 2006 | B2 |
7177083 | Holler | Feb 2007 | B2 |
7230583 | Tidwell | Jun 2007 | B2 |
7249853 | Weller-Brophy | Jul 2007 | B2 |
7405881 | Shimizu | Jul 2008 | B2 |
7414791 | Urakawa | Aug 2008 | B2 |
7522344 | Curatu | Apr 2009 | B1 |
8467133 | Miller | Jun 2013 | B2 |
8503087 | Amirparviz | Aug 2013 | B1 |
8511827 | Hua | Aug 2013 | B2 |
9201193 | Smith | Dec 2015 | B1 |
9239453 | Cheng | Jan 2016 | B2 |
9310591 | Hua | Apr 2016 | B2 |
9720232 | Hua | Aug 2017 | B2 |
9874760 | Hua | Jan 2018 | B2 |
10176961 | Hua | Jan 2019 | B2 |
20010009478 | Yamazaki | Jul 2001 | A1 |
20010048561 | Heacock | Dec 2001 | A1 |
20020015116 | Park | Feb 2002 | A1 |
20020060850 | Takeyama | May 2002 | A1 |
20020063913 | Nakamura | May 2002 | A1 |
20020067467 | Dorval | Jun 2002 | A1 |
20020114077 | Javidi | Aug 2002 | A1 |
20020181115 | Massof | Dec 2002 | A1 |
20030076591 | Ohmori | Apr 2003 | A1 |
20030090753 | Takeyama | May 2003 | A1 |
20040136097 | Park | Jul 2004 | A1 |
20040164927 | Suyama | Aug 2004 | A1 |
20040196213 | Tidwell | Oct 2004 | A1 |
20040218243 | Yamazaki | Nov 2004 | A1 |
20040233551 | Takahashi | Nov 2004 | A1 |
20050036119 | Ruda | Feb 2005 | A1 |
20050179868 | Seo | Aug 2005 | A1 |
20050248849 | Urey | Nov 2005 | A1 |
20060028400 | Lapstun | Feb 2006 | A1 |
20060119951 | McGuire | Jun 2006 | A1 |
20070109505 | Kubara | May 2007 | A1 |
20070246641 | Baun | Oct 2007 | A1 |
20080036853 | Shestak | Feb 2008 | A1 |
20080094720 | Yamazaki | Apr 2008 | A1 |
20080291531 | Heimer | Nov 2008 | A1 |
20090115842 | Saito | May 2009 | A1 |
20090168010 | Vinogradov | Jul 2009 | A1 |
20090256943 | Kondo | Oct 2009 | A1 |
20100091027 | Oyama | Apr 2010 | A1 |
20100109977 | Yamazaki | May 2010 | A1 |
20100208372 | Heimer | Aug 2010 | A1 |
20100271698 | Kessler | Oct 2010 | A1 |
20100289970 | Watanabe | Nov 2010 | A1 |
20110037951 | Hua | Feb 2011 | A1 |
20110043644 | Munger | Feb 2011 | A1 |
20110075257 | Hua | Mar 2011 | A1 |
20110090389 | Saito | Apr 2011 | A1 |
20110221656 | Haddick | Sep 2011 | A1 |
20120013988 | Hutchin | Jan 2012 | A1 |
20120019557 | Aronsson | Jan 2012 | A1 |
20120050891 | Seidl | Mar 2012 | A1 |
20120057129 | Durnell | Mar 2012 | A1 |
20120081800 | Cheng | Apr 2012 | A1 |
20120113092 | Bar-Zeev | May 2012 | A1 |
20120160302 | Citron | Jun 2012 | A1 |
20120162549 | Gao | Jun 2012 | A1 |
20120242697 | Border | Sep 2012 | A1 |
20120262802 | Huang | Oct 2012 | A1 |
20130100524 | Magarill | Apr 2013 | A1 |
20130112705 | McGill | May 2013 | A1 |
20130182317 | Takahashi | Jul 2013 | A1 |
20130187836 | Cheng | Jul 2013 | A1 |
20130222896 | Komatsu | Aug 2013 | A1 |
20130258461 | Sato | Oct 2013 | A1 |
20130285885 | Nowatzyk | Oct 2013 | A1 |
20130286053 | Fleck | Oct 2013 | A1 |
20130300634 | White | Nov 2013 | A1 |
20130329304 | Hua | Dec 2013 | A1 |
20140009845 | Cheng | Jan 2014 | A1 |
20140035959 | Lapstun | Feb 2014 | A1 |
20140049833 | Totani | Feb 2014 | A1 |
20140071539 | Gao | Mar 2014 | A1 |
20140300869 | Hirsch | Oct 2014 | A1 |
20140347361 | Alpaslan | Nov 2014 | A1 |
20140361957 | Hua | Dec 2014 | A1 |
20150168802 | Bohn | Jun 2015 | A1 |
20150177445 | Takagi | Jun 2015 | A1 |
20150201176 | Graziosi | Jul 2015 | A1 |
20150208061 | Yang | Jul 2015 | A1 |
20150212321 | Zhao | Jul 2015 | A1 |
20150277129 | Hua | Oct 2015 | A1 |
20150346495 | Welch | Dec 2015 | A1 |
20150363978 | Maimone | Dec 2015 | A1 |
20160011419 | Gao | Jan 2016 | A1 |
20160085075 | Cheng | Mar 2016 | A1 |
20160239985 | Haddick et al. | Aug 2016 | A1 |
20160320620 | Maimone | Nov 2016 | A1 |
20170078652 | Hua | Mar 2017 | A1 |
20170102545 | Hua | Apr 2017 | A1 |
20170202633 | Liu | Jul 2017 | A1 |
20180045949 | Hua | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
1252133 | May 2000 | CN |
101359089 | Feb 2009 | CN |
101424788 | May 2009 | CN |
0408344 | Jan 1991 | EP |
1102105 | May 2001 | EP |
2928034 | Aug 2009 | FR |
02200074 | Aug 1990 | JP |
H03101709 | Apr 1991 | JP |
08160345 | Jun 1996 | JP |
H09218375 | Aug 1997 | JP |
H09297282 | Nov 1997 | JP |
H1013861 | Jan 1998 | JP |
H10307263 | Nov 1998 | JP |
H11326820 | Nov 1999 | JP |
2000249974 | Sep 2000 | JP |
2001013446 | Jan 2001 | JP |
2001066543 | Mar 2001 | JP |
2001145127 | May 2001 | JP |
2001238229 | Aug 2001 | JP |
2002148559 | May 2002 | JP |
2003241100 | Aug 2003 | JP |
2006091333 | Apr 2006 | JP |
2006276884 | Oct 2006 | JP |
2007101930 | Apr 2007 | JP |
2010072188 | Apr 2010 | JP |
2014505381 | Feb 2014 | JP |
9923647 | May 1999 | WO |
2004079431 | Sep 2004 | WO |
2007002694 | Jan 2007 | WO |
2007085682 | Aug 2007 | WO |
2007002694 | Dec 2007 | WO |
2007140273 | Dec 2007 | WO |
2008089417 | Jul 2008 | WO |
2011134169 | Nov 2011 | WO |
2012064546 | May 2012 | WO |
2012118573 | Sep 2012 | WO |
2013112705 | Aug 2013 | WO |
2014062912 | Apr 2014 | WO |
2015134738 | Sep 2015 | WO |
2015134740 | Sep 2015 | WO |
2015184409 | Dec 2015 | WO |
2016033317 | Mar 2016 | WO |
2018052590 | Mar 2018 | WO |
Entry |
---|
US 9,207,443 B2, 12/2015, Cheng (withdrawn) |
US 9,213,186 B2, 12/2015, Cheng (withdrawn) |
US 9,880,387 B2, 01/2018, Hua (withdrawn) |
‘Fresnel Lenses’ downloaded from http://www.fresneltech.com on Jun. 8, 2011. Copyright Fresnel Technologies, Inc., 2003. |
Azuma, R., et al., ‘Recent advances in augmented reality’, IEEE Computer Graphics App;. 21, 34-47 (2001). |
Bajura, M., et al., “Merging virtual objects with the real world: seeing ultrasound imagery within the patient” in Proceedings of ACM SIGGRAPH (ACM, Chicago, 1992), pp. 203-210. |
Biocca, et al., “Virtual eyes can rearrange your body: adapting to visual displacement in see-through, head-mounted displays”, Presence: Teleoperators and Virtual Environments 7, 262-277 (1998). |
Bunkenburg, J. ‘Innovative Diffractive Eyepiece for Helmet-Mounted Display.’ SPIE vol. 3430. pp. 41-49 Jul. 1998. |
C. Curatu, H. Hua, and J. P. Rolland, “Projection-based headmounted display with eye-tracking capabilities,” Proc. SPIE 5875, 587050J (2005). |
Cakmakci, O., et al., ‘Head-Worn Displays: A Review’. Journal of Display Technology, vol. 2, No. 3, Sep. 2006, pp. 199-216. |
Caudell, T., et al., “Augmented reality: an application of heads-up display technology to manual manufacturing processes” in Proceedings of Hawaii International Conferences on Systems Sciences (Hawaii, 1992), pp. 659-669. |
Cruz-Neira et al., ‘Surround-Screen Projection-Based Virtual Reality: the Design and Implementation of the CAVE,’ Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques pp. 135-142, ACM SIGGRAPH, ACM Press (1993). |
Examination Report dated Apr. 29, 2011 from corresponding GB Application No. GB1012165.5. |
H. Hua, C. Gao, and J. P. Rolland, ‘Study of the Imaging properties of retroreflective materials used in head-mounted projective displays (HMPDs),’ Proc. SPIE4711, 194-201 (2002). |
H. Hua, C. Gao, F. Biocca, and J. P. Rolland, “An ultra-light and compact design and implementation of head-mounted projective displays,” in Proceedings of IEEE VR 2001, pp. 175-182. |
H. Hua, L. Brown, and C. Gao, “A new collaborative infrastructure: SCAPE,” in Proceedings of IEEE VR 2003 (IEEE, 2003), pp. 171-179. |
H. Hua, L. Brown, and C. Gao, “SCAPE: supporting stereoscopic collaboration in augmented and projective environments,” IEEE Comput. Graphics Appl. 24, 66-75 (2004). |
H. Hua, L. Brown, and C. Gao, “System and interface framework for SCAPE as a collaborative infrastructure,” Presence: Teleoperators and Virtual Environments 13, 234-250 (2004). |
H. Hua, Y. Ha, and J. P. Rolland, ‘Design of an ultra-light and compact projection lens,’ Appl. Opt. 42, 1-12 (2003), pp. 97-107. |
H. Hua., A. Girardot, C. Gao. J. P. Rolland. ‘Engineering of head-mounted projective displays’. Applied Optics. 39 (22), pp. 3814-3824. (2000). |
H. Hua and C. Gao, “A polarized head-mounted projective display,” in Proceedings of IEEE and ACM International Symposium on Mixed and Augmented Reality 2005 (IEEE, 2005), pp. 32-35. |
Hua et al., ‘Design of a Bright Polarized Head-Mounted Projection Display’ Applied Optics 46:2600-2610 (2007). |
International Search Report dated Mar. 9, 2009 with regard to International Patent Application No. PCT/ US2009/031606. |
J. L. Pezzaniti and R. A. Chipman, “Angular dependence of polarizing beam-splitter cubes,” Appl. Opt. 33, 1916-1929 (1994). |
J. P. Rolland, F. Biocca, F. Hamza-Lup, Y. Ha, and R. Martins, “Development of head-mounted projection displays for distributed, collaborative, augmented reality applications,” Presence: Teleoperators and Virtual Environments 14, 528-549 (2005). |
J. P. Rolland and Hong Hua. “Head-mounted display systems,” in Encyclopedia of Optical Engineering. R. Barry Johnson and Ronald O. Driggers, Eds, (2005). |
Krueerke, Daniel, “Speed May Give Ferroelectric LCOS Edge in Projection Race,” Display Devices Fall '05. Copyright 2005 Dempa Publications, Inc. pp. 29-31. |
L. Brown and H. Hua, “Magic lenses for augmented virtual environments,” IEEE Comput. Graphics Appl. 26, 64-73 (2006). |
L. Davis, J. P. Rolland, F. Hamza-Lup, Y. Ha, J. Norfleet, and C. Imielinska, ‘Enabling a continuum of virtual environment experiences,’ IEEE Comput. Graphics Appl. 23, pp. 10-12 Mar./Apr. 2003. |
M. Inami, N. Kawakami, and S. Tachi, ‘Optical camouflage using retro-reflective projection technology,’ in Proceedings of ISMAR 2003 {ISMAR, 2003). |
M. Inami, N. Kawakami, D. Sekiguchi, Y. Yanagida, T. Maeda, and S. Tachi, “Visuo-haptic display using head-mounted projector,” in Proceedings of IEEE Virtual Reality 2000, pp. 233-240. |
M. Robinson. J. Chen, and G. Sharp, Polarization Engineering for LCD Projection. John Wiley & Sons, Ltd. England, 2005. |
N. Kawakami, M. Inami, D. Sekiguchi, Y. Yangagida, T. Maeda, and S. Tachi, ‘Object-oriented displays: a new type of display systemsfrom immersive display to object-oriented displays,’ in Proceedings of IEEE SMC 1999, IEEE International Conference on Systems, Man, and Cybernetics, vol. 5, pp. 1066-1069. |
R. Azuma, A Survey of Augmented Reality in Presence; Teleoperators and Virtual Environments 6. 4, 355-385, (1997). |
R. Kijima, K. Haza, Y. Tada, and T. Ojika, “Distributed display approach using PHMD with infrared camera,” in Proceedings of IEEE 2002 Virtual Reality Annual International Symposium (IEEE, 2002), pp. 1-8. |
R. Kijima and T. Ojika, “Transition between virtual environment and workstation environment with projective headmounted display,” in Proceedings of IEEE VR 1997 (IEEE, 1997), pp. 130-137. |
R. Martins, V. Shaoulov, Y. Ha, and J. P. Rolland, “Projection based head-mounted displays for wearable computers,” Proc. SPIE 5442, 104-110 (2004). |
R. N. Berry, L. A. Riggs, and C. P. Duncan, “The relation of vernier and depth discriminations to field brightness,” J. Exp. Psychol. 40, 349-354 (1950). |
Rolland, J.P., et al., ‘Optical versus video see-through head mounted displays in medical visualization’, Presence' Teleoperators and Virtual Environments 9, 287-309 (2000). |
Winterbottom, M., et al., ‘Helmet-Mounted Displays for use in Air Force Training and Simulation’, Human Effectiveness Directorate, Nov. 2005, pp. 1-54. |
Written Opinion of the International Searching Authority dated Mar. 9, 2009 with regard to International Patent Application No. PCT/US2009/031606. |
Y. Ha, H. Hua, R. Martins, and J. P. Rolland, “Design of a wearable wide-angle projection color display,” in Proceedings of International Optical Design Conference 2002 (IODC, 2002), pp. 67-73. |
Zhang, R., “8.3: Design of a Compact Light Engine for FLCOS Microdisplays in a p-HMPD system”, Society for Information Display 2008 International Symposium, Seminar and Exhibition (SID2008), Los Angeles, CA, May 2008. |
Zhang, R., et al., “Design of a Polarized Head-Mounted Projection Display Using Ferroelectric Liquid-Crystal-on-Silicon Microdisplays”, Applied Optics, vol. 47, No. 15, May 20, 2008, pp. 2888-2896. |
Zhang, R., et al., “Design of a Polarized Head-Mounted Projection Display using FLCOS Microdisplays”, Proc. of SPIE vol. 6489, 64890B-1. (2007). |
“OLED-XL Microdisplays,” eMagin 5 pages (2010). |
A. Jones, I. McDowall, Yamada H., M. Bolas, P. Debevec, Rendering for an Interactive 360° Light Field Display ACM Transactions on Graphics (TOG)—Proceedings of ACM SIGGRAPH 2007, 26(3), 2007. |
A. Malmone, and H. Fuchs, “Computational augmented reality eyeglasses,” Proc. of ISMAR 2012. |
A. Castro, Y. Frauel, and B. Javidi, “Integral imaging with large depth of field using an asymmetric phase mask,” Journal of Optics Express, vol. 15, Issue 16, pp. 10266-10273 (Aug. 2007). |
A. T. Duchowski, “Incorporating the viewer's Point-Of-Regard (POR) in gaze-contingent virtual environments”, SPIE-Int. Soc. Opt. Eng. Proceedings of Spie—the International Society for Optical Engineering, vol. 3295, 1998, pp. 332-343. |
Akeley et al., “A Stereo Display Prototype with Multiple Focal Distances,” ACM Trans. Graphics 23:804-813 (2004). |
Blundell, B. G., and Schwarz, A. J., “The classification of volumetric display systems: characteristics and predictability of the image space,” IEEE Transaction on Visualization and Computer Graphics, 8(1), pp. 66-75, 2002. |
C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. 58, 71-76 (1968). |
C. Manh Do, R. Mart□ ez-Cuenca, and B. Javidi, “Three-dimensional object-distortion-tolerant recognition for integral imaging using independent component analysis,” Journal of Optical Society of America A 26, issue 2, pp. 245-251 (Feb. 1, 2009). |
Chih-Wei Chen, Myungjin Cho, Yi-Pai Huang, and Bahram Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” IEEE Journal of Display Technology, 2014. |
Christopher M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, Inc. New York, NY 1995. |
Curatu, C., J.P. Rolland, and Hong Hua, “Dual purpose lens for an eye-tracked projection head-mounted display,” Proceedings of International Optical Design Conference, Vancouver, Canada, Jun. 2006. |
D. Cheng, Y.Wang, H. Hua, and M. M. Talha, Design of an optical see-through headmounted display with a low f-number and large field of view using a free-form prism, App. Opt. 48 (14), pp. 2655-2668, 2009. |
D. Cheng, Y. Wang, H. Hua, and M. M. Talha, “Design of an optical see-through head-mounted display with a low f-number and large field of view using a freeform prism,” Appl. Opt., 48(14):2655-2668, 2009. |
D. Cheng, Y. Wang, H. Hua, J. Sasian, “Design of a wide-angle, lightweight head-mounted display using free-form optics tiling,” Opt. Lett., 36(11):2098-100, 2011. |
D.M. Hoffman, A.R. Girshick, K. Akeley, and M.S. Banks, “Vergence-Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue,” J. Vision, 8(3), 1-30, (2008). |
Davis et al., “Accommodation to Large Disparity Stereograms,” Journal of AAPOS 6:377-384 (2002). |
Downing et al., “A Three-Color, Solid-State, Three-Dimensional Display,” Science 273:1185-1189 (1996). |
Duchowski, A., “Eyetracking Methodology: theory and practice,” Publisher: Springer, 2003. |
Duchowski, A.T., and A. Coltekin, “Foveated gaze-contingent displays for peripheral LOD management, 3D visualization, and stereo imaging,” ACM Trans., on Mult. Comp., Comm., and App. 3, 1-21, (2007). |
Edgar et al., “Visual Accommodation Problems with Head-Up and Helmet-Mounted Displays?,” Displays 15:68-75 (1994). |
European Search Report dated Aug. 14, 2015 in corresponding EP application 13740989.2. |
F. Okano, H. Hoshino, J. Arai y I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36, 1598-1603 (1997). |
Favalora et al., “100 Million-Voxel Volumetric Display,” Proc. SPIE 4712:300-312 (2002). |
G. Wetzstein et al., “Tensor Displays: Compressive light field synthesis using multilayer displays with directional backlighting,” ACM Transactions on Graphics, 31(4), 2012. |
GB Examination Report corresponding to GB 1012165.5 dated Jun. 28, 2011. |
Geisler, W.S., J.S. Perry and J. Najemnik, “Visual search: The role of peripheral information measured using gaze-contingent displays,” J. Vision 6, 858-873 (2006). |
Graham-Rowe, “Liquid Lenses Make a Splash,” Nature—Photonics pp. 2-4 (2006). |
H. Hua, X. Hu, and C. Gao, “A high-resolution optical see-through head-mounted display with eyetracking capability,” Optics Express, Nov. 2013. |
H. Hua, “Sunglass-like displays become a reality with freeform optical technology,” SPIE Newsroom, 2012. |
H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, T. Yoshida, M. Kuwahara, and K. Aiki, A full-color eyewear display using planar waveguides with reflection volume holograms, J. Soc. Inf. Display 19 (3), pp. 185-193, 2009. |
H. Hoshi, N. Taniguchi, H. Morishima, T. Akiyama, S. Yamazaki and A. Okuyama, “Off-axial HMD optical system consisting of aspherical surfaces without rotational symmetry,” SPIE vol. 2653, 234 (1996). |
H. Hua, C. Pansing, and J.P. Rolland, “Modeling of an eye-imaging system for optimizing illumination schemes in an eye-tracked head-mounted display,” Appl. Opt., 46(31):7757-75, Oct. 2007. |
H. Hua, P. Krishnaswamy, and J.P. Rolland, ‘Video-based eyetracking methods and algorithms in head-mounted displays,’ Opt. Express, 14(10):4328-50, May 2006. |
Heanue et al., “Volume Holographic Storage and Retrieval of Digital Data,” Science 265:749-752 (1994). |
Hidenori Kuriyabashi, Munekazu Date, Shiro Suyama, Toyohiko HatadaJ. of the SID 14/5, 2006 pp. 493-498. |
Hua, “Merging the Worlds of Atoms and Bits: Augmented Virtual Environments,” Optics and Photonics News 17:26-33 (2006). |
Hua, H., C. Pansing, and J. P. Rolland, “Modeling of an eye-imaging system for optimizing illumination schemes in an eye-tracked head-mounted display,” Applied Optics, 46(32): 1-14, Nov. 2007. |
Hua, H. “Integration of eye tracking capability into optical see-through head-mounted displays,” Proceedings of SPIE (Electronic Imaging 2001), pp. 496-503, Jan. 2001. |
Hua et al, “Compact eyetracked optical see-through head-mounted display”, Proc. SPIE 8288, Stereoscopic Displays and Applications XXIII, 82881F (Feb. 9, 2012). |
Inoue et al., “Accommodative Responses to Stereoscopic Three-Dimensional Display,” Applied Optics, 36:4509-4515 (1997). |
International Search Report and Written Opinion dated Nov. 24, 2015 in corresponding PCT application PCT/US2015/047163. |
International Search Report dated Feb. 10, 2011 from PCT/CN2010/072376. |
International Search Report dated Jan. 29, 2014 in corresponding international application PCT/US2013/065422. |
International Search Report dated Jun. 18, 2010 in corresponding international application PCT/US2010/031799. |
J. Hong, S. Min, and B. Lee, “Integral floating display systems for augmented reality,” Applixed Optics, 51(18):4201-9, 2012. |
J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with non-uniform focal lengths and aperture sizes,” Opt. Lett. vol. 28, pp. 1924-1926 (2003). |
J. Arai, et al., “Depth-control method for integral imaging,” Feb. 1, 2008 / vol. 33, No. 3 / Optics Letters. |
J. E. Melzer's: ‘Overcoming the field-of-view/resolution invariant in head-mounted displays’ Proc. SPIE vol. 3362, 1998, p. 284. |
J. G. Droessler, D. J. Rotier, “Tilted cat helmet-mounted display,” Opt. Eng., vol. 29, 849 (1990). |
J. P. Rolland, “Wide-angle, off-axis, see-through head-mounted display,” Opt. Eng., vol. 39, 1760 (2000). |
J. S. Jang, F. Jin, and B. Javidi, “Three-dimensional integral imaging with large depth of focus by use of real and virtual image fields,” Opt. Lett. 28:1421-23, 2003. |
J. Y. Son, W.H. Son, S.K. Kim, K.H. Lee, B. Javidi, “Three-Dimensional Imaging for Creating Real-World-Like Environments,” Proceedings of IEEE Journal, vol. 101, issue 1, pp. 190-205, Jan. 2013. |
Jisoo Hong, et al., “Three-dimensional display technologies of recent interest: Principles, Status, and Issues,” Applied Optics (Dec. 1, 2011) 50(34):106. |
K. Iwamoto, K. Tanie, T. T. Maeda, “A head-mounted eye movement tracking display and its image display method”, Systems & Computers in Japan, vol. 28, No. 7, Jun. 30, 1997, pp. 89-99. Publisher: Scripta Technica, USA. |
K. Iwamoto, S. Katsumata, K. Tanie, “An eye movement tracking type head mounted display for virtual reality system:—evaluation experiments of a prototype system”, Proceedings of 1994 IEEE International Conference on Systems, Man, and Cybernetics. Humans, Information and Technology (Cat. No. 94CH3571-5). IEEE. Part vol. 1, 1994, pp. 13-18 vol. 1. New York, NY, USA. |
Kuiper et al., “Variable-Focus Liquid Lens for Miniature Cameras,” Applied Physics Letters 85:1128-1130 (2004). |
Kuribayashi, et al., “A Method for Reproducing Apparent Continuous Depth in a Stereoscopic Display Using “Depth-Fused 3D” Technology” Journal of the Society for Information Display 14:493-498 (2006). |
L. G. Brown's: ‘Applications of the Sensics panoramic HMD’ SID Symposium Digest vol. 39, 2008, p. 77. |
Laurence R. Young, David Sheena, “Survey of eye movement recording methods”, Behavior Research Methods & Instrumentation, 7(5), 397-429, 1975. |
Liu et al., ‘A Novel Prototype for an Optical See-Through Head-Mounted Display with Addressable Focus Cues,’ IEEE Transactions on Visualization and Computer Graphics 16:381-393 (2010). |
Liu et al., “A Systematic Method for Designing Depth-Fused Multi-Focal Plane Three-Dimensional Displays,” Optics Express 18:11562-11573 (2010). |
Liu et al., “An Optical See-Through head Mounted Display with Addressable Focal Planes,” IEEE Computer Society, pp. 33-42 (2008). |
Liu et al., “Time-Multiplexed Dual-Focal Plane Head-Mounted Display with a Liquid Lens,” Optics Letters 34:1642-1644 (2009). |
Loschky, L.C. and Wolverton, G.S., “How late can you update gaze-contingent multiresolutional displays without detection?” ACM Trans. Mult. Comp. Comm. and App. 3, Nov. 2007. |
Love et al. (High Speed switchable lens enables the development of a volumetric stereoscopic display. Aug. 2009, Optics Express. Vol. 17, No. 18, pp. 15716-15725.). |
M. Marti-nez-Corral, H. Navarro, R. Mart□ ez-Cuenca, G. Saavedra, and B. Javidi, “Full parallax 3-D TV with programmable display parameters,” Opt. Phot. News 22, 50-50 (2011). |
M. D. Missig and G. M. Morris, “Diffractive optics applied to eyepiece design,” Appl. Opt. 34, 2452-2461 (1995). |
M. Daneshpanah, B. Javidi, and E. Watson, “Three dimensional integral imaging with randomly distributed sensors,” Journal of Optics Express, vol. 16, Issue 9, pp. 6368-6377, Apr. 21, 2008. |
M. Gutin: ‘Automated design and fabrication of ocular optics’ Proc. SPIE 2008, p. 7060. |
M. L. Thomas, W. P. Siegmund, S. E. Antos, and R. M. Robinson, “Fiber optic development for use on the fiber optic helmet-mounted display”, Helmet-Mounted Displays, J. T. Carollo, ed., Proc. SPIE 116, 90-101, 1989. |
M. Lucente, “Interactive three-dimensional holographic displays: seeing the future in depth,” Computer Graphics, 31(2), pp. 63-67, 1997. |
McQuaide et al., “A Retinal Scanning Display System That Produces Multiple Focal Planes with a Deformable Membrane Mirror,” Displays 24:65-72 (2003). |
Mon-Williams et al., “Binocular Vision in a Virtual World: Visual Deficits Following the Wearing of a Head-Mounted Display,” Ophthalmic Physiol. Opt. 13:387-391 (1993). |
O. Cakmakci, B. Moore, H. Foroosh, and J. P. Rolland, “Optimal local shape description for rotationally non-symmetric optical surface design and analysis,” Opt. Express 16, 1583-1589 (2008). |
Optical Research Associates, http://www.optica1res.com, 2 pages (obtained Jan. 26, 2011). |
P. A. Blanche, et al, “Holographic three-dimensional telepresence using large-area photorefractive polymer”, Nature, 468, 80-83, Nov. 2010. |
P. Gabbur, H. Hua, and K. Barnard, ‘A fast connected components labeling algorithm for real-time pupil detection,’ Mach. Vision Appl., 21(5):779-787, 2010. |
R. MartÃ?Â-nez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. MartÃ?Â-nez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Optics Express, vol. 15, Issue 24, pp. 16255-16260, Nov. 21, 2007. |
R. Schulein, C. Do, and B. Javidi, “Distortion-tolerant 3D recognition of underwater objects using neural networks,” Journal of Optical Society of America A, vol. 27, No. 3, pp. 461-468, Mar. 2010. |
R. Schulein, M. DaneshPanah, and B. Javidi, “3D imaging with axially distributed sensing,” Journal of Optics Letters, vol. 34, Issue 13, pp. 2012-2014, Jul. 1, 2009. |
R.J. Jacob, “The use of eye movements in human-computer interaction techniques: what you look at is what you get”, ACM Transactions on Information Systems, 9(2), 152-69, 1991. |
Reingold, E.M., L.C. Loschky, G.W. McConkie and D.M. Stampe, “Gaze-contingent multiresolutional displays: An integrative review,” Hum. Factors 45, 307-328 (2003). |
Rolland, J. P., A. Yoshida, L. D. Davis and J. H. Reif, “High-resolution inset head-mounted display,” Appl. Opt. 37, 4183-93 (1998). |
Rolland et al., “Multifocal Planes Head-Mounted Displays,” Applied Optics 39:3209-3215 (2000). |
S. Bagheri and B. Javidi, “Extension of Depth of Field Using Amplitude and Phase Modulation of the Pupil Function,” Journal of Optics Letters, vol. 33, No. 7, pp. 757-759, Apr. 1, 2008. |
S. Hong, J. Jang, and B. Javidi,“Three-dimensional volumetric object reconstruction using computational integral imaging,” Journal of Optics Express, on-line Journal of the Optical Society of America, vol. 12, No. 3, pp. 483-491, Feb. 9, 2004. |
S. Hong and B. Javidi, “Distortion-tolerant 3D recognition of occluded objects using computational integral imaging,” Journal of Optics Express, vol. 14, Issue 25, pp. 12085-12095, Dec. 11, 2006. |
S. Kishk and B. Javidi, “Improved Resolution 3D Object Sensing and Recognition using time multiplexed Computational Integral Imaging,” Optics Express, on-line Journal of the Optical Society of America, vol. 11, No. 26, pp. 3528-3541, Dec. 29, 2003. |
Schowengerdt, B. T., and Seibel, E. J., “True 3-D scanned voxel displays using single or multiple light sources,” Journal of SID, 14(2), pp. 135-143, 2006. |
Schowengerdt et al., “True 3-D Scanned Voxel Displays Using Single or Multiple Light Sources,” J. Soc. Info. Display 14:135-143 (2006). |
Sheedy et al., “Performance and Comfort on Near-Eye Computer Displays,” Optometry and Vision Science 79:306-312 (2002). |
Shibata et al., “Stereoscopic 3-D Display with Optical Correction for the Reduction of the Discrepancy Between Accommodation and Convergence,” Journal of the Society for Information Display 13:665-671 (2005). |
Shiwa et al., “Proposal for a 3-D Display with Accommodative Compensation: 3DDAC,” Journal of the Society for Information Display 4:255-261 (1996). |
Sullivan, “A Solid-State Multi-Planar Volumetric Display,” SID Symposium Digest of Technical Papers 34:354-356 (2003). |
Suyama, S., Ohtsuka, S., Takada, H., Uehira, K., and Sakai, S., “Apparent 3D image perceived from luminance-modulated two 2D images displayed at different depths,” Vision Research, 44: 785-793, 2004. |
T. Okoshi, “Optimum design and depth resolution of lens-sheet and projection-type three-dimensional displays,” Appl. Opt. 10, 2284-2291 (1971). |
T. Ando, K. Yamasaki, M. Okamoto, and E. Shimizu, “Head Mounted Display using holographic optical element,” Proc. SPIE, vol. 3293, 183 (1998). |
Tibor Balogh, “The HoloVizio System,” Proceedings of SPIE, vol. 6055, 2006. |
Varioptic, “Video Auto Focus and Optical Image Stabilization,” http://vvww.varioptic.com/en/home.html, 2 pages (2008). |
Wann et al., Natural Problems for Stereoscopic Depth Perception in Virtual Environments, Vision Res. 35:2731-2736 (1995). |
Wartenberg, Philipp, “EyeCatcher, the Bi-directional OLED Microdisplay,” Proc. of SID 2011. |
Watt et al., “Focus Cues Affect Perceived Depth,” J Vision 5:834-862 (2005). |
Written Opinion dated Feb. 10, 2011 from PCT/CN2010/072376. |
Written Opinion dated Jun. 18, 2010 in corresponding international application PCT/US2010/031799. |
X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” Journal of Display Technology, Dec. 2013. |
Xiao Xiao, Bahram Javidi, Manuel Martinez-Corral, and Adrian Stern , “Advances in Three-Dimensional Integral Imaging: Sensing, Display, and Applications,” Applied Optics, 52(4):. 546-560,2013. |
Xin Shen, Yu-Jen Wang, Hung-Shan Chen, Xiao Xiao, Yi-Hsin Lin, and Bahram Javidi, “Extended depth-of-focus 3D micro integral imaging display using a bifocal liquid crystal lens,” Optics Letters, vol. 40, issue 4, pp. 538-541 (Feb. 9, 2015). |
Xinda Hu and Hong Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Optics Express,22(11): 13896-13903, Jun. 2014. |
Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view winshield display for long-distance image information presentation,” Opt. Express, 19, 704-16, 2011. |
Yamazaki et al, “Thin wide-field-of-view HMD with free-form-surface prism and applications”, Proc. SPIE 3639, Stereoscopic Displays and Virtual Reality Systems VI, 453 (May 24, 1999). |
Yano, S., Emoto, M., Mitsuhashi, T., and Thwaites, H., “A study of visual fatigue and visual comfort for 3D HDTV/HDTV images,” Displays, 23(4), pp. 191-201, 2002. |
Xin et al., “Design of Secondary Optics for IRED in active night vision systems,” Jan. 10, 2013, vol. 21, No. 1, Optics Express, pp. 1113-1120. |
S. Nikzad, Q. Yu, A. L. Smith, T. J. Jones, T. A. Tombrello, S. T. Elliott, “Direct detection and imaging of low-energy electrons with delta-doped charge-coupled devices,” Applied Physics Letters, vol. 73, p. 3417, 1998. |
European Search Report dated Apr. 28, 2016 from EP application 13847218.8. |
Xinda Hu et al: “48.1: Distinguished Student Paper: A Depth-Fused Multi-Focal-Plane Display Prototype Enabling Focus Cues in StereoscopicDisplays”, SID International Symposium. Digest of Technical Papers, vol. 42, No. I, Jun. 1, 2011 (Jun. 1, 2011), pp. 691-694, XP055266326. |
Hu and Hua, “Design and tolerance of a freeform optical system for an optical see-through multi-focal plane display,” Applied Optics, 2015. |
A. Yabe, “Representation of freeform surface suitable for optimization,” Applied Optics, 2012. |
Armitage, David, Ian Underwood, and Shin-Tson Wu. Introduction to Microdisplays. Chichester, England: Wiley, 2006. |
Hoshi, et al, “Off-axial HMD optical system consisting of aspherical surfaces without rotational symmetry,” Proc. SPIE 2653, Stereoscopic Displays and Virtual Reality Systems III, 234 (Apr. 10, 1996). |
S. Feiner, 2002, “Augmented reality: Anew way of seeing,” Scientific American, No. 54, 2002. |
K. Ukai and P.A. Howardth, “Visual fatigue caused by viewing stereoscopic motion images: background, theories, and observations,” Displays, 29(2), pp. 106-116, 2008. |
B. T. Schowengerdt, M. Murari, E. J. Seibel, “Volumetric display using scanned fiber array,” SID Symposium Digest of Technical Papers, 2010. |
H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display”, Optics Express, 22(11): 13484-13491, 2014. |
W. Song, Y. Wang. D. Cheng, Y. Liu, “Light field head-mounted display with correct focus cue using micro structure array,” Chinese Optics Letters, 12(6): 060010, 2014. |
T. Peterka, R. Kooima, D. Sandin, A. Johnson, J. Leigh, T. DeFanti, “Advances in the Dynallax solid-state dynamic parallax barrier autostereoscopi visualization display system,” IEEE Trans. Visua. Comp. Graphics, 14(3): 487-499, 2008. |
Hu, X., Development of the Depth-Fused Multi-Focal Plane Display Technology, Ph.D. Dissertation, College of Optical Sciences, University of Arizona, 2014. |
S. Ravikumar, K. Akeley, and M. S. Banks, “Creating effective focus cues in multi-plane 3D displays,” Opt. Express 19, 20940-20952, 2011. |
X. Hu and H. Hua, “Design and tolerance of a free-form optical system for an optical see-hrough multi-focal-plane display,” Applied Optics, 54(33): 9990-9, 2015. |
Dewen Cheng et al.; “Large field-of-view and high resolution free-form head-mounted display”; SPIE-OSA/ vol. 7652 Jun. 2018. |
Huang et al., “An integral-imaging-based head-mounted light field display using a tunable lens ,;1nd aperture array.” Journal of the Society for Information Display Mar. 1, 2017; p. 199-201. |
G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” Journal of Physics (Paris) 7, 821-825 (1908). |
Full Certified Translation of Reference JP008160345. |
Full Certified Translation of Reference JP 02200074. |
Cheol-Joong Kim et al, “Depth plane adaptive integral imaging using a varifocal liquid lens array”, Applied Optics, OSA, vol. 54, No. 10, Apr. 1, 2015 (Apr. 1, 2015) , pp. 2565-2571. |
Xin Shen et al: “Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens”, Applied Optics, vol. 57, No. 7, Mar. 1, 2018 (Mar. 1, 2018), p. B184. |
Martinez-Cuenca R et al: “Progress in 3-D Multiperspective Display by Integral Imaging”, Proceedings of the IEEE, IEEE. New York, us, vol. 97, No. 6, Jun. 1, 2009 (Jun. 1, 2009), pp. 1067-1077. |
Kim Cheoljoong et al: “Depth-enhanced integral imaging display system with time-multiplexed depth planes using a varifocal liquid lens array”, Proceedings of SPIE, IEEE, US, vol. 9385, Mar. 11, 2015 (Mar. 11, 2015), pp. 93850D-93850D. |
Huan Deng et al: “The Realization of Computer Generated Integral Imaging Based on Two Step Pickup Method”, Photonics and Optoelectronic (SOPO), 2010 Symposium on, IEEE, Piscataway, NJ, USA, Jun. 19, 2010 (Jun. 19, 2010), pp. 1-3. |
H. Hua, “Enabling focus cues in head-mounted displays,” Proceedings of the IEEE 105(5), 805-824 (2017). |
G. E. Favalora, “Volumetric 3D displays and application infrastructure,” Computer, 38(8), 37-44 (2005). |
H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nature Photonics 11(3), 186 (2017). |
G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Letters 41(11), 2486-2489 (2016). |
S. B. Kim and J. H. Park, “Optical see-through Maxwellian near-to-eye display with an enlarged eyebox,” Opt. Letters 43(4), 767-770 (2018). |
D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1-10 (2013). |
H. Huang and H. Hua, “High-performance integral-imaging-based light field augmented reality display using freeform optics,” Opt. Express 26(13), 17578-17590 (2018). |
B. Liu, X. Sang, X. Yu, X. Gao, L. Liu, C. Gao, P. Wang, Y. Le, and J. Du, “Time-multiplexed light field display with 120-degree wide viewing angle”. Opt. Express 27(24), pp. 35728-35739 (2019). |
H. Huang and H. Hua, “Generalized methods and strategies for modeling and optimizing the optics of 3D head-mounted light field displays,” Opt. Express 27(18), 25154-25171 (2019). |
H. Huang and H. Hua, “Systematic characterization and optimization of 3D light field displays,” Opt. Express 25(16), 18508-18525 (2017). |
J. H. Park, S. W. Min, S. Jung, and B. Lee. “Analysis of viewing parameters for two display methods based on integral photography.” Applied Optics 40, No. 29 5217-5232 (2001). |
X. Wang, Y. Qin, H. Hua, Y. H. Lee, and S. T. Wu. “Digitally switchable multi-focal lens using freeform optics.” Opt. Express 16;26(8):11007-17(2018). |
X. Wang, and H. Hua. “Digitally Switchable Micro Lens Array for Integral Imaging.” SID Symposium Digest of Technical Papers. vol. 51. No. 1. (2020). |
M. Xu and H. Hua, “Finite-depth and varifocal head-mounted display based on geometrical lightguide,” Opt. Express 28(8), 12121-12137 (2020). |
Jason Geng: “Three-dimensional display technologies”, Advances in Optics and Photonics, vol. 5, No. 4, Nov. 22, 2013 (Nov. 22, 2013), pp. 456-535. |
Number | Date | Country | |
---|---|---|---|
20200328058 A1 | Oct 2020 | US |
Number | Date | Country | |
---|---|---|---|
62113656 | Feb 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15017763 | Feb 2016 | US |
Child | 16217158 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16217158 | Dec 2018 | US |
Child | 16790221 | US |