The present invention relates generally to a wearable 3D augmented reality display, and more particularly, but not exclusively, to a wearable 3D augmented reality display comprising 3D integral imaging (InI) optics.
An augmented reality (AR) display, which allows overlaying 2D or 3D digital information on a person's real-world view, has long been portrayed as a transformative technology to redefine the way we perceive and interact with digital information. Although several types of AR display devices have been explored, a desired form of AR displays is a lightweight optical see-through head-mounted display (OST-HMD), which enables optical superposition of digital information onto the direct view of the physical world and maintains see-through vision to the real world. With the rapidly increased bandwidth of wireless networks, the miniaturization of electronics, and the prevailing cloud computing, one of the current challenges is to realize an unobtrusive AR display that integrates the functions of OST-HMDs, smart phones, and mobile computing within the volume of a pair of eyeglasses.
Such an AR display, if available, will have the potential to revolutionize many fields of practice and penetrate through the fabric of life, including medical, defense and security, manufacturing, transportation, education and entertainment fields. For example, in medicine AR technology may enable a physician to see CT images of a patient superimposed onto the patient's abdomen while performing surgery; in mobile computing it can allow a tourist to access reviews of restaurants in his or her sight while walking on the street; in military training it can allow fighters to be effectively trained in environments that blend 3D virtual objects into live training environments.
Typically, the most critical barriers of AR technology are defined by the displays. The lack of high-performance, compact and low-cost AR displays limits the ability to explore the full range of benefits potentially offered by AR technology. In recent years a significant research and market drive has been toward overcoming the cumbersome, helmet-like form factor of OST-HMD systems, primarily focusing on achieving compact and lightweight form factors. Several optical technologies have been explored, resulting in significant advances in OST-HMDs. For instance, the well-advertised Google Glass® is a very compact, lightweight (˜36 grams), monocular OST-HMD, providing the benefits of encumbrance-free instant access to digital information. Although it has demonstrated a promising and exciting future prospect of AR displays, the current version of Google Glass® has a very narrow FOV (approximately 15° FOV diagonally) with an image resolution of 640×360 pixels. It offers limited ability to effectively augment the real-world view in many applications.
Despite such promises a number of problems remain with existing OST-HMD's, such as visual discomfort of AR displays. Thus, it would be an advance in the art to provide OST-HMD's which provide increased visual comfort, while achieving low-cost, high-performance, lightweight, and true 3D OST-HMD systems.
In one of its aspects the present invention may provide a 3D augmented reality display having a microdisplay for providing a virtual 3D image for display to a user. For example, the optical approach of the present invention may uniquely combine the optical paths of an AR display system with that of a micro-InI subsystem to provide a 3D lightfield optical source. This approach offers the potential to achieve an AR display invulnerable to the accommodation-convergence discrepancy problem. Benefiting from freeform optical technology, the approach can also create a lightweight and compact OST-HMD solution.
In this regard, in one exemplary configuration of the present invention, display optics may be provided to receive optical radiation from the microdisplay and may be configured to create a 3D lightfield, that is, a true optically reconstructed 3D real or virtual object from the received radiation. (As used herein the term “3D lightfield” is defined to mean the radiation field of a 3D scene comprising a collection of light rays appearing to be emitted by the 3D scene to create the perception of a 3D scene.) An eyepiece in optical communication with the display optics may also be included, with the eyepiece configured to receive the 3D lightfield from the display optics and deliver the received radiation to an exit pupil of the system to provide a virtual display path. The eyepiece may include a selected surface configured to receive the 3D lightfield from the display optics and reflect the received radiation to an exit pupil of the system to provide a virtual display path. The selected surface may also be configured to receive optical radiation from a source other than the microdisplay and to transmit such optical radiation to the exit pupil to provide a see-through optical path. The eyepiece may include a freeform prism shape. In one exemplary configuration the display optics may include integral imaging optics.
The foregoing summary and the following detailed description of the exemplary embodiments of the present invention may be further understood when read in conjunction with the appended drawings, in which:
Despite current commercial development of HMDs, very limited efforts have been made to address the challenge of minimizing visual discomfort of AR displays, which is a critical concern in applications requiring an extended period of use. One of the key factors causing visual discomfort is the accommodation-convergence discrepancy between the displayed digital information and the real-world scene, which is a fundamental problem inherent to most of the existing AR displays. The accommodation cue refers to the focus action of the eye where ciliary muscles change the refractive power of the crystalline lens and therefore minimize the amount of blur for the fixated depth of the scene. Associated with eye accommodation change is the retinal image blur cue which refers to the image blurring effect varying with the distance from the eye's fixation point to the points nearer or further away. The accommodation and retinal image blurring effects together are known as focus cues. The convergence cue refers to the rotation action of the eyes to bring the visual axes inward or outward to intersect at a 3D object of interest at near or far distances.
The accommodation-convergence mismatch problem stems from the fact that the image source in most of the existing AR displays is a 2D flat surface located at a fixed distance from the eye. Consequently, this type of AR display lacks the ability to render correct focus cues for digital information that is to be overlaid over real objects located at distances other than the 2D image source. It causes the following three accommodation-convergence conflict. (1) There exists a mismatch of accommodation cues between the 2D image plane and the real-world scene (
In one of its aspects the present invention relates to a novel approach to OST-HMD designs by combining 3D lightfield creation technology and freeform optical technology. 3D lightfield creation technology of the present invention reconstructs the radiation field of a 3D scene by creating a collection of light rays appearing to be emitted by the 3D scene and creating the perception of a 3D scene. Thus, as used herein the term “3D lightfield” is defined to mean the radiation field of a 3D scene comprising a collection of light rays appearing to be emitted by the 3D scene to create the perception of a 3D scene. The reconstructed 3D scene creates a 3D image source for HMD viewing optics, which enables the replacement of a typical 2D display surface with a 3D source and thus potentially overcomes the accommodation-convergence discrepancy problem. Any optical system capable of generating a 3D lightfield may be used in the devices and methods of the present invention. For instance, one exemplary configuration of the present invention uses micro integral imaging (micro-InI) optics for creating a full-parallax 3D lightfield to optically create the perception of the 3D scene. (Persons skilled in the art will be aware that Integral imaging (InI) is a multi-view imaging and display technique that captures or displays the light fields of a 3D scene by utilizing an array of pinholes, lenses or microlenses. In the case of being a display technique, a microlens array in combination with a display device, which provides a set of elemental images each having information of a different perspective of the 3D scene. The microlens array in combination with the display device renders ray bundles emitted by different pixels of the display device, and these ray bundles from different pixels intersect and optically create the perception of a 3D point that appears to emit light and occupy the 3D space. This method allows the reconstruction of a true 3D image of the 3D scene with full parallax information in all directions.) Other optical system capable of generating a 3D lightfield which may be used with the present invention include, but not limited to, holographic display (M. Lucente, “Interactive three-dimensional holographic displays: seeing the future in depth,” Computer Graphics, 31(2), pp. 63-67, 1997; P. A. Blanche, et al, “Holographic three-dimensional telepresence using large-area photorefractive polymer”, Nature, 468, 80-83, Nov. 2010), multi-layer computational lightfield display (G. Wetzstein et al., “Tensor Displays: Compressive light field synthesis using multilayer displays with directional backlighting,” ACM Transactions on Graphics, 31(4), 2012.), and volumetric displays (Blundell, B. G., and Schwarz, A. J., “The classification of volumetric display systems: characteristics and predictability of the image space,” IEEE Transaction on Visualization and Computer Graphics, 8(1), pp. 66-75, 2002. J. Y. Son, W. H. Son, S. K. Kim, K. H. Lee, B. Javidi, “Three-Dimensional Imaging for Creating Real-World-Like Environments,” Proceedings of IEEE Journal, Vol. 101, issue 1, pp. 190-205, January 2013.).
A micro-InI system has the potential of achieving full-parallax 3D object reconstruction and visualization in a very compact form factor suitable for a wearable system. It can dramatically alleviate most of the limitations in a conventional autostereoscopic InI display due to the benefit of well-constrained viewing positions and can be effectively utilized for addressing the accommodation-convergence discrepancy problem in conventional HMD systems. The micro-InI unit can reconstruct a miniature 3D scene through the intersection of propagated ray cones from a large number of recorded perspective images of a 3D scene. By taking advantage of the freeform optical technology, the approach of the present invention can result in a compact, lightweight, goggle-style AR display that is potentially less vulnerable to the accommodation-convergence discrepancy problem and visual fatigue. Responding to the accommodation-convergence discrepancy problem of existing AR displays, we developed an AR display technology with the ability to render the true lightfield of a 3D scene reconstructed optically and thus accurate focus cues for digital information placed across a large depth range.
The challenges of creating a lightweight and compact OST-HMD solution, invulnerable to the accommodation-convergence discrepancy problem, are to address two cornerstone issues. The first is to provide the capability of displaying a 3D scene with correctly rendered focus cues for a scene's intended distance correlated with the eye convergence depth in an AR display, rather than on a fixed-distance 2D plane. The second is to create an optical design of an eyepiece with a form factor as compelling as a pair of eyeglasses.
A block diagram of a 3D OST-HMD system in accordance with the present invention is illustrated in
In one of its aspects, the present invention provides an innovative OST-HMD system that integrates the 3D micro-InI method for full-parallax 3D scene optical visualization with freeform optical technology for OST-HMD viewing optics. This approach enables the development of a compact 3D InI optical see-through HMD (InI-OST-HMD) with full-parallax lightfield rendering capability, which is anticipated to overcome the persisting accommodation-convergence discrepancy problem and to substantially reduce visual discomfort and fatigue experiences of users.
Full-parallax lightfield creation method. An important step to address the accommodation-convergence discrepancy problem is to provide the capability of correctly rendering the focus cues of digital information regardless of its distance to the viewer, rather than rendering digital information on a fixed-distance 2D surface. Among the different non-stereoscopic display methods, we chose to use an InI method that allows the reconstruction of the full-parallax lightfields of a 3D scene appearing to be emitted by a 3D scene seen from constrained or unconstrained viewing zones. Compared with all other techniques, an InI technique requires a minimal amount of hardware complexity, which makes it possible to integrate it with an OST-HMD optical system and create a wearable true 3D AR display.
Although the InI method is promising, improvements are still desirable due to three major limitations: (1) low lateral and longitudinal resolutions; (2) narrow depth of field (DOF); and (3) limited field of view angle. These limitations are subject to the limited imaging capability and finite aperture of microlenses, poor spatial resolution of large-size displays, and the trade-off relationship between wide view angle and high spatial resolution. Conventional InI systems typically yield low lateral and depth resolutions and narrow DOF. These limitations, however, can be alleviated in a wearable InI-HMD system of the present invention. First, microdisplays with large pixel counts and very fine pixels (e.g. ˜5 μm pixel size) may be used in the present invention to replace large-pixel display devices (˜200-500 μm pixel size) used in conventional InI displays, offering at least 50× gain in spatial resolution,
The lightfields of the miniature 3D scene reconstructed by a micro-InI unit may be relayed by eyepiece optics into the eye for viewing. The eyepiece optics not only effectively couples the 3D lightfields into the eye (exit) pupil but may also magnify the 3D scene to create a virtual 3D display appearing to be at a finite distance from the viewer.
As an example,
Among the different methods for HMD designs, freeform optical technology demonstrates great promise in designing compact HMD systems.
Rather than requiring multiple elements, the optical path is naturally folded within a three-surface prism structure of the eyepiece 640, which helps reduce the overall volume and weight of the optics substantially when compared with designs using rotationally symmetric elements.
To enable see-through capability for AR systems, surface 2 of the eyepiece 640 may be coated as a beam splitting mirror. A freeform corrector lens 650 may be added to provide a wearable 3D augmented reality display 690 having improved see-through capability. The corrector lens 650 may include two freeform surfaces which may be attached to the surface 2 of the eyepiece 640 to correct the viewing axis deviation and undesirable aberrations introduced by the freeform prism eyepiece 640 to the real world scene. The rays from the virtual lightfield generated by the 3D InI unit 630 are reflected by surface 2 of the prism eyepiece 640, while the rays from a real-world scene are transmitted through the freeform eyepiece 640 and corrector lens 650,
Thus, in devices of the present invention, the freeform eyepiece 640 may image the lightfield of a 3D surface AOB, rather than a 2D image surface. In such an InI-HMD system 600, 690, the freeform eyepiece 640 can reconstruct the lightfield of a virtual 3D object A′O′B′ at a location optically conjugate to the lightfield of a real object, while in a conventional HMD system the eyepiece creates a magnified 2D virtual display which is optically conjugate to the 2D microdisplay surface.
A proof-of-concept monocular prototype of an InI OST-HMD according to the configuration of
System Prescription for Display Path
In Table 1, surfaces #2-#4 specify the free-form eyepiece 640. Table 1 surfaces #2 and #4 represent the same physical surface and corresponds to eyepiece surface 1, in
System Prescription for Optical See-Through Path
In Table 2 surfaces #2 and #3 are eyepiece surfaces 1 and 3, modeled the same as in the display path. Surfaces #4, #5 specify the freeform corrector lens 650. Surface #4 is an exact replica of Surface #3 (eyepiece surface 2).
As used in the system prescription Tables, e.g., Table 1 or Table 2, the term “XY Poly” refers to a surface which may be represented by the equation
where z is the sag of the free-form surface measured along the z-axis of a local x, y, z coordinate system, c is the vertex curvature (CUY), r is the radial distance, k is the conic constant, and Cj is the coefficient for xmyn.
For demonstration purposes, a 3D scene including a number “3” and a letter “D” was simulated. In the visual space, the objects “3” and “D” were located ˜4 meters and 30 cms away from the eye position, respectively. To clearly demonstrate the effects of focusing, these character objects, instead of using plain solid colors, were rendered with black line textures. An array of 18×11 elemental images of the 3D scene were simulated (
The invention described and claimed herein is not to be limited in scope by the specific embodiments herein disclosed, since these embodiments are intended as illustrations of several aspects of the invention. Any equivalent embodiments are intended to be within the scope of this invention. Indeed, various modifications of the invention in addition to those shown and described herein will become apparent to those skilled in the art from the foregoing description. Such modifications are also intended to fall within the scope of the appended claims.
A number of patent and non-patent publications are cited in the specification, the entire disclosure of each of these publications is incorporated by reference herein.
This application is a continuation of U.S. application Ser. No. 16/400,370 filed on May 1, 2019, which is a continuation of U.S. application Ser. No. 15/122,492 filed on Aug. 30, 2016, which is a 371 application of International Application No. PCT/US15/18948 filed Mar. 5, 2015, which claims the benefit of priority of U.S. Provisional Application No. 61/948,226 filed on Mar. 5, 2014, the entire contents of which application(s) are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3632184 | King | Jan 1972 | A |
3992084 | Nakamura | Nov 1976 | A |
4468101 | Ellis | Aug 1984 | A |
4669810 | Wood | Jun 1987 | A |
4753522 | Nishina | Jun 1988 | A |
4863251 | Herloski | Sep 1989 | A |
5109469 | Duggan | Apr 1992 | A |
5172272 | Aoki | Dec 1992 | A |
5172275 | Dejager | Dec 1992 | A |
5416315 | Filipovich | May 1995 | A |
5436763 | Chen | Jul 1995 | A |
5526183 | Chen | Jun 1996 | A |
5572229 | Fisher | Nov 1996 | A |
5621572 | Fergason | Apr 1997 | A |
5625495 | Moskovich | Apr 1997 | A |
5699194 | Takahashi | Dec 1997 | A |
5701202 | Takahashi | Dec 1997 | A |
5706136 | Okuyama | Jan 1998 | A |
5818632 | Stephenson | Oct 1998 | A |
5880711 | Tamada | Mar 1999 | A |
5880888 | Schoenmakers | Mar 1999 | A |
5917656 | Hayakawa | Jun 1999 | A |
5959780 | Togino | Sep 1999 | A |
6008781 | Furness | Dec 1999 | A |
6023373 | Inoguchi | Feb 2000 | A |
6028606 | Kolb | Feb 2000 | A |
6034823 | Togino | Mar 2000 | A |
6198577 | Kedar | Mar 2001 | B1 |
6201646 | Togino | Mar 2001 | B1 |
6236521 | Nanba | May 2001 | B1 |
6239915 | Takagi | May 2001 | B1 |
6243199 | Hansen | Jun 2001 | B1 |
6271972 | Kedar | Aug 2001 | B1 |
6384983 | Yamazaki | May 2002 | B1 |
6396639 | Togino | May 2002 | B1 |
6404561 | Isono | Jun 2002 | B1 |
6404562 | Ota | Jun 2002 | B1 |
6433376 | Kim | Aug 2002 | B2 |
6433760 | Vaissie | Aug 2002 | B1 |
6493146 | Inoguchi | Dec 2002 | B2 |
6510006 | Togino | Jan 2003 | B1 |
6563648 | Gleckman | May 2003 | B2 |
6646811 | Inoguchi | Nov 2003 | B2 |
6653989 | Nakanishi | Nov 2003 | B2 |
6671099 | Nagata | Dec 2003 | B2 |
6731434 | Hua | May 2004 | B1 |
6829113 | Togino | Dec 2004 | B2 |
6963454 | Martins | Nov 2005 | B1 |
6999239 | Martins | Feb 2006 | B1 |
7152977 | Ruda | Dec 2006 | B2 |
7177083 | Holler | Feb 2007 | B2 |
7230583 | Tidwell | Jun 2007 | B2 |
7249853 | Weller-Brophy | Jul 2007 | B2 |
7405881 | Shimizu | Jul 2008 | B2 |
7414791 | Urakawa | Aug 2008 | B2 |
7522344 | Curatu | Apr 2009 | B1 |
8467133 | Miller | Jun 2013 | B2 |
8503087 | Amirparviz | Aug 2013 | B1 |
8511827 | Hua | Aug 2013 | B2 |
9201193 | Smith | Dec 2015 | B1 |
9239453 | Cheng | Jan 2016 | B2 |
9310591 | Hua | Apr 2016 | B2 |
9720232 | Hua | Aug 2017 | B2 |
9874760 | Hua | Jan 2018 | B2 |
20010009478 | Yamazaki | Jul 2001 | A1 |
20010048561 | Heacock | Dec 2001 | A1 |
20020015116 | Park | Feb 2002 | A1 |
20020060850 | Takeyama | May 2002 | A1 |
20020063913 | Nakamura | May 2002 | A1 |
20020067467 | Dorval | Jun 2002 | A1 |
20020114077 | Javidi | Aug 2002 | A1 |
20020181115 | Massof | Dec 2002 | A1 |
20030076591 | Ohmori | Apr 2003 | A1 |
20030090753 | Takeyama | May 2003 | A1 |
20040136097 | Park | Jul 2004 | A1 |
20040164927 | Suyama | Aug 2004 | A1 |
20040196213 | Tidwell | Oct 2004 | A1 |
20040218243 | Yamazaki | Nov 2004 | A1 |
20040233551 | Takahashi | Nov 2004 | A1 |
20050036119 | Ruda | Feb 2005 | A1 |
20050179868 | Seo | Aug 2005 | A1 |
20050248849 | Urey | Nov 2005 | A1 |
20060028400 | Lapstun | Feb 2006 | A1 |
20060119951 | McGuire | Jun 2006 | A1 |
20070109505 | Kubara | May 2007 | A1 |
20070246641 | Baun | Oct 2007 | A1 |
20080036853 | Shestak | Feb 2008 | A1 |
20080094720 | Yamazaki | Apr 2008 | A1 |
20080291531 | Heimer | Nov 2008 | A1 |
20090115842 | Saito | May 2009 | A1 |
20090168010 | Vinogradov | Jul 2009 | A1 |
20090256943 | Kondo | Oct 2009 | A1 |
20100091027 | Oyama | Apr 2010 | A1 |
20100109977 | Yamazaki | May 2010 | A1 |
20100208372 | Heimer | Aug 2010 | A1 |
20100271698 | Kessler | Oct 2010 | A1 |
20100289970 | Watanabe | Nov 2010 | A1 |
20110037951 | Hua | Feb 2011 | A1 |
20110043644 | Munger | Feb 2011 | A1 |
20110075257 | Hua | Mar 2011 | A1 |
20110090389 | Saito | Apr 2011 | A1 |
20110221656 | Haddick | Sep 2011 | A1 |
20120013988 | Hutchin | Jan 2012 | A1 |
20120019557 | Aronsson | Jan 2012 | A1 |
20120050891 | Seidl | Mar 2012 | A1 |
20120057129 | Durnell | Mar 2012 | A1 |
20120081800 | Cheng | Apr 2012 | A1 |
20120113092 | Bar-Zeev | May 2012 | A1 |
20120160302 | Citron | Jun 2012 | A1 |
20120162549 | Gao | Jun 2012 | A1 |
20120242697 | Border | Sep 2012 | A1 |
20120262802 | Huang | Oct 2012 | A1 |
20130100524 | Magarill | Apr 2013 | A1 |
20130112705 | McGill | May 2013 | A1 |
20130182317 | Takahashi | Jul 2013 | A1 |
20130187836 | Cheng | Jul 2013 | A1 |
20130222896 | Komatsu | Aug 2013 | A1 |
20130258461 | Sato | Oct 2013 | A1 |
20130285885 | Nowatzyk | Oct 2013 | A1 |
20130286053 | Fleck | Oct 2013 | A1 |
20130300634 | White | Nov 2013 | A1 |
20130329304 | Hua | Dec 2013 | A1 |
20140009845 | Cheng | Jan 2014 | A1 |
20140035959 | Paul | Feb 2014 | A1 |
20140049833 | Totani | Feb 2014 | A1 |
20140071539 | Gao | Mar 2014 | A1 |
20140300869 | Hirsch | Oct 2014 | A1 |
20140347361 | Alpaslan | Nov 2014 | A1 |
20140361957 | Hua | Dec 2014 | A1 |
20150168802 | Bohn | Jun 2015 | A1 |
20150177445 | Takagi | Jun 2015 | A1 |
20150201176 | Graziosi | Jul 2015 | A1 |
20150208061 | Yang | Jul 2015 | A1 |
20150212321 | Zhao | Jul 2015 | A1 |
20150262424 | Tabaka | Sep 2015 | A1 |
20150277129 | Hua | Oct 2015 | A1 |
20150346495 | Welch | Dec 2015 | A1 |
20150363978 | Maimone | Dec 2015 | A1 |
20160011419 | Gao | Jan 2016 | A1 |
20160085075 | Cheng | Mar 2016 | A1 |
20160239985 | Haddick et al. | Aug 2016 | A1 |
20160320620 | Maimone | Nov 2016 | A1 |
20170078652 | Hua | Mar 2017 | A1 |
20170102545 | Hua | Apr 2017 | A1 |
20170202633 | Liu | Jul 2017 | A1 |
20180045949 | Hua | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
1252133 | May 2000 | CN |
101359089 | Feb 2009 | CN |
101424788 | May 2009 | CN |
0408344 | Jan 1991 | EP |
1102105 | May 2001 | EP |
2928034 | Aug 2009 | FR |
02200074 | Aug 1990 | JP |
H03101709 | Apr 1991 | JP |
08160345 | Jun 1996 | JP |
H09218375 | Aug 1997 | JP |
H09297282 | Nov 1997 | JP |
H1013861 | Jan 1998 | JP |
H10307263 | Nov 1998 | JP |
H11326820 | Nov 1999 | JP |
2000249974 | Sep 2000 | JP |
2001013446 | Jan 2001 | JP |
2001066543 | Mar 2001 | JP |
2001145127 | May 2001 | JP |
2001238229 | Aug 2001 | JP |
2002148559 | May 2002 | JP |
2003241100 | Aug 2003 | JP |
2006091333 | Apr 2006 | JP |
2006276884 | Oct 2006 | JP |
2007101930 | Apr 2007 | JP |
2010072188 | Apr 2010 | JP |
2014505381 | Feb 2014 | JP |
9923647 | May 1999 | WO |
2004079431 | Sep 2004 | WO |
2007002694 | Jan 2007 | WO |
2007085682 | Aug 2007 | WO |
2007002694 | Dec 2007 | WO |
2007140273 | Dec 2007 | WO |
2008089417 | Jul 2008 | WO |
2011134169 | Nov 2011 | WO |
2012064546 | May 2012 | WO |
2012118573 | Sep 2012 | WO |
2013112705 | Aug 2013 | WO |
2014062912 | Apr 2014 | WO |
2015134738 | Sep 2015 | WO |
2015134740 | Sep 2015 | WO |
2015184409 | Dec 2015 | WO |
2016033317 | Mar 2016 | WO |
2018052590 | Mar 2018 | WO |
Entry |
---|
US 9,207,443 B2, 12/2015, Cheng (withdrawn) |
US 9,213,186 B2, 12/2015, Cheng (withdrawn) |
US 9,880,387 B2, 01/2018, Hua (withdrawn) |
Hong Hua, Chunyu Gao, “A compact eyetracked optical see-through head-mount display” Proc. SPIE 8288, Stereoscopic Displays and Applications XXII 8288 IF (Feb. 25, 2012); doi: 10.117/12.909523 p. 1-9 (Year: 2012). |
‘Fresnel Lenses’ downloaded from http://www.fresneltech.com on Jun. 8, 2011. Copyright Fresnel Technologies, Inc., 2003. |
Azuma, R., et al., ‘Recent advances in augmented reality’, IEEE Computer Graphics App;. 21, 34-47 (2001). |
Bajura, M., et al., “Merging virtual objects with the real world: seeing ultrasound imagery within the patient” in Proceedings of ACM SIGGRAPH (ACM, Chicago, 1992), pp. 203-210. |
Biocca, et al., “Virtual eyes can rearrange your body: adapting to visual displacement in see-through, head-mounted displays”, Presence: Teleoperators and Virtual Environments 7, 262-277 (1998). |
Bunkenburg, J. ‘Innovative Diffractive Eyepiece for Helmet-Mounted Display.’ SPIE vol. 3430. pp. 41-49 Jul. 1998. |
C. Curatu, H. Hua, and J. P. Rolland, “Projection-based headmounted display with eye-tracking capabilities,” Proc. SPIE 5875, 587050J (2005). |
Cakmakci, O., et al., ‘Head-Worn Displays: A Review’. Journal of Display Technology, vol. 2, No. 3, Sep. 2006, pp. 199-216. |
Caudell, T., et al., “Augmented reality: an application of heads-up display technology to manual manufacturing processes” in Proceedings of Hawaii International Conferences on Systems Sciences (Hawaii, 1992), pp. 659-669. |
Cruz-Neira et al., ‘Surround-Screen Projection-Based Virtual Reality: the Design and Implementation of the CAVE,’ Proceedings of the 20th Annual Conference on Computer Graphics and Interactive Techniques pp. 135-142, ACM SIGGRAPH, ACM Press (1993). |
Examination Report dated Apr. 29, 2011 from corresponding GB Application No. GB1012165.5. |
H. Hua, C. Gao, and J. P. Rolland, ‘Study of the Imaging properties of retroreflective materials used in head-mounted projective displays (HMPDs),’ Proc. SPIE4711, 194-201 (2002). |
H. Hua, C. Gao, F. Biocca, and J. P. Rolland, “An ultra-light and compact design and implementation of head-mounted projective displays,” in Proceedings of IEEE VR 2001, pp. 175-182. |
H. Hua, L. Brown, and C. Gao, “A new collaborative infrastructure: SCAPE,” in Proceedings of IEEE VR 2003 (IEEE, 2003), pp. 171-179. |
H. Hua, L. Brown, and C. Gao, “SCAPE: supporting stereoscopic collaboration in augmented and projective environments,” IEEE Comput. Graphics Appl. 24, 66-75 (2004). |
H. Hua, L. Brown, and C. Gao, “System and interface framework for SCAPE as a collaborative infrastructure,” Presence: Teleoperators and Virtual Environments 13, 234-250 (2004). |
H. Hua, Y. Ha, and J. P. Rolland, ‘Design of an ultra-light and compact projection lens,’ Appl. Opt. 42, 1-12 (2003), pp. 97-107. |
H. Hua., A. Girardot, C. Gao. J. P. Rolland. ‘Engineering of head-mounted projective displays’. Applied Optics. 39 (22), pp. 3814-3824. (2000). |
H. Hua and C. Gao, “A polarized head-mounted projective display,” in Proceedings of IEEE and ACM International Symposium on Mixed and Augmented Reality 2005 (IEEE, 2005), pp. 32-35. |
Hua et al., ‘Design of a Bright Polarized Head-Mounted Projection Display’ Applied Optics 46:2600-2610 (2007). |
International Search Report dated Mar. 9, 2009 with regard to International Patent Application No. PCT/US2009/031606. |
J. L. Pezzaniti and R. A. Chipman, “Angular dependence of polarizing beam-splitter cubes,” Appl. Opt. 33, 1916-1929 (1994). |
J. P. Rolland, F. Biocca, F. Hamza-Lup, Y. Ha, and R. Martins, “Development of head-mounted projection displays for distributed, collaborative, augmented reality applications,” Presence: Teleoperators and Virtual Environments 14, 528-549 (2005). |
J. P. Rolland and Hong Hua. “Head-mounted display systems,” in Encyclopedia of Optical Engineering. R. Barry Johnson and Ronald O. Driggers, Eds, (2005). |
Krueerke, Daniel, “Speed May Give Ferroelectric LCDS Edge in Projection Race,” Display Devices Fall '05. Copyright 2005 Dempa Publications, Inc. pp. 29-31. |
L. Brown and H. Hua, “Magic lenses for augmented virtual environments,” IEEE Comput. Graphics Appl. 26, 64-73 (2006). |
L. Davis, J. P. Rolland, F. Hamza-Lup, Y. Ha, J. Norfleet, and C. Imielinska, ‘Enabling a continuum of virtual environment experiences,’ IEEE Comput. Graphics Appl. 23, pp. 10-12 Mar./Apr. 2003. |
M. Inami, N. Kawakami, and S. Tachi, ‘Optical camouflage using retro-reflective projection technology,’ in Proceedings of ISMAR 2003 {ISMAR, 2003). |
M. Inami, N. Kawakami, D. Sekiguchi, Y. Yanagida, T. Maeda, and S. Tachi, “Visuo-haptic display using head-mounted projector,” in Proceedings of IEEE Virtual Reality 2000, pp. 233-240. |
M. Robinson. J. Chen, and G. Sharp, Polarization Engineering for LCD Projection. John Wiley & Sons, Ltd. England, 2005. |
N. Kawakami, M. Inami, D. Sekiguchi, Y. Yangagida, T. Maeda, and S. Tachi, ‘Object-oriented displays: a new type of display systemsfrom immersive display to object-oriented displays,’ in Proceedings of IEEE SMC 1999, IEEE International Conference on Systems, Man, and Cybernetics, vol. 5, pp. 1066-1069. |
R. Azuma, A Survey of Augmented Reality in Presence; Teleoperators and Virtual Environments 6. 4, 355-385, (1997). |
R. Kijima, K. Haza, Y. Tada, and T. Ojika, “Distributed display approach using PHMD with infrared camera,” in Proceedings of IEEE 2002 Virtual Reality Annual International Symposium (IEEE, 2002), pp. 1-8. |
R. Kijima and T. Ojika, “Transition between virtual environment and workstation environment with projective headmounted display,” in Proceedings of IEEE VR 1997 (IEEE, 1997), pp. 130-137. |
R. Martins, V. Shaoulov, Y. Ha, and J. P. Rolland, “Projection based head-mounted displays for wearable computers,” Proc. SPIE 5442, 104-110 (2004). |
R. N. Berry, L. A. Riggs, and C. P. Duncan, “The relation of vernier and depth discriminations to field brightness,” J. Exp. Psychol. 40, 349-354 (1950). |
Rolland, J.P., et al., ‘Optical versus video see-through head mounted displays in medical visualization’, Presence' Teleoperators and Virtual Environments 9, 287-309 (2000). |
Winterbottom, M., et al., ‘Helmet-Mounted Displays for use in Air Force Training and Simulation’, Human Effectiveness Directorate, Nov. 2005, pp. 1-54. |
Written Opinion of the International Searching Authority dated Mar. 9, 2009 with regard to International Patent Application No. PCT/US2009/031606. |
Y. Ha, H. Hua, R. Martins, and J. P. Rolland, “Design of a wearable wide-angle projection color display,” in Proceedings of International Optical Design Conference 2002 (IODC, 2002), pp. 67-73. |
Zhang, R., “8.3: Design of a Compact Light Engine for FLCOS Microdisplays in a p-HMPD system”, Society for Information Display 2008 International Symposium, Seminar and Exhibition (SID2008), Los Angeles, CA, May 2008. |
Zhang, R., et al., “Design of a Polarized Head-Mounted Projection Display Using Ferroelectric Liquid-Crystal-on-Silicon Microdisplays”, Applied Optics, vol. 47, No. 15, May 20, 2008, pp. 2888-2896. |
Zhang, R., et al., “Design of a Polarized Head-Mounted Projection Display using FLCOS Microdisplays”, Proc. of SPIE vol. 6489, 64890B-1. (2007). |
“OLED-XL Microdisplays,” eMagin 5 pages (2010). |
A. Jones, I. McDowall, Yamada H., M. Bolas, P. Debevec, Rendering for an Interactive 360° Light Field Display ACM Transactions on Graphics (TOG)—Proceedings of ACM SIGGRAPH 2007, 26(3), 2007. |
A. Malmone, and H. Fuchs, “Computational augmented reality eyeglasses,” Proc. of ISMAR 2012. |
A. Castro, Y. Frauel, and B. Javidi, “Integral imaging with large depth of field using an asymmetric phase mask,” Journal of Optics Express, vol. 15, Issue 16, p. 10266-10273 (Aug. 2007). |
A. T. Duchowski, “Incorporating the viewer's Point-of-Regard (POR) in gaze-contingent virtual environments”, SPIE-lnt. Soc. Opt. Eng. Proceedings of Spie—the International Society for Optical Engineering, vol. 3295, 1998, pp. 332-343. |
Akeley et al., “A Stereo Display Prototype with Multiple Focal Distances,” ACM Trans. Graphics 23:804-813 (2004). |
Blundell, B. G., and Schwarz, A. J., “The classification of volumetric display systems: characteristics and predictability of the image space,” IEEE Transaction on Visualization and Computer Graphics, 8(1), pp. 66-75, 2002. |
C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. 58, 71-76 (1968). |
C. Manh Do, R. Mart□ ez-Cuenca, and B. Javidi, “Three-dimensional object-distortion-tolerant recognition for integral imaging using independent component analysis,” Journal of Optical Society of America A 26, issue 2, pp. 245-251 (Feb. 1, 2009). |
Chih-Wei Chen, Myungjin Cho, Yi-Pai Huang, and Bahram Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” IEEE Journal of Display Technology, 2014. |
Christopher M. Bishop, Neural Networks for Pattern Recognition, Oxford University Press, Inc. New York, NY 1995. |
Curatu, C., J.P. Rolland, and Hong Hua, “Dual purpose lens for an eye-tracked projection head-mounted display,” Proceedings of International Optical Design Conference, Vancouver, Canada, Jun. 2006. |
D. Cheng, Y.Wang, H. Hua, and M. M. Talha, Design of an optical see-through headmounted display with a low f-number and large field of view using a free-form prism, App. Opt. 48 (14), pp. 2655-2668, 2009. |
D. Cheng, Y. Wang, H. Hua, and M. M. Talha, “Design of an optical see-through head-mounted display with a low f-number and large field of view using a freeform prism,” Appl. Opt., 48(14):2655-2668, 2009. |
D. Cheng, Y. Wang, H. Hua, J. Sasian, “Design of a wide-angle, lightweight head-mounted display using free-form optics tiling,” Opt. Lett., 36(11):2098-100, 2011. |
D.M. Hoffman, A.R. Girshick, K. Akeley, and M.S. Banks, “Vergence-Accommodation Conflicts Hinder Visual Performance and Cause Visual Fatigue,” J. Vision, 8(3), 1-30, (2008). |
Davis et al., “Accommodation to Large Disparity Stereograms,” Journal of AAPOS 6:377-384 (2002). |
Downing et al., “A Three-Color, Solid-State, Three-Dimensional Display,” Science 273:1185-1189 (1996). |
Duchowski, A., “Eyetracking Methodology: theory and practice,” Publisher: Springer, 2003. |
Duchowski, A.T., and A. Coltekin, “Foveated gaze-contingent displays for peripheral LOD management, 3D visualization, and stereo imaging,” ACM Trans, on Mult. Comp., Comm., and App. 3, 1-21, (2007). |
Edgar et al., “Visual Accommodation Problems with Head-Up and Helmet-Mounted Displays?,” Displays 15:68-75 (1994). |
European Search Report dated Aug. 14, 2015 in corresponding EP application 13740989.2. |
F. Okano, H. Hoshino, J. Arai y I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36, 1598-1603 (1997). |
Favalora et al., “100 Million-Voxel Volumetric Display,” Proc. SPIE 4712:300-312 (2002). |
G. Wetzstein et al., “Tensor Displays: Compressive light field synthesis using multilayer displays with directional backlighting,” ACM Transactions on Graphics, 31(4), 2012. |
GB Examination Report corresponding to GB 1012165.5 dated Jun. 28, 2011. |
Geisler, W.S., J.S. Perry and J. Najemnik, “Visual search: The role of peripheral information measured using gaze-contingent displays,” J. Vision 6, 858-873 (2006). |
Graham-Rowe, “Liquid Lenses Make a Splash,” Nature-Photonics pp. 2-4 (2006). |
H. Hua, X. Hu, and C. Gao, “A high-resolution optical see-through head-mounted display with eyetracking capability,” Optics Express, Nov. 2013. |
H. Hua, “Sunglass-like displays become a reality with freeform optical technology,” SPIE Newsroom, 2012. |
H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, T. Yoshida, M. Kuwahara, and K. Aiki, A full-color eyewear display using planar waveguides with reflection volume holograms, J. Soc. Inf. Display 19 (3), pp. 185-193, 2009. |
H. Hoshi, N. Taniguchi, H. Morishima, T. Akiyama, S. Yamazaki and A. Okuyama, “Off-axial HMD optical system consisting of aspherical surfaces without rotational symmetry,” SPIE vol. 2653, 234 (1996). |
H. Hua, C. Pansing, and J.P. Rolland, “Modeling of an eye-imaging system for optimizing illumination schemes in an eye-tracked head-mounted display,” Appl. Opt., 46(31):7757-75, Oct. 2007. |
H. Hua, P. Krishnaswamy, and J.P. Rolland, ‘Video-based eyetracking methods and algorithms in head-mounted displays,’ Opt. Express, 14(10):4328-50, May 2006. |
Heanue et al., “Volume Holographic Storage and Retrieval of Digital Data,” Science 265:749-752 (1994). |
Hidenori Kuriyabashi, Munekazu Date, Shiro Suyama, Toyohiko HatadaJ. of the SID 14/5, 2006 pp. 493-498. |
Hua, “Merging the Worlds of Atoms and Bits: Augmented Virtual Environments,” Optics and Photonics News 17:26-33 (2006). |
Hua, H., C. Pansing, and J. P. Rolland, “Modeling of an eye-imaging system for optimizing illumination schemes in an eye-tracked head-mounted display,” Applied Optics, 46(32): 1-14, Nov. 2007. |
Hua, H. “Integration of eye tracking capability into optical see-through head-mounted displays,” Proceedings of SPIE (Electronic Imaging 2001), pp. 496-503, Jan. 2001. |
Hua et al., “Compact eyetracked optical see-through head-mounted display”, Proc. SPIE 8288, Stereoscopic Displays and Applications XXIII, 82881F (Feb. 9, 2012). |
Inoue et al., “Accommodative Responses to Stereoscopic Three-Dimensional Display,” Applied Optics, 36:4509-4515 (1997). |
International Search Report and Written Opinion dated Nov. 24, 2015 in corresponding PCT application PCT/US2015/047163. |
International Search Report dated Feb. 10, 2011 from PCT/CN2010/072376. |
International Search Report dated Jan. 29, 2014 in corresponding international application PCT/US2013/065422. |
International Search Report dated Jun. 18, 2010 in corresponding international application PCT/US2010/031799. |
J. Hong, S. Min, and B. Lee, “Integral floating display systems for augmented reality,” Applixed Optics, 51(18):4201-9, 2012. |
J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with non-uniform focal lengths and aperture sizes,” Opt. Lett. vol. 28, pp. 1924-1926 (2003). |
J. Arai, et al., “Depth-control method for integral imaging,” Feb. 1, 2008 / vol. 33, No. 3 / Optics Letters. |
J. E. Melzer's: ‘Overcoming the field-of-view/resolution invariant in head-mounted displays’ Proc. SPIE vol. 3362, 1998, p. 284. |
J. G. Droessler, D. J. Rotier, “Tilted cat helmet-mounted display,” Opt. Eng., vol. 29, 849 (1990). |
J. P. Rolland, “Wide-angle, off-axis, see-through head-mounted display,” Opt. Eng., vol. 39, 1760 (2000). |
J. S. Jang, F. Jin, and B. Javidi, “Three-dimensional integral imaging with large depth of focus by use of real and virtual image fields,” Opt. Lett. 28:1421-23,2003. |
J. Y. Son, W.H. Son, S.K. Kim, K.H. Lee, B. Javidi, “Three-Dimensional Imaging for Creating Real-World-Like Environments,” Proceedings of IEEE Journal, vol. 101, issue 1, pp. 190-205, Jan. 2013. |
Jisoo Hong, et al., “Three-dimensional display technologies of recent interest: Principles, Status, and Issues,” Applied Optics (Dec. 1, 2011) 50(34):106. |
K. Iwamoto, K. Tanie, T. T. Maeda, “A head-mounted eye movement tracking display and its image display method”, Systems & Computers in Japan, vol. 28, No. 7, Jun. 30, 1997, pp. 89-99. Publisher: Scripta Technica, USA. |
K. Iwamoto, S. Katsumata, K. Tanie, “An eye movement tracking type head mounted display for virtual reality system:—evaluation experiments of a prototype system”, Proceedings of 1994 IEEE International Conference on Systems, Man, and Cybernetics. Humans, Information and Technology (Cat. No. 94CH3571-5). IEEE. Part vol. 1, 1994, pp. 13-18 vol. 1. New York, NY, USA. |
Kuiper et al., “Variable-Focus Liquid Lens for Miniature Cameras,” Applied Physics Letters 85:1128-1130 (2004). |
Kuribayashi, et al., “A Method for Reproducing Apparent Continuous Depth in a Stereoscopic Display Using “Depth-Fused 3D” Technology” Journal of the Society for Information Display 14:493-498 (2006). |
L. G. Brown's: ‘Applications of the Sensics panoramic HMD’ SID Symposium Digest vol. 39, 2008, p. 77. |
Laurence R. Young, David Sheena, “Survey of eye movement recording methods”, Behavior Research Methods & Instrumentation, 7(5), 397-429, 1975. |
Liu et al., ‘A Novel Prototype for an Optical See-Through Head-Mounted Display with Addressable Focus Cues,’ IEEE Transactions on Visualization and Computer Graphics 16:381-393 (2010). |
Liu et al., “A Systematic Method for Designing Depth-Fused Multi-Focal Plane Three-Dimensional Displays,” Optics Express 18:11562-11573 (2010). |
Liu et al., “An Optical See-Through head Mounted Display with Addressable Focal Planes,” IEEE Computer Society, pp. 33-42 (2008). |
Liu et al., “Time-Multiplexed Dual-Focal Plane Head-Mounted Display with a Liquid Lens,” Optics Letters 34:1642-1644 (2009). |
Loschky, L.C. and Wolverton, G.S., “How late can you update gaze-contingent multiresolutional displays without detection?” ACM Trans. Mult. Comp. Comm. and App. 3, Nov. 2007. |
Love et al. (High Speed switchable lens enables the development of a volumetric stereoscopic display. Aug. 2009, Optics Express. vol. 17, No. 18, p. 15716-15725.). |
M. Marti-nez-Corral, H. Navarro, R. Mart□ ez-Cuenca, G. Saavedra, and B. Javidi, “Full parallax 3-D TV with programmable display parameters,” Opt. Phot. News 22, 50-50 (2011). |
M. D. Missig and G. M. Morris, “Diffractive optics applied to eyepiece design,” Appl. Opt. 34, 2452-2461 (1995). |
M. Daneshpanah, B. Javidi, and E. Watson, “Three dimensional integral imaging with randomly distributed sensors,” Journal of Optics Express, vol. 16, Issue 9, pp. 6368-6377, Apr. 21, 2008. |
M. Gutin: ‘Automated design and fabrication of ocular optics’ Proc. SPIE 2008, p. 7060. |
M. L. Thomas, W. P. Siegmund, S. E. Antos, and R. M. Robinson, “Fiber optic development for use on the fiber optic helmet-mounted display”, Helmet-Mounted Displays, J. T. Carollo, ed., Proc. SPIE 116, 90-101, 1989. |
M. Lucente, “Interactive three-dimensional holographic displays: seeing the future in depth,” Computer Graphics, 31(2), pp. 63-67, 1997. |
McQuaide et al., “A Retinal Scanning Display System That Produces Multiple Focal Planes with a Deformable Membrane Mirror,” Displays 24:65-72 (2003). |
Mon-Williams et al., “Binocular Vision in a Virtual World: Visual Deficits Following the Wearing of a Head-Mounted Display,” Ophthalmic Physiol. Opt. 13:387-391 (1993). |
O. Cakmakci, B. Moore, H. Foroosh, and J. P. Rolland, “Optimal local shape description for rotationally non-symmetric optical surface design and analysis,” Opt. Express 16, 1583-1589 (2008). |
Optical Research Associates, http://www.optica1res.com, 2 pages (obtained Jan. 26, 2011). |
P. A. Blanche, et al., “Holographic three-dimensional telepresence using large-area photorefractive polymer”, Nature, 468, 80-83, Nov. 2010. |
P. Gabbur, H. Hua, and K. Barnard, ‘A fast connected components labeling algorithm for real-time pupil detection,’ Mach. Vision Appl., 21(5):779-787, 2010. |
R. MartÃ?Â-nez-Cuenca, H. Navarro, G. Saavedra, B. Javidi, and M. MartÃ?Â-nez-Corral, “Enhanced viewing-angle integral imaging by multiple-axis telecentric relay system,” Optics Express, vol. 15, Issue 24, p. 16255-16260, Nov. 21, 2007. |
R. Schulein, C. Do, and B. Javidi, “Distortion-tolerant 3D recognition of underwater objects using neural networks,” Journal of Optical Society of America A, vol. 27, No. 3, pp. 461-468, Mar. 2010. |
R. Schulein, M. DaneshPanah, and B. Javidi, “3D imaging with axially distributed sensing,” Journal of Optics Letters, vol. 34, Issue 13, pp. 2012-2014, Jul. 1, 2009. |
R.J. Jacob, “The use of eye movements in human-computer interaction techniques: what you look at is what you get”, ACM Transactions on Information Systems, 9(2), 152-69, 1991. |
Reingold, E.M., L.C. Loschky, G.W. McConkie and D.M. Stampe, “Gaze-contingent multiresolutional displays: An integrative review,” Hum. Factors 45, 307-328 (2003). |
Rolland, J. P., A. Yoshida, L. D. Davis and J. H. Reif, “High-resolution inset head-mounted display,” Appl. Opt. 37, 4183-93 (1998). |
Rolland et al., “Multifocal Planes Head-Mounted Displays,” Applied Optics 39:3209-3215 (2000). |
S. Bagheri and B. Javidi, “Extension of Depth of Field Using Amplitude and Phase Modulation of the Pupil Function,” Journal of Optics Letters, vol. 33, No. 7, pp. 757-759, Apr. 1, 2008. |
S. Hong, J. Jang, and B. Javidi,“Three-dimensional volumetric object reconstruction using computational integral imaging,” Journal of Optics Express, on-line Journal of the Optical Society of America, vol. 12, No. 3, pp. 483-491, Feb. 9, 2004. |
S. Hong and B. Javidi, “Distortion-tolerant 3D recognition of occluded objects using computational integral imaging,” Journal of Optics Express, vol. 14, Issue 25, p. 12085-12095, Dec. 11, 2006. |
S. Kishk and B. Javidi, “Improved Resolution 3D Object Sensing and Recognition using time multiplexed Computational Integral Imaging,” Optics Express, on-line Journal of the Optical Society of America, vol. 11, No. 26, pp. 3528-3541, Dec. 29, 2003. |
Schowengerdt, B. T., and Seibel, E. J., “True 3-D scanned voxel displays using single or multiple light sources,” Journal of SID, 14(2), pp. 135-143, 2006. |
Schowengerdt et al., “True 3-D Scanned Voxel Displays Using Single or Multiple Light Sources,” J. Soc. Info. Display 14:135-143 (2006). |
Sheedy et al., “Performance and Comfort on Near-Eye Computer Displays,” Optometry and Vision Science 79:306-312 (2002). |
Shibata et al., “Stereoscopic 3-D Display with Optical Correction forthe Reduction of the Discrepancy Between Accommodation and Convergence,” Journal of the Society for Information Display 13:665-671 (2005). |
Shiwa et al., “Proposal for a 3-D Display with Accommodative Compensation: 3DDAC,” Journal of the Society for Information Display 4:255-261 (1996). |
Sullivan, “A Solid-State Multi-Planar Volumetric Display,” SID Symposium Digest of Technical Papers 34:354-356 (2003). |
Suyama, S., Ohtsuka, S., Takada, H., Uehira, K., and Sakai, S., “Apparent 3D image perceived from luminance-modulated two 2D images displayed at different depths,” Vision Research, 44: 785-793, 2004. |
T. Okoshi, “Optimum design and depth resolution of lens-sheet and projection-type three-dimensional displays,” Appl. Opt. 10, 2284-2291 (1971). |
T. Ando, K. Yamasaki, M. Okamoto, and E. Shimizu, “Head Mounted Display using holographic optical element,” Proc. SPIE, vol. 3293, 183 (1998). |
Tibor Balogh, “The HoloVizio System,” Proceedings of SPIE, vol. 6055, 2006. |
Varioptic, “Video Auto Focus and Optical Image Stabilization,” http://www.varioptic.com/en/home.html, 2 pages (2008). |
Wann et al., Natural Problems for Stereoscopic Depth Perception in Virtual Environments, Vision Res. 35:2731-2736 (1995). |
Wartenberg, Philipp, “EyeCatcher, the Bi-directional OLED Microdisplay,” Proc. of SID 2011. |
Watt et al., “Focus Cues Affect Perceived Depth,” J Vision 5:834-862 (2005). |
Written Opinion dated Feb. 10, 2011 from PCT/CN2010/072376. |
Written Opinion dated Jun. 18, 2010 in corresponding international application PCT/US2010/031799. |
X. Hu and H. Hua, “Design and assessment of a depth-fused multi-focal-plane display prototype,” Journal of Display Technology, Dec. 2013. |
Xiao Xiao, Bahram Javidi, Manuel Martinez-Corral, and Adrian Stern , “Advances in Three-Dimensional Integral Imaging: Sensing, Display, and Applications,” Applied Optics, 52(4):. 546-560,2013. |
Xin Shen, Yu-Jen Wang, Hung-Shan Chen, Xiao Xiao, Yi-Hsin Lin, and Bahram Javidi, “Extended depth-of-focus 3D micro integral imaging display using a bifocal liquid crystal lens,” Optics Letters, vol. 40, issue 4, pp. 538-541 (Feb. 9, 2015). |
Xinda Hu and Hong Hua, “High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics,” Optics Express,22(11): 13896-13903, Jun. 2014. |
Y. Takaki, Y. Urano, S. Kashiwada, H. Ando, and K. Nakamura, “Super multi-view winshield display for long-distance image information presentation,” Opt. Express, 19, 704-16, 2011. |
Yamazaki et al., “Thin wide-field-of-view HMD with free-form-surface prism and applications”, Proc. SPIE 3639, Stereoscopic Displays and Virtual Reality Systems VI, 453 (May 24, 1999). |
Yano, S., Emoto, M., Mitsuhashi, T., and Thwaites, H., “A study of visual fatigue and visual comfort for 3D HDTV/HDTV images,” Displays, 23(4), pp. 191-201, 2002. |
Xin et al., “Design of Secondary Optics for IRED in active night vision systems,” Jan. 10, 2013, vol. 21, No. 1, Optics Express, pp. 1113-1120. |
S. Nikzad, Q. Yu, A. L. Smith, T. J. Jones, T. A. Tombrello, S. T. Elliott, “Direct detection and imaging of low-energy electrons with delta-doped charge-coupled devices,” Applied Physics Letters, vol. 73, p. 3417, 1998. |
European Search Report dated Apr. 28, 2016 from EP application 13847218.8. |
Xinda Hu et al: “48.1: Distinguished Student Paper: A Depth-Fused Multi-Focal-Plane Display Prototype Enabling Focus Cues in StereoscopicDisplays”, SID International Symposium. Digest of Technical Papers, vol. 42, No. I, Jun. 1, 2011 (Jun. 1, 2011), pp. 691-694, XP055266326. |
Hu and Hua, “Design and tolerance of a freeform optical system for an optical see-through multi-focal plane display,” Applied Optics, 2015. |
A. Yabe, “Representation of freeform surface suitable for optimization,” Applied Optics, 2012. |
Armitage, David, Ian Underwood, and Shin-Tson Wu. Introduction to Microdisplays. Chichester, England: Wiley, 2006. |
Hoshi, et al., “Off-axial HMD optical system consisting of aspherical surfaces without rotational symmetry,” Proc. SPIE 2653, Stereoscopic Displays and Virtual Reality Systems III, 234 (Apr. 10, 1996). |
S. Feiner, 2002, “Augmented reality: A new way of seeing,” Scientific American, No. 54, 2002. |
K. Ukai and P.A. Howardth, “Visual fatigue caused by viewing stereoscopic motion images: background, theories, and observations,” Displays, 29(2), pp. 106-116, 2008. |
B. T. Schowengerdt, M. Murari, E. J. Seibel, “Volumetric display using scanned fiber array,” SID Symposium Digest of Technical Papers, 2010. |
H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display”, Optics Express, 22(11): 13484-13491, 2014. |
W. Song, Y. Wang. D. Cheng, Y. Liu, “Light field head-mounted display with correct focus cue using micro structure array,” Chinese Optics Letters, 12(6): 060010, 2014. |
T. Peterka, R. Kooima, D. Sandin, A. Johnson, J. Leigh, T. DeFanti, “Advances in the Dynallax solid-state dynamic parallax barrier autostereoscopi visualization display system,” IEEE Trans. Visua. Comp. Graphics, 14(3): 487-499, 2008. |
Hu, X., Development of the Depth-Fused Multi-Focal Plane Display Technology, Ph.D. Dissertation, College of Optical Sciences, University of Arizona, 2014. |
S. Ravikumar, K. Akeley, and M. S. Banks, “Creating effective focus cues in multi-plane 3D displays,” Opt. Express 19, 20940-20952, 2011. |
X. Hu and H. Hua, “Design and tolerance of a free-form optical system for an optical see-hrough multi-focal-plane display,” Applied Optics, 54(33): 9990-9, 2015. |
Dewen Cheng et al.; “Large field-of-view and high resolution free-form head-mounted display”; SPIE-OSA/ vol. 7652 Jun. 2018. |
Huang et al., “An integral-imaging-based head-mounted light field display using a tunable lens ,;1nd aperture array.” Journal of the Society for Information Display Mar. 1, 2017; p. 199-201. |
G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” Journal of Physics (Paris) 7, 821-825 (1908). |
Full Certified Translation of Reference JP008160345. |
Full Certified Translation of Reference JP 02200074. |
H. Hua, “Enabling focus cues in head-mounted displays,” Proceedings of the IEEE 105(5), 805-824 (2017). |
G. E. Favalora, “Volumetric 3D displays and application infrastructure,” Computer, 38(8), 37-44 (2005). |
H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nature Photonics 11(3), 186 (2017). |
G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Letters 41(11), 2486-2489 (2016). |
S. B. Kim and J. H. Park, “Optical see-through Maxwellian near-to-eye display with an enlarged eyebox,” Opt. Letters 43(4), 767-770 (2018). |
D. Lanman and D. Luebke, “Near-eye light field displays,” ACM Trans. Graph. 32(6), 1-10 (2013). |
H. Huang and H. Hua, “High-performance integral-imaging-based light field augmented reality display using freeform optics,” Opt. Express 26(13), 17578-17590 (2018). |
B. Liu, X. Sang, X. Yu, X. Gao, L. Liu, C. Gao, P. Wang, Y. Le, and J. Du, “Time-multiplexed light field display with 120-degree wide viewing angle”. Opt. Express 27(24), p. 35728-35739 (2019). |
H. Huang and H. Hua, “Generalized methods and strategies for modeling and optimizing the optics of 3D head-mounted light field displays,” Opt. Express 27(18), 25154-25171 (2019). |
H. Huang and H. Hua, “Systematic characterization and optimization of 3D light field displays,” Opt. Express 25(16), 18508-18525 (2017). |
J. H. Park, S. W. Min, S. Jung, and B. Lee. “Analysis of viewing parameters for two display methods based on integral photography.” Applied Optics 40, No. 29 5217-5232 (2001). |
X. Wang, Y. Qin, H. Hua, Y. H. Lee, and S. T. Wu. “Digitally switchable multi-focal lens using freeform optics.” Opt. Express 16;26(8):11007-17(2018). |
X. Wang, and H. Hua. “Digitally Switchable Micro Lens Array for Integral Imaging.” SID Symposium Digest of Technical Papers. vol. 51. No. 1. (2020). |
M. Xu and H. Hua, “Finite-depth and varifocal head-mounted display based on geometrical lightguide,” Opt. Express 28(8), 12121-12137 (2020). |
Jason Geng: “Three-dimensional display technologies”, Advances in Optics and Photonics, vol. 5, No. 4, Nov. 22, 2013 (Nov. 22, 2013), pp. 456-535. |
Cheol-Joong Kim et al., “Depth plane adaptive integral imaging using a varifocal liquid lens array”, Applied Optics, OSA, vol. 54, No. 10, Apr. 1, 2015 (Apr. 1, 2015) , pp. 2565-2571. |
Xin Shen et al: “Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens”, Applied Optics, vol. 57, No. 7, Mar. 1, 2018 (Mar. 1, 2018), p. B184. |
Martinez-Cuenca R et al: “Progress in 3-D Multiperspective Display by Integral Imaging”, Proceedings of the IEEE, IEEE. New York, us, vol. 97, No. 6, Jun. 1, 2009 (Jun. 1, 2009), pp. 1067-1077. |
Kim Cheoljoong et al: “Depth-enhanced integral imaging display system with time-multiplexed depth planes using a varifocal liquid lens array”, Proceedings of SPIE, IEEE, US, vol. 9385, Mar. 11, 2015 (Mar. 11, 2015), pp. 93850D-93850D. |
Huan Deng et al: “The Realization of Computer Generated Integral Imaging Based on Two Step Pickup Method”, Photonics and Optoelectronic (SOPO), 2010 Symposium on, IEEE, Piscataway, NJ, USA, Jun. 19, 2010 (Jun. 19, 2010), pp. 1-3. |
Number | Date | Country | |
---|---|---|---|
20210006773 A1 | Jan 2021 | US |
Number | Date | Country | |
---|---|---|---|
61948226 | Mar 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16400370 | May 2019 | US |
Child | 17022602 | US | |
Parent | 15122492 | US | |
Child | 16400370 | US |