Near-eye display devices are configured to present images to a user via a display that is positioned close to the user's eyes. For example, a head-mounted augmented reality display device may be worn on a user's head to position a near-eye display directly in front of a user's eyes. The near-eye display device may include an imaging device for tracking a gaze of a user based on light reflected from the user's eyes.
Embodiments are provided for a see-through head-mounted display system. In one embodiment, the see-through head-mounted display system includes a freeform prism, a display device configured to emit display light through the freeform prism to an eye of a user, and an imaging device configured to receive gaze-detection light reflected from the eye and directed through the freeform prism. The see-through head-mounted display system may also include an illumination prism through which display light is emitted and gaze-detection light is emitted and received. In this way, the display light and gaze-detection light may share optics in order to provide a compact structure. The arrangement described herein may also enable the imaging device to be positioned at a back focal plane of the system, so that the imaging system is object-space telecentric. The telecentricity of the imaging system enables the system to maintain accuracy in gaze direction determinations regardless of changes of the distance between an eye of the user and elements of the see-through head-mounted display system.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Near-eye display systems may include components for tracking a gaze of a user in addition to components for displaying an image to a user. In near-eye display systems that utilize a waveguide or similar display technology, a glint source and a camera for a gaze-tracking system may be positioned externally to the waveguide, such that light transmitted by the glint source and received by the camera does not pass through the waveguide. Such systems may be sensitive to eye longitudinal and transverse movement, such as changes in the distance between an eye and one or more components of the near-eye display. For example, as the eye moves away from the near-eye display, an image of the eye received at the camera may appear smaller, thereby distorting the relative positions of the eye and a glint or other tracking element directed to the eye for gaze tracking.
Thus, embodiments are disclosed for utilizing a freeform prism in a near-eye display system such that a display and a gaze-tracking system may utilize the same optical components for transmitting and/or receiving light. The shared optical configuration allows the gaze-tracking system to be object-space telecentric, such that embodiments of the system that position the gaze-tracking system at a back focal plane of the freeform prism are less sensitive to eye longitudinal and transverse movement than non-telecentric configurations. The freeform prism enables the eye image to be collimated and focused at a camera lens of the gaze-tracking system without magnifying the image. Accordingly, eye movement relative to the near-eye display system may not affect the accuracy of the gaze-tracking system.
The see-through head-mounted display device 100 includes a right eye camera 204a and a left eye camera 204b, schematically illustrated in
The see-through displays 200a and 200b and the eye cameras 204a and 204b may be positioned at a viewing location relative to the eye via one or more securing mechanisms of the frame 104. For example, as illustrated in
During use, a distance between the eye, the displays 200a and 200b, and the eye cameras 204a and 204b may change. As discussed above, unless the system is object-space telecentric, a size of the images of the eye from the eye cameras 204a and 204b may be altered responsive to such changes in distance between the eye and the displays/cameras. Thus, an entrance pupil of each of eye cameras 204a and 204b may optionally be positioned at the back focal plane of the optics of the display devices 200a and 200b, respectively, in order to maintain accurate gaze tracking regardless of depth changes between the eye, the optics, and the eye cameras.
Turning now to
The display subsystem 306 may direct light through an illumination prism 308 including a beam splitter 310 (e.g., a polarizing beam splitter or hot mirror) configured to transmit the display light from the display device. For example, the beam splitter 310 may be configured to pass visible light while reflecting infrared light. In some embodiments, the display subsystem 306 may include a reflective micro-display, such as a liquid crystal on silicon (LCoS) display. In other embodiments, the display subsystem may include an emissive micro-display, such as organic light emitting diode (OLED) array display types, inorganic light emitting diode (iLED) array display types, and/or any other suitable micro-display. The beam splitter 310 may include a polarizing beam splitter, and an illumination light source 312 may be configured to emit illumination light, LL into the illumination prism 308 (e.g., from an optical wedge 314 to the polarizing beam splitter). The illumination light source 312 may comprise one or more light sources, such as an RGB LED array, one or more white LEDs (e.g., with a color filter arrangement), and/or any suitable illumination light source configuration. As the polarizing beam splitter splits the illumination light into beams of different polarization, the polarizing beam splitter may be configured to reflect a portion of the polarized illumination light toward the LCoS display for illuminating the display. The display may reflect the illumination light to generate the display light, LD, depicted in
The see-through display 300 includes a freeform prism 316 for directing light from the illumination prism 308 to the eye 302 of the user 304. The freeform prism 316 may have positive optical power and comprise at least three surfaces, each surface being non-planar and non-spherical. For example, the freeform prism 316 may include a user-facing surface 318 having a total internal reflective coating in order to direct the light into the user's eye 302. The freeform prism 316 may also include an outward-facing surface 320 opposite the user-facing total internal reflective surface, the outward-facing surface 320 having a coating that is highly reflective for infrared light and partially reflective for visible light. For example, the outward-facing surface 320 may be more reflective for infrared light than for visible light. In some embodiments, the outward-facing surface 320 may be configured to reflect substantially all infrared light and a portion of visible light.
The see-through display 300 may include one or more compensators positioned around the freeform prism 316. As illustrated in
The see-through display 300 also includes an eye tracking system comprising an imaging device, such as an eye tracking camera 330, and one or more glint sources 332 (e.g. one or more infrared light sources) configured to produce light for reflection from the user's eye. Emitted gaze-detection light, LG(E), from the glint sources 332 may travel along a gaze-detection optical path (indicated by rays originating at one of the glint sources 332) through at least a portion of the optics utilized for the display optical path. Turning briefly to
As shown in
The eye tracking system (e.g., the eye tracking camera 330 and/or the glint sources 332) may detect a location of the eye and/or anatomical structures thereof (e.g. a pupil of the eye 302). The eye tracking system may also detect a location of reflections from glint sources 332 in the image data acquired via eye tracking camera 330, and from this information determine a direction in which the eye is gazing. As depicted in
At 512, the method 500 includes receiving the gaze-detection light as reflected from the eye and directed through the freeform prism to a camera. For example, as shown in
In some embodiments, the methods and processes described herein may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
Computing system 600 includes a logic machine 602 and a storage machine 604. Computing system 600 may optionally include a display subsystem 606, input subsystem 608, communication subsystem 610, and/or other components not shown in
Logic machine 602 includes one or more physical devices configured to execute instructions. For example, the logic machine may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
The logic machine may include one or more processors configured to execute software instructions. Additionally or alternatively, the logic machine may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. Processors of the logic machine may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic machine optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic machine may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration.
Storage machine 604 includes one or more physical devices configured to hold instructions executable by the logic machine to implement the methods and processes described herein. When such methods and processes are implemented, the state of storage machine 604 may be transformed—e.g., to hold different data.
Storage machine 604 may include removable and/or built-in devices. Storage machine 604 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., RAM, EPROM, EEPROM, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage machine 604 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices.
It will be appreciated that storage machine 604 includes one or more physical devices. However, aspects of the instructions described herein alternatively may be propagated by a communication medium (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
Aspects of logic machine 602 and storage machine 604 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
When included, display subsystem 606 may be used to present a visual representation of data held by storage machine 604. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage machine, and thus transform the state of the storage machine, the state of display subsystem 606 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 606 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic machine 602 and/or storage machine 604 in a shared enclosure, or such display devices may be peripheral display devices. In some embodiments, the display subsystem may include a near-eye display such as see-through display 300.
When included, input subsystem 608 may comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or game controller. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity.
When included, communication subsystem 610 may be configured to communicatively couple computing system 600 with one or more other computing devices. Communication subsystem 610 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system 600 to send and/or receive messages to and/or from other devices via a network such as the Internet.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
Number | Name | Date | Kind |
---|---|---|---|
4711512 | Upatnieks | Dec 1987 | A |
4731462 | Russo | Mar 1988 | A |
5044727 | Steinle | Sep 1991 | A |
5331149 | Spitzer et al. | Jul 1994 | A |
5459611 | Bohn | Oct 1995 | A |
5583335 | Spitzer et al. | Dec 1996 | A |
5856842 | Tedesco | Jan 1999 | A |
6163336 | Richards | Dec 2000 | A |
6323970 | Popovich | Nov 2001 | B1 |
6550917 | Neal | Apr 2003 | B1 |
6580529 | Amitai et al. | Jun 2003 | B1 |
7184615 | Levola | Feb 2007 | B2 |
7522344 | Curatu et al. | Apr 2009 | B1 |
7576916 | Amitai | Aug 2009 | B2 |
7872635 | Mitchell | Jan 2011 | B2 |
8154583 | Kurtz et al. | Apr 2012 | B2 |
8160411 | Levola et al. | Apr 2012 | B2 |
8233204 | Robbins et al. | Jul 2012 | B1 |
8467133 | Miller | Jun 2013 | B2 |
8472120 | Border | Jun 2013 | B2 |
8611014 | Valera et al. | Dec 2013 | B2 |
8817350 | Robbins et al. | Aug 2014 | B1 |
20010009478 | Yamazaki et al. | Jul 2001 | A1 |
20010028438 | Matsumoto | Oct 2001 | A1 |
20020034016 | Inoguchi | Mar 2002 | A1 |
20030077459 | Vitt et al. | Apr 2003 | A1 |
20040196433 | Durnell | Oct 2004 | A1 |
20050122464 | Lu | Jun 2005 | A1 |
20050174535 | Lai et al. | Aug 2005 | A1 |
20060077558 | Urakawa et al. | Apr 2006 | A1 |
20060198027 | Li et al. | Sep 2006 | A1 |
20060215111 | Mihashi | Sep 2006 | A1 |
20070041684 | Popovich et al. | Feb 2007 | A1 |
20070109619 | Eberl | May 2007 | A1 |
20080024392 | Gustafsson | Jan 2008 | A1 |
20090189830 | Deering et al. | Jul 2009 | A1 |
20090219484 | Ebisawa | Sep 2009 | A1 |
20100328623 | Takahashi | Dec 2010 | A1 |
20110043644 | Munger et al. | Feb 2011 | A1 |
20120081236 | Best et al. | Apr 2012 | A1 |
20120139817 | Freeman | Jun 2012 | A1 |
20120162549 | Gao | Jun 2012 | A1 |
20120326948 | Crocco et al. | Dec 2012 | A1 |
20130077049 | Bohn | Mar 2013 | A1 |
20130083173 | Geisner et al. | Apr 2013 | A1 |
20130106674 | Wheeler et al. | May 2013 | A1 |
20130162673 | Bohn | Jun 2013 | A1 |
20140071539 | Gao | Mar 2014 | A1 |
20140104665 | Popovich et al. | Apr 2014 | A1 |
20140140653 | Brown et al. | May 2014 | A1 |
20140140654 | Brown et al. | May 2014 | A1 |
20140177023 | Gao | Jun 2014 | A1 |
20140204455 | Popovich et al. | Jul 2014 | A1 |
20140361957 | Hua | Dec 2014 | A1 |
Number | Date | Country |
---|---|---|
1591082 | Mar 2005 | CN |
101077232 | Nov 2007 | CN |
0408344 | Jan 1991 | EP |
0408344 | Mar 1992 | EP |
0687932 | Dec 1995 | EP |
1089111 | Apr 2001 | EP |
2805200 | Nov 2014 | EP |
2002328330 | Nov 2002 | JP |
2009265352 | Nov 2009 | JP |
2012172295 | Dec 2012 | WO |
Entry |
---|
Minier, V. et al., “Diffraction Characteristics of Superimposed Holographic Gratings in Planar Optical Waveguides,” IEEE Photonics Technology Letters, vol. 4, No. 10, Oct. 1992, 4 pages. |
Pu, A. et al., “Exposure Schedule for Multiplexing Holograms in Photopolymer Films,” Optical Engineering, vol. 35, No. 10, Oct. 1996, 6 pages. |
Han, X. et al., “Accurate Diffraction Efficiency Control for Multiplexed Volume Holographic Gratings,” Optical Engineering, vol. 41, No. 11, Available Online Oct. 2002, 13 pages. |
Yan, A. et al., “Multiplexing Holograms in the Photopolymer with Equal Diffraction Efficiency,” Advances in Optical Data Storage Technology, SPIE vol. 5643, Nov. 2004, 9 pages. |
Massenot, S. et al., “Multiplexed Holographic Transmission Gratings Recorded in Holographic Polymer-Dispersed Liquid Crystals: Static and Dynamic Studies,” Applied Optics, vol. 44, No. 25, Sep. 2005, 8 pages. |
Zharkova, G. et al., “Study of the Dynamics of Transmission Gratings Growth on Holographic Polymer-Dispersed Liquid Crystals,” International Conference on Methods of Aerophysical Research, ICMAR 2008, Jun. 2008, 4 pages. |
Kress, B. et al., “Exit Pupil Expander for Wearable See-Through Displays,” Photonic Applications for Aerospace, Transportation, and Harsh Environment IIII, SPIE vol. 8368, Apr. 2012, 8 pages. |
Handa, S. et al., “Development of Head-Mounted Display with Eye-Gaze Detection Function for the Severely Disabled,” VECIMS 2008—IEEE International Conference on Virtual Environments, Human-Computer Interfaces, and Measurement Systems, pp. 140-144, Jul. 14-16, 2008, 5 pages. |
ISA European Patent Office, International Search Report and Written Opinion Issued in Application No. PCT/US2014/043547, dated Oct. 1, 2014, WIPO, 10 pages. |
ISA European Patent Office, International Search Report and Written Opinion Issued in Application No. PCT/US2014/043549, dated Oct. 1, 2014, WIPO, 11 pages. |
Curatu, et al., “Projection-based Head-mounted Display with Eye-tracking Capabilities”, Retrieved at <<http://www.creol.ucf.edu/Research/Publications/1419.pdf>>, In Optics & Photonics—International Society for Optics and Photonics, Aug. 2005, pp. 9. |
Curatu et al., “Dual Purpose Lens for an Eye-Tracked Projection Head-Mounted Display”, Retrieved at <<http://www.creol.ucf.edu/Research/Publications/1410.PDF>> In Proceedings of the International Optical Design Conference Vancouver, Jun. 6, 2006, 7 pages. |
IPEA European Patent Office, Second Written Opinion issued in PCT Application No. US2014043549, dated Apr. 17, 2015, Germany, 5 pages. |
IPEA European Patent Office, International Preliminary Report on Patentability Issued in Application No. PCT/US2014/043549, dated Jul. 14, 2015, WIPO, 7 pages. |
European Patent Office, Office Action Issued in European Patent Application No. 14744987.0, dated Sep. 15, 2016, Germany, 3 pages. |
The State Intellectual Property Office of the People's Republic of China, First Office Action and Search Report Issued in Chinese Patent Application No. 201480036477.3, dated Apr. 28, 2017, 14 Pages. (Submitted with Partial English Translation of First Office Action and Search Report). |
“Final Office Action Issued in U.S. Appl. No. 13/926,968”, dated Jul. 28, 2016, 47 Pages. |
“Final Office Action Issued in U.S. Appl. No. 13/926,968”, dated Oct. 30, 2015, 31 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/926,968”, dated Mar. 30, 2016, 44 Pages. |
“Non-Final Office Action Issued in U.S. Appl. No. 13/926,968”, dated Jun. 30, 2015, 19 Pages. |
“Office Action Issued in European Patent Application No. 14744987.0”, dated Feb. 16, 2017, 4 Pages. |
Hua, et al., “A Compact Eyetracked optical see-through Head-Mounted Display”, In the Proceedings of SPIE, Stereoscopic Displays and Applications XXIII, vol. 8288, Feb. 9, 2012, 2 Pages. |
“Office Action Issued in European Patent Application No. 14744989.6”, dated Feb. 1, 2017, 6 Pages. |
“Office Action Issued in Chinese Patent Application No. 201480036475.4”, dated Jun. 1, 2018, 6 Pages. |
“Office Action Issued in Chinese Patent Application No. 201480036475.4”, dated Apr. 13, 2017, 13 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201480036475.4”, dated Oct. 11, 2017, 11 Pages. |
“Second Office Action Issued in Chinese Patent Application No. 201480036477.3”, dated Nov. 16, 2017, 9 Pages. |
“Third Office Action Issued in Chinese Patent Application No. 201480036477.3”, dated Jun. 4, 2018, 10 Pages. |
Number | Date | Country | |
---|---|---|---|
20140375790 A1 | Dec 2014 | US |