This disclosure relates generally to the field of optics, and in particular but not exclusively, relates to near-to-eye optical systems.
A head mounted display (“HMD”) is a display device worn on or about the head. HMDs usually incorporate some sort of near-to-eye optical system to display an image within a few centimeters of the human eye. Single eye displays are referred to as monocular HMDs while dual eye displays are referred to as binocular HMDs. Some HMDs display only a computer generated image (“CGI”), while other types of HMDs are capable of superimposing CGI over a real-world view. This latter type of HMD is often referred to as augmented reality because the viewer's image of the world is augmented with an overlaying CGI, also referred to as a heads-up display (“HUD”).
HMDs have numerous practical and leisure applications. Aerospace applications permit a pilot to see vital flight control information without taking their eye off the flight path. Public safety applications include tactical displays of maps and thermal imaging. Other application fields include video games, transportation, and telecommunications. There is certain to be new found practical and leisure applications as the technology evolves; however, many of these applications are currently limited due to the cost, size, field of view, eye box, and efficiency of conventional optical systems used to implemented existing HMDs.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of a system and technique for extending an eyebox of an image display apparatus using whole image scanning are described herein. In the following description numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
During operation, light source 205 generates whole image 201 by simultaneously projecting each pixel in a two-dimensional array of image pixels 235 that represent the whole portion of the image. Although
Whole image 201 is projected onto scanning mirror 215, which reflects whole image 201 towards (or into) light bending optics 230, which deliver whole image 201 to eye 120. Scanning mirror 215 is physically mounted to an actuator 220, which is controlled by scanning controller 225. Under control of scanning controller 225, whole image 201 can be scanned across the eye 120 via physically adjustments to the position of scanning mirror 215. Repetitiously scanning whole image 201 across multiple different locations results in an eyebox area 240 that is larger than the whole image 201 itself. This has the effect of increasing eyebox area 240 relative to the raster scanning technique illustrated in
Actuator 220 may be implemented using a variety of different electro-mechanical devices that can manipulate the position of scanning mirror 215 along one, two, or even three dimensions. For example, in one embodiment, actuator 220 is a micro-electro-mechanical system (“MEMS”) scanner and scanning mirror 215 is a reflective surface (e.g., metallic coating) disposed on the MEMS scanner. In one embodiment, the MEMS scanner is repositionable in two independent axis. For example, the MEMS scanner may be capable of controlled movement in two translational axes (e.g., X and Y axes, X and Z axes, etc.), capable of controlled movement about two rotational axes (e.g., R1 and R2), or capable of controlled movement in both a translational axis and a rotational axis together. Adjustments to the position of scanning mirror 215 via actuator 220 under control of scanning controller 225 enables the whole image 201 output from image source 205 to be redirected along multiple optical paths (e.g., optical paths 250 and 255). By sequentially and repetitiously cycling through the different optical paths (e.g., cycling through the set of optical paths 30 or 60 times per second), eyebox area 240 is enlarged since the whole image 201 is viewable from a larger range of locations.
Returning to
Light bending optics 230 are provided so that whole image 201 may be generated at a location peripheral to eye 120 (e.g., temple region of the user) and transported to a position in front of eye 120 for emission to eye 120 in a near-to-eye configuration. Light bending optics 230 may be implemented with a variety of different optical devices. The illustrated embodiment includes a simple reflective surface positioned in front of eye 120. In one embodiment, the reflective surface may be partially transparent so that external light from an external scene may be pass through to eye 120 and the whole image 201 (e.g., computer generated image) may be superimposed over the external scene thereby displaying an augmented reality to the user. In other embodiments, light bending optics 230 may include a variety of other structures including planar waveguides, light pipes, fused fiber bundles, partially reflective lenses, or otherwise.
Gaze tracking system 505 is provided to continuously monitor the movement of eye 120, to determine a gazing direction (e.g., location of the pupil) of eye 120 in real-time, and to provide feedback signals to actuator 220. The real-time feedback control can be used to dynamically adjust a bias position of scanning mirror 215 so that whole image 401 can be translated to track the movement of eye 120. In one embodiment, the feedback control can also be used to adjust a pre-distortion applied to image 201 to compensate for any dynamic image distortion that may occur due to positional changes of scanning mirror 215. Via appropriate feedback control, the central scanning position of whole image 401 can be made to move with eye 120 in a complementary manner to further increase the size of the eyebox 410 and/or the field of view of image 401 displayed to eye 120. For example, if eye 120 looks left, then image 401 may be shifted to the left to track the eye movement and remain in the user's central vision. Gaze tracking system 505 may also be configured to implement other various functions as well. For example, gaze tracking system 505 may be used to implement a user interface controlled by eye motions that enable to the user to select objects within their vision and issue other commands.
In one embodiment, gaze tracking camera 510 can be positioned to acquire a direct image of eye 120. In other embodiments, gaze tracking camera 510 can be positioned to acquire eye images via reflection from light bending optics 230.
In a process block 705, gaze tracking camera 510 captures gazing image 615 of eye 120. Gazing image 615 may be acquired as a direct image or a reflection off of one or more reflective surfaces. A new gazing image 615 may be continually acquired as a video stream of images. In a process block 710, gazing image 615 is analyzed by gaze tracking controller 610 to determine the current gazing direction of eye 120. The gazing direction may be determined based upon the location of the pupil within the gazing image 615. With the real-time gazing direction determined, gaze tracking controller 610 can provide feedback control signals to scanning controller 225 to adjust the bias point of actuator 220 and scanning mirror 215 in real-time (process block 715). The bias point is a position about which scanning mirror 215 is repetitiously adjusted; however, the bias point can be any position in the repetitious cycle of motion and need not be a center position. In one embodiment, gaze tracking controller 610 may further provide a feedback control signal to CGI engine 605 to facilitate real-time pre-distortion correction to compensate for any image distortion that may occur from scanning mirror 215, diffractive lens 405, or light bending optics 230. Real-time adjustments to scanning mirror 215 may cause dynamically changing optical distortion. Accordingly, the feedback control signal to CGI engine 605 may be used to impart compensating pre-distortion. Pre-distortion may include applying various types of complementary optical correction effects such as keystone, barrel, and pincushion.
In a process block 720, the CGI is generated by CGI engine 605 and output from image source 205 as whole image 201. In a process block 725, scanning controller 225 instructs actuator 220 to commence repetitious scanning of whole image 201 across eye 120 about the bias point set in process block 715. In a process block 730, a new gazing image is captured of eye 120 and used to determine whether eye 120 has changed its gaze. If eye movement is detected (decision block 735), then process 700 returns to process block 710 and continues from there. Each time eye 120 moves, the gazing direction is determined and a new bias point is selected based upon the gazing direction. The whole image is then repetitiously scanned about the bias point.
In the illustrated embodiment, light bending optics 820 include a light pipe or housing in which scanning mirror 215 and a partially transparent redirecting mirror are embedded. Whole image 201 is directed onto the partially transparent mirror disposed in front of eye 120 where whole image 201 exits towards eye 120. Although the illustrated embodiment of HMD 800 includes light pipes for implementing the light bending optics, it should be appreciated that embodiments are not limited in this regard. Rather a variety of different light bending optics including free space air and a redirecting mirror may be used. Furthermore, although
The two near-to-eye optical systems 801 are secured into an eye glass arrangement that can be worn on the head of a user. The left and right ear arms 810 and 815 rest over the user's ears while nose assembly 805 rests over the user's nose. The frame assembly is shaped and sized to position an emission surface of light bending optics 820 in front of a corresponding eye 120 of the user. Of course, other frame assemblies may be used (e.g., single, contiguous visor for both eyes, integrated headband or goggles type eyewear, etc.).
The illustrated embodiment of HMD 800 is capable of displaying an augmented reality to the user. Light bending optics 820 permit the user to see a real world image via external scene light 825. Left and right (binocular embodiment) CGIs 830 may be generated by one or two CGI engines (not illustrated) coupled to a respective light sources 205. CGIs 830 are seen by the user as virtual images superimposed over the real world as an augmented reality. In some embodiments, external scene light 825 may be blocked or selectively blocked to provide a head mounted virtual reality display.
The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a tangible machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or the like.
A tangible machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Number | Name | Date | Kind |
---|---|---|---|
4806011 | Bettinger | Feb 1989 | A |
5093567 | Staveley | Mar 1992 | A |
5334991 | Wells et al. | Aug 1994 | A |
5539422 | Heacock et al. | Jul 1996 | A |
5696521 | Robinson et al. | Dec 1997 | A |
5715337 | Spitzer et al. | Feb 1998 | A |
5771124 | Kintz et al. | Jun 1998 | A |
5815126 | Fan et al. | Sep 1998 | A |
5844530 | Tosaki | Dec 1998 | A |
5886822 | Spitzer | Mar 1999 | A |
5896232 | Budd et al. | Apr 1999 | A |
5943171 | Budd et al. | Aug 1999 | A |
5949583 | Rallison et al. | Sep 1999 | A |
6023372 | Spitzer et al. | Feb 2000 | A |
6081304 | Kuriyama et al. | Jun 2000 | A |
6091546 | Spitzer | Jul 2000 | A |
6172657 | Kamakura et al. | Jan 2001 | B1 |
6201629 | McClelland et al. | Mar 2001 | B1 |
6204974 | Spitzer | Mar 2001 | B1 |
6222677 | Budd et al. | Apr 2001 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6353492 | McClelland et al. | Mar 2002 | B2 |
6353503 | Spitzer et al. | Mar 2002 | B1 |
6356392 | Spitzer | Mar 2002 | B1 |
6384982 | Spitzer | May 2002 | B1 |
6538799 | McClelland et al. | Mar 2003 | B2 |
6618099 | Spitzer | Sep 2003 | B1 |
6690516 | Aritake et al. | Feb 2004 | B2 |
6701038 | Rensing et al. | Mar 2004 | B2 |
6723354 | Spitzer | Apr 2004 | B1 |
6724354 | Spitzer et al. | Apr 2004 | B1 |
6738535 | Kanevsky et al. | May 2004 | B2 |
6747611 | Budd et al. | Jun 2004 | B1 |
6829095 | Amitai | Dec 2004 | B2 |
6879443 | Spitzer et al. | Apr 2005 | B2 |
7158096 | Spitzer | Jan 2007 | B1 |
7242527 | Spitzer et al. | Jul 2007 | B2 |
7391573 | Amitai | Jun 2008 | B2 |
7457040 | Amitai | Nov 2008 | B2 |
7576916 | Amitai | Aug 2009 | B2 |
7577326 | Amitai | Aug 2009 | B2 |
7637617 | Liu et al. | Dec 2009 | B2 |
7643214 | Amitai | Jan 2010 | B2 |
7663805 | Zaloum et al. | Feb 2010 | B2 |
7672055 | Amitai | Mar 2010 | B2 |
7724441 | Amitai | May 2010 | B2 |
7724442 | Amitai | May 2010 | B2 |
7724443 | Amitari | May 2010 | B2 |
7830607 | Hotta et al. | Nov 2010 | B2 |
7843403 | Spitzer | Nov 2010 | B2 |
7900068 | Weststrate et al. | Mar 2011 | B2 |
8004765 | Amitai | Aug 2011 | B2 |
20010048554 | Yona et al. | Dec 2001 | A1 |
20020097498 | Melville et al. | Jul 2002 | A1 |
20030090439 | Spitzer et al. | May 2003 | A1 |
20050174651 | Spitzer et al. | Aug 2005 | A1 |
20060192306 | Giller et al. | Aug 2006 | A1 |
20060192307 | Giller et al. | Aug 2006 | A1 |
20080219025 | Spitzer et al. | Sep 2008 | A1 |
20090122414 | Amitari | May 2009 | A1 |
20100046070 | Mukawa | Feb 2010 | A1 |
20100103078 | Mukawa et al. | Apr 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100278480 | Vasylyev et al. | Nov 2010 | A1 |
20110213664 | Osterhout et al. | Sep 2011 | A1 |
Number | Date | Country |
---|---|---|
2272980 | Jun 1994 | GB |
10-2011-0065375 | Jun 2011 | KR |
WO9605533 | Feb 1996 | WO |
WO 0133282 | May 2001 | WO |
Entry |
---|
Kasai, I., Tanijiri, Y., Endo, T., and Ueda, H., “A Forgettable Near Eye Display,” Optics Technology Division, Minolta Co., Ltd.; The Fourth International Symposium on Wearable Computers, 2000, pp. 115-118. |
Cakmakci, O. and Rolland, J., “Head-Worn Displays: A Review,” Journal of Display Technology, vol. 2, No. 3, Sep. 2006, pp. 199-216. |
Levola, Tapani, “Diffractive Optics for Virtual Reality Displays”, Academic Dissertation, Joensuu 2005, University of Joensuu, Department of Physics, Vaisala Laboratory, 26 pages. |
Mukawa, Hiroshi et al., “Distinguished Paper: A Full Color Eyewear Display using Holographic Planar Waveguides”, SID Symposium Digest of Technical Papers—May 2008—vol. 39, Issue 1, pp. 89-92. |
Microvision Color Eyewear Program Brief, Microvision, Inc., http://www.microvision.com/pdfs/program—brief.pdf <retrieved from Internet May 12, 2011>, 2 pages. |
Microvision Technology: PicoP Display Engine, Microvision, Inc., http://www.microvision.com/technology/picop.html <retrieved from Internet May 12, 2011>, 1 page. |
Microvision Technology: MEMS Scanning Mirror, Microvision, Inc., http://www.microvision.com/technology/mems.html <retrieved from Internet May 12, 2011>, 1 page. |
Microvision Bar Code Scanners: MEMS Technology Overview, Microvision, Inc., http://www.microvision.com/barcode/mems—scanner.html <retrieved from Internet May 12, 2011>, 2 pages. |
PCT/US2012/041153; PCT International Search Report and Written Opinion of the International Searching Authority, mailed Jan. 31, 2013, 11 pages. |
PCT/US2012/041153; PCT International Preliminary Report on Patentability, mailed Jan. 23, 2014, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20130016413 A1 | Jan 2013 | US |