This disclosure relates generally to the field of optics, and in particular but not exclusively, relates to near-to-eye optical systems.
A head mounted display (“HMD”) is a display device worn on or about the head. HMDs usually incorporate some sort of near-to-eye optical system to emit a light image within a few centimeters of the human eye. Single eye displays are referred to as monocular HMDs while dual eye displays are referred to as binocular HMDs. Some HMDs display only a computer generated image (“CGI”), while other types of HMDs are capable of superimposing CGI over a real-world view. This latter type of HMD can serve as the hardware platform for realizing augmented reality. With augmented reality the viewer's image of the world is augmented with an overlaying CGI, also referred to as a heads-up display (“HUD”).
HMDs have numerous practical and leisure applications. Aerospace applications permit a pilot to see vital flight control information without taking their eye off the flight path. Public safety applications include tactical displays of maps and thermal imaging. Other application fields include video games, transportation, and telecommunications. There is certain to be new found practical and leisure applications as the technology evolves; however, many of these applications are limited due to the cost, size, weight, field of view, and efficiency of conventional optical systems used to implemented existing HMDs.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of a system and a method of operation for a head mounted display (“HMD”) are described herein. In the following description numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Wearable glasses 100 may include a controller 105 disposed in right temple arm 140 and a computer generated image (“CGI”) engine 110 disposed in left temple arm 130. Controller 105 and CGI engine 110 may be disposed in other locations in wearable glasses 100. Controller 105 may include an integrated circuit with hardware, firmware, or software logic. CGI engine 110 may include a processor and graphics engine for rendering image data. In one embodiment, controller 105 and CGI engine 110 are combined in one integrated-chip. Controller 105 may be used to receive, transmit, and process data and communicate with CGI engine 110.
The illustrated embodiment of wearable glasses 100 includes a miniature projector 115 and a camera 120 disposed in frame 125. Miniature projector 115 and camera 120 may be located in different locations in the frame and more than one camera or projector may be utilized in some embodiments. CGI engine 110 generates images that are transmitted to miniature projector 115 for projection. Miniature projector 115 may project a surface projection 150 to facilitate user input. The illustrated embodiment shows surface projection 150 as a numeric keypad showing the numbers one through nine. However, surface projection 150 could be a full QWERTY keyboard, a document, a webpage, or other projectable images.
Camera 120 may be used to monitor user interaction with surface projection 150. For calibration purposes, miniature projector 115 may project a calibration image for the user to interact with. Camera 120 may then take images of the user interacting with specific points on the calibration image and those images could be processed to establish spatial data points to compare to the user's interaction with future projections. Using the spatial data points established in calibration, camera 120 may monitor a number input sequence by a user where the user is interacting with surface projection 150. Camera 120 may report the number input sequence to controller 105 for processing and controller 105 may instruct CGI engine 110 to generate an image based on the user interaction. In one embodiment, camera 120 may monitor user gestures (e.g. a wave of a user's hand) as inputs for browsing a webpage, viewing pictures, flip pages, or otherwise.
When a first user (wearing wearable glasses 100) wants to show a second user a document, video, or picture, the first user can simply project the content onto a surface for viewing. Eye 260 of the first user of the wearable glasses can look through lens 145 and see surface projection 150. Surface projection 150 may also be user-interactive (e.g. typing on virtual keyboard) in addition to simply displaying content for viewing. Further, surface projection 150 may be user-interactive with users not wearing wearable glasses 100. For example, a person wearing wearable glasses 100 may want to project a virtual keyboard for another person to type on or project a schematic drawing program for another person to modify a schematic. Camera 120 may monitor the user interaction with surface projection 150 as an input to controller 105. The shared nature of wearable glasses 100 may also allow for multi-player gaming applications with one or more users wearing wearable glasses 100.
The illustrated embodiment of eyepiece 300 also includes eye-ward optics 355 on an eye-ward side of eyepiece 300 for focusing private projections (intended only for the eyes of the user) into an eye 360 of a user wearing wearable glasses that include eyepiece 300. Eyepiece 300 may include projection optics 365 on a external scene-side of eyepiece 300 for focusing public projections, such as surface projection 150, onto a surface. Projection optics 365 and eye-ward optics 355 may not be necessary for a highly collimated source (e.g. laser). The illustrated embodiment of projection optics 365 includes a selectable blocking element 370 that, when activated, will prevent a public projection from being externally projected. The selectable blocking element 370 may be a mechanical shutter, a neutral density filter controlled by a voltage (e.g. LCD shutter), or otherwise.
Illumination module 305 launches CGI light (which may be generically referred to as display light) along a forward path 380 toward NBS 315. Light relay 375 has a transparent structure to permit the CGI light to pass through along forward path 380. Light relay 375 may be fabricated of a solid transparent material (e.g., glass, quartz, acrylic, clear plastic, PMMA, ZEONEX-E48R, etc.) or be implemented as a solid housing having an inner air gap through which the CGI light passes. Light relay 375 operates to protect the optical path, but may not use total internal reflection (“TIR”) to guide or confine the CGI light.
After the CGI light along forward path 380 passes through light relay 375, NBS 315 reflects a first portion of the CGI light towards the external scene-side of eyepiece 300 and passes a second portion of the CGI light. In one embodiment, NBS 315 is a 45 degree 50/50 NBS, meaning it reflects 50 percent of light and passes the other 50 percent of light. The CGI light passed by NBS 315 continues along forward path 380 and end reflector 325 reflects back the CGI light along a reverse path 385. The CGI light along reverse path 385 encounters NBS 315, which reflects a portion of the CGI light along reverse path 385 toward an eye-ward side of eyepiece 300. The CGI light reflected toward the eye-ward side of eyepiece 300 may encounter eye-ward optics 355. Eye-ward optics 355 may rotate and/or focus the CGI light for eye 360. In one embodiment, NBS 315 may pass a smaller portion of light than it reflects to adjust the brightness of the CGI light for projection into eye 360.
The illustrated embodiment of
Illumination module 405 launches polarized CGI light along forward path 480 toward selectable waveplate retarder 410. Selectable waveplate retarder 410 configures the light to a selected polarization orientation (e.g. “S” or “P” polarization) in response to a received signal. The illustrated embodiment shows a control signal (CTRL) controlling the selection/configuring of the polarization orientation by selectable waveplate retarder 410. The control signal received by selectable waveplate retarder 410 may be a voltage, current, optical signal, or otherwise. The control signal received by selectable waveplate retarder 410 may be received from controller 105 or CGI engine 110. Selectable waveplate retarder 410 may be made from a layer of nematic liquid crystal that configures the polarization of light. In one embodiment, selectable waveplate retarder 410 is a one-half wavelength polarization rotator that configures the polarization orientation of light when receiving a voltage signal.
In
In
In one embodiment, selectable waveplate retarder 410 alternates between rotating the polarized CGI light and not rotating the polarized CGI light. When this happens at a high enough modulation rate, it may appear to the human eye that there is both a public, external projection and a private, near-to-eye projection at the same time. For example, if the signal controlling selectable waveplate retarder 410 was a 120 Hertz square-wave, the human eye may perceive both a private and a public projection, even though the CGI light would be a public projection a portion of the time and a private projection for the other portion of the time. In one embodiment, the portion of time that that the public projection or the private projection is projected is tuned as a way to control the brightness of each projection. In one embodiment, the polarized CGI light launched by illumination module 405 includes public CGI light and private CGI light. The public CGI light and the private CGI light may be interlaced and launched in sync with the signal that controls selectable waveplate retarder 410. The in sync signal (which may be at 60 Hertz or higher) may cause the selectable waveplate retarder to give the public CGI light a first polarization orientation (illustrated as “S” in
There may be several practical or leisure applications for eyepiece 400 when it is configured to interlace public and private CGI light at a modulation rate imperceptible to the human eye. In one embodiment, a document could be displayed publicly for more than one person to see, while a private projection of confidential information pertaining to the publicly projected document is projected into the eye of the user/wearer of eyepiece 400. In one embodiment, the public projection is a virtual keyboard and the private projection is a word processing display of the text that has been entered that allows the user to view what she has written. This arrangement will keep a document or an email authored by the user confidential. In one embodiment, a game board could be projected publicly for players to view, while a solution to the game board could be projected privately into the eye of the user/wearer of eyepiece 400. Interlacing public and private CGI light may yield power efficiencies because one projector can be used for projecting both public and private CGI light.
In process block 605, a projection target is selected for projection of polarized CGI light. Either a surface external to the HMD (public projection) or the eye of a user (private projection) of the HMD is selected as the projection target. An input from a user interface disposed in the HMD may determine which projection target is selected. Data associated with CGI that will be displayed may also determine which projection target is selected. In process block 610, polarized CGI light is emitted (or launched) from an illumination module (e.g. illumination module 405). In process block 615, a polarization orientation of the polarized CGI light is manipulated to steer the CGI light to the correct projection target. One way of manipulating the CGI light is using selectable waveplate retarder 410. Manipulating the polarization orientation of the CGI light may direct it to the projection target because special optics (e.g. PBS 415) reflect light having one polarization orientation and pass light having a second polarization orientation. In process block 620, a camera is activated to monitor a user interaction with the projection. The camera can be activated to monitor a user interaction with a private projection as well as a public projection. For example, if the user is viewing a private projection of pictures, the camera may monitor hand motions or gestures that enable browsing through the pictures. For public projections, the camera may monitor a projected virtual keyboard that the user (or another person) uses to input data.
In process block 705, a determination is made regarding whether two displays (or projections) will be utilized. If two displays will be utilized, a public and a private image is provided (process block 710). In process block 715, hardware is configured to display the public and the private image. In process block 705, if only one display will be utilized, an image is provided (process block 720), and a determination is made as to whether that image will be displayed publicly or privately (process block 725). If the image is to be displayed privately, hardware is configured to display the image privately (process block 730). If the image is to be displayed publicly, hardware is configured to display the image publicly (process block 735). At process block 740, a determination is made as to whether a displayed image will be user interactive. If the display (or projection) will be user interactive, then a camera (located on the HMD) is turned on to monitor the user input or user interaction with the display (process block 745).
The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a tangible or non-transitory machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or otherwise.
A tangible non-transitory machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Number | Name | Date | Kind |
---|---|---|---|
4117265 | Gerlach | Sep 1978 | A |
4195918 | Freche | Apr 1980 | A |
4272764 | Herr | Jun 1981 | A |
4711512 | Upatnieks | Dec 1987 | A |
5076664 | Migozzi | Dec 1991 | A |
5076682 | Pasfield | Dec 1991 | A |
5093567 | Staveley | Mar 1992 | A |
5526184 | Tokuhashi et al. | Jun 1996 | A |
5539422 | Heacock et al. | Jul 1996 | A |
5541767 | Murphy | Jul 1996 | A |
5696521 | Robinson et al. | Dec 1997 | A |
5715337 | Spitzer et al. | Feb 1998 | A |
5771124 | Kintz et al. | Jun 1998 | A |
5815126 | Fan et al. | Sep 1998 | A |
5844530 | Tosaki | Dec 1998 | A |
5886822 | Spitzer | Mar 1999 | A |
5896232 | Budd et al. | Apr 1999 | A |
5926318 | Hebert | Jul 1999 | A |
5943171 | Budd et al. | Aug 1999 | A |
5949583 | Rallison et al. | Sep 1999 | A |
6023253 | Taniguchi et al. | Feb 2000 | A |
6023372 | Spitzer et al. | Feb 2000 | A |
6091546 | Spitzer | Jul 2000 | A |
6172657 | Kamakura et al. | Jan 2001 | B1 |
6201629 | McClelland et al. | Mar 2001 | B1 |
6204974 | Spitzer | Mar 2001 | B1 |
6222677 | Budd et al. | Apr 2001 | B1 |
6346929 | Fukushima | Feb 2002 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6353492 | McClelland et al. | Mar 2002 | B2 |
6353503 | Spitzer et al. | Mar 2002 | B1 |
6356392 | Spitzer | Mar 2002 | B1 |
6384982 | Spitzer | May 2002 | B1 |
6538799 | McClelland et al. | Mar 2003 | B2 |
6618099 | Spitzer | Sep 2003 | B1 |
6640379 | Scribner | Nov 2003 | B1 |
6690516 | Aritake et al. | Feb 2004 | B2 |
6701038 | Rensing et al. | Mar 2004 | B2 |
6724354 | Spitzer | Apr 2004 | B1 |
6738535 | Kanevsky et al. | May 2004 | B2 |
6747611 | Budd et al. | Jun 2004 | B1 |
6829095 | Amitai | Dec 2004 | B2 |
6879443 | Spitzer et al. | Apr 2005 | B2 |
6880931 | Moliton et al. | Apr 2005 | B2 |
7059717 | Bloch | Jun 2006 | B2 |
7158096 | Spitzer | Jan 2007 | B1 |
7242527 | Spitzer et al. | Jul 2007 | B2 |
7255437 | Howell | Aug 2007 | B2 |
7391573 | Amitai | Jun 2008 | B2 |
7457040 | Amitai | Nov 2008 | B2 |
7576916 | Amitai | Aug 2009 | B2 |
7577326 | Amitai | Aug 2009 | B2 |
7643214 | Amitai | Jan 2010 | B2 |
7663805 | Zaloum et al. | Feb 2010 | B2 |
7672055 | Amitai | Mar 2010 | B2 |
7724441 | Amitai | May 2010 | B2 |
7724442 | Amitai | May 2010 | B2 |
7724443 | Amitai | May 2010 | B2 |
7843403 | Spitzer | Nov 2010 | B2 |
7900068 | Weststrate et al. | Mar 2011 | B2 |
8000020 | Amitai | Aug 2011 | B2 |
8004765 | Amitai | Aug 2011 | B2 |
8310555 | Ludlow | Nov 2012 | B2 |
8471967 | Miao et al. | Jun 2013 | B2 |
8643948 | Amitai et al. | Feb 2014 | B2 |
8749886 | Gupta | Jun 2014 | B2 |
8760765 | Gupta | Jun 2014 | B2 |
8767305 | Spitzer et al. | Jul 2014 | B2 |
8867131 | Amirparviz | Oct 2014 | B1 |
8867139 | Gupta | Oct 2014 | B2 |
20010013972 | Doany et al. | Aug 2001 | A1 |
20010055152 | Richards | Dec 2001 | A1 |
20030090439 | Spitzer et al. | May 2003 | A1 |
20050174651 | Spitzer et al. | Aug 2005 | A1 |
20050180021 | Travers | Aug 2005 | A1 |
20060192306 | Giller et al. | Aug 2006 | A1 |
20060192307 | Giller et al. | Aug 2006 | A1 |
20080219025 | Spitzer et al. | Sep 2008 | A1 |
20090122414 | Amitai | May 2009 | A1 |
20100046070 | Mukawa | Feb 2010 | A1 |
20100103078 | Mukawa et al. | Apr 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100278480 | Vasylyev | Nov 2010 | A1 |
20110213664 | Osterhout et al. | Sep 2011 | A1 |
20130070338 | Gupta et al. | Mar 2013 | A1 |
Number | Date | Country |
---|---|---|
2272980 | Jun 1994 | GB |
WO9605533 | Feb 1996 | WO |
Entry |
---|
Levola, Tapani, “Diffractive Optics for Virtual Reality Displays”, Academic Dissertation, Joensuu 2005, University of Joensuu, Department of Physics, Vaisala Laboratory, 26 pages. |
Mukawa, Hiroshi et al., “Distinguished Paper: A Full Color Eyewear Display using Holographic Planar Waveguides”, SID Symposium Digest of Technical Papers—May 2008—vol. 39, Issue 1, pp.89-92. |