Wearable systems can integrate various elements, such as miniaturized computers, input devices, sensors, image displays, wireless communication devices, and image and audio processors, into a device that can be worn by a user. Such systems can provide a mobile and lightweight solution to communicating, computing, and interacting with a user's environment. With the advance of technologies associated with wearable systems and miniaturized optical elements, it has become possible to consider wearable compact optical display systems that augment the user's experience of a real-world environment.
In one example, by placing an image display element or component close to the user's eye(s), an artificial or virtual computer-generated image can be displayed over the user's view of the real-world environment. One or more such image display elements can be incorporated into optical display systems and referred to generally as near-eye displays, head-mounted displays (“HMDs”), or heads-up displays (“HUDs”). Depending upon the size of the display element and the distance to the user's eye, the artificial image may fill or nearly fill the user's field of view.
In a first aspect, a display system includes an image generator configured to generate a virtual image and a first beam splitter coupled to the image generator. The virtual image and a real-world view are viewable through the first beam splitter from a viewing location. The display system also includes a second beam splitter coupled to the first beam splitter and a camera coupled to the second beam splitter. The camera is configured to image the viewing location. Further, a controller is coupled to the image generator and the camera. The controller is configured to control an operation of the display system based on the image of the viewing location.
In a second aspect, a display system includes a display panel configured to generate a light pattern. One or more optical components are coupled to the display panel. The one or more optical components are configured to transmit the light pattern, external light from a real-world environment, and reflected light from a viewing location. Further, the light pattern is viewable from the viewing location through the one or more optical components as a virtual image superimposed over the real-world environment. The display system also includes an optical sensor coupled to the one or more optical components and configured to receive the external light to obtain an image of the real-world environment and to receive the reflected light to obtain an image of the viewing location. A processor is coupled to the display panel and the optical sensor and is configured to process the image of the real-world environment and the image of the viewing location.
In a third aspect, a method includes generating a light pattern using a display panel and forming a computer generated image from the light pattern utilizing one or more optical components. The computer generated image is viewable from a viewing location. The method also includes receiving external light from a real-world environment through the one or more optical components and incident on an optical sensor. The real-world environment is viewable from the viewing location. In addition, the method includes obtaining an image of the real-world environment from the received external light, receiving light reflected from the viewing location and incident on the optical sensor, and obtaining an image of the viewing location location from the received reflected light. Further, the method includes controlling the generation of the light pattern based on the image of the viewing location.
The present disclosure generally relates to an optical display system that enables a user to observe the user's real-world surroundings or environment and to view a computer-generated virtual image. In some cases, the virtual image overlays a portion of the user's field of view of the real world.
In accordance with one example, the display system of the present disclosure includes a see-through wearable computer system, such as an HMD that displays a computer-generated virtual image that may be overlaid over a portion of the user's field of view of the real-world environment or surroundings. Thus, while the user of the HMD is going about his or her daily activities, such as walking, driving, exercising, etc., the user may be able to see a displayed image generated by the HMD at the same time that the user is looking out at his or her real-world surroundings.
The virtual image may include, for example, graphics, text, and/or video that provide content, such as data, alerts, or indications relating to the user's real-world environment. The content of the virtual image can relate to any number of contexts, including, but not limited to, the user's current environment, an activity in which the user is currently engaged, a biometric status of the user, and any audio, video, or textual communications that have been directed to the user. The virtual image may also be part of an interactive user interface and include menus, selection boxes, navigation icons, or other user interface features that enable the user to interact with the display system and other devices.
The virtual image can be updated or modified dynamically in response to a change in the context, such as a change in the user's real-world field of view, a change in the user's current activity, a received communication, a preset alarm or reminder, an interaction with a user interface or menu, etc. In one example of the present disclosure, the virtual image can be changed or modified in response to gaze tracking of the user. More particularly, in the present example, the display system is configured to track the gaze of the user and to identify one or more locations in the user's real-world view or in the virtual image where a user's eye is focused. Based on such identified location(s), the virtual image can be modified to relate to a feature in the real-world view or to some portion of the virtual image.
Referring now to
Generally, the processor 32 controls the image generator 26 to generate a light pattern that is directed through the optical component(s) 28 to form the virtual image that is viewable by the user 22. In addition, the processor 32 and the optical sensor 30 are configured to obtain a representation of the real-world environment and to track the gaze of the user 22. In response to the gaze tracking, the processor 32 is further configured to control the light pattern generated by the image generator 26 to update or modify the virtual image viewable by the user 22. The virtual image may be updated or modified in response to the gaze tracking by changing the location, size, brightness, content, and/or other properties thereof.
For example, in response to the processor 32 determining that a user's eye is focused on a sign on a sidewalk for a particular bus route, the virtual image can be modified to provide additional information for the bus route, such as an estimated time of arrival for the next bus, an estimated travel time to arrive at a predetermined destination if riding the bus, a required fare for the bus, etc. In another example, in response to the processor 32 determining that the user's eye is focused on an icon in the virtual image that indicates a new email or text message has been received, the virtual image can be modified to display the email or text message.
In the present example, the data storage 34 can be any suitable device or computer readable medium that is capable of storing data and instructions that can be executed by the processor 32 to control the image generator 26, to obtain the representation of the real-world environment, to track the gaze of the user 22, and to control other components of the display system 20. The power supply 36 provides electrical power to the various components of the display system 20 and can be any suitable rechargeable or non-rechargeable power supply. Further the I/O components 38 may include switches, dials, buttons, touch screens, etc. that allow the user 22 to interact with the display system 20. The I/O components 38 may also include, for example, speakers, microphones, biometric sensors, environmental sensors, and transmitters and/or receivers for communicating with other devices, servers, networks, and the like.
Further, the processor 32 can also be configured to control other components of the display system 20 in response to the gaze tracking. For example, in response to the processor 32 identifying that the user's eye(s) are focused on an icon in the virtual image that indicates a voicemail has been received, the I/O components 38 may be controlled to play the voicemail to the user through a speaker. In another example, in response to the processor 32 identifying that the user's eye(s) are focused on an icon in the virtual image to return a missed call, the I/O components 38 may be controlled to initiate a call to the missed call phone number.
In
In the illustrated optical system 50, the proximal portion 56 includes a proximal beam splitter 64 that has faces generally parallel to XY, XZ, and YZ planes. In
The proximal beam splitter 64 of
As seen in
In one embodiment, the proximal beam splitter 64, the distal beam splitter 80, and the light pipe 82 are made of glass. Alternatively, some or all of such optical components may be made partially or entirely of plastic, which can also function to reduce the weight of optical system 50. A suitable plastic material is Zeonex® E48R cyclo olefin optical grade polymer, which is available from Zeon Chemicals L.P., Louisville, Ky. Another suitable plastic material is polymethyl methacrylate (“PMMA”).
The distal portion 58 further includes a display panel 86 and a light source 88 optically coupled to the distal beam splitter 80. In the present example, the display panel 86 is generally vertically oriented and coupled to a right side of the distal beam splitter 80 and the light source 88 is coupled to a back side of the distal beam splitter.
The display panel 86 is configured to generate a light pattern from which the virtual image is formed. The display panel 86 may be an emissive display such as an Organic Light Emitting Diode (“OLED”) display. Alternatively, the display panel 86 may be a Liquid-Crystal on Silicon (“LCOS”) or a micro-mirror display such as a Digital Light Projector (“DLP”) that generates the light pattern by spatially modulating light from a light source, such as the light source 88. The light source 88 may include, for example, one or more light-emitting diodes (“LEDs”) and/or laser diodes. The light pattern generated by the display panel 86 can be monochromatic or may include multiple colors, such as red, green, and blue, to provide a color gamut for the virtual image.
In one example of the optical system 50 in use, the light source 88 emits light toward the distal beam-splitting interface 84, which reflects the light toward the display panel 86. The display panel 86 generates a light pattern by spatially modulating the incident light to provide spatially modulated light reflected toward the distal beam-splitting interface 84. The distal beam-splitting interface 84 transmits the spatially modulated light through the light pipe 82 and toward the proximal beam splitter 64. The proximal beam-splitting interface 70 transmits the spatially-modulated light so that it reaches the image former 68. The image former 68 reflects the spatially-modulated light back toward the proximal beam-splitting interface 70, which reflects the spatially-modulated light toward the viewing location 54 so that the virtual image is viewable along the viewing axis 60.
As a general matter, the reflection and/or transmission of light by and/or through the beam splitters 64, 80 or other optical components of the optical system 50 may refer to the reflection and/or transmission of substantially all of the light or of a portion of the light. Consequently, such terms and descriptions should be interpreted broadly in the present disclosure.
In some embodiments, the proximal and/or distal beam splitters 64, 80 may be polarizing beam splitters, such that the beam splitters preferentially transmit p-polarized light and preferentially reflect s-polarized light, for example. More particularly, in one embodiment, the proximal beam splitter 64 is a polarizing beam splitter that preferentially transmits p-polarized light and preferentially reflects s-polarized light. With this configuration, the external light that is viewable along the viewing axis 60 is generally p-polarized and the light that is viewable along the viewing axis as the virtual image is generally s-polarized. In the present example, the distal beam splitter 80 may be a non-polarizing beam splitter that transmits a portion of the incident light and reflects a portion of the incident light independent (or largely independent) of polarization. The light source 88 may provide s-polarized light that is partly reflected by the distal beam-splitting interface 84 toward the display panel 86. The display panel 86 spatially modulates the incident s-polarized light and also changes its polarization. Thus, in this example, the display panel 86 converts the incident s-polarized light into a spatially-modulated light pattern of p-polarized light. At least a portion of the p-polarized light is transmitted through the distal beam-splitting interface 84, through the light pipe 82, and through the polarizing proximal beam-splitting interface 70 to the image former 68.
In the present example, the image former 68 includes a reflector 90, such as a concave mirror or Fresnel reflector, and a quarter-wave plate 92. The p-polarized light passes through the quarter-wave plate 92 and is reflected by the reflector 90 back through the quarter-wave plate toward the proximal beam-splitting interface 70. After the light pattern interacts with the image former 68 in this way, the polarization is changed from p-polarization to s-polarization and the s-polarized, spatially-modulated light is reflected by the proximal beam-splitting interface 70 toward the viewing location 54 so that the virtual image is viewable along the viewing axis 60.
Referring back to
In an example of the optical system 50 in use, external light from the real world enters through the viewing window 66 and is reflected by the proximal beam-splitting interface 70, through the light pipe 82, and toward the distal beam splitter 80. The distal beam-splitting interface 84 reflects the incident external light to the optical sensor 94 to obtain an image of the real-world environment.
Similarly, light reflected from the user's eye 62 can be directed to the optical sensor 94 to obtain an image of the user's eye. In one example, light from the light source 88, the external light, and/or some other light source is reflected from the user's eye 62 toward the proximal beam splitter 64. The proximal beam-splitting interface 70 reflects the incident reflected light toward the image former 68, which reflects the light back through the proximal beam-splitting interface 70, through the light pipe 82, and toward the distal beam splitter 80. The distal beam-splitting interface 84 reflects the incident reflected light toward the optical sensor 94 to obtain an image of the user's eye 62.
In one example, the light reflected from the user's eye 62 is IR light generated by the light source 88 or some other light source coupled to the optical system 50. In this example, the optical sensor 94 can include an IR filter or otherwise be sensitive to IR light. Thus, the reflected light from the user's eye 62 received by the optical sensor 94 can be distinguished from other light that may be incident on the optical sensor, for example, based on wavelength, which in turn allows the optical system 50 to more accurately track the user's gaze. Alternatively or in conjunction, the light source 88 may emit light that is modulated at predetermined frequencies and/or intensities and reflected from the user's eye 62 to the optical sensor 94 so that reflected modulated light can be distinguished from other non-modulated light incident on the optical sensor.
Various modifications can be made to the optical system 50 of
Referring now to
Following the block 102, control passes to a block 104, which generates a light pattern from which a virtual image can be formed. Referring to
At a block 108, external light that represents the real-world environment is received, such as by the optical sensor 94 described above and, at a block 110, the received external light can be processed to obtain an image or other representation of the real world. The representation can include still images and/or video. Similarly, at blocks 112 and 114, light reflected from the object 62, such as the user's eye, at the viewing location 54 can be received by the optical sensor 94 and the received reflected light can be processed to obtain an image or other representation of the user's eye.
At a block 116, the image of the user's eye is processed with the image of the real world and/or the virtual image to track the user's gaze. In one example, the image of the user's eye is processed to extract one or more features of the eye, such as an eye pupil centroid, an eye pupil diameter, an eye pupil periphery, etc. A gaze direction can be determined from the extracted feature(s) and the gaze direction can be correlated to the real-world environment and/or the virtual image.
Thereafter, control passes to a block 118, and the display system can be controlled in response to the gaze tracking. In one example, the block 118 may control the display system by updating or modifying the virtual image to relate to a feature in the real-world view or to some indicia in the virtual image. In another example, the block 118 may control other components of the display system in response to the gaze tracking, such as by playing an audio message through a speaker or initiating a phone call.
Various modifications can be made to the flowchart 100 of
Referring now to
The right-side display device 132 may be attached to the frame 146 by a mount 156 and the left-side display device 134 may be attached to the frame 148 by a mount 158. The mounts 156, 158 position the display devices 132, 134 so that their respective viewing axes 160, 162 are generally aligned with the user's right eye 136 and left eye 138, respectively. Thus, as shown in
Although
As noted above, the HMD 130 may function as a wearable computing device. In this regard, the HMD may include a processor 170, which can be located inside of or attached to part of the head-mountable support 140. For example, the processor 170 can be located inside of the side-piece 152, as shown in
In one embodiment, the processor 170 is configured to control display panels in the display devices 132, 134 in order to control the virtual images that are generated and displayed to the user. Further, the processor 170 is configured to control optical sensors and to receive images or video captured by the optical sensors. The processor 170 may be communicatively coupled to the display devices 132, 134 by wires inside of the head-mountable support 140, for example. Alternatively, the processor 170 may communicate with the display devices 132, 134 through external wires or through a wireless connection.
The HMD 130 may also include other components that are operatively coupled to the processor 170 to provide desired functionality. For example, the HMD 130 may include one or more touchpads, microphones, and sensors, which are exemplified in
The processor 170 may control the content of the virtual images generated by the display systems 132, 134 and in response to various inputs. Such inputs may come from the touchpad 172, the microphone 174, the sensor 176, and/or a wired or wireless communication interfaces of HMD. The processor 170 may also control the content of the virtual images in response to gaze tracking, as described generally above. In this way, the processor 170 may control the content of the virtual images so that it is appropriate for the user's current surroundings and/or tasks in which the user is involved.
The above detailed description describes various features and functions of the disclosed systems, devices, and methods with reference to the accompanying FIGS. In the FIGS., similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, FIGS., and claims are not meant to be limiting. Other embodiments can be utilized, and other changes can be made, without departing from the spirit or scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the FIGS., can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
5293535 | Sensui | Mar 1994 | A |
5526184 | Tokuhashi et al. | Jun 1996 | A |
5715337 | Spitzer et al. | Feb 1998 | A |
5886822 | Spitzer | Mar 1999 | A |
5943171 | Budd et al. | Aug 1999 | A |
5949583 | Rallison et al. | Sep 1999 | A |
6023372 | Spitzer et al. | Feb 2000 | A |
6091546 | Spitzer | Jul 2000 | A |
6201629 | McClelland et al. | Mar 2001 | B1 |
6204974 | Spitzer | Mar 2001 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6353492 | McClelland et al. | Mar 2002 | B2 |
6353503 | Spitzer et al. | Mar 2002 | B1 |
6356392 | Spitzer | Mar 2002 | B1 |
6384982 | Spitzer | May 2002 | B1 |
6394602 | Morrison et al. | May 2002 | B1 |
6538799 | McClelland et al. | Mar 2003 | B2 |
6618099 | Spitzer | Sep 2003 | B1 |
6671100 | McRuer | Dec 2003 | B1 |
6693749 | King et al. | Feb 2004 | B2 |
6701038 | Rensing et al. | Mar 2004 | B2 |
6724354 | Spitzer | Apr 2004 | B1 |
6862006 | Sato et al. | Mar 2005 | B2 |
6879443 | Spitzer et al. | Apr 2005 | B2 |
7158096 | Spitzer | Jan 2007 | B1 |
7192136 | Howell et al. | Mar 2007 | B2 |
7242527 | Spitzer et al. | Jul 2007 | B2 |
7255437 | Howell et al. | Aug 2007 | B2 |
7369317 | Li et al. | May 2008 | B2 |
7380936 | Howell et al. | Jun 2008 | B2 |
7401918 | Howell et al. | Jul 2008 | B2 |
7438410 | Howell et al. | Oct 2008 | B1 |
7457040 | Amitai | Nov 2008 | B2 |
7481531 | Howell et al. | Jan 2009 | B2 |
7500746 | Howell et al. | Mar 2009 | B1 |
7500747 | Howell et al. | Mar 2009 | B2 |
7522344 | Curatu et al. | Apr 2009 | B1 |
7543934 | Howell et al. | Jun 2009 | B2 |
7581833 | Howell et al. | Sep 2009 | B2 |
7621634 | Howell et al. | Nov 2009 | B2 |
7663805 | Zaloum et al. | Feb 2010 | B2 |
7672055 | Amitai | Mar 2010 | B2 |
7677723 | Howell et al. | Mar 2010 | B2 |
7724443 | Amitari | May 2010 | B2 |
7751122 | Amitai | Jul 2010 | B2 |
7760898 | Howell et al. | Jul 2010 | B2 |
7771046 | Howell et al. | Aug 2010 | B2 |
7792552 | Thomas et al. | Sep 2010 | B2 |
7806525 | Howell et al. | Oct 2010 | B2 |
7843403 | Spitzer | Nov 2010 | B2 |
7900068 | Weststrate et al. | Mar 2011 | B2 |
7922321 | Howell et al. | Apr 2011 | B2 |
8004765 | Amitai | Aug 2011 | B2 |
8109629 | Howell et al. | Feb 2012 | B2 |
8270086 | Hall et al. | Sep 2012 | B1 |
8335040 | Mukawa et al. | Dec 2012 | B2 |
20030090439 | Spitzer et al. | May 2003 | A1 |
20050174651 | Spitzer et al. | Aug 2005 | A1 |
20060192306 | Giller et al. | Aug 2006 | A1 |
20060192307 | Giller et al. | Aug 2006 | A1 |
20060238877 | Ashkenazi et al. | Oct 2006 | A1 |
20080219025 | Spitzer et al. | Sep 2008 | A1 |
20090122414 | Amitari | May 2009 | A1 |
20090231687 | Yamamoto | Sep 2009 | A1 |
20100046070 | Mukawa | Feb 2010 | A1 |
20100103078 | Mukawa et al. | Apr 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100278480 | Vasylyev | Nov 2010 | A1 |
20110018903 | Lapstun et al. | Jan 2011 | A1 |
20110109528 | Mun et al. | May 2011 | A1 |
20110227820 | Haddick et al. | Sep 2011 | A1 |
20120293548 | Perez et al. | Nov 2012 | A1 |
Number | Date | Country |
---|---|---|
0233688 | Apr 2002 | WO |
2010054473 | May 2010 | WO |
Entry |
---|
Falldorf et al., Liquid Crystal Spatial Light Modulators in Optical Metrology, 2010, 987-1-4244-8227-6/10, IEEE, pp. 1-3. |
Cakmakci, Ozan et al., “Head-Worn Displays: A Review,” Journal of Display Technology, vol. 2, pp. 199-216, 2006. |
Levola, Tapani, “Diffractive Optics for Virtual Reality Displays”, Academic Dissertation, Joensuu 2005, University of Joensuu, Department of Physics, Vaisala Laboratory, 26 pages. |
Mukawa, Hiroshi et al., “Distinguished Paper: A Full Color Eyewear Display using Holographic Planar Waveguides”, SID Symposium Digest of Technical Papers—May 2008—vol. 39, Issue 1, pp. 89-92. |
Pfeiffer, Thies, “Towards Gaze Interaction in Immersive Virtual Reality: Evaluation of a Monocular Eye Tracking Set-UP”, Al Group, Faculty of Technology, Bielefeld University, Sep. 2008, 12 pages. |
Scholles, Michael et al., “OLED-on-CMOS for sensors and microdisplays”, Fraunhofer Institute for Photonic Microsystems (IPMS), May 2010, 20 pages. |