This disclosure relates generally to head mounted displays, and in particular but not exclusively, relates to binocular head mounted displays.
A head mounted display (“HMD”) is a display device worn on or about the head. HMDs usually incorporate some sort of near-to-eye optical system to display an image within a few centimeters of the human eye. Single eye displays are referred to as monocular HMDs while dual eye displays are referred to as binocular HMDs. Some HMDs display only a computer generated image (“CGI”), while other types of HMDs are capable of superimposing CGI over a real-world view. This latter type of HMD is often referred to as augmented reality because the viewer's image of the world is augmented with an overlaying CGI, also referred to as a heads-up display (“HUD”).
HMDs have numerous practical and leisure applications. Aerospace applications permit a pilot to see vital flight control information without taking their eye off the flight path. Public safety applications include tactical displays of maps and thermal imaging. Other application fields include video games, transportation, and telecommunications. Due to the infancy of this technology, there is certain to be new found practical and leisure applications as the technology evolves; however, many of these applications are currently limited due to the cost, size, field of view, and efficiency of conventional optical systems used to implemented existing HMDs, as well as, other technological hurdles that have yet to be adequately solved before HMDs will have widespread adoption in the marketplace.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Embodiments of an apparatus and method of sensing head mounted display (“HMD”) deformation are described herein. In the following description numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
One technological hurdle to overcome to further encourage marketplace adoption of HMD technology is identifying and compensating for binocular HMD deformation. Deformation of a binocular HMD can lead to deleterious misalignment between the left and right image displays of the binocular HMD. These misalignments can result in a blurred or otherwise compromised image as perceived by the user, which ultimately leads to a poor user experience (disorientation, dizziness, etc.). Deformation can occur due to a variety of reasons including misuse, poor user fit, nonsymmetrical facial features, harsh environmental factors (e.g., thermal warping), or otherwise.
For example, if a binocular HMD is too narrow for a given user's head, the user's head will assert outward forces on each of the ear arms of the binocular HMD causing the ear arms to spread, thereby flexing the frontal display section about the nose bridge. To a lesser extent, the opposite effect, by the ears applying an inward compressing force to the ear arms, can occur if the user's head is too narrow. Additionally, if the user's ears are not symmetrical (i.e., one ear is higher than the other), a torsion force can be applied to the ear arms causing the left and right sides of the binocular HMD to twist about the nose bridge. Both of these rotational deformations can result in misalignment between the right and left displays of a binocular HMD.
Displays 105 may be implemented using a variety of different binocular display technologies. For example, displays 105 may be implemented as semi-transparent optical elements or opaque optical elements. The semi-transparent optical element embodiments permit external light to pass through to the eyes of a user to provide a real-world view, but also display a super-imposed computer generated image (“CGI”) over the real-world view. This type of display technology is often referred to as a heads up display (“HUD”) or augmented reality. The semi-transparent optical element embodiments can further be defined into emissive embodiments (e.g., sheet of transparent organic light emitting diodes (“OLEDs”)) or reflective embodiments.
As discussed above, binocular HMD 100 may deform during use or overtime due to a number of environmental factors or use scenarios. As illustrated in
Flex sensor 115 is disposed in or on the frame to monitor and measure the degree of deformation about the rotational flex axis. In one embodiment, flex sensor 115 is disposed along the top ridge of frontal display section 132 and straddles nose bridge 145. In one embodiment, flex sensor 115 operates by measuring tensile or compressive forces/stresses as the frame is deformed. In one embodiment, flex sensor 115 is coupled to deformation controller 125, which monitors an output signal from flex sensor 115, which is indicative of the magnitude of the deformation. For example, flex sensor 115 may be implemented with a plurality of piezoelectric crystals coupled in series along a strip. In this embodiment, deformation controller 125 may bias the flex sensor 115 with a constant current and continuously measure the voltage drop across flex sensor 115. As the frame is deformed or flexed about the rotational flex axis, tensile or compressive forces will assert themselves on flex sensor 115 causing a resultant voltage change. The magnitude and sign of this voltage change is indicative of the magnitude and direction of the flexing.
It should be appreciated that flex sensor 115 may be positioned at other locations in or on the frame. For example, a flex sensor may straddle the bottom side of nose bridge 145 and run along a portion of lower display supports 140. Alternatively (or additionally), flex sensors may be incorporated into each ear arm 130. All of these locations (as well as other possible locations) hold potential for measuring flex about the rotational flex axis with varying degrees of sensitivity. In embodiments that use multiple flex sensors, each flex sensor may be coupled to a distinct deformation controller 125 or all be coupled to a single centralized deformation controller 125, as illustrated.
As discussed above, binocular HMD 100 may also deform about a rotational torsion axis (or longitudinal axis) that passes through nose bridge 145. Referring to
Torsion sensors 120 are disposed in or on the frame to monitor and measure the degree of deformation about the rotational torsion axis. In illustrated embodiment, torsion sensors 120 are disposed at opposite sides of frontal display section 132. The torsion sensors 120 operate by measuring the gravity vector at the two locations. In the illustrated embodiment, torsion sensors 120 are coupled to deformation controller 125, which monitors the output signals from each torsion sensor 120, compares the gravity vectors measured by the two torsion sensors 120, and based upon the difference between the two measured gravity vectors determines the direction and magnitude of torsional deformation about the rotational torsion axis. Since nose bridge 145 is typically the weakest link in the frontal display section 132, the rotational torsion axis will typically pass longitudinally (or horizontally) through nose bridge 145. Torsional sensors 120 may each be implemented using accelerometers to measuring the gravity vector. In one embodiment, the accelerometers are three dimensional accelerometers. In one embodiment, the accelerometers are implemented using microelectromechanical systems (“MEMs”) disposed in or on the frame. As the frame is deformed or twisted about the rotational torsion axis, the left and right sides of the frame will pivot in opposite directions relative to each other. As the two sides pivot in opposite directions, torsion sensors 120 will begin to measure different gravity vectors. The difference between these gravity vectors can be used to determine the magnitude and direction of the relative twisting between the right and left displays 105. Since the accelerometers are measuring the acceleration due to gravity, these sensors (which typically measure dynamic motions) can be used to measure static deformations in the HMD frame.
It should be appreciated that torsion sensors 120 may be positioned at other locations in or on the frame than illustrated. For example, torsion sensors 120 may be positioned at the upper outer corners of frontal display section 132, disposed in ear arms 130 near frontal display section 132, disposed anywhere along lower display supports 140, or otherwise. However, locations that increase the lever arm from the central pivot point of the torsional rotation may serve to increase the sensitivity of the sensors. Although the
Rotational deformations of binocular HMD 100 may be reduced via appropriate design of the frame and selection of materials. By selecting stiffer materials and bulking up the frame strength about nose bridge 145, rotational deformations that cause right display 105A to become misaligned relative to left display 105B can be reduced. For example, binocular HMD 100 maybe fabricated of plastics molded around the sensor and control systems, hollow metal frame members in which or on which the sensor and control systems are disposed, or otherwise. However, the selection of stiffer materials and/or the bulkier frame designs may be heavy, uncomfortable, or aesthetically/functionally unpleasing. Thus, the deformation sensor and controller systems disclosed herein provide active monitoring of frame deformation that can be used to generate fault signals to shut down displays 105 to prevent user disorientation upon reaching a threshold degree of deformation, issue a user warning and instructions to straighten or otherwise recalibrate the frames, or even provide active feedback to image controller 210 for real-time image compensation to counteract the effects of the mechanical deformation.
In a process block 405, flex sensor 115 is activated. In an embodiment where flex sensor 115 is implemented using piezoelectric crystals, activating flex sensor 115 may include applied a bias current across the sensor strip. In a process block 410, the output from flex sensor 115 is monitored in real-time. In one embodiment, the monitored output is the voltage across flex sensor 115. In a process block 415, the monitored output is compared to a reference value. The comparison may be executed by deformation controller 125 and the reference value may be a measured output from flex sensor 115 when the frame was known to be in alignment or non-deformed state. The magnitude and the sign of the difference between the monitored output value and the reference value may then be used to determine the direction and magnitude of the frame deformation about the rotational flex axis.
If binocular HMD 100 is configured to perform real-time active correction (decision block 420), then process 400 continues to a process block 425. In process block 425, the magnitude and/or sign of the difference value is used to apply active image correction to the CGIs displayed by displays 105. In this manner, image correction can be used to overcome mechanical frame deformations and bring the right and left CGIs back into alignment despite continued present of the physical deformations. In one embodiment, image controller 210 performs the necessary image adjustments, which may include horizontal or vertical translations, keystoning, etc.
If the frame deformation becomes too great for active image correction or if the particular embodiment of binocular HMD 100 does not support active image correction, then it may be determined whether the deformation exceeds a threshold amount. Thus, in decision block 430, it is determined whether the difference value exceeds a threshold value. If not, process 400 returns to process block 410 for continued monitoring. If so, process 400 continues to a process block 435. In process block 435, a fault signal is issued warning the user. The fault signal may disable the displays 105 or even display a warning message to the user on displays 105. In one embodiment, the warning message may include an indication of how to correct the deformation. As the user realigns the frame by applying counter forces to correct the frame deformation, the displays may display an alignment indicator to guide the user in real-time and may eventually return to regular operation once the deformation is reduced below a threshold value. Other types of warning or error signals may also be issued and/or displayed, such as an audible warning.
Simultaneous to the operation of flex sensor 115, torsion sensors 120 may also monitor the deformation of binocular HMD 100 about the rotational torsion axis. Thus, in a process block 440 torsion sensors 120 are activated. In a process block 445, deformation controller 125 receives and monitors the outputs from torsion sensors 120. In one embodiment, the outputs may be a series of voltages that are indicative of the 3-dimensional gravity vectors measured by each torsion sensor 120. In a process block 450, deformation controller 125 compares the measured gravity vectors (e.g., subtracts one gravity vector from the other). Based upon the comparison, if active correction is enabled (decision block 455), then active image correction may be applied in process block 425 to compensate for deformation about the rotational torsion axis and bring the right and left CGIs back into alignment. If active correction is not enabled and/or the deformation deviance exceeds a threshold value (decision block 460), then the fault signal may be issued, as described above, in process block 435. Otherwise, process 400 returns to process block 445 for continued real-time deformation tracking about the rotational torsion axis.
The processes explained above are described in terms of computer software and hardware. The techniques described may constitute machine-executable instructions embodied within a non-transitory machine (e.g., computer) readable storage medium, that when executed by a machine will cause the machine to perform the operations described. Additionally, the processes may be embodied within hardware, such as an application specific integrated circuit (“ASIC”) or the like.
A machine-readable storage medium includes any mechanism that provides (i.e., stores) information in a form accessible by a machine (e.g., a computer, a network device, a personal digital assistant, any device with a set of one or more processors, etc.). For example, a machine-readable storage medium includes recordable/non-recordable media (e.g., read only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Number | Name | Date | Kind |
---|---|---|---|
5093567 | Staveley | Mar 1992 | A |
5539422 | Heacock et al. | Jul 1996 | A |
5696521 | Robinson et al. | Dec 1997 | A |
5715337 | Spitzer et al. | Feb 1998 | A |
5771124 | Kintz et al. | Jun 1998 | A |
5815126 | Fan et al. | Sep 1998 | A |
5844530 | Tosaki | Dec 1998 | A |
5886822 | Spitzer | Mar 1999 | A |
5896232 | Budd et al. | Apr 1999 | A |
5926318 | Hebert | Jul 1999 | A |
5943171 | Budd et al. | Aug 1999 | A |
5949583 | Rallison et al. | Sep 1999 | A |
6023372 | Spitzer et al. | Feb 2000 | A |
6091546 | Spitzer | Jul 2000 | A |
6172657 | Kamakura et al. | Jan 2001 | B1 |
6201629 | McClelland et al. | Mar 2001 | B1 |
6204974 | Spitzer | Mar 2001 | B1 |
6222677 | Budd et al. | Apr 2001 | B1 |
6272371 | Shlomo | Aug 2001 | B1 |
6349001 | Spitzer | Feb 2002 | B1 |
6353492 | McClelland et al. | Mar 2002 | B2 |
6353503 | Spitzer et al. | Mar 2002 | B1 |
6356392 | Spitzer | Mar 2002 | B1 |
6384982 | Spitzer | May 2002 | B1 |
6538799 | McClelland et al. | Mar 2003 | B2 |
6618099 | Spitzer | Sep 2003 | B1 |
6690516 | Aritake et al. | Feb 2004 | B2 |
6701038 | Rensing et al. | Mar 2004 | B2 |
6724354 | Spitzer | Apr 2004 | B1 |
6738535 | Kanevsky et al. | May 2004 | B2 |
6747611 | Budd et al. | Jun 2004 | B1 |
6829095 | Amitai | Dec 2004 | B2 |
6879443 | Spitzer et al. | Apr 2005 | B2 |
7158096 | Spitzer | Jan 2007 | B1 |
7242527 | Spitzer et al. | Jul 2007 | B2 |
7391573 | Amitai | Jun 2008 | B2 |
7457040 | Amitai | Nov 2008 | B2 |
7542012 | Kato et al. | Jun 2009 | B2 |
7548012 | Cavalloni et al. | Jun 2009 | B2 |
7576916 | Amitai | Aug 2009 | B2 |
7577326 | Amitai | Aug 2009 | B2 |
7643214 | Amitai | Jan 2010 | B2 |
7663805 | Zaloum et al. | Feb 2010 | B2 |
7672055 | Amitai | Mar 2010 | B2 |
7724441 | Amitai | May 2010 | B2 |
7724442 | Amitai | May 2010 | B2 |
7724443 | Amitai | May 2010 | B2 |
7843403 | Spitzer | Nov 2010 | B2 |
7900068 | Weststrate et al. | Mar 2011 | B2 |
7959287 | Saffra | Jun 2011 | B1 |
8004765 | Amitai | Aug 2011 | B2 |
20010021058 | McClelland et al. | Sep 2001 | A1 |
20010022682 | McClelland et al. | Sep 2001 | A1 |
20030090439 | Spitzer et al. | May 2003 | A1 |
20040104864 | Nakada | Jun 2004 | A1 |
20050140646 | Nozawa | Jun 2005 | A1 |
20050174651 | Spitzer et al. | Aug 2005 | A1 |
20060192306 | Giller et al. | Aug 2006 | A1 |
20060192307 | Giller et al. | Aug 2006 | A1 |
20070085845 | Kikuchi et al. | Apr 2007 | A1 |
20080219025 | Spitzer et al. | Sep 2008 | A1 |
20090122414 | Amitai | May 2009 | A1 |
20090234614 | Kahn et al. | Sep 2009 | A1 |
20100046070 | Mukawa | Feb 2010 | A1 |
20100073626 | Engstrom | Mar 2010 | A1 |
20100103078 | Mukawa et al. | Apr 2010 | A1 |
20100110368 | Chaum | May 2010 | A1 |
20100149073 | Chaum et al. | Jun 2010 | A1 |
20100278480 | Vasylyev | Nov 2010 | A1 |
20110071416 | Terada et al. | Mar 2011 | A1 |
20110213664 | Osterhout et al. | Sep 2011 | A1 |
20120154255 | Hinckley et al. | Jun 2012 | A1 |
Number | Date | Country |
---|---|---|
2272980 | Jun 1994 | GB |
WO9605533 | Feb 1996 | WO |
Entry |
---|
Cakmakci, Ozan et al., “Head-Worn Displays: A Review”, Journal of Display Technology, Sep. 2006, 20 pages, vol. 2, Issue 3. |
Levola, Tapani, “Diffractive Optics for Virtual Reality Displays”, Academic Dissertation, Joensuu 2005, University of Joensuu, Department of Physics, Vaisala Laboratory, 26 pages. |
Mukawa, Hiroshi et al., “Distinguished Paper: A Full Color Eyewear Display using Holographic Planar Waveguides”, SID Symposium Digest of Technical Papers—May 2008—vol. 39, Issue 1, pp. 89-92. |