Augmented reality (AR) overlays virtual image objects/information onto real-world image objects/information and displays the combined AR image in a user's field of view (FOV). Categories of AR technology include Projection AR (the virtual information is projected using external projectors onto the real objects in the user's environment; Handheld AR (the virtual information is displayed on a handheld device—e.g., mobile phones, tablets etc.); and Head Mounted Display (HMD) AR (the virtual information is displayed on an optical head-mounted display—e.g., smart glasses).
Conventional AR HMD displays stereoscopic three-dimensional (3D) objects. Typically, the conventional techniques provide each eye with images from slightly different angles. Television screens and motion picture theaters provide 3D images where the light source is far from the eye, and objects appear in focus. However, in HMD, the display is near-eye, which requires additional optic components to focus objects that are virtually far.
Because conventional 3D augmented reality is generated near the eye, additional optical components are also need to properly focus the image(s). However, AR introduces an additional complexity as the focus of the real-world image objects need to be preserved. Conventional solutions for this problem include wave-guide optics and bird-bath optical combiners. Both these conventional designs suffer from dual competing problems of having a narrow FOV produced by a heavy device that includes a display projector and a display screen.
Embodying systems include a head-mounted, augmented reality device that combines a lens system with a transparent display screen to provide a high field of view in a light weight package. An embodying combination of lens and display screen results in a device that presents an AR image with either a spatial or time periodic component.
Control processor 110 can include processor unit 112 and memory unit 114. The control processor can be in direct communication with data store 120. In some implementations, the control processor can be in indirect communication with the data store across an electronic communication network if the data store is located remote from the head-mounted unit. Memory unit 114 can provide the control processor with local cache memory.
Processor unit 112 can execute executable instructions 122, which can cause the processor to access virtual image records 124. The control processor unit provides control of pixels in the transparent display to generate the virtual image. The control processor also provides control signals to the lens system. The virtual image can be combined with a real-world image viewed through transparent display screen 106 to create an AR image.
The Statue of Liberty depiction represents real world object 210. Light reflected off the real world object creates an image, which enters the AR HMD unit through transparent display 206. Lens system 208 acts on the real world image to provide an augmented reality view to the user's eye(s).
In some implementations, lens system 208 can be implemented as a micro-lens positioned between transparent display 206 and a user's eye(s).
The white box represents a transparent pixel, which allows the real object image's light to pass through to a user's eye(s) unaffected. The black box represents a pixel that is activated to present a portion of selected virtual image 124A. The spatial location of the checkerboard pattern varies with the selected virtual image record. Pixels of the selected virtual image are spatially interspersed with transparent pixels (i.e., non-activated pixels) so that a portion of the real object image light and the selected virtual image reach a user's eye(s).
The micro-lens focuses the virtual image at a distance/infinity. The end result of the combination is an interweaving of virtual image pixels and real world image light through transparent pixels. Because the size of the pixel is smaller in size than what an eye can resolve, the images are fused when consumed by the eye.
In other implementations, lens system 208 can be implemented as a focus-tunable lens. In this implementation, the periodicity of the display is not in a spatial position domain, but in a time-based domain. In accordance with this implementation, the focus-tunable lens is toggled on/off at a high frequency. The frequency of modulation is at a frame rate faster that the frame rate at which a user's eye(s) perceives light—e.g., at a rate of about 30 frames/second or greater.
When the system is off, the real objects image light passes unimpaired through the transparent display. Similarly, when the focus-tunable lens is toggled off, the tunable lens, it acts as window. When the system is on, the transparent display blocks the real light image, and instead each pixel of the transparent display is activated to generate a selected virtual image. The focus tunable lens is toggle on to focus the virtual image. Because the system is toggled on/off at a modulation frequency higher than can be perceived at the user's eye(s), the real world (unimpaired) image and the selected virtual image are interwoven in time. The result of this time-dependent interweaving is that the user's brain perceives the two images as a single image; hence, creating an augmented reality visualization.
In either implementation (transparent display with micro-lens, or transparent display with focus-tunable lens), the resulting visual effect on a user's eye is to simultaneously see the two images (real image 210 and a selected virtual image record 124A).
Embodying AR HMD systems improve over conventional approaches (that use waveguide optics) by implementing a streamlined unit that combines a transparent display screen with a lens element to create an augmented reality image using periodic-alterable features of the combination. Embodying AR HMD systems create an augmented reality image with a field of view greater than conventional approaches, while being scalable to allow for larger visual coverage.
Although specific hardware and methods have been described herein, note that any number of other configurations may be provided in accordance with embodiments of the invention. Thus, while there have been shown, described, and pointed out fundamental novel features of the invention, it will be understood that various omissions, substitutions, and changes in the form and details of the illustrated embodiments, and in their operation, may be made by those skilled in the art without departing from the spirit and scope of the invention. Substitutions of elements from one embodiment to another are also fully intended and contemplated. The invention is defined solely with regard to the claims appended hereto, and equivalents of the recitations therein.
Number | Name | Date | Kind |
---|---|---|---|
8582209 | Amirparviz | Nov 2013 | B1 |
8933965 | Tomite et al. | Jan 2015 | B2 |
20020158873 | Williamson | Oct 2002 | A1 |
20130021226 | Bell | Jan 2013 | A1 |
20130169683 | Perez et al. | Jul 2013 | A1 |
20130328762 | McCulloch et al. | Dec 2013 | A1 |
20140132484 | Pandey et al. | May 2014 | A1 |
20160341961 | Mullins et al. | Nov 2016 | A1 |
20180120573 | Ninan | May 2018 | A1 |
Number | Date | Country |
---|---|---|
1060772 | Dec 2000 | EP |
2724191 | Apr 2014 | EP |
2920766 | Sep 2015 | EP |
Entry |
---|
Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens, Applied Optics, vol. 57, No. 7, (Year: 2018). |
Jong-Young et al., “See-through optical combiner for augmented reality head-mounted display: index-matched anisotropic crystal lens”, Scientific Report, Jun. 2017, 11 pp. |
Andrew et al., “Holographic Near-Eye Displays for Virtual and Augmented Reality”, ACM Transactions on Graphics, vol. 36, No. 4, Article 85, pp. 01-16, Jul. 2017, 16 pp. |
Number | Date | Country | |
---|---|---|---|
20200201044 A1 | Jun 2020 | US |