The present disclosure relates to virtual reality and augmented reality imaging and visualization systems.
Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” or “augmented reality” experiences, wherein digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may be perceived as, real. A virtual reality, or “VR”, scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user. For example, referring to
An innovative aspect of the subject matter disclosed herein is implemented in an optical system comprising an image projection system, a waveguide; and a control system. The image projection system is configured to emit a coherent beam of light at a plurality of wavelengths in the visible spectral range. The waveguide comprises a first edge, a second edge and a pair of reflective surfaces disposed between the first and the second edges. The pair of reflective surfaces is separated by a gap having a gap height d. The waveguide comprises a material having a refractive index n. The pair of reflective surfaces has a reflectivity r. The beam emitted from the image projection system is coupled into the waveguide at an input angle θ. The input light can be coupled through one of the first or the second edge or through one of the reflective surfaces. The control system is configured to vary at least one parameter selected from the group consisting of: a wavelength from the plurality of wavelengths, the gap height d, the refractive index n and the reflectivity r. The variation of the at least one parameter is correlated with variation in the input angle θ.
In various embodiments of the optical system the image projection system can be configured to vary the input angle θ of emitted beam at a scan rate. The control system can be configured to modulate the at least one parameter at a modulation rate substantially equal to the scan rate. The control system can be configured to modulate the at least one parameter, the modulation rate configured such that the equation 2nd sin θ=mλ is satisfied for all values of the input angle θ, wherein m is an integer and λ is wavelength of the beam. In various embodiments, the least one parameter can be a wavelength from the plurality of wavelengths. In some embodiments, the least one parameter can be the gap height d. In various embodiments, the least one parameter can be the refractive index n. In some embodiments, the least one parameter can be the reflectivity r. In various embodiments, the image projection system can comprise a fiber. In various embodiments, the emitted beam can be collimated. The plurality of wavelengths can comprise wavelengths in the red, green and blue spectral regions. The waveguide can comprise an acousto-optic material, a piezo-electric material, an electro-optic material or a micro-electro mechanical system (MEMS). The waveguide can be configured as an exit pupil expander that expands and multiplies the emitted beam. The waveguide can be configured to expand the beam to a spot size greater than 1 mm. Various embodiments of the optical system discussed herein can be integrated in an augmented reality (AR) device, a virtual reality (VR) device, a near-to-eye display device, or an eyewear comprising at least one of: a frame, one or more lenses or ear stems.
An innovative aspect of the subject matter disclosed herein is implemented in an optical system comprising an image projection system, a plurality of stacked waveguides, and a control system. The image projection system is configured to emit a coherent beam of light at a plurality of wavelengths in the visible spectral range. Each waveguide of the plurality of stacked waveguides comprises a first edge, a second edge and a pair of reflective surfaces disposed between the first and the second edges. The pair of reflective surfaces is separated by a gap having a gap height d. The waveguide comprises a material having a refractive index n. The pair of reflective surfaces has a reflectivity r. The control system is configured to vary at least one parameter selected from the group consisting of: a wavelength from the plurality of wavelengths, the gap height d, the refractive index n and the reflectivity r. The beam emitted from the image projection system is coupled into the waveguide at an input angle θ. The input light can be coupled through one of the first or the second edge or through one of the reflective surfaces. The variation of the at least one parameter is correlated with variation in the input angle θ.
In various embodiments, each waveguide of the plurality of stacked waveguides can have an associated depth plane. The beam emitted from each waveguide can appear to originate from that waveguide's associated depth plane. The different waveguides from the plurality of stacked waveguides can have different associated depth planes. Various embodiments of the optical system discussed above can be integrated in an augmented reality (AR) device, a virtual reality (VR) device, a near-to-eye display device, or an eyewear comprising at least one of: a frame, one or more lenses or ear stems.
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Neither this summary nor the following detailed description purports to define or limit the scope of the inventive subject matter.
The drawings are provided to illustrate certain example embodiments and are not intended to limit the scope of the disclosure. Like numerals refer to like parts throughout.
Overview
In order for a three-dimensional (3D) display to produce a true sensation of depth, and more specifically, a simulated sensation of surface depth, it is desirable for each point in the display's visual field to generate the accommodative response corresponding to its virtual depth. If the accommodative response to a display point does not correspond to the virtual depth of that point, as determined by the binocular depth cues of convergence and stereopsis, the human eye may experience an accommodation conflict, resulting in unstable imaging, harmful eye strain, headaches, and, in the absence of accommodation information, almost a complete lack of surface depth.
VR and AR experiences can be provided by display systems having displays in which images corresponding to a plurality of depth planes are provided to a viewer. The images may be different for each depth plane (e.g., provide slightly different presentations of a scene or object) and may be separately focused by the viewer's eyes, thereby helping to provide the user with depth cues based on the accommodation of the eye required to bring into focus different image features for the scene located on different depth plane and/or based on observing different image features on different depth planes being out of focus. As discussed elsewhere herein, such depth cues provide credible perceptions of depth.
The local processing and data module 71 may comprise a hardware processor, as well as digital memory, such as non-volatile memory (e.g., flash memory), both of which may be utilized to assist in the processing, caching, and storage of data. The data may include data a) captured from sensors (which may be, e.g., operatively coupled to the frame 64 or otherwise attached to the user 60), such as image capture devices (e.g., cameras), microphones, inertial measurement units, accelerometers, compasses, global positioning system (GPS) units, radio devices, and/or gyroscopes; and/or b) acquired and/or processed using remote processing module 72 and/or remote data repository 74, possibly for passage to the display 62 after such processing or retrieval. The local processing and data module 71 may be operatively coupled by communication links 76 and/or 78, such as via wired or wireless communication links, to the remote processing module 72 and/or remote data repository 74 such that these remote modules are available as resources to the local processing and data module (71). In addition, remote processing module 72 and remote data repository 74 may be operatively coupled to each other.
In some embodiments, the remote processing module 72 may comprise one or more processors configured to analyze and process data and/or image information. In some embodiments, the remote data repository 74 may comprise a digital data storage facility, which may be available through the internet or other networking configuration in a “cloud” resource configuration. In some embodiments, all data is stored and all computations are performed in the local processing and data module, allowing fully autonomous use from a remote module.
The human visual system is complicated and providing a realistic perception of depth is challenging. Without being limited by theory, it is believed that viewers of an object may perceive the object as being three-dimensional due to a combination of vergence and accommodation. Vergence movements (e.g., rotational movements of the pupils toward or away from each other to converge the lines of sight of the eyes to fixate upon an object) of the two eyes relative to each other are closely associated with focusing (or “accommodation”) of the lenses of the eyes. Under normal conditions, changing the focus of the lenses of the eyes, or accommodating the eyes, to change focus from one object to another object at a different distance will automatically cause a matching change in vergence to the same distance, under a relationship known as the “accommodation-vergence reflex.” Likewise, a change in vergence will trigger a matching change in accommodation, under normal conditions. Display systems that provide a better match between accommodation and vergence may form more realistic or comfortable simulations of three-dimensional imagery.
Waveguide Stack Assembly
With continued reference to
In some embodiments, the image injection devices 200, 202, 204, 206, 208 are discrete displays that each produce image information for injection into a corresponding waveguide 182, 184, 186, 188, 190, respectively. In some other embodiments, the image injection devices 200, 202, 204, 206, 208 are the output ends of a single multiplexed display which may, e.g., pipe image information via one or more optical conduits (such as fiber optic cables) to each of the image injection devices 200, 202, 204, 206, 208.
A controller 210 controls the operation of the stacked waveguide assembly 178 and the image injection devices 200, 202, 204, 206, 208. In some embodiments, the controller 210 includes programming (e.g., instructions in a non-transitory computer-readable medium) that regulates the timing and provision of image information to the waveguides 182, 184, 186, 188, 190. In some embodiments, the controller may be a single integral device, or a distributed system connected by wired or wireless communication channels. The controller 210 may be part of the processing modules 71 or 72 (illustrated in
The waveguides 182, 184, 186, 188, 190 may be configured to propagate light within each respective waveguide by total internal reflection (TIR). The waveguides 182, 184, 186, 188, 190 may each be planar or have another shape (e.g., curved), with major top and bottom surfaces and edges extending between those major top and bottom surfaces. In the illustrated configuration, the waveguides 182, 184, 186, 188, 190 may each include light extracting optical elements 282, 284, 286, 288, 290 that are configured to extract light out of a waveguide by redirecting the light, propagating within each respective waveguide, out of the waveguide to output image information to the eye 304. Extracted light may also be referred to as outcoupled light, and light extracting optical elements may also be referred to as outcoupling optical elements. An extracted beam of light is outputted by the waveguide at locations at which the light propagating in the waveguide strikes a light redirecting element. The light extracting optical elements 82, 284, 286, 288, 290 may, for example, be reflective and/or diffractive optical features. While illustrated disposed at the bottom major surfaces of the waveguides 182, 184, 186, 188, 190 for ease of description and drawing clarity, in some embodiments, the light extracting optical elements 282, 284, 286, 288, 290 may be disposed at the top and/or bottom major surfaces, and/or may be disposed directly in the volume of the waveguides 182, 184, 186, 188, 190. In some embodiments, the light extracting optical elements 282, 284, 286, 288, 290 may be formed in a layer of material that is attached to a transparent substrate to form the waveguides 182, 184, 186, 188, 190. In some other embodiments, the waveguides 182, 184, 186, 188, 190 may be a monolithic piece of material and the light extracting optical elements 282, 284, 286, 288, 290 may be formed on a surface and/or in the interior of that piece of material.
With continued reference to
The other waveguide layers (e.g., waveguides 188, 190) and lenses (e.g., lenses 196, 198) are similarly configured, with the highest waveguide 190 in the stack sending its output through all of the lenses between it and the eye for an aggregate focal power representative of the closest focal plane to the person. To compensate for the stack of lenses 198, 196, 194, 192 when viewing/interpreting light coming from the world 144 on the other side of the stacked waveguide assembly 178, a compensating lens layer 180 may be disposed at the top of the stack to compensate for the aggregate power of the lens stack 198, 196, 194, 192 below. Such a configuration provides as many perceived focal planes as there are available waveguide/lens pairings. Both the light extracting optical elements of the waveguides and the focusing aspects of the lenses may be static (e.g., not dynamic or electro-active). In some alternative embodiments, either or both may be dynamic using electro-active features.
With continued reference to
In some embodiments, the light extracting optical elements 282, 284, 286, 288, 290 are diffractive features that form a diffraction pattern, or “diffractive optical element” (also referred to herein as a “DOE”). Preferably, the DOEs have a relatively low diffraction efficiency so that only a portion of the light of the beam is deflected away toward the eye 304 with each intersection of the DOE, while the rest continues to move through a waveguide via total internal reflection. The light carrying the image information is thus divided into a number of related exit beams that exit the waveguide at a multiplicity of locations and the result is a fairly uniform pattern of exit emission toward the eye 304 for this particular collimated beam bouncing around within a waveguide.
In some embodiments, one or more DOEs may be switchable between “on” states in which they actively diffract, and “off” states in which they do not significantly diffract. For instance, a switchable DOE may comprise a layer of polymer dispersed liquid crystal, in which microdroplets comprise a diffraction pattern in a host medium, and the refractive index of the microdroplets can be switched to substantially match the refractive index of the host material (in which case the pattern does not appreciably diffract incident light) or the microdroplet can be switched to an index that does not match that of the host medium (in which case the pattern actively diffracts incident light).
In some embodiments, the number and distribution of depth planes and/or depth of field may be varied dynamically based on the pupil sizes and/or orientations of the eyes of the viewer. In some embodiments, a camera 500 (e.g., a digital camera) may be used to capture images of the eye 304 to determine the size and/or orientation of the pupil of the eye 304. The camera 500 can be used to obtain images for use in determining the direction the wearer 60 is looking (e.g., eye pose) or for biometric identification of the wearer (e.g., via iris identification). In some embodiments, the camera 500 may be attached to the frame 64 (as illustrated in
For example, depth of field may change inversely with a viewer's pupil size. As a result, as the sizes of the pupils of the viewer's eyes decrease, the depth of field increases such that one plane not discernible because the location of that plane is beyond the depth of focus of the eye may become discernible and appear more in focus with reduction of pupil size and commensurate increase in depth of field. Likewise, the number of spaced apart depth planes used to present different images to the viewer may be decreased with decreased pupil size. For example, a viewer may not be able to clearly perceive the details of both a first depth plane and a second depth plane at one pupil size without adjusting the accommodation of the eye away from one depth plane and to the other depth plane. These two depth planes may, however, be sufficiently in focus at the same time to the user at another pupil size without changing accommodation.
In some embodiments, the display system may vary the number of waveguides receiving image information based upon determinations of pupil size and/or orientation, or upon receiving electrical signals indicative of particular pupil sizes and/or orientations. For example, if the user's eyes are unable to distinguish between two depth planes associated with two waveguides, then the controller 210 may be configured or programmed to cease providing image information to one of these waveguides. Advantageously, this may reduce the processing burden on the system, thereby increasing the responsiveness of the system. In embodiments in which the DOEs for a waveguide are switchable between on and off states, the DOEs may be switched to the off state when the waveguide does receive image information.
In some embodiments, it may be desirable to have an exit beam meet the condition of having a diameter that is less than the diameter of the eye of a viewer. However, meeting this condition may be challenging in view of the variability in size of the viewer's pupils. In some embodiments, this condition is met over a wide range of pupil sizes by varying the size of the exit beam in response to determinations of the size of the viewer's pupil. For example, as the pupil size decreases, the size of the exit beam may also decrease. In some embodiments, the exit beam size may be varied using a variable aperture.
The relayed and exit-pupil expanded light is optically coupled from the distribution waveguide apparatus into the one or more primary planar waveguides 1. The primary planar waveguide 1 relays light along a second axis, preferably orthogonal to first axis, (e.g., horizontal or X-axis in view of
The optical system may include one or more sources of colored light (e.g., red, green, and blue laser light) 110 which may be optically coupled into a proximal end of a single mode optical fiber 9. A distal end of the optical fiber 9 may be threaded or received through a hollow tube 8 of piezoelectric material. The distal end protrudes from the tube 8 as fixed-free flexible cantilever 7. The piezoelectric tube 8 can be associated with 4 quadrant electrodes (not illustrated). The electrodes may, for example, be plated on the outside, outer surface or outer periphery or diameter of the tube 8. A core electrode (not illustrated) is also located in a core, center, inner periphery or inner diameter of the tube 8.
Drive electronics 12, for example electrically coupled via wires 10, drive opposing pairs of electrodes to bend the piezoelectric tube 8 in two axes independently. The protruding distal tip of the optical fiber 7 has mechanical modes of resonance. The frequencies of resonance can depend upon a diameter, length, and material properties of the optical fiber 7. By vibrating the piezoelectric tube 8 near a first mode of mechanical resonance of the fiber cantilever 7, the fiber cantilever 7 is caused to vibrate, and can sweep through large deflections.
By stimulating resonant vibration in two axes, the tip of the fiber cantilever 7 is scanned biaxially in an area filling two dimensional (2D) scan. By modulating an intensity of light source(s) 11 in synchrony with the scan of the fiber cantilever 7, light emerging from the fiber cantilever 7 forms an image. Descriptions of such a set up are provided in U.S. Patent Publication No. 2014/0003762, which is incorporated by reference herein in its entirety.
A component 6 of an optical coupler subsystem collimates the light emerging from the scanning fiber cantilever 7. The collimated light is reflected by mirrored surface 5 into the narrow distribution planar waveguide 3 which contains the at least one diffractive optical element (DOE) 4. The collimated light propagates vertically (relative to the view of
At each point of intersection with the DOE 4, additional light is diffracted toward the entrance of the primary waveguide 1. By dividing the incoming light into multiple outcoupled sets, the exit pupil of the light can be expanded vertically by the DOE 4 in the distribution planar waveguide 3 and/or the eyebox can be expanded. This vertically expanded light coupled out of distribution planar waveguide 3 enters the edge of the primary planar waveguide 1.
Light entering primary waveguide 1 propagates horizontally (relative to the view of
At each point of intersection between the propagating light and the DOE 2, a fraction of the light is diffracted toward the an exit surface of the primary waveguide 1 allowing the light to escape the TIR, and emerge from the exit surface of the primary waveguide 1. In some embodiments, the radially symmetric diffraction pattern of the DOE 2 additionally imparts a divergence to the diffracted light such that it appears to originate from a focal depth thereby shaping the light wavefront (e.g., imparting a curvature) of the individual beam as well as steering the beam at an angle that matches the designed focal depth.
Accordingly, these different pathways can cause the light to be coupled out of the primary planar waveguide 1 by a multiplicity of DOEs 2 at different angles, focal depths, and/or yielding different fill patterns at the exit pupil. Different fill patterns at the exit pupil can be beneficially used to create a light field display with multiple depth planes. Each layer in the waveguide assembly or a set of layers (e.g., 3 layers) in the stack may be employed to generate a respective color (e.g., red, blue, green). Thus, for example, a first set of three layers may be employed to respectively produce red, blue and green light at a first focal depth. A second set of three layers may be employed to respectively produce red, blue and green light at a second focal depth. Multiple sets may be employed to generate a full 3D or 4D color image light field with various focal depths.
Other Components of AR Systems
In many implementations, the AR system may include other components in addition to the wearable display system 80 (or optical systems 100). The AR devices may, for example, include one or more haptic devices or components. The haptic device(s) or component(s) may be operable to provide a tactile sensation to a user. For example, the haptic device(s) or component(s) may provide a tactile sensation of pressure and/or texture when touching virtual content (e.g., virtual objects, virtual tools, other virtual constructs). The tactile sensation may replicate a feel of a physical object which a virtual object represents, or may replicate a feel of an imagined object or character (e.g., a dragon) which the virtual content represents. In some implementations, haptic devices or components may be worn by the user (e.g., a user wearable glove). In some implementations, haptic devices or components may be held by the user.
The AR system may, for example, include one or more physical objects which are manipulable by the user to allow input or interaction with the AR system. These physical objects are referred to herein as totems. Some totems may take the form of inanimate objects, for example a piece of metal or plastic, a wall, a surface of table. Alternatively, some totems may take the form of animate objects, for example a hand of the user. As described herein, the totems may not actually have any physical input structures (e.g., keys, triggers, joystick, trackball, rocker switch). Instead, the totem may simply provide a physical surface, and the AR system may render a user interface so as to appear to a user to be on one or more surfaces of the totem. For example, the AR system may render an image of a computer keyboard and trackpad to appear to reside on one or more surfaces of a totem. For instance, the AR system may render a virtual computer keyboard and virtual trackpad to appear on a surface of a thin rectangular plate of aluminum which serves as a totem. The rectangular plate does not itself have any physical keys or trackpad or sensors. However, the AR system may detect user manipulation or interaction or touches with the rectangular plate as selections or inputs made via the virtual keyboard and/or virtual trackpad.
Examples of haptic devices and totems usable with the AR devices, HMD, and display systems of the present disclosure are described in U.S. Patent Publication No. 2015/0016777, which is incorporated by reference herein in its entirety.
Optical Systems with Exit Pupil Expander
An optical system (e.g., wearable display system 80 or the optical system 100) comprising a waveguide (e.g., planar waveguide 1) that is configured to output incoupled light propagating through the waveguide via total internal reflection can be associated with an exit pupil configured such that light rays that exit the system through the exit pupil can be viewed by a user. An exit pupil larger than the pupil size of the user's eyes wastes some light, but allows for some tolerance in side-to-side movement of the user's head or eye. The optical system can also be associated with an eyebox which corresponds to the volume where the user can place his/her eye without sacrificing full field of view (FOV) and/or the full resolution of the optical system.
Various embodiments of optical systems (e.g., wearable display system 80 or the optical system 100) can include additional waveguides (e.g., the distribution waveguide apparatus 3 illustrated in
Various embodiments of an optical system (e.g., wearable display system 80 or the optical system 100) can comprise a waveguide (e.g., planar waveguide 1) having two reflective surfaces—a first reflective surface and a second reflective surface. An incoming light beam incident on the first reflective surface at an angle θ can be coupled into the waveguide such that it propagates through the waveguide via total internal reflection at the first and the second reflective surfaces. Each occurrence of total internal reflection at the first and the second reflective surface can be considered to produce a copy of the incoming light beam. Accordingly, multiple copies of the incoming light beam can be produced as the light beam propagates through the waveguide. Incoming light beam that propagates through the waveguide can be outcoupled out of the waveguide through the second reflective surface. Each copy of the incoupled light beam can be considered to be a kaleidoscopic copy or a mirror image of the incoupled light beam. Accordingly, the light that is coupled out of the second reflective surface of the waveguide can be considered to include a beamlet array including a plurality of light beams that are copies of the incoupled light beam. Each of the plurality of light beams can have a beam diameter that is equal to the beam diameter of the incoupled light beam. Each of the plurality of light beams of the beamlet array can appear to originate from a virtual source that is disposed on a side of the reflective surface from which the incoupled light beam is total internally reflected. Accordingly, each reflective surface of the waveguide produces a set of mirror image copies of the input light source that emits the incoming light beam. The set of mirror image copies appear to be on a side of a respective reflective surface. This is explained further below with reference to
A pivotable optical system, such as, for example, a human eye viewing one of the two surfaces of the waveguide 710 (e.g., second reflective surface 712a as illustrated in
The point spread function (PSF) of the beamlet array output from the waveguide can depend on the characteristics of the input light source that outputs the incoupled light beam 701. This is explained herein with reference to
In embodiments of optical systems (e.g., optical system 100) in which light from a scanning projector (e.g., a projection system including a fiber cantilever 7 illustrated in
Although, lens based exit pupil expander systems can achieve a desired output aperture size, they can be bulky and heavy making them unpractical to be integrated with near-to-eye display systems. As discussed above waveguides having a refractive index ‘n’ and thickness ‘d’ can function as an EPE when the optical path length difference between adjacent beams of the beamlet array output from the waveguide, Γ=2nd cos θ is an integral multiple of the wavelength λ of incident light can expand the exit pupil. Accordingly, waveguides can provide a compact way of increasing the exit pupil of an optical system without contributing to the weight or bulk.
However, as noted for
As the incident angle at which input light is incoupled into the waveguide varies within the solid angle Θ, the beamlet array output from the waveguide is angularly filtered by a discrete two-dimensional (2D) grid of focused spots, as described above with reference to
In optical systems including a scanning projector with a small aperture size as a source of optical signal and a waveguide as an exit pupil expander, it is advantageous to control one or more of the optical and/or mechanical properties of the display system and/or the input beam to maintain the intensity of projected images at an intensity level above a threshold. The optical and/or mechanical properties can include the spacing between the reflective surfaces of the waveguide (also referred to as the thickness ‘d’), the index of refraction ‘n’ of the waveguide or the wavelength λ of the input optical signal. The optical and/or mechanical properties of the display system and/or the input beam can be controlled to be in synchrony with the variations of the input beam's scan angle such that the discrete two-dimensional (2D) grid of focused spots can be angularly shifted in a manner such that every scan angle of the projector will produce a beamlet array that has a compact tightly focused PSF (similar to the PSF depicted in
The output beam produced by an optical system comprising a waveguide that splits a scanned input beam into a regular two-dimensional beamlet array including a plurality of light beams can have a beam diameter that is greater than the beam diameter of individual ones of the plurality of light beams of the beamlet array when one or more of the physical or optical properties of the waveguide and/or the wavelength of the scanned input beam is varied approximately at a frequency of the scan rate. By varying one or more of the physical or optical properties of the waveguide and/or the wavelength of the scanned input beam at a frequency of the scan rate can advantageously control the relative phase shift between the light beams in the beamlet array such that the output beam has a continuous wavefront with a uniform phase. Such embodiments of the optical system can be considered to function as an optical phase array that is capable of forming and steering output beams with larger beam diameters. In such optical systems, the projector's scanning technology can steer the input beam between the preferred angles of the waveguide's angular filter grid (which corresponds to the 2D grid of focused spots), and the modulation technologies employed to vary one or more of the physical or optical properties of the waveguide and/or the wavelength of the scanned input beam at the frequency of the scan rate are responsible for steering the angular filter grid between the different angles of the input beam. In various embodiments, the waveguide can be configured such that the beamlet array output from the waveguide forms a light beam having a continuous wavefront with a uniform phase and a beam diameter that is larger than the beam diameter of the individual beams in the beamlet array without dynamically varying (e.g., by utilizing one or more holographic structures) one or more of the physical or optical properties of the waveguide and/or the wavelength of the scanned input beam at a frequency of the scan rate. Systems and methods that can dynamically or non-dynamically achieve phase synchronization between the various light beams of the beamlet array for different scanned angles of the input light beam are discussed below.
1. Dynamic Phase Synchronization
A variety of techniques and methods can be used to vary one or more of the physical or optical properties of the waveguide and/or the wavelength of the scanned input beam at a frequency of the scan rate to dynamically achieve phase synchronization between the various light beams of the beamlet array for different scanned angles of the input light beam which are discussed below. In various embodiments, the optical system can comprise a control system that is configured to control one or more of the physical or optical properties of the waveguide (e.g., refractive index, distance between the reflective surfaces of the waveguide) and/or the wavelength of the input beam. The control system can include feedback loops to continuously maintain phase synchronization between the individual light beams of the beamlet array.
1.1. Index of Refraction
As discussed above, to maintain phase synchronization between the individual light beams of the beamlet array the optical path length difference Γ=2nd cos θ should be an integral multiple of the wavelength λ. Accordingly, if the index of refraction of the material of the waveguide is varied at a frequency of the scan rate (or at the frequency at which θ varies) such that the optical path length difference Γ=2nd cos θ is an integral multiple of the wavelength λ for all input angles θ, then phase synchronization between the individual light beams of the beamlet array can be maintained for all input angles θ.
n0 at scan angle θm+1 such that the term
n0d cos θm+1 is equal to mλ, wherein m is an integer. The point 901e of
n0 at scan angle θm+2 such that the term
n0d cos θm+2 is equal to (m+1)λ, wherein m is an integer. In
n0 is substantially equal to
n0. Only some of the possible values of refractive index ‘n’ at which 2nd cos θ is an integral multiple of the wavelength λ are depicted in
Refractive index of the material of the waveguide can be varied by a variety of techniques including but not limited to varying parameters of an electrical or optical field, varying temperature of the material of the waveguide, varying chemical compositions and/or concentrations of various materials comprised in the waveguide, by piezo-optic effects, etc. For example, the waveguide can comprise a crystalline and/or liquid crystal material whose index of refraction can be varied with the application of electric fields via a number of different electro-optic effects. As another example, the waveguide can comprise a liquid solution whose index of refraction can be varied by controlling the mixing and relative concentrations of its solutes. As another example, the waveguide can comprise a chemically active substrate whose index of refraction can be varied by controlling the rate and/or the results of chemical reactions within the material comprising the waveguide. For example, in some embodiments, the rate and/or the results of chemical reactions within the material comprising the waveguide can be controlled by application of electric field, application of optical field or both. As another example, in some embodiments, the rate and/or the results of chemical reactions within the material comprising the waveguide can be controlled by the use of chemical pumps. Changes in optical wavelength can also produce changes in the refractive index. Accordingly, in various embodiments, the change in the refractive index can be correlated to the wavelength λ of light that is incident on the waveguide. For example, the wavelength λ of the incident light can vary due to a variety of factors including but not limited to modulation of the incident light, non-linearity and/or dispersion of the waveguide. For example, in various embodiments, the wavelength of the incident light λ can change by an amount Δλ that is about 1%-10% of the wavelength λoptical of the unmodulated incident light due to modulation. Accordingly, a controller configured to vary the refractive index of the material of the light can be configured to take into consideration the change in the wavelength λ of the incident light when calculating the amount Δn by which refractive index is to be changed. In various embodiments, the controller can include a feedback loop that is configured to dynamically calculate a change in the wavelength λ of the incident light and calculate the amount Δn by which refractive index is to be changed based on the change in the wavelength λ of the incident light such that phase synchronization between the various light beams of the beamlet array for different scanned angles of the input light beam can be achieved.
1.2. Reflector Plane Spacing
Various embodiments of the waveguide can be configured such that the space (also referred to as thickness of the waveguide) between the reflective surfaces (e.g., reflective surfaces 712a and 712b of waveguide 710) need not be fixed but instead can be varied. For example, in various embodiments of the waveguide, the space between the reflective surfaces (e.g., reflective surfaces 712a and 712b of waveguide 710) can be occupied by a fluid or air. The waveguide can comprise a controller that moves one or both the reflective surfaces with respect to each other to vary a distance between the reflective surfaces and/or a thickness of the space including the fluid or air at a frequency of the scan rate (or at the frequency at which θ varies) such that the optical path length difference Γ=2nd cos θ is an integral multiple of the wavelength λ for all input angles θ.
d0 at scan angle θm+1 such that the term
nd0 cos θm+1 is equal to mλ, wherein m is an integer. The spacing between the reflective surfaces of the waveguide at point 905e of
d0 at scan angle θm+2 such that the term
nd0 cos θm+2 is equal to (m+1)λ, wherein m is an integer. In
d0 is substantially equal to
d0. Only some of the possible values of the spacing ‘d’ between the reflective surfaces of the waveguide at which 2nd cos θ is an integral multiple of the wavelength λ are depicted in
As discussed above, the variation of the spacing ‘d’ between the reflective surfaces of the waveguide can be synchronized with the variation in the scan angle θ. The variation of the spacing between the reflective surfaces of the waveguide can be periodic as depicted in
In another embodiment, the waveguide can comprise a plurality of layers that are spaced apart from each other. Each of the plurality of layers can be configured to be switched between a reflective state and a transmissive state. A pair of reflective surfaces with any desired spacing between them can be obtained by selectively configuring two of the plurality of layers to be in a reflective state and configuring the remaining plurality of layers to be in a transmissive state. In such embodiments, each of the plurality of layers can be switched between the reflective state and the transmissive state using electro-magnetic control systems. This is explained in greater detail below with reference to
In some embodiments of the waveguide the reflective surfaces can comprise a piezoelectric material. In such embodiments, the spacing between the reflective surfaces can be varied by inducing mechanical expansion or contraction of the waveguide via the application of an electric field.
1.3. Wavelength
Various embodiments of the waveguide can be configured such that the wavelength of the incident light (e.g., light beam 701) can be varied at a frequency of the scan rate (or at the frequency at which θ varies) such that the optical path length difference Γ=2nd cos θ is an integral multiple of the wavelength λ for all input angles θ.
λ0 at scan angle θm+1 such that the term 2nd cos θm+1 is equal to mλ0, wherein m is an integer. The wavelength λ at point 910e of
λ0 at scan angle θm+2 such that the term
nd0 cos θm+2 is equal to (m+1)λ0, wherein m is an integer. In
λ0 is substantially equal to
λ0. Only some of the possible values of the wavelength λ of the incident light at which 2ndcos θ is an integral multiple of the wavelength λ are depicted in
As discussed above, the variation of the wavelength λ of the incident light can be synchronized with the variation in the scan angle θ. The variation of the wavelength λ of the incident light can be periodic as depicted in
As discussed above, changes in optical wavelength can also produce changes in the refractive index. Accordingly, in various embodiments, the change in the wavelength λ of light that is incident on the waveguide can be correlated to the change in the refractive index ‘n’. For example, a controller configured to vary the wavelength λ of incident light can be configured to take into consideration the change in the refractive index of the material of the waveguide. In various embodiments, the controller can include a feedback loop that is configured to dynamically calculate a change in the wavelength λ of the incident light based on the change in the refractive index Δn of the waveguide such that phase synchronization between the various light beams of the beamlet array for different scanned angles of the input light beam can be achieved.
In general, for dynamic phase synchronization, the angular spacing, in radians, between the angles that meet the phase synchronization condition is approximately equal to the light's wavelength, divided by the width of the waveguide. The angular shift for waveguide widths and beam diameters of approximately 100 to 1000 microns can be between about 0.001 to 0.01 radians (or a percentage change of about 0.1% to 1%). To maintain phase synchronization, the angular shift can be compensated by decreasing the waveguide's index of refraction by an amount in the range between about 0.001 and about 0.01; increasing or decreasing the spacing between the reflective surface (or width of the waveguide) by approximately 1 micron; or by increasing or decreasing the wavelength of the incident light in a range between about 1 and about 10 nm.
2. Non-Dynamic Phase Synchronization
Phase synchronization can also be achieved without using any of the dynamic approaches discussed above. For example, the waveguide can comprise a plurality of holographic structures, each of the plurality of holographic structures providing a phase synchronized output for each incident angle. Accordingly, a phase synchronized output can be obtained as the incident angle of the input beam varies without actively co-modulating the spacing between the reflective surfaces of the waveguide, the refractive index of the waveguide or the wavelength of the incident light at the scan rate of the input beam.
A first of plurality of holographic structures that provides a phase synchronized output for a first incident angle can be recorded on a thick holographic medium by interfering a first reference beam incident on the holographic medium from a first side of the holographic medium at the first incident angle and a second reference beam incident on the holographic medium from a second side of the holographic medium opposite the first side. The first reference beam can be configured to have the characteristics of the light beam output from a scanning projector. For example, the first reference beam can be collimated in some embodiments. The first reference beam can have a beam diameter of less than or equal to about 100 microns (e.g., less than or equal to 90 microns, less than or equal to 80 microns, less than or equal to 70 microns, less than or equal to 60 microns, less than or equal to 50 microns, less than or equal to 40 microns, less than or equal to 30 microns, less than or equal to 25 microns, less than or equal to 20 microns, or values therebetween). The second reference beam can be configured to have the characteristics of the phase synchronized beamlet array that is output from the waveguide when the first reference beam is incident on the waveguide at the first incident angle. For example, the second reference beam can be a collimated beam having a continuous wavefront with a uniform phase similar to the beamlet array depicted in
Multiple holographic structures are recorded on the same holographic medium by varying the incidence angle of the first reference beam. For example, the incidence angle of the first reference beam can be continuously varied between about ±30-degrees. As another example, the incidence angle of the first reference beam can be varied between about ±30-degrees in discrete steps that is less than or equal to about 1 degree (e.g., less than or equal to 0.9 degrees, less than or equal to 0.8 degrees, less than or equal to 0.7 degrees, less than or equal to 0.6 degrees, less than or equal to 0.5 degrees, less than or equal to 0.4 degrees, less than or equal to 0.3 degrees, less than or equal to 0.2 degrees, less than or equal to 0.1 degrees, less than or equal to 0.05 degrees, or values therebetween). The angle of incidence of the second reference beam can also be varied corresponding to the variation of the incidence angle of the first reference beam.
Accordingly at least one holographic structure is recorded on the holographic medium for each combination of the angle of incidence of the first reference beam and the angle of incidence of the second reference beam. The waveguide comprising a plurality of holographic structures that are recorded in this manner can be configured to output a phase synchronized beamlet array for an input beam incident at the different angles θ within the solid angle Θ swept by the scanning projector. Furthermore, the diameter of the output beamlet array can be greater than the diameter of the input beam. In such embodiments, angular selectivity is built into the waveguide such that it is not necessary to dynamically synchronize the phase between the various beams of the beamlet array as the angle of incidence of the input light is varied. Thus, in such embodiments, one or more parameters of the waveguide (e.g., refractive index, spacing between the reflective surfaces of the waveguide) and/or the wavelength λ of the incident light need not be varied at the frequency of the scan rate to achieve phase synchronization between the various light beams of the beamlet array output from the waveguide.
Each of the processes, methods, and algorithms described herein and/or depicted in the attached figures may be embodied in, and fully or partially automated by, code modules executed by one or more physical computing systems, hardware computer processors, application-specific circuitry, and/or electronic hardware configured to execute specific and particular computer instructions. For example, computing systems can include general purpose computers (e.g., servers) programmed with specific computer instructions or special purpose computers, special purpose circuitry, and so forth. A code module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language. In some implementations, particular operations and methods may be performed by circuitry that is specific to a given function.
Further, certain implementations of the functionality of the present disclosure are sufficiently mathematically, computationally, or technically complex that application-specific hardware or one or more physical computing devices (utilizing appropriate specialized executable instructions) may be necessary to perform the functionality, for example, due to the volume or complexity of the calculations involved or to provide results substantially in real-time. For example, a video may include many frames, with each frame having millions of pixels, and specifically programmed computer hardware is necessary to process the video data to provide a desired image processing task or application in a commercially reasonable amount of time.
Code modules or any type of data may be stored on any type of non-transitory computer-readable medium, such as physical computer storage including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), optical disc, volatile or non-volatile storage, combinations of the same and/or the like. The methods and modules (or data) may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The results of the disclosed processes or process steps may be stored, persistently or otherwise, in any type of non-transitory, tangible computer storage or may be communicated via a computer-readable transmission medium.
Any processes, blocks, states, steps, or functionalities in flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing code modules, segments, or portions of code which include one or more executable instructions for implementing specific functions (e.g., logical or arithmetical) or steps in the process. The various processes, blocks, states, steps, or functionalities can be combined, rearranged, added to, deleted from, modified, or otherwise changed from the illustrative examples provided herein. In some embodiments, additional or different computing systems or code modules may perform some or all of the functionalities described herein. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto can be performed in other sequences that are appropriate, for example, in serial, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. Moreover, the separation of various system components in the implementations described herein is for illustrative purposes and should not be understood as requiring such separation in all implementations. It should be understood that the described program components, methods, and systems can generally be integrated together in a single computer product or packaged into multiple computer products. Many implementation variations are possible.
The processes, methods, and systems may be implemented in a network (or distributed) computing environment. Network environments include enterprise-wide computer networks, intranets, local area networks (LAN), wide area networks (WAN), personal area networks (PAN), cloud computing networks, crowd-sourced computing networks, the Internet, and the World Wide Web. The network may be a wired or a wireless network or any other type of communication network.
The systems and methods of the disclosure each have several innovative aspects, no single one of which is solely responsible or required for the desirable attributes disclosed herein. The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the claims are not intended to be limited to the implementations shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein.
Certain features that are described in this specification in the context of separate implementations also can be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also can be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. No single feature or group of features is necessary or indispensable to each and every embodiment.
Conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. In addition, the articles “a,” “an,” and “the” as used in this application and the appended claims are to be construed to mean “one or more” or “at least one” unless specified otherwise.
As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: A, B, or C” is intended to cover: A, B, C, A and B, A and C, B and C, and A, B, and C. Conjunctive language such as the phrase “at least one of X, Y and Z,” unless specifically stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc. may be at least one of X, Y or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y and at least one of Z to each be present.
Similarly, while operations may be depicted in the drawings in a particular order, it is to be recognized that such operations need not be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the drawings may schematically depict one more example processes in the form of a flowchart. However, other operations that are not depicted can be incorporated in the example methods and processes that are schematically illustrated. For example, one or more additional operations can be performed before, after, simultaneously, or between any of the illustrated operations. Additionally, the operations may be rearranged or reordered in other implementations. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results.
This application is a continuation of U.S. application Ser. No. 16/672,175 filed on Nov. 1, 2019 entitled “SYSTEMS AND METHODS FOR OPTICAL SYSTEMS WITH EXIT PUPIL EXPANDER,” which is a continuation of U.S. application Ser. No. 15/710,055 filed on Sep. 20, 2017 entitled “SYSTEMS AND METHODS FOR OPTICAL SYSTEMS WITH EXIT PUPIL EXPANDER,” which claims the priority benefit of U.S. Provisional Patent Application No. 62/397,759 filed on Sep. 21, 2016 entitled “SYSTEMS AND METHODS FOR OPTICAL SYSTEMS WITH EXIT PUPIL EXPANDER.” The applications recited above are each incorporated by reference herein in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
6850221 | Tickle | Feb 2005 | B1 |
D514570 | Ohta | Feb 2006 | S |
8760762 | Kelly et al. | Jun 2014 | B1 |
9081426 | Armstrong | Jul 2015 | B2 |
9215293 | Miller | Dec 2015 | B2 |
D752529 | Loretan et al. | Mar 2016 | S |
9348143 | Gao et al. | May 2016 | B2 |
D759657 | Kujawski et al. | Jul 2016 | S |
9417452 | Schowengerdt et al. | Aug 2016 | B2 |
9470906 | Kaji et al. | Oct 2016 | B2 |
9547174 | Gao et al. | Jan 2017 | B2 |
9671566 | Abovitz et al. | Jun 2017 | B2 |
9740006 | Gao | Aug 2017 | B2 |
9791700 | Schowengerdt et al. | Oct 2017 | B2 |
9851563 | Gao et al. | Dec 2017 | B2 |
9857591 | Welch et al. | Jan 2018 | B2 |
9874749 | Bradski | Jan 2018 | B2 |
10481399 | Macnamara | Nov 2019 | B2 |
11119323 | Macnamara | Sep 2021 | B2 |
20060028436 | Armstrong | Feb 2006 | A1 |
20070081123 | Lewis | Apr 2007 | A1 |
20110075963 | Choi et al. | Mar 2011 | A1 |
20120127062 | Bar-Zeev et al. | May 2012 | A1 |
20120162549 | Gao et al. | Jun 2012 | A1 |
20130082922 | Miller | Apr 2013 | A1 |
20130117377 | Miller | May 2013 | A1 |
20130125027 | Abovitz | May 2013 | A1 |
20130208234 | Lewis | Aug 2013 | A1 |
20130242262 | Lewis | Sep 2013 | A1 |
20140071539 | Gao | Mar 2014 | A1 |
20140177023 | Gao et al. | Jun 2014 | A1 |
20140218468 | Gao et al. | Aug 2014 | A1 |
20140267420 | Schowengerdt | Sep 2014 | A1 |
20140306866 | Miller et al. | Oct 2014 | A1 |
20150016777 | Abovitz et al. | Jan 2015 | A1 |
20150103306 | Kaji et al. | Apr 2015 | A1 |
20150178939 | Bradski et al. | Jun 2015 | A1 |
20150205126 | Schowengerdt | Jul 2015 | A1 |
20150222883 | Welch | Aug 2015 | A1 |
20150222884 | Cheng | Aug 2015 | A1 |
20150268415 | Schowengerdt et al. | Sep 2015 | A1 |
20150302652 | Miller et al. | Oct 2015 | A1 |
20150309263 | Abovitz et al. | Oct 2015 | A2 |
20150326570 | Publicover et al. | Nov 2015 | A1 |
20150346490 | TeKolste et al. | Dec 2015 | A1 |
20150346495 | Welch et al. | Dec 2015 | A1 |
20160011419 | Gao | Jan 2016 | A1 |
20160026253 | Bradski et al. | Jan 2016 | A1 |
20160103324 | Arakawa et al. | Apr 2016 | A1 |
20160231568 | Saarikko et al. | Aug 2016 | A1 |
20170176835 | Gupta | Jun 2017 | A1 |
20170184897 | Rho | Jun 2017 | A1 |
20170331071 | Han | Nov 2017 | A1 |
20170364194 | Jang | Dec 2017 | A1 |
20180120566 | Macnamara | May 2018 | A1 |
20200142197 | Macnamara | May 2020 | A1 |
Number | Date | Country |
---|---|---|
102654590 | Sep 2012 | CN |
104049926 | Sep 2014 | CN |
104169749 | Nov 2014 | CN |
105700143 | Jun 2016 | CN |
105842843 | Aug 2016 | CN |
105934902 | Sep 2016 | CN |
WO 2013188464 | Dec 2013 | WO |
WO 2015081313 | Jun 2015 | WO |
WO 2016105285 | Jun 2016 | WO |
WO 2018057528 | Mar 2018 | WO |
Entry |
---|
International Search Report and Written Opinion for PCT Application No. PCT/US2017/52314, dated Nov. 29, 2017. |
International Preliminary Report on Patentability for PCT Application No. PCT/US2017/52314, dated Mar. 26, 2019. |
ARToolKit: https://web.archive.org/web/20051013062315/http://www.hitl.washington.edu:80/artoolkit/documentation/hardware.htm, archived Oct. 13, 2005. |
Azuma, “A Survey of Augmented Reality,” Teleoperators and Virtual Environments 6, 4 (Aug. 1997), pp. 355-385. https://web.archive.org/web/20010604100006/http://www.cs.unc.edu/˜azuma/ARpresence.pdf. |
Azuma, “Predictive Tracking for Augmented Realty,” TR95-007, Department of Computer Science, UNC-Chapel Hill, NC, Feb. 1995. |
Bimber, et al., “Spatial Augmented Reality—Merging Real and Virtual Worlds,” 2005 https://web.media.mit.edu/˜raskar/book/BimberRaskarAugmentedRealityBook.pdf. |
Jacob, “Eye Tracking in Advanced Interface Design,” Human-Computer Interaction Lab Naval Research Laboratory, Washington, D.C. / paper/ in Virtual Environments and Advanced Interface Design, ed. by W. Barfield and T.A. Furness, pp. 258-288, Oxford University Press, New York (1995). |
Tanriverdi and Jacob, “Interacting With Eye Movements in Virtual Environments,” Department of Electrical Engineering and Computer Science, Tufts University, Medford, MA—paper/Proc. ACM CHI 2000 Human Factors in Computing Systems Conference, pp. 265-272, Addison-Wesley/ACM Press (2000). |
Number | Date | Country | |
---|---|---|---|
20220082839 A1 | Mar 2022 | US |
Number | Date | Country | |
---|---|---|---|
62397759 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16672175 | Nov 2019 | US |
Child | 17473755 | US | |
Parent | 15710055 | Sep 2017 | US |
Child | 16672175 | US |