The present disclosure relates to optical elements such as waveguides for eyepieces that can be included in virtual reality and augmented reality imaging and visualization systems and methods of fabricating optical elements. Various waveguide designs, for example, are configured to redirect UV light laser light used to cut the waveguides from sheets of material to reduce the amount of such UV laser light that is coupled into the waveguide during fabrication that would otherwise degrade the optical quality of the waveguide.
Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” or “augmented reality” experiences, wherein digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may be perceived as, real. A virtual reality, or “VR”, scenario typically involves presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user. An advanced augmented reality scenario can involve virtual objects that are integrated into, and responsive to, the natural world. For example, some AR content may be blocked by or otherwise be perceived as interacting with objects in the real world.
Referring to
Systems and methods disclosed herein address various challenges related to display technologies, including AR and VR technology. Some challenges in fabricating waveguides, for example, are addressed herein.
Methods of fabricating optical elements such as waveguides for eyepieces for AR and VR head mounted displays are disclosed. Some such optical elements (e.g., waveguides) are configured to include diffractive or diffraction grating(s), diffractive optical element(s) or other surface feature(s) that direct light that is coupled into the optical element (e.g., waveguides) out of the optical element/waveguide. In some examples, this light is UV light that is employed to singulate the optical element or waveguide from a sheet of material. A number of examples of waveguides, eyepieces, and display systems that include waveguides and/or eyepieces, as well as methods of forming the same are provided herein.
Example implementations described herein have several features, no single one of which is indispensable or solely responsible for their desirable attributes. A variety of example systems and methods are provided below.
Example 1: A display system configured to be disposed on a user's head and/or face so as to present images to a user's eye, said display system including:
Example 2: The display system of Example 1, wherein the plurality of surface features includes a diffraction grating.
Example 3: The display system of Example 1, wherein the plurality of surface features includes a blazed grating.
Example 4: The display system of any of the examples above, wherein at least one of said surface features has at least one sloping surface.
Example 5: The display system of any of the examples above, wherein at least one of said surface features has two sloping surfaces.
Example 6: The display system of any of the examples above, wherein a plurality of said surface features have a sawtooth shape.
Example 7: The display system of any of the examples above, wherein a plurality of said surface features have a triangular cross-section orthogonal to the portion of the edge closest thereto.
Example 8: The display system of any of the examples above, wherein a plurality of said surface features are asymmetric.
Example 9: The display system of any of Claim 1, wherein a plurality of said surface features are symmetric.
Example 10: The display system of any of the examples above, wherein the plurality of surface features has an average height in a range from 10 to 500 nanometers (nm) high.
Example 11: The display system of any of the examples above, wherein the plurality of surface features has an average peak-to-peak spacing in a range from 100 and 500 nm.
Example 12: The display system of any of the examples above, wherein the plurality of surface features has an average full width at half maximum (FWHM) in a range from 75 and 250 nm.
Example 13: The display system of any of the examples above, wherein the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 2 mm from the edge.
Example 14: The display system of any of the examples above, wherein the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 1.8 mm from the edge.
Example 15: The display system of any of the examples above, wherein the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 1.6 mm from the edge.
Example 16: The display system of any of the examples above, wherein the plurality of surface features extends to within 0.2 mm from said edge.
Example 17: The display system of any of the examples above, wherein the plurality of surface features extends to within 0.1 mm from said edge.
Example 18: The display system of any of the examples above, wherein the plurality of surface features extends to within 0.05 mm from said edge.
Example 19: The display system of any of the examples above, further including a light absorbing material disposed on the edge and one or more of the plurality of surface features.
Example 20: The display system of Example 19, wherein the light absorbing material covers an area beyond portions of the top and bottom major surfaces having the plurality of surface features.
Example 21: The display system of any of Examples 1-18, wherein said plurality of surface features are not covered by dark material.
Example 22: The display system of any of Examples 1-18 or 21, wherein said plurality of surface features are not covered by black or grey material.
Example 23: The display system of any of Examples 1-18, 21 or 22, wherein said plurality of surface features are not covered by opaque material.
Example 24: The display system of any of the examples above, wherein the waveguide is part of a stack of waveguides.
Example 25: The display system of any of the examples above, wherein the eyepiece is disposed on a frame configured to be supported on the head or face of the user.
Example 26: The display system of any of the examples above, wherein the eyepiece forms part of eyewear configured to be worn by the user.
Example 27: The display system of any of the examples above, wherein the eyepiece is transparent such that the user can see through the eyepiece to view the environment in front of the user and the eyepiece.
Example 28: A method of forming said display system of any of the examples above, including forming said eyepiece by using a laser beam to cut a waveguide from a sheet of material having said plurality of surface features formed therein.
Example 29: A method of forming said display system of the examples above, wherein said laser beam cuts along an edge of said plurality of surface features.
Example 30: A method of forming said display system of any of Examples 1-28, wherein said laser beam cuts along a path surrounded on both sides by said plurality of surface features.
Example 31: A method of forming said display system of any of the Examples 1-28, wherein said laser beam cuts along a path surrounding said plurality of surface features such that said plurality of surface features are disposed between said path and said main region.
Example 32: A display system configured to be disposed on a user's head and/or face so as to present images to a user's eye, said display system including:
Example 33: The display system of Example 32, wherein said surface features are not covered by dark material.
Example 34: The display system of Example 32 or 33, wherein said surface features are not covered by black or grey material.
Example 35: The display system of any of Examples 32-34, wherein said surface features are not covered by opaque material.
Example 36: A display system configured to be disposed on a user's head and/or face so as to present images to a user's eye, said display system including:
Example 37: The display system of Example 36, wherein at least one of said plurality of surface features within said peripheral region of said waveguide is (i) configured to couple light propagating in said waveguide from said main region toward said peripheral region out of said waveguide out from said peripheral region and (ii) configured to couple light propagating in said waveguide from said edge toward said main region out of said waveguide out from said peripheral region.
Example 38: The display system of Example 36 or 37, wherein surface features further from said edge have steeper sloping surfaces that surface features closer to said edge
Example 39: A display system configured to be disposed on a user's head and/or face so as to present images to a user's eye, said display system including:
Example 40: A display system configured to be disposed on a user's head and/or face so as to present images to a user's eye, said display system including:
Example 41: The display system of Example 40, wherein said second plurality of surface features have a cross-sectional shape that is a mirror image of that of said first plurality of surface features.
Example 42: The display system of Example 40 or 41, wherein surface features further from said edge in said first plurality of surface features have steeper sloping surfaces than surface features in said first plurality of surface features closer to said edge.
Example 43: The display system of any of Examples 40-42, wherein surface features further from said edge in said second plurality of surface features have steeper sloping surfaces than surface features in said second plurality of surface features closer to said edge.
Example 44: The display system of any of Examples 40-43, wherein said first and second plurality of surface features have a sawtooth shape.
Example 45: The display system of any of Examples 40-44, wherein said first and second plurality of surface features have a triangular cross-section orthogonal to said portion of said edge closest thereto.
Example 46: The display system of any of Examples 40-45, wherein said first plurality of surface features slope more in a first direction and said second plurality of surface features slope more in a second direction different from the first direction.
Example 47: The display system of any of Examples 40-46, wherein said first plurality of surface features are asymmetric.
Example 48: The display system of any of Examples 40-47, wherein said second plurality of surface features are asymmetric.
Example 49: A method of forming a waveguide for an eyepiece for a display system, said method including:
Example 50: The method of Example 49, further including forming said plurality of surface features in said substrate.
Example 51: The method of Example 49, further including forming said plurality of surface features in said substrate using nano-imprinting.
Example 52: The method of any of Examples 49-51, wherein the plurality of surface features is configured such that at least some of said laser light that is coupled into said waveguide is directed out of said waveguide by said plurality of surface features.
Example 53: The method of any of Examples 49-52, wherein the plurality of surface features includes a diffraction grating.
Example 54: The method of any of Examples 49-53, wherein the plurality of surface features includes a blazed grating.
Example 55: The method of any of Examples 49-54, wherein a plurality of said surface features have at least one sloping surface.
Example 56: The method of any of Examples 49-55, wherein a plurality of said surface features have two sloping surfaces.
Example 57: The method of any of Examples 49-56, wherein a plurality of said surface features have a sawtooth shape.
Example 58: The method of any of Examples 49-57, wherein a plurality of said surface features have a triangular cross-section.
Example 59: The method of any of Examples 49-58, wherein a plurality of surface features are asymmetric.
Example 60: The method of any of the Examples 49-58, wherein a plurality of surface features are symmetric.
Example 61: The method of any of Examples 49-60, wherein the plurality of surface features have a height in a range from 10 to 500 nanometers (nm) high.
Example 62: The method of any of the Examples 49-61, wherein the plurality of surface features have an average peak-to-peak spacing in a range from 100 to 500.
Example 63: The method of any of the Examples 49-62, wherein the plurality of surface features have a full width at half maximum (FWHM) in a range from 75 to 250 nm.
Example 64: The method of any of the Examples 49-63, wherein the substrate is cut by the laser such that the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 2 mm from the path cut by the laser.
Example 65: The method of any of the Examples 49-64, wherein the substrate is cut by the laser such that the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 1.8 mm from the path cut by the laser.
Example 66: The method of any of the Examples 49-65, wherein the substrate is cut by the laser such that the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 1.6 mm from the path cut by the laser.
Example 67: The method of any of the Examples 49-66, wherein the substrate is cut by the laser such that the plurality of surface features extends to at least within 0.2 mm from said path cut by the laser.
Example 68: The method of any of the Examples 49-67, wherein the substrate is cut by the laser such that the plurality of surface features extends to at least within 0.1 mm from said path cut by the laser.
Example 69: The method of any of the Examples 49-68, wherein the substrate is cut by the laser such that the plurality of surface features extends to at least within 0.05 mm from said path cut by the laser.
Example 70: The method of any of the Examples 49-69, further including depositing light absorbing material on the edge and one or more of plurality of surface features.
Example 71: The method of Example 70, further including covering an area beyond portions of the top and bottom major surfaces having the plurality of surface features with the light absorbing material.
Example 72: The method of any of the Examples 49-69, wherein said plurality of surface features are not covered by dark material.
Example 73: The method of any of the Examples 49-69, wherein said plurality of surface features are not covered by black or grey material.
Example 74: The method of any of the Examples 49-69, wherein said plurality of surface features are not covered by opaque material.
Example 75: The method of any of the Examples 49-74, wherein the waveguide is included as part of a stack of waveguides.
Example 76: The method of any of the Examples 49-75, wherein the eyepiece is disposed on a frame configured to be supported on the head or face of the user.
Example 77: The method of any of the Examples 49-76, wherein the eyepiece is included in eyewear configured to be worn by the user.
Example 78: The method of any of the Examples 49-77, wherein the eyepiece is transparent such that the user can see through the eyepiece to view the environment in front of the user and the eyepiece.
Example 79: The method of any of the Examples 49-78, wherein said laser beam cuts along an edge of said plurality of surface features.
Example 80: The method of forming said display system of any of the Examples 49-78, wherein said laser beam cuts along a path surrounded on both sides by a plurality of said surface features.
Example 81: The method of any of the Examples 49-78, wherein said laser beam cuts along a path surrounding said plurality of surface features such that said plurality of surface features are disposed between said path and said main region.
Example 82: The display system of any of Examples 32-35, 36-38, 39 or 40-48, wherein the plurality of surface features includes a diffraction grating.
Example 83: The display system of any of Examples 32-35, 36-38, 39 or 40-48, wherein the plurality of surface features includes a blazed grating.
Example 84: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82-83, wherein at least one of said surface features has at least one sloping surface.
Example 85: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82-84, wherein at least one of said surface features has two sloping surfaces.
Example 86: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82-85, wherein a plurality of said surface features have a sawtooth shape.
Example 87: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82-86, wherein a plurality of said surface features have a triangular cross-section orthogonal to the portion of the edge closest thereto.
Example 88: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82-87, wherein a plurality of said surface features are asymmetric.
Example 89: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82, wherein a plurality of said surface features are symmetric.
Example 90: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82-89, wherein the surface features have an average height in a range from 10 to 500 nanometers (nm) high.
Example 91: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-90, wherein the plurality of surface features have an average peak-to-peak spacing in a range from 100 and 500 nm.
Example 92: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-91, wherein the plurality of surface features have an average full width at half maximum (FWHM) in a range from 75 and 250 nm.
Example 93: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-92, wherein the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 2 mm from the edge.
Example 94: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-93, wherein the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 1.8 mm from the edge.
Example 95: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-94, wherein the plurality of surface features extends out, on one or both of the top and bottom major surfaces no more than 1.6 mm from the edge.
Example 96: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-95, wherein the plurality of surface features extends to within 0.2 mm from said edge.
Example 97: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-96, wherein the plurality of surface features extends to within 0.1 mm from said edge.
Example 98: The display system of any of Examples 32-35, 36-38, 39, 4048, or 82-97, wherein the plurality of surface features extends to within 0.05 mm from said edge.
Example 99: The display system of any of Examples 36-38, 39, 40-48, or 82-98, further including a light absorbing material disposed on the edge and one or more of the plurality of surface features.
Example 100: The display system of Example 99, wherein the light absorbing material covers an area beyond portions of the top and bottom major surfaces having the plurality of surface features.
Example 101: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-98, wherein said plurality of surface features are not covered by dark material.
Example 102: The display system of any of Examples 32-35, 36-38, 39, 40-48, 82-98 or 101, wherein said plurality of surface features are not covered by black or grey material.
Example 103: The display system of any of Examples 32-35, 36-38, 39, 40-48, 82-98, 101 or 102, wherein said plurality of surface features are not covered by opaque material.
Example 104: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-103, wherein the waveguide is part of a stack of waveguides.
Example 105: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-104, wherein the eyepiece is disposed on a frame configured to be supported on the head or face of the user.
Example 106: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-105, wherein the eyepiece forms part of eyewear configured to be worn by the user.
Example 107: The display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-106, wherein the eyepiece is transparent such that the user can see through the eyepiece to view the environment in front of the user and the eyepiece.
Example 108: The method of forming said display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-107, including forming said eyepiece by using a laser beam to cut a waveguide from a sheet of material having said plurality of surface features formed therein.
Example 109: The method of forming said display system of Examples 32-35, 36-38, 39, 40-48, or 82-108, wherein said laser beam cuts along an edge of said plurality of surface features.
Example 110: The method of forming said display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-108, wherein said laser beam cuts along a path surrounded on both sides by said plurality of surface features.
Example 111: The method of forming said display system of any of Examples 32-35, 36-38, 39, 40-48, or 82-108, wherein said laser beam cuts along a path surrounding said plurality of surface features such that said plurality of surface features are disposed between said path and said main region.
These and other features will now be described with reference to the drawings summarized above. The drawings and the associated descriptions are provided to illustrate implementations and not to limit the scope of the disclosure or claims. Throughout the drawings, reference numbers may be reused to indicate correspondence between referenced elements. In addition, where applicable, the first one or two digits of a reference numeral for an element can frequently indicate the figure number in which the element first appears.
Like reference numbers and designations in the various drawings indicate like elements throughout.
With continued reference to
With continued reference to
The perception of an image as being “three-dimensional” or “3-D” may be achieved by providing slightly different presentations of the image to each eye of the viewer.
It will be appreciated, however, that the human visual system is more complicated and providing a realistic perception of depth is more challenging. For example, many viewers of conventional “3-D” display systems find such systems to be uncomfortable or may not perceive a sense of depth at all. Without being limited by theory, it is believed that viewers of an object may perceive the object as being “three-dimensional” due to a combination of vergence and accommodation. Vergence movements (i.e., rotation of the eyes so that the pupils move toward or away from each other to converge the lines of sight of the eyes to fixate upon an object) of the two eyes relative to each other are closely associated with focusing (or “accommodation”) of the lenses and pupils of the eyes. Under normal conditions, changing the focus of the lenses of the eyes, or accommodating the eyes, to change focus from one object to another object at a different distance will automatically cause a matching change in vergence to the same distance, under a relationship known as the “accommodation-vergence reflex,” as well as pupil dilation or constriction. Likewise, a change in vergence will trigger a matching change in accommodation of lens shape and pupil size, under normal conditions. As noted herein, many stereoscopic or “3-D” display systems display a scene using slightly different presentations (and, so, slightly different images) to each eye such that a three-dimensional perspective is perceived by the human visual system. Such systems are uncomfortable for many viewers, however, since they, among other things, simply provide a different presentation of a scene, but with the eyes viewing all the image information at a single accommodated state, and work against the “accommodation-vergence reflex.” Display systems that provide a better match between accommodation and vergence may form more realistic and comfortable simulations of three-dimensional imagery contributing to increased duration of wear and in turn compliance to diagnostic and therapy protocols.
The distance between an object and the eye 210 or 220 may also change the amount of divergence of light from that object, as viewed by that eye.
Without being limited by theory, it is believed that the human eye typically can interpret a finite number of depth planes to provide depth perception. Consequently, a highly believable simulation of perceived depth may be achieved by providing, to the eye, different presentations of an image corresponding to each of these limited number of depth planes. The different presentations may be separately focused by the viewer's eyes, thereby helping to provide the user with depth cues based on the accommodation of the eye required to bring into focus different image features for the scene located on different depth plane and/or based on observing different image features on different depth planes being out of focus.
With continued reference to
In some implementations, the image injection devices 360, 370, 380, 390, 400 are discrete displays that each produce image information for injection into a corresponding waveguide 270, 280, 290, 300, 310, respectively. In some other implementations, the image injection devices 360, 370, 380, 390, 400 are the output ends of a single multiplexed display which may, e.g., pipe image information via one or more optical conduits (such as fiber optic cables) to each of the image injection devices 360, 370, 380, 390, 400. It will be appreciated that the image information provided by the image injection devices 360, 370, 380, 390, 400 may include light of different wavelengths, or colors (e.g., different component colors, as discussed herein).
In some implementations, the light injected into the waveguides 270, 280, 290, 300, 310 is provided by a light projector system 520, which includes a light module 530, which may include a light emitter, such as a light emitting diode (LED). The light from the light module 530 may be directed to and modified by a light modulator 540, e.g., a spatial light modulator, via a beam splitter 550. The light modulator 540 may be configured to change the perceived intensity of the light injected into the waveguides 270, 280, 290, 300, 310. Examples of spatial light modulators include liquid crystal displays (LCD) including a liquid crystal on silicon (LCOS) displays.
In some implementations, the display system 250 may be a scanning fiber display including one or more scanning fibers configured to project light in various patterns (e.g., raster scan, spiral scan, Lissajous patterns, etc.) into one or more waveguides 270, 280, 290, 300, 310 and ultimately to the eye 210 of the viewer. In some implementations, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a single scanning fiber or a bundle of scanning fibers configured to inject light into one or a plurality of the waveguides 270, 280, 290, 300, 310. In some other implementations, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a plurality of scanning fibers or a plurality of bundles of scanning fibers, each of which are configured to inject light into an associated one of the waveguides 270, 280, 290, 300, 310. It will be appreciated that one or more optical fibers may be configured to transmit light from the light module 530 to the one or more waveguides 270, 280, 290, 300, 310. It will be appreciated that one or more intervening optical structures may be provided between the scanning fiber, or fibers, and the one or more waveguides 270, 280, 290, 300, 310 to, e.g., redirect light exiting the scanning fiber into the one or more waveguides 270, 280, 290, 300, 310.
A controller 560 controls the operation of one or more of the stacked waveguide assembly 260, including operation of the image injection devices 360, 370, 380, 390, 400, the light source 530, and the light modulator 540. In some implementations, the controller 560 is part of the local data processing module 140. The controller 560 includes programming (e.g., instructions in a non-transitory medium) that regulates the timing and provision of image information to the waveguides 270, 280, 290, 300, 310 according to, e.g., any of the various schemes disclosed herein. In some implementations, the controller may be a single integral device, or a distributed system connected by wired or wireless communication channels. The controller 560 may be part of the processing modules 140 or 150 (
With continued reference to
With continued reference to
The other waveguide layers 300, 310 and lenses 330, 320 are similarly configured, with the highest waveguide 310 in the stack sending its output through all of the lenses between it and the eye for an aggregate focal power representative of the closest focal plane to the person. To compensate for the stack of lenses 320, 330, 340, 350 when viewing/interpreting light coming from the world 510 on the other side of the stacked waveguide assembly 260, a compensating lens layer 620 may be disposed at the top of the stack to compensate for the aggregate power of the lens stack 320, 330, 340, 350 below. Such a configuration provides as many perceived focal planes as there are available waveguide/lens pairings. Both the out-coupling optical elements of the waveguides and the focusing aspects of the lenses may be static (i.e., not dynamic or electro-active). In some alternative implementations, either or both may be dynamic using electro-active features.
In some implementations, two or more of the waveguides 270, 280, 290, 300, 310 may have the same associated depth plane. For example, multiple waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same depth plane, or multiple subsets of the waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same plurality of depth planes, with one set for each depth plane. This can provide advantages for forming a tiled image to provide an expanded field of view at those depth planes.
With continued reference to
In some implementations, the out-coupling optical elements 570, 580, 590, 600, 610 are diffractive features that form a diffraction pattern, or “diffractive optical element” (also referred to herein as a “DOE”). Preferably, the DOE's have a sufficiently low diffraction efficiency so that only a portion of the light of the beam is deflected away toward the eye 210 with each intersection of the DOE, while the rest continues to move through a waveguide via TIR. The light carrying the image information is thus divided into a number of related exit beams that exit the waveguide at a multiplicity of locations and the result is a fairly uniform pattern of exit emission toward the eye 210 for this particular collimated beam bouncing around within a waveguide.
In some implementations, one or more DOEs may be switchable between “on” states in which they actively diffract, and “off” states in which they do not significantly diffract. For instance, a switchable DOE may include a layer of polymer dispersed liquid crystal, in which microdroplets include a diffraction pattern in a host medium, and the refractive index of the microdroplets may be switched to substantially match the refractive index of the host material (in which case the pattern does not appreciably diffract incident light) or the microdroplet may be switched to an index that does not match that of the host medium (in which case the pattern actively diffracts incident light).
In some implementations, a camera assembly 630 (e.g., a digital camera, including visible light and infrared light cameras) may be provided to capture images of the eye 210 and/or tissue around the eye 210 to, e.g., detect user inputs and/or to monitor the physiological state of the user. As used herein, a camera may be any image capture device. In some implementations, the camera assembly 630 may include an image capture device and a light source to project light (e.g., infrared light) to the eye, which may then be reflected by the eye and detected by the image capture device. In some implementations, the camera assembly 630 may be attached to the frame 80 (
With reference now to
In some implementations, a full color image may be formed at each depth plane by overlaying images in each of the component colors, e.g., three or more component colors.
In some implementations, light of each component color may be outputted by a single dedicated waveguide and, consequently, each depth plane may have multiple waveguides associated with it. In such implementations, each box in the figures including the letters G, R, or B may be understood to represent an individual waveguide, and three waveguides may be provided per depth plane where three component color images are provided per depth plane. While the waveguides associated with each depth plane are shown adjacent to one another in this drawing for ease of description, it will be appreciated that, in a physical device, the waveguides may all be arranged in a stack with one waveguide per level. In some other implementations, multiple component colors may be outputted by the same waveguide, such that, e.g., only a single waveguide may be provided per depth plane.
With continued reference to
It will be appreciated that references to a given color of light throughout this disclosure will be understood to encompass light of one or more wavelengths within a range of wavelengths of light that are perceived by a viewer as being of that given color. For example, red light may include light of one or more wavelengths in the range of about 620-780 nm, green light may include light of one or more wavelengths in the range of about 492-577 nm, and blue light may include light of one or more wavelengths in the range of about 435-493 nm.
In some implementations, the light source 530 (
With reference now to
The illustrated set 660 of stacked waveguides includes waveguides 670, 680, and 690. Each waveguide includes an associated in-coupling optical element (which may also be referred to as a light input area on the waveguide), with, e.g., in-coupling optical element 700 disposed on a major surface (e.g., an upper major surface) of waveguide 670, in-coupling optical element 710 disposed on a major surface (e.g., an upper major surface) of waveguide 680, and in-coupling optical element 720 disposed on a major surface (e.g., an upper major surface) of waveguide 690. In some implementations, one or more of the in-coupling optical elements 700, 710, 720 may be disposed on the bottom major surface of the respective waveguide 670, 680, 690 (particularly where the one or more in-coupling optical elements are reflective, deflecting optical elements). As illustrated, the in-coupling optical elements 700, 710, 720 may be disposed on the upper major surface of their respective waveguide 670, 680, 690 (or the top of the next lower waveguide), particularly where those in-coupling optical elements are transmissive, deflecting optical elements. In some implementations, the in-coupling optical elements 700, 710, 720 may be disposed in the body of the respective waveguide 670, 680, 690. In some implementations, as discussed herein, the in-coupling optical elements 700, 710, 720 are wavelength selective, such that they selectively redirect one or more wavelengths of light, while transmitting other wavelengths of light. While illustrated on one side or corner of their respective waveguide 670, 680, 690, it will be appreciated that the in-coupling optical elements 700, 710, 720 may be disposed in other areas of their respective waveguide 670, 680, 690 in some implementations.
As illustrated, the in-coupling optical elements 700, 710, 720 may be laterally offset from one another. In some implementations, each in-coupling optical element may be offset such that it receives light without that light passing through another in-coupling optical element. For example, each in-coupling optical element 700, 710, 720 may be configured to receive light from a different image injection device 360, 370, 380, 390, and 400 as shown in
Each waveguide also includes associated light distributing elements, with, e.g., light distributing elements 730 disposed on a major surface (e.g., a top major surface) of waveguide 670, light distributing elements 740 disposed on a major surface (e.g., a top major surface) of waveguide 680, and light distributing elements 750 disposed on a major surface (e.g., a top major surface) of waveguide 690. In some other implementations, the light distributing elements 730, 740, 750, may be disposed on a bottom major surface of associated waveguides 670, 680, 690, respectively. In some other implementations, the light distributing elements 730, 740, 750, may be disposed on both top and bottom major surface of associated waveguides 670, 680, 690, respectively; or the light distributing elements 730, 740, 750, may be disposed on different ones of the top and bottom major surfaces in different associated waveguides 670, 680, 690, respectively.
The waveguides 670, 680, 690 may be spaced apart and separated by, e.g., gas, liquid, and/or solid layers of material. For example, as illustrated, layer 760a may separate waveguides 670 and 680; and layer 760b may separate waveguides 680 and 690. In some implementations, the layers 760a and 760b are formed of low refractive index materials (that is, materials having a lower refractive index than the material forming the immediately adjacent one of waveguides 670, 680, 690). Preferably, the refractive index of the material forming the layers 760a, 760b is 0.05 or more, or 0.10 or less than the refractive index of the material forming the waveguides 670, 680, 690. Advantageously, the lower refractive index layers 760a, 760b may function as cladding layers that facilitate total internal reflection (TIR) of light through the waveguides 670, 680, 690 (e.g., TIR between the top and bottom major surfaces of each waveguide). In some implementations, the layers 760a, 760b are formed of air. While not illustrated, it will be appreciated that the top and bottom of the illustrated set 660 of waveguides may include immediately neighboring cladding layers.
Preferably, for ease of manufacturing and other considerations, the material forming the waveguides 670, 680, 690 are similar or the same, and the material forming the layers 760a, 760b are similar or the same. In some implementations, the material forming the waveguides 670, 680, 690 may be different between one or more waveguides, and/or the material forming the layers 760a, 760b may be different, while still holding to the various refractive index relationships noted above.
With continued reference to
In some implementations, the light rays 770, 780, 790 have different properties, e.g., different wavelengths or different ranges of wavelengths, which may correspond to different colors. The in-coupling optical elements 700, 710, 720 each deflect the incident light such that the light propagates through a respective one of the waveguides 670, 680, 690 by TIR. In some implementations, the in-coupling optical elements 700, 710, 720 each selectively deflect one or more particular wavelengths of light, while transmitting other wavelengths to an underlying waveguide and associated in-coupling optical element.
For example, in-coupling optical element 700 may be configured to deflect ray 770, which has a first wavelength or range of wavelengths, while transmitting rays 780 and 790, which have different second and third wavelengths or ranges of wavelengths, respectively. The transmitted ray 780 impinges on and is deflected by the in-coupling optical element 710, which is configured to deflect light of a second wavelength or range of wavelengths. The ray 790 is deflected by the in-coupling optical element 720, which is configured to selectively deflect light of third wavelength or range of wavelengths.
With continued reference to
With reference now to
In some implementations, the light distributing elements 730, 740, 750 are orthogonal pupil expanders (OPE's). In some implementations, the OPE's deflect or distribute light to the out-coupling optical elements 800, 810, 820 and, in some implementations, may also increase the beam or spot size of this light as it propagates to the out-coupling optical elements. In some implementations, the light distributing elements 730, 740, 750 may be omitted and the in-coupling optical elements 700, 710, 720 may be configured to deflect light directly to the out-coupling optical elements 800, 810, 820. For example, with reference to
Accordingly, with reference to
As described above, the AR or VR display may include a plurality of layers, for example, a stack of waveguides with different waveguides for different color light and/or depth planes. These waveguides may be cut from a sheet of material using a laser beam such as a UV laser beam. However, during the fabrication process some of the laser light used for cutting may be inadvertently coupled into the waveguide being cut from the sheet. Unfortunately, this UV light coupled into the waveguide may degrade the optical properties of the waveguide. Accordingly, decreasing the amount of UV light that is coupled into the waveguide during fabrication, or more specifically during singulation, may reduce the optical degradation to the waveguide caused by the UV light.
As shown in
In some implementations, the plurality of surface features 1010 extend out from said edge 1012 within 1.0 mm to 3.0 mm from said edge. In various implementations, the plurality of surface features 1010 extend out from said edge 1012 no more than 1.0 mm, 1.1 mm, 1.2 mm, 1.3 mm, 1.4 mm, 1.5 mm, 1.6 mm, 1.7 mm, 1.8 mm, 1.9 mm, 2.0 mm, 2.1 mm, 2.2 mm, 2.3 mm, 2.4 mm, 2.5 mm, 2.6 mm, 2.7 mm, 2.8 mm, 2.9 mm, 3.0 mm from said edge or any range between any of these values or possibly a longer or shorter distance. Likewise, in some implementations, the width of the plurality of said plurality of surface features may be between 1.0 mm and 3.0 mm. In various implementations, the width of said plurality of surface features 1010 may be, for example, 1.0 mm, 1.1 mm, 1.2 mm, 1.3 mm, 1.4 mm, 1.5 mm, 1.6 mm, 1.7 mm, 1.8 mm, 1.9 mm, 2.0 mm, 2.1 mm, 2.2 mm, 2.3 mm, 2.4 mm, 2.5 mm, 2.6 mm, 2.7 mm, 2.8 mm, 2.9 mm, 3.0 mm, or any range between any of these values or possibly a wider or smaller in width. Although 18-20 surface features are shown, for example in
In the examples shown in
In some implementations, the surface features 1010 have a peak and a base. In various implementations, the plurality of surface features 1010 have a height or an average height in a range from 5 to 600 nanometers (nm) high. For example, the height or average height can be 5, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 150, 200, 250, 300, 350, 400, 450, 500, 600 nm or any range between any of these values or greater or less. In some implementations, for example, the height or average height may be from 5 to 600 nm, from 5 to 700 nm, from 5 to 500 nm, from 5 to 300 nm, from 10 to 700 nm, from 10 to 500 nm, from 10 to 400 nm, from 10 to 300 nm, or any range between any of these values or larger or smaller. In some implementations, the plurality of surface features 1010 have a peak-to-peak spacing or an average peak-to-peak spacing (P) in a range from 100 to 500 nm. For example, the peak-to-peak spacing or average peak-to-peak spacing can be 50, 100, 110, 120, 130, 140, 150, 160, 170, 180, 190, 200, 250, 300, 350, 400, 450, 500, 550, 600, 650, 700 nm or any range between any of these values or greater or less. In some implementations, for example, the plurality of surface features 1010 can have a peak-to-peak spacing or an average peak-to-peak spacing in a range from 50 to 700, nm from 50 to 500 nm, from 50 to 300 nm, from 100 to 700 nm, from 100 to 600 nm, from 100 to 400, nm from 100 to 300 nm, or any range between any of these values. Similarly, in some implementations, the plurality of surface features 1010 have a full width at half maximum (FWHM) or an average full width at half maximum (FWHM) in a range from 75 to 250 nm. In some implementations, the full width at half maximum (FWHM) or average full width at half maximum (FWHM) can be 50, 60, 75, 85, 95, 100, 110, 120, 130, 140, 150, 160, 170, 180, 190, 200, 210, 220, 230, 240, 250, 300, 350, 400 nm or any range between any of these values or greater or less. In some implementations, for example, the plurality of surface features 1010 have a full width at half maximum (FWHM) or an average full width at half maximum (FWHM) in a range, from 50 to 400 nm, from 50 to 300 nm, from 50 to 200 nm, from 50 to 100 nm, from 75 to 400 nm, from 75 to 400, nm from 75 to 300 nm, from 75 to 200 nm, from 75 to 100 nm, or any range between any of these values.
The plurality of surface features may include a diffraction grating or diffractive optical element or holographic optical element. For example, the plurality of surface features may have a dimension (e.g., width, spacing, etc.) configured to diffract the desired wavelength of light such as UV light or possibly visible light. In some implementations, the diffraction grating includes a blazed grating.
In the example shown in
As discussed above, in some designs, the surface features 1010 may have surfaces 1020 oriented at an angle so as not to reflect the light by total internal reflection. For example, the surface 1020 may be oriented at an angle that is less than the critical angle.
In some implementations, the plurality of surface features 1010 include surface features having different shape. For example, the plurality of surface feature 1010 may include surface features having sloping surfaces 1020 oriented at different angles. In certain implementations, such as the one shown in
Accordingly, in various implementations, the surface features 1010 have one or more parameters that changes possibly progressively, with distance from the edge 1012. For example, the surface features 1010 may have one or more parameters that increase, e.g., progressively increase, with distance from the edge 1012. Alternatively, the surface features 1010 may have one or more parameter that decrease, e.g., progressively decrease, with distance from the edge 1012. In some implementations, the surface features 1010 have one or more parameter that increases and decreases, e.g., progressively increases and decreases, with distance from the edge 1012. In some implementations, the parameter may be steepness of one of the surfaces 1020 on the same side of the surface feature, for example, side farthest from the edge 1012 closest thereto or alternatively, the side closest to the edge closest thereto. Other parameters are possible. For example, the pitch or peak to peak spacing of the surface features can change (e.g., increase or decrease) with distance from the edge. Additionally, the parameter that changes (e.g., increase or decrease) with distance from the edge may be height of the surface features. For example, for the prismatic features and gratings including prismatic features, increasing height could potentially be a more efficient at producing egress of light from the waveguide. In some implementations, the variation need not be progressive. In some implementations, the variation may change directions one or more times. In certain implementations, the variation may be progressive for short distances but change directions one or more times although the variation need not be progressive.
As discussed above, the light coupled into the edge 1012 of the waveguide 1000 as represented by the rays shown 1018a, 1018b, 1018c, 1018d may include laser light that is used to cut the waveguide from a sheet of material. This laser light may include UV light. In some situations, a portion of this laser light may couple in the waveguide 1000 and degrade to one or more optical properties of the waveguide. Accordingly, the plurality of surface features 1010 that cause a portion of this light to exit the waveguide 1000 in the peripheral region 1008 of the waveguide may in some cases advantageously reduce the amount of degradation to the main region 1006 of the waveguide. Such optical degradation may possibly include, for example increased haze and discoloration. Intense light may result in added heat and damage to the waveguide and the material including the waveguide. Increased haze may be partially caused by vaporized material depositing back on the waveguide. Such haze may cause reduced optical transmission. Discoloration such as yellowing may be caused by heating the waveguide. Similarly, UV light, e.g., excessive UV light, may cause polymer waveguide to turn yellow.
Accordingly, a method is disclosed herein directed to providing a plurality of surface features to redirect light such as UV light coupled into the waveguide during the segmentation process out of the waveguide. A flow diagram for an example method is shown in
In some cases, the plurality of surface features 1010 may be configured to address light propagating from a projector that is intended to be coupled out of waveguide 1000 by the out-coupling optical element 1004 to the eye but that fails to be directed out of the waveguide by the out-coupling optical element.
In various implementations, the light absorbing material 1030 includes material that substantially absorbs visible light. In some implementations, carbon black and/or dye such as black die may be used and may include the light absorbing material 1030. For example, carbon black or dye (e.g., black dye) may be mixed with liquid polymer resin or industrial adhesive, which is UV cured to become solid. In other implementations the carbon black or dye (e.g., black dye) is mixed with an evaporative solvent. Both are applied together, but in various implementations, primarily or only the carbon black remains on the edge of waveguide. This light absorbing material 1030 may be coated or deposited on the plurality of surface features. In some implementations, the light absorbing material 1030 may be conformally disposed over the plurality of surface features. The light absorbing material 1030 may be directly on the plurality of surface features or one or more layers of material may be between the light absorbing material and the surface features. The light absorbing material may be dark or black or grey or opaque or any combination of these. In some implementations, a seal is employed. Other types of opaque and/or absorbing structures such as sleeve, sheath, shield, or baffles or any combination of these may be used to reduce reflection of light back into the waveguide 1000 and/or output of light to through the peripheral region that might be visible to the user or another person looking at the user or the eyewear.
In some designs, the plurality of surface features may be configured to couple out light containing image content intended for delivery to the user's eye that is not ejected by the out-coupling optical element 1004 as well as couple light out of the waveguide that is injected into the waveguide when the waveguide is cut from a sheet of material. Accordingly, in various designs, the plurality of surface features 1010 may be configured to couple light propagating in the waveguide 1000 from said main region 1006 toward the peripheral region 1008 out of the waveguide out from said peripheral region and to couple light propagating in the waveguide from said edge 1012 toward said main region out of said waveguide out from said peripheral region.
It is contemplated that various implementations may be implemented in or associated with a variety of applications such as waveguides, wave guide plates, other optical elements as well as imaging systems and devices, display systems and devices, etc. The structures, devices and methods described herein may particularly find use in displays such as wearable displays (e.g., head mounted displays) that may be used for augmented and/or virtually reality. More generally, the described implementations may be implemented in any device, apparatus, or system that may be configured to display an image, whether in motion (such as video) or stationary (such as still images), and whether textual, graphical or pictorial. It is contemplated, however, that the described implementations may be included in or associated with a variety of electronic devices such as, but not limited to: mobile telephones, multimedia Internet enabled cellular telephones, mobile television receivers, wireless devices, smartphones, Bluetooth® devices, personal data assistants (PDAs), wireless electronic mail receivers, hand-held or portable computers, netbooks, notebooks, smartbooks, tablets, printers, copiers, scanners, facsimile devices, global positioning system (GPS) receivers/navigators, cameras, digital media players (such as MP3 players), camcorders, game consoles, wrist watches, clocks, calculators, television monitors, flat panel displays, electronic reading devices (e.g., e-readers), computer monitors, auto displays (including odometer and speedometer displays, etc.), cockpit controls and/or displays, camera view displays (such as the display of a rear view camera in a vehicle), electronic photographs, electronic billboards or signs, projectors, architectural structures, microwaves, refrigerators, stereo systems, cassette recorders or players, DVD players, CD players, VCRs, radios, portable memory chips, washers, dryers, washer/dryers, parking meters, head mounted displays and a variety of imaging systems. Thus, the teachings are not intended to be limited to the implementations depicted solely in the Figures, but instead have wide applicability as will be readily apparent to one having ordinary skill in the art.
Various modifications to the implementations described in this disclosure may be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Various changes may be made to the invention described and equivalents may be substituted without departing from the true spirit and scope of the invention. In addition, many modifications may be made to adapt a particular situation, material, composition of matter, process, process act(s) or step(s) to the objective(s), spirit or scope of the present invention. All such modifications are intended to be within the scope of claims associated with this disclosure.
The word “exemplary” is used exclusively herein to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations. Additionally, a person having ordinary skill in the art will readily appreciate, the terms “upper” and “lower”, “above” and “below”, etc., are sometimes used for ease of describing the figures, and indicate relative positions corresponding to the orientation of the figure on a properly oriented page, and may not reflect the orientation of the structures described herein, as those structures are implemented.
Various terms are used interchangeably within this description. Each of the terms are intended to have their customary ordinarily understood plain meaning in addition to the meanings described throughout this application. For example, the terms “recording beam”, “recording light beam”, and “recording beam of light” can be used interchangeably. Similarly, the terms “head mounted display” and “wearable display” can be used interchangeably. The terms “visible spectrum” or “visible wavelength range” may refer to wavelengths visible to human eye (generally between 450 nanometers and 750 nanometers). The terms “infrared or IR spectrum” or “infrared or IR wavelength range” may refer to wavelengths used for IR imaging, thermal imaging, eye tracking, range finding and the like. IR wavelength range may include near IR wavelength range (generally between 750 nanometers to 2000 nanometers) and mid-IR wavelength range (generally between 200 nanometers to 6000 nanometers).
Certain features that are described in this specification in the context of separate implementations also may be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also may be implemented in multiple implementations separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Further, the drawings may schematically depict one more example processes in the form of a flow diagram. However, other operations that are not depicted may be incorporated in the example processes that are schematically illustrated. For example, one or more additional operations may be performed before, after, simultaneously, or between any of the illustrated operations. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims may be performed in a different order and still achieve desirable results.
Implementations include methods that may be performed using the subject devices. The methods may include the act of providing such a suitable device. Such provision may be performed by the end user. In other words, the “providing” act merely requires the end user obtain, access, approach, position, set-up, activate, power-up or otherwise act to provide the requisite device in the subject method. Methods recited herein may be carried out in any order of the recited events which is logically possible, as well as in the recited order of events.
In addition, while the invention has been described in reference to several examples optionally incorporating various features, the invention is not to be limited to that which is described or indicated as contemplated with respect to each variation of the invention. Various changes may be made to the invention described and equivalents (whether recited herein or not included for the sake of some brevity) may be substituted without departing from the true spirit and scope of the invention. In addition, where a range of values is provided, it is understood that every intervening value, between the upper and lower limit of that range and any other stated or intervening value in that stated range, is encompassed within the invention.
Also, it is contemplated that any optional feature of the inventive variations described may be set forth and claimed independently, or in combination with any one or more of the features described herein. Reference to a singular item, includes the possibility that there are plural of the same items present. More specifically, as used herein and in claims associated hereto, the singular forms “a,” “an,” “said,” and “the” include plural referents unless the specifically stated otherwise. In other words, use of the articles allow for “at least one” of the subject item in the description above as well as claims associated with this disclosure. It is further noted that such claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation.
Without the use of such exclusive terminology, the term “comprising” in claims associated with this disclosure shall allow for the inclusion of any additional element-irrespective of whether a given number of elements are enumerated in such claims, or the addition of a feature could be regarded as transforming the nature of an element set forth in such claims. Except as specifically defined herein, all technical and scientific terms used herein are to be given as broad a commonly understood meaning as possible while maintaining claim validity.
The breadth of the present invention is not to be limited to the examples provided and/or the subject specification, but rather only by the scope of claim language associated with this disclosure.
This application claims the benefit of U.S. Provisional Application No. 63/245,169, filed Sep. 16, 2021, the contents of which are incorporated by reference herein in their entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/076504 | 9/15/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63245169 | Sep 2021 | US |