Limited pupil (or “eye-box”) size is a challenge for stereo displays, such as binocular refractive optic-based stereo displays used for robotic and telesurgery or for remote piloting of vehicles, for example. Virtual Reality (VR) solutions, offering constrained user-to-pupil alignment by virtue of a worn display are not viable for applications in which the user must intermittently look away from the display and toward the physical world around them. Multi-view approaches, as implemented with lenticular screens and raster barriers, produce a trade-off of resolution and image quality, due to optical artifacts that are difficult to eliminate within such displays. A solution to overcome such problems with existing display systems would be desirable.
The following presents a simplified summary of one or more aspects in order to provide a basic understanding of such aspects. This summary is not an extensive overview of all contemplated aspects and is intended to neither identify key or critical elements of all aspects nor delineate the scope of any or all aspects. Its purpose is to present some concepts of one or more aspects in a simplified form as a prelude to the more detailed description that is presented later.
The present disclosure describes a stereo viewing system including a first projector having a first microdisplay configured to emit spatially modulated light associated with a first image, and a first eyepiece substrate having a world side surface and a user side surface. The first eyepiece substrate includes a first input coupling element and a first output coupling element. The first input coupling element is configured to incouple the spatially modulated light into the first eyepiece substrate, and the first output coupling element is also configured to project at least a portion of the incoupled spatially modulated light out of the first eyepiece substrate toward the user side.
The present disclosure further describes a stereo viewing system having a first projector with a first microdisplay configured to emit spatially modulated light associated with a first image, a second microdisplay configured to emit spatially modulated light associated with a second image, and a first eyepiece stack having a world side and a user side. The first eyepiece stack includes at least a first and a second eyepiece substrate layer aligned along a viewing axis. The first eyepiece substrate layer includes a first input coupling element and a first output coupling element. The second eyepiece substrate layer includes a second input coupling element and a second output coupling element. The first input coupling element is configured to incouple the spatially modulated light associated with the first image into the first eyepiece substrate layer, and the first output coupling element is also configured to project at least a portion of the incoupled spatially modulated light out of the first eyepiece substrate layer toward the user side. The second input coupling element is configured to incouple the spatially modulated light associated with the second image into the second eyepiece substrate layer and the second output coupling element is configured to project at least a portion of the incoupled spatially modulated light out of the second eyepiece substrate layer toward the user side.
The appended drawings illustrate only some implementations and are therefore not to be considered limiting of scope.
The present invention is described more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. In the drawings, the size and relative sizes of layers and regions may be exaggerated for clarity.
It will be understood that, although the terms first, second, third etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer, or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the present invention.
Spatially relative terms, such as “beneath,” “below,” “lower,” “under,” “above,” “upper,” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” or “under” other elements or features. Thus, the exemplary terms “below” and “under” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. In addition, it will also be understood that when a layer is referred to as “between” two layers, it can be the only layer between the two layers, or one or more intervening layers may also be present.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “compromising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items, and may be abbreviated as “/”.
It will be understood that when an element or layer is referred to as being “on,” “connected to,” “coupled to,” or “adjacent to” another element or layer, it can be directly on, connected, coupled, or adjacent to the other element or layer, or intervening elements or layers may be present. In contrast, when an element is referred to as being “directly on,” “directly connected to,” “directly coupled to,” or “immediately adjacent to” another element or layer, there are no intervening elements or layers present. Likewise, when light is received or provided “from” one element, it can be received or provided directly from that element or from an intervening element. On the other hand, when light is received or provided “directly from” one element, there are no intervening elements present.
Embodiments of the invention are described herein with reference to cross-section illustrations that are schematic illustrations of idealized embodiments (and intermediate structures) of the invention. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, embodiments of the invention should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. Accordingly, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of the invention.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and/or the present specification and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The present disclosure describes embodiments of an alternative viewing optics assembly combining two high density emissive displays, such as micro light emitting diode (microLED) panels, with pupil-expanding waveguide windows. Such a configuration can provide high resolution stereo imagery with large eye-boxes, to enable comfortable viewing with minimal artifacts over any selected field of view.
The system 100 includes a right display 104 and a left display 106 that project right and left image data, respectively, via spatially modulated light toward a right optical sub-assembly 108 and a left optical sub-assembly 110, respectively. The left and right image data may be slightly different, or offset, to provide stereo input to the user's eyes. The optical sub-assemblies 108, 110 can include one or more lenses, prisms, or other optical elements to direct the light from right and left displays 104 and 106, respectively, into right and left eyebox regions 112 and 114, respectively, where it can be received by a user's right and left eyes 116, 118, respectively, when the pupils of user's eyes 116, 118 overlap the eyebox regions 112, 114. It is noted that the items shown in
Notably, the eyebox regions 112, 114 for stereoscopic and prismatic optics, such as in the optical sub-assemblies 108, 110 of system 100, are limited in lateral (i.e., x direction) size. Additionally, aberrations within the refractive optical sub-assemblies may produce significant optical artifacts for viewers whose eyes are not centered within the eyebox regions. As a result, image quality and/or viewing experience may be poor for many viewers. For example, viewers having an interpupillary distance (IPD) 122 that does not align with the distance between eyebox regions 112, 114 may be unable to simultaneously position the pupil of left and right eyes within the left and right eyebox regions 112, 114. Even fewer users will have an IPD that places both pupils in the center of the eyebox regions where image quality is highest. These deficiencies may be unacceptable in certain applications, such as robotic surgery, where full views of high quality, high resolution images are required for making high risk decisions.
Other drawbacks to the system 100 exist. Even for users who have an IPD that aligns perfectly with the eyebox regions 112, 114, very little movement of the user's head is tolerated while still maintaining the pupils within the eyebox regions. Still less movement is permissible in order to view the images with minimal artifacts. As a result, the user must remain almost completely stationary to keep both pupils centered within the eyebox regions. Such ergonomic constraints may be harmful or even impossible for a user to maintain over long periods of use.
Stereoscopic and prismatic optics are characterized by an inherent relationship between the lens f-number and pupil size. As the field of view increases, the pupil size decreases leading to reduced image quality. Finally, the optical system 100 must be enclosed within a bulky housing to minimize contrast-reducing ambient light scatter and to minimize other artifacts that may compromise image quality. Large housing sizes may prohibit systems like optical system 100 from being used in areas without plenty of spare room in which to place the equipment.
Other types of systems have been developed to address various deficiencies discussed with respect to the optical system 100. In one example, head-mounted systems can be used. Head-mounted systems are generally stationary (i.e., affixed in relative position) with respect to the user's eyes at all times, thereby allowing a user to move their head without causing misalignment between the user's eye and the display. However, head-mounted systems, and in particular head-mounted virtual reality (VR) systems, are cumbersome in applications that require a user to look away from the display system intermittently in order to view or interact with other objects or people in the user's environment.
Another type of optical system 200 used in an attempt to overcome the aforedescribed problems with system 100 is shown in a top-down view in
While the system 200 can be more compact than the system 100 and can include mechanisms for accommodating a range of user IPDs, the images produced by system 200 can suffer from poor image quality. Artifacts such as crosstalk and ghosting may be present in the images delivered to a user's eyes due to lenslet imperfections, misalignments between the lenslets and the pixels, and the simplification of lenslet design required in order to make the lenslet arrays manufacturable. The lenslets may further produce a texture on the display surface that can contribute to loss of image quality and/or distract a user. Additionally, lenslet prescriptions may give rise to vignetting and reduced redirection coherence for higher off-axis pixels and exit angles. The system 200 also sacrifices spatial resolution in favor of using perspective-bearing pixels, which limits the fidelity of the image perceived by a user.
In addition to lacking sufficient image quality, the system 200 may suffer from the inability to adjust an image location. The image in a lenslet-based optical system 200 is located at the display 204. Approaches to offset the depth location of the image along the viewing axis, z, generally produces significant image blur as well as astigmatic and other optical aberrations.
Referring to
Referring to
The eyepiece substrate 530 further includes a first output coupling element 540 configured to interact with the first portion 538a of light traveling through the eyepiece substrate 530 in TIR. At each interaction with the output coupling element 540, represented by circles 548, the trajectory of a percentage of light is modified such that it is directed out of the eyepiece substrate 530. Generally, some of the modified light is directed toward a user 102 as third portion 538c of light and the some of the modified light is directed away from a user 102 as fourth portion 538d of light, opposite to the direction of third portion 538c. In some configurations, fourth portion 538d will not be seen by a user 102 and is considered wasted light. Similar to the reflective coating discussed above with respect to the input coupling element, a reflective coating 544 may be disposed over the output coupling element 540 or over a portion of the world side surface 534 aligned with the output coupling element 540 in order to reverse the direction of fourth portion 538d of light traveling away from user side 532 such that the reflected light can be directed toward the user's eye. That is, the reflective coating 544 redirects the fourth portion 538d of light back toward the user 102 where it may be seen by the user 102, thereby improving efficiency of the optical system. Alternatively, a reflective coating can be provided on an input coupling element 536, which may be a diffraction grating, to increase a diffraction efficiency of coupling element 536. Thus, by using a reflective coating, more light can be coupled into TIR angles through eyepiece substrate 530. The reflective or mirrored coating further provides a high level of non-scattering opacity. The mirror coating may be a broadband or narrow band coating as a matter of design choice.
One of skill in the art will appreciate that the percentage of light whose trajectory is modified at each interaction event 548 is determined by particular designs of the output coupling element 540. Light whose trajectory is not modified at each interaction with output coupling element 540 continues propagating through the eyepiece substrate in TIR where it may be modified at subsequent interaction events. Further, in-coupling element 536 and output coupling element 540 may be configured to operate in transmission mode, with the light rays shown in
The area over which light is projected from the eyepiece substrate is considered the first eyebox region 542. Each beam within the third and fourth portions 538c, 538d of light projected from the eyepiece substrate contain a replica of the full image represented by the light 538 initially received into the eyepiece substrate from the projector 550. As such, a user need only receive a portion of third portion 538c of light in order to perceive the full image. Increasing the amount of light projected from the eyepiece substrate and/or increasing the area of the eyepiece substrate from which the relevant portions of are projected toward user 102 increases the likelihood that a user can see the full image from a wider range of eye positions. This large region in which the user's right eye 116 can see the full image is represented by the first eyebox region 542. In some embodiments, the footprint size of the output coupling element 540 on the eyepiece substrate may be selected to be in the range of approximately 2 inches by 2 inches, though other sizes and shapes are possible with specific dimensions being merely a design choice.
The eyepiece substrate 530 may further include a second input coupling element 552 and a second output coupling element 554 configured to direct light to a second eyebox region 556 for viewing by a user's left eye 118. The second input coupling element and second output coupling element may function in the same way as the first input coupling element and the first output coupling element discussed above. Additional reflective coatings may be disposed over one or more of the second input coupling element, the second output coupling element, or various portions of the user or world side surfaces.
Several variations of the system 500 are possible. For example, instead of or in addition to the reflective coating 544, an opaque coating or a dimmable backing panel may be used. Dimmable backing panels may be programmable so that the amount of dimming is variable and is selectable by a user and/or by a control module. Such variations may provide a dark background against which an image being viewed by a user will appear brighter with higher contrast, and thus may be easier to see.
Various types of input and output coupling elements may be used. In some embodiments, the input and output coupling elements may include leaky-mode grating-based pupil expander windows (e.g., diffraction gratings) with repeating grating features (e.g., protrusions and/or recesses). The repeating grating features may repeat in one dimension or in two dimensions. The gratings may be formed using a patterned resist material on top of the eyepiece substrate or may be etched into or otherwise integrated with the eyepiece substrate. In some embodiments, the grating structures may include volume-phase material. The input and/or output coupling elements may alternatively or additionally include prisms and/or beamsplitter cascades.
Light projected from the output coupling elements 540, 554 may be focused at optical infinity. One or more lenses 560, 562 may be placed between the output coupling elements 540, 552 and the user's eyes 116, 118, respectively. The lenses may be modestly-powered lenses configured to focus images received by the user 102 at a finite distance. In some embodiments, the lenses may have a power ranging from approximately 1 to 3 diopters to bring the image into focus around an arm's length distance. This allows virtual content to be focused within an arm's reach working range of the user 102. For example, the virtual content may be focused between approximately 30 and approximately 70 centimeters. In some embodiments, the virtual content may be focused at approximately 50 centimeters.
The system 500 further includes a projector 550 that may be placed such that light from the projector is incident on the world side surface of the eyepiece substrate. The projector 550 includes at least one high resolution microdisplay 558 configured to project collimated, spatially distributed light through one or more relay lenses 565 where the light is converted to the angularly multiplexed light 538 incident on at least one input coupling element 536, 552. In some embodiments, the microdisplay 558 includes a fast-switching, high-density spatial light modulator (SLM) such as a microLED display that can alternatingly project a first image associated with a view shown to the right eye and a second image associated with a view shown to the left eye. Thus, the single microdisplay 558 may project temporally multiplexed images to one or more input coupling elements for delivery to one or more eyes. For instance, a microLED display may be used to implement high frame rate switching (e.g., alternating between left and right images at switching rates on the order of 240 Hz), and/or high resolution displays (e.g., 4K or 8K resolution) in a compact projector form factor. For example, non-microLED display, such as a liquid crystal on silicon (LCOS) display, pixel sizes are limited to at least ˜3 microns and the frame rates are limited to approximately 120 Hz due to the use of field-sequential color generation schemes and slow response times of the liquid crystal materials. In contrast, the emitters within a microLED display may be spaced such that a full color red-green-blue (RGB) pixel unit can be located within a smaller pitch than 3 microns, where all three colors can be simultaneously and fundamentally emitted. Also, the switching times of microLED emitters are much faster than 120 Hz such that the switching between right and left projected images can be performed without being noticeable to the user and the form factor of the overall projector can be kept at a handheld size or smaller. Additionally, the pupil size of a microLED projector can be larger than the pupil size of an LCOS-based projector, thus the thickness of the waveguide window used in a system such as optical system 500 can be increased, resulting in increased light guiding efficiency.
The projector 550 may further include a second microdisplay 564 adjacent the first microdisplay 558 such that at least a portion of the microdisplays overlap along the z-axis. The second microdisplay 564 may be laterally offset (e.g., in the x- and/ory-directions) relative to the first microdisplay. In this configuration, the first microdisplay 558 may be dedicated to generating images for a first eye (e.g., the right eye 116) and the second microdisplay 564 may be dedicated to generating images for a second eye (e.g., the left eye 118). For example, spatially modulated light emitted from the first microdisplay passes through one or more relay lenses 565 where it is converted to angularly multiplexed light and is incident on a first input coupling element 536 while spatially modulated light emitted from the second microdisplay passes through one or more relay lenses 565 where it is converted to angularly multiplexed light and is incident on a second input coupling element 552.
From the first and second input coupling elements 536, 552, incoupled light is directed to first and second output coupling elements 540, 554, respectively. The light from the first and second microdisplays may be temporally multiplexed to reduce crosstalk or interference between light associated with the first and second (e.g., right and left) images. In some embodiments, a shutter (not shown) may be disposed between the projector and the first and second input coupling elements to further isolate the left and right image light. One or more portions of the shutter may be configured to switch between open and closed states in sync with the temporal multiplexing between the first and second microdisplay such that the shutter is open over the first input coupling element and closed over the second input coupling element when light from the first microdisplay passes through the shutter. The shutter is open over the second input coupling element and closed over the first input coupling element when light from the second microdisplay passes through the shutter. Thus, the shutter may prevent stray light from the microdisplays from entering the wrong input coupling element which could reduce overall image quality experienced by the user.
In order to feed light to light pathways associated with both the left and right eyes, the projector 550 is generally centered between the two input coupling elements 536, 552. The projector may be positioned in they direction (i.e., into the page) above, below, or even with the output coupling elements 540, 554. For example, in the configuration shown in
The system 700 is shown having a first projector 750 which is laterally offset (e.g., in the x direction) from a second projector 768. The first and second projectors are shown on the world side of the eyepiece substrates 530 and project light toward the user 102. The first and second projectors are also positioned on the nasal side of each of eyepiece substrates 530, although other arrangements are possible.
The first projector includes a first microdisplay 758, which may be a high-density microLED display, and one or more relay lenses 765. Light from the first microdisplay 758 passes through relay lenses 765, which converts the light into a first plurality of angularly multiplexed beams that propagate through a first optical pupil 738. Each of the first plurality of angularly multiplexed beams is incident on the first input coupling element 536. In the second projector 768, light from the second microdisplay 764 passes through a second set of relay lenses 766, which converts the light into a second plurality of angularly multiplexed beams that propagate through a second optical pupil 770. Each of the second plurality of angularly multiplexed beams is incident on the second input coupling element 552. Using two independent projectors may allow greater separation (e.g., gap 769 between the first and second input coupling elements 536, 552 which can reduce image artifacts thereby improving image quality. Furthermore, having two independent projectors would allow each projector to have its own input coupling elements and run at normal frame rates. If a single projector is used for projecting both stereo images, the frame rate of the projector should be double the normal frame rates, and a switching mechanism or a shutter should be incorporated into the system to time multiplex the images sent to the left and right fields. Images received by the user's eyes 116, 118 within the eyebox regions 542, 556 may be focused at infinity unless a powered lens or curved eyepiece substrate is included in the system 700.
Similar to the system 500, the system 700 advantageously includes expanded eyebox regions 542, 556 wherein a user may perceive a full field of view, high quality images that retain their native resolution (i.e., the resolution of the image produced by the projector). In contrast to the systems 100, 200, and 300, no spatio/angular trade in resolution is needed in the waveguide-based systems 500, 700, and 800. Thus, images with 4K, 8K, or higher pixel count can be displayed to each eye. Implementing a waveguide-based system using a small pixel pitch, high pixel density SLM, such as a monolithic microLED display, keeps the microdisplay component small and allows other components within the optical train to be similarly small. The resulting systems can be lightweight, compact, and easily moved to or installed in small spaces. As an example, typical pixel pitch of a high-density microLED display may be, for example, full RGB pixel pitch of 2 to 3 microns. For instance, a 2 micron RGB pitch at 4k resolution would result in a display panel of approximately 11 millimeters on the diagonal. Such a display would in turn require a relay lens approximately 15 to 25 millimeters in diameter. In comparison, an LCOS-based SLM has a practical minimum pitch of 3 microns providing sequential color. A 4k display using LCOS would result in a 17 millimeter diagonal panel with a required relay lens diameter of 23 to 33 millimeters, with an additional illumination module to backlight the LCOS pixels, such that the LCOS display would result in at least a 50 to 75% increase in the overall optical system volume, compared to a microLED display system, which does not require backlighting. Thus, the smaller microLED display system would be more readily mountable on, for instance, a boom arm or other easily adjustable arrangement.
As discussed above, the waveguide-based systems 500, 700 produce a virtual image that can be focused at a position substantially coincident with a user's physical hand location without sacrificing image quality. This feature advantageously allows for the superposition of visual and physical fields for activities benefiting from close proximity between the projected content (e.g., virtual images) and the physical environment (e.g., the user's hands and/or tools), such as in telesurgery applications. In some embodiments, optical magnification of the projected image can be performed using only a single lens per eye, as discussed above. The quality of the single lens can be much higher than the plurality of lenslets described with respect to
Expanded eyebox allows the system to be used by people having widely varying IPDs without any loss in image quality for people having wide or narrow IPDs. In addition, the expanded eyebox regions allow users the freedom of greater head and eye motion while still maintaining overlap with the eyebox regions, and thus, still viewing the full field of view, high resolution, high quality image. The systems 500, 700 therefore provide improved image quality and ergonomics for users.
A further advantage of the systems 500, 700 is the compact size. Referring to
Alternatively,
The embodiments illustrated in
The foregoing is illustrative of the present invention and is not to be construed as limiting thereof. Although a few exemplary embodiments of this invention have been described, those skilled in the art will readily appreciate that many modifications are possible in the exemplary embodiments without materially departing from the novel teachings and advantages of this invention.
In some embodiments, the images displayed by one or more of the optical systems described herein may be directly received by or otherwise derived from remote video. For example, the images projected to the user may be streamed in substantially real-time from a surgical camera on or inside of a patient undergoing telesurgery. The user may make decisions and movements with the user input means to control robotic surgery equipment that replicates the user's motions in substantially real-time, thereby performing remote surgery on the patient.
Various alternative or additional configurations or components may be implemented in one or more of the waveguide-based optical systems described above. In some embodiments, multiple stacked layers (i.e., aligned along a viewing axis in the z direction) of eyepiece substrates may be used instead of one. This may allow the optical system to more efficiently divide color components or portions of an image field of view between the various layers. Additional microdisplays or projectors may be added in order to feed separated optical pupils to one or more input optical elements to divide the color or field of view components among the different layers of eyepiece substrates.
In addition to the number of layers of eyepiece substrate used, the material of the eyepiece substrates may be selected to support a particular number of colors or a particular field of view. The refractive index of the substrate material corresponds to the size of the field of view that can be supported in TIR within the substrate material. For example, lithium niobate has a high index of refraction (n=2.3) and can support a field of view of approximately 90°. As described herein, the eyepiece substrates are generally planar with minimal thickness variation; however, curved or freeform substrates may also be used as matter of design choice.
Some differentiating factors of the embodiments described herein are: 1) flexibility in the size of the waveguides; 2) employment of the large waveguides to enable significant head motion in regular use, in comparison to a head-mounted display (HMD), which are attached to the head such that there should be minimum displacement of the HMD from the eyes/head; 3) the ability to use thicker substrates with larger optics given the divergence of microLED pixel light; and 4) the use of large waveguides to produce a stereo window using a single set of microLED display and relay optics.
Features described above as well as those claimed below may be combined in various ways without departing from the scope hereof. The following enumerated examples illustrate some possible, non-limiting combinations.
(A1) A stereo viewing system includes a projector and an eyepiece. The projector incudes a first microdisplay and is configured to emit spatially modulated light associated with a first image and spatially modulated light associated with a second image. The eyepiece includes an eyepiece substrate, a first input coupling element, a first output coupling element, a second input coupling element, and a second output coupling element. The eyepiece substrate has a user-side surface. The first input coupling element is configured to receive the spatially modulated light associated with the first image from the projector and incouple the spatially modulated light associated with the first image into the eyepiece substrate. The first output coupling element is configured to project at least a portion of the incoupled spatially modulated light out of the eyepiece substrate from the user-side surface. The second input coupling element is configured to receive the spatially modulated light associated with the second image from the projector and incouple the spatially modulated light associated with the second image into the eyepiece substrate. The second output coupling element is configured to project at least a portion of the incoupled spatially modulated light out of the eyepiece substrate from the user-side surface.
(A2) Embodiments of system (A1) further include a mirror coating on the world-side of the at least one eyepiece substrate. The mirror coating may be aligned with the first output coupling grating. In embodiments, the mirror coating is one of a broadband mirror coating and a narrow band mirror coating.
(A3) In embodiments of either one of system (A1) and (A2), at least one of (i) the first input coupling element is a first input coupling grating, and (ii) the first output coupling element is a first output coupling grating. At least one of the first output coupling grating and the second output coupling grating may be formed from functional resist. In embodiments, at least one of the first output coupling grating and the second output coupling grating is integrally formed with the first eyepiece substrate. At least one the first output coupling grating and the second output coupling grating may be formed of a volume-phase material, such as a photopolymer.
(A4) In embodiments of any one of systems (A1)-(A3), the eyepiece substrate is formed of glass, polymer, lithium niobate, or a combination thereof.
(A5) In embodiments of any one of systems (A1)-(A4), the first projector is disposed on the user side of the first eyepiece substrate.
(A6) In embodiments of any one of systems (A1)-(A4), the first projector is disposed on the world side of the first eyepiece substrate.
(A7) In embodiments of any one of systems (A1)-(A6), one or more of the first input coupling element and the first output coupling element includes a beamsplitter.
(B1) In embodiments a stereo viewing system includes a first projector and a first eyepiece stack. The first projector includes a first microdisplay configured to emit spatially modulated light associated with a first image and a second microdisplay configured to emit spatially modulated light associated with a second image. The first eyepiece stack includes at least a first eyepiece and a second eyepiece aligned along a viewing axis. The first eyepiece includes a first eyepiece substrate having a first user-side surface, a first input coupling element, and a first output coupling element. The second eyepiece includes a second eyepiece substrate having a second user-side surface, a second input coupling element, and a second output coupling element.
The first input coupling element is configured to incouple the spatially modulated light associated with the first image into the first eyepiece substrate. The first output coupling element is configured to project at least a portion of the incoupled spatially modulated light out of the first eyepiece substrate from the first user-side surface. The second input coupling element is configured to incouple the spatially modulated light associated with the second image into the second eyepiece substrate. The second output coupling element is configured to project at least a portion of the incoupled spatially modulated light out of the second eyepiece substrate from the user-side surface.
Accordingly, many different embodiments stem from the above description and the drawings. It will be understood that it would be unduly repetitious and obfuscating to literally describe and illustrate every combination and subcombination of these embodiments. As such, the present specification, including the drawings, shall be construed to constitute a complete written description of all combinations and subcombinations of the embodiments described herein, and of the manner and process of making and using them, and shall support claims to any such combination or subcombination.
The present application claims the benefit of U.S. Provisional Patent Application No. 63/113,375, filed Nov. 13, 2020, and entitled “Eyebox Expanding Viewing Optics Assembly For Stereo-Viewing.”
Number | Date | Country | |
---|---|---|---|
63113375 | Nov 2020 | US |