This application incorporates by reference the entireties of each of the following: U.S. application Ser. No. 14/555,585 filed on Nov. 27, 2014, published on Jul. 23, 2015 as U.S. Publication No. 2015/0205126; U.S. application Ser. No. 14/690,401 filed on Apr. 18, 2015, published on Oct. 22, 2015 as U.S. Publication No. 2015/0302652; U.S. application Ser. No. 14/212,961 filed on Mar. 14, 2014, now U.S. Pat. No. 9,417,452 issued on Aug. 16, 2016; U.S. application Ser. No. 14/331,218 filed on Jul. 14, 2014, published on Oct. 29, 2015 as U.S. Publication No. 2015/0309263; U.S. Patent App. Pub. No. 2018/0061121, published Mar. 1, 2018; U.S. patent application Ser. No. 16/221,065, filed Dec. 14, 2018; U.S. Patent App. Pub. No. 2018/0275410, published Sep. 27, 2018; U.S. Provisional Application No. 62/786,199, filed Dec. 28, 2018; and U.S. application Ser. No. 16/221,359, filed on Dec. 14, 2018; U.S. Provisional Application No. 62/702,707, filed on Jul. 24, 2018; U.S. application Ser. No. 15/481,255, filed Apr. 6, 2017; and U.S. application Ser. No. 15/927,808, filed Apr. 21, 2018, published on Sep. 27, 2018 as U.S. Patent App. Pub. No. 2018/0275410.
The present disclosure relates to display systems and, more particularly, to augmented and virtual reality display systems.
Modern computing and display technologies have facilitated the development of systems for so called “virtual reality” or “augmented reality” experiences, in which digitally reproduced images or portions thereof are presented to a user in a manner wherein they seem to be, or may be perceived as, real. A virtual reality, or “VR”, scenario typically involves the presentation of digital or virtual image information without transparency to other actual real-world visual input; an augmented reality, or “AR”, scenario typically involves presentation of digital or virtual image information as an augmentation to visualization of the actual world around the user. A mixed reality, or “MR”, scenario is a type of AR scenario and typically involves virtual objects that are integrated into, and responsive to, the natural world. For example, an MR scenario may include AR image content that appears to be blocked by or is otherwise perceived to interact with objects in the real world.
Referring to
According to some embodiments, a head-mounted display system comprises a support structure configured to mount on a user's head, a light projection system supported by the support structure, an eyepiece, and one or more processors. The light projection system comprises a micro-display comprising an array of light emitters associated with a first resolution, wherein the array of light emitters is configured to output light forming frames of virtual content; projection optics; and one or more actuators. The eyepiece is supported by the support structure and configured to receive light from the light projection system and to direct the received light to the user, one or more processors. The one or more processors are configured to receive a rendered frame of virtual content, the rendered frame comprising at least a portion associated with a second resolution, wherein the second resolution is higher than the first resolution. The one or more processors are further configured to cause the emissive micro-display projector to output light forming a first subframe of the rendered frame, wherein the first subframe and the rendered frame are substantially a same size. The one or more processors are further configured to shift, via the one or more actuators, one or move parts of the light projection system to adjust positions associated with light emitter light outputted from the light projection system and cause the light projection system to output light forming a second subframe of the rendered frame.
According to some other embodiments, a method implemented by a head-mounted display system of one or more processors comprises providing a rendered frame of virtual content, the rendered frame comprising at least a portion associated with a second resolution. An emissive micro-display projector is caused to output light forming a first subframe of the rendered frame, the first subframe having a first resolution less than the second resolution, wherein the emissive micro-display projector comprises an array of light emitters associated with the first resolution and having a pixel pitch. The emissive micro-display projector is shifted, via one or more actuators, to adjust geometric positions associated with light output by the emissive micro-display projector, wherein the geometric positions are adjusted a distance less than the pixel pitch. The emissive micro-display projector is caused to output light forming a second subframe of the rendered frame, the second subframe having the first resolution.
According to yet other embodiments, a system comprises one or more processors and one or more computer storage media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations. The operations comprise generating a rendered frame of virtual content to be displayed as augmented reality content via an emissive micro-display projector system of the system, the rendered frame being associated with a second resolution, and the emissive micro-display projector comprising one or more light emitter arrays configured to output light forming virtual content associated with a first, lower, resolution. The rendered frame of virtual content is divided into a plurality of subframes, wherein each subframe includes a subset of pixels included in the rendered frame. Light is successively output via the emissive micro-display projector system, the light forming the plurality of subframes, wherein the emissive micro-display projector system is shifted via one or more actuators for each of the subframes according to a movement pattern, wherein the emissive micro-display projector system is shifted along one or more axes on a plane parallel to a plane of an output pupil of the projector system.
According to some other embodiments, a method implemented by a head-mounted display system of one or more processors comprises generating a rendered frame of virtual content to be displayed as virtual content via an emissive micro-display projector system of the head-mounted display system, the rendered frame being associated with a second resolution, and the emissive micro-display projector comprising emitters configured to output light forming virtual content associated with a first, lower, resolution. The rendered frame of virtual content is divided into a plurality of subframes, wherein each subframe includes a subset of pixels included in the rendered frame. Light is successively output via the emissive micro-display projector system, the light forming the plurality of subframes, wherein the emissive micro-display projector system is shifted along one or more axes via one or more actuators for each of the subframes according to a movement pattern, wherein the emissive micro-display projector system is shifted along one or more axes on a plane parallel to a plane of an output pupil of the projector system.
Some additional examples are provided below.
Example 1. A head-mounted display system comprising: a support structure configured to mount on a user's head; a light projection system supported by the support structure and comprising: a micro-display comprising an array of light emitters associated with a first resolution, wherein the array of light emitters is configured to output light forming frames of virtual content; projection optics; and one or more actuators; an eyepiece supported by the support structure and configured to receive light from the light projection system and to direct the received light to the user; and one or more processors, the one or more processors configured to: receive a rendered frame of virtual content, the rendered frame comprising at least a portion associated with a second resolution, wherein the second resolution is higher than the first resolution; cause the emissive micro-display projector to output light forming a first subframe of the rendered frame, wherein the first subframe and the rendered frame are substantially a same size; shift, via the one or more actuators, one or move parts of the light projection system to adjust positions associated with light emitter light outputted from the light projection system; and cause the light projection system to output light forming a second subframe of the rendered frame.
Example 2. The head-mounted display of example 1, wherein the portion associated with the second resolution is associated with a foveal region of a user's eye.
Example 3. The head-mounted display of example 2, wherein the one or more processors are configured to determine that light forming the portion falls within a threshold angular distance of a fovea of the user.
Example 4. The head-mounted display of example 2, wherein the one or more processors are configured to cause: for the second subframe, light emitters to update emitted light forming the portion; and for the first subframe, light emitters to not update emitted light forming parts of the rendered frame outside of the portion.
Example 5. The head-mounted display system of example 1, wherein each emissive micro-display array has an associated emitter size, wherein the emitter size is less than the pixel pitch.
Example 6. The head-mounted display of example 5, wherein a total number of subframes of the rendered frame is determined based on a size associated with the pixel pitch and the emitter size.
Example 7. The head-mounted display of example 6, wherein the one or more processors are configured to cause the light projection system to successively output light forming the total number of subframes.
Example 8. The head-mounted display of example 7, wherein the one or more processors are configured to time multiplex the rendered frame by causing the one or more actuators to shift parts of the light projection system for each subframe.
Example 9. The head-mounted display of example 8, wherein the one or more processors are configured to cause the one or more actuators to shift the parts of the light projection system such that geometric positions associated with the array of light emitters are tiled within respective inter-emitter regions.
Example 10. The head-mounted display of example 1, wherein the one or more processors are configured to cause the one or more actuators to shift the parts of the light projection system according to a movement pattern, and wherein the movement pattern is a continual movement pattern.
Example 11. The head-mounted display of example 1, wherein the first subframe and the second subframe each comprise pixels associated with respective portions of the rendered frame.
Example 12. The head-mounted display of example 1, wherein the light projection system comprises a plurality of arrays of light emitters.
Example 13. The head-mounted display of example 12, further comprising an X-cube prism, wherein each of the arrays of light emitters face a different side of the X-cube prism.
Example 14. The head-mounted display of example 12, wherein each of the arrays of light emitters is configured to direct light into dedicated associated projection optics.
Example 15. The head-mounted display of example 12, wherein the arrays of light emitters are attached to a common back plane.
Example 16. The head-mounted display of example 1, wherein the one or more actuators are configured to shift the projection optics.
Example 17. The head-mounted display of example 1, wherein the one or more actuators are piezoelectric motors.
Example 18. The head-mounted display of example 1, wherein the one or more actuators shift the emissive micro-display projector along two axes.
Example 19. The head-mounted display of example 1, wherein the light emitters comprise light emitting diodes.
Example 20. The head-mounted display of example 1, wherein the array of light emitters is configured to emit light of a plurality of component colors.
Example 21. The head-mounted display of example 20, wherein each light emitter comprises a stack of constituent light generators, wherein each constituent light generator emits light of a different color.
Example 22. The head-mounted display of example 1, wherein the eyepiece comprises a waveguide assembly comprising one or more waveguides, each waveguide comprising: an in-coupling optical element configured to incouple light from the micro-display into the waveguide; and an out-coupling optical element configured to outcouple incoupled light out of the waveguide.
Example 23. A method implemented by a head-mounted display system of one or more processors, the method comprising: providing a rendered frame of virtual content, the rendered frame comprising at least a portion associated with a second resolution; causing an emissive micro-display projector to output light forming a first subframe of the rendered frame, the first subframe having a first resolution less than the second resolution, wherein the emissive micro-display projector comprises an array of light emitters associated with the first resolution and having a pixel pitch; shifting, via one or more actuators, the emissive micro-display projector to adjust geometric positions associated with light output by the emissive micro-display projector, wherein the geometric positions are adjusted a distance less than the pixel pitch; and causing the emissive micro-display projector to output light forming a second subframe of the rendered frame, the second subframe having the first resolution.
Example 24. A system comprising: one or more processors; and one or more computer storage media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: generating a rendered frame of virtual content to be displayed as augmented reality content via an emissive micro-display projector system of the system, the rendered frame being associated with a second resolution, and the emissive micro-display projector comprising one or more light emitter arrays configured to output light forming virtual content associated with a first, lower, resolution; dividing the rendered frame of virtual content into a plurality of subframes, wherein each subframe includes a subset of pixels included in the rendered frame; and successively outputting light via the emissive micro-display projector system, the light forming the plurality of subframes, wherein the emissive micro-display projector system is shifted via one or more actuators for each of the subframes according to a movement pattern, wherein the emissive micro-display projector system is shifted along one or more axes on a plane parallel to a plane of an output pupil of the projector system.
Example 25. The system of example 24, wherein the one or more processors are configured to cause the one or more actuators to shift the emissive micro-display projector such that geometric positions associated with the emissive micro-display arrays are tiled within respective inter-emitter regions.
Example 26. The system of example 25, wherein the one or more processors are configured to cause the one or more actuators to shift the light emitter arrays along the one or axes.
Example 27. The system of example 25, wherein the micro-display projector system comprises projection optics, wherein the one or more processors are configured to cause the one or more actuators to shift the projection optics along the one or more axes, the projection optics being configured to output light to a user of the system.
Example 28. A method implemented by a head-mounted display system of one or more processors, the method comprising: generating a rendered frame of virtual content to be displayed as virtual content via an emissive micro-display projector system of the head-mounted display system, the rendered frame being associated with a second resolution, and the emissive micro-display projector comprising emitters configured to output light forming virtual content associated with a first, lower, resolution; dividing the rendered frame of virtual content into a plurality of subframes, wherein each subframe includes a subset of pixels included in the rendered frame; and successively outputting light via the emissive micro-display projector system, the light forming the plurality of subframes, wherein the emissive micro-display projector system is shifted along one or more axes via one or more actuators for each of the subframes according to a movement pattern, wherein the emissive micro-display projector system is shifted along one or more axes on a plane parallel to a plane of an output pupil of the projector system.
Example 28. The method of example 28, wherein the emissive micro-display projector system is shifted such that geometric positions associated with light emitter arrays are tiled within respective inter-emitter regions.
Example 29. The method of example 29, wherein the one or more actuators shift the light emitter arrays along the one or axes.
Example 30. The method of example 29, wherein the one or more actuators shift projection optics of the micro-LED projector system along the one or more axes, the projection optics being configured to output light to a user of the head-mounted display system.
Augmented reality (AR) or virtual reality (VR) systems may display virtual content to a user, or viewer. This content may be displayed on a head-mounted display, for example, as part of eyewear, that projects image information to the user's eyes. In addition, where the system is an AR system, the display may also transmit light from a surrounding environment to the user's eyes, to allow a view of the surrounding environment. As used herein, it will be appreciated that a “head-mounted” or “head mountable” display is a display that may be mounted on the head of the user or viewer.
To improve the usability of AR or VR systems (also referred to simply as “display systems”), it may be beneficial to reduce the size, weight, and/or power consumption of the display systems. As an example, a user may be more likely to utilize a display system if the size, and general obtrusiveness, of the display system is decreased. As another example, a user may be more likely to utilize a display system if the weight placed on the user's head is reduced. Similarly, reduced power consumption can allow the use of smaller batteries, reduce heat generated by the display system, and so on. Various embodiments described herein facilitate such benefits, including reductions in the size of parts of display systems.
As described herein, light forming virtual content (also referred to herein as image light) may be generated by one or more display technologies. For example, the light may be generated by a light projection system included in a display system. This light may then be routed via optics for output as virtual content to a user of the display system. The virtual content may be represented as image pixels included in rendered frames successively presented to the user. To achieve high-quality (e.g., lifelike) virtual content, the display system may render, and then output, frames of virtual content at a sufficient resolution (e.g., greater than a threshold resolution). Accordingly, the image pixels may be sufficiently close together to achieve the sufficient resolution.
However, it will be appreciated that design constraints associated with a display system may limit the ability to achieve such closeness in image pixels, and thus resolution. For example, to miniaturize a display system, the display system may be required to have a reduced display size (e.g., a projector size). An example display may include a liquid crystal on silicon (LCoS) display. To output image light forming virtual content, the LCoS display may be required to utilize a separate illumination module including one or more light emitters. In this example, an LCoS panel may impose spatially varying modulation on the generated light to form virtual content. However, to decrease a size associated with an LCoS panel while preserving a high resolution, the pixel pitch associated with the LCoS panel may need to be reduced. Pixel pitch, as described herein, may represent a physical distance on a display between similar locations on similar elements of the display forming image pixels. Due to physical constraints regarding small pixel pitches, coupled with the necessity of a separate illumination module, an LCoS display may be larger than desired in some applications.
Some embodiments disclosed herein advantageously include an emissive micro-display, such as a micro-LED display. In some embodiments, the micro-displays are micro-OLED displays. Display systems utilizing an emissive micro-display may avoid the added bulk of an illumination module. Additionally, an emissive micro-display may facilitate the presentation of images with an apparent advantageously small pixel pitch. As described, an example display system may utilize one or more an emissive micro-displays to achieve reduced size, weight, power consumption, among other benefits.
Emissive micro-displays have several advantages for use in wearable display systems. As an example, the power consumption of emissive micro-displays generally varies with image content, such that dim or sparse content requires less power to display. Since AR environments may often be sparse—since it may generally be desirable for the user to be able to see their surrounding environment—emissive micro-displays may have an average power consumption below that of other display technologies that use a spatial light modulator to modulate light from a light source. In contrast, other display technologies may utilize substantial power even for dim, sparse, or “all off”, virtual content. As another example, emissive micro-displays may offer an exceptionally high frame-rate (which may enable the use of a partial-resolution array) and may provide low levels of visually apparent motion artifacts (e.g., motion blur). As another example, emissive micro-displays may not require polarization optics of the type required by LCoS displays. Thus, emissive micro-displays may avoid the optical losses present in polarization optics.
While arrays of light emitters, such as micro-LEDs, may provide for substantial size, weight, and/or power savings, current light emitters may not provide for sufficiently small pixel pitch to enable high resolution virtual content in small display system form factors. As a non-limiting example, some micro-LED-based micro-displays may allow for a pixel pitch of about 2 to about 3 micron. Even at such pixel pitches, to provide a desired number of pixels, the micro-LED display may still be undesirably large for use in a wearable display system, particularly since a goal for such systems may be to have a form factor and size similar to that of eyeglasses.
As described in more detail, a light projection system including an emissive micro-display may achieve an effective small pixel pitch via rapid physical adjustment, or displacement, to parts of the light projection system. For example, the emissive micro-display may be physically adjusted in position, or displaced, along one or more axes. As another example, optical elements (e.g., projection optics) may be physically adjusted in position, or displaced, along one or more axes.
As described herein, a size associated with a light emitter, such as a micro-LED, may be referred to as emitter size. The emitter size may refer to a dimension of the light emitter along a particular axis (e.g., lateral axis). Emitter size may also refer to dimensions of the light emitter along two axes (e.g., lateral and longitudinal axes). Similarly, pixel pitch may refer to a distance between similar points on directly adjacent light emitters along a particular axis (e.g., lateral axis), with different axes having their own pixel pitch. For example, in some embodiments, the light emitters may be placed more closely along a first axis than along a second axis (e.g., an orthogonal axis). An example of an array of light emitters is described in more detail herein and illustrated in
It will be appreciated that the size of a light emitter may be less than the gap that separates directly neighboring light emitters. For example, due to physical and electrical constraints, it may be challenging to form an emissive micro-display with light emitters at greater than a threshold density. Example constraints may include current crowding, substrate droop, and so on. Thus, there may be substantial gaps or spaces between adjacent light emitters. The gap between two light emitters is referred to herein as an inter-emitter region. Inter-emitter regions, an example of which is illustrated in
Advantageously, the ability to operate light emitters, such as micro-LEDs, at high speeds may allow time-multiplexed presentation of an image using the same one or more emissive micro-displays; for example, the geometric position of light emitters relative to projection optics may be shifted to allowed the same light emitters to present different pixels of the image at different times. In some embodiments, a rendered frame of virtual content may be presented as a series of subframes in rapid succession via time-multiplexing schemes. In this example, each subframe may be associated with a particular physical position of the light emitters relative to the projection optics. Thus, it will be appreciated that the geometric position may be varied by changing the locations of the light emitters and the projection optics relative to one another (e.g., by changing the physical position of light emitters while keeping the projection optics stationary, by changing the physical position of the projection optics while keeping the light emitters stationary, or by changing the physical positions of both the light emitters and the projection optics). As described in more detail below, the geometric positions may be adjusted (e.g., via one or more actuators) to cause the light emitters to tile respective inter-emitter regions. Thus, the emissive micro-display may achieve output of advantageously high resolution virtual content.
Thus, a light projection system may be configured to project individual full-resolution frames of virtual content by projecting one or more partial-resolution subframes. For example, one or more partial-resolution subframes may be projected. The subframes may be projected in rapid succession and may be offset from each-other (e.g., by a less than a full pixel pitch along one or more axes on which the subframes are translated). For example, an emissive micro-display included in the projector may be physically displaced along one or more axes. As described above, an emissive micro-display may include light emitters, such as micro-LEDs, having a pixel pitch. This pixel pitch may thus inform a resolution at which the emissive micro-display may output frames of virtual content. To effectively decrease the functional pixel pitch and gap between light emitters, and thus increase a resolution for a same size display, the light emitters may be adjusted in position, for example, by less than the pixel pitch. As an example, a display may include light emitters separated by a pixel pitch of 2.5 microns, and each light emitter may have an emitter size of 0.833 microns. In some embodiments, the light emitters may be adjusted in position a number of times based on the number of times that the light emitters may be translated to different (e.g., non-overlapping) positions within an inter-emitter region. In this example, the inter-emitter region may be 6.25 microns2, and an example light emitter may be adjusted in position three times along a first axis and three times along a second, orthogonal axis. Thus, the example light emitter may effectively assume 9 positions within the inter-emitter region. For one or more of the 9 positions, a particular subframe of a same rendered frame of virtual content may be presented. Thus, the successively presented subframes may be perceived as a high-resolution frame of virtual content. In effect, the light emitters may form images with a higher apparent pixel density than the physical density of the light emitters.
In some embodiments, the visual system of users may merge together the subframes such that users perceive the full-resolution frames. For example, the pixels of the subframes may be interwoven to form a full-resolution frame. Preferably, the subframes may be sequentially displayed at a frame rate higher than the flicker fusion threshold of the human visual system. As an example, the flicker fusion threshold may be 60 Hz, which is considered to be sufficiently fast that most users do not perceive the subframes as being displayed at different times. In some embodiments, the different subframes are sequentially displayed at a rate equal to or higher than the flicker fusion threshold (e.g., equal to or higher than 60 Hz).
As a result, an emissive micro-display can be configured to have fewer light emitters than the number of image pixels contained in each full-resolution rendered frame of virtual content. For example, a full-resolution image could include 2000×2000 pixels, while the emissive micro-display may be an array of only 1000×1000 elements. The use of lower-resolution emissive micro-displays can be particularly beneficial for wearable systems such as the display systems described herein. As an example, a lower-resolution display may be smaller, weigh less, and/or consume less power that a higher-resolution display.
While the above has described moving or adjusting the position of an emissive micro-display (including, for example, micro-LED arrays), it will be appreciated that the position of projection optics may alternatively, or additionally, be adjusted. For example, and as will be described in more detail below with respect to
It will be appreciated that certain portions of a rendered frame of virtual content may be more visually apparent to a user than other portions. For example, the user may have heightened visual acuity for portions of virtual content falling on the user's fovea (herein referred to as “foveal portions”). To determine the locations of these foveal portions, a display system may determine a fixation point at which a user is fixating. Portions of virtual content falling within a threshold angular distance of this fixation point may be identified as falling on a fovea of the user. As will be described, the display system may be configured to increase a resolution associated with foveal portions. The resolution of remaining portions may be increased less or not increased.
As an example, an emissive micro-display may be configured to update pixels included in a foveal portion at a greater rate than for pixels included in other portions. As described above, geometric positions of light emitters may be translated, or adjusted, to tile an inter-emitter region of the array. Optionally, light emitters utilized to output light forming pixels including a foveal region may be updated for a relatively high proportion of the different geometric positions (e.g., for each different geometric position), while light emitters for forming pixels away from the foveal region may be updated for a lower proportion of the different geometric positions (e.g., these light emitters may be “off” or may simply present the same information as in a previous position). Light emitters utilized to output light forming pixels included in other regions may be updated less. For example, these light emitters may be updated twice, or only once, for a given full-resolution rendered frame of virtual content. For example, the light emitters for a foveal region may be updated for each of four different geometric positions within inter-emitter regions, while light emitters corresponding to peripheral parts of an image may be updated only for every other geometric positions.
As a result, a rendered frame formed from rapidly displayed or projected subframes may have an effective resolution which varies across the rendered frame. With foveated imaging, the effective resolution of an emissive micro-display can be made high in foveal regions (e.g., regions of interest, regions in which a user is focused, regions designated by a user, regions designated by a designer, etc.) and can be made lower in other regions (e.g., outside regions of interest). Configuring the emissive micro-displays to provide foveated images may further help to converse resources, for example by eliminating and/or reducing processing and power loads associated with displaying or projecting fewer interesting regions (e.g., regions unlikely to be the focus of users' attentions).
Example Display Systems with Emissive Micro-Displays
Advantageously, as noted herein, display systems utilizing emissive micro-displays as described herein may allow for a low-weight and compact form factor, and may also provide a high frame rate and low motion blur. Preferably, the micro-displays are emissive micro-displays, which provide advantages for high brightness and high pixel density. In some embodiments, the emissive micro-displays are micro-LED displays. In some other embodiments, the emissive micro-displays are micro-OLED displays. In some embodiments, the emissive micro-displays comprise arrays of light emitters having a pitch of, e.g., less than 10 μm, less than 8 μm, less than 6 μm, less than 5 μm, or less than 2 μm, including 1-5 μm, and an emitter size of 2 μm or less, 1.7 μm or less, or 1.3 μm or less. In some embodiments, the emitter size is within a range having an upper limit of the above-noted sizes and a lower limit of 1 μm. In some embodiments, the ratio of emitter size to pitch is 1:1 to 1:5, 1:2 to 1:4, or 1:2 to 1:3, which may have advantages for individual control of emitters and efficient utilization of emitted light by eyepieces, as discussed further herein.
In some embodiments, a plurality of emissive micro-displays may be utilized to form images for a head-mounted display system. The light containing the image information for forming these images may be referred to as image light. It will be appreciated that image light may vary in, e.g., wavelength, intensity, polarization, etc. The emissive micro-displays output image light to an eyepiece, which then relays the light to an eye of the user.
In some embodiments, the plurality of emissive micro-displays may be utilized and positioned at different sides of an optical combiner, e.g., an X-cube prism or dichroic X-cube. The X-cube prism receives light rays from different micro-displays on different faces of the cube and outputs the light rays from the same face of the cube. The outputted light may be directed towards projection optics, which is configured to converge or focus the image light onto the eyepiece.
In some embodiments, the plurality of emissive micro-displays comprises monochrome micro-displays, which are configured to output light of a single component color. Combining various component colors forms a full color image. In some other embodiments, one or more of the emissive micro-displays may have sub-pixels configured to emit light of two or more, but not all, component colors utilized by the display system. For example, a single emissive micro-display may have sub-pixels which emit light of the colors blue and green, while a separate emissive micro-display on a different face of the X-cube may have pixels configured to emit red light. In some embodiments, the plurality of micro-displays are each full-color displays comprising, e.g., pixels formed of multiple sub-pixels configured to emit light of different component colors. Advantageously, combining the light of multiple full-color micro-displays may increase display brightness and dynamic range.
It will be appreciated that the emissive micro-displays may comprise arrays of light emitters. The light emitters may emit light with a Lambertian angular emission profile. Undesirably, such an angular remission profile may “waste” light, since only a small portion of the emitted light may ultimately be incident on the eyepiece. In some embodiments, light collimators may be utilized to narrow the angular emission profile of light emitted by the light emitters. As used herein, a light collimator is an optical structure which narrows the angular emission profile of incident light; that is, the light collimator receives light from an associated light emitter with a relatively wide initial angular emission profile and outputs that light with a narrower angular emission profile than the wide initial angular emission profile. In some embodiments, the rays of light exiting the light collimator are more parallel than the rays of light received by the light collimator, before being transmitted through and exiting the collimator. Examples of light collimators include micro-lenses, nano-lenses, reflective wells, metasurfaces, and liquid crystal gratings. In some embodiments, the light collimators may be configured to steer light to ultimately converge on different laterally-shifted light-coupling optical elements. In some embodiments, each light emitter has a dedicated light collimator. The light collimators are preferably positioned directly adjacent or contacting the light emitters, to capture a large proportion of the light emitted by the associated light emitters.
In some embodiments, a single emissive micro-display may be utilized to direct light to the eyepiece. For example, the single emissive micro-display may be a full-color display comprising light emitters that emit light of different component colors. In some embodiments, the light emitters may form groups, which are localized in a common area, with each group comprising light emitters which emit light of each component color. In such embodiments, each group of light emitters may share a common micro-lens. Advantageously, light of different colors from different light emitters take a different path through the micro-lens, which may be manifested in light of different component colors being incident on different in-coupling optical elements of an eyepiece, as discussed herein.
In some embodiments, the full-color micro-display may comprise repeating groups of light emitters of the same component color. For instance, the micro-display may include rows of light emitters, with the light emitters of each individual row configured to emit light of the same color. Thus, different rows may emit light of different component colors. In addition, the micro-display may have an associated array of light collimators configured to direct light to a desired location on an eyepiece, e.g., to an associated in-coupling optical element. Advantageously, while the individual light emitters of such a full-color micro-display may not be positioned to form a high-quality full-color image, as viewed directly on the micro-display, the lens array appropriately steers the light from the light emitters to the eyepiece, which combines monochrome images formed by light emitters of different colors, thereby forming a high-quality full-color image.
In some embodiments, the eyepiece receiving image light from the micro-displays may comprise a waveguide assembly. The area of a waveguide of the waveguide assembly on which the image light is incident may include in-coupling optical elements which in-couple incident image light, such that the light propagates through the waveguide by total internal reflection (TIR). In some embodiments, the waveguide assembly may include a stack of waveguides, each of which has an associated in-coupling optical element. Different in-coupling optical elements may be configured to in-couple light of different colors, such that different waveguides may be configured to propagate light of different colors therein. The waveguides may include out-coupling optical elements, which out-couple light propagating therein, such that the out-coupled light propagates towards the eye of the user. In some other embodiments, the waveguide assembly may include a single waveguide having an associated in-coupling optical element configured to in-couple light of different component colors.
In some embodiments, the in-coupling optical elements are laterally shifted, as seen from the projection optics. Different in-coupling optical elements may be configured to in-couple light of different colors. Preferably, image light of different colors take different paths to the eyepiece and, thus, impinge upon different corresponding in-coupling optical elements.
In some other embodiments, other types of eyepieces or optics for relaying image light to the eyes of the user may be utilized. For example, as discussed herein, the eyepiece may include one or more waveguides which propagates image light therein by TIR. As another example, the eyepiece may include a birdbath combiner comprising a semitransparent mirror that both directs image light to a viewer and allows a view of the ambient environment.
In some embodiments, the eyepiece may be configured to selectively output light with different amounts of wavefront divergence, to provide virtual content at a plurality of virtual depth planes (also referred to simply as “depth planes” herein) perceived to be at different distances away from the user. For example, the eyepiece may comprise a plurality of waveguides each having out-coupling optical elements with different optical power to output light with different amounts of wavefront divergence. In some other embodiments, a variable focus element may be provided between the eyepiece and the user's eye. The variable focus element may be configured to dynamically change optical power to provide the desired wavefront divergence for particular virtual content. In some embodiments, as an alternative to, or in addition to waveguide optical structures for providing optical power, the display systems may also include a plurality of lenses that provide or additionally provide optical powers.
In addition to the compact form factor and high frame rates discussed above, emissive micro-displays according to some embodiments may provide one of more of the following advantages. For example, the micro-displays may provide exceptionally small pixel pitches and high pixel density. The micro-displays may also provide high luminance and efficiency. For example, the light emitters of the emissive micro-displays may only consume power to emit light when the light emitters are needed provide content with luminance. This is in contrast to other display technologies in which the light source may illuminate an entire panel of pixels, whether or not some of those pixels are dark. Further, it will be appreciated that the human visual system integrates received light over time and the light emitters of emissive micro-displays, such as micro-LEDs, have advantageously high duty cycles (e.g., including a short activation period for a light emitter in a micro-display to rise from an “off” to a full “on” state, and a correspondingly short time to fall from an “on” state to “off” state allow the light emitters to emit light at the on level for a large percentage of each cycle). As a result, the power used to generate an image with a given perceived brightness may be less as compared to conventional display technologies with lower duty cycles. In some embodiments, the duty cycle may be 70% or more, 80% or more, or 90% or more. In some embodiments, the duty cycle may be about 99%. In addition, as noted herein, micro-displays may facilitate exceptionally high frame rates, which may provide advantages including reducing mismatches between the position of a user's head and the displayed content.
Reference will now be made to the drawings, in which like reference numerals refer to like parts throughout. Unless indicated otherwise, the drawings are schematic and not necessarily drawn to scale.
With continued reference to
Generating a realistic and comfortable perception of depth is challenging, however. It will be appreciated that light from objects at different distances from the eyes have wavefronts with different amounts of divergence.
With continued reference to
With reference now to
Without being limited by theory, it is believed that viewers of an object may perceive the object as being “three-dimensional” due to a combination of vergence and accommodation. As noted above, vergence movements (e.g., rotation of the eyes so that the pupils move toward or away from each other to converge the lines of sight of the eyes to fixate upon an object) of the two eyes relative to each other are closely associated with accommodation of the lenses of the eyes. Under normal conditions, changing the shapes of the lenses of the eyes to change focus from one object to another object at a different distance will automatically cause a matching change in vergence to the same distance, under a relationship known as the “accommodation-vergence reflex.” Likewise, a change in vergence will trigger a matching change in lens shape under normal conditions.
With reference now to
Undesirably, many users of conventional “3-D” display systems find such conventional systems to be uncomfortable or may not perceive a sense of depth at all due to a mismatch between accommodative and vergence states in these displays. As noted above, many stereoscopic or “3-D” display systems display a scene by providing slightly different images to each eye. Such systems are uncomfortable for many viewers, since they, among other things, simply provide different presentations of a scene and cause changes in the vergence states of the eyes, but without a corresponding change in the accommodative states of those eyes. Rather, the images are shown by a display at a fixed distance from the eyes, such that the eyes view all the image information at a single accommodative state. Such an arrangement works against the “accommodation-vergence reflex” by causing changes in the vergence state without a matching change in the accommodative state. This mismatch is believed to cause viewer discomfort. Display systems that provide a better match between accommodation and vergence may form more realistic and comfortable simulations of three-dimensional imagery.
Without being limited by theory, it is believed that the human eye typically may interpret a finite number of depth planes to provide depth perception. Consequently, a highly believable simulation of perceived depth may be achieved by providing, to the eye, different presentations of an image corresponding to each of these limited numbers of depth planes. In some embodiments, the different presentations may provide both cues to vergence and matching cues to accommodation, thereby providing physiologically correct accommodation-vergence matching.
With continued reference to
In the illustrated embodiment, the distance, along the z-axis, of the depth plane 240 containing the point 221 is 1 m. As used herein, distances or depths along the z-axis may be measured with a zero-point located at the exit pupils of the user's eyes. Thus, a depth plane 240 located at a depth of 1 m corresponds to a distance of 1 m away from the exit pupils of the user's eyes, on the optical axis of those eyes with the eyes directed towards optical infinity. As an approximation, the depth or distance along the z-axis may be measured from the display in front of the user's eyes (e.g., from the surface of a waveguide), plus a value for the distance between the device and the exit pupils of the user's eyes. That value may be called the eye relief and corresponds to the distance between the exit pupil of the user's eye and the display worn by the user in front of the eye. In practice, the value for the eye relief may be a normalized value used generally for all viewers. For example, the eye relief may be assumed to be 20 mm and a depth plane that is at a depth of 1 m may be at a distance of 980 mm in front of the display.
With reference now to
It will be appreciated that each of the accommodative and vergence states of the eyes 210, 220 are associated with a particular distance on the z-axis. For example, an object at a particular distance from the eyes 210, 220 causes those eyes to assume particular accommodative states based upon the distances of the object. The distance associated with a particular accommodative state may be referred to as the accommodation distance, Ad. Similarly, there are particular vergence distances, Vd, associated with the eyes in particular vergence states, or positions relative to one another. Where the accommodation distance and the vergence distance match, the relationship between accommodation and vergence may be said to be physiologically correct. This is considered to be the most comfortable scenario for a viewer.
In stereoscopic displays, however, the accommodation distance and the vergence distance may not always match. For example, as illustrated in
In some embodiments, it will be appreciated that a reference point other than exit pupils of the eyes 210, 220 may be utilized for determining distance for determining accommodation-vergence mismatch, so long as the same reference point is utilized for the accommodation distance and the vergence distance. For example, the distances could be measured from the cornea to the depth plane, from the retina to the depth plane, from the eyepiece (e.g., a waveguide of the display device) to the depth plane, and so on.
Without being limited by theory, it is believed that users may still perceive accommodation-vergence mismatches of up to about 0.25 diopter, up to about 0.33 diopter, and up to about 0.5 diopter as being physiologically correct, without the mismatch itself causing significant discomfort. In some embodiments, display systems disclosed herein (e.g., the display system 250,
In some embodiments, a single waveguide may be configured to output light with a set amount of wavefront divergence corresponding to a single or limited number of depth planes and/or the waveguide may be configured to output light of a limited range of wavelengths. Consequently, in some embodiments, a plurality or stack of waveguides may be utilized to provide different amounts of wavefront divergence for different depth planes and/or to output light of different ranges of wavelengths. As used herein, it will be appreciated at a depth plane may be planar or may follow the contours of a curved surface.
In some embodiments, the display system 250 may be configured to provide substantially continuous cues to vergence and multiple discrete cues to accommodation. The cues to vergence may be provided by displaying different images to each of the eyes of the user, and the cues to accommodation may be provided by outputting the light that forms the images with selectable discrete amounts of wavefront divergence. Stated another way, the display system 250 may be configured to output light with variable levels of wavefront divergence. In some embodiments, each discrete level of wavefront divergence corresponds to a particular depth plane and may be provided by a particular one of the waveguides 270, 280, 290, 300, 310.
With continued reference to
the plurality of lenses 320, 330, 340, 350 may be configured to send image information to the eye with various levels of wavefront curvature or light ray divergence. Each waveguide level may be associated with a particular depth plane and may be configured to output image information corresponding to that depth plane. Image injection devices 360, 370, 380, 390, 400 may function as a source of light for the waveguides and may be utilized to inject image information into the waveguides 270, 280, 290, 300, 310, each of which may be configured, as described herein, to distribute incoming light across each respective waveguide, for output toward the eye 210. Light exits an output surface 410, 420, 430, 440, 450 of the image injection devices 360, 370, 380, 390, 400 and is injected into a corresponding input surface 460, 470, 480, 490, 500 of the waveguides 270, 280, 290, 300, 310. In some embodiments, each of the input surfaces 460, 470, 480, 490, 500 may be an edge of a corresponding waveguide, or may be part of a major surface of the corresponding waveguide (that is, one of the waveguide surfaces directly facing the world 510 or the viewer's eye 210). In some embodiments, a single beam of light (e.g. a collimated beam) may be injected into each waveguide to output an entire field of cloned collimated beams that are directed toward the eye 210 at particular angles (and amounts of divergence) corresponding to the depth plane associated with a particular waveguide. In some embodiments, a single one of the image injection devices 360, 370, 380, 390, 400 may be associated with and inject light into a plurality (e.g., three) of the waveguides 270, 280, 290, 300, 310.
In some embodiments, the image injection devices 360, 370, 380, 390, 400 are discrete displays that each produce image information for injection into a corresponding waveguide 270, 280, 290, 300, 310, respectively. In some other embodiments, the image injection devices 360, 370, 380, 390, 400 are the output ends of a single multiplexed display which may, e.g., pipe image information via one or more optical conduits (such as fiber optic cables) to each of the image injection devices 360, 370, 380, 390, 400. It will be appreciated that the image information provided by the image injection devices 360, 370, 380, 390, 400 may include light of different wavelengths, or colors (e.g., different component colors, as discussed herein).
In some embodiments, the light injected into the waveguides 270, 280, 290, 300, 310 is provided by a light projection system 520, which comprises a light module 530, which may include a light emitter, such as a light emitting diode (LED). The light from the light module 530 may be directed to and modified by a light modulator 540, e.g., a spatial light modulator, via a beam splitter 550. The light modulator 540 may be configured to change the perceived intensity of the light injected into the waveguides 270, 280, 290, 300, 310 to encode the light with image information. Examples of spatial light modulators include liquid crystal displays (LCD) including a liquid crystal on silicon (LCOS) displays. In some other embodiments, the spatial light modulator may be a MEMS device, such as a digital light processing (DLP) device. It will be appreciated that the image injection devices 360, 370, 380, 390, 400 are illustrated schematically and, in some embodiments, these image injection devices may represent different light paths and locations in a common projection system configured to output light into associated ones of the waveguides 270, 280, 290, 300, 310. In some embodiments, the waveguides of the waveguide assembly 260 may function as ideal lens while relaying light injected into the waveguides out to the user's eyes. In this conception, the object may be the spatial light modulator 540 and the image may be the image on the depth plane.
In some embodiments, the display system 250 may be a scanning fiber display comprising one or more scanning fibers configured to project light in various patterns (e.g., raster scan, spiral scan, Lissajous patterns, etc.) into one or more waveguides 270, 280, 290, 300, 310 and ultimately to the eye 210 of the viewer. In some embodiments, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a single scanning fiber or a bundle of scanning fibers configured to inject light into one or a plurality of the waveguides 270, 280, 290, 300, 310. In some other embodiments, the illustrated image injection devices 360, 370, 380, 390, 400 may schematically represent a plurality of scanning fibers or a plurality of bundles of scanning fibers, each of which are configured to inject light into an associated one of the waveguides 270, 280, 290, 300, 310. It will be appreciated that one or more optical fibers may be configured to transmit light from the light module 530 to the one or more waveguides 270, 280, 290, 300, 310. It will be appreciated that one or more intervening optical structures may be provided between the scanning fiber, or fibers, and the one or more waveguides 270, 280, 290, 300, 310 to, e.g., redirect light exiting the scanning fiber into the one or more waveguides 270, 280, 290, 300, 310.
A controller 560 controls the operation of one or more of the stacked waveguide assembly 260, including operation of the image injection devices 360, 370, 380, 390, 400, the light source 530, and the light modulator 540. In some embodiments, the controller 560 is part of the local data processing module 140. The controller 560 includes programming (e.g., instructions in a non-transitory medium) that regulates the timing and provision of image information to the waveguides 270, 280, 290, 300, 310 according to, e.g., any of the various schemes disclosed herein. In some embodiments, the controller may be a single integral device, or a distributed system connected by wired or wireless communication channels. The controller 560 may be part of the processing modules 140 or 150 (
With continued reference to
With continued reference to
The other waveguide layers 300, 310 and lenses 330, 320 are similarly configured, with the highest waveguide 310 in the stack sending its output through all of the lenses between it and the eye for an aggregate focal power representative of the closest focal plane to the person. To compensate for the stack of lenses 320, 330, 340, 350 when viewing/interpreting light coming from the world 510 on the other side of the stacked waveguide assembly 260, a compensating lens layer 620 may be disposed at the top of the stack to compensate for the aggregate power of the lens stack 320, 330, 340, 350 below. Such a configuration provides as many perceived focal planes as there are available waveguide/lens pairings. Both the out-coupling optical elements of the waveguides and the focusing aspects of the lenses may be static (i.e., not dynamic or electro-active). In some alternative embodiments, either or both may be dynamic using electro-active features.
In some embodiments, two or more of the waveguides 270, 280, 290, 300, 310 may have the same associated depth plane. For example, multiple waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same depth plane, or multiple subsets of the waveguides 270, 280, 290, 300, 310 may be configured to output images set to the same plurality of depth planes, with one set for each depth plane. This may provide advantages for forming a tiled image to provide an expanded field of view at those depth planes.
With continued reference to
In some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 are diffractive features that form a diffraction pattern, or “diffractive optical element” (also referred to herein as a “DOE”). Preferably, the DOE's have a sufficiently low diffraction efficiency so that only a portion of the light of the beam is deflected away toward the eye 210 with each intersection of the DOE, while the rest continues to move through a waveguide via TIR. The light carrying the image information is thus divided into a number of related exit beams that exit the waveguide at a multiplicity of locations and the result is a fairly uniform pattern of exit emission toward the eye 210 for this particular collimated beam bouncing around within a waveguide.
In some embodiments, one or more DOEs may be switchable between “on” states in which they actively diffract, and “off” states in which they do not significantly diffract. For instance, a switchable DOE may comprise a layer of polymer dispersed liquid crystal, in which microdroplets comprise a diffraction pattern in a host medium, and the refractive index of the microdroplets may be switched to substantially match the refractive index of the host material (in which case the pattern does not appreciably diffract incident light) or the microdroplet may be switched to an index that does not match that of the host medium (in which case the pattern actively diffracts incident light).
In some embodiments, a camera assembly 630 (e.g., a digital camera, including visible light and infrared light cameras) may be provided to capture images of the eye 210 and/or tissue around the eye 210 to, e.g., detect user inputs and/or to monitor the physiological state of the user. As used herein, a camera may be any image capture device. In some embodiments, the camera assembly 630 may include an image capture device and a light source to project light (e.g., infrared light) to the eye, which may then be reflected by the eye and detected by the image capture device. In some embodiments, the camera assembly 630 may be attached to the frame or support structure 80 (
The camera assembly 630 may, in some embodiments, observe movements of the user, such as the user's eye movements. As an example, the camera assembly 630 may capture images of the eye 210 to determine the size, position, and/or orientation of the pupil of the eye 210 (or some other structure of the eye 210). The camera assembly 630 may, if desired, obtain images (processed by processing circuitry of the type described herein) used to determine the direction the user is looking (e.g., eye pose or gaze direction). In some embodiments, camera assembly 630 may include multiple cameras, at least one of which may be utilized for each eye, to separately determine the eye pose or gaze direction of each eye independently. The camera assembly 630 may, in some embodiments and in combination with processing circuitry such as the controller 560 or the local data processing module 140, determine eye pose or gaze direction based on glints (e.g., reflections) of reflected light (e.g., infrared light) from a light source included in camera assembly 630.
With reference now to
In some embodiments, a full color image may be formed at each depth plane by overlaying images in each of the component colors, e.g., three or more component colors.
In some embodiments, light of each component color may be outputted by a single dedicated waveguide and, consequently, each depth plane may have multiple waveguides associated with it. In such embodiments, each box in the figures including the letters G, R, or B may be understood to represent an individual waveguide, and three waveguides may be provided per depth plane where three component color images are provided per depth plane. While the waveguides associated with each depth plane are shown adjacent to one another in this drawing for ease of description, it will be appreciated that, in a physical device, the waveguides may all be arranged in a stack with one waveguide per level. In some other embodiments, multiple component colors may be outputted by the same waveguide, such that, e.g., only a single waveguide may be provided per depth plane.
With continued reference to
It will be appreciated that references to a given color of light throughout this disclosure will be understood to encompass light of one or more wavelengths within a range of wavelengths of light that are perceived by a viewer as being of that given color. For example, red light may include light of one or more wavelengths in the range of about 620-780 nm, green light may include light of one or more wavelengths in the range of about 492-577 nm, and blue light may include light of one or more wavelengths in the range of about 435-493 nm.
In some embodiments, the light source 530 (
With reference now to
The illustrated set 660 of stacked waveguides includes waveguides 670, 680, and 690. Each waveguide includes an associated in-coupling optical element (which may also be referred to as a light input area on the waveguide), with, e.g., in-coupling optical element 700 disposed on a major surface (e.g., an upper major surface) of waveguide 670, in-coupling optical element 710 disposed on a major surface (e.g., an upper major surface) of waveguide 680, and in-coupling optical element 720 disposed on a major surface (e.g., an upper major surface) of waveguide 690. In some embodiments, one or more of the in-coupling optical elements 700, 710, 720 may be disposed on the bottom major surface of the respective waveguide 670, 680, 690 (particularly where the one or more in-coupling optical elements are reflective, deflecting optical elements). As illustrated, the in-coupling optical elements 700, 710, 720 may be disposed on the upper major surface of their respective waveguide 670, 680, 690 (or the top of the next lower waveguide), particularly where those in-coupling optical elements are transmissive, deflecting optical elements. In some embodiments, the in-coupling optical elements 700, 710, 720 may be disposed in the body of the respective waveguide 670, 680, 690. In some embodiments, as discussed herein, the in-coupling optical elements 700, 710, 720 are wavelength selective, such that they selectively redirect one or more wavelengths of light, while transmitting other wavelengths of light. While illustrated on one side or corner of their respective waveguide 670, 680, 690, it will be appreciated that the in-coupling optical elements 700, 710, 720 may be disposed in other areas of their respective waveguide 670, 680, 690 in some embodiments.
As illustrated, the in-coupling optical elements 700, 710, 720 may be laterally offset from one another, as seen in the illustrated head-on view in a direction of light propagating to these in-coupling optical elements. In some embodiments, each in-coupling optical element may be offset such that it receives light without that light passing through another in-coupling optical element. For example, each in-coupling optical element 700, 710, 720 may be configured to receive light from a different image injection device 360, 370, 380, 390, and 400 as shown in
Each waveguide also includes associated light distributing elements, with, e.g., light distributing elements 730 disposed on a major surface (e.g., a top major surface) of waveguide 670, light distributing elements 740 disposed on a major surface (e.g., a top major surface) of waveguide 680, and light distributing elements 750 disposed on a major surface (e.g., a top major surface) of waveguide 690. In some other embodiments, the light distributing elements 730, 740, 750, may be disposed on a bottom major surface of associated waveguides 670, 680, 690, respectively. In some other embodiments, the light distributing elements 730, 740, 750, may be disposed on both top and bottom major surface of associated waveguides 670, 680, 690, respectively; or the light distributing elements 730, 740, 750, may be disposed on different ones of the top and bottom major surfaces in different associated waveguides 670, 680, 690, respectively.
The waveguides 670, 680, 690 may be spaced apart and separated by, e.g., gas, liquid, and/or solid layers of material. For example, as illustrated, layer 760a may separate waveguides 670 and 680; and layer 760b may separate waveguides 680 and 690. In some embodiments, the layers 760a and 760b are formed of low refractive index materials (that is, materials having a lower refractive index than the material forming the immediately adjacent one of waveguides 670, 680, 690). Preferably, the refractive index of the material forming the layers 760a, 760b is 0.05 or more, or 0.10 or less than the refractive index of the material forming the waveguides 670, 680, 690. Advantageously, the lower refractive index layers 760a, 760b may function as cladding layers that facilitate total internal reflection (TIR) of light through the waveguides 670, 680, 690 (e.g., TIR between the top and bottom major surfaces of each waveguide). In some embodiments, the layers 760a, 760b are formed of air. While not illustrated, it will be appreciated that the top and bottom of the illustrated set 660 of waveguides may include immediately neighboring cladding layers.
Preferably, for ease of manufacturing and other considerations, the material forming the waveguides 670, 680, 690 are similar or the same, and the material forming the layers 760a, 760b are similar or the same. In some embodiments, the material forming the waveguides 670, 680, 690 may be different between one or more waveguides, and/or the material forming the layers 760a, 760b may be different, while still holding to the various refractive index relationships noted above.
With continued reference to
In some embodiments, the light rays 770, 780, 790 have different properties, e.g., different wavelengths or different ranges of wavelengths, which may correspond to different colors. The in-coupling optical elements 700, 710, 720 each deflect the incident light such that the light propagates through a respective one of the waveguides 670, 680, 690 by TIR. In some embodiments, the in-coupling optical elements 700, 710, 720 each selectively deflect one or more particular wavelengths of light, while transmitting other wavelengths to an underlying waveguide and associated in-coupling optical element.
For example, in-coupling optical element 700 may be configured to deflect ray 770, which has a first wavelength or range of wavelengths, while transmitting rays 780 and 790, which have different second and third wavelengths or ranges of wavelengths, respectively. The transmitted ray 780 impinges on and is deflected by the in-coupling optical element 710, which is configured to deflect light of a second wavelength or range of wavelengths. The ray 790 is deflected by the in-coupling optical element 720, which is configured to selectively deflect light of third wavelength or range of wavelengths.
With continued reference to
With reference now to
In some embodiments, the light distributing elements 730, 740, 750 are orthogonal pupil expanders (OPE's). In some embodiments, the OPE's deflect or distribute light to the out-coupling optical elements 800, 810, 820 and, in some embodiments, may also increase the beam or spot size of this light as it propagates to the out-coupling optical elements. In some embodiments, the light distributing elements 730, 740, 750 may be omitted and the in-coupling optical elements 700, 710, 720 may be configured to deflect light directly to the out-coupling optical elements 800, 810, 820. For example, with reference to
Accordingly, with reference to
It will be appreciated that the spatially overlapping areas may have lateral overlap of 70% or more, 80% or more, or 90% or more of their areas, as seen in the top-down view. On the other hand, the laterally shifted areas of less than 30% overlap, less than 20% overlap, or less than 10% overlap of their areas, as seen in top-down view. In some embodiments, laterally shifted areas have no overlap.
With continued reference to
With continued reference to
With continued reference to
As noted herein, the separate light source 940 and associated lens structure 960 may undesirably add weight and size to the wearable display system. This may decrease the comfort of the display system, particularly for a user wearing the display system for an extended duration.
In addition, the light source 940 in conjunction with the SLM 930 may consume energy inefficiently. For example, the light source 940 may illuminate the entirety of the SLM 930. The SLM 930 then selectively reflects light towards the eyepiece 920. thus, not all the light produced by the light source 940 may be utilized to form an image; some of this light, e.g., light corresponding to dark regions of an image, is not reflected to the eyepiece 920. As a result, the light source 940 utilizes energy to generate light to illuminate the entirety of the SLM 930, but only a fraction of this light may be needed to form some images.
Moreover, as noted herein, in some cases, the SLM 930 may modulate light using a micro-mirror to selectively reflect incident light, or using liquid crystal molecules that modify the amount of light reflected from an underlying mirror. As a result, such devices require physical movement of optical elements (e.g., micro-mirrors or liquid crystal molecules such as in LCoS or DLP panels, respectively) in order to modulate light from the light source 940. The physical movement required to modulate light to encode the light with image information, e.g., corresponding to a pixel, may occur at relatively slow speeds in comparison to, e.g., the ability to turn an LED or OLED “on” or “off”. This relatively slow movement may limit the frame rate of the display system and may be visible as, e.g., motion blur, color-breakup, and/or presented images that are mismatched with the pose of the user's head or changes in said pose.
Advantageously, wearable displays utilizing emissive micro-displays, as disclosed herein, may facilitate wearable display systems that have a relatively low weight and bulkiness, high energy efficiency, and high frame rate, with low motion blur and low motion-to-photon latency. Low blur and low motion-to-photon latency are further discussed in U.S. Provisional Application No. 62/786,199, filed Dec. 28, 2018, the entire disclosure of which is incorporated by reference herein. In addition, in comparison to scanning fiber displays, the emissive micro-displays may avoid artifacts caused by the use of coherent light sources.
With reference now to
In some embodiments, the micro-displays 1030a, 1030b, 1030c may be monochrome micro-displays, with each monochrome micro-display outputting light of a different component color to provide a monochrome image. As discussed herein, the monochrome images combine to form a full-color image.
In some other embodiments, the micro-displays 1030a, 1030b, 1030c may be may each be full-color displays configured to output light of all component colors. For example, the micro-displays 1030a, 1030b, 1030c each include red, green, and blue light emitters. The micro-displays 1030a, 1030b, 1030c may be identical and may display the same image. However, utilizing multiple micro-displays may provide advantages for increasing the brightness and brightness dynamic range of the brightness of the image, by combining the light from the multiple micro-displays to form a single image. In some embodiments, two or more (e.g., three) micro-displays may be utilized, with the optical combiner 1050 is configured to combine light from all of these micro-displays.
The micro-displays may comprise an array of light emitters. Examples of light emitters include organic light-emitting diodes (OLEDs) and micro-light-emitting diodes (micro-LEDs). It will be appreciated that OLEDs utilize organic material to emit light and micro-LEDs utilize inorganic material to emit light. Advantageously, some micro-LEDs provide higher luminance and higher efficiency (in terms of lux/W) than OLEDs. In some embodiments, the micro-displays are preferably emissive micro-LED displays.
With continued reference to
In some embodiments, the eyepiece 1020 may comprise a plurality of stacked waveguides 1020a, 1020b, 1020c, each of which has a respective in-coupling optical element 1022a, 1022b, 1022c. In some embodiments, the number of waveguides is proportional to the number of component colors provided by the micro-displays 1030a, 1030b, 1030c. For example, where there are three component colors, the number of waveguides in the eyepiece 1020 may include a set of three waveguides or multiple sets of three waveguides each. In some embodiments, each set may output light with wavefront divergence corresponding to a particular depth plane, as discussed herein. It will be appreciated that the waveguides 1020a, 1020b, 1020c and the in-coupling optical element 1022a, 1022b, 1022c may correspond to the waveguides 670, 680, 690 and the in-coupling optical elements 700, 710, 720, respectively, of
As illustrated, the various in-coupling optical elements disclosed herein (e.g., the in-coupling optical element 1022a, 1022b, 1022c) may be disposed on a major surface of an associated waveguide (e.g., waveguides 1020a, 1020b, 1020c, respectively). In addition, as also illustrated, the major surface on which a given in-coupling optical element is disposed may be the rear surface of the waveguide. In such a configuration, the in-coupling optical element may be a reflective light redirecting element, which in-couples light by reflecting the light at angles which support TIR through the associated waveguide. In some other configurations, the in-coupling optical element may be disposed on the forward surface of the waveguide (closer to the projection optics 1070 than the rearward surface). In such configurations, the in-coupling optical element may be a transmissive light redirecting element, which in-couples light by changing the direction of propagation of light as the light is transmitted through the in-coupling optical element. It will be appreciated that any of the in-coupling optical elements disclosed herein may be reflective or transmissive in-coupling optical elements.
With continued reference to
With continued reference to
With continued reference to
As discussed herein, the perception of a full color image by a user may be achieved with time division multiplexing in some embodiments. For example, different ones of the emissive micro-displays 1030a, 1030b, 1030c may be activated at different times to generate different component color images. In such embodiments, the different component color images that form a single full color image may be sequentially displayed sufficiently quickly that the human visual system does not perceive the component color images as being displayed at different times; that is, the different component color images that form a single full color image may all be displayed within a duration that is sufficiently short that the user perceives the component color images as being simultaneously presented, rather than being temporally separated. For example, it will be appreciated that the human visual system may have a flicker fusion threshold. The flicker fusion threshold may be understood to a duration within which the human visual system is unable to differentiate images as being presented at different times. Images presented within that duration are fused or combined and, as a result, may be perceived by a user to be present simultaneously. Flickering images with temporal gaps between the images that are outside of that duration are not combined, and the flickering of the images is perceptible. In some embodiments, the duration is 1/60 seconds or less, which corresponds to a frame rate of 60 Hz or more. Preferably, image frames for any individual eye are provided to the user at a frame rate equal to or higher than the duration of the flicker fusion threshold of the user. For example, the frame rate for each of the left-eye or right-eye pieces may be 60 Hz or more, or 120 Hz or more; and, as a result, the frame rate provided by the light projection system 1010 may be 120 Hz or more, or 240 Hz or more in some embodiments.
It will be appreciated that time division multiplexing may advantageously reduce the computational load on processors (e.g., graphics processors) utilized to form displayed images. In some other embodiments, such as where sufficient computational resources are available, all component color images that form a full color image may be displayed simultaneously by the micro-displays 1030a, 1030b, 1030c.
As discussed herein, the micro-displays 1030a, 1030b, 1030c may each include arrays of light emitters.
Where the associated micro-display is a full-color micro-display, different ones of the light emitters 1044 may be configured to emit light of different colors. In such embodiments, the light emitters 1044 may be considered subpixels and may be arranged in groups, with each group having at least one light emitter configured to emit light of each component color. For example, where the component colors are red, green, and blue, each group may have at least one red subpixel, at least one green subpixel, in at least one blue subpixel.
It will be appreciated, that while the light emitters 1044 are shown arranged in a grid pattern for ease of illustration, the light emitters 1044 may have other regularly repeating spatial arrangements. For example, the number of light emitters of different component colors may vary, the sizes of the light emitters may vary, the shapes of the light emitters and/or the shapes made out by groups of light emitters may vary, etc.
With continued reference to
It will be appreciated that, given some light emitter device architectures and materials, current crowding may decrease the emitter's efficiency and pixel droop may cause unintentional activation of pixels (e.g., due to energy directed to one light emitter bleeding into a neighboring light emitter). As a result, a relatively large area 1045 may beneficially reduce current crowding and pixel droop. In some embodiments, the ratio of emitter size to pitch is preferably 1:2 to 1:4, or 1:2 to 1:3.
It will also be appreciated, however, that large separations between light emitters (e.g., a small light emitter to pitch ratio) may undesirably cause visible gaps, or dark regions, between the light emitters. Even when laterally translated as discussed herein, some gaps may still be visible, depending on the size of the original gap, the distance of the translation, and the number of subframes (and resulting translation increments) utilized. In some embodiments, lens structure such as light collimators may be utilized to effectively fill or partially fill in these dark regions. For example, a light collimating lens may extend on and around a light emitter 1044, such that light from the emitter 1044 completely fills the lens. For example, the light collimating lens may have a larger width than the light emitters 1044 and, in some embodiments, the width of the collimating lens may be approximately equal to the pitch. As a result, the size of the emitter 1044 is effectively increased to extend across the area of the lens, thereby filling in some or all of the area 1045. In some other embodiments, the width of the collimating lens may be approximately equal to the distance that the projection system is translated, as discussed herein, for each subframe. Lens structures such as light collimators are further discussed herein (e.g., in
As discussed herein, the light emitters 1044 may be OLEDs or micro-LEDs. It will be appreciated that OLEDs may utilize layers of organic material, e.g., disposed between electrodes, to emit light. Micro-LEDs may utilize inorganic materials, e.g., Group III-V materials such as GaAs, GaN, and/or GaIn for light emission. Examples of GaN materials include InGaN, which may be used to form blue or green light emitters in some embodiments. Examples of GaIn materials include AlGaInP, which may be used to form red light emitters in some embodiments. In some embodiments, the light emitters 1044 may emit light of an initial color, which may be converted to other desired colors using phosphor materials or quantum dots. For example, the light emitter may emit blue light which excites a phosphor material or quantum dot that converts the blue wavelength light to green or red wavelengths.
With reference now to
In some embodiments, the light redirecting structures 1080a, 1080c may be lens structures. It will be appreciated that the lens structures may be configured to receive incident light and to redirect the incident light at an angle such that the light reflects off a corresponding one of the reflective surfaces 1052, 1054 and propagates along a light path towards a corresponding one of the in-coupling optical elements 1022a, 1022c. As examples, the light redirecting structures 1080a, 1080c may comprise micro-lenses, nano-lenses, reflective wells, metasurfaces, and liquid crystal gratings. In some embodiments, the micro-lenses, nano-lenses, reflective wells, metasurfaces, and liquid crystal gratings may be organized in arrays. For example, each light emitter of the micro-displays 1030a, 1030c may be matched with one micro-lens. In some embodiments, in order to redirect light in a particular direction, the micro-lens or reflective wells may be asymmetrical and/or the light emitters may be disposed off-center relative to the micro-lens. In addition, in some embodiments, the light redirecting structures 1080a, 1080c may be collimators which narrow the angular emission profiles of associated light emitters, to increase the amount of light ultimately in-coupled into the eyepiece 1020. Further details regarding such light redirecting structures 1080a, 1080c are discussed below regarding
With reference now to
As illustrated, differences between the paths for the image light 1032b and image light 1032a, 1032c may be established using light redirecting structures 1080a, 1080c. In some embodiments, the image light 1032b from the emissive micro-display 1030b proceeds directly through the optical combiner 1052. The image light 1032a from the emissive micro-display 1032a is redirected by the light redirecting structure 1080a such that it reflects off of the reflective surface 1054 and propagates out of the optical combiner 1050 in the same direction as the image light 1032c. It will be appreciated that the image light 1032c from the emissive micro-display 1032c is redirected by the light redirecting structure 1080c such that it reflects off of the reflective surface 1052 at an angle such that the image light 1032c propagates out of the optical combiner 1050 in the same direction as the image light 1032b. Thus, the redirection of light by the light redirecting structures 1080a, 1080c and the angles of the reflective surfaces 1052, 1054 are configured to provide a common path for the image light 1032a, 1032c out of the optical combiner 1050, with this common path being different from the path of the image light 1032b. In some other embodiments, one or both of the light redirecting structures 1080a, 1080c may be omitted and the reflective surfaces 1052, 1054 in the optical combiner 1050 may be configured to reflect the image light 1032a, 1032c in the appropriate respective directions such that they exit the optical combiner 1050 propagating in the same direction, which is different from the direction of the image light 1032b. As such, after propagating through the projection optics 1070, the image light 1032a, 1032c exit from one exit pupil while the image light 1032b exits from another exit pupil. In this configuration, the light projection system 1010 may be referred to as a two-pupil projection system.
In some embodiments, the light projection system 1010 may have a single output pupil and may be referred to as a single-pupil projection system. In such embodiments, the light projection system 1010 may be configured to direct the image light 1032a, 1032b, 1032c onto a single common area of the eyepiece 1020. Such a configuration is shown in
As discussed herein, in some embodiments, the emissive micro-displays 1030a, 1030b, 1030c may be monochrome micro-displays configured to emit light of different colors. In some embodiments, one or more of the emissive micro-displays 1030a, 1030b, 1030c may have groups of light emitters configured to emit light of two or more, but not all, component colors. For example, a single emissive micro-display may have groups of light emitters—with at least one light emitter per group configured to emit blue light and at least one light emitter per group configured to emit green light—and a separate emissive micro-display on a different face of the X-cube 1050 may have light emitters configured to emit red light. In some other embodiments, the emissive micro-displays 1030a, 1030b, 1030c may each be full-color displays, each having light emitters of all component colors. As noted herein, utilizing multiple similar micro-displays may provide advantages for dynamic range and increased display brightness.
In some embodiments, a single full-color emissive micro-display may be utilized.
As discussed above, the in-coupling optical elements of the eyepiece 1020 may assume various configurations. Some examples of configurations for the eyepiece 1020 are discussed below in relation to
With continued reference to
As discussed herein, the in-coupling optical element 1022c is preferably configured to in-couple substantially all the incident light 1032c corresponding to the first color image into the associated waveguide 1020c while allowing substantially all the incident light 1032b, 1032a corresponding to the second color image and the third color image, respectively, to be transmitted without being in-coupled. Similarly, the in-coupling optical element 1022b is preferably configured to in-couple substantially all the incident image light 1032b corresponding to the second color image into the associated waveguide 1020b while allowing substantially all the incident light corresponding to the third color image to be transmitted without being in-coupled.
It will be appreciated that, in practice, the various in-coupling optical elements may not have perfect selectivity. For example, some of the image light 1032b, 1032a may undesirably be in-coupled into the waveguide 1020c by the in-coupling optical element 1022c; and some of the incident image light 1032a may undesirably be in-coupled into the waveguide 1020b by the in-coupling optical element 1022b. Furthermore, some of the image light 1032c may be transmitted through the in-coupling optical element 1022c and in-coupled into waveguides 1020b and/or 1020a by the in-coupling optical elements 1020b and/or 1020a, respectively. Similarly, some of the image light 1032b may be transmitted through the in-coupling optical element 1022b and in-coupled into waveguide 1020a by the in-coupling optical element 1022a.
In-coupling image light for a color image into an unintended waveguide may cause undesirable optical effects, such as, for example cross-talk and/or ghosting. For example, in-coupling of the image light 1032c for the first color image into unintended waveguides 1020b and/or 1020a may result in undesirable cross-talk between the first color image, the second color image and/or the third color image; and/or may result in undesirable ghosting. As another example, in-coupling of the image light 1032b, 1032a for the second or third color image, respectively, into the unintended waveguide 1020c may result in undesirable cross-talk between the first color image, the second color image and/or the third color image; and/or may cause undesirable ghosting. In some embodiments, these undesirable optical effects may be mitigated by providing color filters (e.g., absorptive color filters) that may reduce the amount of incident light that is in-coupled into an unintended waveguide.
With continued reference to
In some embodiments, the color filters 1026 on each major surface of the waveguide 1020c are similar and are configured to absorb light of the wavelengths of both image light 1032a, 1032b. In some other embodiments, the color filter 1026 on one major surface of the waveguide 1020c may be configured to absorb light of the color of image light 1032a, and the color filter on the other major surface may be configured to absorb light of the color of image light 1032b. In either arrangement, the color filters 1026 may be configured to selectively absorb the image light 1032a, 1032b propagating through the waveguide 1020c by total internal reflection. For example, at TIR bounces of the image light 1032a, 1032b off the major surfaces of the waveguide 1020c, the image light 1032a, 1032b contacts a color filter 1026 on those major surfaces and a portion of that image light is absorbed. Preferably, due to the selective absorption of image light 1032a, 1032b by the colors filters 1026, the propagation of the in-coupled the image light 1032c via TIR through the waveguide 1020c is not appreciably affected.
Similarly, the plurality of color filters 1028 may be configured as absorption filters that absorb in-coupled image light 1032a that propagates through the waveguide 1020b by total internal reflection. At TIR bounces of the image light 1032a off the major surfaces of the waveguide 1020b, the image light 1032a contacts a color filter 1028 on those major surfaces and a portion of that image light is absorbed. Preferably, the absorption of the image light 1032a is selective and does not affect the propagation of the in-coupled image light 1032b that is also propagating via TIR through the waveguide 1020b.
With continued reference to
In some embodiments, the color filters 1026 and 1028 may have single-pass attenuation factors of less than about 10%, (e.g., less than or equal to about 5%, less than or equal to about 2%, and greater than about 1%) to avoid significant undesired absorption of light propagating through the thickness the waveguides 1020c, 1020b (e.g., light of the colors of the image light 1032a, 1032b propagating through the waveguides 1020c, 1020b from the ambient environment and/or other waveguides). Various embodiments of the color filters 1024c and 1024b may be configured to have low attenuation factors for the wavelengths that are to be transmitted and high attenuation factor for the wavelengths that are to be absorbed. For example, in some embodiments, the color filter 1024c may be configured to transmit greater than 80%, greater than 90%, or greater than 95%, of incident light having the colors of the image light 1032a, 1032b and absorb greater than 80%, greater than 90%, or greater than 95%, of incident light having the color of the image light 1032a. Similarly, the color filter 1024b may be configured to transmit greater than 80%, greater than 90%, or greater than 95%, of incident light having the color of the image light 1032a and absorb greater than 80%, greater than 90%, or greater than 95%, of incident light having the color of the image light 1032b.
In some embodiments, the color filters 1026, 1028, 1024c, 1024b may comprise a layer of color selective absorbing material deposited on one or both surfaces of the waveguide 1020c, 1020b and/or 1020a. The color selective absorbing material may comprise a dye, an ink, or other light absorbing material such as metals, semiconductors, and dielectrics. In some embodiments, the absorption of material such as metals, semiconductors, and dielectrics may be made color selective by utilizing these materials to form subwavelength gratings (e.g., a grating that does not diffract the light). The gratings may be made of plasmonics (e.g. gold, silver, and aluminum) or semiconductors (e.g. silicon, amorphous silicon, and germanium).
The color selective material may be deposited on the substrate using various deposition methods. For example, the color selective absorbing material may be deposited on the substrate using jet deposition technology (e.g., ink-jet deposition). Ink-jet deposition may facilitate depositing thin layers of the color selective absorbing material. Because ink-jet deposition allows for the deposition to be localized on selected areas of the substrate, ink-jet deposition provides a high degree of control over the thicknesses and compositions of the layers of the color selective absorbing material, including providing for nonuniform thicknesses and/or compositions across the substrate. In some embodiments, the color selective absorbing material deposited using ink-jet deposition may have a thickness between about 10 nm and about 1 micron (e.g., between about 10 nm and about 50 nm, between about 25 nm and about 75 nm, between about 40 nm and about 100 nm, between about 80 nm and about 300 nm, between about 200 nm and about 500 nm, between about 400 nm and about 800 nm, between about 500 nm and about 1 micron, or any value in a range/sub-range defined by any of these values). Controlling the thickness of the deposited layer of the color selective absorbing material may be advantageous in achieving a color filter having a desired attenuation factor. Furthermore, layers having different thickness may be deposited in different portions of the substrate. Additionally, different compositions of the color selective absorbing material may be deposited in different portions of the substrate using ink-jet deposition. Such variations in composition and/or thickness may advantageously allowing for location-specific variations in absorption. For example, in areas of a waveguide in which transmission of light from the ambient (to allow the viewer to see the ambient environment) is not necessary, the composition and/or thickness may be selected to provide high absorption or attenuation of selected wavelengths of light. Other deposition methods such as coating, spin-coating, spraying, etc. may be employed to deposit the color selective absorbing material on the substrate.
While
With continued reference to
The in-coupling optical element 1022c is preferably configured to in-couple all the incident light 1032c into the associated waveguide 1020c while being transmissive to all the incident light 1032a. On the other hand, the image light 1032b may propagate to the in-coupling optical element 1022b without needing to propagate through any other in-coupling optical elements. This may be advantageous in some embodiments by allowing light, to which the eye is more sensitive, to be incident on a desired in-coupling optical element without any loss or distortion associated with propagation through other in-coupling optical elements. Without being limited by theory, in some embodiments, the image light 1032b is green light, to which the human eye is more sensitive. It will be appreciated that, while the waveguides 1020a, 1020b, 1020c are illustrated arranged a particular order, in some embodiments, the order of the waveguides 1020a, 1020b, 1020c may differ.
It will be appreciated that, as discussed herein, the in-coupling optical element 1022c overlying the in-coupling optical elements 1022a may not have perfect selectivity. Some of the image light 1032a may undesirably be in-coupled into the waveguide 1020c by the in-coupling optical element 1022c; and some of the image light 1032c may be transmitted through the in-coupling optical element 1022c, after which the image light 1032c may strike the in-coupling optical element 1020a and be in-coupled into the waveguide 1020a. As discussed herein, such undesired in-coupling may be visible as ghosting or crosstalk.
To mitigate unintentionally in-couple image light 1032a propagating through the waveguide 1022c, absorptive color filters 1026 may be provided on one or both major surfaces of the waveguide 1022c. The absorptive color filters 1026 may be configured to absorb light of the color of the unintentionally in-coupled image light 1032a. As illustrated, the absorptive color filters 1026 are disposed in the general direction of propagation of the image light through the waveguide 1020c. Thus, the absorptive color filters 1026 are configured to absorb image light 1032a as that light propagates through the waveguide 1020c by TIR and contacts the absorptive color filters 1026 while reflecting off one or both of the major surfaces of the waveguide 1020c.
With continued reference to
It will also be appreciated that in the embodiments illustrated in
With reference now to
Without being limited by theory, it will be appreciated that the in-coupling optical element 1022a may behave symmetrically; that is, it may redirect incident light such that the incident light propagates through the waveguide at TIR angles. However, light that is incident on the diffractive optical elements at TIR angles (such as upon re-bounce) may also be out-coupled. In addition or alternatively, in embodiments where the in-coupling optical element 1022a is coated with a reflective material, it will be understood that the reflection of light off of a layer of material such as metal may also involve partial absorption of the incident light, since reflection may involve the absorption and emission of light from a material. As a result, light out-coupling and/or absorption may undesirably cause loss of in-coupled light. Accordingly, re-bounced light may incur significant losses, as compared with light that interacts only once with the in-coupling optical element 1022a.
In some embodiments, the in-coupling elements are configured to mitigate in-coupled image light loss due to re-bounce. Generally, re-bounce of in-coupled light occurs towards the end 1023 of the in-coupling optical element 1022a in the propagation direction 1033 of the in-coupled light. For example, light in-coupled at the end of the in-coupling optical element 1022a opposite the end 1023 may re-bounce if the spacing 1034 for that light is sufficiently short. To avoid such re-bounce, in some embodiments, the in-coupling optical element 1022a is truncated at the propagation direction end 1023, to reduce the width 1022w of the in-coupling optical element 1022a along which re-bounce is likely to occur. In some embodiments, the truncation may be a complete truncation of all structures of the in-coupling optical element 1022a (e.g., the metallization and diffractive gratings). In some other embodiments, for example, where the in-coupling optical element 1022a comprises a metalized diffraction grating, a portion of the in-coupling optical element 1022a at the propagation direction end 1023 may not be metalized, such that the propagation direction end 1023 of the in-coupling optical element 1022a absorbs less re-bouncing light and/or outcouples re-bouncing light with a lower efficiency. In some embodiments, a diffractive region of an in-coupling optical element 1022a may have a width along a propagation direction 1033 shorter than its length perpendicular to the propagation direction 1033, and/or may be sized and shaped such that a first portion of image light 1032a is incident on the in-coupling optical element 1022a and a second portion of the beam of light impinges on the waveguide 1030a without being incident on the in-coupling optical element 1022a. While waveguide 1032a and light in-coupling optical element 1022a are illustrated alone for clarity, it will be appreciated that re-bounce and the strategies discussed for reducing re-bounce may apply to any of the in-coupling optical elements disclosed herein. It will also be appreciated that the spacing 1034 is related to the thickness of the waveguide 1030a (a larger thickness results in a larger spacing 1034). In some embodiments, the thickness of individual waveguides may be selected to set the spacing 1034 such that re-bounce does not occur. Further details regarding re-bounce mitigation may be found in U.S. Provisional Application No. 62/702,707, filed on Jul. 24, 2018, the entire disclosure of which is incorporated by reference herein.
It will be appreciated that in the waveguide assemblies of
The waveguide assemblies of
With reference now to
In some embodiments, one strategy for capturing more of the light emitted by the light emitters 1040 is to increase the size of the projection optics 1070, to increase the size of the numerical aperture of the projection optics 1070 capturing light. In addition or alternatively, the projection optics 1070 may also be formed with high refractive index materials (e.g., having refractive indices above 1.5) which may also facilitate light collection. In some embodiments, the projection optics 1070 may utilize a lens sized to capture a desired, high proportion of the light emitted by the light emitters 1044. In some embodiments, the projection optics 1070 may be configured to have an elongated exit pupil, e.g., to emit light beams having a cross-sectional profile similar to the shapes of the in-coupling optical elements 1022a, 1022b, 1022c of
In some embodiments, one or more light collimators may be utilized to reduce or narrow the angular emission profile of light from the light emitters 1044. As a result, more of the light emitted by the light emitters 1044 may be captured by the projection optics 1070 and relayed to the eyes of a user, advantageously increasing the brightness of images and the efficiency of the display system. In some embodiments, the light collimators may allow the light collection efficiency of the projection optics (the percentage of light emitted by the light emitters 1044 that is captured by the projection optics) to reach values of 80% or more, 85% or more, or 90% or more, including about 85-95% or 85-90%. In addition, the angular emission profile of the light from the light emitters 1044 may be reduced to 60° or less, 50° or less, or 40° or less (from, e.g., 180°). In some embodiments, the reduced angular emission profiles may be in the range of about 30-60°, 30-50°, or 30-40°. It will be appreciated that light from the light emitters 1044 may make out the shape of a cone, with the light emitter 1044 at the vertex of the cone. The angular mission profile refers to the angle made out by the sides of the cone, with the associated light emitter 1044 at the vertex of the angle (as seen in a cross-section taken along a plane extending through the middle of the cone and including the cone apex).
In some embodiments, the light collimators 1302 and array 1300 may be part of the light redirecting structures 1080a, 180c of
Preferably, the light collimators 1302 are positioned in tight proximity to the light emitters 1044 to capture a large proportion of the light outputted by the light emitters 1044. In some embodiments, there may be a gap between the light collimators 1302 and the light emitters 1044. In some other embodiments, the light collimator 1302 may be in contact with the light emitters 1044. It will be appreciated that the angular emission profile 1046 may make out a wide cone of light. Preferably, the entirety or majority of a cone of light from a light emitter 1044 is incident on a single associated light collimator 1302. Thus, in some embodiments, each light emitter 1044 is smaller (occupies a smaller area) than the light receiving face of an associated light collimator 1302. In some embodiments, each light emitter 1044 has a smaller width than the spacing between neighboring far light emitters 1044.
Advantageously, the light collimators 1302 may increase the efficiency of the utilization of light and may also reduce the occurrence of crosstalk between neighboring light emitters 1044. It will be appreciated that crosstalk between light emitters 1044 may occur when light from a neighboring light emitter is captured by a light collimator 1302 not associated with that neighboring light emitter. That captured light may be propagated to the user's eye, thereby providing erroneous image information for a given pixel.
With reference to
It will be appreciated that the light collimators 1302 may take various forms. For example, the light collimators 1302 may be micro-lenses or lenslets, in some embodiments. As discussed herein, each micro-lens preferably has a width greater than the width of an associated light emitter 1044. The micro-lenses may be formed of curved transparent material, such as glass or polymers, including photoresist and resins such as epoxy. In some embodiments, light collimators 1302 may be nano-lenses, e.g., diffractive optical gratings. In some embodiments, light collimators 1302 may be metasurfaces and/or liquid crystal gratings. In some embodiments, light collimator's 1302 may take the form of reflective wells.
It will be appreciated that different light collimators 1302 may have different dimensions and/or shapes depending upon the wavelengths or colors of light emitted by the associated light emitter 1044. Thus, for full-color emissive micro-displays, the array 1300 may include a plurality of light collimators 1302 with different dimensions and/or shapes depending upon the color of light emitted by the associate light emitter 1044. In embodiments where the emissive micro-display is a monochrome micro-display, the array 1300 may be simplified, with each of the light collimators 1302 in the array being configured to redirect light of the same color. With such monochrome micro-displays, the light collimator 1302 may be similar across the array 1300 in some embodiments.
With continued reference to
As noted above, the light collimators 1302 may take the form of reflective wells.
With reference now to
With continued reference to
The reflective walls 1303 may be formed in the substrate 1301 by various methods. For example, the walls 1303 may be formed in a desired shape by machining the substrate 1301, or otherwise removing material to define the walls 1303. In some other embodiments, the walls 1303 may be formed as the substrate 1301 is formed. For example, the walls 1303 may be molded into the substrate 1301 as the substrate 1301 is molded into its desired shape. In some other embodiments, the walls 1303 may be defined by rearrangement of material after formation of the body 2200. For example, the walls 1303 may be defined by imprinting.
Once the contours of the walls 1303 are formed, they may undergo further processing to form surfaces having the desired degree of reflection. In some embodiments, the surface of the substrate 1301 may itself be reflective, e.g., where the body is formed of a reflective metal. In such cases, the further processing may include smoothing or polishing the interior surfaces of the walls 1303 to increase their reflectivity. In some other embodiments, the interior surfaces of the reflectors 2110 may be lined with a reflective coating, e.g., by a vapor deposition process. For example, the reflective layer may be formed by physical vapor deposition (PVD) or chemical vapor deposition (CVD).
It will be appreciated that the location of a light emitter relative to an associated light collimator may influence the direction of emitted light out of the light collimator. This is illustrated, for example, in
With continued reference to
With reference now to
With reference now to
With reference to
As noted herein, the light collimator 1302 may also take the form of a nano-lens.
With continued reference to
The illustrated grating structure may be formed by various methods. For example, the substrate 1308 may be etched or nano-imprinted to define trenches, and the trenches may be filled with material of a different refractive index from the substrate 1308 to form the grating features 1306.
Advantageously, nano-lens arrays may provide various benefits. For example, the light collection efficiencies of the nano-lenslets may be large, e.g., 80-95%, including 85-90%, with excellent reductions in angular emission profiles, e.g., reductions to 30-40° (from 180°). In addition, low levels of cross-talk may be achieved, since each of the nano-lens light collimators 1302 may have physical dimensions and properties (e.g., pitch, depth, the refractive indices of materials forming the feature 1306 and substrate 1308) selected to act on light of particular colors and possibly particular angles of incidence, while preferably providing high extinction ratios (for wavelengths of light of other colors). In addition, the nano-lens arrays may have flat profiles (e.g., be formed on a flat substrate), which may facilitates integration with micro-displays that may be flat panels, and may also facilitate manufacturing and provide high reproducibility and precision in forming the nano-lens array. For example, highly reproducible trench formation and deposition processes may be used to form each nano-lens. Moreover, these processes allow, with greater ease and reproducibility, for variations between nano-lenses of an array than are typically achieved when forming curved lens with similar variations.
With reference now to
In some embodiments, some rows or columns may be repeated to increase the number of light emitters of a particular component color. For example, light emitters of some component colors may occupy multiple rows or columns. This may facilitate color balancing and/or may be utilized to address differential aging or reductions in light emission intensity over time.
With reference to
With continued reference to
In contrast, it will be appreciated that full-color micro-display typically include sub-pixels of each component color, with the sub-pixels arranged in particular relatively closely-packed spatial orientations in groups, with these groups reproduced across an array. Each group of sub-pixels may form a pixel in an image. In some cases, the sub-pixels are elongated along an axis, and rows or columns of sub-pixels of the same component color extent along that same axis. It will be appreciated that such an arrangement allows the sub-pixels of each group to be located close together, which may have benefits for image quality and pixel density. In the illustrated arrangement of
With reference to
It will be appreciated that the light collimators 1302 may be utilized to direct light along different light paths to form multi-pupil projections systems. For example, the light collimators 1302 may direct light of different component colors to two or three areas, respectively, for light in-coupling.
The emissive-micro-display 1030 includes an array of light emitters 1044, which may be subdivided into monochrome light emitters 1044a, 1044b, 1044c, which emit the image light 1032a, 1032b, 1032c, respectively. It will be appreciated that the light emitters 1044 emit image light with a broad angular emission profile 1046. The image light propagates through the array 1300 of light collimators, which reduces the angular emission profile to the narrowed angular emission profile 1047.
In addition, the array of 1300 of light collimators is configured to redirect the image light (image light 1032a, 1032b, 1032c) such that the image light is incident on the projection optics 1070 at angles which cause the projection optics 1070 to output the image light such that the image light propagates to the appropriate in-coupling optical element 1022a, 1022b, 1022c. For example, the 1300 array of light collimators is preferably configured to: direct the image light 1032a such that it propagates through the projection optics 1070 and is incident on the in-coupling optical element 1022a; direct the image light 1032b such that it propagates through the projection optics 1070 and is incident on the in-coupling optical element 1022b; and direct the image light 1032c such that it propagates through the projection optics 1070 and is incident on the in-coupling optical element 1022c.
Since different light emitters 1044 may emit light of different wavelengths and may need to be redirected into different directions to reach the appropriate in-coupling optical element, in some embodiments, the light collimators associated with different light emitters 1044 may have different physical parameters (e.g., different pitches, different widths, etc.). Advantageously, the use of flat nano-lenses as light collimators facilitates the formation of light collimators which vary in physical properties across the array 1300 of light collimators. As noted herein, the nano-lenses may be formed using patterning and deposition processes, which facilitates the formation of structures with different pitches, widths, etc. across a substrate.
With reference again to
With reference now to
With continued reference to
As illustrated, the micro-display 1030b may include an array 1042 of light emitters 1044, each surrounded by non-light-emitting areas 1045 having a total width 1045w. In addition, the light emitters 1044 have a width W and a pitch P. In arrays in which the light emitters 1044 are regularly spaced, each light emitter 1044 and surrounding area 1045 effectively forms a unit cell having the width 1045w, which may be equal to the pitch P.
In some embodiments, the light collimators 1302 are micro-lenses disposed directly on and surrounding associated light emitters 1044. In some embodiments, the width of the micro-lenses 1302 is equal to 1045w, such that neighboring micro-lenses 1302 nearly contact or directly contact one another. It will be appreciated that light from the light emitters 1044 may fill the associated micro-lens 1302, effectively magnifying the area encompassed by the light emitter 1044. Advantageously, such a configuration reduces the perceptibility of the areas 1045 which do not emit light and may otherwise be visible as dark spaces to a user. However, because micro-lens 1302 effectively magnifies the associated light emitter 1044 such that it extends across the entire area of the micro-lens 1302, the areas 1045 may be masked.
With continued reference to
With reference now to
Each micro-display 1030a, 1030b, 1030c may have an associated array 1300a, 1300b, 1300c, respectively, of light collimators. The light collimators narrow the angular emission profile of light 1032a, 1032b, 1032c from light emitters of the associated micro-display. In some embodiments, individual light emitters have a dedicated associated light collimator (as shown in
With continued reference to
With continued reference to
As discussed herein, the wearable display system incorporating micro-displays is preferably configured to output light with different amounts of wavefront divergence, to provide comfortable accommodation-vergence matching for the user. These different amounts of wavefront divergence may be achieved using out-coupling optical elements with different optical powers. As discussed herein, the out-coupling optical elements may be present on or in waveguides of an eyepiece such as the eyepiece 1020 (e.g.,
In some embodiments, the variable focus lens elements 1530, 1540 may be disposed on either side of the waveguide structure 1032. The variable focus lens elements 1530, 1540 may be in the path of image light from the waveguide structure 1032 to the eye 210, and also in the path of light from the ambient environment through the waveguide structure 1032 to the eye 210. The variable focus optical element 1530 may modulate the wavefront divergence of image light outputted by the waveguide structure 1032 to the eye 210. It will be appreciated that the variable focus optical element 1530 may have optical power which may distort the eye 210's view of the world. Consequently, in some embodiments, a second variable focus optical element 1540 may be provided on the world side of the waveguide structure 1032. The second variable focus optical element 1540 may provide optical power opposite to that of the variable focus optical element 1530 (or opposite to the net optical power of the optical element 1530 and the waveguide structure 1032, where the waveguide structure 1032 has optical power), so that the net optical power of the variable focus lens elements 1530, 1540 and the waveguide structure 1032 is substantially zero.
Preferably, the optical power of the variable focus lens elements 1530, 1540 may be dynamically altered, for example, by applying an electrical signal thereto. In some embodiments, the variable focus lens elements 1530, 1540 may comprise a transmissive optical element such as a dynamic lens (e.g., a liquid crystal lens, an electro-active lens, a conventional refractive lens with moving elements, a mechanical-deformation-based lens, an electrowetting lens, an elastomeric lens, or a plurality of fluids with different refractive indices). By altering the variable focus lens elements' shape, refractive index, or other characteristics, the wavefront of incident light may be changed. In some embodiments, the variable focus lens elements 1530, 1540 may comprise a layer of liquid crystal sandwiched between two substrates. The substrates may comprise an optically transmissive material such as glass, plastic, acrylic, etc.
In some embodiments, in addition or as alternative to providing variable amounts of wavefront divergence for placing virtual content on different depth planes, the variable focus lens elements 1530, 1540 and waveguide structure 1032 may advantageously provide a net optical power equal to the user's prescription optical power for corrective lenses. Thus, the eyepiece 1020 may serve as a substitute for lenses used to correct for refractive errors, including myopia, hyperopia, presbyopia, and astigmatism. Further details regarding the use of variable focus lens elements as substitutes for corrective lenses may be found in U.S. application Ser. No. 15/481,255, filed Apr. 6, 2017, the entire disclosure of which is incorporated by reference herein.
With reference now to
With continued reference to
The lens element 1534 modifies a wavefront divergence of light outputted by the waveguide structure 1034 to the eye 210. It will be appreciated that the light from the waveguide structure 1034 also passes through the lens element 1532. Thus, the wavefront divergence of light outputted by the waveguide structure 1034 is modified by both the lens element 1534 and the lens element 1532 (and the waveguide structure 1032 in cases where the waveguide structure 10032 has optical power). In some embodiments, the lens elements 1532, 1534 and the waveguide structure 1032 provide a particular net optical power for light outputted from the waveguide structure 1034.
The illustrated embodiment provides two different levels of wavefront divergence, one for light outputted from the waveguide structure 1032 and a second for light outputted by a waveguide structure 1034. As a result, virtual objects may be placed on two different depth planes, corresponding to the different levels of wavefront divergence. In some embodiments, an additional level of wavefront divergence and, thus, an additional depth plane may be provided by adding an additional waveguide structure between lens element 1532 and the eye 210, with an additional lens element between the additional waveguide structure and the eye 210. Further levels of wavefront divergence may be similarly added, by adding further waveguide structures and lens elements.
With continued reference to
Example Light Projection Systems Having Emissive Micro-Displays Providing Enhanced Resolution
As described above, a display system (e.g., a wearable display system presenting AR or VR content) may utilize one or more emissive micro-displays to reduce the size, mass, and/or power consumption relative to systems utilizing various other display technologies. For example, the display system may optionally utilize a threshold number of emissive micro-displays (e.g., three displays each including an array of light emitters, such as micro-LEDs). In this example, each emissive micro-display may be configured to generate light of a particular component color. The generated light may be combined to provide the appearance of a full color image, as discussed herein. Various examples in which multiple emissive micro-displays are utilized are discussed above, and also discussed below with reference to
Utilizing the one or more emissive micro-displays, the display system described herein may be configured to output AR or VR content (“virtual content”) at a greater resolution than a resolution directly corresponding to the number of light emitters included in the emissive micro-displays. For example, the display system may utilize one or more actuators to cause movement of or adjustment to one or more parts of a light projection system configured to output light forming virtual content to a user. For example, the actuators may adjust geometric positions associated with the light emitters. As an example, and as illustrated in
The adjustment described above may be leveraged to cause geometric positions of light emitters to assume positions located in inter-emitter regions of the arrays. As described above, an inter-emitter region (e.g., region 1045 illustrated in
With continued reference to
In some embodiments, the positions of the light emitters of the array 1042, as seen by a user at a first point in time, may be shifted at a second point in time to locations originally in the inter-emitter region 1045, to thereby display pixels corresponding to those locations in an image. Thus, a high-resolution image frame may be broken up into lower resolution subframes, with a first subframe having pixels at locations corresponding to a first position of the light emitters, a second subframe having pixels at locations corresponding to a second position of the light emitters, a third subframe having pixels at locations corresponding to a third position of the light emitters, and so on. Thus, the positions of the light emitters, as seen by the user, may be adjusted in position to effectively tile (e.g., substantially tile) the subframes of the high-resolution image. It will be appreciated that the subframes and the high-resolution image frame occupy substantially the same area (e.g., are substantially the same physical size), as perceived by a user. For example, the subframes are preferably 90%, 95%, 99%, or 100% of the size of the high-resolution image frame, except that they have lower pixel density than the high-resolution image frame.
With continued reference to
With continued reference to
In some embodiments, the perceived positions of the array 1042 of light emitters may be updated in a substantial continuous movement. For example, the position of emitter 1044 may be shifted continuously along an x direction until output of pixel 1044c. The display system (e.g., one or more processors or processing elements) described herein may determine an extent to which the position of emitter 1044 has been shifted during this continual movement. The display system may be configured to determine a time at which to output light corresponding to a new pixel. For example, the display system may identify that a position of emitter 1044 has reached a distance corresponding to pixel 1044b. The display system may then cause the emitter 1044 to output light based on an image value associated with pixel 1044b in the second subframe. Utilizing such continual adjustment of the geometric positions may reduce jerkiness associated with shifting the geometric positions. In some other embodiments, the geometric positions may be shifted in discrete steps. For example, emitter 1044 may output light corresponding to pixel 1044a. The geometric position of emitter 1044 may then be shifted in a discrete step and paused to output light corresponding to pixel 1044b. Other light emitters of the array 1042 may similarly be shifted along with the light emitter 1044. For example, the light emitter 1044′ may be shifted in discrete steps to provide the pixels 1044a′, 1044b′, and 1044c′ at different ones of the discrete steps.
In some embodiments, the number of subframes N may be determined or limited by the physical properties of the array 1042. An example property may include a maximum framerate (e.g., N is preferably not be so large that subframes fail to merge together in users' visual systems; that all N subframes are preferably displayed over a time duration that is less than the flicker fusion threshold of the user, e.g., less than 1/60 of a second). Additional example properties may include the emitter pitch Λ, the emitter size p, and so on. As described above, the number of subframes N may be determined based on a number of positions in which an emitter of the array 1042 may fit within an inter-emitter region 1045. In the example of
In some embodiments, N may be determined based on computing a floor of the emitter pitch Λ divided by the emitter size p. The computed floor may represent a number of times an emitter may be adjusted or moved in each direction. For example, if Λ=2.5 micron and p=0.8 micron, then N would equal 3. Thus, there may be 9 subframes (e.g., 3×3). It will be appreciated that this determination may be adjusted depending on whether emitter pitch Λ and/or emitter size p varies along the x and y directions. For example, there may be emitter pitchX ΛX and emitter pitchy ΛY. In this example, N may thus vary based on direction. The number of subframes may be determined as being NX×NY.
Example Emissive Micro-Display for Forming Foveated Images
Another potentially desirable feature in VR, AR, and MR applications is foveated imaging (also referred to simply as foveation), in which the resolution of a displayed image varies across that image. In particular, VR, AR, and MR may include eye tracking systems that determine where users are looking. Given the limitations of the human visual system, which generally detects fewer details in portions of the field of view away from a user's fixation point, presenting full resolution content (e.g., content at the full-rendered resolution) at the periphery of users' vision may be undesirable. The peripheral full resolution content may consume excessive processing power in rendering and have excessive power consumption when displayed by the display system. In other words, significant benefits may be achieved in terms of reduced processing loads and display power consumption by reducing the resolution of content away from a user's fixation point and by delivering the highest resolution image content only to the part of the viewing field that the user is looking, for example, at and immediately adjacent the fixation point. It will be appreciated that the fixation point corresponds to the portion of the field of view that is focused onto the fovea of the user's eye; thus, the eye has relatively high sensitive to detail in this portion of the field of view.
Foveation may also be based on factors other than fixation point. As an example, content creators may specify that certain content, such as text, be displayed at full resolution even if the user is looking away from the content. As another example, content creators may specify that foveation should only be active under certain conditions. As yet other examples, foveation may be a user selectable setting or may be automatically enabled as a result of a low battery condition. In at least some embodiments, foveation may conserve display resources (data, pixels, bandwidth, computation) by delivering the highest resolution only to portions of an image in the part of the field of view that the user is fixating on (e.g., represented by foveal region A in
To provide a high resolution for the foveal region 1132, while maintaining a lower resolution for the second region 1134, the light emitters 1044 of the array 1042 (
The second region 1134 is illustrated in
With reference to
In each of
Example Movements of Emissive Micro-Display and/or Display Optics
As discussed herein, the positions of displayed pixels may be shifted by shifting the physical positions of parts a light projection system, such as, for example, light emitters 1044 (
In some embodiments, these shifts may be made in discrete steps. For example, the light emitters and/or projection optics may be stationary or substantially stationary while they emit light to form pixels of an individual subframe. The positions of the light emitters and/or projection optics may then be shifted between the presentation of different subframes.
In some embodiments, the light emitters and/or projection optics may be continuously moved between subframes with or without a reduction in velocity while the light emitters are displaying or projecting an individual subframe. Such continuous movement may advantageously be simpler to implement than precisely starting and stopping the movement of the light emitters and/or projection optics in small steps. In either case, the result remains that the relatively low-resolution and low-fill-factor array 1042 (
In some embodiments, the movements may be made through use of two actuators. For example, a first actuator may adjust movement in a first direction (e.g., an x-direction) and a second actuator may adjust movement in a second direction (e.g., a y-direction). In some embodiments, the first actuator and second actuator may operate in quadratures (e.g., 180 degrees phase shift relative to each other). As an example, the first actuator may perform a cosine motion while the second actuator may perform a sine motion, with the two motions combining to define a circle. Consequently, in some embodiments, it will be appreciated that the various actuators (e.g. actuators 1504, 1504a-c, etc.) herein may be understood to be an aggregate structure encompassing two constituent actuators, each providing movement along a particular axis.
With continued reference to
In some embodiments, with continued reference to
With reference now to
As shown in
Thus, the techniques described herein for enhancing the resolution of an emissive micro-display may be accomplished via displacements of the projection optics or other optical component between the emissive micro-display and the user. Moreover, and as described above with respect to
Example Emissive Micro-Display Systems
As discussed herein, the various parts of a light projection system may be moved to provide the desired shifting of positions of displayed pixel, and this movement may be achieved using actuators mechanically connected to the parts to be moved.
With continued reference to
The light projection system 1010 may utilize monochrome emissive micro-displays 1030a, 1030b, 1030c, each configured to output a different component color. An optical combiner 1050, such as a dichroic x-cube, may redirect the light emitted from the emissive micro-displays 1030a-1030c to the projection optics 1070 as described above.
In some embodiments, the projection optics 1070 is configured to receive image light from the emissive micro-displays 1030a-1030c, the actuator 1504 is configured to move the projection optics 1070, which then causes the image light outputted by the light projection system 1500 to shift. Thus, the pixels presented by an array may be perceived to be adjusted in location, for example to tile subframes across an inter-emitter region as described herein; the emissive micro-displays 1030a-1030c may output light corresponding to a plurality of subframes. These subframes may be presented in rapid succession (within the flicker fusion threshold), such that a user may perceive them as being present simultaneously in a full resolution frame of virtual content.
In some embodiments, one or more of the emissive micro-displays 1030a-1030c are independently moveable relative to others of the emissive micro-displays, the optical combiner 1050, and the projection optics 1070. In such embodiments, each independently moveable micro-display may have an associated independently moveable actuator 1504.
With reference again to
In some embodiments, actuators 1504 or 1504a-1504c may continually move the mechanically coupled part of the light projection system 1500, for example according to the movement patterns illustrated in
In some embodiments, as discussed herein, time division multiplexing may be utilized for the micro-displays 1030a, 1030b, 1030c. For example, different ones of the emissive micro-displays 1030a, 1030b, 1030c, may be activated at different times to generate different component color images.
In some embodiments, the actuators 1504a-1504c may be moved to complete at least one movement loop (e.g., a loop of the movements paths 1300-1306,
While the eyepiece 1020 is illustrated in
Additionally, as discussed herein, a single micro-display may emit light of two or more (e.g., all) component colors (e.g., emit red, green, and blue light). For example,
Preferably, each component color for a given pixel is emitted from an overlapping area of the array 3742, which may advantageously facilitate the shifting described herein for providing different pixels of an image. For example, the light emitters 3744 may be understood to each include a stack of constituent light generators, with each constituent light generators being configured to emit light of a different associated component color. The micro-display 1030b may, in some embodiments, include coaxial red, green, and blue stacked constituent light generators.
Advantageously, with continued reference to
While
As described above, in some embodiments where different micro-displays are utilized to generate light of different component colors, the projection system may use an optical combiner to combine the separately generated light of different colors. For example, an x-cube may be employed to combine light from different micro-displays 1030a-1030c (
In some embodiments, and as illustrated in
It will be appreciated that in embodiments in which an optical combiner 1500 is not used, several example benefits may be achieved. As an example, there may be improved light collection as the micro-displays 1030a-1030c can be placed closer to the projection optics 1070a-1070c when the intervening optical combiner 1500 is omitted. As a result, higher light utilization efficiency and image brightness may be achieved. As another example, the projection system 1500 may be simplified and tailored to light of a particular component color. For example, an optics design for each respective projection optics 1070a-1070C may be calibrated separately for light of each component color generated by the micro-displays 1030a-1030c. In this way, the projection system 1500 may avoid the need for achromatization of the projection optics.
As another example benefit, and as illustrated in
In contrast, the examples of
It will be appreciated that the actuators of
In the description above, with respect to at least
For example,
Similarly, with reference to
Example Flowchart
At block 3902, the display system obtains a rendered frame of virtual content. As described above, the display system may generate frames of virtual content for presentation to a user. For example, the local processor and data module 140 may include one or more graphics processing elements. The module 140 may then generate rendered frames of virtual content.
As described in
At block 3904, the display system outputs light forming a first subframe. The display system may select pixels included in the rendered frame as forming the first subframe. For example, and as described in
At block 3906, the display system shifts the positions of displayed pixels. As described in
As described above, the display system may continually move parts of the light projection system. For example, the actuators may follow a movement pattern such as the movement patterns illustrated in
At block 3908, the display system outputs light forming a second subframe after changing the position of moveable parts of the light projection system, so that the light forming the second subframe provides pixels at a desired location for the second subframe. The display system may select pixels of the rendered frame forming the second subframe. Optionally, the module 140 may render the second subframe. The display system may then cause the light projection system to output light forming the second subframe.
The display system may then continue to shift the positions of displayed and output successive subframes until the full rendered frame is formed.
Various example embodiments of the invention are described herein. Reference is made to these examples in a non-limiting sense. They are provided to illustrate more broadly applicable aspects of the invention. Various changes may be made to the invention described and equivalents may be substituted without departing from the spirit and scope of the invention.
For example, while advantageously utilized with AR displays that provide images across multiple depth planes, the virtual content disclosed herein may also be displayed by systems that provide images on a single depth plane.
In addition, many modifications may be made to adapt a particular situation, material, composition of matter, process, process act, or step(s) to the objective(s), spirit, or scope of the present invention. Further, as will be appreciated by those with skill in the art that each of the individual variations described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present inventions. All such modifications are intended to be within the scope of claims associated with this disclosure.
The invention includes methods that may be performed using the subject devices. The methods may comprise the act of providing such a suitable device. Such provision may be performed by the user. In other words, the “providing” act merely requires the user obtain, access, approach, position, set-up, activate, power-up or otherwise act to provide the requisite device in the subject method. Methods recited herein may be carried out in any order of the recited events that is logically possible, as well as in the recited order of events.
In addition, it will be appreciated that each of the processes, methods, and algorithms described herein and/or depicted in the figures may be embodied in, and fully or partially automated by, code modules executed by one or more physical computing systems, hardware computer processors, application-specific circuitry, and/or electronic hardware configured to execute specific and particular computer instructions. For example, computing systems may include general purpose computers (e.g., servers) programmed with specific computer instructions or special purpose computers, special purpose circuitry, and so forth. A code module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language. In some embodiments, particular operations and methods may be performed by circuitry that is specific to a given function.
Further, certain embodiments of the functionality of the present disclosure are sufficiently mathematically, computationally, or technically complex that application-specific hardware or one or more physical computing devices (utilizing appropriate specialized executable instructions) may be necessary to perform the functionality, for example, due to the volume or complexity of the calculations involved or to provide results substantially in real-time. For example, a video may include many frames, with each frame having millions of pixels, and specifically programmed computer hardware is necessary to process the video data to provide a desired image processing task or application in a commercially reasonable amount of time.
Code modules or any type of data may be stored on any type of non-transitory computer-readable medium, such as physical computer storage including hard drives, solid state memory, random access memory (RAM), read only memory (ROM), optical disc, volatile or non-volatile storage, combinations of the same and/or the like. In some embodiments, the non-transitory computer-readable medium may be part of one or more of the local processing and data module (140), the remote processing module (150), and remote data repository (160). The methods and modules (or data) may also be transmitted as generated data signals (e.g., as part of a carrier wave or other analog or digital propagated signal) on a variety of computer-readable transmission mediums, including wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). The results of the disclosed processes or process steps may be stored, persistently or otherwise, in any type of non-transitory, tangible computer storage or may be communicated via a computer-readable transmission medium.
Any processes, blocks, states, steps, or functionalities described herein and/or depicted in the attached figures should be understood as potentially representing code modules, segments, or portions of code which include one or more executable instructions for implementing specific functions (e.g., logical or arithmetical) or steps in the process. The various processes, blocks, states, steps, or functionalities may be combined, rearranged, added to, deleted from, modified, or otherwise changed from the illustrative examples provided herein. In some embodiments, additional or different computing systems or code modules may perform some or all of the functionalities described herein. The methods and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto may be performed in other sequences that are appropriate, for example, in serial, in parallel, or in some other manner. Tasks or events may be added to or removed from the disclosed example embodiments. Moreover, the separation of various system components in the embodiments described herein is for illustrative purposes and should not be understood as requiring such separation in all embodiments. It should be understood that the described program components, methods, and systems may generally be integrated together in a single computer product or packaged into multiple computer products.
Example aspects of the invention, together with details regarding material selection and manufacture have been set forth above. As for other details of the present invention, these may be appreciated in connection with the above-referenced patents and publications as well as generally known or appreciated by those with skill in the art. The same may hold true with respect to method-based aspects of the invention in terms of additional acts as commonly or logically employed.
In addition, though the invention has been described in reference to several examples optionally incorporating various features, the invention is not to be limited to that which is described or indicated as contemplated with respect to each variation of the invention. Various changes may be made to the invention described and equivalents (whether recited herein or not included for the sake of some brevity) may be substituted without departing from the spirit and scope of the invention. In addition, where a range of values is provided, it is understood that every intervening value, between the upper and lower limit of that range and any other stated or intervening value in that stated range, is encompassed within the invention.
Also, it is contemplated that any optional feature of the inventive variations described may be set forth and claimed independently, or in combination with any one or more of the features described herein. Reference to a singular item, includes the possibility that there are plural of the same items present. More specifically, as used herein and in claims associated hereto, the singular forms “a,” “an,” “said,” and “the” include plural referents unless the specifically stated otherwise. In other words, use of the articles allow for “at least one” of the subject item in the description above as well as claims associated with this disclosure. It is further noted that such claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation. Without the use of such exclusive terminology, the term “comprising” in claims associated with this disclosure shall allow for the inclusion of any additional element—irrespective of whether a given number of elements are enumerated in such claims, or the addition of a feature could be regarded as transforming the nature of an element set forth in such claims.
Accordingly, the claims are not intended to be limited to the embodiments shown herein, but are to be accorded the widest scope consistent with this disclosure, the principles and the novel features disclosed herein.
This application is a continuation of U.S. patent application Ser. No. 17/418,729, filed Jun. 25, 2021, which is a 371 of International Patent Application No. PCT/US2019/067816 filed on Dec. 20, 2019, which claims priority from: U.S. Provisional Application No. 62/911,018 filed on Oct. 4, 2019 and titled “AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS WITH SHARED DISPLAY FOR LEFT AND RIGHT EYES”; U.S. Provisional Application No. 62/800,363 filed on Feb. 1, 2019 and titled “VIRTUAL AND AUGMENTED REALITY DISPLAY SYSTEMS WITH EMISSIVE MICRO-DISPLAYS”; and U.S. Provisional Application No. 62/786,199 filed on Dec. 28, 2018 and titled “LOW MOTION-TO-PHOTON LATENCY ARCHITECTURE FOR AUGMENTED AND VIRTUAL REALITY DISPLAY SYSTEMS”. The above-noted applications are hereby incorporated by reference herein in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
6188382 | Okamura et al. | Feb 2001 | B1 |
6850221 | Tickle | Feb 2005 | B1 |
7270428 | Alasaarela et al. | Sep 2007 | B2 |
8783878 | Shevlin et al. | Jul 2014 | B2 |
9213405 | Perez et al. | Dec 2015 | B2 |
9470906 | Kaji et al. | Oct 2016 | B2 |
20040227703 | Lamvik et al. | Nov 2004 | A1 |
20060028436 | Armstrong | Feb 2006 | A1 |
20060209420 | Lerner et al. | Sep 2006 | A1 |
20060215129 | Alasaarela | Sep 2006 | A1 |
20070081123 | Lewis | Apr 2007 | A1 |
20120127062 | Bar-Zeev et al. | May 2012 | A1 |
20120162549 | Gao et al. | Jun 2012 | A1 |
20120188245 | Hyatt | Jul 2012 | A1 |
20130077049 | Bohn | Mar 2013 | A1 |
20130082922 | Miller | Apr 2013 | A1 |
20130117377 | Miller | May 2013 | A1 |
20130125027 | Abovitz | May 2013 | A1 |
20130208234 | Lewis | Aug 2013 | A1 |
20130242262 | Lewis | Sep 2013 | A1 |
20140071539 | Gao | Mar 2014 | A1 |
20140085190 | Erinjippurath et al. | Mar 2014 | A1 |
20140177023 | Gao et al. | Jun 2014 | A1 |
20140218468 | Gao et al. | Aug 2014 | A1 |
20140267420 | Schowengerdt et al. | Sep 2014 | A1 |
20140300966 | Travers et al. | Oct 2014 | A1 |
20150016777 | Abovitz et al. | Jan 2015 | A1 |
20150103306 | Kaji et al. | Apr 2015 | A1 |
20150178939 | Bradski et al. | Jun 2015 | A1 |
20150205126 | Schowengerdt | Jul 2015 | A1 |
20150279102 | Fleck et al. | Oct 2015 | A1 |
20150309263 | Abovitz et al. | Oct 2015 | A2 |
20150326570 | Publicover et al. | Nov 2015 | A1 |
20150346495 | Welch et al. | Dec 2015 | A1 |
20160011419 | Gao | Jan 2016 | A1 |
20160026253 | Bradski et al. | Jan 2016 | A1 |
20170236466 | Spitzer et al. | Aug 2017 | A1 |
20180017801 | Chang et al. | Jan 2018 | A1 |
20180019233 | Chang et al. | Jan 2018 | A1 |
20180275410 | Yeoh et al. | Sep 2018 | A1 |
20180321496 | Bohn | Nov 2018 | A1 |
20190318706 | Peng | Oct 2019 | A1 |
Number | Date | Country |
---|---|---|
2196729 | Jun 2010 | EP |
H0915548 | Jan 1997 | JP |
H09322099 | Dec 1997 | JP |
WO 2017134412 | Aug 2017 | WO |
WO 2018175652 | Sep 2018 | WO |
WO 2019178060 | Sep 2019 | WO |
WO 2020139752 | Jul 2020 | WO |
Entry |
---|
Allen et al., “47.4: Invited Paper: Wobulation: Doubling the Addressed Resolution of Projection Displays,” SID Symposium Digest of Technical Papers 36, pp. 1514-1517, 2005. |
ARToolKit: htpps://web.archive.org/web/20051013062315/http://www.hitl.washington.edu:80/artoolkit/documentation/hardware.htm, archived Oct. 13, 2005. |
Azuma, “A Survey of Augmented Reality,” Presence: Teleoperators and Virtual Environments 6(4): 355-385, Aug. 1997. |
Azuma, “Predictive Tracking for Augmented Reality,” TR95-007, Dissertation, Doctor of Philosophy, UNC—Chapel Hill, North Carolina, Department of Computer Science, Feb. 1995. (262 pages). |
Berthouzoz et al., “Resolution Enhancement by Vibrating Displays,” ACM Trans. Graph. 31: 15:1-15:14, 2012. |
Bimber et al., “Spatial Augmented Reality—Merging Real and Virtual Worlds,” 2005. (393 pages). |
Extended European Search Report, dated Sep. 14, 2022, for European Application No. 19906210.0-1020 / 3903143. (13 pages). |
International Preliminary Report on Patentability, dated Jun. 16, 2021, for International Application No. PCT/US2019/067816. (9 pages). |
International Search Report and Written Opinion, mailed Mar. 5, 2020, for International Application No. PCT/US2019/067816. (15 pages). |
Jacob, “Eye Tracking in Advance Interface Design,” Virtual Environments and Advanced Interface Design, ed. by W. Barfield and T.A. Furness, Oxford University Press, New York, Human-Computer Interaction Lab Naval Research Laboratory, Washington, D.C., 1995, pp. 258-288. |
Napoli et al., “Imaging artifact precompensation for spatially multiplexed 3-D displays,” Stereoscopic Displays and Applications XIX 6803: 680304-1-680304-12, 2008. |
Tanriverdi et al., “Interacting with Eye Movements in Virtual Environments,” Department of Electrical Engineering and Computer Science, Tufts University, Medford, MA, ACM CHI 2000 Human Factors in Computing Systems Conference, The Hague, The Netherlands, Apr. 1-6, 2000, Association for Computing Machinery Press, pp. 265-272. |
Number | Date | Country | |
---|---|---|---|
20230273439 A1 | Aug 2023 | US |
Number | Date | Country | |
---|---|---|---|
62786199 | Dec 2018 | US | |
62800363 | Feb 2019 | US | |
62911018 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17418729 | US | |
Child | 18172949 | US |