This disclosure generally relates to electronic displays worn by a viewer for forming left-eye and right-eye virtual images and more particularly relates to binocular alignment in Head-Mounted Display's (HMDs).
HMDs are being developed for a range of diverse uses, including military, commercial, industrial, fire-fighting, and entertainment applications. For many of these applications, there is particular value in forming a virtual image that can be visually superimposed over the real-world image formed in the eye from within the field of view of the HMD user. Image light guides convey image-bearing light along a transmissive waveguide from a location outside the viewer's field of view to a position in alignment with the viewer's pupil while preserving the viewer's view of the environment through the waveguide.
In some image light guides, collimated, relatively angularly encoded light beams from an image source are coupled into a plate-shaped waveguide by an input coupling such as an in-coupling diffractive optic, which can be mounted or formed on a surface of the plate-shaped waveguide or buried within the waveguide. Such diffractive optics can be formed as diffraction gratings, holographic optical elements or in other known ways. After propagating along the waveguide, the diffracted light can be directed back out of the waveguide by a similar output grating, which can be arranged to provide pupil expansion along at least one dimension of the virtual image. In addition, a turning diffractive optic can be positioned along the waveguide between the input and output gratings to provide pupil expansion in a second orthogonal dimension of the virtual image. The two dimensions of pupil expansion define an expanded eyebox within which the viewer's pupil can be positioned for viewing the virtual image conveyed by the light guide.
Image light guides and diffractive optical elements may form a virtual image focused at optical infinity by conveying angularly encoded light beams of collimated light to the viewer eyebox. However, a virtual image may be focused at some closer distance, such as in the range from 1 m to 1.5 m, for example. Using near-focused solutions can allow the viewer to have the advantage of augmented reality imaging in applications where it is useful to have the real-world scene content at a close distance, such as manufacturing and warehousing applications, for example.
The present disclosure provides a system and method for consistently producing properly aligned stereoscopic presentation of virtual images in a near-eye display system. Reducing or eliminating virtual image misalignment and undesirable optical effects such as incorrect coloration, blurring, and optical noise may mitigate, for example, eye strain (i.e., asthenopia).
In a first exemplary embodiment, the present disclosure provides a method for alignment of images in a near-eye binocular display system, including providing a substantially rigid binocular frame operable to receive two or more planar waveguides, wherein each of the two or more planar waveguides include an in-coupling diffractive optic operable to diffract image-bearing light beams from an image source into the waveguide, and an out-coupling diffractive optic operable to diffract the image-bearing light beams from the waveguide toward an eyebox. The in-coupling diffractive optic is operable to in-couple light incident from a first direction and said out-coupling diffractive optic is operable to out-couple light in the first direction. Securing the binocular frame to a stationary alignment mount, positioning a first projector on a right side of the frame to project a first image and projecting the first image upon a screen without the one or more planar waveguides in place. Positioning a second projector on a left side of the frame to project a second image and projecting the second image upon the screen without the one or more planar waveguides in place. Comparing the first image and the second image with respective targets and adjusting the positioning of the first projector and/or the second projector to align the first image and the second image with the respective targets.
In a second exemplary embodiment, the present disclosure provides a system for alignment of virtual images in a near-eye binocular display system, including a substantially rigid frame, a stationary alignment mount operable to secure the frame, and a screen having photodetectors operable to measure properties of calibration images. The system further including a first projector connected to a right side of the frame, wherein the first projector is operable to project a first image, and a second projector connected to a left side of the frame, wherein the second projector is operable to project a second image. The system further including one or more objective lenses operable to receive the first image at infinity focus and refocus the infinity focus image as a calibration image upon the performance screen, wherein the screen is operable to compare pixel orientation of one or more calibration images.
In a third exemplary embodiment, the present disclosure provides a method for aligning projectors with a rigid frame in a head-mounted display system independent of waveguide alignment, including securing a generally rigid binocular frame to a stationary alignment mount, wherein the binocular frame is operable to support a waveguide; positioning a first projector proximal a right side of the frame, wherein the first projector is operable to project a right image to a first location in space without a waveguide in place; positioning a second projector proximal a left side of the frame, wherein the second projector is operable to project a left image to a second location in space without a waveguide in place; positioning a screen relative to the frame at the first and second locations, the screen having photodetectors operable to receive and measure properties of the left image and the right image; comparing the measured image properties of the left image and the right image; and adjusting the positioning of at least the first projector or the second projector.
In certain embodiments of the invention, alignment may be accomplished in a factory setting prior to introduction of waveguides.
The accompanying drawings are incorporated herein as part of the specification. The drawings described herein illustrate embodiments of the presently disclosed subject matter and are illustrative of selected principles and teachings of the present disclosure. However, the drawings do not illustrate all possible implementations of the presently disclosed subject matter and are not intended to limit the scope of the present disclosure in any way.
It is to be understood that the invention may assume various alternative orientations and step sequences, except where expressly specified to the contrary. It is also to be understood that the specific assemblies and systems illustrated in the attached drawings and described in the following specification are simply exemplary embodiments of the inventive concepts defined herein. Hence, specific dimensions, directions, or other physical characteristics relating to the embodiments disclosed are not to be considered as limiting, unless expressly stated otherwise. Also, although they may not be, like elements in various embodiments described herein may be commonly referred to with like reference numerals within this section of the application.
Where used herein, the terms “first”, “second”, and so on, do not necessarily denote any ordinal, sequential, or priority relation, but are simply used to more clearly distinguish one element or set of elements from another, unless specified otherwise.
Where used herein, the terms “viewer”, “operator”, “observer”, and “user” are considered to be equivalent and refer to the person, or machine, who wears and/or views images using a near-eye display device.
Where used herein, the terms “coupled” or “coupler” (in the context of optics) refer to a connection by which light travels from one optical medium or device to another optical medium or device.
Where used herein, the term “about” when applied to a value is intended to mean within the tolerance range of the equipment used to produce the value, or, in some examples, is intended to mean plus or minus 10%, or plus or minus 5%, or plus or minus 1%, unless otherwise expressly specified.
Where used herein, the term “substantially” is intended to mean within the tolerance range of the equipment used to produce the value, or, in some examples, is intended to mean plus or minus 10%, or plus or minus 5%, or plus or minus 1%, unless otherwise expressly specified.
Where used herein, the terms “optical infinity” and “at infinity” correspond to conventional usage in the camera and imaging arts, indicating image formation using substantially collimated light, so that the focus distance exceeds at least about four meters (4 m).
Where used herein, the term “beam expansion” is intended to mean replication of a beam via multiple encounters with an optical element to provide exit pupil expansion in one or more directions. Similarly, as used herein, to “expand” a beam, or a portion of a beam, is intended to mean replication of a beam via multiple encounters with an optical element to provide exit pupil expansion in one or more directions.
An optical system, such as a HMD, can produce a virtual image display. In contrast to methods for forming a real image, a virtual image is not formed on a display surface. That is, if a display surface were positioned at the perceived location of a virtual image, no image would be formed on that surface. Virtual image display has a number of inherent advantages for augmented reality presentation. For example, the apparent size of a virtual image is not limited by the size or location of a display surface. Additionally, the source object for a virtual image may be small; for example, a magnifying glass provides a virtual image of an object. In comparison with systems that project a real image, a more realistic viewing experience can be provided by forming a virtual image that appears to be some distance away. Providing a virtual image also obviates the need to compensate for screen artifacts, as may be necessary when projecting a real image.
An image light guide may utilize image-bearing light from a light source such as a projector to display a virtual image. For example, collimated, relatively angularly encoded, light beams from a projector are coupled into a planar waveguide by an input coupling such as an in-coupling diffractive optic, which can be mounted or formed on a surface of the planar waveguide or buried within the waveguide. Such diffractive optics can be formed as diffraction gratings, holographic optical elements (HOEs) or in other known ways. For example, the diffraction grating can be formed by surface relief. After propagating along the waveguide, the diffracted light can be directed back out of the waveguide by a similar output coupling such as an out-coupling diffractive optic, which can be arranged to provide pupil expansion along at least one direction of the virtual image. In addition, a turning grating can be positioned on/in the waveguide to provide pupil expansion in an orthogonal direction of the virtual image. The image-bearing light output from the waveguide provides an expanded eyebox for the viewer.
When used as a part of a near-eye or head-mounted display system, the in-coupling diffractive optic IDO of the conventional image light guide system 10 couples the image-bearing light beams WI from a real, virtual or hybrid image source 50 into the substrate S of the image light guide 12. Any real image or image dimension formed by the image source 50 is first converted into an array of overlapping, angularly related, collimated beams encoding the different positions within a virtual image for presentation to the in-coupling diffractive optic IDO. Typically, the rays within each bundle forming one of the angularly related beams extend in parallel, but the angularly related beams are relatively inclined to each other through angles that can be defined in two angular dimensions corresponding to linear dimensions of the image.
Once the angularly related beams engage with the in-coupling diffractive optic IDO, at least a portion of the image-bearing light beams WI are diffracted (generally through a first diffraction order) and thereby redirected by in-coupling diffractive optic IDO into the planar image light guide 12 as angularly encoded image-bearing light beams WG for further propagation along a length dimension x of the image light guide 12 by total internal reflection (TIR) between the plane-parallel front and back surfaces 14 and 16. Although diffracted into a different combination of angularly related beams in keeping with the boundaries set by TIR, the image-bearing light beams WG preserve the image information in an angularly encoded form that is derivable from the parameters of the in-coupling diffractive optic IDO. The out-coupling diffractive optic ODO receives the encoded image-bearing light beams WG and diffracts (also generally through a first diffraction order) at least a portion of the image-bearing light beams WG out of the image light guide 12, as image-bearing light beams WO, toward a nearby region of space referred to as an eyebox E, within which the transmitted virtual image can be seen by a viewer's eye or other optical component. The out-coupling diffractive optic ODO can be designed symmetrically with respect to the in-coupling diffractive optic IDO to restore the original angular relationships of the image-bearing light beams WI among outputted angularly related beams of the image-bearing light beams WO. In addition, the out-coupling diffractive optic ODO can modify the original field points positional angular relationships producing an output virtual image at a finite focusing distance.
However, to increase one dimension of overlap among the angularly related beams populating the eyebox E (defining the size of the region within which the virtual image can be seen), the out-coupling diffractive optic ODO is arranged together with a limited thickness T of the image light guide 12 to encounter the image-bearing light beams WG multiple times and to diffract only a portion of the image-bearing light beams WG upon each encounter. The multiple encounters along the length (e.g., a first direction) of the out-coupling diffractive optic ODO have the effect of replicating the image-bearing light beams WG and enlarging or expanding at least one dimension of the eyebox E where the replicated beams overlap. The expanded eyebox E decreases sensitivity to the position of a viewer's eye 5 for viewing the virtual image.
The out-coupling diffractive optic ODO is shown as a transmissive-type diffraction grating arranged on or secured to the front surface 14 of the image light guide 12. However, like the in-coupling diffractive optic IDO, the out-coupling diffractive optic ODO can be located on, in, or otherwise engaged with the front or back surface 14, 16 of the image light guide 12 and can be of a transmissive or reflective-type in a combination that depends upon the direction through which the image-bearing light beams WG is intended to exit the image light guide 12. In addition, the out-coupling diffractive optic ODO could be formed as another type of diffractive optic, such as a volume hologram or other holographic diffraction element, that diffracts propagating image-bearing light beams WG from the image light guide 12 as the image-bearing light beams WO propagating toward the eyebox E.
When the image source 50 is arranged to emit the image-bearing light beams WI towards the image light guide 12 from a position opposite to the front surface 14 through which the image-bearing light beams WO are conveyed, that is, directing the approach of the angularly related image-bearing light beams WI towards surface 16, image-bearing light beams WO will be emitted through the out-coupling diffractive optic ODO on the surface 14 at a vector equal to the angle of incidence, rendering image-bearing light beams WI and image-bearing light beams WO parallel in angular space.
In an embodiment, as illustrated in
The perspective view of
In an exemplary embodiment, as shown in
In an embodiment, waveguides 12, 12a connect with the head mounted near eye display system 80 by means of an easily accessible input mechanism, via sliding into place through waveguide replacement ports 78, 78a in the side of the lens frame 70 as shown in
When replaced, waveguides 12, 12a do not require fine tuning of their position once inserted. Rather, controlled focal vergence is achieved by factory alignment of projector 50, 50a. Referring now to
In an embodiment, projector alignment is manually performed by a human operator. For example, an operator may project image-bearing light beams 90 from each projector 50, 50a onto the projector screen 96, and align images M on the performance screen 96 by eye to effect the projector alignment.
As illustrated in
Referring now to
In an embodiment, performance screen 96 includes photodetector devices in communication with the controller 97 operable to receive and measure various properties of calibration image M2. Constructed with optical detectors responsive to, but not limited to, timing precision and pulse frequency, spectral region, lumens or light intensity, and pixel orientation, performance screen 96 enables a method for ensuring projector 50 light output meets desired or pre-set requirements. In an embodiment, performance screen 96 also includes a pixel grid 100 operable to compare pixel orientation of calibration image M2 to a reference pixel or pre-set factory alignment scheme. In an embodiment, the reference pixel appears in the center of calibration image M2. In another embodiment, the reference pixel is located in various regions of the image. For example, pixel alignment may be defined as the visual angle of one pixel on a device with a pixel density of ninety-six dots per inch (“96 dpi”) and visual angle of 0.0213 degrees, with an acceptable margin of error of 0.005 degrees.
In an embodiment, performance screen 96 is operable to account for unwanted effects on the image-bearing light beams 90 caused by the objective lens 94, including chromatic aberrations and other wavelength dependent optical distortions apparent in image M2. In certain aspects, such unwanted effects are aberrations and/or distortions known to be present in objective lens 94. A method of finer calibration of projector 50 to correct for peripheral pixel alignment, color adjustment, and the like includes the use of alignment system software implementing image processing algorithms. The software alignments may occur before and/or after course alignment, as described in the example in
In an embodiment, the alignment mount and mounting bracket 56 (shown in
As illustrated in
As illustrated in
Referring now to
As illustrated in
In an embodiment, the test preparation procedure actions may be performed in sequence before proceeding with the remainder of the method for aligning and calibrating projectors in a rigid frame binocular system. In another embodiment, any of the test preparation procedure actions may be performed discretely, out of sequence, repeated any number of times, or omitted. General test preparation procedure actions in step 240 include, but are not limited to, a functional check to ensure basic component functionality is operational, such as LED lights, touchpad input sensor, and projector. Further, the operator may check for mechanical defects that would impact alignment testing, such as temple arm 74, 74a warping, tolerance issues, or the detection of physical inconsistencies within the near eye display system The presence of such defects would necessitate a review of the near eye display system at hand and cessation of the calibration process. The operator may check ambient temperature range for optimal conditions, and verify airborne cleanliness specifications meet the Federal Standard 209c particulate contaminant rating designated as Class 100 or ISO class 5. In addition, the surface cleanliness of the objective lens 94 and projectors 50, 50a may be verified as also meeting Federal Standard 209c particulate contaminant rating designated as Class 100 or ISO class 5. In another embodiment, particulate contaminant ratings may allow for ISO class 6 or greater. Further checks included in step 240 may determine the operability of the performance screen 96, including a screen sensor check and positioning of image reference pixels operable for projector alignment.
Moving to step 250, the operator secures the near eye display system 80 to the stationary mounting system located in the test environment. In step 260, the operator or a mechanical arm may perform fine adjustments of the first projector 50 position. In an embodiment, the first projector 50 refers only to the right temple projector, where for the purposes of the projector calibration scheme, only one projector is aligned at a time. In an embodiment, the left temple projector may be the first projector. In another embodiment, both projectors 50, 50a may be calibrated in tandem. Electronic adjustment software may be used to further calibrate projector positioning. In step 270, the mounted near eye display system projects an image upon the performance screen 96 using its first projector 50. In step 280, image values are captured by the performance screen 96 and measured by calibration software. In an embodiment, the performance screen 96 is photosensitive, having photodetector devices operable to measure various properties of a calibration image. In certain embodiments, this step may include calibration with respect to a reference pixel or pre-set factory alignment scheme, as described above.
In step 290, the system and/or test operator evaluates the image data against accuracy specifications such as the example accuracy specifications set forth above. In the case that specifications are not met, the near eye display system alignment method returns to step 260 for further mechanical and/or electronic adjustment based on performance data. In the case that specifications are met, the calibration process proceeds on to step 300 where the operator or a mechanical arm or similar mechanical element attached to the frame may perform fine adjustments of the second projector 50a position in advance of attunement with the first projector 50. Electronic adjustment software may be used to further calibrate projector 50a positioning. Data received through the performance screen test of the first projector 50 may additionally be used to inform calibration of the second projector 50a.
In step 310, the mounted near eye display system 80 projects an image upon the performance screen 96 using its second projector 50a. In step 320, image values are captured by the performance screen 96 and measured by calibration software as described above. In step 330, the system and/or test operator evaluates the image data against accuracy specifications to ensure the projectors 50, 50a are in proper alignment with each other. In the case that specifications are not met, the near eye display system alignment method returns to step 300 for further mechanical and/or electronic adjustment of second projector 50a based on performance data. In the case that specifications are met, a pairing sequence is initiated to lock and record the position of each projector 50, 50a as part of the system values in step 340. The system values may be used in the future for internal system alignment procedures. In step 350, the successful projector alignment of near eye display system 80 is recorded with a calibration certificate. Waveguides may be added to the frame after the foregoing embodiment of the method is completed, but may also be added or inserted earlier in the alignment process.
As illustrated in
In an embodiment, there is a full perimeter seal between the outer cover 408 and the housing 402 to prevent any debris from entering the enclosed system. Additionally, the outer cover 408 may include an anti-reflective coating thereon to reduce unwanted reflections. The inner cover 410 may also include an anti-reflective coating and/or ‘smudge’ proof coating thereon. The outer cover 408, the inner cover 410, and the waveguide housing 402 seal the waveguide assembly 406 within he waveguide stack module 400.
The waveguide assembly 406 includes one or more waveguides 12, 22a (as described herein). As illustrated in
One or more features of the embodiments described herein may be combined to create additional embodiments which are not depicted. While various embodiments have been described in detail above, it should be understood that they have been presented by way of example, and not limitation. It will be apparent to persons skilled in the relevant arts that the disclosed subject matter may be embodied in other specific forms, variations, and modifications without departing from the scope, spirit, or essential characteristics thereof. The embodiments described above are therefore to be considered in all respects as illustrative, and not restrictive. The scope of the invention is indicated by the appended claims, and all changes that come within the meaning and range of equivalents thereof are intended to be embraced therein.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/032989 | 6/10/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63209295 | Jun 2021 | US |