Exit pupil expander

Information

  • Patent Grant
  • 11927759
  • Patent Number
    11,927,759
  • Date Filed
    Thursday, December 22, 2022
    a year ago
  • Date Issued
    Tuesday, March 12, 2024
    a month ago
Abstract
An exit pupil expander (EPE) has entrance and exit pupils, a back surface adjacent to the entrance pupil, and an opposed front surface. In one embodiment the EPE is geometrically configured such that light defining a center wavelength that enters at the entrance pupil perpendicular to the back surface experiences angularly varying total internal reflection between the front and back surfaces such that the light exiting the optical channel perpendicular to the exit pupil is at a wavelength shifted from the center wavelength. In another embodiment a first distance at the entrance pupil between the front and back surfaces is different from a second distance at the exit pupil between the front and back surfaces. The EPE may be deployed in a head-wearable imaging device (e.g., virtual or augmented reality) where the entrance pupil in-couples light from a micro display and the exit pupil out-couples light from the EPE.
Description
TECHNOLOGICAL FIELD

The described invention relates to optical channels, and more particularly to controlling the color space across the output grating of an optical exit pupil expander such as may be disposed in a head-wearable imaging device/computer that projects an image directly in front of a user's eye.


BACKGROUND

Certain wearable computers such as those embodied as eyeglasses or virtual technology goggles project an image directly in front of a user's eye. In eyeglass type devices these projections are see-through so the user can see the projected data in the near field while the visual real-world in the far field remains largely unobscured. In virtual reality devices the user is isolated from perceiving the real world so the display needs to fill the user's entire field of vision. One challenge with such wearable displays is to produce an adequate eye-box in which the viewer can view the data that is projected by the micro-display. Such an the eye-box for see-through displays measures about 10-12 mm in the vertical and in the horizontal and the eye relief is in the range of 20-30 mm. For virtual reality devices the eye box is necessarily larger and often the eye relief is a bit longer. Retinal scanning display devices project the image directly on the user's retina so the eye-box is smaller and the eye relief is closer to zero. Due to the nature of such wearable devices the space constraints limit the reach of the optics and so one challenge is to keep that eye-box from shrinking to only a few mm, given the optical train (often located at the side of the user's head for see-through displays) is limited by practical limits to the size of such wearable devices. These size limits to the optical train also adversely affect the color space seen by the user. Color space may be a peripheral matter for see through displays where only data is being displayed but is critical for virtual reality devices whose effectiveness relies on the display persuading a certain level of the user's consciousness that the scene represents more than only a virtual world.


The exit pupil expander (EPE) is the optical component that would replace the geometric optics that have traditionally been used to expand the size of the eye-box in head-wearable visual devices. In optics the exit pupil is a virtual aperture in that only rays which pass through this virtual aperture can exit the system. The exit pupil is the image of the aperture stop in the optics that follow it. The term exit pupil is sometimes also used to refer to the diameter of the virtual aperture. Unlike the optics of conventional cameras or telescopes, an exit pupil expander of a wearable virtual reality or see-through device is designed to display for near-distance viewing.


Numerical aperture expander is a less common term sometimes used with reference to retinal scanning displays which project an image through the pupil directly on the user's retina. The numerical aperture of the light emanating from display pixels determines the exit pupil size, and retinal scanning displays project a rastered image about the size of the user's eye pupil at an intermediate plane. Retinal scanning displays can be used for virtual reality applications.


Diffractive exit pupil expanders have diffraction gratings that pose an inherent problem in controlling the color space. Because of diffraction the input and output gratings diffract different color bands of light into different output angles. This results in the user's perception of the color space of the scene being displayed having a varying color balance across the user's field of view.


Conventional exit pupil expanders typically have a very high degree of parallelism which FIG. 1 demonstrates with parallel front and back surfaces of the EPE. Incident light 102 enters the EPE 100 via the back surface 104 and encounters an input grating 106. Light propagates inside the EPE 100 by multiple total internal reflections (TIR) and the color space is controlled by having a stack of EPE plates, for example separate plates for red (R) and green (G) as well as blue (B) primary color bands. Light exiting the EPE 100 is expanded by these internal reflections and passes through an output grating 108 and exits normal to the front surface 110, which is parallel to the opposed back surface 104. This plate stacking necessarily complicates the design and raises its cost. The individual beams in FIG. 1 represent different colors (R, G, B) each defining a different wavelength k.


SUMMARY

According to a first aspect of these teachings there is an optical channel comprising an entrance pupil enabling light to enter the optical channel, an exit pupil enabling the light to exit the optical channel, a back surface adjacent to the entrance pupil, and a front surface opposite the back surface. In this particular aspect the optical channel is geometrically configured such that the light defining a center wavelength that enters the optical channel at the entrance pupil perpendicular to the back surface experiences angularly varying total internal reflection between the front and back surfaces such that the light that exits the optical channel perpendicular to the exit pupil is at a wavelength shifted from the center wavelength.


According to a second aspect of these teachings there is an optical channel comprising an entrance pupil enabling light to enter the optical channel, an exit pupil enabling the light to exit the optical channel, a back surface adjacent to the entrance pupil, and a front surface opposite the back surface. In this particular aspect the optical channel is configured such that a first distance at the entrance pupil between the front surface and the back surface is different from a second distance at the exit pupil between the front surface and the back surface.


According to a third aspect of these teachings there is a head-wearable imaging device comprising a micro display and an exit pupil expander. The head-wearable imaging device may for example be a virtual reality device or an augmented reality device. In either case the exit pupil expander comprises: an entrance pupil configured to in-couple light from the micro-display; an exit pupil configured to out-couple light from the exit pupil expander; a back surface adjacent to the entrance pupil; and a front surface opposite the back surface. In this embodiment, as with the optical channel of the first aspect, the exit pupil expander is geometrically configured such that the light defining a center wavelength that enters the optical channel at the entrance pupil perpendicular to the back surface experiences angularly varying total internal reflection between the front and back surfaces such that the light that exits the optical channel perpendicular to the exit pupil is at a wavelength shifted from the center wavelength. In another embodiment the exit pupil expander may be as described above for the optical channel according to the second aspect of these teachings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic diagram illustrating a prior art exit pupil expander with parallel front and back surfaces according to the prior art.



FIG. 2 is a schematic diagram illustrating a wedge-shaped exit pupil expander with non-parallel front and back surfaces according to an embodiment of these teachings.



FIG. 3 is a plot of light intensity showing for each of R, G and B wavelengths a central peak and a shifted wavelength in the out-coupled light from a wedge-shaped EPE such as that shown at FIG. 2.



FIG. 4 quantitatively tabulates the color shift of red, green and blue light passing through a wedge-shaped EPE such as that shown at FIG. 2, with α=0.25 degrees.



FIG. 5 is similar to FIG. 2 but further illustrating a compensating wedge for see-through (non-virtual reality) type applications according to an example embodiment.



FIG. 6 is a schematic diagram of a segmented exit pupil expander defining multiple discrete segments of which less than all segments impose angularly varying total internal reflection according to these teachings.



FIG. 7 is a schematic diagram of an exit pupil expander that includes a main channel or body with a thin film wedge overlay that imposes the angularly varying total internal reflection according to these teachings.



FIG. 8 is a schematic diagram of a wedge-type exit pupil expander similar to that of FIG. 2, but along with the inset FIG. 8A which is similar to FIG. 3 these more fully illustrate exit pupil expansion of only a single wavelength/color.



FIG. 9 is a perspective view of a non-virtual reality headset which is one type of device 900 in which embodiments of these teachings may be disposed, with front and top views particularly illustrating one example for placement of an EPE.





DETAILED DESCRIPTION

Certain non-limiting embodiments of these teachings provide a wedge-shaped EPE (exit pupil expander) plate for controlling color space as generally shown at FIG. 2. In the wedge-shaped EPE 200, the front 210 and opposed back 204 surfaces of the EPE 200 are non-parallel. These surfaces 204, 210 refer to internal reflective surfaces upon which the light reflects within the optical channel/EPE 200 and are sometimes referred to as plates. Light 202 propagates inside the EPE 200 through total internal reflection from these surfaces 204, 210 and experiences a varying degree of angular variation due to non-parallelism of these surfaces/plates 204, 210. Hence, at the output grating 208, the angular spread of the out-coupled light is affected and a user sees the angular shift as a color change of the light source as compared to the incident light 202 that was input through the back surface 204 at the input grating 206. That is, the wedge-shaped EPE 200 shifts the diffracted light from the central emitted wavelength of the light source, which in FIG. 2 is represented as the incident light 202. Improved color balance is achieved by controlling the wedge-shape of the EPE 200.


More particularly, the light reflecting off these surfaces 204, 210 propagating inside the wedge-shaped EPE 200 by TIR experiences a varying degree of angular variation, as a result of the non-parallelism of the surfaces/plates 204, 210. This affects the angular spread of the out-coupled light that exits the EPE 200 through the front surface 210 at the output grating 208. In particular, if light-emitting diodes (LEDs) are used as light sources (the incident light 202) for the optical engine providing the image, the user will see the resulting angular shift as a color change of the light source because the diffracted light is shifted from the dominant or from the central-emitted wavelength of the LED. The light that is coupled in with a slightly different wavelength is indicated by dashed arrows in FIG. 2 that exit normal/perpendicular to the front surface 210 while light at the wavelength peak exits somewhat offset from the normal. An improved color balance in the overall system can be achieved by controlling the wedge shape of the EPE 200, more particularly by controlling the extent of the non-parallelism of the internal reflective surfaces/plates 202, 210.


The optical channel/EPE 200 of FIG. 2 is configured such that a first distance 220 at the entrance pupil/input grating 206 between the front surface 210 and the back surface 204 is different from a second distance 222 at the exit pupil/output grating 208 between the front surface 210 and the back surface 204. The specific location of these distances 220, 222 in FIG. 2 is to avoid obscuring the ray traces through the channel; the appropriate locations would follow one particular ray of a given wavelength from input to output pupil (which are delineated in the drawings as input and output gratings) of the channel/EPE 200. In a particular but non-limiting embodiment the extent of the wedge is such that this wavelength ray tracing is not relevant to these distances, where for example a smallest first distance 220 anywhere along the entrance pupil/input grating 206 is larger than a largest second distance 222 anywhere along the exit pupil/output grating 208.


Consider this distinction between FIGS. 1 and 2 quantitatively. Assuming for simplicity that the characteristics of the input grating and the output grating are identical, the output coupled light experiences a color shift across the output grating. The grating equation d sin θm=mλ (also shown at FIG. 1) describes the coupling angle of the light for each central wavelength λ, for Red, Green, and Blue. In FIG. 2, the improved color balance achieved by the wedged plates 204, 210 is shown. If the wedge angle α shown at FIG. 2 is chosen appropriately, the color shift across the output grating 208 is offset by the shift in the central wavelength of the respective light sources for R, G, and B. In FIG. 2 the central wavelength is followed with solid lines, and the shifted wavelengths are shown in dashed lines.


This is also shown in FIG. 3, where the spectra of the light sources are shown, with solid line indicators for the central (peak) wavelengths for Blue 301C; Green 302C and Red 303C as well as dashed line indicators for the shifted wavelengths for Blue 301S; Green 302S and Red 303S as seen by the user when the EPE is wedge-shaped as shown by the FIG. 2 example embodiment.



FIG. 4 reproduces the calculations for the color shifting plotted at FIG. 3. Green light input at the input grating has wavelength λ=525 nm; red light input has wavelength λ=630 nm; and blue light input has wavelength λ=430 nm. The wedge angle α shown at FIG. 2 is α=0.25 degrees, the distance between slits on the input grating is d=1200 nm (consistent with the assumption above the input and output gratings are identical), θm is the diffraction angle at which phases add to produce a maxima, and air is the medium within the EPE. As can be seen from FIG. 4 the color shift Δλ of green light is Δλ≈(525-511 nm)=14≈2.5%; the color shift of red light is Δλ≈(630-617 nm)=13≈2.0%; and the color shift of blue light is Δλ≈(430-415 nm)=15 nm≈3.5%. At least for the narrowing wedge shape the color shift is more pronounced for shorter wavelengths. FIG. 4 also shows the angular differences between the diffracted input and output angles. If the value of the angle α were set to zero the resulting EPE would be as shown at FIG. 1 and the calculations shown at FIG. 4 would return to the original (input) wavelength and the output angles would be equal to the input angles.


The basic wedge-shaped EPE 200 is only one of several EPE designs that will produce a color shift in the out-coupled light according to these teachings. While the FIG. 2 example shows the wedge narrowing between the input 202 (input grating 206) and the output (output grating 210) a similar color shifting benefit can be achieved with an expanding or widening wedge shape.



FIG. 5 illustrates how a basic wedge-shape such as that shown at FIG. 2 can be adapted for a see-through display such as an eye-glass mounted micro-display that is transparent to visible light from the user's environment. Like reference numbers denote similar features as detailed above with respect to FIG. 2. For such a non-virtual reality device the user will want to perceive his/her surroundings without the color-shifting aspects imposed by the wedge shaped EPE 200. The incident light 202 forming the image to be projected in front of or on the user's eye is designated 502A in FIG. 5 to distinguish over the incident environmental light 502B that the user perceives directly that is not subject to internal reflection off the non-parallel surfaces 204, 210 within the EPE 200. That incident environmental light 502B passes through a compensating wedge 500 that defines opposed surfaces 510, 504 such that the surface 510 adjacent to the wedge-shaped EPE 200 is parallel to the back surface 204 of the EPE 200 and the surface opposite the EPE 200 is parallel to the front surface 210 of the EPE 200. So while the non-parallelism of the EPE 200 narrows between the input and output pupils, the compensating wedge widens to exactly match across the expanse of the compensating wedge 500 (or at least across the entrance and exit pupils of that compensating wedge 500). The angle α shown for the EPE 200 is the same angle α used for the compensating wedge 500 but offset 180 degrees (shown as 180-α on the compensating wedge 500). Aberrations to the incident environmental light 502B due to the angular difference between opposed surfaces 204, 210 of the EPE 200 are exactly offset by the angular difference between opposed surfaces 510, 504 of the compensating wedge 500, which may even be separate from the EPE 200 waveguide.


Similar color-shifting advantages can be realized with one or more segmented wedge-shapes intermediate between the input and output of the EPE in which case the input and output surfaces at which the input and output gratings are disposed can be parallel themselves, as shown by example at FIG. 6. This segment-wise wedged EPE 600 defines three distinct segments 600A, 600B, 600C through which the incident light 602 propagates. Incident light 602 from the computer of the overall optical headset/eyeglass device forming the image to be projected is in-coupled through the back surface 604A of the first segment 600A which defines a narrowing wedge between opposed internal reflective surfaces 604A and 610A. From this first segment 600A the light continues through the second segment 600B which has parallel opposed internal reflective surfaces 604B, 610B. The light continues into the third segment 600C where it is out coupled 612 after reflecting between back 604C and front 610C surfaces which also define a narrowing wedge. In various embodiments there may be only one wedge segment 600A, 600C, and if there are multiple wedge segments 600A, 600C they may define the same or different wedge angles α. At least input and output gratings similar to those shown at FIG. 2 are assumed though not shown at FIG. 6, and further the second segment 600B with parallel opposed surfaces 604B, 610B may also incorporate diffraction gratings along those surfaces.


It is known to incorporate into the design of an EPE intermediate vertical expansion gratings, which in FIG. 1 would run vertically between the opposed surfaces 104, 110. Where such vertical expansion gratings are used in a multi-segment EPE 600 such as that shown at FIG. 6 they are preferably disposed in the second segment 600B which exhibits a high degree of parallelism between the opposed surfaces 604B, 610B. In this case the wedge can be on either side of the non-wedge segment 600B in which the intermediate vertical expansion plates are disposed, or in other embodiments there may be one or more wedge segments before or after the non-wedge parallel segment 600B.


While the embodiments illustrated herein show non-parallel planar surfaces similar advantages can be gained where one or both of such surfaces are curved. The result is qualitatively similar in that the color expansion arises from the non-parallelism of these opposed reflective surfaces but the computations are more extensive to realize a practical EPE as compared to planar non-parallel surfaces.



FIG. 7 illustrates a further embodiment of an EPE 700 in which there is a thin film wedge 714 overlying the front surface with a thin air gap 718 between the thin film wedge 714 and the main EPE body 712. Optical mediums other than air may also be used for this thin gap. For simplicity the main EPE body 712 has parallel opposed surfaces 704A, 710A. As with FIGS. 1 and 2 incident light 702 enters through the back surface 704A at the input grating 706 and is refracted internal of the channel between those surfaces 704A, 710A. The wedge overlay 714 is disposed opposite that input pupil such that the first refraction at the input grating directs the light towards the wedge overlay 714, which extends along only a portion of the main EPE body 712. The extent of that portion depends on the characteristics of that first diffraction angle and the wavelength or wavelengths the designer selects for evanescent coupling back into the optical channel of the main EPE body 712. The example at FIG. 7 shows blue and red light are incident 702; blue at the left and red at the right of the input grating 706. The blue light reflects into the thin film wedge 714 which extends far enough that a portion of this same blue light is also reflected from its front surface 710B through its back surface 704B to re-enter the main EPE body 712. The distal end 716 of the wedge overlay 714 prevents similar reflection of the red light that entered the wedge overlay 714 from being reflected back into that main EPE body 712. In this manner the ‘leaky’ input light can be filtered in and other wavelengths of light can be filtered out by selection of the wedge angle of the wedge overlay 714 and the position of the distal end 716 (for a given input grating 706).


In the FIG. 7 embodiment the output grating 708 is disposed along the rear surface 704A to show the advantages of these teachings do not depend on light being out-coupled from the surface opposite where it was in-coupled, and this feature can be incorporated into any of the other examples herein (except for the see-through embodiment of FIG. 5 it would be realized by in-coupling and out-coupling through the front surface 210 to retain the see-through feature). Some conventions prefer to name the surface adjacent to the output grating as the front surface, in which case surface 704A of FIG. 7 would be named the front surface and surface 710A would be named the back/rear surface; the terms front and back or rear surfaces as used herein merely designate opposing surfaces and the input/output pupils are specifically illustrated so there is no ambiguity. Note that the micro-display which provides the image seen by the user is not particularly shown at FIGS. 1-8; the micro-display and optical engine of the host device is the source of the incident light 702 that is in-coupled to the EPE via the input grating and out-coupled from it via the output grating. Micro-displays and optical engines to drive them are well known in the head-wearable visual imaging arts; these known micro-displays and optical engines are suitable for providing the image that is in-coupled to the EPE embodiments described herein and need not be further detailed. In some embodiments of optical devices such as retinal scanning displays the image is projected directly on the user's retina and such embodiments may or may not have any output grating at the exit pupil of the EPE.



FIG. 8 illustrates a basic wedge-type EPE 200 similar to that shown at FIG. 2 but illustrating field of view aspects of these teachings; the inset at FIG. 8A is similar in kind to the data plot of FIG. 3. In this illustration the user's field of view is illustrated for only one color, blue which is incident 202 across the entire input grating 206 at zero degrees (normal to the plane of the grating). The solid arrows out-coupled from the output grating 208 represent the directions of the central peak of the color, and this central peak is also delineated at the inset with a solid arrow. The dashed arrows out-coupled from the output grating 208 show the side of the spectrum color that are coupled towards the user's pupil 850, where the dashed arrows p1, p2 and p3 also illustrated at the inset. Note that those dashed arrows are on either side of the color peak even for this basic wedge design. When the user looks at the image projected on the output grating 208 the field of view has the color error that is imposed by the non-parallel channel of the EPE 200, so long as this error is not otherwise compensated within the EPE 200.


One particular technical effect of embodiments of these teachings is an improved color space provided by augmented reality and virtual reality viewing devices, and at a reduced cost. Such augmented reality or virtual reality devices would need to be designed such that the characteristics of the diffraction gratings take into account the wedge angle α but this would be an engineering matter more than compensated by volume sales of these retail end user devices.


Certain of the above embodiments may be described in part by its functionality as an optical channel (the EPE) comprising an entrance pupil enabling light to enter the optical channel; an exit pupil enabling the light to exit the optical channel; a back surface 204 adjacent to the entrance pupil; and a front surface 210 opposite the back surface. In the drawings the entrance pupil is designated by the input grating 206 and the exit pupil is designated by the output grating 208; while typical embodiments will have such gratings at those entrance and exit pupils the gratings themselves are not an essential part of the novel aspects of the optical channel/EPE presented herein. As detailed more particularly above the optical channel/EPE is geometrically configured, that is its shape is designed, such that the light defining a center wavelength that enters the optical channel at the entrance pupil perpendicular to the back surface experiences angularly varying total internal reflection between the front and back surfaces such that the light that exits the optical channel perpendicular to the exit pupil is at a wavelength shifted from the center wavelength. The dashed lines exiting the output grating 208 are perpendicular, and FIG. 3 as well as the inset FIG. 8A illustrate the shift of the dashed line wavelengths as compared to the solid-line peak which is the wavelength that entered the channel at the input grating 206.


Further to the aspects of the invention demonstrated by FIG. 3, in those embodiments the center wavelength (solid line) of the light is expanded by the angularly varying total internal reflection such that a) a first portion of the expanded light that exits the optical channel perpendicular to the exit pupil (the dashed lines) is at a wavelength shifted from the center wavelength; and b) a second portion of the expanded light that exits the optical channel non-perpendicular to the exit pupil (the solid lines) is at the center wavelength. The solid versus dashed line peaks at both FIGS. 3 and 8A show that this second portion of the expanded light that exits the optical channel non-perpendicular to the exit pupil exhibits a greater intensity than the first portion of the expanded light that exits the optical channel perpendicular to the exit pupil.


In the described embodiments the optical channel is geometrically configured such that the front surface and the back surface are non-parallel. While flat non-parallel surfaces are shown curved surfaces can also be employed to take advantage of these teachings. In the specific embodiment of FIG. 5 for a non-virtual reality implementation, the front surface 210 is adjacent to the exit pupil and the optical channel 200 is a see-through exit pupil expander further comprising a compensating wedge 500 disposed adjacent to a portion of the back surface 204 opposite the exit pupil, and this compensating wedge is transparent to incident environmental light 502B and is further geometrically configured to offset angular variance that the optical channel 200 imposes on incident environmental light 502B that passes into the optical channel via the compensating wedge.



FIG. 6 demonstrated an example of a segmented EPE 600, and in this case the front and back surfaces define at least first (600B) and second (600A and/or 600C) discrete geometric segments of the optical channel, wherein the front and back surfaces are parallel in the first discrete geometric segment 600B and non-parallel in at least the second discrete geometric segment 600A/600C.


Another embodiment shown particularly at FIG. 7 had the optical channel/EPE comprising a main channel or body 712 and a wedge overlay 714. In this case the front and back surfaces mentioned above would be considered the front 710A and rear 704A surfaces of the main channel 712; the wedge overlay would define non-parallel front 710B and back 704B overlay surfaces; the wedge overlay would be disposed adjacent to the front surface 710A of the main channel opposite the entrance pupil; and at least some of the angularly varying total internal reflection between the front and back surfaces are between the front surface 710B of the wedge overlay 714 and the back surface 704A of the main channel 712. For simplicity but not by way of limitation we can assume an embodiment with a wedge overlay in which the front 710A and rear 704A surfaces of the main channel 712 are parallel; a particularly useful aspect of the wedge overlay concept is that disposition of a distal end 716 of the wedge overlay 714 relative to the entrance pupil filters incident light 702 passing through the entrance pupil such that only wavelengths above or below a threshold experience the angularly varying total internal reflection while remaining wavelengths experience total internal reflection that is not angularly varying. These are respectively shown by the leftmost ray entering the input grating 706 which is reflected from the front overlay surface 710B back into the main channel 712 and by the rightmost ray entering the input grating 706 which is not reflected 710B back into the main channel 712 from the wedge overlay 714 due to the location of the distal end 716.


Alternatively, certain embodiments of these teachings may be described by the channel's geometry and without functional terms. For example, such an optical channel 200 comprises an entrance pupil enabling light 202 to enter the optical channel; an exit pupil enabling the light to exit the optical channel; a back surface 204 adjacent to the entrance pupil; and a front surface 210 opposite the back surface. As above, the drawings depict the entrance pupil as the input grating 206 and the exit pupil as the output grating 208, and such gratings may be common to most implementations but are not essential, particularly the output grating is not needed if the host device is of the retinal scanning variety. In this way of describing the invention the optical channel is configured such that a first distance 220 at the entrance pupil between the front surface 210 and the back surface 204 is different from a second distance 222 at the exit pupil between the front surface 210 and the back surface 204.


In one such embodiment such as that shown at FIGS. 2 and 5, the front and back surfaces of the optical channel are configured to form a continuous wedge defining an angle α that quantifies an amount of non-parallelism between them. Those particularly illustrated embodiments have the first distance greater than the second distance, but an opposite arrangement is also possible within these teachings.


Though the FIG. 5 embodiment is shown as a continuous wedge this is a non-limiting feature of the see-through/non-virtual reality embodiment of the exit pupil expander where the front surface 210 is adjacent to the exit pupil. In this embodiment the see-through exit pupil expander further comprises a compensating wedge 500 disposed adjacent to a portion of the back surface 204 opposite the exit pupil, and as detailed above this compensating wedge is transparent to incident environmental light 502B and geometrically configured to offset exit pupil expansion that the optical channel/EPE 200 imposes on incident environmental light 502B that passes into the optical channel/EPE via the compensating wedge 500.


The FIG. 6 embodiment has the front and back surfaces defining at least first and second discrete geometric segments of the optical channel, wherein the front and back surfaces are parallel in the first discrete geometric segment and non-parallel in at least the second discrete geometric segment as detailed above in the functional description of the FIG. 6 embodiment.


The embodiment detailed with respect to FIG. 7 has the optical channel comprising a main channel 712 and a wedge overlay 714, and this also is fully described above in the functional description of this embodiment.


Embodiments of these teachings include the overall host device in which such an optical channel/EPE may be deployed. Such a host device is shown by example at FIG. 9 as a head-wearable imaging device which comprises a micro display that would be disposed at the image source 904 and an exit pupil expander disposed between the source 904 and the eyepiece as shown. In such embodiments the exit pupil expander may be as detailed more particularly above with entrance and exit pupils to respectively in-couple and out-couple light from the micro-display and front and back surfaces to angularly vary the total internal reflection of the light passing between those pupils. In some embodiments at least a portion of these front and back surfaces are non-parallel to one another and this region is where the light experiences the angularly varying total internal reflection; as particularly shown at FIGS. 2 and 5-8 these non-parallel portions are flat. In some deployments the head-wearable imaging device is a virtual reality imaging device that isolates the user's field of view from the surrounding environment. In other deployments the head-wearable imaging device is an augmented reality device such as the particular example shown at FIG. 9, in which case there may further be a compensating wedge along the lines of the example at FIG. 5 and disposed opposite the exit pupil such that incident environmental light 502B passes into the exit pupil expander 200 through the compensating wedge 500 and passes out of the exit pupil expander through the exit pupil without total internal reflection.



FIG. 9 is a perspective view of a non-virtual reality headset which is one type of device 900 in which embodiments of these teachings may be disposed. Typically in host devices that are implemented as eyeglasses such as the host device 900 shown at FIG. 9 the EPE 200 would be disposed to lie along the user's temple when the device is worn, either within the earpiece 902 or separately but substantially alongside the earpiece as FIG. 9 specifically shows. For virtual reality type host devices the EPE 200 may be disposed along the user's temple as FIG. 9 illustrates, or it may be disposed to run laterally along the user's face from the temple to the eye center. In any case there is a computer based image source 904 that provides the incident light to the entrance aperture/input grating for expansion and projection at or from the exit pupil/output grating. In some embodiments the image source 904 may generate the image itself, in others it may include a wireless receiver that receives the digitized image over a Bluetooth or other wireless connection and simply renders the received image for visual presentation.


The various embodiments presented herein provide a fuller appreciation for the scope of the teachings herein, but these are examples and do not themselves represent an inherent limit to the various types of embodiments that can exploit the teachings herein, whether such embodiments relate to the EPE itself or as to how it may be disposed on or within a host device.

Claims
  • 1. A method of operating an optical channel comprising: directing light through an entrance pupil of an optical channel body, the light being directed in a direction perpendicular to a back surface of the optical channel body, wherein the light experiences angularly varying total internal reflection between a front surface and the back surface, wherein the front surface and the back surface are non-parallel, such that the light that exits the optical channel body perpendicular to an exit pupil is at a wavelength shifted from the center wavelength, wherein a wedge overlay adjacent to the front surface of the optical channel body opposite the entrance pupil defines non-parallel front and back overlay surfaces and at least some of the angularly varying total internal reflection between the front surface and the back surface are between the front overlay surface of the wedge overlay and the back surface of the optical channel body.
  • 2. The method of claim 1, wherein the center wavelength of the light is expanded by the angularly varying total internal reflection such that: a first portion of the expanded light that exits the optical channel body perpendicular to the exit pupil is at a wavelength shifted from the center wavelength; anda second portion of the expanded light that exits the optical channel body non-perpendicular to the exit pupil is at the center wavelength.
  • 3. The method of claim 2, wherein: the second portion of the expanded light that exits the optical channel body non-perpendicular to the exit pupil exhibits a greater intensity than the first portion of the expanded light that exits the optical channel body perpendicular to the exit pupil.
  • 4. The method of claim 1, wherein the front surface is adjacent to the exit pupil and the optical channel body is a see-through exit pupil expander further wherein a compensating wedge disposed adjacent to a portion of the back surface opposite the exit pupil, said compensating wedge transparent to incident environmental light and geometrically configured to offset angular variance the optical channel body imposes on incident environmental light that passes into the optical channel body via the compensating wedge.
  • 5. The method of claim 4, wherein the front and back surfaces define at least first and second discrete geometric segments of the optical channel body, wherein the front and back surfaces are parallel in the first discrete geometric segment and non-parallel in at least the second discrete geometric segment.
  • 6. The method of claim 1, wherein: disposition of a distal end of the wedge overlay relative to the entrance pupil filters incident light passing through the entrance pupil such that only wavelengths above or below a threshold experience the angularly varying total internal reflection while remaining wavelengths experience total internal reflection that is not angularly varying.
  • 7. A method of operating an optical channel comprising: directing light through an entrance pupil of an optical channel body wherein the optical channel body is configured such that a first distance at the entrance pupil between a front surface and a back surface of the optical channel body is different from a second distance at the exit pupil between the front surface and the back surface, wherein the front and back surfaces of the optical channel are configured to form a continuous wedge defining an angle a that quantifies an amount of non-parallelism between them, wherein a wedge overlay is located adjacent to the front surface of the optical channel body opposite the entrance pupil, and wherein at least some of the light entering the optical channel body via the entrance pupil is reflected from a front overlay surface of the wedge overlay back into the optical channel body towards the back surface of the optical channel body.
  • 8. The method of claim 7, wherein the first distance is greater than the second distance.
  • 9. The method of claim 7, wherein the front surface is adjacent to the exit pupil and the optical channel body is a see-through exit pupil expander further, wherein a compensating wedge disposed adjacent to a portion of the back surface opposite the exit pupil, said compensating wedge transparent to incident environmental light and geometrically configured to offset exit pupil expansion that the optical channel body imposes on incident environmental light that passes into the optical channel body via the compensating wedge.
  • 10. The method of claim 7, wherein the front and back surfaces define at least first and second discrete geometric segments of the optical channel body, wherein the front and back surfaces are parallel in the first discrete geometric segment and non-parallel in at least the second discrete geometric segment.
  • 11. The method of claim 7, wherein: disposition of a distal end of the wedge overlay relative to the entrance pupil filters incident light passing through the entrance pupil such that only wavelengths above or below a threshold are reflected from the front overlay surface into the optical channel body while remaining wavelengths reflected from the front overlay surface are not.
  • 12. A method of operating an optical channel comprising: directing light through an entrance pupil of an optical channel body wherein the optical channel body is configured such that a first distance at the entrance pupil between a front surface and a back surface of the optical channel body is different from a second distance at the exit pupil between the front surface and the back surface, wherein a wedge overlay is located adjacent to the front surface of the optical channel body opposite the entrance pupil, and wherein at least some of the light entering the optical channel body via the entrance pupil is reflected from a front overlay surface of the wedge overlay back into the optical channel body towards the back surface of the optical channel body, wherein the front surface is adjacent to the exit pupil and the optical channel body is a see-through exit pupil expander, wherein a compensating wedge disposed adjacent to a portion of the back surface opposite the exit pupil, said compensating wedge transparent to incident environmental light and geometrically configured to offset exit pupil expansion that the optical channel body imposes on incident environmental light that passes into the optical channel body via the compensating wedge.
  • 13. A method of operating an optical channel comprising: directing light through an entrance pupil of an optical channel body wherein the optical channel body is configured such that a first distance at the entrance pupil between a front surface and a back surface of the optical channel body is different from a second distance at the exit pupil between the front surface and the back surface, wherein a wedge overlay is located adjacent to the front surface of the optical channel body opposite the entrance pupil, and wherein at least some of the light entering the optical channel body via the entrance pupil is reflected from a front overlay surface of the wedge overlay back into the optical channel body towards the back surface of the optical channel body, wherein disposition of a distal end of the wedge overlay relative to the entrance pupil filters incident light passing through the entrance pupil such that only wavelengths above or below a threshold are reflected from the front overlay surface into the optical channel body while remaining wavelengths reflected from the front overlay surface are not.
CROSS-REFERENCE TO RELATED APPLICATIONS

This is a continuation of U.S. patent application Ser. No. 16/748,193, filed on Jan. 21, 2020, which is a continuation of U.S. patent application Ser. No. 15/659,732, filed on Jul. 26, 2017, now U.S. Pat. No. 10,578,870, all of which are incorporated herein by reference in their entirety.

US Referenced Citations (449)
Number Name Date Kind
4344092 Miller Aug 1982 A
4652930 Crawford Mar 1987 A
4810080 Grendol et al. Mar 1989 A
4997268 Dauvergne Mar 1991 A
5007727 Kahaney et al. Apr 1991 A
5074295 Willis Dec 1991 A
5240220 Elberbaum Aug 1993 A
5251635 Dumoulin et al. Oct 1993 A
5410763 Bolle May 1995 A
5455625 Englander Oct 1995 A
5495286 Adair Feb 1996 A
5497463 Stein et al. Mar 1996 A
5659701 Amit et al. Aug 1997 A
5682255 Friesem et al. Oct 1997 A
5826092 Flannery Oct 1998 A
5854872 Tai Dec 1998 A
5864365 Sramek et al. Jan 1999 A
5937202 Crosetto Aug 1999 A
6012811 Chao et al. Jan 2000 A
6016160 Coombs et al. Jan 2000 A
6064749 Hirota et al. May 2000 A
6076927 Owens Jun 2000 A
6117923 Amagai et al. Sep 2000 A
6124977 Takahashi Sep 2000 A
6191809 Hori et al. Feb 2001 B1
6375369 Schneider et al. Apr 2002 B1
6385735 Wilson May 2002 B1
6538655 Kubota Mar 2003 B1
6541736 Huang et al. Apr 2003 B1
6757068 Foxlin Jun 2004 B2
7046515 Wyatt May 2006 B1
7051219 Hwang May 2006 B2
7076674 Cervantes Jul 2006 B2
7111290 Yates, Jr. Sep 2006 B1
7119819 Robertson et al. Oct 2006 B1
7219245 Raghuvanshi May 2007 B1
7431453 Hogan Oct 2008 B2
7542040 Templeman Jun 2009 B2
7573640 Nivon et al. Aug 2009 B2
7724980 Shenzhi May 2010 B1
7751662 Kleemann Jul 2010 B2
7758185 Lewis Jul 2010 B2
8060759 Arnan et al. Nov 2011 B1
8120851 Iwasa Feb 2012 B2
8214660 Capps, Jr. Jul 2012 B2
8246408 Elliot Aug 2012 B2
8353594 Lewis Jan 2013 B2
8360578 Nummela et al. Jan 2013 B2
8508676 Silverstein et al. Aug 2013 B2
8547638 Levola Oct 2013 B2
8605764 Rothaar et al. Oct 2013 B1
8619365 Harris et al. Dec 2013 B2
8696113 Lewis Apr 2014 B2
8698701 Margulis Apr 2014 B2
8733927 Lewis May 2014 B1
8736636 Kang May 2014 B2
8759929 Shiozawa et al. Jun 2014 B2
8793770 Lim Jul 2014 B2
8823855 Hwang Sep 2014 B2
8847988 Geisner et al. Sep 2014 B2
8874673 Kim Oct 2014 B2
9010929 Lewis Apr 2015 B2
9015501 Gee Apr 2015 B2
9086537 Iwasa et al. Jul 2015 B2
9095437 Boyden et al. Aug 2015 B2
9239473 Lewis Jan 2016 B2
9244293 Lewis Jan 2016 B2
9244533 Friend et al. Jan 2016 B2
9383823 Geisner et al. Jul 2016 B2
9489027 Ogletree Nov 2016 B1
9519305 Wolfe Dec 2016 B2
9581820 Robbins Feb 2017 B2
9582060 Balatsos Feb 2017 B2
9658473 Lewis May 2017 B2
9671566 Abovitz et al. Jun 2017 B2
9671615 Vallius et al. Jun 2017 B1
9696795 Marcolina et al. Jul 2017 B2
9798144 Sako et al. Oct 2017 B2
9874664 Stevens et al. Jan 2018 B2
9880441 Osterhout Jan 2018 B1
9918058 Takahasi et al. Mar 2018 B2
9955862 Freeman et al. May 2018 B2
9978118 Ozgumer et al. May 2018 B1
9996797 Holz et al. Jun 2018 B1
10018844 Levola et al. Jul 2018 B2
10082865 Raynal et al. Sep 2018 B1
10151937 Lewis Dec 2018 B2
10185147 Lewis Jan 2019 B2
10218679 Jawahar Feb 2019 B1
10241545 Richards et al. Mar 2019 B1
10317680 Richards et al. Jun 2019 B1
10436594 Belt et al. Oct 2019 B2
10516853 Gibson et al. Dec 2019 B1
10551879 Richards et al. Feb 2020 B1
10578870 Kimmel Mar 2020 B2
10698202 Kimmel et al. Jun 2020 B2
10856107 Mycek et al. Oct 2020 B2
10825424 Zhang Nov 2020 B2
10987176 Poltaretskyi et al. Apr 2021 B2
11190681 Brook et al. Nov 2021 B1
11209656 Choi et al. Dec 2021 B1
11236993 Hall et al. Feb 2022 B1
20010010598 Aritake et al. Aug 2001 A1
20020007463 Fung Jan 2002 A1
20020108064 Nunally Feb 2002 A1
20020063913 Nakamura et al. May 2002 A1
20020071050 Homberg Jun 2002 A1
20020122648 Mule′ et al. Sep 2002 A1
20020140848 Cooper et al. Oct 2002 A1
20030028816 Bacon Feb 2003 A1
20030048456 Hill Mar 2003 A1
20030067685 Niv Apr 2003 A1
20030077458 Korenaga et al. Apr 2003 A1
20030115494 Cervantes Jun 2003 A1
20030218614 Lavelle et al. Nov 2003 A1
20030219992 Schaper Nov 2003 A1
20030226047 Park Dec 2003 A1
20040001533 Tran et al. Jan 2004 A1
20040021600 Wittenberg Feb 2004 A1
20040025069 Gary et al. Feb 2004 A1
20040042377 Nikoloai et al. Mar 2004 A1
20040073822 Greco Apr 2004 A1
20040073825 Itoh Apr 2004 A1
20040111248 Granny et al. Jun 2004 A1
20040174496 Ji et al. Sep 2004 A1
20040186902 Stewart Sep 2004 A1
20040201857 Foxlin Oct 2004 A1
20040238732 State et al. Dec 2004 A1
20040240072 Schindler et al. Dec 2004 A1
20040246391 Travis Dec 2004 A1
20040268159 Aasheim et al. Dec 2004 A1
20050001977 Zelman Jan 2005 A1
20050034002 Flautner Feb 2005 A1
20050157159 Komiya et al. Jul 2005 A1
20050177385 Hull Aug 2005 A1
20050231599 Yamasaki Oct 2005 A1
20050273792 Inohara et al. Dec 2005 A1
20060013435 Rhoads Jan 2006 A1
20060015821 Jacques Parker et al. Jan 2006 A1
20060019723 Vorenkamp Jan 2006 A1
20060038880 Starkweather et al. Feb 2006 A1
20060050224 Smith Mar 2006 A1
20060090092 Verhulst Apr 2006 A1
20060126181 Levola Jun 2006 A1
20060129852 Bonola Jun 2006 A1
20060132914 Weiss et al. Jun 2006 A1
20060179329 Terechko Aug 2006 A1
20060221448 Nivon et al. Oct 2006 A1
20060228073 Mukawa et al. Oct 2006 A1
20060250322 Hall et al. Nov 2006 A1
20060259621 Ranganathan Nov 2006 A1
20060268220 Hogan Nov 2006 A1
20070058248 Nguyen et al. Mar 2007 A1
20070103836 Oh May 2007 A1
20070124730 Pytel May 2007 A1
20070159673 Freeman et al. Jul 2007 A1
20070188837 Shimizu et al. Aug 2007 A1
20070198886 Saito Aug 2007 A1
20070204672 Huang et al. Sep 2007 A1
20070213952 Cirelli Sep 2007 A1
20070283247 Brenneman et al. Dec 2007 A1
20080002259 Ishizawa et al. Jan 2008 A1
20080002260 Arrouy et al. Jan 2008 A1
20080043334 Itzkovitch et al. Feb 2008 A1
20080046773 Ham Feb 2008 A1
20080063802 Maula et al. Mar 2008 A1
20080068557 Menduni et al. Mar 2008 A1
20080146942 Dala-Krishna Jun 2008 A1
20080173036 Willaims Jul 2008 A1
20080177506 Kim Jul 2008 A1
20080183190 Adcox et al. Jul 2008 A1
20080205838 Crippa et al. Aug 2008 A1
20080215907 Wilson Sep 2008 A1
20080225393 Rinko Sep 2008 A1
20080316768 Travis Dec 2008 A1
20090153797 Allon et al. Jun 2009 A1
20090224416 Laakkonen et al. Sep 2009 A1
20090245730 Kleemann Oct 2009 A1
20090310633 Ikegami Dec 2009 A1
20100005326 Archer Jan 2010 A1
20100019962 Fujita Jan 2010 A1
20100056274 Uusitalo et al. Mar 2010 A1
20100063854 Purvis et al. Mar 2010 A1
20100079841 Levola Apr 2010 A1
20100153934 Lachner Jun 2010 A1
20100194632 Raento et al. Aug 2010 A1
20100232016 Landa et al. Sep 2010 A1
20100232031 Batchko et al. Sep 2010 A1
20100244168 Shiozawa et al. Sep 2010 A1
20100277803 Pockett et al. Nov 2010 A1
20100284085 Laakkonen Nov 2010 A1
20100296163 Sarikko Nov 2010 A1
20100309687 Sampsell et al. Dec 2010 A1
20110021263 Anderson et al. Jan 2011 A1
20110022870 Mcgrane Jan 2011 A1
20110050640 Lundback et al. Mar 2011 A1
20110050655 Mukawa Mar 2011 A1
20110122240 Becker May 2011 A1
20110145617 Thomson et al. Jun 2011 A1
20110170801 Lu et al. Jul 2011 A1
20110218733 Hamza et al. Sep 2011 A1
20110286735 Temblay Nov 2011 A1
20110291969 Rashid et al. Dec 2011 A1
20120011389 Driesen Jan 2012 A1
20120050535 Densham et al. Mar 2012 A1
20120075501 Oyagi et al. Mar 2012 A1
20120081392 Arthur Apr 2012 A1
20120089854 Breakstone Apr 2012 A1
20120113235 Shintani May 2012 A1
20120127062 Bar-Zeev et al. May 2012 A1
20120154557 Perez et al. Jun 2012 A1
20120215094 Rahimian et al. Aug 2012 A1
20120218301 Miller Aug 2012 A1
20120246506 Knight Sep 2012 A1
20120249416 Maciocci et al. Oct 2012 A1
20120249741 Maciocci et al. Oct 2012 A1
20120260083 Andrews Oct 2012 A1
20120307075 Margalitq Dec 2012 A1
20120307362 Silverstein et al. Dec 2012 A1
20120314959 White et al. Dec 2012 A1
20120320460 Levola Dec 2012 A1
20120326948 Crocco et al. Dec 2012 A1
20130021486 Richardon Jan 2013 A1
20130050258 Liu et al. Feb 2013 A1
20130050642 Lewis et al. Feb 2013 A1
20130050833 Lewis et al. Feb 2013 A1
20130051730 Travers et al. Feb 2013 A1
20130061240 Yan et al. Mar 2013 A1
20130077049 Bohn Mar 2013 A1
20130077170 Ukuda Mar 2013 A1
20130094148 Sloane Apr 2013 A1
20130129282 Li May 2013 A1
20130162940 Kurtin et al. Jun 2013 A1
20130169923 Schnoll et al. Jul 2013 A1
20130205126 Kruglick Aug 2013 A1
20130222386 Tannhauser et al. Aug 2013 A1
20130268257 Hu Oct 2013 A1
20130278633 Ahn et al. Oct 2013 A1
20130314789 Saarikko et al. Nov 2013 A1
20130318276 Dalal Nov 2013 A1
20130336138 Venkatraman et al. Dec 2013 A1
20130342564 Kinnebrew et al. Dec 2013 A1
20130342570 Kinnebrew et al. Dec 2013 A1
20130342571 Kinnebrew et al. Dec 2013 A1
20130343408 Cook Dec 2013 A1
20140013098 Yeung Jan 2014 A1
20140016821 Arth et al. Jan 2014 A1
20140022819 Oh et al. Jan 2014 A1
20140078023 Ikeda et al. Mar 2014 A1
20140082526 Park et al. Mar 2014 A1
20140119598 Ramachandran et al. May 2014 A1
20140126769 Reitmayr et al. May 2014 A1
20140140653 Brown et al. May 2014 A1
20140149573 Tofighbakhsh et al. May 2014 A1
20140168260 O'Brien et al. Jun 2014 A1
20140244983 McDonald et al. Aug 2014 A1
20140266987 Magyari Sep 2014 A1
20140267419 Ballard et al. Sep 2014 A1
20140274391 Stafford Sep 2014 A1
20140282105 Nordstrom Sep 2014 A1
20140313228 Kasahara Oct 2014 A1
20140340498 Plagemann et al. Nov 2014 A1
20140359589 Kodsky et al. Dec 2014 A1
20140375680 Ackerman et al. Dec 2014 A1
20150005785 Olson Jan 2015 A1
20150009099 Queen Jan 2015 A1
20150077312 Wang Mar 2015 A1
20150097719 Balachandreswaran et al. Apr 2015 A1
20150123966 Newman May 2015 A1
20150130790 Vazquez, II et al. May 2015 A1
20150134995 Park et al. May 2015 A1
20150138248 Schrader May 2015 A1
20150155939 Oshima et al. Jun 2015 A1
20150168221 Mao et al. Jun 2015 A1
20150205126 Schowengerdt Jul 2015 A1
20150235427 Nobori et al. Aug 2015 A1
20150235431 Schowengerdt Aug 2015 A1
20150253651 Russell et al. Sep 2015 A1
20150256484 Cameron Sep 2015 A1
20150269784 Miyawaki et al. Sep 2015 A1
20150294483 Wells et al. Oct 2015 A1
20150301955 Yakovenko et al. Oct 2015 A1
20150310657 Eden Oct 2015 A1
20150338915 Publicover et al. Nov 2015 A1
20150355481 Hilkes et al. Dec 2015 A1
20160004102 Nisper et al. Jan 2016 A1
20160015470 Border Jan 2016 A1
20160027215 Burns et al. Jan 2016 A1
20160033770 Fujimaki et al. Feb 2016 A1
20160051217 Douglas et al. Feb 2016 A1
20160077338 Robbins et al. Mar 2016 A1
20160085285 Mangione-Smith Mar 2016 A1
20160085300 Robbins et al. Mar 2016 A1
20160091720 Stafford et al. Mar 2016 A1
20160093099 Bridges Mar 2016 A1
20160093269 Buckley et al. Mar 2016 A1
20160103326 Kimura et al. Apr 2016 A1
20160123745 Cotier et al. May 2016 A1
20160155273 Lyren et al. Jun 2016 A1
20160180596 Gonzalez del Rosario Jun 2016 A1
20160187654 Border et al. Jun 2016 A1
20160191887 Casas Jun 2016 A1
20160202496 Billetz et al. Jul 2016 A1
20160217624 Finn et al. Jul 2016 A1
20160266412 Yoshida Sep 2016 A1
20160267708 Nistico et al. Sep 2016 A1
20160274733 Hasegawa et al. Sep 2016 A1
20160287337 Aram et al. Oct 2016 A1
20160300388 Stafford et al. Oct 2016 A1
20160321551 Priness et al. Nov 2016 A1
20160327798 Xiao et al. Nov 2016 A1
20160334279 Mittleman et al. Nov 2016 A1
20160357255 Lindh et al. Dec 2016 A1
20160370404 Quadrat et al. Dec 2016 A1
20160370510 Thomas Dec 2016 A1
20170038607 Camara Feb 2017 A1
20170060225 Zha et al. Mar 2017 A1
20170061696 Li et al. Mar 2017 A1
20170064066 Das et al. Mar 2017 A1
20170100664 Osterhout et al. Apr 2017 A1
20170102544 Vallius et al. Apr 2017 A1
20170115487 Travis Apr 2017 A1
20170122725 Yeoh et al. May 2017 A1
20170123526 Trail et al. May 2017 A1
20170127295 Black et al. May 2017 A1
20170131569 Aschwanden et al. May 2017 A1
20170147066 Katz et al. May 2017 A1
20170160518 Lanman et al. Jun 2017 A1
20170161951 Fix et al. Jun 2017 A1
20170185261 Perez et al. Jun 2017 A1
20170192239 Nakamura et al. Jul 2017 A1
20170201709 Igarashi et al. Jul 2017 A1
20170205903 Miller et al. Jul 2017 A1
20170206668 Poulos et al. Jul 2017 A1
20170213388 Margolis et al. Jul 2017 A1
20170219841 Popovich et al. Aug 2017 A1
20170232345 Rofougaran et al. Aug 2017 A1
20170235126 DiDomenico Aug 2017 A1
20170235129 Kamakura Aug 2017 A1
20170235142 Wall et al. Aug 2017 A1
20170235144 Piskunov et al. Aug 2017 A1
20170235147 Kamakura Aug 2017 A1
20170243403 Daniels et al. Aug 2017 A1
20170246070 Osterhout et al. Aug 2017 A1
20170254832 Ho et al. Sep 2017 A1
20170256096 Faaborg et al. Sep 2017 A1
20170258526 Lang Sep 2017 A1
20170266529 Reikmoto Sep 2017 A1
20170270712 Tyson et al. Sep 2017 A1
20170281054 Stever et al. Oct 2017 A1
20170287376 Bakar et al. Oct 2017 A1
20170293141 Schowengerdt et al. Oct 2017 A1
20170307886 Stenberg et al. Oct 2017 A1
20170307891 Bucknor et al. Oct 2017 A1
20170312032 Amanatullah et al. Nov 2017 A1
20170322418 Liu et al. Nov 2017 A1
20170322426 Tervo Nov 2017 A1
20170329137 Tervo Nov 2017 A1
20170332098 Rusanovskyy et al. Nov 2017 A1
20170336636 Amitai et al. Nov 2017 A1
20170357332 Balan et al. Dec 2017 A1
20170363871 Vallius Dec 2017 A1
20170371394 Chan Dec 2017 A1
20170371661 Sparling Dec 2017 A1
20180014266 Chen Jan 2018 A1
20180024289 Fattal Jan 2018 A1
20180044173 Netzer Feb 2018 A1
20180052007 Teskey et al. Feb 2018 A1
20180052501 Jones, Jr. et al. Feb 2018 A1
20180059305 Popovich et al. Mar 2018 A1
20180067779 Pillalamarri et al. Mar 2018 A1
20180070855 Eichler Mar 2018 A1
20180082480 White et al. Mar 2018 A1
20180088185 Woods et al. Mar 2018 A1
20180102981 Kurtzman et al. Apr 2018 A1
20180108179 Tomlin et al. Apr 2018 A1
20180114298 Malaika et al. Apr 2018 A1
20180129112 Osterhout May 2018 A1
20180131907 Schmirler et al. May 2018 A1
20180136466 Ko May 2018 A1
20180144691 Choi et al. May 2018 A1
20180150971 Adachi et al. May 2018 A1
20180151796 Akahane May 2018 A1
20180172995 Lee et al. Jun 2018 A1
20180188115 Hsu et al. Jul 2018 A1
20180189568 Powderly et al. Jul 2018 A1
20180190017 Mendez et al. Jul 2018 A1
20180191990 Motoyama et al. Jul 2018 A1
20180217395 Lin et al. Aug 2018 A1
20180218545 Garcia et al. Aug 2018 A1
20180250589 Cossairt et al. Sep 2018 A1
20180260218 Gopal Sep 2018 A1
20180284877 Klein Oct 2018 A1
20180292654 Wall et al. Oct 2018 A1
20180299678 Singer et al. Oct 2018 A1
20180357472 Dreessen Dec 2018 A1
20190005069 Filgueiras de Araujo et al. Jan 2019 A1
20190011691 Peyman Jan 2019 A1
20190056591 Tervo et al. Feb 2019 A1
20190087015 Lam et al. Mar 2019 A1
20190101758 Zhu et al. Apr 2019 A1
20190107723 Lee et al. Apr 2019 A1
20190137788 Suen May 2019 A1
20190155034 Singer et al. May 2019 A1
20190155439 Mukherjee et al. May 2019 A1
20190158926 Kang et al. May 2019 A1
20190167095 Krueger Jun 2019 A1
20190172216 Ninan et al. Jun 2019 A1
20190178654 Hare Jun 2019 A1
20190196690 Chong et al. Jun 2019 A1
20190206116 Xu et al. Jul 2019 A1
20190219815 Price et al. Jul 2019 A1
20190243123 Bohn Aug 2019 A1
20190287270 Nakamura et al. Sep 2019 A1
20190318502 He et al. Oct 2019 A1
20190318540 Piemonte et al. Oct 2019 A1
20190321728 Imai et al. Oct 2019 A1
20190347853 Chen et al. Nov 2019 A1
20190380792 Poltaretskyi et al. Dec 2019 A1
20190388182 Kumar et al. Dec 2019 A1
20200066045 Stahl et al. Feb 2020 A1
20200098188 Bar-Zeev et al. Mar 2020 A1
20200100057 Galon et al. Mar 2020 A1
20200110928 Al Jazaery et al. Apr 2020 A1
20200117267 Gibson et al. Apr 2020 A1
20200117270 Gibson et al. Apr 2020 A1
20200184217 Faulkner Jun 2020 A1
20200184653 Faulker Jun 2020 A1
20200202759 Ukai et al. Jun 2020 A1
20200242848 Ambler et al. Jul 2020 A1
20200309944 Thoresen et al. Oct 2020 A1
20200356161 Wagner Nov 2020 A1
20200368616 Delamont Nov 2020 A1
20200391115 Leeper et al. Dec 2020 A1
20200409528 Lee Dec 2020 A1
20210008413 Asikainen et al. Jan 2021 A1
20210033871 Jacoby et al. Feb 2021 A1
20210041951 Gibson et al. Feb 2021 A1
20210053820 Gurin et al. Feb 2021 A1
20210093391 Poltaretskyi et al. Apr 2021 A1
20210093410 Gaborit et al. Apr 2021 A1
20210093414 Moore et al. Apr 2021 A1
20210097886 Kuester et al. Apr 2021 A1
20210132380 Wieczorek May 2021 A1
20210141225 Meynen et al. May 2021 A1
20210142582 Jones et al. May 2021 A1
20210158627 Cossairt et al. May 2021 A1
20210173480 Osterhout et al. Jun 2021 A1
20220366598 Azimi et al. Nov 2022 A1
Foreign Referenced Citations (102)
Number Date Country
100416340 Sep 2008 CN
101449270 Jun 2009 CN
104040410 Sep 2014 CN
104603675 May 2015 CN
106662754 May 2017 CN
107683497 Feb 2018 CN
109223121 Jan 2019 CN
105190427 Nov 2019 CN
0504930 Mar 1992 EP
0535402 Apr 1993 EP
0632360 Jan 1995 EP
1215522 Jun 2002 EP
1494110 Jan 2005 EP
1938141 Jul 2008 EP
1943556 Jul 2008 EP
2290428 Mar 2011 EP
2350774 Aug 2011 EP
1237067 Jan 2016 EP
3139245 Mar 2017 EP
3164776 May 2017 EP
3236211 Oct 2017 EP
2723240 Aug 2018 EP
2896986 Feb 2021 EP
2499635 Aug 2013 GB
2542853 Apr 2017 GB
938DEL2004 Jun 2006 IN
H03-036974 Apr 1991 JP
H10-333094 Dec 1998 JP
2002-529806 Sep 2002 JP
2003-029198 Jan 2003 JP
2003-141574 May 2003 JP
2003-228027 Aug 2003 JP
2003-329873 Nov 2003 JP
2005-303843 Oct 2005 JP
2007-012530 Jan 2007 JP
2007-86696 Apr 2007 JP
2007-273733 Oct 2007 JP
2008-257127 Oct 2008 JP
2009-090689 Apr 2009 JP
2009-244869 Oct 2009 JP
2011-033993 Feb 2011 JP
2011-257203 Dec 2011 JP
2012-015774 Jan 2012 JP
2012-235036 Nov 2012 JP
2013-525872 Jun 2013 JP
2014-500522 Jan 2014 JP
2014-192550 Oct 2014 JP
2015-191032 Nov 2015 JP
2016-502120 Jan 2016 JP
2016-85463 May 2016 JP
2016-516227 Jun 2016 JP
2016-126134 Jul 2016 JP
2017-015697 Jan 2017 JP
2017-153498 Sep 2017 JP
2017-531840 Oct 2017 JP
2017-535825 Nov 2017 JP
6232763 Nov 2017 JP
6333965 May 2018 JP
2005-0010775 Jan 2005 KR
10-1372623 Mar 2014 KR
201219829 May 2012 TW
201803289 Jan 2018 TW
1991000565 Jan 1991 WO
2000030368 Jun 2000 WO
2002071315 Sep 2002 WO
2004095248 Nov 2004 WO
2006132614 Dec 2006 WO
2007041678 Apr 2007 WO
2007037089 May 2007 WO
2007085682 Aug 2007 WO
2007102144 Sep 2007 WO
2008148927 Dec 2008 WO
2009101238 Aug 2009 WO
2014203440 Dec 2010 WO
2012030787 Mar 2012 WO
2013049012 Apr 2013 WO
2013062701 May 2013 WO
2014033306 Mar 2014 WO
2015143641 Oct 2015 WO
2015143641 Oct 2015 WO
2016054092 Apr 2016 WO
2017004695 Jan 2017 WO
2017044761 Mar 2017 WO
2017049163 Mar 2017 WO
2017120475 Jul 2017 WO
2017176861 Oct 2017 WO
2017203201 Nov 2017 WO
2017203201 Nov 2017 WO
2018008232 Jan 2018 WO
2018031261 Feb 2018 WO
2018022523 Feb 2018 WO
2018044537 Mar 2018 WO
2018039273 Mar 2018 WO
2018057564 Mar 2018 WO
2018085287 May 2018 WO
2018087408 May 2018 WO
2018097831 May 2018 WO
2018166921 Sep 2018 WO
2018236587 Dec 2018 WO
2019040493 Feb 2019 WO
2019148154 Aug 2019 WO
2020010226 Jan 2020 WO
Non-Patent Literature Citations (247)
Entry
“ARToolKit: Hardware”, https://web.archive.org/web/20051013062315/http://www.hitl.washington.edu:80/artoolkit/documentation/hardware.htm (downloaded Oct. 26, 2020), Oct. 13, 2015, (3 pages).
“Communication according to Rule 164(1) EPC dated Feb. 23, 2022”, European Patent Application No. 20753144.3, (11 pages).
“Communication Pursuant to Article 94(3) EPC dated Sep. 4, 2019”, European Patent Application No. 10793707.0, (4 pages).
“Communication Pursuant to Article 94(3) EPC dated Apr. 25, 2022”, European Patent Application No. 18885707.2, (5 pages).
“Communication Pursuant to Article 94(3) EPC dated Jan. 4, 2022”, European Patent Application No. 20154070.5, (8 pages).
“Communication Pursuant to Article 94(3) EPC dated May 30, 2022”, European Patent Application No. 19768418.6, (6 pages).
“Communication Pursuant to Article 94(3) EPC dated Oct. 21, 2021”, European Patent Application No. 16207441.3, (4 pages).
“Communication Pursuant to Rule 164(1) EPC dated Feb. 23, 2022”, European Patent Application No. 20753144.3, (11 pages).
“Communication Pursuant to Rule 164(1) EPC dated Jul. 27, 2021”, European Patent Application No. 19833664.6, (11 pages).
“European Search Report dated Oct. 15, 2020”, European Patent Application No. 20180623.9, (10 pages).
“Extended European Search Report dated Jul. 20, 2022”, European Patent Application No. 19885958.9, (9 pages).
“Extended European Search Report dated May 20, 2020”, European Patent Application No. 20154070.5, (7 pages).
“Extended European Search Report dated Jan. 22, 2021”, European Patent Application No. 18890390.0, (11 pages).
“Extended European Search Report dated Nov. 3, 2020”, European Patent Application No. 18885707.2, (7 pages).
“Extended European Search Report dated Jun. 30, 2021”, European Patent Application No. 19811971.1, (9 pages).
“Extended European Search Report dated Mar. 4, 2021”, European Patent Application No. 19768418.6, (9 pages).
“Extended European Search Report dated Nov. 4, 2020”, European Patent Application No. 20190980.1, (14 pages).
“Extended European Search Report dated Aug. 24, 2022”, European Patent Application No. 20846338.0, (13 pages).
“Extended European Search Report dated Aug. 8, 2022”, European Patent Application No. 19898874.3, (8 pages).
“Extended European Search Report dated Sep. 8, 2022”, European Patent Application No. 20798769.4, (13 pages).
“Extended European Search Report dated Nov. 3, 2022”, European Patent Application No. 20770244.0, (23 pages).
“Extended European Search Report dated Jun. 12, 2017”, European Patent Application No. 16207441.3, (8 pages).
“Extended European Search Report dated Jan. 28, 2022”, European Patent Application No. 19815876.8, (9 pages).
“Extended European Search Report dated Jan. 4, 2022”, European Patent Application No. 19815085.6, (9 pages).
“Extended European Search Report dated Jul. 16, 2021”, European Patent Application No. 19810142.0, (14 pages).
“Extended European Search Report dated Jul. 30, 2021”, European Patent Application No. 19839970.1, (7 pages).
“Extended European Search Report dated Jun. 19, 2020”, European Patent Application No. 20154750.2, (10 pages).
“Extended European Search Report dated Mar. 22, 2022”, European Patent Application No. 19843487.0, (14 pages).
“Extended European Search Report dated May 16, 2022”, European Patent Application No. 19871001.4, (9 pages).
“Extended European Search Report dated May 30, 2022”, European Patent Application No. 20753144.3, (10 pages).
“Extended European Search Report dated Oct. 27, 2021”, European Patent Application No. 19833664.6, (10 pages).
“Extended European Search Report dated Sep. 20, 2021”, European Patent Application No. 19851373.1, (8 pages).
“Extended European Search Report dated Sep. 28, 2021”, European Patent Application No. 19845418.3, (13 pages).
“Final Office Action dated Aug. 10, 2020”, U.S. Appl. No. 16/225,961, (13 pages).
“Final Office Action dated Dec. 4, 2019”, U.S. Appl. No. 15/564,517, (15 pages).
“Final Office Action dated Feb. 19, 2020”, U.S. Appl. No. 15/552,897, (17 pages).
“Final Office Action dated Feb. 23, 2022”, U.S. Appl. No. 16/748,193, (23 pages).
“Final Office Action dated Feb. 3, 2022”, U.S. Appl. No. 16/864,721, (36 pages).
“Final Office Action dated Jul. 13, 2022”, U.S. Appl. No. 17/262,991, (18 pages).
“Final Office Action dated Jun. 15, 2021”, U.S. Appl. No. 16/928,313, (42 pages).
“Final Office Action dated Mar. 1, 2021”, U.S. Appl. No. 16/214,575, (29 pages).
“Final Office Action dated Mar. 19, 2021”, U.S. Appl. No. 16/530,776, (25 pages).
“Final Office Action dated Nov. 24, 2020”, U.S. Appl. No. 16/435,933, (44 pages).
“Final Office Action dated Sep. 17, 2021”, U.S. Appl. No. 16/938,782, (44 pages).
“First Examination Report dated Jul. 27, 2022”, Chinese Patent Application No. 201980036675.2, (5 pages).
“First Examination Report dated Jul. 28, 2022”, Indian Patent Application No. 202047024232, (6 pages).
“First Examination Report dated May 13, 2022”, Indian Patent Application No. 202047026359, (8 pages).
“First Office Action dated Feb. 11, 2022 with English translation”, Chinese Patent Application No. 201880089255.6, (17 pages).
“First Office Action dated Mar. 14, 2022 with English translation”, Chinese Patent Application No. 201880079474.6, (11 pages).
“First Office Action dated Sep. 16, 2022 with English translation”, Chinese Patent Application No. 201980063642.7, (7 pages).
“FS_XR5G: Permanent document, v0.4.0”, Qualcomm Incorporated, 3GPP TSG-SA 4 Meeting 103 retrieved from the Internet: URL:http://www.3gpp.org/ftp/Meetings%5F3GPP%5FSYNC/SA4/Docs/S4%2DI90526%2Ezip [retrieved on Apr. 12, 2019], Apr. 12, 2019, (98 pages).
“International Search Report and Written Opinion dated Feb. 12, 2021”, International Application No. PCT/US20/60555, (25 pages).
“International Search Report and Written Opinion dated Mar. 12, 2020”, International PCT Patent Application No. PCT/US19/67919, (14 pages).
“International Search Report and Written Opinion dated Aug. 15, 2019”, International PCT Patent Application No. PCT/US19/33987, (20 pages).
“International Search Report and Written Opinion dated Jun. 15, 2020”, International PCT Patent Application No. PCT/US2020/017023, (13 pages).
“International Search Report and Written Opinion dated Oct. 16, 2019”, International PCT Patent Application No. PCT/US19/43097, (10 pages).
“International Search Report and Written Opinion dated Oct. 16, 2019”, International PCT Patent Application No. PCT/US19/36275, (10 pages).
“International Search Report and Written Opinion dated Oct. 16, 2019”, International PCT Patent Application No. PCT/US19/43099, (9 pages).
“International Search Report and Written Opinion dated Jun. 17, 2016”, International PCT Patent Application No. PCT/FI2016/050172, (9 pages).
“International Search Report and Written Opinion dated Feb. 2, 2021”, International PCT Patent Application No. PCT/US20/60550, (9 pages).
“International Search Report and Written Opinion dated Oct. 22, 2019”, International PCT Patent Application No. PCT/US19/43751, (9 pages).
“International Search Report and Written Opinion dated Dec. 23, 2019”, International PCT Patent Application No. PCT/US19/44953, (11 pages).
“International Search Report and Written Opinion dated May 23, 2019”, International PCT Patent Application No. PCT/US18/66514, (17 pages).
“International Search Report and Written Opinion dated Sep. 26, 2019”, International PCT Patent Application No. PCT/US19/40544, (12 pages).
“International Search Report and Written Opinion dated Aug. 27, 2019”, International PCT Application No. PCT/US2019/035245, (8 pages).
“International Search Report and Written Opinion dated Dec. 27, 2019”, International Application No. PCT/US19/47746, (16 pages).
“International Search Report and Written Opinion dated Dec. 3, 2020”, International Patent Application No. PCT/US20/43596, (25 pages).
“International Search Report and Written Opinion dated Sep. 30, 2019”, International Patent Application No. PCT/US19/40324, (7 pages).
“International Search Report and Written Opinion dated Sep. 4, 2020”, International Patent Application No. PCT/US20/31036, (13 pages).
“International Search Report and Written Opinion dated Jun. 5, 2020”, International Patent Application No. PCT/US20/19871, (9 pages).
“International Search Report and Written Opinion dated Aug. 8, 2019”, International PCT Patent Application No. PCT/US2019/034763, (8 pages).
“International Search Report and Written Opinion dated Oct. 8, 2019”, International PCT Patent Application No. PCT/US19/41151, (7 pages).
“International Search Report and Written Opinion dated Jan. 9, 2020”, International Application No. PCT/US19/55185, (10 pages).
“International Search Report and Written Opinion dated Feb. 28, 2019”, International Patent Application No. PCT/US18/64686, (8 pages).
“International Search Report and Written Opinion dated Feb. 7, 2020”, International PCT Patent Application No. PCT/US2019/061265, (11 pages).
“International Search Report and Written Opinion dated Jun. 11, 2019”, International PCT Application No. PCT/US19/22620, (7 pages).
“Invitation to Pay Additional Fees dated Aug. 15, 2019”, International PCT Patent Application No. PCT/US19/36275, (2 pages).
“Invitation to Pay Additional Fees dated Sep. 24, 2020”, International Patent Application No. PCT/US2020/043596, (3 pages).
“Invitation to Pay Additional Fees dated Oct. 22, 2019”, International PCT Patent Application No. PCT/US19/47746, (2 pages).
“Invitation to Pay Additional Fees dated Apr. 3, 2020”, International Patent Application No. PCT/US20/17023, (2 pages).
“Invitation to Pay Additional Fees dated Oct. 17, 2019”, International PCT Patent Application No. PCT/US19/44953, (2 pages).
“Multi-core processor”, TechTarget, 2013, (1 page).
“Non Final Office Action dated Nov. 19. 2019”, U.S. Appl. No. 16/355,611, (31 pages).
“Non Final Office Action dated Apr. 1, 2022”, U.S. Appl. No. 17/256,961, (65 pages).
“Non Final Office Action dated Apr. 11, 2022”, U.S. Appl. No. 16/938,782, (52 pages).
“Non Final Office Action dated Apr. 12, 2022”, U.S. Appl. No. 17/262,991, (60 pages).
“Non Final Office Action dated Aug. 21, 2019”, U.S. Appl. No. 15/564,517, (14 pages).
“Non Final Office Action dated Aug. 4, 2021”, U.S. Appl. No. 16/864,721, (21 pages).
“Non Final Office Action dated Dec. 7, 2022”, U.S. Appl. No. 17/357,795, (63 pages).
“Non Final Office Action dated Feb. 2, 2022”, U.S. Appl. No. 16/783,866, (8 pages).
“Non Final Office Action dated Jan. 26, 2021”, U.S. Appl. No. 16/928,313, (33 pages).
“Non Final Office Action dated Jan. 27, 2021”, U.S. Appl. No. 16/225,961, (15 pages).
“Non Final Office Action dated Jul. 26, 2022”, U.S. Appl. No. 17/098,059, (28 pages).
“Non Final Office Action dated Jul. 27, 2020”, U.S. Appl. No. 16/435,933, (16 pages).
“Non Final Office Action dated Jul. 9, 2021”, U.S. Appl. No. 17/002,663, (43 pages).
“Non Final Office Action dated Jul. 9, 2021”, U.S. Appl. No. 16/833,093, (47 pages).
“Non Final Office Action dated Jun. 10, 2021”, U.S. Appl. No. 16/938,782, (40 Pages).
“Non Final Office Action dated Jun. 17, 2020”, U.S. Appl. No. 16/682,911, (22 pages).
“Non Final Office Action dated Jun. 19, 2020”, U.S. Appl. No. 16/225,961, (35 pages).
“Non Final Office Action dated Jun. 29, 2021”, U.S. Appl. No. 16/698,588, (58 pages).
“Non Final Office Action dated Mar. 3, 2021”, U.S. Appl. No. 16/427,337, (41 pages).
“Non Final Office Action dated Mar. 31, 2022”, U.S. Appl. No. 17/257,814, (60 pages).
“Non Final Office Action dated Mar. 9, 2022”, U.S. Appl. No. 16/870,676, (57 pages).
“Non Final Office Action dated May 10, 2022”, U.S. Appl. No. 17/140,921, (25 pages).
“Non Final Office Action dated May 17, 2022”, U.S. Appl. No. 16/748,193, (11 pages).
“Non Final Office Action dated May 26, 2021”, U.S. Appl. No. 16/214,575, (19 pages).
“Non Final Office Action dated Nov. 5, 2020”, U.S. Appl. No. 16/530,776, (45 pages).
“Non Final Office Action dated Oct. 22, 2019”, U.S. Appl. No. 15/859,277, (15 pages).
“Non Final Office Action dated Sep. 1, 2020”, U.S. Appl. No. 16/214,575, (40 pages).
“Non Final Office Action dated Sep. 19, 2022”, U.S. Appl. No. 17/263,001, (14 pages).
“Non Final Office Action dated Sep. 20, 2021”, U.S. Appl. No. 17/105,848, (56 pages).
“Non Final Office Action dated Sep. 29, 2021”, U.S. Appl. No. 16/748,193, (62 pages).
“Notice of Allowance dated Mar. 25, 2020”, U.S. Appl. No. 15/564,517, (11 pages).
“Notice of Allowance dated Oct. 5, 2020”, U.S. Appl. No. 16/682,911, (27 pages).
“Notice of Reason for Rejection dated Oct. 28, 2022 with English translation”, Japanese Patent Application No. 2020-531452, (3 pages).
“Notice of Reason of Refusal dated Sep. 11, 2020 with English translation”, Japanese Patent Application No. 2019-140435, (6 pages).
“Office Action dated Nov. 24, 2022 with English Translation”, Japanese Patent Application No. 2020-533730, (11 pages).
“Phototourism Challenge”, CVPR 2019 Image Matching Workshop. https://image matching-workshop. github.io., (16 pages).
“Second Office Action dated Jul. 13, 2022 with English Translation”, Chinese Patent Application No. 201880079474.6, (10 pages).
“Second Office Action dated Jun. 20, 2022 with English Translation”, Chinese Patent Application No. 201880089255.6, (14 pages).
“Summons to attend oral proceedings pursuant to Rule 115(1) EPC mailed on Jul. 15, 2019”, European Patent Application No. 15162521.7, (7 pages).
Aarik, J., et al., “Effect of crystal structure on optical properties of TiO2 films grown by atomic layer deposition”, Thin Solid Films; Publication [online). May 19, 1998 [retrieved Feb. 19, 2020]. Retrieved from the Internet: <URL: https://www.sciencedirect.com/science/article/pii/S0040609097001351 ?via%3Dihub>; DOI: 10.1016/S0040-6090(97)00135-1; see entire document, (2 pages).
Altwaijry, et al., “Learning to Detect and Match Keypoints with Deep Architectures”, Proceedings of the British Machine Vision Conference (BMVC), BMVA Press, Sep. 2016, [retrieved on Jan. 8, 2021 (Jan. 8, 2021 )] < URL: http://www.bmva.org/bmvc/2016/papers/paper049/index.html >, en lire document, especially Abstract.
Anonymous, “Koi Pond: Top iPhone App Store Paid App”, https://web.archive.org/web/20080904061233/https://www.iphoneincanada.ca/reviews /koi-pond-top-iphone-app-store-paid-app/—[retrieved on Aug. 9, 2022].
Arandjelović, Relja, et al., “Three things everyone should know to improve object retrieval”, CVPR, 2012, (8 pages).
Azom, “Silica—Silicon Dioxide (SiO2)”, AZO Materials; Publication [Online]. Dec. 13, 2001 [retrieved Feb. 19, 2020]. Retrieved from the Internet: <URL: https://www.azom.com/article.aspx?Article1D=1114>.
Azuma, Ronald T., “A Survey of Augmented Reality”, Presence: Teleoperators and Virtual Environments 6, 4 (Aug. 1997), 355-385; https://web.archive.org/web/20010604100006/http://www.cs.unc.edu/˜azuma/ARpresence.pdf (downloaded Oct. 26, 2020).
Azuma, Ronald T., “Predictive Tracking for Augmented Reality”, Department of Computer Science, Chapel Hill NC; TR95-007, Feb. 1995, 262 pages.
Battaglia, Peter W, et al., “Relational inductive biases, deep learning, and graph networks”, arXiv:1806.01261, Oct. 17, 2018, pp. 1-40.
Berg, Alexander C, et al., “Shape matching and object recognition using low distortion correspondences”, In CVPR, 2005, (8 pages).
Bian, Jiawang, et al., “GMS: Grid-based motion statistics for fast, ultra-robust feature correspondence.”, In CVPR (Conference on Computer Vision and Pattern Recognition), 2017, (10 pages).
Bimber, Oliver, et al., “Spatial Augmented Reality: Merging Real and Virtual Worlds”, https://web.media.mit.edu/˜raskar/book/BimberRaskarAugmentedRealityBook.pdf; published by A K Peters/CRC Press (Jul. 31, 2005); eBook (3rd Edition, 2007), (393 pages).
Brachmann, Eric, et al., “Neural-Guided RANSAC: Learning Where to Sample Model Hypotheses”, In ICCV (International Conference on Computer Vision ), arXiv:1905.04132v2 [cs.CV] Jul. 31, 2019, (17 pages).
Butail, et al., “Putting the fish in the fish tank: Immersive VR for animal behavior experiments”, In: 2012 IEEE International Conference on Robotics and Automation. May 18, 2012 (May 18, 2012) Retrieved on Nov. 14, 2020 (Nov. 14, 2020) from <http:/lcdcl.umd.edu/papers/icra2012.pdf> entire document.
Caetano, Tiberio S, et al., “Learning graph matching”, IEEE TPAMI, 31(6):1048-1058, 2009.
Cech, Jan, et al., “Efficient sequential correspondence selection by cosegmentation”, IEEE TPAMI, 32(9):1568-1581, Sep. 2010.
Chittineni, C., et al., “Single filters for combined image geometric manipulation and enhancement”, Proceedings of SPIE vol. 1903, Image and Video Processing, Apr. 8, 1993, San Jose, CA. (Year: 1993), pp. 111-121.
Cuturi, Marco, “Sinkhorn distances: Lightspeed computation of optimal transport”, NIPS, 2013, (9 pages).
Dai, Angela, et al., “ScanNet: Richly-annotated 3d reconstructions of indoor scenes”, In CVPR, arXiv:1702.04405v2 [cs.CV] Apr. 11, 2017, (22 pages).
Deng, Haowen, et al., “PPFnet: Global context aware local features for robust 3d point matching”, In CVPR, arXiv:1802.02669v2 [cs.CV] Mar. 1, 2018, (12 pages).
Detone, Daniel, et al., “Deep image homography estimation”, In RSS Work-shop: Limits and Potentials of Deep Learning in Robotics, arXiv:1606.03798v1 [cs.CV] Jun. 13, 2016, (6 pages).
Detone, Daniel, et al., “Self-improving visual odometry”, arXiv:1812.03245, Dec. 8, 2018, (9 pages).
Detone, Daniel, et al., “SuperPoint: Self-supervised interest point detection and description”, In CVPR Workshop on Deep Learning for Visual SLAM, arXiv:1712.07629v4 [cs.CV] Apr. 19, 2018, (13 pages).
Dusmanu, Mihai, et al., “D2-net: A trainable CNN for joint detection and description of local features”, CVPR, arXiv:1905.03561v1 [cs.CV] May 9, 2019, (16 pages).
Ebel, Patrick, et al., “Beyond cartesian representations for local descriptors”, ICCV, arXiv:1908.05547v1 [cs.CV] Aug. 15, 2019, (11 pages).
Fischler, Martin A, et al., “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography”, Communications of the ACM, 24(6): 1981, pp. 381-395.
Gilmer, Justin, et al., “Neural message passing for quantum chemistry”, In ICML, arXiv:1704.01212v2 [cs.LG] Jun. 12, 2017, (14 pages).
Giuseppe, Donato, et al., “Stereoscopic helmet mounted system for real time 3D environment reconstruction and indoor ego—motion estimation”, Proc. SPIE 6955, Head- and Helmet-Mounted Displays XIII: Design and Applications, SPIE Defense and Security Symposium, 2008, Orlando, Florida, United States, 69550P.
Goodfellow, “Titanium Dioxide-Titania (TiO2)”, AZO Materials; Publication [online]. Jan. 11, 2002 [retrieved Feb. 19, 2020]. Retrieved from the Internet: <URL: https://www.azom.com/article.aspx?Article1D=1179>.
Hartley, Richard, et al., “Multiple View Geometry in Computer Vision”, Cambridge University Press, 2003, pp. 1-673.
Jacob, Robert J.K., “Eye Tracking in Advanced Interface Design”, Human-Computer Interaction Lab, Naval Research Laboratory, Washington, D.C., date unknown. 2003, pp. 1-50.
Lee, et al., “Self-Attention Graph Pooling”, Cornell University Library/Computer Science/Machine Learning, Apr. 17, 2019 [retrieved on Jan. 8, 2021 from the Internet< URL: https://arxiv.org/abs/1904.08082 >, entire document.
Lee, Juho, et al., “Set transformer: A frame-work for attention-based permutation-invariant neural networks”, ICML, arXiv:1810.00825v3 [cs.LG] May 26, 2019, (17 pages).
Leordeanu, Marius, et al., “A spectral technique for correspondence problems using pairwise constraints”, Proceedings of (ICCV) International Conference on Computer Vision, vol. 2, pp. 1482-1489, Oct. 2005, (8 pages).
Levola, T., “Diffractive Optics for Virtual Reality Displays”, Journal of the SID Eurodisplay 14/05, 2005, XP008093627, chapters 2-3, Figures 2 and 10, pp. 467-475.
Levola, Tapani, “Invited Paper: Novel Diffractive Optical Components for Near to Eye Displays—Nokia Research Center”, SID 2006 Digest, 2006 SID International Symposium, Society for Information Display, vol. XXXVII, May 24, 2005, chapters 1-3, figures 1 and 3, pp. 64-67.
Li, Yujia, et al., “Graph matching networks for learning the similarity of graph structured objects”, ICML, arXiv:1904.12787v2 [cs.LG] May 12, 2019, (18 pages).
Li, Zhengqi, et al., “Megadepth: Learning single-view depth prediction from internet photos”, In CVPR, fromarXiv: 1804.00607v4 [cs.CV] Nov. 28, 2018, (10 pages).
Libovicky, et al., “Input Combination Strategies for Multi-Source Transformer Decoder”, Proceedings of the Third Conference on Machine Translation (WMT). vol. 1: Research Papers, Belgium, Brussels, Oct. 31-Nov. 1, 2018; retrieved on Jan. 8, 2021 (Jan. 8, 2021 ) from < URL: https://doi.org/10.18653/v1/W18-64026 >, entire document.
Loiola, Eliane Maria, et al., “A survey for the quadratic assignment problem”, European journal of operational research, 176(2): 2007, pp. 657-690.
Lowe, David G, “Distinctive image features from scale-invariant keypoints”, International Journal of Computer Vision, 60(2): 91-110, 2004, (28 pages).
Luo, Zixin, et al., “ContextDesc: Local descriptor augmentation with cross-modality context”, CVPR, arXiv:1904.04084v1 [cs.CV] Apr. 8, 2019, (14 pages).
Memon, F., et al., “Synthesis, Characterization and Optical Constants of Silicon Oxycarbide”, EPJ Web of Conferences; Publication [online). Mar. 23, 2017 [retrieved Feb. 19, 2020).<URL: https://www.epj-conferences.org/articles/epjconf/pdf/2017/08/epjconf_nanop2017 _00002.pdf>; DOI: 10.1051/epjconf/201713900002, (8 pages).
Molchanov, Pavlo, et al., “Short-range FMCW monopulse radar for hand-gesture sensing”, 2015 IEEE Radar Conference (RadarCon) (2015), pp. 1491-1496.
Mrad, et al., “A framework for System Level Low Power Design Space Exploration”, 1991.
Munkres, James, “Algorithms for the assignment and transportation problems”, Journal of the Society for Industrial and Applied Mathematics, 5(1): 1957, pp. 32-38.
Ono, Yuki, et al., “LF-Net: Learning local features from images”, 32nd Conference on Neural Information Processing Systems (NIPS 2018), arXiv:1805.09662v2 [cs.CV] Nov. 22, 2018, (13 pages).
Paszke, Adam, et al., “Automatic differentiation in Pytorch”, 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA, (4 pages).
Peyré, Gabriel, et al., “Computational Optimal Transport”, Foundations and Trends in Machine Learning, 11(5-6):355-607, 2019; arXiv:1803.00567v4 [stat.ML] Mar. 18, 2020, (209 pages).
Qi, Charles Ruizhongtai, et al., “Pointnet++: Deep hierarchical feature learning on point sets in a metric space.”, 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA., Jun. 7, 2017, (10 pages).
Qi, Charles R, et al., “Pointnet: Deep Learning on Point Sets for 3D Classification and Segmentation”, CVPR, arXiv:1612.00593v2 [cs.CV] Apr. 10, 2017, (19 pages).
Radenović, Filip, et al., “Revisiting Oxford and Paris: Large-Scale Image Retrieval Benchmarking”, CVPR, arXiv:1803.11285v1 [cs.CV] Mar. 29, 2018, (10 pages).
Raguram, Rahul, et al., “A comparative analysis of ransac techniques leading to adaptive real-time random sample consensus”, Computer Vision—ECCV 2008, 10th European Conference on Computer Vision, Marseille, France, Oct. 12-18, 2008, Proceedings, Part I, (15 pages).
Ranftl, René, et al., “Deep fundamental matrix estimation”, European Conference on Computer Vision (ECCV), 2018, (17 pages).
Revaud, Jerome, et al., “R2D2: Repeatable and Reliable Detector and Descriptor”, In NeurIPS, arXiv:1906.06195v2 [cs.CV] Jun. 17, 2019, (12 pages).
Rocco, Ignacio, et al., “Neighbourhood Consensus Networks”, 32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montreal, Canada, arXiv:1810.10510v2 [cs.CV] Nov. 29, 2018, (20 pages).
Rublee, Ethan, et al., “ORB: An efficient alternative to SIFT or SURF”, Proceedings of the IEEE International Conference on Computer Vision. 2564-2571. 2011; 10.1109/ICCV.2011.612654, (9 pages).
Sarlin, et al., “SuperGlue: Learning Feature Matching with Graph Neural Networks”, Cornell University Library/Computer Science/ Computer Vision and Pattern Recognition, Nov. 26, 2019 [retrieved on Jan. 8, 2021 from the Internet< URL: https://arxiv.org/abs/1911.11763 >, entire document.
Sattler, Torsten, et al., “SCRAMSAC: Improving RANSAC's efficiency with a spatial consistency filter”, ICCV, 2009: 2090-2097., (8 pages).
Schonberger, Johannes Lutz, et al., “Pixelwise view selection for un- structured multi-view stereo”, Computer Vision—ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, Oct. 11-14, 2016, Proceedings, Part III, pp. 501-518, 2016.
Schonberger, Johannes Lutz, et al., “Structure-from-motion revisited”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2016, pp. 4104-4113, (11 pages).
Sheng, Liu, et al., “Time-multiplexed dual-focal plane head-mounted display with a liquid lens”, Optics Letters, Optical Society of Amer i ca, US, vol. 34, No. 11, Jun. 1, 2009 (Jun. 1, 2009), XP001524475, ISSN: 0146-9592, pp. 1642-1644.
Sinkhorn, Richard, et al., “Concerning nonnegative matrices and doubly stochastic matrices.”, Pacific Journal of Mathematics, 1967, pp. 343-348.
Spencer, T., et al., “Decomposition of poly(propylene carbonate) with UV sensitive iodonium 11 salts”, Polymer Degradation and Stability; (online]. Dec. 24, 2010 (retrieved Feb. 19, 2020]., (17 pages).
Tanriverdi, Vildan, et al., “Interacting With Eye Movements in Virtual Environments”, Department of Electrical Engineering and Computer Science, Tufts University; Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 2000, pp. 1-8.
Thomee, Bart, et al., “YFCC100m: The new data in multimedia research”, Communications of the ACM, 59(2):64-73, 2016; arXiv:1503.01817v2 [cs.MM] Apr. 25, 2016, (8 pages).
Torresani, Lorenzo, et al., “Feature correspondence via graph matching: Models and global optimization”, Computer Vision—ECCV 2008, 10th European Conference on Computer Vision, Marseille, France, Oct. 12-18, 2008, Proceedings, Part II, (15 pages).
Tuytelaars, Tinne, et al., “Wide baseline stereo matching based on local, affinely invariant regions”, BMVC, 2000, pp. 1-14.
Ulyanov, Dmitry, et al., “Instance normalization: The missing ingredient for fast stylization”, arXiv:1607.08022v3 [cs.CV] Nov. 6, 2017, (6 pages).
Vaswani, Ashish, et al., “Attention is all you need”, 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA; arXiv:1706.03762v5 [cs.CL] Dec. 6, 2017, (15 pages).
Veli{hacek over (c)}kovi{hacek over (c)}, Petar, et al., “Graph attention networks”, ICLR, arXiv:1710.10903v3 [stat.ML] Feb. 4, 2018, (12 pages).
Villani, Cédric, “Optimal transport: old and new”, vol. 338. Springer Science & Business Media, Jun. 2008, pp. 1-998.
Wang, Xiaolong, et al., “Non-local neural networks”, CVPR, arXiv:1711.07971v3 [cs.CV] Apr. 13, 2018, (10 pages).
Wang, Yue, et al., “Deep Closest Point: Learning representations for point cloud registration”, ICCV, arXiv:1905.03304v1 [cs.CV] May 8, 2019, (10 pages).
Wang, Yue, et al., “Dynamic Graph CNN for learning on point clouds”, ACM Transactions on Graphics, arXiv:1801.07829v2 [cs.CV] Jun. 11, 2019, (13 pages).
Weissel, et al., “Process cruise control: event-driven clock scaling for dynamic power management”, Proceedings of the 2002 international conference on Compilers, architecture, and synthesis for embedded systems. Oct. 11, 2002 (Oct. 11, 2002) Retrieved on May 16, 2020 (May 16, 2020) from <URL: https://dl.acm.org/doi/pdf/10.1145/581630.581668>.
Yi, Kwang Moo, et al., “Learning to find good correspondences”, CVPR, arXiv:1711.05971v2 [cs.CV] May 21, 2018, (13 pages).
Yi, Kwang Moo, et al., “Lift: Learned invariant feature transform”, ECCV, arXiv:1603.09114v2 [cs.CV] Jul. 29, 2016, (16 pages).
Zaheer, Manzil, et al., “Deep Sets”, 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA; arXiv:1703.06114v3 [cs.LG] Apr. 14, 2018, (29 pages).
Zhang, Jiahui, et al., “Learning two-view correspondences and geometry using order-aware network”, ICCV; aarXiv:1908.04964v1 [cs.CV] Aug. 14, 2019, (11 pages).
Zhang, Li, et al., “Dual graph convolutional net-work for semantic segmentation”, BMVC, 2019; arXiv:1909.06121v3 [cs.CV] Aug. 26, 2020, (18 pages).
“Communication Pursuant to Article 94(3) EPC dated Feb. 28, 2023”, European Patent Application No. 19845418.3, (6 Pages).
“Communication Pursuant to Article 94(3) EPC dated May 23, 2023”, European Patent Application No. 18890390.0, (5 pages).
“Decision of Rejection dated Jan. 5, 2023 with English translation”, Chinese Patent Application No. 201880079474.6, (10 pages).
“Extended European Search Report dated Apr. 5, 2023”, European Patent Application No. 20888716.6, (11 pages).
“Extended European Search Report dated Dec. 14, 2022”, European Patent Application No. 20886547.7, (8 pages).
“Final Office Action dated Dec. 29, 2022”, U.S. Appl. No. 17/098,059, (32 pages).
“Final Office Action dated Mar. 10, 2023”, U.S. Appl. No. 17/357,795, (15 pages).
“First Office Action dated Apr. 21, 2023 with English translation”, Japanese Patent Application No. 2021-509779, (26 pages).
“First Office Action dated Jul. 4, 2023 with English translation”, Japanese Patent Application No. 2021-505669, (6 pages).
“First Office Action dated Apr. 13, 2023 with English Translation”, Japanese Patent Application No. 2020-567766, (7 pages).
“First Office Action dated Dec. 22, 2022 with English translation”, Chinese Patent Application No. 201980061450.2, (11 pages).
“First Office Action dated Jan. 24, 2023 with English translation”, Japanese Patent Application No. 2020-549034, (7 pages).
“First Office Action dated Jan. 30, 2023 with English translation”, Chinese Patent Application No. 201980082951.9, (5 pages).
“First Office Action dated Jun. 13, 2023 with English translation”, Japanese Patent Application No. 2020-567853, (7 pages).
“First Office Action dated Mar. 27, 2023 with English translation”, Japanese Patent Application No. 2020-566617, (6 pages).
“First Office Action dated Mar. 6, 2023 with English translation”, Korean Patent Application No. 10-2020-7019685, (7 pages).
“First Office Action dated May 26, 2023 with English translation”, Japanese Patent Application No. 2021-500607, (6 pages).
“First Office Action dated May 30, 2023 with English translation”, Japanese Patent Application No. 2021-519873, (8 pages).
“Non Final Office Action dated Apr. 13, 2023”, U.S. Appl. No. 17/098,043, (7 pages).
“Non Final Office Action dated Feb. 3, 2023”, U.S. Appl. No. 17/429,100, (16 pages).
“Non Final Office Action dated Feb. 3, 2023”, U.S. Appl. No. 17/497,965, (32 pages).
“Non Final Office Action dated Jan. 24, 2023”, U.S. Appl. No. 17/497,940, (10 pages).
“Non Final Office Action dated Jul. 20, 2023”, U.S. Appl. No. 17/650,188, (11 pages).
“Non Final Office Action dated Jun. 14, 2023”, U.S. Appl. No. 17/516,483, (10 pages).
“Non Final Office Action dated Mar. 1, 2023”, U.S. Appl. No. 18/046,739, (34 pages).
“Non Final Office Action dated May 11, 2023”, U.S. Appl. No. 17/822,279, (24 pages).
“Office Action dated Apr. 13, 2023 with English translation”, Japanese Patent Application No. 2020-533730, (13 pages).
“Office Action dated Jun. 8, 2023 with English translation”, Japanese Patent Application No. 2021-503762, (6 pages).
“Office Action dated Mar. 30, 2023 with English translation”, Japanese Patent Application No. 2020-566620, (10 pages).
“Second Office Action dated May 2, 2023 with English Translation”, Japanese Patent Application No. 2020-549034, (6 pages).
Li, Yujia , et al., “Graph Matching Networks for Learning the Similarity of Graph Structured Objects”, arxiv.org, Cornell University Library, 201 Olin Library Cornell University Ithaca, NY 14853, XP081268608, Apr. 29, 2019.
Luo, Zixin , et al., “ContextDesc: Local Descriptor Augmentation With Cross-Modality Context”, 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, XP033686823, DOI: 10.1109/CVPR.2019.00263 [retrieved on Jan. 8, 2020], Jun. 15, 2019, pp. 2522-2531.
Zhang, Zen , et al., “Deep Graphical Feature Learning for the Feature Matching Problem”, 2019 IEEE/CVF International Conference on Computer Vision (ICCV), IEEE, XP033723985, DOI: 10.1109/ICCV.2019.00519 [retrieved on Feb. 24, 2020], Oct. 27, 2019, pp. 5086-5095.
“Communication Pursuant to Article 94(3) EPC mailed on Oct. 6, 2023”, European Patent Application No. 19851373.1, (6 pages).
“Final Office Action dated Oct. 16, 2023”, U.S. Appl. No. 17/098,043, (7 pages).
“Final Office Action dated Dec. 1, 2023”, U.S. Appl. No. 17/357,795, (18 pages).
“First Office Action dated Nov. 2, 2023 with English translation”, Chinese Patent Application No. 201980090867.1, (16 pages).
“First Office Action dated Sep. 29, 2023 with English translation”, Japanese Patent Application No. 2023-10887, (5 pages).
“Non Final Office Action dated Nov. 22, 2023”, U.S. Appl. No. 17/268,376, (8 pages).
“Non Final Office Action dated Nov. 3, 2023”, U.S. Appl. No. 17/416,248, (17 pages).
“Non Final Office Action dated Oct. 11, 2023”, U.S. Appl. No. 17/357,795, (14 pages).
“Non Final Office Action dated Oct. 24, 2023”, U.S. Appl. No. 17/259,020, (21 pages).
“Office Action dated Nov. 21, 2023 with English Translation”, Japanese Patent Application No. 2021-535716, (15 pages).
“Office Action dated Nov. 7, 2023 with English translation”, Korean Patent Application No. 10-2023-7036734, (5 pages).
“Penultimate Office Action dated Oct. 19, 2023 with English translation”, Japanese Patent Application No. 2021-509779, (5 pages).
“Second Office Action dated Sep. 25, 2023 with English translation”, Japanese Patent Application No. 2020-567853, (8 pages).
Related Publications (1)
Number Date Country
20230131587 A1 Apr 2023 US
Continuations (2)
Number Date Country
Parent 16748193 Jan 2020 US
Child 18145416 US
Parent 15659732 Jul 2017 US
Child 16748193 US