The subject invention relates to an optical device that is used in an augmented reality system. The augmented reality system is of the type that includes a planar waveguide for transmitting a view of the world while also generating and delivering computer generated images to the user. In the subject invention, one or more additional waveguides are provided to capture images from the world.
There is considerable interest in developing systems that create an augmented reality for a user. In one example of such a system, the user would be provided with a head mounted device that includes a window for viewing the outside world. The window would have the capability to generate image information and project that image information into the eyes of the user. In such a system, images of an imaginary object could be generated and added to the real world scene.
A description of a device for creating an augmented reality can be found in U.S. Patent Publication No. 2015/0016777, published Jan. 15, 2015, the disclosure of which is incorporated herein by reference.
As described in the latter publication and illustrated in in
As best illustrated in
The DOE 2 (illustrated in
With reference to
The control subsystem 106 includes one or more light sources and drive electronics that generate image data that is encoded in the form of light that is spatially and/or temporally varying. As noted above, a collimation element 6 may collimate the light, and the collimated light is optically coupled into one or more primary planar waveguides 1 (only one primary waveguide is illustrated in
As illustrated in
In some implementations, a scanning light display is used to couple light into one or more primary planar waveguides. The scanning light display can comprise a single light source that forms a single beam that is scanned over time to form an image. This scanned beam of light may be intensity-modulated to form pixels of different brightness levels. Alternatively, multiple light sources may be used to generate multiple beams of light, which are scanned either with a shared scanning element or with separate scanning elements to form imagery. These light sources may comprise different wavelengths, visible and/or non-visible, they may comprise different geometric points of origin (X, Y, or Z), they may enter the scanner(s) at different angles of incidence, and may create light that corresponds to different portions of one or more images (flat or volumetric, moving or static).
The light may, for example, be scanned to form an image with a vibrating optical fiber, for example as discussed in U.S. patent application Ser. No. 13/915,530, International Patent Application Serial No. PCT/US2013/045267, and U.S. provisional patent application Ser. No. 61/658,355. The optical fiber may be scanned biaxially by a piezoelectric actuator. Alternatively, the optical fiber may be scanned uniaxially or triaxially. As a further alternative, one or more optical components (e.g., rotating polygonal reflector or mirror, oscillating reflector or mirror) may be employed to scan an output of the optical fiber.
In other embodiments, the image can be generated using a LCOS (liquid crystal on silicon) mirrors formed in an array.
Many of the structures of the optical system 300 of
The optical system 300 may employ a distribution waveguide apparatus to relay light along a first axis (vertical or Y-axis in view of
The relayed and exit pupil expanded light is optically coupled from the distribution waveguide apparatus into one or more primary planar waveguide 1. The primary planar waveguide 1 relays light along a second axis, preferably orthogonal to first axis, (e.g., horizontal or X-axis in view of
The optical system 300 may include one or more sources of red, green, and blue light 11, which may be optically coupled into a proximal end of a single mode optical fiber 9. A distal end of the optical fiber 9 may be threaded or received through a hollow tube 8 of piezoelectric material. The distal end protrudes from the tube 8 as fixed-free flexible cantilever 7. The piezoelectric tube 8 is associated with four quadrant electrodes (not illustrated). The electrodes may, for example, be plated on the outside, outer surface or outer periphery or diameter of the tube 8. A core electrode (not illustrated) is also located in a core, center, inner periphery or inner diameter of the tube 8.
Drive electronics 12, for example electrically coupled via wires 10, drive opposing pairs of electrodes to bend the piezoelectric tube 8 in two axes independently. The protruding distal tip of the optical fiber 7 has mechanical modes of resonance. The frequencies of resonance depend upon a diameter, length, and material properties of the optical fiber 7. By vibrating the piezoelectric tube 8 near a first mode of mechanical resonance of the fiber cantilever 7, the fiber cantilever 7 is caused to vibrate, and can sweep through large deflections. By stimulating resonant vibration in two axes, the tip of the fiber cantilever 7 is scanned biaxially in an area filling 2D scan. By modulating an intensity of light source(s) 11 in synchrony with the scan of the fiber cantilever 7, light emerging from the fiber cantilever 7 forms an image.
Collimator 6 collimates the light emerging from the scanning fiber cantilever 7. The collimated light may be reflected by mirrored surface 5 into a narrow distribution planar waveguide 3 which contains at least one diffractive optical element (DOE) 4. The collimated light propagates vertically (i.e., relative to view of
Light entering primary waveguide 1 propagates horizontally (i.e., relative to view of
The optical system 600 includes a waveguide apparatus 102, which as described above may comprise one or more primary planar waveguides 1 and associated DOE(s) 2 (not illustrated in
The optical system 600 can enable the use of a single primary planar waveguide 1, rather than using two or more primary planar waveguides 1 (e.g., arranged in a stacked configuration along the Z-axis of
In each of the embodiments discussed above, a light source is provided for injecting image information into the waveguide and using the DOE to distribute the light to the wearer. As discussed below, in the subject invention, a similar combination including a planar waveguide and a diffractive optical element is used to capture light entering one of the planar faces of the waveguide and then measuring the captured light with a sensor. The collection waveguide can be used alone or in combination with an output waveguide.
In a preferred embodiment, the apparatus includes a waveguide having opposed planar input and output faces. A diffractive optical element (DOE) is formed across the waveguide. The DOE is configured to couple a portion of the light passing through the waveguide into the waveguide. The light coupled into the waveguide is directed via total internal reflection to an exit location on the waveguide. The apparatus further includes a light sensor having an input positioned adjacent the exit location of the waveguide to capture light exiting therefrom and generate output signals corresponding thereto. A processor determines the angle and position of the coupled light with respect to the input face of the waveguide based on the output signals. The apparatus may include a narrow band wavelength filter aligned with the waveguide.
The apparatus may further include a second waveguide having opposed planar input and output faces, where the second waveguide may be aligned with and parallel to the first waveguide. A DOE may be formed across the second waveguide. The DOE may be configured to control the level of reflectivity at locations across faces of the second waveguide. The apparatus may further include a light generator having an output positioned adjacent the second waveguide to inject light into the second waveguide. A processor may control the light being injected into the second waveguide to guide light entering the second waveguide via total internal reflection to exit the waveguide at particular locations across the output face of the second waveguide.
The apparatus may further include a third waveguide extending along an edge of the first waveguide. The third waveguide may capture light exiting the exit location of the first waveguide and delivering the light to the sensor.
In accordance with the subject invention, a portion of the rays entering waveguide 702 will be trapped by the waveguide and directed via total internal reflection along the length of the waveguide to an exit location shown at 710. The light exit location 710 can be on either the front or back of the waveguide or at a side edge thereof. Light exiting the waveguide can be captured by a sensor 712. Signals generated by the sensor are coupled to a processor 714 for analysis.
Various types of sensors could be used. For example, the sensor can include a movable fiber optic as discussed above with the output device. Similarly, an array of sensors could be provided. In addition, the sensor could include a LCOS system selectively directing light to additional, fixed position sensors such as CMOS or CCD imagers.
The processor 714 would analyze the input signals to determine the input position and angle of the rays that were captured by the waveguide and channeled to the exit location 710. In practice, this analysis could be complicated if only a single waveguide were used. For example, the single waveguide embodiment would produce signals that combine more than one wavelength of light reaching the waveguide.
Accordingly, in one preferred embodiment, three similar waveguides would be used, each waveguide arranged to capture either red, blue or green light.
It should be noted that the light rays which enter from any particular point on the waveguide will effectively be summed with light of the same color which enters at certain other points but on the same TIR path back to the sensor. In effect, one would get a superposition of light from many sources. The processor would need to be arranged to unravel this superposition of information via digital analysis. In practice, the algorithms could include various statistical analyses in conjunction with a learning system. The specific data analysis approach is not the subject of this application.
It should be noted that if the device is capturing an image of a flat object (e.g. a photograph), the deconvolution problem is much simpler. For a flat object, the deconvolution will have N displaced images, where N is the number of entry pupils on the edge element.
One way of addressing the deconvolution problem when imaging a three dimensional scene would be to utilize a plurality of DOEs across the waveguide that are actively switchable. Using electronic controls to vary the diffractive power of the DOEs, one could selectively allow individual entry pupils to be selectively turned on or shut off. Under this system, the processor would know in advance the entry location of light channeled by the waveguide and measured by the sensor.
It should be noted that if an active DOE system were used to image a flat image, each pupil would capture the entire undistorted image of the object.
In the basic operation of the system 900, some of the light rays from the real world will pass through both waveguides 910 and 920 and into the eye of the user. Waveguide 910 can be used to deliver additional visual images to the eye to achieve an augmented reality. Waveguide 920 can be used to capture and measure light rays from the real world. This information can be used in a variety of ways.
For example, information captured from the real world can be used to modify the images generated for the output waveguide 910. In particular, the light output from waveguide 910 can be some specific function of the input measured at the same coordinate on the waveguide 920 surface. This function could include linear scaling, nonlinear scaling, and clipped scaling, as well as any other specific function of the intensity as computed on a per pixel basis or computed locally with respect to a pixel location.
The image information captured by waveguide 920 can also be used to help register virtual images generated by waveguide 910. In many currently envisioned augmented reality implementations, virtual objects are located in three dimensions and complex projections are made to compute their location on the waveguide 910. Information collected by waveguide 920 permits objects to be specifically registered with respect to the two dimensional image, thus guaranteeing their correct location relative to other landmarks in the image (i.e. no “jitter”).
In another example, the image information collected by waveguide 920 can be used to recognize a specific image element in the real world and then compute a substitute element for presentation to the user via waveguide 910. For example, the system could recognize a person's face and then render a modified image (e.g., face with a beard) to the user. In another example, the color of a building could be changed. In still another example, signage in the real world written in one language could be rendered in a different language. This concept extends not only to “filters” which use image recognition or modeling, but also to simpler filters such as blur, or combinations of the above.
In an alternative embodiment, a controllable darkening layer (e.g., LCD, not shown) can be provided between the waveguides 910 and 920 for blocking the light from the real world from reaching the user's eye. Instead, this incoming light can be “substituted” with light generated by waveguide 910.
Algorithms for decoding the light captured by waveguide 920 and converted into a time sequential signal are context dependent.
For example, in a system wherein the sensor is a scanning fiber, the signals can be converted into a two dimensional rectilinear array of the kind expected by many image processing algorithms. In some cases, it may be easier to process the pixels in their natural polar form and output them in the same polar form.
In one implementation, there can be a direct assignment of the incoming polar coordinate pixel (IPCP) values to the rectangular coordinate pixels (RCP) found by direct computation. In another implementation, one can assign all rectangular coordinate pixels to the nearest rectangular coordinate pixel for which a corresponding incoming polar coordinate pixel can be found. In another approach, one can interpolate the assignment of rectangular coordinate pixels to the value of the nearest rectangular coordinate pixels for which a corresponding incoming polar coordinate pixel can be found. The latter approach includes linear, quadratic, or any other such interpolation. Finally, it is possible to pre-compute and store the relationship between the rectangular coordinate pixels and the associated incoming polar coordinate pixel projections.
While the subject invention has been described with reference to some preferred embodiments, various changes and modifications could be made therein, by one skilled in the art, without varying from the scope and spirit of the subject invention as defined by the appended claims.
This application is a continuation of U.S. application Ser. No. 17/111,372 filed Dec. 3, 2020, which is a continuation of U.S. application Ser. No. 16/047,771 filed Jul. 27, 2018, now U.S. Pat. No. 10,890,465, which is a continuation of U.S. application Ser. No. 15/881,345 filed Jan. 26, 2018, now U.S. Pat. No. 10,060,766, which is a continuation of U.S. application Ser. No. 15/824,777 filed Nov. 28, 2017, now U.S. Pat. No. 10,378,930, which is a continuation of U.S. application Ser. No. 15/159,518 filed May 19, 2016, now U.S. Pat. No. 9,891,077, which claims priority to U.S. Provisional Patent Application No. 62/163,733, filed on May 19, 2015, entitled “DUAL COMPOSITE LIGHT FIELD DEVICE”, the disclosures of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4852988 | Velez | Aug 1989 | A |
6433760 | Vaissie | Aug 2002 | B1 |
6491391 | Blum et al. | Dec 2002 | B1 |
6721053 | Maseeh | Apr 2004 | B1 |
6847336 | Lemelson | Jan 2005 | B1 |
6943754 | Aughey | Sep 2005 | B2 |
6977776 | Volkenandt et al. | Dec 2005 | B2 |
7347551 | Fergason et al. | Mar 2008 | B2 |
7454103 | Parriaux | Nov 2008 | B2 |
7488294 | Torch | Feb 2009 | B2 |
8235529 | Raffle | Aug 2012 | B1 |
8331751 | Delaney et al. | Dec 2012 | B2 |
8465699 | Fehr et al. | Jun 2013 | B2 |
8611015 | Wheeler | Dec 2013 | B2 |
8638498 | Bohn et al. | Jan 2014 | B2 |
8696113 | Lewis | Apr 2014 | B2 |
8929589 | Publicover et al. | Jan 2015 | B2 |
9010929 | Lewis | Apr 2015 | B2 |
9274338 | Robbins et al. | Mar 2016 | B2 |
9292973 | Bar-zeev et al. | Mar 2016 | B2 |
9423397 | Duer | Aug 2016 | B2 |
9720505 | Gribetz et al. | Aug 2017 | B2 |
9891077 | Kaehler | Feb 2018 | B2 |
10013053 | Cederlund et al. | Jul 2018 | B2 |
10025379 | Drake et al. | Jul 2018 | B2 |
10060766 | Kaehler | Aug 2018 | B2 |
10378930 | Kaehler | Aug 2019 | B2 |
10423222 | Popovich et al. | Sep 2019 | B2 |
10890465 | Kaehler | Jan 2021 | B2 |
10977815 | Chao | Apr 2021 | B1 |
11454523 | Kaehler | Sep 2022 | B2 |
20030030597 | Geist | Feb 2003 | A1 |
20060023158 | Howell et al. | Feb 2006 | A1 |
20100302196 | Han et al. | Dec 2010 | A1 |
20110090389 | Saito | Apr 2011 | A1 |
20110211056 | Publicover et al. | Sep 2011 | A1 |
20110213664 | Osterhout | Sep 2011 | A1 |
20110227813 | Haddick et al. | Sep 2011 | A1 |
20120021806 | Maltz | Jan 2012 | A1 |
20130181896 | Gruhlke et al. | Jul 2013 | A1 |
20140195918 | Friedlander | Jul 2014 | A1 |
20140306866 | Miller | Oct 2014 | A1 |
20150016777 | Abovitz | Jan 2015 | A1 |
20150168731 | Robbins | Jun 2015 | A1 |
20150309264 | Abovitz et al. | Oct 2015 | A1 |
20160011112 | Tappura et al. | Jan 2016 | A1 |
20170052384 | Santori et al. | Feb 2017 | A1 |
20170255335 | Zhang | Sep 2017 | A1 |
20180080803 | Kaehler | Mar 2018 | A1 |
20190041634 | Popovich et al. | Feb 2019 | A1 |
20200018968 | Edwin et al. | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
2316473 | Jan 2001 | CA |
2362895 | Dec 2002 | CA |
2388766 | Dec 2003 | CA |
2007304227 | Nov 2007 | JP |
201185769 | Apr 2011 | JP |
2009050501 | Apr 2009 | WO |
2009050504 | Apr 2009 | WO |
Entry |
---|
Examination Report dated Dec. 20, 2022, for New Zealand Application No. 737402, two pages. |
Examination Report dated Dec. 20, 2022, for New Zealand Application No. 776281, three pages. |
Japanese Office Action dated Oct. 17, 2022, for JP Application No. 2021-187622, with English translation, 5 pages. |
Canadian Office Action dated Aug. 2, 2022, for CA Application No. 2,986,146, four pages. |
Chines Office Action dated May 12, 2020, for CN Application No. 201680041954.4, with English translation, ten pages. |
Chinese Office Action dated Jul. 27, 2022, for CN Application No. 202110069265.7, with English translation, 14 pages. |
European Notice of Allowance dated May 20, 2021, for EP Application No. 16797324.7 filed on Dec. 13, 2017, 6 pages. |
European Search Report dated Jan. 19, 2022, for EP Application No. 21199507.1, seven pages. |
European Search Report dated May 24, 2018 for EP Application No. 16797324.7 filed on Dec. 13, 2017, 6 pages. |
Examination Report dated Jan. 20, 2021, for Indian Application No. 201747040493, six pages. |
Examination Report dated Jun. 15, 2022, for New Zealand Application No. 737402, four pages. |
Examination Report dated Jun. 15, 2022, for New Zealand Application No. 776281, four pages. |
Examination Report dated Sep. 2, 2020, for AU Application No. 2016264599, four pages. |
Final Office Action dated Jan. 9, 2019, for U.S. Appl. No. 15/824,777, filed Nov. 28, 2017, seven pages. |
Final Office Action dated Oct. 3, 2019, for U.S. Appl. No. 16/047,771, filed Jul. 27, 2018, eight pages. |
Indian Office Action dated May 24, 2022, for IN Application No. 202148033071, with English translation, 5 pages. |
International Preliminary Report On Patentability dated Nov. 30, 2017, for PCT Patent Application No. PCT/US2016/033343, Internationally filed on May 19, 2016, 6 pages. |
International Search Report and Written Opinion of the International Searching Authority dated Aug. 25, 2016, for PCT Patent Application No. PCT/US2016/033343, Internationally filed on May 19, 2016, 7 pages. |
Israeli Notice of Allowance for IL Patent Application No. 255713, dated May 10, 2022, with English translation, seven pages. |
Israeli Office Action for IL Patent Application No. 255713, dated Mar. 14, 2021, with English translation, 6 pages. |
Non-Final Office Action dated Feb. 21, 2020, for U.S. Appl. No. 16/047,771, filed Jul. 27, 2018, seven pages. |
Non-Final Office Action dated Jan. 2, 2018 for U.S. Appl. No. 15/824,777, filed Nov. 28, 2017, six pages. |
Non-Final Office Action dated Jul. 12, 2018 for U.S. Appl. No. 15/824,777, filed Nov. 28, 2017, ten pages. |
Non-Final Office Action dated Mar. 3, 2022, for U.S. Appl. No. 17/111,372, filed Dec. 3, 2020, seven pages. |
Non-Final Office Action dated Sep. 21, 2018 for U.S. Appl. No. 16/047,771, filed Jul. 27, 2018, five pages. |
Notice of Acceptance dated May 4, 2021, for AU Application No. 2016264599, three pages. |
Notice of Allowance (corrected) dated Nov. 6, 2020, for U.S. Appl. No. 16/047,771, filed Jul. 27, 2018, seven pages. |
Notice of Allowance dated Apr. 27, 2018 for U.S. Appl. No. 15/881,345, filed Jan. 26, 2018, nine pages. |
Notice of Allowance dated Dec. 10, 2021, for JP Application No. 2019-094767, with English translation, six pages. |
Notice of Allowance dated Jun. 29, 2022, for U.S. Appl. No. 17/111,372, filed Dec. 3, 2020, nine pages. |
Notice of Allowance dated Mar. 29, 2019 for U.S. Appl. No. 15/824,777, filed Nov. 28, 2017, eight pages. |
Notice of Allowance dated Sep. 3, 2020, for U.S. Appl. No. 16/047,771, filed Jul. 27, 2018, eight pages. |
Notice of Final reasons for rejection (JP OA) dated Jul. 15, 2021, for JP Application No. 2020-045165, with English translation, four pages. |
Notice of Final rejection (JP OA) dated Feb. 2, 2021, for JP Application No. 2019-094767, with English translation, 8 pages. |
Notice of Reason for Rejection (JP OA) dated Sep. 1, 2021, for JP Application No. 2019-094767, with English translation, 5 pages. |
Notice of reasons for rejection (JP OA) dated Dec. 16, 2019, for JP Application No. 2017-560236, with English translation, ten pages. |
Notice of reasons for rejection (JP OA) dated Mar. 18, 2020, for JP Application No. 2019-094767, with English translation, 18 pages. |
Notice of reasons for rejection (JP OA) dated Nov. 6, 2020, for JP Application No. 2020-045165, with English translation two pages. |
Examination Report dated Oct. 11, 2022, for AU Application No. 2021215223, three pages. |
Jacob, R. “Eye Tracking in Advanced Interface Design”, Virtual Environments and Advanced Interface Design, Oxford University Press, Inc. (Jun. 1995). |
Rolland, J. et al., “High-resolution inset head-mounted display”, Optical Society of America, vol. 37, No. 19, Applied Optics, (Jul. 1, 1998). |
Tanriverdi, V. et al. (Apr. 2000). “Interacting With Eye Movements In Virtual Environments,” Department of Electrical Engineering and Computer Science, Tufts University, Medford, MA 02155, USA, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, eight pages. |
Yoshida, A. et al., “Design and Applications of a High Resolution Insert Head Mounted Display”. (Jun. 1994). |
Examination Report dated Mar. 21, 2023, for New Zealand Application No. 737402, one page. |
Israeli Office Action dated Feb. 23, 2023, for IL Patent Application No. 295437, four pages. |
Japanese Notice of Allowance dated Jan. 18, 2023, for JP Application No. 2021-187622, with English translation, 6 pages. |
Office Action dated Apr. 17, 2023, for CA Application No. 2,986,146, four pages. |
Chinese Notice of Allowance dated Jun. 1, 2023, for CN Application No. 202110069265.7, with English translation, 5 pages. |
European Notice of Allowance dated Jun. 15, 2023, for EP Application No. 21199507.1, eight pages. |
Notice of Acceptance dated Jun. 15, 2023, for New Zealand Application No. 737402, two pages. |
Korean Office Action dated Sep. 26, 2023, for KR Application No. 10-2017-7036453, with English translation, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20220404178 A1 | Dec 2022 | US |
Number | Date | Country | |
---|---|---|---|
62163733 | May 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17111372 | Dec 2020 | US |
Child | 17891951 | US | |
Parent | 16047771 | Jul 2018 | US |
Child | 17111372 | US | |
Parent | 15881345 | Jan 2018 | US |
Child | 16047771 | US | |
Parent | 15824777 | Nov 2017 | US |
Child | 15881345 | US | |
Parent | 15159518 | May 2016 | US |
Child | 15824777 | US |