The present invention relates to imaging devices, and more particularly to a lens-less imaging device.
Conventional optical imaging systems use a lens to form an image of a target object on a sensor array. An optical phased array receiver may be used as a coherent imaging device by using a beam-forming technique. Incoherent imaging techniques based on signal correlation have also been used in astronomy
A lens-less imaging device, in accordance with one embodiment of the present invention, includes, in part, a multitude of pixels each having a light detector and an associated optical element adapted to cause the pixel to be responsive to a different direction of light received from a target. Each pixel has a field of view that overlaps with a field of view of at least a subset of the remaining pixels.
In one embodiment, each optical element is a transparent dielectric element having a different angle relative to a reference angle. In one embodiment, each optical element is a transparent MEMS component having a different angle relative to a reference angle. In one embodiment, each optical element is a transparent microlens having a different angle relative to a reference angle. In one embodiment, each optical element has one or more metallic walls having a different angle relative to a reference angle.
In one embodiment, each of a first subset of the pixels has a Gaussian distribution response. In one embodiment, each of a first subset of the pixels has a non-Gaussian distribution response. In one embodiment, the multitude of optical elements form a continuous mapping layer. In one embodiment, the pixels form a one-dimensional array. In one embodiment, the pixels form a two-dimensional array. In one embodiment, the pixels form a three-dimensional array. In one embodiment, the lens-less imaging device forms an image of a target in accordance with an optical transfer functions of the pixels as well as responses of the pixels to a light received from the target.
A lens-less imaging device, in accordance with one embodiment of the present invention, includes, in part, a multitude of grating couplers each adapted to be responsive to a different direction of light received from a target.
A method of forming an image of a target, in accordance with one embodiment of the present invention, includes, in part, receiving a response from each of a multitude of pixels, and forming the image in accordance with the received responses and further in accordance with optical transfer functions of the pixels. Each pixel is responsive to the light received from a different direction from the target and each pixel has a field of view that overlaps with a field of view of one or more of the other pixels.
In one embodiment, each pixel includes a light detector and an associated optical element. In one embodiment, each optical element is a transparent dielectric element having a different angle relative to a reference angle. In one embodiment, each optical element is a transparent MEMS component having a different angle relative to a reference angle. In one embodiment, each optical element is a transparent microlens having a different angle relative to a reference angle. In one embodiment, each optical element has one or more metallic walls having a different angle relative to a reference angle.
In one embodiment, each of a first subset of the pixels has a Gaussian distribution response. In one embodiment, each of a first subset of the pixels has a non-Gaussian distribution response. In one embodiment, the multitude of optical elements form a continuous mapping layer. In one embodiment, the pixels form a one-dimensional array. In one embodiment, the pixels form a two-dimensional array. In one embodiment, the pixels form a three-dimensional array.
A method of forming an image of a target, in accordance with one embodiment of the present invention, includes, in part, receiving a response from each of a multitude of grating couplers, and forming the image in accordance with the received responses and further in accordance with optical transfer functions of the multitude of grating couplers. Each grating coupler is responsive to a different direction of light received from the target. Each grating coupler has a field of view that overlaps with a field of view of one or more of the other grating couplers.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
A lens-less imaging device, in accordance with one embodiment of the present invention, includes a directionally sensitive receiver array formed by a multitude of receiver elements that have different responses to different directions from which light from a target is received.
A lens-less imaging device, in accordance with embodiments of the present invention, provides a number of advantages. A lens-less imaging device, accordance with embodiments of the present invention, may be as small as a few wavelengths in thickness; therefore it is thin and compact. Unlike conventional cameras that require a bulky lens to collect light efficiently, a lens-less imaging device, in accordance with embodiments of the present invention, does not require any external optical components, such as a lens.
In an incoherent/coherent imaging system, the phasor amplitudes received from different points on a target are uncorrelated since the illumination is spatially incoherent. Equation (1) below shows the relationship between the light intensity of an image of a target and the light intensity emanating from target along the x-direction for a one-dimensional array of pixels:
IIM(ƒx)=|H(ƒx)|2ITAR(ƒx)=H(ƒx)ITAR(ƒx) (1)
In equation (1), IIM(ƒx) represents the image light intensity shown as being a function of the Fourier transform of the received light intensity along the x-direction, H(ƒx) is the optical transfer function providing a mapping between the target and its image, |H(ƒx)|2 represents that square of the absolute value of H(ƒx), and ITAR(ƒx) represents the target light intensity shown as being a function of the Fourier transform of the light intensity emanating from the target along the x-direction.
In a two-dimensional space, the optical transfer function may be represented by the following expression:
In equation (2), h(u, v) represents the point-spread function associated with the imaging system, FT represents the Fourier transform operation, and (u, v) are variables representing positions along the x-y axis.
Using Parseval's identity, it is seen that:
In equation (3), p and q are variables used in the integration. Accordingly, the optical transfer function H(ƒx,ƒy) is the normalized autocorrelation function of the amplitude transfer function (ATF). Geometrically, the optical transfer function is the overlap area of the ATF with a spatially shifted version of the ATF divided by the total area of the ATF.
In accordance with one embodiments of the present invention, because each pixel 10 is sensitive to the light received from a different direction, each pixel captures the incident light differently. The intensity of the light arriving at the angle θ near the center of pixel 10i, namely Iim
|im
In equation (4), Hi(θ) represents the optical transfer function of pixel 10i, Itar(θ) represents the intensity of the light received from the target, and * denotes the convolution operation.
Since the imaging device has N pixels, the intensity of the light received by the N pixels may be represented by vector {right arrow over (I)}im(θ) defined as shown below:
{right arrow over (I)}im(θ)={right arrow over (H)}i(θ)*{right arrow over (I)}tar (5)
In Equation (5), each entry in vector {right arrow over (Iim)}(θ) represents the average light intensity received along the direction θ, {right arrow over (H)}(θ) represents the vector of transfer functions, and {right arrow over (I)}tar(θ) represents the average light intensity emanating from the target along the direction θ.
Each vector in equation (5) may be represented by a matrix. Assuming M represents the discretization of the values in equation (5), vector {right arrow over (I)}im may be represented by a 1×N matrix Y, vector {right arrow over (H)}(θ) may be represented by an N×M matrix H, and vector {right arrow over (I)}tar may be represented by an 1×M matrix I:
Accordingly, equation (5) may be shown in a matrix form as:
Y=H·I (6)
Referring to equation (6) because the matrix Y is obtained by the imaging device, in accordance with the present invention, and the transfer functions of the pixels represented by matrix H is also known, as described further below, the image of the target represented by matrix I can be computed.
In one embodiment, by changing the spacing(s) between the openings of a grating coupler, the sensitivity and responsiveness of the grating coupler to the angle of the incident light may be varied. Such angular/directional sensitivity/responsiveness is also referred to herein as angular view.
Referring to
In accordance with another embodiment of the present invention, in addition to having different angular views, different pixels or different subsets of the pixels of a lens-less imaging device may have different responses to different wavelengths of the light. Such additional sensitivity to the light wavelength increases the information captured by the imaging device thus enabling it to form a higher resolution of the target image, reflected in the higher rank of the matrix represented by equation 6. In yet other embodiments, by forming pixels that have different responses to different wavelengths, an imaging device, in accordance with embodiments of the present invention, is adapted to form different images of the target for each different wavelength.
In some embodiments, the light captured by each pixel may have a Gaussian distribution. In some embodiments, the difference between the angular views of each pair of adjacent pixels may be the same. For example, in the embodiment, shown in
where σ represents the beam width and (θ−θc) represents the degree of deviation from the angular view of that pixel. For example, referring to
In yet other embodiments, the difference between the angular views of different pair of adjacent pixels may be randomly selected and may thus be different. Therefore, in such embodiment, the difference (θi−θi+1) may be different for different i's.
In some embodiments of the present invention, the light captured by the pixels does not have a Gaussian distribution pattern.
In accordance with one embodiment of the present invention, a lens-free imaging device includes a mapping layer formed over an array of detectors.
The above embodiments of the present invention are illustrative and not limitative. The embodiments of the present invention are not limited by the number of receiving elements or pixels in the array or the number of array dimensions. The above embodiments of the present invention are not limited by the wavelength or frequency of the light. Other modifications and variations will be apparent to those skilled in the art and are intended to fall within the scope of the appended claims.
The present application claims benefit under 35 USC 119(e) of Application Ser. No. 62/535,375 filed Jul. 21, 2017, the content of which is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4585948 | Schneider et al. | Apr 1986 | A |
4915463 | Barbee, Jr. | Apr 1990 | A |
5196900 | Pettersen | Mar 1993 | A |
6483096 | Kunz | Nov 2002 | B1 |
6587180 | Wang | Jul 2003 | B2 |
7211820 | Gunapala | May 2007 | B2 |
7286221 | Caracci | Oct 2007 | B2 |
8660312 | Cui et al. | Feb 2014 | B2 |
8710427 | Amako | Apr 2014 | B2 |
9110240 | Gill | Aug 2015 | B2 |
9239455 | Brueck | Jan 2016 | B2 |
9377179 | Tukker et al. | Jun 2016 | B2 |
9484386 | Chang | Nov 2016 | B2 |
9837455 | Lin | Dec 2017 | B2 |
10094661 | Hsu | Oct 2018 | B2 |
11326946 | Brueck | May 2022 | B2 |
20020185535 | Tsikos et al. | Dec 2002 | A1 |
20030038933 | Shirley et al. | Feb 2003 | A1 |
20030103037 | Rotzoll | Jun 2003 | A1 |
20050243439 | Tomita | Nov 2005 | A1 |
20070083289 | Russell | Apr 2007 | A1 |
20110157393 | Zomet et al. | Jun 2011 | A1 |
20110228142 | Brueckner et al. | Sep 2011 | A1 |
20120013749 | Oberdoerster et al. | Jan 2012 | A1 |
20120091372 | Molnar et al. | Apr 2012 | A1 |
20120286147 | Hvass et al. | Nov 2012 | A1 |
20130085398 | Roukes | Apr 2013 | A1 |
20130135515 | Abolfadl | May 2013 | A1 |
20140043320 | Tosaya et al. | Feb 2014 | A1 |
20140085265 | Yin | Mar 2014 | A1 |
20140209581 | Pawlowski et al. | Jul 2014 | A1 |
20150077618 | Ueno | Mar 2015 | A1 |
20150125943 | Molnar | May 2015 | A1 |
20150193937 | Georgiev | Jul 2015 | A1 |
20150236066 | Tayanaka | Aug 2015 | A1 |
20160069997 | Johnson et al. | Mar 2016 | A1 |
20160127713 | Hazeghi | May 2016 | A1 |
20160161731 | Brueck et al. | Jun 2016 | A1 |
20160295087 | Oberdörster | Oct 2016 | A1 |
20170112376 | Gill | Apr 2017 | A1 |
20170160482 | Frankel | Jun 2017 | A1 |
20170187933 | Duparre | Jun 2017 | A1 |
20170243373 | Bevensee | Aug 2017 | A1 |
20170261738 | Shiono | Sep 2017 | A1 |
20180191953 | Stork | Jul 2018 | A1 |
20180275350 | Oh | Sep 2018 | A1 |
20180321390 | Gotthold et al. | Nov 2018 | A1 |
20180329185 | Gill | Nov 2018 | A1 |
20190075282 | White et al. | Mar 2019 | A1 |
20190094575 | Wang et al. | Mar 2019 | A1 |
20190209102 | Hugg | Jul 2019 | A1 |
20190278006 | Tajima | Sep 2019 | A1 |
20200124474 | Brueck | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
1949833 | Apr 2007 | CN |
101027598 | Aug 2007 | CN |
101455071 | Jun 2009 | CN |
101622716 | Jan 2010 | CN |
102265124 | Nov 2011 | CN |
102710902 | Oct 2012 | CN |
103837937 | Jun 2014 | CN |
104375270 | Feb 2015 | CN |
104395818 | Mar 2015 | CN |
104518835 | Apr 2015 | CN |
WO 2012044250 | Apr 2012 | WO |
WO-2016199984 | Dec 2016 | WO |
WO-2017095587 | Jun 2017 | WO |
Entry |
---|
WIPO Application No. PCT/US2018/046529, PCT International Search Report and Written Opinion of the International Searching Authority dated Dec. 4, 2018. |
U.S. Appl. No. 16/102,484, Non-Final Office Action dated Mar. 7, 2019. |
U.S. Appl. No. 16/102,484, Final Office Action dated Nov. 13, 2019. |
WIPO Application No. PCT/US2018/043326, PCT International Preliminary Report on Patentability dated Jan. 30, 2020. |
WIPO Application No. PCT/US2018/046529, PCT International Preliminary Report on Patentability dated Feb. 20, 2020. |
WIPO Application No. PCT/US2018/043326, PCT International Search Report and Written Opinion of the International Searching Authority dated Sep. 28, 2018. |
U.S. Appl. No. 16/102,484, Non-Final Office Action dated Jun. 19, 2020. |
U.S. Final Office Action dated Mar. 26, 2021 in U.S. Appl. No. 16/102,484. |
U.S. Final Office Action dated May 25, 2022 in U.S. Appl. No. 16/102,484. |
U.S. Office Action dated Oct. 29, 2021 in U.S. Appl. No. 16/102,484. |
CN Office Action dated Jun. 20, 2022, in Application No. CN201880058687 with English translation. |
U.S. Notice of Allowance dated Dec. 23, 2022 in U.S. Appl. No. 16/102,484. |
CN Office Action dated Feb. 8, 2023, in Application No. CN201880058687 with English translation. |
U.S. Notice of Allowance dated Apr. 12, 2023 in U.S. Appl. No. 16/102,484. |
U.S. Notice of Allowance dated Jul. 24, 2023 in U.S. Appl. No. 16/102,484. |
Number | Date | Country | |
---|---|---|---|
20190028623 A1 | Jan 2019 | US |
Number | Date | Country | |
---|---|---|---|
62535375 | Jul 2017 | US |