1. Technical Field
The present disclosure relates to imaging systems and, particularly, to a computational imaging system.
2. Description of Related Art
Generally, an image of an object captured by conventional imaging systems is in focus only over a limited object distance range which is known as depth of field (DOF). Therefore, it is difficult to sharply capture object scenes that span large distances. To obtain an extended DOF, one attempt has been made that deliberately blurs an intermediate image captured by an imaging system by placing a coded aperture in the aperture of the imaging system and then digitally removes the blur using reconstruction algorithms. The coded aperture is patterned according to a modulation transfer function (e.g., a delta function). As such, reconstruction algorithms can effectively deconvolute the modulation transfer function and restores the image to a more recognizable likeness of the object with a greater DOF than what that would have been otherwise obtainable. This is known as coded aperture imaging and is one kind of computational imaging system. See Zand, J., “Coded Aperture Imaging in High Energy Astronomy”, NASA Laboratory for High Energy Astrophysics (LHEA) at NASA's GSFC (1996); Levin, A., Fergus, R., Durand, F., Freeman, B., “Image and Depth from a Conventional Camera with a Coded Aperture”, ACM Transactions on Graphics (Proc. SIGGRAPH) (2007); Veeraraghavan, A., Raskar, R., Agrawal, A., Mohan, A., Tumblin, J., “Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing”, ACM Transactions on Graphics (Proc. SIGGRAPH) (2007); and Liang, C. K., Lin, T. H., Wong, B. Y., Liu, C., Chen, H. H., “Programmable Aperture Photography: Multiplexed Light Field Acquisition”, ACM Transactions on Graphics (Proc. SIGGRAPH), Vol. 27, No. 3, Article No. 55 (2008). However, to blur the intermediate image, the coded aperture (e.g., the pattern formed on the coded aperture) also blocks large amounts of light rays incident on the aperture, resulting in large amount of light loss.
Therefore, it is desirable to provide a computational imaging system, which can overcome the abovementioned shortcomings.
Many aspects of the present computational imaging system should be better understood with reference to the following drawings. The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the present computational imaging system. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
Embodiments of the present computational imaging system will now be described in detail with reference to the drawings.
Referring to
The lens 10 and the image sensor 20 constitute an imaging sub-system. The LC element 30 functions as the aperture of the imaging sub-system constituted by the lens 10 and the image sensor 20 (placed in the light path of the imaging sub-system).
The LC element 30 is a transmissive LC panel that has a periodically patterned electrode 32. The electrode 32 is patterned according to a periodical modulation transfer function (i.e., a spatial function):
H(x,y)=cos 2π(sxx+syy), (1)
where an origin of the oxy coordinate system is the center of the LC element 30, the x axis extends along the widthwise direction of the LC element 30, the y axis extends along the lengthwise direction of the LC element 30, sx is a spatial frequency of the electrode 32 along the x axis, and sy is a spatial frequency of the electrode 32 along the y axis. Assuming that: (i) the refractive index of the LC element 30 outside the electrode 32 is n0; and (ii) the refractive index of the LC element 30 at the electrode 32 is n=n0+Δn, where Δn is the refractive index variance caused by applying a voltage to the electrode 32, the refractive index of the entire LC element 30 can be expressed as a refractive index function:
n(x,y)=n0+Δn×cos 2π(sxx+syy). (2)
Also referring to
The digital Focus processor 40 includes a Fourier transforming device 42, a deconvolution device 44, an inverse Fourier transforming device 46, and a refocusing device 48.
The Fourier transforming device 42 is configured for transforming a space domain amplitude function UI(x,y) of an intermediate image captured by the image sensor 20 into a frequency domain function Uƒ(x,y), where ƒx, ƒy are x and y axes variables in the frequency domain, respectively. According to Fourier optics, it can be determined that:
where j is the imaginary unit, λ is a wavelength of light rays that captured by the image sensor 20, ƒ(x,y) is a focal length function of each point (e.g., pixel) (x,y) of the image sensor 20 to bring the corresponding point (x,y) into focus.
In addition, the Fourier transforming device 42 is also used for transforming the spatial function of the electrode 32 H(x,y) into a corresponding frequency domain function: Hƒ(ƒx,ƒy).
According to complex optics, the function Uƒ(ƒx,ƒy) is the convolution of a function US(x,y) and the function H(x,y), that is,
UI(x,y)=US(x,y)·H(x,y), (4)
wherein the function US(x,y) is a spatial domain amplitude function of a real (final) image of objects. As such, to obtain the real image of the objects, the function Uƒ(ƒx,ƒy) must go through deconvolution to obtain the function Hƒ(ƒx,ƒy). This is accomplished by the deconvolution device 44. According to mathematics, it can be determined that:
Uƒ(ƒx,ƒy)=F(US(x,y))·Hƒ(ƒx,ƒy), (5)
where F(US(x,y)) is the Fourier transform of the function US(x,y). As such, deconvoluting of the function Uƒ(ƒx,ƒy) can be expressed as:
F(US(x,y))={F}−1(Uƒ(ƒx,ƒy))Hƒ(ƒx,ƒy). (6)
As such, the blur caused by the electrode 32 is digitally removed.
The inverse Fourier transforming device 46 is configured for inversely transforming the frequency domain function F(US(x,y)) into the spatial domain amplitude function US(x,y) to restore the real image of the objects.
According to the above, it can be determined that the resulting function US(x,y) is a function of three variables: x, y, and ƒ(x,y). Therefore, for each point (x,y) of the real image, the unique in-focus focal length ƒ(x,y) can be determined. The refocusing device 50 is configured to determine the unique in-focus focal length for each point (x,y) of the real image to bring all points of the real image into focus. As such, an all-in-focus real image of the objects can be obtained.
By employing the LC element 30, transmittance of the electrode 32 can be controlled by adjusting the voltage applied thereto. As such, the amount of light loss can be controlled and minimized. Typically, to reduce light loss, a transmittance of the electrode 32 is greater than about 50%.
It will be understood that the above particular embodiments and methods are shown and described by way of illustration only. The principles and the features of the present disclosure may be employed in various and numerous embodiment thereof without departing from the scope of the disclosure as claimed. The above-described embodiments illustrate the scope of the disclosure but do not restrict the scope of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2009 1 0304765 | Jul 2009 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
6737652 | Lanza et al. | May 2004 | B2 |
7593161 | George et al. | Sep 2009 | B2 |
7646549 | Zalevsky et al. | Jan 2010 | B2 |
7792423 | Raskar et al. | Sep 2010 | B2 |
7830561 | Zomet et al. | Nov 2010 | B2 |
7888626 | Slinger et al. | Feb 2011 | B2 |
7923677 | Slinger | Apr 2011 | B2 |
7965936 | Raskar et al. | Jun 2011 | B2 |
20020075990 | Lanza et al. | Jun 2002 | A1 |
20050030625 | Cattin-Liebl | Feb 2005 | A1 |
20060157640 | Perlman et al. | Jul 2006 | A1 |
20080088841 | Brady | Apr 2008 | A1 |
20080124070 | Liang et al. | May 2008 | A1 |
20090022410 | Haskell | Jan 2009 | A1 |
20090128682 | He et al. | May 2009 | A1 |
20090244300 | Levin et al. | Oct 2009 | A1 |
20100003024 | Agrawal et al. | Jan 2010 | A1 |
20100110179 | Zalevsky et al. | May 2010 | A1 |
20100201865 | Han et al. | Aug 2010 | A1 |
20110007306 | Jak et al. | Jan 2011 | A1 |
20110085051 | Chi et al. | Apr 2011 | A1 |
20110085074 | Sonoda et al. | Apr 2011 | A1 |
20110157393 | Zomet et al. | Jun 2011 | A1 |
20110249028 | Chang et al. | Oct 2011 | A1 |
20110267507 | Kane et al. | Nov 2011 | A1 |
Number | Date | Country |
---|---|---|
2011166255 | Aug 2011 | JP |
WO 02056055 | Jul 2002 | WO |
Number | Date | Country | |
---|---|---|---|
20110019068 A1 | Jan 2011 | US |