1. Field of the Invention
This invention relates generally to multimode imaging systems using filter modules, such as plenoptic imaging systems using filter modules.
2. Description of the Related Art
A multimode imaging system is an imaging system that can capture information beyond the usual spatial information acquired by conventional imaging systems (e.g., those based on RGB Bayer patterns). For example, a multimode imaging system might acquire additional spectral information (multispectral and hyperspectral systems) to be used for spectral analysis or identification of substances. Multimode imaging systems might also capture information about the polarization of a scene, or even provide a higher dynamic range than that provided by the inherent capability of the detector array.
Acquiring this additional information can be difficult since most commercially available detector arrays spatially segment the incoming image into a two-dimensional image signal. Traditionally, the additional information of spectra, polarization or other modes was acquired by time multiplexing. For example, in spectral imaging applications, it might be desirable to acquire radiation from an object at different wavelengths of interest. The number of wavelength bands of interest may be between 5 and 20 for multispectral imagers and more than 20 for hyperspectral imagers. Traditional multispectral or hyperspectral imagers are based either on a filter wheel that contains wavelength filters that correspond to the wavelength bands of interest or on dispersive elements such as prisms or gratings. In case a filter wheel is used, at any one time, only one of the wavelength filters is positioned in the imaging path. The filter wheel rotates in order to switch from one wavelength filter to the next. Thus, the multispectral or hyperspectral imaging is implemented in a time multiplexed manner. However, the resulting systems can be large and complicated. In case dispersive elements are used to spatially separate different wavelengths, the light is typically dispersed along one dimension of the detector array. The other dimension is used to capture one spatial dimension of the object. However, it is difficult to also capture the second spatial dimension of the object. Sometimes, time multiplexing is introduced to capture the second spatial dimension, for example by scanning.
Recently, there has been an increased attention on acquiring multimode information of a scene simultaneously, or in a “single snapshot.” These single-snapshot systems multiplex the different mode signals onto different detector pixels in the detector array. That is, the multimode information is spatially multiplexed rather than time multiplexed.
Single snapshot multispectral imaging architectures can generally be categorized into two classes. One class uses dispersive elements, such as prisms or gratings, to spatially separate different wavelengths in combination with some beam splitting element, e.g. a prism or a mask. This architecture has the disadvantage that the dispersive element is typically applied to either a collimated beam or at an intermediate image plane. As a result, many of these systems can be quite large and/or have a limited field of view.
In the other class of single snapshot imagers, separate filters are attached to each detector in a manner similar to the RGB Bayer pattern found in conventional color imaging systems. That is, a color filter array is laid out on top of the detector array, so that individual filters (which will be referred to as micro-filters) each filter the light directed to individual detectors. The individual micro-filters are designed to implement the multimode imaging. For example, the filter array might include micro-filters with 20 different wavelength responses in order to implement multispectral imaging. One disadvantage of this class of systems is the increased cost and complexity of manufacturing. Because there is a one-to-one correspondence between micro-filters and detectors, and because the micro-filters are attached to the detectors, the micro-filters are the same size as the detectors, which is small. The many different small micro-filters must then be arranged into an array and aligned with the underlying detectors. Another disadvantage is the lack of flexibility. Once the micro-filter array is attached to the detector array, it is difficult to change the micro-filter array.
Thus, there is a need for improved multimode imaging systems.
The present invention overcomes the limitations of the prior art by providing an adjustable multimode lightfield imaging system. A non-homogeneous filter module, for example a filter array, is positioned at or near the pupil plane of the main (or primary) image-forming optical module of the lightfield imaging system and provides the multimode capability. The filter module can be moved relative to the imaging system with the exposure conditions adjusted accordingly, thus allowing adjustment of the multimode capability.
In one embodiment, an adjustable multimode lightfield imaging system includes an optical module that forms an optical image of an object. The optical module is characterized by imaging optics, a pupil plane and an image plane. A filter array is positioned approximately at the pupil plane. The filter array is moveable relative to the optical module. An array of micro-imaging elements is positioned approximately at the image plane. A detector array is positioned such that the micro-imaging elements image the filter array onto the detector array. The detector array captures a multimode image of the object according to an exposure condition. Changing the position of the filter array changes the multimode capability of the imaging system. A controller adjusts the exposure condition in response to the position of the filter array.
The exposure condition can be adjusted in different ways. For example, the controller could adjust the strength of a light source illuminating the object, as might be the case in a microscope setting. Alternately, the controller could adjust a light throughput of the imaging system, for example by adjusting the attenuation of a neutral density filter in the system. The controller could also adjust the exposure/integration time of the detector array.
In another aspect, the exposure condition is adjusted in response to an image quality metric. Examples of such metrics include signal to noise ratio (SNR), signal energy, contrast, brightness and exposure value.
Other aspects of the invention include methods corresponding to the devices and systems described above, and applications for the foregoing.
The invention has other advantages and features which will be more readily apparent from the following detailed description of the invention and the appended claims, when taken in conjunction with the accompanying drawings, in which:
The figures depict embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
In a conventional imaging system, a detector array would be located at image plane 125 to capture the optical image 160. In a conventional color imaging system, an array of micro-filters (e.g., arranged in an R-G-B Bayer pattern) would be attached to the detector array at the image plane 125, so that each individual detector would capture red, green or blue channels of data.
However, this is not the case in
Second, an array 120 of micro-imaging elements 121 is located at the image plane 125. In
In the case of microlenses, each microlens 121 forms an image 170 of the pupil (and filter module) at the conjugate plane 135. The corresponding section of the detector array 130 may contain a single detector 131 or multiple detectors 131 (preferably, multiple detectors), and will be referred to as a “subarray” 133. Thus, in the example of
Conversely, referring to
Referring to
Each detector 131 collects the light from one filter 111 that travels through one microlens 121. The microlens array 120 is located in a conjugate plane to the object 150, so there is also an imaging relationship between the object 150 and the microlens array 120. Therefore, the light incident on a microlens is light originating from a portion of the object, not from the entire object. Thus, each detector 131 collects the light from a corresponding portion of the object (as determined by the extent of the microlens), as filtered by a corresponding filter 111.
Analogously, microlens 121B will image the entire filter array 110 (at least the illuminated portion) onto subarray 133B. Thus, filter 111w will be imaged by microlens 121B onto subpixel 132B(w), filter 111x onto subpixel 132B(x), and so on for the other filter/subpixel pairs, and for the other microlenses 121. For detectors 131 that fall entirely within subpixel 132B(w), those detectors will detect the light coming from subobject 151B passing through filter 111w. The same is true for the other subpixels 132A(w)-132C(z).
The imaging system is a “lightfield” imaging system because the rays collected by a detector are a function of not only the position in the object plane (as is the case in conventional imaging systems), but also a function of the position in the pupil plane. The imaging system is a multimode imaging system because the filter module 110 captures multimode information. These systems are sometimes referred to as plenoptic imaging systems.
Referring again to
The system also includes an actuator 140 and a source controller 142. The actuator 140 can change the position of the filter module 110 relative to the optical module 105. In this way, which filters 111 are directed to which subpixels 132 can be changed, thereby changing the effective filtering. For example, in
Each of
In
In many cases, the filter array may be positioned so that each filter covers an equal number of detectors. However, this is not required. In the position of
This architecture has several advantages over a conventional approach. A conventional approach might use two filter wheels, one containing filters F1,F2,F4,F5 when that spectral analysis is desired and another containing filters F2,F3,F5,F6 for the alternate spectral analysis. When the F1,F2,F4,F5 filter wheel is used, each of the filters F1,F2,F4,F5 rotates through the illumination path in sequence. This results in time multiplexing of the spectral information (rather than single snapshot capture of the spectral information). Furthermore, if the F2,F3,F5,F6 analysis is desired, the filter wheel must be changed, rather than simply shifting the filter module as in
As another example, the filter module in
This approach might be used in an application where spatial and spectral resolution are traded off. Detailed images of the object (i.e., high spatial resolution) may be formed when zoomed in, but with lower spectral resolution or less spectral information. When zoomed out, increased spectral resolution may be achieved at the expense of spatial resolution.
Other types of movement can also be used to change which filters are illuminated.
As a final example, consider a multispectral imaging task that involves detecting different types of spectral signals. For example, the classification of plant matter as either dry or fresh requires different spectral signatures for different crop types such as rice, corn, wheat. Preferably, a single, multispectral imaging system could be adjusted to address the different crop types, rather than requiring a separate filter wheel for each crop type.
Filters f1-f4 and c5,c6 are used to identify corn, and to distinguish between fresh and dry corn leaf. Filters f1-f4 and r5,r6 are used to identify rice, and to distinguish between fresh and dry rice leaf. Note that filters f1-f4 are shared filters that are common to both corn and rice. The filter specifications in Tables 1-3 were chosen from a dictionary of commercially available filters.
When detecting corn, the filter array is positioned as shown in
The use of a non-homogeneous filter module allows the capture of a multimode image in a single snapshot. That is, images filtered by filters F1, F2, etc. can be acquired simultaneously. However, different modes may benefit from different illuminations. For example, if F1 is a red filter and F2 is a blue filter and the detector array or scene is more sensitive to red than to blue, then the blue image may benefit from a stronger illumination. However, since both the red and blue images are acquired simultaneously, stronger illumination will produce stronger illumination for both the red and blue images. This can be addressed by both repositioning 140 the filter module and adjusting 142 the source strength (or other exposure condition).
The multimode image can be processed 1130 in many ways, which may affect the image quality metric 1140. For example, the processed image could be produced by summing or averaging all the detectors corresponding to each filter. Alternately, several sub-images for one filter could be produced by demultiplexing the detectors behind a lenslet into one sub-image. If the second method is chosen, then further processing might be applied to better define the image quality metric. This additional processing could be averaging all the sub-images for a given filter, or choosing features that are best in each sub-image to define a “sharpest feature” image, a “highest SNR” image, etc.
The following is another example method for determining the exposure condition corresponding to the position for the filter module. Let Q be the image quality metric. Assume there are K filters fk. In this example, there is an initial calibration (steps 1 and 2) before taking a sequence of images.
Step 1. For different positions for the filter module, determine which detectors fall within which subpixel corresponding to each filter fk, k=1, . . . K. That is, for filter module position 1, which detectors will collected light filter by filter f1, which will collected light filtered by f2, etc. This can be done experimentally or through modeling or simulation. For example, geometric optics calculations can be used to propagate light passing through a filter onto the detector array.
Step 2. Position the filter module such that only one of the filters fk is exposed (e.g., as shown in
Step 3. Position the filter module such that each filter fk covers an equal area of the pupil (e.g., as shown in
Step 4. Process other positions of the filter module as follows:
In a special case, the image quality metric is the exposure value
EV=log2(A/T) (1)
where A is the area of the pupil covered by the relevant filter and T is the integration/exposure time. In this case, some of the calculations are simplified due to the log nature of the exposure value. In another variation, the approach described above can be used but without the initial calibration phase.
As different exposure conditions are determined for different positions of the filter module, this information can be stored for future use. In one approach, the data is stored in a lookup table. Then, as different filter module positions are used in the field, the corresponding exposure conditions can be determined from the lookup table.
Although the detailed description contains many specifics, these should not be construed as limiting the scope of the invention but merely as illustrating different examples and aspects of the invention. It should be appreciated that the scope of the invention includes other embodiments not discussed in detail above.
For example, many types of filter arrays can be implemented. Filters can have different wavelength responses. All of the filters can be different, or some can be different and some the same. Multiple filters or filter arrays can also be used. For example, multiple filter arrays or filter modules can be used, each of which is moved independently of the others (or not). The number and type of wavelength responses can also vary, depending on the application. Some applications will use many different wavelength filters: 10, 20, 25 or more. In some cases, specific wavelength lines are being detected, so narrow band interference filters may be used. Filters other than wavelength filters can also be used, for example polarization, luminance, and neutral density filters.
Filtering can also be implemented at other points in the system. For example, the invention does not prevent the use of traditional micro-filters with the detector array. Various types of optical modules can also be used, including reflective and catadioptric systems. In some applications, the optical module is preferably telecentric. Finally, terms such as “light” and “optical” are not meant to be limited to the visible or purely optical regions of the electromagnetic spectrum, but are meant to also include regions such as the ultraviolet and infrared (but not be limited to these).
Various other modifications, changes and variations which will be apparent to those skilled in the art may be made in the arrangement, operation and details of the method and apparatus of the present invention disclosed herein without departing from the spirit and scope of the invention as defined in the appended claims. Therefore, the scope of the invention should be determined by the appended claims and their legal equivalents.
Number | Name | Date | Kind |
---|---|---|---|
5227890 | Dowski, Jr. | Jul 1993 | A |
5521695 | Cathey, Jr. et al. | May 1996 | A |
5748371 | Cathey, Jr. et al. | May 1998 | A |
5870179 | Cathey, Jr. et al. | Feb 1999 | A |
5926283 | Hopkins | Jul 1999 | A |
5982497 | Hopkins | Nov 1999 | A |
6021005 | Cathey, Jr. et al. | Feb 2000 | A |
6069738 | Cathey, Jr. et al. | May 2000 | A |
6313960 | Marquiss et al. | Nov 2001 | B2 |
6525302 | Dowski, Jr. et al. | Feb 2003 | B2 |
6842297 | Dowski, Jr. | Jan 2005 | B2 |
6873733 | Dowski, Jr. | Mar 2005 | B2 |
6911638 | Dowski, Jr. et al. | Jun 2005 | B2 |
6927836 | Nishinaga | Aug 2005 | B2 |
6940649 | Dowski, Jr. | Sep 2005 | B2 |
7433042 | Cavanaugh et al. | Oct 2008 | B1 |
7616841 | Robinson et al. | Nov 2009 | B2 |
7616842 | Robinson | Nov 2009 | B2 |
7663690 | Kurosawa | Feb 2010 | B2 |
7692709 | Robinson et al. | Apr 2010 | B2 |
7714982 | Nagasaka | May 2010 | B2 |
8143565 | Berkner et al. | Mar 2012 | B2 |
8248511 | Robinson et al. | Aug 2012 | B2 |
8259200 | Iwasa | Sep 2012 | B2 |
8345144 | Georgiev et al. | Jan 2013 | B1 |
8471920 | Georgiev et al. | Jun 2013 | B2 |
8761534 | Shroff et al. | Jun 2014 | B2 |
8949078 | Berkner et al. | Feb 2015 | B2 |
20010021074 | Marquiss et al. | Sep 2001 | A1 |
20020118457 | Dowski, Jr. | Aug 2002 | A1 |
20020195548 | Dowski, Jr. et al. | Dec 2002 | A1 |
20030057353 | Dowski, Jr. et al. | Mar 2003 | A1 |
20030169944 | Dowski, Jr. et al. | Sep 2003 | A1 |
20030173502 | Dowski, Jr. et al. | Sep 2003 | A1 |
20040145808 | Cathey, Jr. et al. | Jul 2004 | A1 |
20040190762 | Dowski, Jr. et al. | Sep 2004 | A1 |
20040228005 | Dowski | Nov 2004 | A1 |
20040257543 | Dowski, Jr. et al. | Dec 2004 | A1 |
20050088745 | Cathey, Jr. et al. | Apr 2005 | A1 |
20050197809 | Dowski, Jr. et al. | Sep 2005 | A1 |
20050264886 | Dowski | Dec 2005 | A1 |
20060077284 | Kurosawa | Apr 2006 | A1 |
20060187311 | Labaziewicz et al. | Aug 2006 | A1 |
20070025722 | Matsugu et al. | Feb 2007 | A1 |
20070081224 | Robinson et al. | Apr 2007 | A1 |
20080204744 | Mir et al. | Aug 2008 | A1 |
20080266655 | Levoy et al. | Oct 2008 | A1 |
20090096914 | Domenicali | Apr 2009 | A1 |
20090135282 | Gidon | May 2009 | A1 |
20100165178 | Chou et al. | Jul 2010 | A1 |
20100253941 | Brady et al. | Oct 2010 | A1 |
20110073752 | Berkner et al. | Mar 2011 | A1 |
20120182438 | Berkner et al. | Jul 2012 | A1 |
20130027581 | Price et al. | Jan 2013 | A1 |
20130215236 | Shroff et al. | Aug 2013 | A1 |
Entry |
---|
Cavanaugh, D.B. et al., “VNIR Hypersensor Camera System,” Proc. SPIE, conference Aug. 3, 2009, online publication Aug. 17, 2009, 17 pages, vol. 7457. |
Choi, J. et al., “Zoom Lens Design for a Novel Imaging Spectrometer That Controls Spatial and Spectral Resolution Individually,” Applied Optics, 2006, pp. 3430-3441, vol. 45. |
Elliott, K.H., “A Novel Zoom-Lens Spectrograph for a Small Astronomical Telescope,” Mon. Not. R. Astron. Soc., 1996, pp. 158-162, vol. 281. |
Fife, K. et al., “A 3D Multi-Aperture Image Sensor Architecture,” Proc. of IEEE Custom Integrated Circuits Conference, 2006, pp. 281-284. |
Gehm, M. et al., “Single-Shot Compressive Spectral Imaging with a Dual-Disperser Architecture,” Optics Express, Oct. 17, 2007, pp. 14013-14027, vol. 15, No. 21. |
Harvey, a.R. et al., “Spectral Imaging in a Snapshot,” Proc. of SPIE, 2005, pp. 110-119, vol. 5694. |
Harvey, a.R. et al., “Technology Options for Hyperspectral Imaging,” Proc. of SPIE, 2000, pp. 13-24, vol. 4132. |
Horstmeyer, R. et al., “Flexible Multimodal Camera Using a Light Field Architecture,” IEEE Conf. on Computational Photography, 2009, pp. 1-8. |
Shogenji, R. et al., “Multispectral Imaging Using Compound Optics,” Optics Express, Apr. 19, 2004, pp. 1643-1655, vol. 12, No. 8. |
Cathey, W. Thomas et al., “New Paradigm for Imaging Systems,” Applied Optics, Oct. 10, 2002, pp. 6080-6092, vol. 41. |
European Extended Search Report, European Application No. 12157955.1, Mar. 14, 2013, 7 pages. |
Fife, K. et al., “A Multi-Aperture Image Sensor with 0.7 μm Pixels in 0.11 μm CMOS Technology,” IEEE Journal of Solid-State Circuits, Dec. 2008, pp. 2990-3005, vol. 43, No. 12. |
Maeda, P. Y. et al., “Integrating Lens Design with Digital Camera Simulation,” 5678 SPIE Proceedings SPIE Electronic Imaging, Feb. 2005, pp. 48-58, San Jose, CA. |
Chinese First Office Action, Chinese Application No. 201410003320.2, Sep. 1, 2015, 21 pages. |
Number | Date | Country | |
---|---|---|---|
20140192255 A1 | Jul 2014 | US |