Multispectral imaging systems collect and record electromagnetic energy in multiple distinct spectral bands, including light from the visible and near-infrared (VNIR), ultraviolet (UV) and infrared (IR) wavelengths of the spectrum. The resulting imagery is displayed by combining the spectral band information into one or many channels to form a grayscale or color representation of the image. Multispectral imaging devices are a class of spectrometers that record energy in many discrete spectral bands simultaneously on an image sensor at a multitude of spatial picture elements, called pixels. Standard broadband imagers record one value at each pixel for all the detected incident energy across a wide spectrum, and create an image in two spatial dimensions from a two-dimensional array of detectors. Multispectral imaging devices differ from standard broadband imagers by creating an image with an additional spectral dimension. Each multispectral pixel may have tens or hundreds of wavelength values recorded where each value is considered a subpixel. A staring array is one type of imaging device where two-dimensional array of detector elements at a focal plane captures energy in selected spectral bands so that an image can be directly constructed from the pixels and subpixels.
One aspect of the invention relates to a multispectral staring array. The multispectral staring array comprises at least two sensors, each sensor adapted to detect an image in a different predetermined spectral sensitivity, and positioned on an image plane wherein the image planes are neither coplanar with nor parallel to each other; a first lens to focus incident spectral bands; a spectral filter between the first lens and the at least two sensors wherein the spectral filter is configured to subdivide the incident spectral bands; and a second lens configured to direct and focus the subdivided incident spectral bands on each of the at least two sensors.
In the drawings:
In the background and the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the technology described herein. It will be evident to one skilled in the art, however, that the exemplary embodiments may be practiced without these specific details. In other instances, structures and device are shown in diagram form in order to facilitate description of the exemplary embodiments.
The exemplary embodiments are described with reference to the drawings. These drawings illustrate certain details of specific embodiments that implement the technology described herein. However, the drawings should not be construed as imposing any limitations that may be present in the drawings.
Technical effects of the multispectral staring array disclosed in the embodiments include greatly reducing the complexity of a sensing system that needs to gather both visible and infrared light. The consolidation of sensors reduces optical complexity and allows for the creation of a single multifunction semiconductor. Consequently, a multispectral imaging device implementing the staring array disclosed in the embodiments can be built into a smaller form factor with decreased weight and power requirements than its conventional counterpart. Additionally, the integration of different sensor types into a single semiconductor as disclosed in the embodiments of the current invention conserve the total light power available to the system that is typically lost due to complex light splitting techniques. By integrating sensors with different spectral sensitivities on a single semiconductor chip, the multispectral image device embodied in the present invention can form substantially real-time high dynamic range imagery. Finally, the embodiments of the multispectral staring array maintain sampling coherency.
The array 10 includes an input aperture such as a lens 14. The lens 14 is configured to collect incident electromagnetic energy. As shown in
The subdivided spectral bands of electromagnetic energy then arrive at a microlens array 26. Each microlens of the microlens array 26 directs and focuses the subdivided spectral bands of electromagnetic energy onto the sensor 28 as shown by the set of rays such as 32 and 34 from the microlens array 26 to the sensor 28. The sensor 28 is positioned on an image plane 40 to the spectral filter 22. The sensor 28 is an array of detectors where each detector may be configured to detect electromagnetic energy of a spectral band with a predetermined spectral sensitivity.
The configuration of the spectral filter 22, microlens array 26 and the sensor 28 is such that each detector in the array will detect electromagnetic energy of a particular spectral band that corresponds to a segment of the imaged object 12. In this way, the sensor 28 includes detector elements arranged to form a multispectral pixel of the imaged object 12. The set of the detector elements on the sensor 28 that detects incident electromagnetic energy reflecting or radiating from a particular area of an imaged object 12 form a multispectral pixel.
As shown in
The redundant elements of the second spatial filter 20, the second microlens 24 array and the second sensor 30 allow for a highly configurable imager. The second spectral filter 20 may be configured to have a different wavelength response than the first spectral filter 22. Similarly, the second sensor 30 may consist of an array of detectors where each detector is configured to detect a set of spectral bands with a different spectral sensitivity than the analogous detector of the first sensor 28. In this way, the multispectral staring array 10 may achieve a higher spectral resolution, that is, detect a higher number of spectral bands per pixel than using a single optical path from the mirror 16 to the first sensor 28 by way of the first spectral filter 22 and the first microlens array 26.
Alternatively, the multispectral staring array 10 may achieve a higher dynamic range, that is, an expanded limit of detectable luminance, than using a single sensor by using two sensors 28 and 30 with different spectral sensitivities for the same spectral bandwidth. Additionally, according to an embodiment of the present invention, the multispectral staring array 10 may further augment high dynamic range (HDR) imaging by altering the timing of the mirror toggling. For example, the mirror at 16 would spend a relatively short time at a first position 16 to enable sensor 28 to capture variance of luminance of the scene in bright regions of the image and a relatively longer time at a second position 18 to enable sensor 30 to capture variance of luminance of the scene in the dark regions of the image.
The subdivided spectral bands of electromagnetic energy then arrive at a microlens array 112. Each microlens of the microlens array 112 directs and focuses the subdivided spectral bands of electromagnetic energy onto one of the three sensors 114, 116 and 118. In this way, the microlens array 112 is acting in the same capacity as the actuating mirror 16, 18 in
As in the embodiment of
To enable the embodiments of the invention, the sensor as in 28 and 30 in the embodiment of
In another embodiment of the mixed-material sensor as shown in
Many semiconductor materials are known to be used in the manufacture of imaging detectors. In any embodiment of the present invention, these materials may be combined on a single chip. For example, the UV and VNIR wavelength detectors 310, 314 may be based upon variants of charge-coupled devices (CCD) or complementary metal-oxide-semiconductor detectors (CMOS). The IR detectors 312 may be a photodetector built from telluride, indium, or other alloys and incorporated with the VNIR and UV detectors 310, 314. When stacking detector types on the same semiconductor with a common substrate for the detector types such as silicon, the detectors are created on the same base wafer using multiple doping and etching steps to produce a layered mixed wavelength photodetector. For example, the UV detector 314 may be built upon the gate drains of the CCD-based VNIR detector 310.
Number | Name | Date | Kind |
---|---|---|---|
4678332 | Rock et al. | Jul 1987 | A |
5300778 | Norkus et al. | Apr 1994 | A |
5373182 | Norton | Dec 1994 | A |
5587784 | Pines et al. | Dec 1996 | A |
5729011 | Sekiguchi | Mar 1998 | A |
5926282 | Knobloch et al. | Jul 1999 | A |
6031619 | Wilkens et al. | Feb 2000 | A |
7274454 | Kowarz et al. | Sep 2007 | B2 |
7623165 | Gruhlke et al. | Nov 2009 | B2 |
7949019 | Bouma et al. | May 2011 | B2 |
8081244 | Golub et al. | Dec 2011 | B2 |
8143565 | Berkner et al. | Mar 2012 | B2 |
20040218187 | Cole | Nov 2004 | A1 |
20060279647 | Wada et al. | Dec 2006 | A1 |
20070170359 | Syllaios et al. | Jul 2007 | A1 |
20080137074 | Furman et al. | Jun 2008 | A1 |
20090021598 | McLean et al. | Jan 2009 | A1 |
20110194100 | Thiel et al. | Aug 2011 | A1 |
20110254953 | Genio et al. | Oct 2011 | A1 |
20110279721 | McMahon | Nov 2011 | A1 |
20110316982 | Steurer | Dec 2011 | A1 |
20120038819 | McMackin et al. | Feb 2012 | A1 |
20120127351 | Vlutters et al. | May 2012 | A1 |
20130235256 | Kodama | Sep 2013 | A1 |
Number | Date | Country |
---|---|---|
0405051 | Jan 1991 | EP |
2012066741 | May 2012 | WO |
Entry |
---|
EP Search Report and Written Opinion dated Mar. 5, 2014 issued in connection with corresponding EP Application No. 13191947.4-1562. |
European Search Report and Written Opinion issued in connection with corresponding EP Application No. 13191947.4-1562 on Jun. 5, 2014. |
Haneishi et al., “Image acquisition technique for high dynamic range scenes using a multiband camera”, Color Research & Application, Volume No. 31, Issue No. 4, pp. 294-302, Aug. 1, 2006. |
Number | Date | Country | |
---|---|---|---|
20140132946 A1 | May 2014 | US |