The invention relates to the field of image capture. More particularly, the invention relates to a sensor for capture of an image and depth information and uses thereof.
Digital cameras and other image capture devices operate by capturing electromagnetic radiation and measuring the intensity of the radiation. The spectral content of electromagnetic radiation focused onto a focal plane depends on, among other things, the image to be captured, the illumination of the subject, the transmission characteristics of the propagation path between the image subject and the optical system, the materials used in the optical system, as well as the geometric shape and size of the optical system.
For consumer imaging systems (e.g., digital cameras) the spectral range of interest is the visible region of the electromagnetic spectrum. A common method for preventing difficulties caused by radiation outside of the visual range is to use ionically colored glass or a thin-film optical coating on glass to create an optical element that passes visible light (typically having wavelengths in the range of 380 nm to 780 nm). This element can be placed in front of the taking lens, within the lens system, or it can be incorporated into the imager package. The principal disadvantage to this approach is increased system cost and complexity.
A color filter array (CFA) is an array of filters deposited over a pixel sensor array so that each pixel sensor is substantially sensitive to only the electromagnetic radiation passed by the filter. A filter in the CFA can be a composite filter manufactured from multiple filters so that the transfer function of the resulting filter is the product of the transfer functions of the constituent filters. Each filter in the CFA passes electromagnetic radiation within a particular spectral range (e.g., wavelengths that are interpreted as red). For example, a CFA may be composed of red, green and blue filters so that the pixel sensors provide signals indicative of the visible color spectrum.
If there is not an infrared blocking element somewhere in the optical chain infrared (IR) radiation (typically considered to be light with a wavelength greater than 780 nm) may also be focused on the focal plane. Imaging sensors or devices based on silicon technology typically require the use of infrared blocking elements to prevent IR from entering the imaging array. Silicon-based devices will typically be sensitive to light with wavelengths up to 1200 nm. If the IR is permitted to enter the array, the device will respond to the IR and generate an image signal including the IR.
In current three-dimensional cameras, the depth information is captured separately from the color information. For example, a camera can capture red, green and blue (visible color) images at fixed time intervals. Pulses of IR light are transmitted between color image captures to obtain depth information. The photons from the infrared light pulse are collected between the capture of the visible colors.
The number of bits available to the analog-to-digital converter determines the depth increments that can be measured. By applying accurate timing to cut off imager integration, the infrared light can directly carry shape information. By controlling the integration operation after pulsing the IR light, the camera can determine what interval of distance will measure object depth and such a technique can provide the shape of the objects in the scene being captured. This depth generation process is expensive and heavily dependent on non-silicon, mainly optical and mechanical systems for accurate shutter and timing control.
The invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements.
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention can be practiced without these specific details. In other instances, structures and devices are shown in block diagram form in order to avoid obscuring the invention.
A sensor for color and depth information capture is disclosed. A filter passes selected wavelengths according to a predetermined pattern to the sensor. The sensor measures light intensities passed by the filter. In one embodiment, the wavelengths passed by the filter correspond to red, green, blue and infrared light. The intensity values can be used for interpolation operations to provide intensity values for areas not captured by the sensor. For example, in an area corresponding to a pixel for which an intensity of red light is captured, interpolation operations using neighboring intensity values can be used to provide an estimation of blue, green and infrared intensities. Red, green and blue intensity values, whether captured or interpolated, are used to provide visible color image information. Infrared intensity values, whether captured or interpolated, are used to provide depth and/or surface texture information.
A color image pixel consists of three basic color components—red, green and blue. High-end digital cameras capture these colors with three independent and parallel sensors each capturing a color plane for the image being captured. However, lower-cost image capture devices use sub-sampled color components so that each pixel has only one color component captured and the two other missing color components are interpolated based on the color information from the neighboring pixels. One pattern commonly used for sub-sampled color image capture is the Bayer pattern.
Each pixel in the Bayer pattern consists of only one color component—either red (R), green (G) or blue (B). The missing components are reconstructed based on the values of the neighboring pixel values. For example, the pixel at location (3,2) contains only blue intensity information and the red and green components have been filtered out.
The missing red information can be obtained by interpolation. For example, the red intensity information can be obtained by determining the average intensity of the four adjacent red pixels at locations (2,1), (2,3), (4,1) and (4,3). Similarly, the missing green intensity information can be obtained by determining the average intensity of the four adjacent green pixels at locations (2,2), (3,1), (3,3) and (4,2). Other, more complex interpolation techniques can also be used. However, an image capture device using the standard Bayer pattern cannot capture depth information without additional components, which increases the cost and complexity of the device.
For example, the pixel in location (7,3) corresponds to blue intensity information (row 7 and column 3). Thus, it is necessary to recover green and red intensity information in order to provide a full color pixel. Recovery of IR intensity information provides depth information. In one embodiment the average intensity of the values of the four neighboring green pixel locations (7,2), (7,4), (6,3) and (8,3) is used for the green intensity value of pixel (7,3). Similarly, the average of the intensity values of the nearest neighbor red pixel locations (7,1), (7,5), (5,3) and (9,3) is used for the red intensity value of pixel (7,3). The IR intensity information for pixel (7,3) can be determined as the average intensity of the nearest neighbor IR pixel locations (6,2), (6,4), (8,2) and (8,4).
One embodiment of a technique for interpolating color and/or depth information follows. In the equations that follow, “IR” indicates an interpolated intensity value for the pixel at location (m,n) unless the equation is IR=X(m,n), which indicates a captured infrared value. The equations for red, green and blue follow the same convention. Alternate techniques can also be used.
For the pixel X(m,n) in location (m,n):
else
B=X(m,n);
G=X(m,n);
G=X(m,n);
Interpolation unit 330 is coupled with sensor unit 320 to interpolate the pixel color information from the sensor unit. In one embodiment, interpolation unit 330 operates using the equations set forth above. In alternate embodiments, other interpolation equations can also be used. Interpolation of the pixel data can be performed in series or in parallel. The collected and interpolated pixel data are stored in the appropriate buffers coupled with interpolation unit 330.
In one embodiment, interpolation unit 330 is implemented as hardwired circuitry to perform the interpolation operations described herein. In an alternate embodiment, interpolation unit 330 is a general purpose processor or microcontroller that executes instructions that cause interpolation unit 330 to perform the interpolation operations described herein. The interpolation instructions can be stored in a storage medium in, or coupled with, image capture device 300, for example, storage medium 360. As another alternative, interpolation unit 330 can perform the interpolation operations as a combination of hardware and software.
Infrared pixel data is stored in IR buffer 342, blue pixel data is stored in B buffer 344, red pixel data is stored in R buffer 346 and green pixel data is stored in G buffer 348. The buffers are coupled with signal processing unit 350, which performs signal processing functions on the pixel data from the buffers. Any type of signal processing known in the art can be performed on the pixel data.
The red, green and blue color pixel data are used to generate color images of the scene captured. The infrared pixel data are used to generate depth and/or texture information. Thus, using the four types of pixel data (R-G-B-IR), an image capture device can capture a three-dimensional image.
In one embodiment, the processed pixel data are stored on storage medium 360. Alternatively, the processed pixel data can be displayed by a display device (not shown in
Color intensity values are received by the interpolation unit, 410. In one embodiment, light from an image to be captured is passed through a lens to a sensor. The sensor can be, for example, a complementary metal-oxide semiconductor (CMOS) sensor a charge-coupled device (CCD), etc. The intensity of the light passed to the sensor is captured in multiple locations on the sensor. In one embodiment, light intensity is captured for each pixel of a digital image corresponding to the image captured.
In one embodiment, each pixel captures the intensity of light corresponding to a single wavelength range (e.g., red light, blue light, green light, infrared light). The colors corresponding to the pixel locations follows a predetermined pattern. One pattern that can be used is described with respect to
The captured color intensity values from the sensor unit are sent to an interpolation unit in any manner known in the art. The interpolation unit performs color intensity interpolation operations on the captured intensity values, 420. In one embodiment, the interpolation operations are performed for the pattern of
The sensor unit captures intensity values for visible colors as well as for infrared wavelengths. In one embodiment, the visible color intensities are interpolated such that each of the pixel locations have two interpolated color intensity values and one captured color intensity value. In alternate embodiments, color intensity values can be selectively interpolated such that one or more of the pixel locations does not have two interpolated color intensity values.
The infrared intensity values are also interpolated as described herein. The infrared intensity values provide depth, or distance information, that can allow the surface features of the image to be determined. In one embodiment, an infrared value is either captured or interpolated for each pixel location. In alternate embodiments, the infrared values can be selectively interpolated.
The captured color intensity values and the interpolated color intensity values are stored in a memory, 430. The color intensity values can be stored in a memory that is part of the capture device or the memory can be external to, or remote from, the capture device. In one embodiment, four buffers are used to store red, green, blue and infrared intensity data. In alternate embodiments, other storage devices and/or techniques can be used.
An output image is generated using, for example, a signal processing unit, from the stored color intensity values, 440. In one embodiment, the output image is a reproduction of the image captured; however, one or more “special effects” changes can be made to the output image. The output image can be displayed, stored, printed, etc.
One embodiment of a technique for interpolating color and/or depth information follows. Alternate techniques can also be used.
For the pixel X(m,n) in location (m,n):
B=X(m,n);
R=X(m,n);
G=X(m,n);
end
One embodiment of a technique for interpolating color and/or depth information follows. Alternate techniques can also be used.
For the pixel X(m,n) in location (m,n):
else
B=X(m,n);
G=X(m,n);
One embodiment of a technique for interpolating color and/or depth information follows. Alternate techniques can also be used.
For the pixel X(m,n) in location (m,n):
B=X(m,n);
R=X(m,n);
G=X(m,n);
end
One embodiment of a technique for interpolating color and/or depth information follows. Alternate techniques can also be used.
For the pixel X(m,n) in location (m,n):
B=X(m,n);
G=X(m,n);
R=X(m,n);
end
Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
In the foregoing specification, the invention has been described with reference to specific embodiments thereof. It will, however, be evident that various modifications and changes can be made thereto without departing from the broader spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
This U.S. Patent application is a continuation-in-part of U.S. patent application Ser. No. 10/376,156, filed Feb. 28, 2003, entitled “FOUR-COLOR MOSAIC PATTERN FOR DEPTH AND IMAGE CAPTURE.” This U.S. Patent application is related to U.S. patent application Ser. No. 10/376,127, filed Feb. 28, 2003, entitled “SUB-SAMPLED INFRARED SENSOR FOR USE IN A DIGITAL IMAGE CAPTURE DEVICE.”
Number | Name | Date | Kind |
---|---|---|---|
5373322 | Laroche et al. | Dec 1994 | A |
5801373 | Oozu et al. | Sep 1998 | A |
5875122 | Acharya | Feb 1999 | A |
5926238 | Inoue et al. | Jul 1999 | A |
5995210 | Acharya | Nov 1999 | A |
6009201 | Acharya | Dec 1999 | A |
6009206 | Acharya | Dec 1999 | A |
6047303 | Acharya | Apr 2000 | A |
6091851 | Acharya | Jul 2000 | A |
6094508 | Acharya et al. | Jul 2000 | A |
6108453 | Acharya | Aug 2000 | A |
6124811 | Acharya et al. | Sep 2000 | A |
6130960 | Acharya | Oct 2000 | A |
6151069 | Dunton et al. | Nov 2000 | A |
6151415 | Acharya et al. | Nov 2000 | A |
6154493 | Acharya et al. | Nov 2000 | A |
6166664 | Acharya | Dec 2000 | A |
6178269 | Acharya | Jan 2001 | B1 |
6195026 | Acharya | Feb 2001 | B1 |
6215908 | Pazmino et al. | Apr 2001 | B1 |
6215916 | Acharya | Apr 2001 | B1 |
6229578 | Acharya et al. | May 2001 | B1 |
6233358 | Acharya | May 2001 | B1 |
6236433 | Acharya et al. | May 2001 | B1 |
6236765 | Acharya | May 2001 | B1 |
6269181 | Acharya | Jul 2001 | B1 |
6275206 | Tsai et al. | Aug 2001 | B1 |
6285796 | Acharya et al. | Sep 2001 | B1 |
6292114 | Tsai et al. | Sep 2001 | B1 |
6292212 | Zigadlo et al. | Sep 2001 | B1 |
6301392 | Acharya | Oct 2001 | B1 |
6348929 | Acharya et al. | Feb 2002 | B1 |
6351555 | Acharya et al. | Feb 2002 | B1 |
6356276 | Acharya | Mar 2002 | B1 |
6366692 | Acharya | Apr 2002 | B1 |
6366694 | Acharya | Apr 2002 | B1 |
6373481 | Tan et al. | Apr 2002 | B1 |
6377280 | Acharya et al. | Apr 2002 | B1 |
6381357 | Tan et al. | Apr 2002 | B1 |
6392699 | Acharya | May 2002 | B1 |
6449380 | Acharya et al. | Sep 2002 | B1 |
6535648 | Acharya | Mar 2003 | B1 |
6659940 | Adler | Dec 2003 | B2 |
6759646 | Acharya et al. | Jul 2004 | B1 |
7109470 | Kohler | Sep 2006 | B2 |
Number | Date | Country | |
---|---|---|---|
20040169749 A1 | Sep 2004 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10376156 | Feb 2003 | US |
Child | 10664023 | US |