This invention relates generally to digital imaging, and more particularly to reproducing alternative forms of light from an object using digital imaging.
A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings hereto: Copyright © 2003, Sony Electronics, Inc., All Rights Reserved.
Traditional digital color imaging systems such as a digital camera or a digital camcorder use either one or three image sensors (e.g., a charge coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS)) well known to those of ordinary skill in the art. For example, a typical consumer digital camera includes one. CCD, while a typical digital camcorder includes three CCDs. A digital camera having one CCD may have a single trichromatic filter. The trichromatic filter consists of red, green, and blue filters to reproduce a color spectrum in a scene using a process well known to those of ordinary skill in the art. A digital camera having one CCD does not have as high a color resolution as a digital camcorder having three CCDs. A digital camcorder having three CCDs typically includes a filter over each CCD. The first filter over the first CCD to filter only the red color spectrum, the second filter over the second CCD to filter only the green color spectrum, and the third filter over the third CCD to filter only the blue color spectrum.
While consumers enjoy the advantages of the trichromatic reproduction capability of these cameras, some serious drawbacks have always accompanied them, such as illuminant estimation and color correction due to infinite choices of illuminants. Furthermore, conventional digital imaging systems lack the capability to capture alternative forms of light under multiple illumination conditions, such as the surface reflectance of objects. Reflectance is the ratio of incident luminous flux upon a surface which is reradiated in the visual spectrum. There are a number of image devices that may capture and reproduce the surface reflectance of an object but these imaging devices are not feasibly included in a commercial consumer digital camera or camcorder because of cost and speed to capture an image. For example, a conventional spectro-radiometer may reproduce a surface reflectance of an object but capturing a complete spectral image of a scene typically takes much longer than just several minutes. This is not feasible for most digital commercial imaging systems because objects in a scene tend to move. Any movement of objects will cause pixel mis-registration and blur the final image.
According to one embodiment of the invention, a digital imaging device is described having filters to capture colorimetric information of visual light at a first and a second set of wavelengths. The captured colorimetric information is processed to reproduce a surface reflectance of an object in a scene.
In the following detailed description of embodiments of the invention, reference is made to the accompanying drawings in which like references indicate similar elements, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical, functional and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
The reproduction of the surface reflectance of an object using a digital imaging device is described. According to one embodiment, the digital imaging device includes, but is not limited to, two imaging sensors and two trichromatic filters to provide six imaging channels. The two trichromatic filters are designed to capture colorimetric information of various wavelengths to reproduce alternative forms of light such as the surface reflectance of objects in a scene. Although the following will describe an embodiment of a digital imaging device that includes trichromatic filters, one of ordinary skill in the art will recognize that the filters may be used to provide a varying number of imaging channels to the imaging sensors, as will be further described below.
The trichromatic filter 110 and trichromatic filter 120 filter visible light to the CCD 130 and CCD 140, respectively, to capture colorimetric information used to reproduce a digital representation of a scene, as will be further described below. As shown, the light is separated into two components with the beam splitter 115.
The CCD 130 and the CCD 140 are imaging sensors that include a collection of light-sensitive diodes called photosites, which convert light (photons) into electrons (electrical charges). The primary function of each of the photosites is to absorb light, which results in an electrical charge that is directly proportional to the intensity of the light shining on it. In this way, the photosite tracks the total intensity of the light as it passes through the trichromatic filter 110 and trichromatic filter 120 that strikes its surface.
The ADC 150 converts the electrical charges that build up in the CCD 130 and CCD 140 into digital signals.
The processor 160 processes the digital signals into the digital image representation of the scene. The processor 160 may be, for example, a well-known digital signal processor (DSP). In one embodiment, the processor 160 processes each digital signal for each photosite to determine a color of a pixel in the digital image. The processor 160 may also correct and enhance the digital image for white balance, contrast, color, and other well-known visual characteristics. The processor 160 may also direct the digital image onto a local display (e.g. LCD display) coupled to the digital imaging device 100 or direct the digital image to a remote display via a wired or wireless network connection (not shown). Furthermore, the processor 160 may also compress the digital image as is well known to those of ordinary skill in the art.
Having provided a brief overview of the digital imaging device 100, embodiments of the trichromatic filter 110 and the trichromatic filter 120 will now be described.
It is well understood that visible light is that portion of the color spectrum between the wavelengths of about 400 nanometers (nm) and 800 nm. The different wavelengths are interpreted by the human brain as colors, ranging from red at the longest wavelengths to violet at the shortest wavelengths.
In one embodiment, the trichromatic filter 110 is designed so that the red filters 210 are most responsive to those wavelengths of between approxiately 570 to 620 nm as illustrated in
In one embodiment, the trichromatic filter 120 is designed to capture the visible light of a set of wavelengths other than those captured by the trichromatic filter 110. For example, wavelength 340 of
Prior art digital cameras and camcorders only provide three imaging channels (e.g., RGB) regardless of the number of CCDs used. The trichromatic filter 120 in the digital imaging device 100 provides three additional imaging channels to the CCD 140, each of which has a wavelength separate from those captured by trichromatic filter 110 to the CCD 130. The additional three channels work with the first three channels to calculate spectral reproduction of alternative forms of light as will be described. In one embodiment, the digital imaging device 100 may be used to extract information of spectral radiance or reflectance of objects. Once the reflectance spectra information is acquired, conversion of colorimetric information under any viewing illuminant can be achieved.
Generally, the reflectance spectra of natural objects can be represented with a limited number of basis functions in terms of principal component analysis (PCA) or independent component analysis (ICA). Typical spectral imaging using a wide band technique applies three to nine basis functions to reproduce the reflectance spectra of natural objects. A wide-band approach is based on the spectral analysis of the objects to be captured. It has been shown that three basis functions are usually not enough to represent the object's reflectance spectra, but six basis functions can accurately represent the object's reflectance spectra in most cases.
For example,
At block 435, the processor 160 calculates the spectral information of the objects based on the six imaging channels. In this fashion, a digital imaging device using two CCDs and two filters provides both high speed and spectral reproduction capabilities. The spectral recovery methods could be principal component analysis (PCA), independent component analysis (ICA), or Wiener estimation. For spectral reproduction, the straightforward metric is the means-squared spectral difference of the measured and recovered surface reflectance spectra of objects. In one embodiment, metrics are first defined to determine the optimal design of spectral sensitivities for spectral reproduction. The candidate metrics to define spectral difference are:
Candidate 1: Mean Square Error of Reflectance Spectra
MSE=E{∥R−{circumflex over (R)}∥2} (1)
where R is the measured reference spectral reflectance, and {circumflex over (R)} is the recovered spectral reflectance; and
Candidate 2: Weighted Mean Square Error of Reflectance Spectra
MSEw=E{∥wλ(R−{circumflex over (R)})∥2} (2)
where wλ is a weighting function appearing as a diagonal matrix with diagonal elements from the samplings of weighting functions related to the human visual system, such as the q-factor curve emphasizing the prime wavelengths of the human visual system.
As stated above, there are several approaches to recovering the surface reflectance spectra of objects and the normalized metrics. For example, with the principal component analysis, it is a known fact that most naturally occurring spectral reflectance can be represented from a limited number of principal eigenvectors, obtained from the principal component analysis of a representative reflectance sample. The output of the digital imaging device 100 can be written as:
tc=STLcR=STLcBα (3)
where S denotes the camera spectral sensitivities, Lc denotes the diagonal form of the taking illuminant, B denotes the principal component vectors obtained via PCA or ICA from a training set of spectral reflectance samples, and α denotes the weights for the principal components (R≅Bα). α can be obtained using pseudo-inverse operation:
α=(STLcB)−1tc (4)
Therefore the recovered spectral reflectance is represented as
{circumflex over (R)}=Bα=B(STLcB)−1tc (5)
The minimized mean square error of spectral difference is
MSE=E{∥R−(STLcB)−1tc∥2} (6)
When using Wiener estimation to recover the surface reflectance spectra of objects and normalize metrics the output signal of the digital imaging device 100 is expressed as
tc=STLcR=GTR+η (7)
where η denotes the imaging noise. The estimation of R is given by
{circumflex over (R)}=F·tc (8)
where F is a unknown linear transformation matrix. Minimizing MSE in Equation (1), the explicit form of F is given as
F=KRG(GTKRG+Kη)−1 (9)
where KR and Kη are the correlation matrices of the ensemble of surface reflectance spectra and noise respectively.
Noise correlation matrix Kη can be estimated from detail measurement of noise for CCD cameras actually used, and
where nsample is the number of samples in the ensemble of spectral reflectance. Therefore the minimal mean squared error in Equation (1) can be represented as
The meaning of α(·) and τ(·) can be interpreted as the total spectral information of objects and the recovered spectral information of objects. The normalized metric corresponding to the minimal MSE
will be referred as spectral quality factor, or the quality factor for spectral reproduction.
Besides the mean squared error of spectral reflectance as a primary metric, the mean color difference under a specific illuminant can be treated as a secondary metric to the optimal design of filters for spectral reproduction. A small collection of optimal candidates generated with a primary metric can be refined with the secondary metrics.
In one embodiment, Wiener estimation with a weighting function may be used by minimizing MSEw in Equation (2), the reflectance spectra is estimated as
{circumflex over (R)}=KRG(GTKRG+Kη)−1 (17)
and the minimal mean squared error of spectra is
Thus a normalized metric can be defined as
It will be appreciated that more or fewer processes may be incorporated into the method illustrated in
Thus, a new digital color imaging system having two imaging sensors and two filters providing multiple imaging channels has been described. The image registration from two CCDs is faster and more efficient than from a three CCD system. The new configuration of the digital imaging device 100 will not substantially increase the size of a single chip digital camera because only the filter patterns are different for the two imaging sensors. However, it is also apparent that the invention is not limited to digital cameras and camcorders but may be used in any imaging device well known to those of ordinary skill in the art.
It should be noted that trichromatic filter 110 and trichromatic filter 120 are not limited to only trichromatic filters. Rather, in an alterative embodiment, the trichromatic filter 110 may include two wavelengths of the color green to provide four imaging channels. Further, the trichromatic filter 120 may provide any number of color imaging channels other than three (e.g., one, two, or four), each having a wavelength different from the trichromatic filter 110. In this way, the digital imaging system 100 may produce multiple color imaging channels for each wavelength.
While the invention has been described in terms of several embodiments, those skilled in the art will recognize that the invention is not limited to the embodiments described. The method and apparatus of the invention can be practiced with modification and alteration within the scope of the appended claims. The description is thus to be regarded as illustrative instead of limiting on the invention.