Imagers, including complimentary metal oxide semiconductor (CMOS) imagers and charge-coupled devices (CCD), may be used in digital imaging applications to capture scenes. An imager includes an array of pixels. Each pixel in the array includes at least a photosensitive element for outputting a signal having a magnitude proportional to the intensity of incident light contacting the photosensitive element. When exposed to incident light to capture a scene, each pixel in the array outputs a signal having a magnitude corresponding to an intensity of light at one point in the scene. The signals output from each photosensitive element may be processed to form an image representing the captured scene.
To capture color images, the photo sensors should be able to separately detect photons of wavelengths of light associated with different colors. For example, a photo sensor may be designed to detect first, second, and third colors (e.g., red, green and blue photons.) In one imager design, each pixel cell may be sensitive to only one color or spectral band. For this, a color filter array may be placed over each pixel cell so that each pixel cell ideally measures only wavelengths of the color of the pixel's associated filter. A group of four pixels (2 green, 1 red and 1 blue) are typically used to capture three different colors of incident light. The groups of four may be repeated throughout an imager array to form an array of many rows and columns.
In another imager design, one pixel may measure all three colors. This design takes advantage of the absorption properties of semiconductor materials. That is, in a typical semiconductor substrate, different wavelengths of light are absorbed at different depths in the substrate. For example, blue light is absorbed in a silicon substrate primarily at a depth of about 0.2 to 0.5 microns, green light is absorbed in a silicon substrate primarily at a depth of about 0.5 to 1.5 microns and red light is absorbed in the silicon substrate at a depth of about 1.5 to 3.0 microns.
This pixel structure includes three stacked pixels formed from two levels of N diffusions and a P well that are diffused in a silicon substrate. This results in a structure having three p-n junctions forming three photodiodes at different depths in the substrate, each designed to primarily absorb a particular color of incident light. Typically, the blue photodiode will be closest to the incident light, the red photodiode will be farthest from the incident light and the green photodiode will be between the blue and red junctions, due to the absorption properties described above. In this way, only one vertically stacked pixel is needed to absorb three or more different colors of light.
In operation, however, the spectral characteristics of the three colors in the vertically stacked pixel are poorly separated. That is, some photons may be absorbed by the wrong layer. Thus, extensive post-processing of the signals is necessary to arrive at the actual values for each of the red, green and blue photodiodes.
In a related design, the vertically stacked photodiodes include vertically stacked color filter segments that incorporate non-silicon materials. These designs improve upon color separation properties of the vertically stacked layers, but also require more complex processing to fabricate.
Included in the drawings are the following figures:
The example embodiments described below utilize groups of two pixels to absorb three separate colors of incident light. For example, a single photodiode pixel absorbs green photons and a vertically stacked magenta pixel absorbs red and blue photons. A magenta filter may be disposed over the magenta pixel to block green wavelengths from entering the pixel.
This structure uses two pixels to absorb a full color spectrum. Thus, it improves pixel density 1.5 times relative to four pixel group structures. Further, the pixel provides better spectral separation between the two colors, with less overlap between the spectral responses, than a system that attempts to absorb all blue, green and red photons in one vertically stacked structure. Also, the structure may be used with standard 4T read out techniques. Finally, fabrication requires no special processing relative to standard CMOS imager chips.
An embodiment of a magenta pixel 204 is shown in
A magenta filter 205 may be disposed over pixel 204 as shown in
An example structural layout of the green pixel is shown in
As shown in
As shown in
As described above, the two pixel group may be repeated to form an array of lines and columns of the example green and magenta pixels shown in, for example,
For pixel array 30, all pixels in the same row may be sampled, for example, by applying row select signal RS to row select transistors 318 and 418 of the selected row. Alternatively, green pixels in a row may be independently selected by applying RS only to row select transistor 318 and magenta pixels in a row may be independently selected by applying RS only to row select transistor 418. Specific pixels in each column may be selectively output by respective column select lines (e.g.,. lines 320 and 420 shown in
As shown in
An example sequence for operating the two pixel group described in the above embodiments is shown in the flow chart of
At step 504, transfer transistor 408 of the magenta pixel is closed by applying signal Tx to the gate of transistor 408. The level of the lower photodiode is thereby transferred to floating diffusion 410 and read out through source follower transistor 416 onto column line 420. The level read from the lower photodiode is placed on a second sample and hold capacitor. It should be noted that because the blue photodiode transistor is open during this processing, the above-described operation for the magenta pixel is only carried out for lower photodiode 404. However, the values read out and stored are primarily for the red pixel with a small amount of the blue pixel (due to the possibility of some spectral overlap, as described above).
At step 506, floating diffusion 310 of the green pixel is reset. The level of floating diffusion 310 is read out through source follower transistor 316 onto column line 320. The level read from the floating diffusion is placed on a third sample and hold capacitor. At step 508, transfer transistor 308 of the green pixel is closed by applying signal Tx to the gate of transistor 308. The level of the photodiode is thereby transferred to floating diffusion 310 and read out through source follower transistor 316 onto column line 320. The level read from the photodiode is placed on a fourth sample and hold capacitor.
At step 510, blue photodiode transistor 405 is closed by applying signal Tpon to transistor 405. The magenta pixel is integrated again over an integration period. At step 512, floating diffusion 410 of the magenta pixel is reset. The level of floating diffusion 410 is read out through source follower transistor 416 onto column line 420. The level read from the floating diffusion is placed on a fifth sample and hold capacitor.
At step 512, transfer transistor 408 of the magenta pixel is closed by applying signal Tx to the gate of transistor 408. The level of the photodiode is thereby transferred to floating diffusion 410 and read out through source follower transistor 416 onto column line 420. The level read from the photodiode is placed on a sixth sample and hold capacitor. Here, the levels read out and stored are for the sum of the red and blue pixels together.
Referring back to
It should be noted that the above sequence is just one example. Depending on the circuitry used, the sequence may be performed differently. For example, all green pixels in a column may be connected to an output line that is used for reading out green pixels and all magenta pixels in a column may be connected to another output line that is used for reading out magenta pixels. In this example, simultaneous integrations and readouts may be performed for the green and magenta pixels.
By way of another example, all green and magenta pixels in a column may be connected to the same readout line. Here, the magenta pixels may be integrated and read. Then, the green pixels may be integrated and read. The possibility of different sequences may, therefore, depend on how the pixels are connected to the column lines going to the sample and hold capacitors.
The post-processing calculations performed at step 518 are for obtaining a desired red value (R) and a desired blue value (B) from the two read out and differentially amplified digital signal values for the magenta pixel described above (represented by U for the signal value from the upper photodiode and L for the combined signal value from the lower photodiode). The U and L signal values may be represented by the following equations:
U=fb*B+fr*R (1)
L=B+R (2)
In equations (1) and (2), fb and fr represent the fraction of blue photons read by the blue photodiode and the fraction of red photons read by the blue photodiode, respectively. From equations (1) and (2), the desired values R and B may be determined according to the following equations:
R=(fb*L−U)/(fb−fr) (3)
B=(U−fr*L)/(fb−fr) (4)
When there is no spectral overlap, fb=1 and fr=0. In this scenario, R=L−U and B=U, as expected. That is, with no spectral overlap, the desired value for red is the combined signal minus the blue signal. Similarly, the desired value for blue is simply the blue signal.
In the example described above, fr=0.3 and fb=0.7. In this scenario, R and B may be determined according to the following equations:
R=1.75L−2.5U (5)
B=2.5U−0.75L (6)
This example may cause some amount of noise increase in both the red and blue signals. It is more likely, however, that the fr and fb values would be closer to the ideal values (no spectral overlap) using a more realistic spectral distribution of an image input into the imager array.
Accordingly, using these calculations, any spectral overlap may be compensated for by selecting appropriate values for fr and fb.
While example embodiments of the invention have been shown and described herein, it will be understood that such embodiments are provided by way of example only. Numerous variations, changes and substitutions will occur to those skilled in the art without departing from the invention. Accordingly, it is intended that the appended claims cover all such variations as fall within the scope of the invention.