This relates generally to imaging systems, and more particularly, to imaging systems with image pixels having variable light collecting areas.
Image sensors are commonly used in electronic devices such as cellular telephones, cameras, and computers to capture images. In a typical arrangement, an electronic device is provided with an array of image pixels arranged in pixel rows and pixel columns. Circuitry is commonly coupled to each pixel column for reading out image signals from the image pixels.
Conventional imaging systems include an image sensor in which the visible light spectrum is sampled by red, green, blue, clear, or other colored image pixels having a light collecting areas of a common size. In some situations, pixels of one color may saturate due to bright lights while nearby pixels of another color have not yet received enough image light to reach an acceptable signal-to-noise ratio. This can be particularly problematic in imaging systems with clear or nearly clear image pixels that receive light in a wide range of wavelengths in comparison with nearby single color pixels such as red and blue pixels.
It would therefore be desirable to be able to improved imaging systems with clear pixels.
Electronic devices such as digital cameras, computers, cellular telephones, and other electronic devices include image sensors that gather incoming light to capture an image. The image sensors may include arrays of image pixels. The pixels in the image sensors may include photosensitive elements such as photodiodes that convert the incoming light into electric charge. Image sensors may have any number of pixels (e.g., hundreds or thousands or more). A typical image sensor may, for example, have hundreds of thousands or millions of pixels (e.g., megapixels). Image sensors may include control circuitry such as circuitry for operating the image pixels and readout circuitry for reading out image signals corresponding to the electric charge generated by the photosensitive elements.
Processing circuitry 18 may include one or more integrated circuits (e.g., image processing circuits, microprocessors, storage devices such as random-access memory and non-volatile memory, etc.) and may be implemented using components that are separate from camera module 12 and/or that form part of camera module 12 (e.g., circuits that form part of an integrated circuit that includes image sensors 16 or an integrated circuit within module 12 that is associated with image sensors 16). Image data that has been captured by camera module 12 may be processed and stored using processing circuitry 18. Processed image data may, if desired, be provided to external equipment (e.g., a computer or other device) using wired and/or wireless communications paths coupled to processing circuitry 18.
As shown in
Column decoder circuitry 126 may include sample-and-hold circuitry, amplifier circuitry, analog-to-digital conversion circuitry, bias circuitry, column memory, latch circuitry for selectively enabling or disabling the column circuitry, or other circuitry that is coupled to one or more columns of pixels in array 200 for operating pixels 190 and for reading out image signals from pixels 190. Column decoder circuitry 126 may be used to selectively provide power to column circuitry on a selected subset of column lines 40. Readout circuitry such as signal processing circuitry associated with column decoder circuitry 126 (e.g., sample-and-hold circuitry and analog-to-digital conversion circuitry) may be used to supply digital image data to processor 18 (
Image sensor pixels 190 may be color image pixels (e.g., image pixels that include color filter elements that filter the light that passes onto each associated image pixel) with varying light collecting areas. As shown in the top view of pixel array 200 of
Each image pixel may have a light collecting area with a characteristic size. For example, red pixels 32 may have a maximum lateral width WR, blue pixels 36 may have a maximum lateral width WB, green pixels 34 may have a maximum lateral width WG, and clear pixels 38 may have a maximum lateral width WC. Blue pixels 36 and red pixels 32 may be asymmetrically shaped so that width WB is larger than an orthogonal width of blue pixels 36 and width WR is larger than an orthogonal width of red pixels 32. Blue pixels 36 and red pixels 32 may each have a non-rectangular shape having multiple angled edges at each corner.
Clear pixels 38 may be substantially rectangular or square shaped pixels. Green pixels 34 may be substantially rectangular or square shaped pixels. Width WC (and the corresponding light collecting area) of clear pixels 38 may be smaller than the lateral width (and corresponding light collecting area) of color pixels of other colors since clear pixels 38 absorb light of more wavelengths than single color pixels such as pixels 32, 34, and 36. Green pixels 34 may have a width WG that is larger than width WC and smaller than widths WB and/or WR. This is because common light sources produce more green light than other colors of light and green color pixels 34 may absorb more light that red pixels 32 and blue pixels 36, but less light than clear pixels 38. In this way, the amount of light that is received by pixels of each color can be balanced by the size of the light collecting area of the pixel.
The color filter pattern of
In another suitable example, array 200 may include alternating unit pixels cells 30′ and 30″ as shown in
The variability in light collecting area from pixel to pixel may be between 1% and 30%, between 10% and 30%, between 5% and 15%, between 10% and 20%, less than 40%, less than 30%, more than 10%, more than 5%, or between 5% and 30% (as examples). Varying the light collecting area of image pixels in array 200 may also help reduce optical cross talk between clear pixels and adjacent color pixels.
As shown in
If desired, color filter barrier structures 52 may be formed between adjacent color filter elements 44. This type of color filter barrier may help prevent optical cross talk. However, this is merely illustrative. If desired, color filter elements 44 may be provided without any intervening barriers.
Color filter elements 44 may form an array of color filter elements over an array of photosensors 42 and under an array of microlenses 46 that form image pixel array 200. However, the arrangement of
As shown in
As shown in
The increase in variance in light collecting area may be proportional to the distance R, may have another dependence on distance R, or may be otherwise dependent on the location of a particular pixel cell in array 200. In this way the varying size of light collection areas may be used to compensate for other effects such as lens shading near the edge of an array.
The processor system 300 generally includes a lens 396 for focusing an image on pixel array 200 of device 2000 when a shutter release button 397 is pressed, central processing unit (CPU) 395, such as a microprocessor which controls camera and one or more image flow functions, which communicates with one or more input/output (I/O) devices 391 over a bus 393. Imaging device 2000 also communicates with the CPU 395 over bus 393. The system 300 also includes random access memory (RAM) 392 and can include removable memory 394, such as flash memory, which also communicates with CPU 395 over the bus 393. Imaging device 2000 may be combined with the CPU, with or without memory storage on a single integrated circuit or on a different chip. Although bus 393 is illustrated as a single bus, it may be one or more busses or bridges or other communication paths used to interconnect the system components.
Various embodiments have been described illustrating image sensors having arrays of image pixels that have varying light collecting areas. The light collecting area of each image pixel may be varied with respect to other image pixels using variable microlens sizes and/or variable color filter element sizes throughout the array. The light collecting area may vary with unit pixel cells. For example, a unit pixel cell may include a red pixel, a blue pixel, a green pixel, and a clear pixel in which the clear pixel has the smallest light collecting area, the green pixel has a relatively larger light collecting area, and the red and blue pixels have larger light collecting areas still. In another example, a unit pixel cell may include two clear pixels and other color pixels in which the clear pixels have a common light collecting area size that is smaller than the size of the light collecting area of the other color pixels.
The variability of the light collecting areas of pixels within each pixel cell may depend on the location of the pixel cell in the pixel array.
The foregoing is merely illustrative of the principles of this invention which can be practiced in other embodiments.
This application claims the benefit of provisional patent application No. 61/701,299, filed Sep. 14, 2012, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5727242 | Lo et al. | Mar 1998 | A |
6137100 | Fossum et al. | Oct 2000 | A |
6278847 | Gharib et al. | Aug 2001 | B1 |
6867549 | Cok et al. | Mar 2005 | B2 |
7742088 | Shizukuishi | Jun 2010 | B2 |
7768569 | Kozlowski | Aug 2010 | B2 |
7773137 | Inoue | Aug 2010 | B2 |
8405748 | Mao et al. | Mar 2013 | B2 |
8648948 | Rafferty et al. | Feb 2014 | B2 |
8742309 | Agranov et al. | Jun 2014 | B2 |
20040105021 | Hu | Jun 2004 | A1 |
20040218291 | Fiete | Nov 2004 | A1 |
20050128596 | Li et al. | Jun 2005 | A1 |
20080225420 | Barrows et al. | Sep 2008 | A1 |
Entry |
---|
A. Portnoy et al., “Design and characterization of thin multiple aperture infrared cameras”, Applied Optics, Apr. 10, 2009, vol. 48, No. 11. |
Number | Date | Country | |
---|---|---|---|
20140078366 A1 | Mar 2014 | US |
Number | Date | Country | |
---|---|---|---|
61701299 | Sep 2012 | US |