This relates generally to imaging devices, and more particularly, to imaging devices with multiple lenses and image sensors.
Image sensors are commonly used in electronic devices such as cellular telephones, cameras, and computers to capture images. In a typical arrangement, an electronic device is provided with a single image sensor having pixels for collecting image data and a single corresponding lens. Some electronic devices use arrays of image sensors and corresponding lenses to gather image data. This type of system, which is sometimes referred to as an array camera, may be used to extend depth of focus or capture depth information from a scene. Array cameras may also be used to improve image processing and information gathering processes such as gesture control, image segmentation or other image processing operations.
In a conventional array camera, image sensors are aligned with the centers of individual corresponding lenses. In array cameras in which each image sensor is associated with an individual lens, alignment of each image sensor with its corresponding lens is limited due to mechanical mounting tolerances. For this reason, each lens is typically aligned within a few tens of pixels of the center of a corresponding image sensor.
It would therefore be desirable to be able to provide improved imaging devices with array cameras having sub-pixel lens alignment precision.
Digital camera modules are widely used in electronic devices such as digital cameras, computers, cellular telephones, or other electronic devices. These electronic devices may include image sensors that gather incoming light to capture an image. The image sensors may include arrays of image pixels. The pixels in the image sensors may include photosensitive elements such as photodiodes that convert the incoming light into digital data. Image sensors may have any number of pixels (e.g., hundreds or thousands or more). A typical image sensor may, for example, have hundreds of thousands or millions of pixels (e.g., megapixels).
There may be any suitable number of lenses in lens array 13, any suitable number of color filters in color filter array 14, and any suitable number of image sensors in image sensor array 16. Lens array 13 may, as an example, include N*M lenses arranged in an N×M two-dimensional array. The values of N and M may be equal to or greater than one, may be equal to or greater than two, may exceed 10, or may have any other suitable values. The lenses in lens array 13 may include one or more lenses associated with each image sensor in image sensor array 16. Lenses in lens array 13 may be formed from one or more layers of lenses (i.e., lens array 13 may include one or more layers, each layer including an array of lenses). Each lens array layer in lens array 13 may be formed from individual lenses mounted in an a mounting structure or may be formed from an array of lenses formed on a single lens structure such as a plastic, glass, silicon or other structure on which multiple lenses may be formed. Lenses in lens array 13 may be formed using compression molding, transfer molding, injection molding, or other suitable methods for forming layers molded lens structures. Lenses in lens array 13 may be formed from a single material (e.g., plastic) or may be formed from multiple materials (i.e., one layer of lenses may be formed from one type of polymer material and another layer of lenses may be formed from another, different type of polymer material.)
Cover layer 20 may be formed from glass and may sometimes be referred to as cover glass. Cover layer 20 may also be formed from other transparent materials such as plastic. Color filter array 14 may be formed under cover layer 20 (i.e., between cover layer 20 and lens array 13). Color filter array 14 may include one or more color filters. Color filter array 14 may be formed separately from cover layer 20 or may be formed as an integral part of cover layer 20. Each color filter in color filter array 14 may be associated with a corresponding image sensor in image sensor array 16 and a corresponding lens in lens array 13.
Image sensor array 16 may contain a corresponding N×M two-dimensional array of individual image sensors. The image sensors may be formed on one or more separate semiconductor substrates and may contain numerous image sensor pixels. Complementary metal-oxide-semiconductor (CMOS) technology or other image sensor integrated circuit technologies may be used in forming image sensor pixels for image sensors in image sensor array 16. Some of the pixels in the image sensors of image sensor array 16 may be actively used for gathering light. Other pixels may be inactive, may be covered using array separating structures, or may be omitted from the array during fabrication. In arrays in which fabricated pixels are to remain inactive, the inactive pixels may be covered with metal or other opaque materials, may be depowered, or may otherwise be inactivated. There may be any suitable number of pixels fabricated in each image sensor of image sensor array 16 (e.g., tens, hundreds, thousands, millions, etc.). The number of active pixels in each image sensor of image sensor array 16 may be tens, hundreds, thousands, or more. With one suitable arrangement, which is sometimes described herein as an example, the image sensors are formed on a common semiconductor substrate (e.g., a common silicon image sensor integrated circuit die). Each image sensor may be identical or, if desired, some image sensors may be different (e.g., some image sensors may have different pixel sizes, shapes or sensitivity than other image sensors). For example, each image sensor may be formed from a portion of an 8 megapixel image sensor integrated circuit. Other types of image sensor may also be used for the image sensors if desired. For example, images sensors with VGA resolution, greater than VGA resolution or less than VGA resolution may be used, image sensor arrays in which the image sensors are not all identical may be used, etc.
Lenses in lens array 13 may be aligned with a position on an associated image sensor in image sensor array 16 that is not the center of the image sensor. Lenses in lens array 13 may be positioned to have an alignment shift in one or more directions away from the center of the associated image sensor in image sensor array 16. The shift in alignment between a lens in lens array 13 and the center of the associated image sensor in image sensor array 16 may be a fraction of the size of a pixel in the associated image sensor (e.g., one quarter, one half, three quarters, less than one quarter, more than one quarter but less than one half, more than one half but less than three quarters, more than three quarters but less than all of the size of the image pixel). Alternatively, the shift in alignment between a lens in lens array 13 and the center of the associated image sensor in image sensor array 16 may more than the size of a pixel in the associated image sensor. The use of a camera module with an array of lenses and an array of corresponding image sensors (i.e., an array camera) in which lenses in a lens array such as lens array 13 are laterally offset from the centers of associated image sensors in an image sensor array such as image sensor array 16 may allow capture and production of super-resolution images (i.e., images having pixels that are smaller than the pixels used to capture the image).
Each color filter in color filter array 14 may pass a single-color of light (e.g., green light, red light, blue light, infrared light, ultraviolet light, etc.), while blocking other colors of light. Some color filters in color filter array 14 may pass different colors of light than other color filters in color filter array 14. With one suitable arrangement, which is sometimes described herein as an example, color filter array 14 may include a two-by-two array of color filters in which one filter passes only blue light, two filters pass only green light, and one filter passes only red light. In comparison with conventional devices, an arrangement in which each image sensor has an associated color filter that passes only one color of light, color cross-talk (i.e., contamination of pixels configured to capture one color of light with other colors of light intended for nearby pixels) may also be reduced. This is because a single-color filter can be used for each image sensor in image sensor array 16 so that adjacent image pixels all receive the same color of light instead of using a conventional Bayer pattern or other multiple-color color filter array pattern over a single image sensor in which light of one color is often received by an image pixel that is immediately adjacent to another image pixel receiving another color of light. With a color filter array and single image sensor arrangement, there is no opportunity for color information to bleed from one color channel to another. As a result, signal-to-noise ratio and color fidelity may be improved. A single-color filter arrangement may also allow increased resolution as the pixels of a single image sensor are not subdivided into multiple colors (as in the case of a Bayer color filter array). The color filters that are used for the image sensor pixel arrays in the image sensors may, for example, be red filters (i.e., filters configured to pass only red light), blue filters (i.e., filters configured to pass only red light), and green filters (i.e., filters configured to pass only red light). Other filters such as infrared-blocking filters, filters that block visible light while passing infrared light, ultraviolet-light blocking filters, white color filters, dual-band IR cutoff filters (e.g., dual-band NIR image sensors having filters that allow visible light and a range of infrared light emitted by LED lights), etc. may also be used.
Processing circuitry 18 (e.g., processing circuitry integrated onto sensor array integrated circuit 16 and/or processing circuitry on one or more associated integrated circuits) can select which digital image data (i.e., image data from which image sensor) to use in constructing a final image for the user of device 10. For example, circuitry 18 may be used to blend image data from red, blue, and green sensors to produce full-color images. Full color images may include pixels that are smaller than the pixels of the individual image sensors. By combining image data from pixels of multiple image sensors having lenses that have lateral alignment offsets from the centers of the image sensors in which the offsets have sub-pixel magnitudes, these super-resolution color images may be produced. This is because knowledge of the magnitude and direction of the lateral offsets to sub-pixel precision allows reliable production super-resolution images without distortion of the scene being imaged in the combined images.
Processing circuitry 18 may also be used to select data from an image sensor having an associated infrared-passing filter when it is desired to produce infrared images, may be used to produce 3-dimensional (sometimes called stereo) images using data from two or more different sensors that have different vantage points when capturing a scene, may be used to produce increased DOF images using data from two or more image sensors, etc. In some modes of operation, all of the sensors on array 16 may be active (e.g., when capturing high-quality images). In other modes of operation (e.g., a low-power preview mode), only a subset of the image sensors may be used. Other sensors may be inactivated to conserve power (e.g., their positive power supply voltage terminals may be taken to a ground voltage or other suitable power-down voltage and their control circuits may be inactivated or bypassed).
Circuitry in an illustrative pixel of one of the image sensors in sensor array 16 is shown in
Before an image is acquired, reset control signal RST may be asserted. This turns on reset transistor 28 and resets charge storage node 26 (also referred to as floating diffusion FD) to Vaa. The reset control signal RST may then be deasserted to turn off reset transistor 28. After the reset process is complete, transfer gate control signal TX may be asserted to turn on transfer transistor (transfer gate) 24. When transfer transistor 24 is turned on, the charge that has been generated by photodiode 22 in response to incoming light is transferred to charge storage node 26.
Charge storage node 26 may be implemented using a region of doped semiconductor (e.g., a doped silicon region formed in a silicon substrate by ion implantation, impurity diffusion, or other doping techniques). The doped semiconductor region (i.e., the floating diffusion FD) exhibits a capacitance that can be used to store the charge that has been transferred from photodiode 22. The signal associated with the stored charge on node 26 is conveyed to row select transistor 36 by source-follower transistor 34.
When it is desired to read out the value of the stored charge (i.e., the value of the stored charge that is represented by the signal at the source S of transistor 34), row select control signal RS can be asserted. When signal RS is asserted, transistor 36 turns on and a corresponding signal Vout that is representative of the magnitude of the charge on charge storage node 26 is produced on output path 38. In a typical configuration, there are numerous rows and columns of pixels such as pixel 190 in the image sensor pixel array of a given image sensor. A vertical conductive path such as path 40 can be associated with each column of pixels.
When signal RS is asserted in a given row, path 40 can be used to route signal Vout from that row to readout circuitry. If desired, other types of image pixel circuitry may be used to implement the image pixels of sensors 16-1, . . . 16-N. For example, each image sensor pixel 190 may be a three-transistor pixel, a pin-photodiode pixel with four transistors, a global shutter pixel, a time-of-flight pixel, etc. The circuitry of
A diagram of a conventional array camera in which an array of identical lenses is aligned with the centers of corresponding image sensors is shown in
In the example of
In the example of
The center of lens 13(2,1) may be shifted with respect to the center of image sensor 16(2,1) by an amount Sx(2,1) in the x-direction and by an amount Sy(2,1) in the y-direction. Shifts Sx(2,1) and Sy(2,1) may be a fraction of the size of a pixel in the associated image sensor (e.g., one quarter, one half, three quarters, less than one quarter, more than one quarter but less than one half, more than one half but less than three quarters, more than three quarters but less than all of the size of the image pixel). Alternatively, the Sx(2,1) and Sy(2,1) may more than the size of a pixel in the associated image sensor. In one suitable arrangement, both shift Sx(2,1) and shift Sy(2,1) may be equal to half of the length of a lateral dimension of a pixel in an image sensor such as image sensor 16(1,1). Shifts Sx(2,1) and shift Sy(2,1) may be equal to shifts Sx(1,2) and shift Sy(1,2) respectively, may have opposite signs (i.e., indicate offsets in an opposite direction) to Sx(2,1) and shift Sy(2,1), or may have different sizes from shifts Sx(2,1) and shift Sy(2,1).
The center of lens 13(2,2) may be shifted with respect to the center of image sensor 16(2,2) by an amount Sx(2,2) in the x-direction and by an amount Sy(2,2) in the y-direction. Shifts Sx(2,2) and Sy(2,2) may be a fraction of the size of a pixel in the associated image sensor (e.g., one quarter, one half, three quarters, less than one quarter, more than one quarter but less than one half, more than one half but less than three quarters, more than three quarters but less than all of the size of the image pixel). Alternatively, the Sx(2,2) and Sy(2,2) may more than the size of a pixel in the associated image sensor. In one suitable arrangement, both shift Sx(2,2) and shift Sy(2,2) may be equal to half of the length of a lateral dimension of a pixel in an image sensor such as image sensor 16(1,1). Shifts Sx(2,2) and shift Sy(2,2) may be equal to shifts Sx(1,2) and shift Sy(1,2) respectively, may be equal to shifts Sx(2,1) and shift Sy(2,1) respectively, may have opposite signs (i.e., indicate offsets in an opposite direction) to Sx(1,2) and shift Sy(1,2), may have equal sizes but opposite signs to Sx(2,1) and shift Sy(2,1) or may have different sizes from shifts Sx(1,2), Sy(1,2), Sx(2,1) and Sy(2,1).
There may be any suitable number of lenses in lens array 13, any suitable number of color filters in color filter array 14, and any suitable number of image sensors in image sensor array 16. Lens array 13 may, as an example, be formed from one or more layers of lenses. Lenses in layers of lenses in lens array 13 may be formed one or more layers of lenses such as top lens layer 13T, middle lens layer 13M and bottom lens layer 13B. Each lens array layer in lens array 13 may be formed from individual lenses mounted in an a mounting structure or may be formed from an array of lenses formed on a single structure such as a plastic, glass, silicon or other structure on which multiple lenses may be formed. Top lens layer 13T, middle lens layer 13M and bottom lens layer 13B may each be formed using compression molding, transfer molding, injection molding, lithography, or other suitable methods. Top lens layer 13T, middle lens layer 13M and bottom lens layer 13B may each be formed from the same material (e.g., plastic) or may each be formed from a different material (e.g., top lens layer 13T and middle lens layer 13M may be formed from one type of polymer material while bottom lens layer 13B is formed from a different type of polymer material, top lens layer 13T and bottom lens layer 13B may be formed from one type of polymer material while middle lens layer 13M is formed from a different type of polymer material, etc.) Alternatively, top lens layer 13T, middle lens layer 13M and bottom lens layer 13B may all be formed from the same material. Forming tope lens layer 13T, middle lens layer 13M and bottom lens layer 13B from the same material may increase the precision with which lens array 13 may be aligned with image sensor array 16. In conventional devices such as device 190 of
Middle lens layer 13M may have a bottom surface such as bottom surface 70 that is substantially planar (i.e., a flat bottom surface). Providing middle lens layer 13M with a flat bottom surface may simplify alignment of top, middle and bottom lens layers 13T, 13M, and 13B respectively with image sensors of image sensor array 16.
Top lens layer 13T may be mounted to cover layer 20 using housing structures such as housing structures 62 and spacer structuring such as spacer structures 60. Middle lens layer 13M may be mounted to top lens layer 13T and bottom lens layer 13B using housing structures such as housing structures 62 and spacer structuring such as spacer structures 60. Bottom lens layer 13B may be mounted to middle lens layer 13M using housing structures such as housing structures 62 and spacer structures such as spacer structures 60. Bottom lens layer 13B may be mounted to image sensor array 16 using housing structures such as housing structures 62 and spacer-buffer structures such as spacer-buffer structure 61. Spacer buffer structure 61 may also be used to separate one image sensor from another image sensor (e.g., to divide pixels 68 of image sensor 16(1,1) from the pixels 68 of image sensor 16(1,1)).
Color filter array 14 may be formed separately from cover layer 20 or may be formed as an integral part of cover layer 20. Each color filter in color filter array 14 may be associated with a corresponding image sensor in image sensor array 16 and a corresponding lens in lens array 13. For example, color filter 14(1,1) may filter light to be focused onto image sensor 16(1,1) by offset lens stack 13(1,1). Color filter 14(1,2) may filter light to be focuses onto image sensor 16(1,2) by offset lens stack 13(1,2), etc. Color filter array 14 may be formed under cover layer 20 (i.e., between cover layer 20 and lens array 13). Cover layer 20 may be formed from glass and may sometimes be referred to as cover glass. Cover layer 20 may also be formed from other transparent materials such as plastic.
Color filters such as color filters 14(1,1) and 14(1,2) in color filter array 14 may each pass a single-color of light (e.g., green light, red light, blue light, infrared light, ultraviolet light, etc.), while blocking other colors of light. Some color filters in color filter array 14 may pass different colors of light than other color filters in color filter array 14. As an example, color filter 14(1,1) may be a red color filter (i.e., a filter that passes red light and blocks other colors of light) while color filter 14(1,2) may be a green color filter (i.e., a filter that passes green light and blocks other colors of light.
Color filter array 14 may include a two-by-two array of color filters having a blue color filter, two green color filters and one red color filter. An arrangement in which each image sensor of image sensor array 16 receives light through a color filter that passes only one color of light may allow increased resolution as the pixels of a single image sensor are not subdivided into multiple colors (as in the case of a Bayer color filter array). Color filters such as color filters 14(1,1) and 14(1,2) may be red filters, blue filters, and green filters, infrared-blocking filters, filters that block visible light while passing infrared light, ultraviolet-light blocking filters, white color filters, dual-band IR cutoff filters (e.g., dual-band NIR image sensors having filters that allow visible light and a range of infrared light emitted by LED lights), etc.
Top lens layer 13T, middle lens layer 13M and bottom lens layer 13B may each include N*M lenses arranged in an N×M two-dimensional array. The values of N and M may be equal to or greater than one, may be equal to or greater than two, may exceed 10, or may have any other suitable values. In the example of
Image sensor array 16 may contain a corresponding N×M two-dimensional array of individual image sensors. The image sensors may be formed on one or more separate semiconductor substrates and may contain numerous image sensor pixels. Complementary metal-oxide-semiconductor (CMOS) technology or other image sensor integrated circuit technologies may be used in forming image sensor pixels for image sensors in image sensor array 16. With one suitable arrangement, which is sometimes described herein as an example, the image sensors are formed on a common semiconductor substrate (e.g., a common silicon image sensor integrated circuit die). Image sensors such as image sensors 16(1,1) and 16(1,2) of image sensor array 16 may each include any number of image pixels 68. Image pixels 68 may include photosensitive elements such as photodiodes for converting light into electric charge. Image pixels 68 may include circuitry such as the circuitry of pixel 190 of
Image sensors such as image sensors 16(1,1) and 16(1,2) of image sensor array 16 may be formed from a portion of a larger image sensor integrated circuit (e.g., an 8 megapixel image sensor) and divided into multiple image sensors using spacer-buffer structures such as spacer-buffer structure 61.
As shown in
As shown in
The centers of other lens stacks in lens array 13 may also be shifted with respect to the centers of associated image sensors. The use of a camera module with an array of lenses and an array of corresponding image sensors (i.e., an array camera) in which lens stacks in a lens array such as lens array 13 are offset from the centers of associated image sensors in an image sensor array such as image sensor array 16 may allow capture and production of super-resolution images (i.e., images having pixels that are smaller than the pixels used to capture the image).
At step 202, processing circuitry such as processing circuitry 18 of
At step 204, processing circuitry 18 may be used to combine the single-color, spatially-offset images into a super-resolution color image such as super-resolution color image 90 by filling the grid of pixels with the values of the overlapping pixels in the single-color, spatially-offset images (i.e., assigning each pixel in the grid of pixels a value corresponding to an associated value of an overlapping pixel in a selected one of the single-color, spatially-offset images). Offsets Sx(1,1), Sy(1,1), Sx(1,2), Sy(1,2), Sx(2,1), Sy(2,1), Sx(2,2), and Sy(2,2) of lens stacks 13(1,1), 13(1,2), 13(2,1) and 13(2,2) of lens array 13 respectively (see
Various embodiments have been described illustrating electronic devices having array cameras that include arrays of image sensors, arrays of associated lenses and array of associated color filters in which lenses are aligned with positions on associated image sensors other than the centers of the associated image sensors. An array of lenses may include one or more layers of lenses formed by compression molding of transparent materials such as plastic. Multiple layers of lenses in an array of lenses may be combined to form a lens stack associated with each image sensor in an array of image sensors. Image sensors may be formed on a single integrated circuit die. Arrays of lenses may be mounted directly onto the integrated circuit die on which the array of image sensors is formed. Each lens stack may have an associated color filter that filters incoming light before the light passes through the lens stack. Each image sensor may include a second color filter formed on the integrated circuit die that further filters the incoming light after it has passed through the lens stack and before it reaches photosensitive components of image pixels in the image sensor. Image sensors may further include microlenses formed on each image pixel for focusing incoming light onto the image pixel. Color filter arrays may include one or more red filters, one or more green filters and one or more blue filters. Lens stacks that focus light onto associated image sensors of an image sensor array may have centers that are offset from the center of the associated image sensor. Offsetting the centers of lens stacks with respect to the centers of associated image sensors may allow capture of spatially offset single-color images by the image sensors. Spatially offset single-color images may be combined into super-resolution images using the processing circuitry.
The foregoing is merely illustrative of the principles of this invention which can be practiced in other embodiments.
This application claims the benefit of provisional patent application No. 61/480,289, filed Apr. 28, 2011, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
61480289 | Apr 2011 | US |