This relates generally to imaging devices, and more particularly, to imaging devices having image sensor pixels with high dynamic range functionalities.
Image sensors are commonly used in electronic devices such as cellular telephones, cameras, and computers to capture images. In a typical arrangement, an image sensor includes an array of image pixels arranged in pixel rows and pixel columns. Circuitry may be coupled to each pixel column for reading out image signals from the image pixels.
Typical image pixels contain a photodiode for generating charge in response to incident light. Image pixels may also include a charge storage region for storing charge that is generated in the photodiode. Image sensors can operate using a global shutter or a rolling shutter scheme. In a global shutter, every pixel in the image sensor may simultaneously capture an image, whereas in a rolling shutter each row of pixels may sequentially capture an image.
Image sensors may be equipped with multi-exposure high dynamic range (HDR) functionality, where multiple images are captured with an image sensor at different exposure times. The images are later combined into a high dynamic range image. An HDR image sensor can operate using a rolling shutter operation. In conventional HDR image sensors, a long-exposure image may be sampled during a first readout cycle. Line buffers are then typically used store the long-exposure image. While the line buffers store the long-exposure image, a short-exposure image is generated. The short-exposure image is then sampled in a second readout cycle. After the short-exposure image is sampled, the short-exposure image and the long-exposure image are combined to form an HDR image. However, the line buffers may add additional costs to manufacturing the image sensor. Additionally, in standard HDR image sensor pixels, bright scenes can cause unwanted saturation of the photodiode leading to over saturated image signals.
It would therefore be desirable to be able to provide imaging devices with improved image sensor pixels.
Electronic devices such as digital cameras, computers, cellular telephones, and other electronic devices may include image sensors that gather incoming light to capture an image. The image sensors may include arrays of image pixels. The pixels in the image sensors may include photosensitive elements such as photodiodes that convert the incoming light into image signals. Image sensors may have any number of pixels (e.g., hundreds or thousands or more). A typical image sensor may, for example, have hundreds of thousands or millions of pixels (e.g., megapixels). Image sensors may include control circuitry such as circuitry for operating the image pixels and readout circuitry for reading out image signals corresponding to the electric charge generated by the photosensitive elements.
Storage and processing circuitry 18 may include one or more integrated circuits (e.g., image processing circuits, microprocessors, storage devices such as random-access memory and non-volatile memory, etc.) and may be implemented using components that are separate from the camera module and/or that form part of the camera module (e.g., circuits that form part of an integrated circuit that includes image sensors 16 or an integrated circuit within the module that is associated with image sensors 16). When storage and processing circuitry 18 is included on different integrated circuits (e.g., chips) than those of image sensors 16, the integrated circuits with circuitry 18 may be vertically stacked or packaged with respect to the integrated circuits with image sensors 16. Image data that has been captured by the camera module may be processed and stored using processing circuitry 18 (e.g., using an image processing engine on processing circuitry 18, using an imaging mode selection engine on processing circuitry 18, etc.). Processed image data may, if desired, be provided to external equipment (e.g., a computer, external display, or other device) using wired and/or wireless communications paths coupled to processing circuitry 18.
If desired, image sensor 16 may include an integrated circuit package or other structure in which multiple integrated circuit substrate layers or chips are vertically stacked with respect to each other. In this scenario, one or more of circuitry 26, 28, and 24 may be vertically stacked below array 20 within image sensor 16. If desired, lines 32 and 30 may be formed from vertical conductive via structures (e.g., through-silicon vias or TSVs) and/or horizontal interconnect lines in this scenario.
Image sensors 16 may include one or more arrays 20 of image pixels 22. Image pixels 22 may be formed in a semiconductor substrate using complementary metal-oxide-semiconductor (CMOS) technology or charge-coupled device (CCD) technology or any other suitable photosensitive devices. Image pixels 22 may be frontside illumination (FSI) image pixels or backside illumination (BSI) image pixels. Image pixels 22 may include one or more photosensitive regions. Each photosensitive region in an image pixel 22 may have a photodiode or photodiode region and readout circuitry for the photodiode or photodiode region. Readout circuitry associated with each photodiode or photodiode region in a given photosensitive region may include transfer gates, floating diffusion regions, and reset gates. Isolation regions between photosensitive regions may also be considered part of either or both of the photosensitive regions between which the isolation structure is formed.
As shown in
Image readout circuitry 28 may receive image signals (e.g., analog pixel values generated by pixels 22) over column lines 32. Image readout circuitry 28 may include sample-and-hold circuitry for sampling and temporarily storing image signals read out from array 20, amplifier circuitry, analog-to-digital conversion (ADC) circuitry, bias circuitry, column memory, latch circuitry for selectively enabling or disabling the column circuitry, or other circuitry that is coupled to one or more columns of pixels in array 20 for operating pixels 22 and for reading out image signals from pixels 22. ADC circuitry in readout circuitry 28 may convert analog pixel values received from array 20 into corresponding digital pixel values (sometimes referred to as digital image data or digital pixel data). Image readout circuitry 28 may supply digital pixel data to control and processing circuitry 24 and/or processor 18 (
If desired, image pixels 22 may include one or more photosensitive regions for generating charge in response to image light. Photosensitive regions within image pixels 22 may be arranged in rows and columns on array 20. Pixel array 20 may be provided with a color filter array having multiple color filter elements which allows a single image sensor to sample light of different colors. As an example, image sensor pixels such as the image pixels in array 20 may be provided with a color filter array which allows a single image sensor to sample red, green, and blue (RGB) light using corresponding red, green, and blue image sensor pixels arranged in a Bayer mosaic pattern. The Bayer mosaic pattern consists of a repeating unit cell of two-by-two image pixels, with two green image pixels diagonally opposite one another and adjacent to a red image pixel diagonally opposite to a blue image pixel. In another suitable example, the green pixels in a Bayer pattern are replaced by broadband image pixels having broadband color filter elements (e.g., clear color filter elements, yellow color filter elements, etc.). These examples are merely illustrative and, in general, color filter elements of any desired color and in any desired pattern may be formed over any desired number of image pixels 22.
Image sensor 16 may be configured to support a rolling shutter operation (e.g., pixels 22 may be operated in a rolling shutter mode). For example, the image pixels 22 in array 20 may each include a photodiode, floating diffusion region, and local charge storage region. With a rolling shutter scheme, the photodiodes in the pixels in the image sensor generate image signals sequentially. The image signals may then be transferred to the respective storage regions in each pixel. Data from each storage region of each pixel may then be read out one at a time, for example.
Subsequent to photodiode reset, a first integration period may begin and photodiode 50 may begin generating and storing an image signal. Pixel 22 may include first transfer transistor 54 and floating diffusion region 56. When the first integration period ends, first transfer transistor 54 may transfer the image signal stored at photodiode 50 to floating diffusion region 56. The time between the beginning and the end of the first integration period may be referred to as a first integration time period. Transfer transistor 54 may include a source terminal, a drain terminal, a gate terminal, and a channel region. Floating diffusion region 56 may be a doped-semiconductor region (e.g., a doped silicon region formed in a silicon substrate by ion implantation, impurity diffusion, or other doping techniques) that has charge storage capabilities shown as capacitor 58 with capacitance Cfd. Photodiode 50 may be connected to a first terminal (e.g., a source or drain terminal) of transistor 54. Floating diffusion region 56 may be connected to a second terminal that opposes the first terminal. As an example, if the first terminal is the source terminal, the second terminal may be the drain terminal, or vice versa. Control signal TX may control both a flow of charge across the channel of transistor 54. When control signal TX is asserted, the image signal stored in photodiode 50 may pass through the channel region of transistor 54 to floating diffusion region 56. Control signal TX may be subsequently deasserted and photodiode 50 may be reset to a supply voltage using control signal RST_PD.
A second integration period may follow the first integration period. Photodiode 50 may generate an image signal corresponding to the second integration period. The image signal from the second integration period may be transferred to floating diffusion region 56 using control signal TX. The image signal from the second integration period may be integrated (e.g., summed or added) with the image signal from the first integration period. The integrated image signal stored at floating diffusion region 56 may be said to have an effective integration time period. The effective integration time period is the summation of the first integration time period and the second integration time period. In general, any number of desired integration processes (e.g., transferring image signals from distinct integration periods to floating diffusion region 56 for summation) may occur. The effective integration period may be generally defined as summation of all of the distinct integration time periods, over which all of the respective individual image signals were generated. After a desired number of integration periods and summation of the corresponding image signals at floating diffusion region 56, control signal TX may be deasserted after adding a last image signal. By breaking up the effective integration period during an image frame into shorter, non-continuous integration periods that span a longer exposure time, image artifacts caused by moving objects, flickering lighting, and objects with changing illumination may be minimized without compromising pixel integration time (i.e., while maintaining the desired total integration time).
Pixel 22 may include readout circuitry that includes source follower transistor 62 and row select transistor 64. Transistor 64 may have a gate that is controlled by row select control signal SEL. When control signal SEL is asserted, transistor 64 is turned on and a corresponding signal PIXOUT (e.g., an output signal having a magnitude that is proportional to the amount of charge at floating diffusion node 56) is passed onto column readout path 66 (sometimes referred to herein as bus line 66). Conversion of incident light into corresponding image signals at photodiode 50 may occur simultaneously with image signal readout, if desired. Pixel 22 may include floating diffusion reset transistor 68. Transistor 68 may have a gate that is controlled by floating diffusion reset control signal RST_FD. Transistor 68 couples floating diffusion region 60 to voltage supply 51 with a supply voltage (e.g., Vdd). When control signal RST_FD is asserted, transistor 68 is turned on and floating diffusion node is reset to the second supply voltage level.
In conventional high dynamic range (HDR) operation, a rolling shutter image sensor will operate with a dual exposure scheme. For example, a first long exposure period will be followed by a second short exposure period. A correlated double sampling readout sequence will then be performed which includes a reset level readout cycle and an image level readout cycle. Each pixel of the rolling shutter image sensor will start a first readout sequence after image signals from the first long exposure period have been generated. Each pixel of the rolling shutter image sensor will then start the second short exposure period, and a second readout sequence after image signals from the second short exposure period have been generated. In order to combine the image signals from both exposure periods, image signals from the first long exposure period needs to be stored in memory circuitry (e.g., a line buffer) until image signals form the second short exposure period are ready to be read out and combined with the image signals from the first long exposure period.
After deasserting control signal RST_PD, at time t2, a first integration period (e.g., first integration period tshort) may begin. First integration period tshort may end at time t4, when control signal TX is deasserted. When control signal TX is asserted at time t3, image signals generated at photodiode 50 from first integration period tshort may be transferred to floating diffusion region 56 for storage. After the image signal from first integration period tshort (e.g., a short integration image signal) has been transferred, control signal TX may be deasserted at time t4. Control signal RST_PD may then be asserted again to reset photodiode 50 to the supply voltage level at time t4. Control signal RST_PD may be deasserted again to begin a second integration period (e.g., second integration period tlong). Second integration period tlong may end at time t12, when control signal TX is deasserted to transfer an image signal from second integration period tlong (e.g., a long integration image signal) to floating diffusion region 56. First integration period tshort may have a shorter integration time than does second integration period tlong. If desired, the first integration period may have a longer integration time than does the second integration period. After the image signals from second integration period have been transferred beginning at time t11, control signal TX may be deasserted at time t12. Control signal RST_PD may then be asserted a last time in the dual exposure scheme to reset photodiode 50 to the supply voltage level at time t12. At time t14, control signal RST_PD may be deasserted to begin collecting an additional set of dual exposure image signals during subsequent short and long integration periods. The short and long integration periods may be from a single exposure cycle. The single exposure cycle may be readout in a single readout sequence (sometimes referred to herein as a single readout cycle). Any desired sets of dual exposure image signals may be generated and read out in this way.
After time t4 and prior to time t11, the short integration image signal may be read out from floating diffusion node 56 using the readout circuitry. At time t6, control signal SEL may be asserted to enable transistor 64. When control signal SEL is asserted, transistor 64 is turned on and a corresponding signal PIXOUT (e.g., an output signal having a magnitude that is proportional to the amount of charge at floating diffusion node 56) is passed onto column readout path 66. Output signal PIXOUT corresponding to the short integration image signal may be sent to corresponding sample and hold circuitry by asserting control signal SHS.
Subsequent to completion of readout of the short integration image signal at time t7, control signal RST_FD may be asserted. When control signal RST_FD is asserted at time t7, floating diffusion region 56 may be reset to the supply voltage level. Control signal RST_FD may be deasserted at time t8 after floating diffusion region 56 has been reset. The reset voltage level may be read out from time t9 to time t10. The supply voltage level may be used for an uncorrelated double sampling readout with the short integration image signal. To readout the supply voltage level, control signal SEL may be asserted at time t9 and deasserted at time t10 to generate a respective output signal PIXOUT. Output signal PIXOUT corresponding to the supply voltage level may be sent to corresponding sample and hold circuitry by asserting control signal SHR.
The generation of the long integration image signals may end after the readout of the supply voltage level (e.g., time t12 may be after time t10). After the readout of the supply voltage level, the long integration image signal may be transferred to floating diffusion region 56 from time t11 to time t12. The long integration image signal may be read out in a correlated double sampling readout with the supply voltage level readout. To readout the long integration image signal from time t13 to time t15, control signal SEL may be asserted at time t13 and deasserted at time t15 to generate a second respective output signal PIXOUT. Output signal PIXOUT corresponding to the long integration image signal may be sent to corresponding sample and hold circuitry by asserting control signal SHL.
A signal readout sequence occurs from time t6 to time t15. The signal readout sequence includes a short exposure signal readout, a reset signal readout, and a long exposure signal readout. Since the short exposure signal is temporarily stored at floating diffusion node, no memory circuitry (e.g., a line buffer) is needed to store the first exposure signal in this dual exposure scheme. Moreover, only a single signal readout sequence is needed for each set of dual exposure signals, and fewer reset level readout cycles is needed, thereby increasing the speed of operating pixel 22. Additionally, Pixel 22 is configured to operate with double sampling readout for both the short and long exposure signal readouts. The long exposure image signal, which is used in low light conditions, may be more sensitive to reset level noise than the short exposure image signal. By sampling the long exposure image signal after sampling the reset level noise, the long exposure image signal will have a correlated double sampling readout.
Instead of a single continuous short integration time period as shown in
When control signal RST_PD is deasserted at time t23, second integration period Tshort2 may begin. Second integration period Tshort2 may end at time t25. When control signal TX is asserted at time t24, the image signal generated at photodiode 50 may be transferred to floating diffusion region 56 and summed with the previous image signal from first integration period Tshort1. When control signal TX is deasserted at time t25, photodiode 50 may be reset to the supply voltage level again, after which a third integration period Tshort3 (not shown) may be begin. In general, an assertion of control signal RST_PD at time t26 and subsequent deassertion of control signal RST_PD at time t27 may begin generation of image signals for nth integration period Tshortn. The nth integration period Tshortn may end at the subsequent deassertion of the control signal TX. At the end of nth integration period Tshortn (e.g., at time t4), the image signal from nth integration period Tshortn may be transferred to floating diffusion region 56 and summed with the image signals from previous integration periods. Any desired number of integration periods (n) may be used. Summed image signals at floating diffusion region 56 may then be read out as described in
By breaking up the effective integration period during an image frame into shorter, non-continuous integration periods that span a longer exposure time, image artifacts caused by moving objects, flickering lighting, and objects with changing illumination may be minimized without compromising pixel integration time (i.e., while maintaining the desired total integration time). In addition,
At step 102, the first exposure image signal may be stored at floating diffusion region 56. When the image signals from the plurality of continuous integration time periods are summed to generate the effective image signal for the first exposure period, the summation may occur at floating diffusion region 56. Alternatively, pixel 22 may include an additional charge storage region (e.g., capacitor, storage diode, storage gate). The separate image signals from the respective integration time periods may be summed at the additional charge storage region before being transferred to floating diffusion region 56 for subsequent readout.
At step 104, photodiode 50 may generate image signals (e.g., charge) corresponding to incident light for a second time period (e.g., a second exposure period). The second time period may be a longer exposure period compared to the first time period. Alternatively, the second time period may be a shorter exposure period or similar in length compared to the first time period. If pixel 22 includes the additional charge storage region described in in connection with step 102, the second exposure period may also include a second set of continuous integration periods that make up a discontinuous integration period. The image signals from the respective continuous integration periods may be summed at the additional charge storage region. Alternatively, the second exposure period may include a single continuous integration period. The image signal for the second exposure period may sometimes be referred to herein as second exposure image signal.
At step 106, the first exposure image signal stored at floating diffusion region 56 may be read out using readout circuitry (e.g., source follower transistor, row select transistor).
At step 108, floating diffusion region 56 may be reset to the reset voltage level (e.g., supply voltage level). The reset voltage level may be read out using readout circuitry. The reset voltage level may provide a reference for a double sampling readout.
At step 110, the second exposure image signal may be transferred to floating diffusion region 56 to be temporarily stored before readout.
At step 112, the second exposure image signal stored at floating diffusion region 56 may be readout using readout circuitry. Using the reset voltage level, the readout of the second exposure image signal may be a correlated double sampling readout.
In an alternative embodiment, dual gain conversion may be implemented in combination with the methods shown in
In an alternative embodiment, charge overflow capabilities may be implemented in combination with the methods shown in
Alternatively, following the generation of the low and high gain signals, the subsequent steps 104 and 106 may proceed as normal. In this alternative scenario, the low gain signal may be stored at a charge storage structure (sometimes referred to as a charge storage element) in addition to floating diffusion region 56, which may be used for storage of the second exposure image signal. An additional readout step may be needed to readout both the low and high gain signals of the first exposure image signal.
In yet another embodiment, pixel 22 may include an additional charge storage structure (e.g., a storage gate, a capacitor) to support a third exposure time period implemented in combination with the methods shown in
The order of the short and long exposure periods in
In general, the methods described in
Although not shown in
Processor system 1000, for example a digital still or video camera system, generally includes a lens 1114 for focusing an image onto one or more pixel arrays in imaging device 1008 when a shutter release button 1116 is pressed and a central processing unit (CPU) 1002 such as a microprocessor which controls camera and one or more image flow functions. Processing unit 1102 can communicate with one or more input-output (I/O) devices 1110 over a system bus 1006. Imaging device 1008 may also communicate with CPU 1002 over bus 1006. System 1000 may also include random access memory (RAM) 1004 and can optionally include removable memory 1112, such as flash memory, which can also communicate with CPU 1002 over the bus 1006. Imaging device 1008 may be combined with the CPU, with or without memory storage on a single integrated circuit or on a different chip. Although bus 1006 is illustrated as a single bus, it may be one or more busses, bridges or other communication paths used to interconnect system components of system 1000.
Various embodiments have been described illustrating systems and methods for generating images using image sensor pixels having high dynamic range functionalities.
The image sensor pixel may include a photosensitive region (e.g., a photodiode), a charge storage region (e.g., a floating diffusion region), and a transfer transistor that couples the photosensitive region to the charge storage region. The image sensor pixel may further include readout circuitry (e.g., a source follower transistor, a row select transistor, etc.). The photosensitive region may generate a first image signal in response to incident light during a first exposure period. The transfer transistor may transfer the first image signal to the charge storage region. The photosensitive region may generate a second image signal in response to incident light during a second exposure period. While generating the second image signal, the readout circuitry may perform readout operations on the first image signal stored at the charge storage region. The readout operations on the first image signal may be a double sampling readout.
Subsequent to the readout of the first image signal, the charge storage region may be reset to a reset voltage level supplied by a reset voltage supply. While generating the second image signal, the readout circuitry may perform readout operations on the reset voltage level stored at the charge storage region. Then transfer transistor may then transfer the second image signal to the charge storage region. The readout circuitry may perform readout operations on the second image signal. The readout operations on the second image signal may be a correlated double sampling readout.
In an alternative embodiment, the photosensitive region may generate image signals in response to light during an effective discontinuous integration period that spans an exposure period. The discontinuous integration period may include a plurality of continuous integration periods during each of which, a corresponding image signal is generated by the photosensitive region. The plurality of continuous integration periods may be implemented by alternatingly asserting the transfer transistor and a photodiode reset transistor. The charge storage region may sum the image signals from the plurality of continuous integration periods to generate the first image signal.
In an alternative embodiment, the image sensor pixel may include an additional charge storage element (e.g., capacitor, storage diode, storage gate, etc.). The additional charge storage element may store a third image signal generated during a third corresponding exposure period. The additional charge storage element may also store the first image signal, such that the readout operations on the first image signal may be a correlated double sampling readout. Additionally, the transfer transistor may impose an overflow barrier. A portion of the first image signal stored at the photosensitive area may be above the overflow barrier. The portion of the first image signal may be transferred to the additional charge storage element. The portion of the first image signal may also be transferred to the charge storage region, if desired.
The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention. The foregoing embodiments may be implemented individually or in any combination.