This disclosure relates generally to image sensors, and in particular but not exclusively, relates to hybrid image sensors with high sampling point distribution.
Image sensors have become ubiquitous and are now widely used in digital cameras, cellular phones, security cameras, as well as in medical, automotive, and other applications. As image sensors are integrated into a broader range of electronic devices, it is desirable to enhance their functionality, performance metrics, and the like in as many ways as possible (e.g., resolution, power consumption, dynamic range) through both device architecture design as well as image acquisition processing. The technology used to manufacture image sensors has continued to advance at a great pace. For example, the demands of higher resolution and lower power consumption have encouraged the further miniaturization and integration of these devices.
A typical image sensor operates in response to image light from an external scene being incident upon the image sensor. The image sensor includes an array of pixels having photosensitive elements (e.g., photodiodes) that absorb a portion of the incident image light and generate image charge upon absorption of the image light. The image charge photogenerated by the pixels may be measured as analog output image signals on column bitlines that vary as a function of the incident image light. In other words, the amount of image charge generated is proportional to the intensity of the image light, which is read out as analog image signals from the column bitlines and converted to digital values to produce digital images (e.g., image data) representing the external scene. The analog image signals on the bitlines are coupled to readout circuits, which include input stages having analog-to-digital conversion (ADC) circuits to convert those analog image signals from the pixel array into the digital image signals.
Non-limiting and non-exhaustive embodiments of the present disclosure are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
Corresponding reference characters indicate corresponding components throughout the several views of the drawings. Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present disclosure. In addition, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure.
Examples directed to an imaging system with a pixel circuit providing simultaneous hybrid functionality with high sampling point distribution are disclosed. In the following description, numerous specific details are set forth to provide a thorough understanding of the examples. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail in order to avoid obscuring certain aspects.
Reference throughout this specification to “one example” or “one embodiment” means that a particular feature, structure, or characteristic described in connection with the example is included in at least one example of the present disclosure. Thus, the appearances of the phrases “in one example” or “in one embodiment” in various places throughout this specification are not necessarily all referring to the same example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more examples.
Spatially relative terms, such as “beneath,” “below,” “over,” “under,” “above,” “upper,” “top,” “bottom,” “left,” “right,” “center,” “middle,” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is rotated or turned over, elements described as “below” or “beneath” or “under” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary terms “below” and “under” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated ninety degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly. In addition, it will also be understood that when an element is referred to as being “between” two other elements, it can be the only element between the two other elements, or one or more intervening elements may also be present.
It will be further understood that, although the terms first, second, third, etc., may be used herein to describe various elements, these elements should not be limited by these terms and should not be used to determine the process sequence or formation order of associated elements. Unless indicated otherwise, these terms are merely used to distinguish one element from another element. Thus, a first element discussed below could be termed a second element without departing from the teachings of the disclosed embodiments.
Throughout this specification, several terms of art are used. These terms are to take on their ordinary meaning in the art from which they come, unless specifically defined herein or the context of their use would clearly suggest otherwise. It should be noted that element names and symbols may be used interchangeably through this document (e.g., Si vs. silicon); however, both have identical meaning.
As will be discussed, various examples of an imaging system with a pixel circuit providing simultaneous hybrid functionality (e.g., simultaneous image/video capturing and event driven sensing capabilities) with high sampling point distribution are disclosed. Although normal image/video sensors offer great image and/or video capturing capabilities, one of the limitations with normal image/video sensors is that normal image sensors do not provide ultra-high frame rates and ultra-high speed capture capabilities that may be useful in a variety of applications such as machine vision, gaming, and artificial intelligence sensing areas. Attempts to provide typical image/video sensors with such ultra-high frame rates and ultra-high speed capabilities have resulted in compromised solutions that provide poor quality image captures compared to their normal image sensor counterparts.
It is appreciated that circuit designs in accordance with the teachings of the present disclosure address at least some of the issues discussed above. For example, an image sensor disclosed herein can operate in a hybrid mode in which the image sensor simultaneously provides great image and video capture capabilities using a first subset of pixels, and senses events at ultra-high frame rates and at ultra-high speeds from pixel using a second subset of pixels for a wide variety of event driven (or other) applications. Moreover, the first and second subsets of pixels can be arranged with high sampling point distribution to provide improved contrast and/or modulation transfer function (MTF) compared to other image sensors.
Thus, as will be shown and described in the various examples below, an example pixel circuit includes a pixel array and a color filter. The pixel array includes a plurality of pixels each comprising two photodiodes, a floating diffusion coupled between the two photodiodes, and two transfer transistors coupled between the two photodiodes and the floating diffusion. The color filter array includes a plurality of color filters disposed over at least one of the pixels. Each pixel is coupled to a first readout circuit. As will be discussed in various examples below, the pixels include a first subset of pixels and a second subset of pixels. In various examples, the second subset of pixels are also coupled to a second readout circuit and the first subset of pixels are a remaining subset of the pixels not coupled to the second readout circuit. In other words, the second subset of pixels is a subset of a set of all of the pixels that includes some, but not all, of the pixels which are coupled to the first readout circuit in accordance with the teachings of the present disclosure. Each pair of pixels arranged in two adjacent rows includes a first pixel included in the second subset of the pixels and a second pixel included in the first subset of pixels, which are the remaining subset of the pixels that arc configured to couple to the first readout circuit and under a certain pixel operation (e.g., hybrid mode), but not couple to the second readout circuit, and disposed underneath one of the color filters.
To illustrate,
In various examples, the pixel array 108 is a two-dimensional (2D) array including a plurality of pixel cells (also referred to as “pixels”) that each includes a photodiode exposed to incident light. As illustrated in the depicted example, the pixels are arranged into rows and columns to acquire image data of a person, place, object, etc., which can then be used to render images and/or video of a person, place, object, etc. As discussed further herein, a first fraction of the pixels are configured as CMOS image sensor (CIS) pixels, and a second fraction of the pixels are configured as hybrid CIS/event-based vision sensor (EVS) pixels. In the example, each CIS pixel can be configured to photogenerate image charge in response to the incident light. After each CIS pixel has acquired its image charge, the corresponding analog image charge data is read out by the image readout circuit 116 in the bottom die 106 through the column bit lines. In various examples, the image charge from each row of pixel array 108 may be read out in parallel through column bit lines by the image readout circuit 116.
In various examples, the image readout circuit 116 in the bottom die 106 includes amplifiers, analog to digital converter (ADC) circuitry, associated analog support circuitry, associated digital support circuitry, etc., for normal image readout and processing. In some examples, image readout circuit 116 may also include event driven readout circuitry, which will be described in greater detail below. In operation, the photogenerated analog image charge signals are read out from the pixel cells of pixel array 108, amplified, and converted to digital values in the image readout circuit 116. In some examples, image readout circuit 116 may read out a row of image data at a time. In other examples, the image readout circuit 116 may read out the image data using a variety of other techniques (not illustrated), such as a serial readout or a full parallel readout of all pixels simultaneously. The image data may be stored or even manipulated by applying post image effects (e.g., crop, rotate, remove red eye, adjust brightness, adjust contrast, and the like).
In the depicted example, the second die 104, which may also be referred to as the middle die 104 of the stacked CIS with EVS system 100, includes an event driven sensing array 112 that is coupled to the pixel array 108 in the top die 102. In various examples, the event driven sensing array 112 is coupled to the pixels of the pixel array 108 through corresponding pairs of hybrid bond pads between the top die 102 and the middle die 104. In one example, the event driven sensing array 112 includes an array of event driven circuits. As will be discussed, in one example, each one of the event driven circuits in the event driven sensing array 112 is coupled to a plurality of pixels of the pixel array 108 through pixel level connections between the top die 102 and the middle die 104 to asynchronously detect events that occur in the light that is incident upon the pixel array 108 in accordance with the teachings of the present disclosure.
In some embodiments, the second fraction of the pixels, namely the hybrid CIS/EVS pixels, can be selectively coupled to the event driven readout circuits of the event driven sensing array 112. When operated as EVS pixels, the photosensors of the hybrid CIS/EVS pixels can be used to track changes in the intensity of light incident on the photosensors from an external scene. In particular, the photosensors can photogenerate image charge (electrons or holes) or photocurrent in response to the incident light from the external scene. The photogenerated image charge can then be provided, via an EVS connection such as a hybrid bond, to a coupled event driven circuit of the event driven sensing array 112. In some embodiments, the event driven circuit includes (i) a photocurrent-to-voltage converter coupled to the photosensor to convert photocurrent generated by the photosensor to a voltage; and (ii) a filter amplifier coupled to the photocurrent-to-voltage converter to generate a filtered and amplified signal in response to the voltage received from the photocurrent-to-voltage converter. The event driven circuit can further include a threshold comparison circuit to determine and generate event detection signals in response to events asynchronously detected in incident light received from the external scene. For example, the threshold comparison circuit may generate an event detection signal when a detected change in the pixel signal at the output of the filter amplifier relative to a reference pixel signal is greater than a predetermined voltage threshold value. It is appreciated that the described event driven readout circuit is one example implementation to read out event signals. Various implementations for readout circuitry and readout schemes for event vision sensor pixels are well known. Thus, details on circuitry and readout techniques for event driven circuits are largely omitted here for the sake of brevity and to avoid obscuring aspects of the present technology.
In various examples, corresponding event detection signals are generated by the event driven circuits in the event driven sensing array 112. The event detection signals may be coupled to be received and processed by an event driven peripheral circuitry 114 which, in one example, is arranged around the periphery of the event driven sensing array 112 in the middle die 104 as shown in
In the illustrated example, the pixel array 208 includes a plurality of pixel cells or pixels 219 arranged in Y rows and X columns. Each pixel 219 can include two subpixels 218 each including a photodiode. The two photodiodes in the two subpixels 218 are coupled together to share a floating diffusion 220 (represented as a horizontal line extending between the two adjacent subpixels 218) such that each pixel 219 includes one floating diffusion 220. The two subpixels 218 are electrically isolated, for example by trench isolation structure and/or junction isolation. Example circuitry of the pixel 219 is described in further detail below with reference to
The pixel circuit 207 can also include a color filter array 209 disposed over the pixel array 208. The color filter array 209 includes a plurality of color filters 230 each having one of a plurality of colors and disposed over at least one of the pixels 218. In
The pixel circuit 207 can also include a plurality of microlenses 240 disposed over the pixel array 208. In particular, the microlenses 240 are 1×2 microlenses such that each microlens 240 is disposed over two subpixels 218 in the same row and two adjacent columns, or each pixel 219. The microlenses 240 can help concentrate incident light onto the photodiodes included in the pixels 219, improving the sensitivity and overall image quality of the pixel circuit 207. The microlenses 240 can also minimize crosstalk between adjacent pixels 219, thereby enhancing the ability of the pixel circuit 207 to accurately capture fine details and colors.
Each of the pixels 219 can be coupled to a first readout circuit, such as the image readout circuit 116 shown in
In various examples, the pixel circuit 207 can be operated in a first mode, a second mode, and a third mode. In the first mode, all of the pixels 219 are configured to couple to the first readout circuit to provide CIS information corresponding to an external scene such that the pixel circuit 207 provides an image without image quality loss compared to conventional CIS-only pixel circuits. In the second mode, the pixels in the remaining subset or first subset 219a are configured to couple to the first readout circuit to continue providing CIS information, and the pixels in the second subset 219b are configured to couple to the second readout circuit, to provide event detection signals (e.g., provide photocurrent for event detection functionality) and/or other non-CIS imaging information (e.g., change in intensity or object motion information). Because the pixels in the second subset 219b are disposed under the RGB filters 230, the non-CIS information can be provided in R+G+G+B or gray, which may result in reduced sensitivity for event detection (or other functionality) in the second mode, but may provide improved image quality in the first mode. Therefore, the second mode is a hybrid mode in which the pixel circuit 207 simultaneously provides CIS information and non-CIS information corresponding to the external scene. In the third mode, all of the pixels 219 are configured to couple to the second readout circuit, to provide non-CIS information.
It is appreciated that in the example N×N (e.g., 8×4) pixel circuit 207 shown in
In the illustrated example, the pixel array 308 includes a plurality of pixel cells or pixels 319 arranged in Y rows and X columns. Each pixel 319 can include two subpixels 318 each including a photodiode. The two photodiodes in the two subpixels 318 are coupled together to share a floating diffusion 320 (represented as a horizontal line extending between the two adjacent subpixels 318) such that each pixel 319 includes one floating diffusion 320. The two subpixels 318 are electrically isolated, for example by trench isolation structure and/or junction isolation. Example circuitry of the pixel 319 is described in further detail below with reference to
The pixel circuit 307 can also include a color filter array 309 disposed over the pixel array 308. The color filter array 309 includes a plurality of color filters 330 each having one of a plurality of colors and disposed over at least one of the pixels 318. In
The pixel circuit 307 can also include a plurality of microlenses 340 disposed over the pixel array 308. In particular, the microlenses 340 are 2×2 microlenses such that each microlens 340 is disposed over four subpixels 318 in adjacent rows and columns, or a pair of pixels 319 in adjacent rows. The microlenses 340 can help concentrate incident light onto the photodiodes included in the pair of pixels 319, improving the sensitivity and overall image quality of the pixel circuit 307. The microlenses 340 can also minimize crosstalk between adjacent pairs of pixels 319, thereby enhancing the ability of the pixel circuit 307 to accurately capture fine details and colors.
Each of the pixels 319 can be coupled to a first readout circuit, such as the image readout circuit 116 shown in
In various examples, the pixel circuit 307 can be operated in a first mode, a second mode, and a third mode. In the first mode, all of the pixels 319 are configured to couple to the first readout circuit, to provide CIS information corresponding to an external scene such that the pixel circuit 307 provides an image without image quality loss compared to conventional CIS-only pixel circuits. In the second mode, the pixels in the remaining subset or first subset 319a are configured to couple to the first readout circuit (e.g., image readout circuit), to continue providing CIS information, and the pixels in the second subset 319b are configured to couple to the second readout circuit (e.g., event driven circuit), to provide event detection signals (e.g., provide photocurrent for event detection functionality) and/or other non-CIS information. Because the pixels in the second subset 319b are disposed under the RGB filters 330, the non-CIS information can be provided in correspondence to a combination of red, green, green, and blue colors (R+G+G+B) or gray, which may result in reduced sensitivity for event detection (or other functionality) in the second mode, but may provide improved image quality in the first mode. Therefore, the second mode is a hybrid mode in which the pixel circuit 307 simultaneously provides CIS information and non-CIS information corresponding to the external scene. In the third mode, all of the pixels 319 are configured to couple to the second readout circuit to provide non-CIS information.
It is appreciated that in the example N×N (e.g., 8×4) pixel circuit 307 shown in
In the illustrated example, the pixel array 408 includes a plurality of pixel cells or pixels 419 arranged in Y rows and X columns. Each pixel 419 can include a two subpixels 418 each including a photodiode. The two photodiodes in the two subpixels 418 are coupled together to share a floating diffusion 420 (represented as a horizontal line extending between the two adjacent subpixels 418) such that each pixel 419 includes one floating diffusion 420. The two subpixels 418 are electrically isolated, for example by trench isolation structure and/or junction isolation. Example circuitry of the pixel 419 is described in further detail below with reference to
The pixel circuit 407 can also include a color filter array 409 disposed over the pixel array 408. The color filter array 409 includes a plurality of color filters each corresponding to one of a plurality of color spectrums and disposed over at least one of the pixels 418. In
The pixel circuit 407 can also include a plurality of microlenses 440 disposed over the pixel array 408. In particular, the microlenses 440 are 1×2 microlenses such that each microlens 440 is disposed over two subpixels 418 in the same row and two adjacent columns, or each pixel 419. The microlenses 440 can help concentrate incident light onto the photodiodes included in the pixels 419, improving the sensitivity and overall image quality of the pixel circuit 407. The microlenses 440 can also minimize crosstalk between adjacent pixels 419, thereby enhancing the ability of the pixel circuit 407 to accurately capture fine details and colors.
Each of the pixels 419 can be coupled to a first readout circuit, such as the image readout circuit 116 shown in
In various examples, the pixel circuit 407 can be operated in a first mode, a second mode, and a third mode. In the first mode, all of the pixels 419 are configured to couple to the first readout circuit, to provide CIS information corresponding to an external scene such that the pixel circuit 407 provides an image without image quality loss compared to conventional CIS-only pixel circuits. In the second mode, the pixels in the first subset 419a are configured to couple to the first readout circuit, to continue providing CIS information, and the pixels in the second subset 419b are configured to couple to the second readout circuit, to provide event detection signals (e.g., provide photocurrent for event detection functionality) and/or other non-CIS information. Disposing the second subset 419b underneath the non-RGB or clear filters 432 (or no filters) can help increase the photocurrent generated and/or reduce latency associated with the pixels included in the second subset 419b. Therefore, the second mode is a hybrid mode in which the pixel circuit 407 simultaneously provides CIS information and non-CIS information corresponding to the external scene. In the third mode, all of the pixels 419 are configured to couple to the second readout circuit to provide non-CIS information.
It is appreciated that in the example N×N (e.g., 8×4) pixel circuit 407 shown in
In the illustrated example, the pixel array 508 includes a plurality of pixel cells or pixels 519 arranged in Y rows and X columns. Each pixel 519 can include two subpixels 518 each including a photodiode. The two photodiodes in the two subpixels 518 are coupled together to share a floating diffusion 520 (represented as a horizontal line extending between the two adjacent subpixels 518) such that each pixel 519 includes one floating diffusion 520. The two subpixels 518 are electrically isolated, for example by trench isolation structure and/or junction isolation. Example circuitry of the pixel 519 is described in further detail below with reference to
The pixel circuit 507 can also include a color filter array 509 disposed over the pixel array 508. The color filter array 509 includes a plurality of color filters each corresponding to one of a plurality of color spectrums and disposed over at least one of the pixels 518. In
The pixel circuit 507 can also include a plurality of microlenses 540 disposed over the pixel array 508. In particular, the microlenses 540 are 2×2 microlenses such that each microlens 540 is disposed over four subpixels 518 in adjacent rows and columns, or a pair of pixels 519 in adjacent rows. The microlenses 540 can help concentrate incident light onto the photodiodes included in the pair of pixels 519, improving the sensitivity and overall image quality of the pixel circuit 507. The microlenses 540 can also minimize crosstalk between adjacent pairs of pixels 519, thereby enhancing the ability of the pixel circuit 507 to accurately capture fine details and colors.
Each of the pixels 519 can be coupled to a first readout circuit, such as the image readout circuit 116 shown in
In various examples, the pixel circuit 507 can be operated in a first mode, a second mode, and a third mode. In the first mode, all of the pixels 519 are configured to couple to the first readout circuit (e.g., image readout circuit) to provide CIS information corresponding to an external scene such that the pixel circuit 507 provides an image without image quality loss compared to conventional CIS-only pixel circuits. In the second mode, the pixels in the first subset 519a are configured to couple to the first readout circuit to continue providing CIS information, and the pixels in the second subset 519b are configured to couple to the second readout circuit (e.g., event driven circuit) to provide event detection signals (e.g., provide photocurrent for event detection functionality) and/or other non-CIS information. Disposing the second subset 519b underneath the non-RGB or clear filters 532 (or no filters) can help increase the photocurrent generated and/or reduce latency associated with the pixels included in the second subset 519b. Therefore, the second mode is a hybrid mode in which the pixel circuit 507 simultaneously provides CIS information and non-CIS information corresponding to the external scene. In the third mode, all of the pixels 519 are configured to couple to the second readout circuit to provide non-CIS information.
It is appreciated that in the example N×N (e.g., 8×4) pixel circuit 507 shown in
In the illustrated example, the pixel array 608 includes a plurality of pixel cells or pixels 619 arranged in Y rows and X columns. Each pixel 619 can include a two subpixels 618 each including a photodiode. The two photodiodes in the two subpixels 618 are coupled together to share a floating diffusion 620 (represented as a horizontal line extending between the two adjacent subpixels 618) such that each pixel 619 includes one floating diffusion 620. The two subpixels 618 are electrically isolated, for example by trench isolation structure and/or junction isolation. Example circuitry of the pixel 619 is described in further detail below with reference to
The pixel circuit 607 can also include a color filter array 609 disposed over the pixel array 608. The color filter array 609 includes a plurality of color filters 630 each corresponding to one of a plurality of color spectrums and is disposed over at least one of the pixels 618. In
The pixel circuit 607 can also include a plurality of microlenses 640 disposed over the pixel array 608. In particular, the microlenses 640 are 1×2 microlenses such that each microlens 640 is disposed over two subpixels 618 in the same row and two adjacent columns, or each pixel 619. The microlenses 640 can help concentrate incident light onto the photodiodes included in the pixels 619, improving the sensitivity and overall image quality of the pixel circuit 607. The microlenses 640 can also minimize crosstalk between adjacent pixels 619, thereby enhancing the ability of the pixel circuit 607 to accurately capture fine details and colors.
Each of the pixels 619 can be coupled to a first readout circuit, such as the image readout circuit 116 shown in
In various examples, the pixel circuit 607 can be operated in a first mode, a second mode, and a third mode. In the first mode, all of the pixels 619 are configured to couple to the first readout circuit (e.g., image readout circuit) to provide CIS information corresponding to an external scene such that the pixel circuit 607 provides an image without image quality loss compared to conventional CIS-only pixel circuits. In the second mode, the pixels in the remaining subset or first subset 619a are configured to couple to the first readout circuit, to continue providing CIS information, and the pixels in the second subset 619b are configured to couple to the second readout circuit (e.g., event driven circuit) to provide event detection signals (e.g., provide photocurrent for event detection functionality) and/or other non-CIS information. Because the pixels in the second subset 619b are disposed under the RGB filters 630, the non-CIS information can be provided in combined color information of red, green, green, and blue color information (e.g., R+G+G+B) or gray, which may result in reduced sensitivity for event detection (or other functionality) in the second mode, but may provide improved image quality in the first mode. Therefore, the second mode is a hybrid mode in which the pixel circuit 607 simultaneously provides CIS information and non-CIS information corresponding to the external scene. In the third mode, all of the pixels 619 are configured to couple to the second readout circuit, to provide non-CIS information.
It is appreciated that in the example N×N (e.g., 8×4) pixel circuit 607 shown in
In the illustrated example, the pixel array 708 includes a plurality of pixel cells or pixels 719 arranged in Y rows and X columns. Each pixel 719 can include two subpixels 718 each including a photodiode. The two photodiodes in the two subpixels 718 are coupled together to share a floating diffusion 720 (represented as a horizontal line extending between the two adjacent subpixels 718) such that each pixel 719 includes one floating diffusion 720. The two subpixels 718 are electrically isolated, for example by trench isolation structure and/or junction isolation. Example circuitry of the pixel 719 is described in further detail below with reference to
The pixel circuit 707 can also include a color filter array 709 disposed over the pixel array 708. The color filter array 709 includes a plurality of color filters each corresponding to one of a plurality of color spectrums and is disposed over at least one of the pixels 718. In
The pixel circuit 707 can also include a plurality of microlenses 740 disposed over the pixel array 708. In particular, the microlenses 740 are 1×2 microlenses such that each microlens 740 is disposed over two subpixels 718 in the same row and two adjacent columns, or each pixel 719. The microlenses 740 can help concentrate incident light onto the photodiodes included in the pixels 719, improving the sensitivity and overall image quality of the pixel circuit 707. The microlenses 740 can also minimize crosstalk between adjacent pixels 719, thereby enhancing the ability of the pixel circuit 707 to accurately capture fine details and colors.
Each of the pixels 719 can be coupled to a first readout circuit, such as the image readout circuit 116 shown in
In various examples, the pixel circuit 707 can be operated in a first mode, a second mode, and a third mode. In the first mode, all of the pixels 719 are configured to couple to the first readout circuit (e.g., image readout circuit) to provide CIS information such that the pixel circuit 707 provides an image without image quality loss compared to conventional CIS-only pixel circuits. In the second mode, the pixels in the first subset 719a are configured to couple to the first readout circuit, to continue providing CIS information, and the pixels in the second subset 719b are configured to couple to the second readout circuit (e.g., event driven circuit), to provide event detection signals (e.g., provide photocurrent for event detection functionality) and/or other non-CIS information. Disposing the second subset 719b underneath the non-RGB or clear filters 732 (or no filters) can help increase the photocurrent generated and/or reduce latency associated with the pixels included in the second subset 719b. Therefore, the second mode is a hybrid mode in which the pixel circuit 707 simultaneously provides CIS information and non-CIS information. In the third mode, all of the pixels 719 are configured to couple to the second readout circuit to provide non-CIS information.
It is appreciated that in the example N×N (e.g., 8×4) pixel circuit 707 shown in
It is appreciated that the pixel circuits illustrated herein and described above are merely examples illustrative of certain features of the present disclosure, and that other pixel circuits are within the scope of the present disclosure. For example, a pixel circuit can include pixels included in a second subset arranged in a checkerboard pattern (e.g., as shown in
As discussed above, an imaging system configured in accordance with the teachings of the present disclosure can be configured between a first mode, providing only CIS information without image quality loss, a second mode, providing hybrid (e.g., simultaneous CIS and non-CIS) information, and a third mode, providing only non-CIS information. This enables the imaging system to selectively provide various types of information without sacrificing conventional imaging quality, unlike many conventional imaging systems.
In some cases, arranging the pixels included in the second subset in a checkerboard pattern can be preferred due to the manner in which most image sensor systems are used. For example, image sensors (e.g., smartphone cameras) are often held either horizontally or vertically such that horizons or edges (e.g., of walls, doors, windows, etc.) land on a single row or column. If (i) an imaging system is operating in the second (e.g., hybrid) mode, (ii) the pixels included in the second subset cover an entire row or column, and (iii) the incident light from a horizon or edge lands on the row or column occupied entirely by the pixels included in the second subset, the imaging system may be unable to sharply capture that horizon or edge due to the lack of CIS pixels in that row or column. On the other hand, a checkerboard pattern ensures that each row and column includes CIS pixels at all times, regardless of the mode in which the imaging system is operating, to capture those horizons or edges.
Moreover, in the various examples of the pixel circuits disclosed herein, including the pixel circuits 207, 307, 407, 507, 607, 707 and the pixel circuits not fully illustrated but described above, each pair of pixels arranged in two adjacent rows includes a first pixel included in the second subset of the pixels and a second pixel included in the first subset of the pixels and disposed underneath one of the color filters. Therefore, when the pixel circuit operates in the second (e.g., hybrid) mode described above, the arrangements described herein result in a high sampling point distribution for event detection, phase detection auto focus (PDAF), or other processing performed exclusively for signals from pixels included in the second subset. The high sampling point distribution disclosed herein can result in improved contrast and/or modulation transfer function (MTF) compared to other image sensors.
The pixel 819 can include a first photodiode 862a, a second photodiode 862b, a floating diffusion (FD) 820, a first transfer transistor 864a, a second transfer transistor 864b, a source follower transistor 866, a row select transistor 868, and a reset transistor 869. The first photodiode 862a, which can correspond to a first subpixel, can be configured to photogenerate a first image charge in response to incident light. The second photodiode 862b, which can correspond to a second subpixel, can be configured to photogenerate a second image charge in response to incident light. The FD 820 can be coupled to receive the first image charge from the first photodiode 862a and receive the second image charge from the second photodiode 862b. The first transfer transistor 864a can be coupled between the first photodiode 862a and the FD 820 to transfer the first image charge(s) from the first photodiode 862a to the floating diffusion FD 820. The second transfer transistor 864b can be coupled between the second photodiode 862b and the FD 820 to transfer the second image charge(s) from the second photodiode 862b to the FD 820. A gate terminal of the source follower transistor 866 can be coupled to the FD 820, and the row select transistor 868 can be coupled to the source follower transistor 866. The reset transistor 869 can be coupled to the FD 820 to selectively reset the FD 820 to a predetermined voltage level.
In the depicted example, the mode select switch circuit 882 can include a first transistor 882a that is coupled between a node N and a voltage source vpix. In the depicted example, the node N is coupled to the second readout circuit 880, which in the example is part of one of the event driven circuits included in the event driven sensing array 112 shown in
As shown, the pixel 819 can be positioned on the top die 102, and the mode select switch circuit 882 and second readout circuit 880 can be positioned on the middle die 104 and coupled to one or more pixels 819. More specifically, in the illustrated example, the reset transistor 869 of the pixel 819 is coupled to the mode select switch circuit 882 through a hybrid bond 870 (or other type of bond) between the top die 102 and the middle die 104.
In the illustrated example, the pixel array 908 positioned on the top die 102 includes a plurality of the pixels 819 arranged in rows and columns. Pixels included in a second subset 819b, one of which is boxed in a broken line, includes reset transistors coupled to the mode select switch circuit 882. Pixels included in a remaining subset or a first subset 819a, one of which is boxed in a broken line, however, are not coupled to the mode select switch circuit 882. In
In the illustrated example, the pixel array 1008 positioned on the top die 102 includes a plurality of the pixels 819 arranged in rows and columns. Pixels included in the second subset 819b, one of which is boxed in a broken line, includes reset transistors coupled to the mode select switch circuit 882. Pixels included in the first subset 819a, one of which is boxed in a broken line, however, are not coupled to the mode select switch circuit 882. In
Compared to the pixel array 908 of
In operation, the mode select switch circuit 882 can be controlled to configure the pixel arrays 908, 1008, and thus the imaging system including the pixel arrays 908, 1008, between the first mode in which all of the pixels 819 provide CIS information through the source follower transistors 866 and the row select transistors 868 (
As discussed above, an imaging system configured in accordance with the teachings of the present disclosure can be configured between the first mode, providing CIS information without image quality loss, and the second mode, providing hybrid information. This enables the imaging system to provide additional information without sacrificing conventional imaging, unlike many conventional imaging systems. Moreover, because the pixels included in the second subset are arranged with high sampling point distribution (e.g., for event detection, phase detection auto focus (PDAF), or other processing), the imaging system can provide improved contrast and/or modulation transfer function (MTF) compared to other image sensors.
The above description of illustrated examples of the disclosure, including what is described in the Abstract, is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. While specific examples of the disclosure are described herein for illustrative purposes, various modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize.
These modifications can be made to the disclosure in light of the above detailed description. The terms used in the following claims should not be construed to limit the disclosure to the specific examples disclosed in the specification. Rather, the scope of the disclosure is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
The present application claims the benefit of U.S. Provisional Patent Application No. 63/608,150, filed Dec. 8, 2023, the disclosure of which is incorporated herein by reference in its entirety.
| Number | Date | Country | |
|---|---|---|---|
| 63608150 | Dec 2023 | US |