The present disclosure relates to an image sensor and an electronic device, and more particularly to an image sensor and an electronic device capable of further improving performance.
Conventionally, there has been developed an image sensor that detects, as an event, that an amount of light received by a photodiode exceeds a threshold value for each pixel in real time, and reads a pixel signal corresponding to an intensity from the pixel to acquire an image.
For example, Patent Document 1 discloses a solid-state imaging device in which pixel transistors are arranged so as to improve the light receiving efficiency of sensors capable of detecting an event and detecting intensity.
Meanwhile, in the solid-state imaging device disclosed in Patent Document 1, a pixel transistor for event detection and a pixel transistor for intensity detection are required for each pixel, and the number of pixel transistors provided for each pixel is increased. Thus, it has been conventionally difficult to achieve miniaturization, expansion of a light receiving section, and the like. Furthermore, the conventional solid-state imaging device has a configuration in which a pixel transistor for event detection and a pixel transistor for intensity detection are arranged adjacent to each other. Thus, there is a concern that a detection error may occur due to coupling of both control lines when these pixel transistors are simultaneously driven. Therefore, it is demanded to improve performance by miniaturizing pixels, enlarging a light receiving section, suppressing a detection error, and the like.
The present disclosure has been made in view of such a situation, and an object thereof is to further improve performance.
An image sensor according to an aspect of the present disclosure includes: a photoelectric conversion unit provided for each of a plurality of pixels arranged in a matrix on a sensor surface; a first transfer transistor that transfers a charge generated by photoelectric conversion in the photoelectric conversion unit to a first node; and a second transfer transistor that transfers a charge generated by photoelectric conversion in the photoelectric conversion unit to a second node different from the first node, in which at least a part of a predetermined number of the pixels included in a first sharing unit that shares and uses the first node and a predetermined number of the pixels included in a second sharing unit that shares and uses the second node have different sharing destinations.
An electronic device according to an aspect of the present disclosure includes: a photoelectric conversion unit provided for each of a plurality of pixels arranged in a matrix on a sensor surface; a first transfer transistor that transfers a charge generated by photoelectric conversion in the photoelectric conversion unit to a first node; and a second transfer transistor that transfers a charge generated by photoelectric conversion in the photoelectric conversion unit to a second node different from the first node, and includes an image sensor in which at least a part of a predetermined number of the pixels included in a first sharing unit that shares and uses the first node and a predetermined number of the pixels included in a second sharing unit that shares and uses the second node have different sharing destinations.
In one aspect of the present disclosure, a photoelectric conversion unit is provided for each of a plurality of pixels arranged in a matrix on a sensor surface, a charge generated by photoelectric conversion in the photoelectric conversion unit is transferred to a first node by a first transfer transistor, and a charge generated by photoelectric conversion in the photoelectric conversion unit is transferred to a second node different from the first node by a second transfer transistor. In addition, at least a part of a predetermined number of the pixels included in a first sharing unit that shares and uses the first node and a predetermined number of the pixels included in a second sharing unit that shares and uses the second node have different sharing destinations.
Hereinafter, a specific embodiment to which the present technology is applied will be described in detail with reference to the drawings.
<Configuration Example of Image Sensor>
An image sensor 11 includes a plurality of pixels 12 arranged in a matrix on a sensor surface that receives light, and detect occurrence of an event for each pixel 12, so as to acquire an image.
Each pixel 12 includes a photodiode (PD) 13, a transfer transistor (hereinafter, referred to as a TG transistor) 14 for intensity detection, and a transfer transistor (hereinafter, referred to as a TGD transistor) 15 for event detection.
In the pixels 12-1 to 12-4, one ends of the TG transistors 14-1 to 14-4 are connected to the PDs 13-1 to 13-4, respectively, and the other ends of the TG transistors 14-1 to 14-4 are connected to the FD node 21. Similarly, in the pixels 12-3 to 12-6, one ends of the TGD transistors 15-3 to 15-6 are connected to the PDs 13-3 to 13-6, respectively, and the other ends of the TGD transistors 15-3 to 15-6 are connected to the SN node 22.
The TG transistors 14-1 to 14-4 transfer the charges generated by photoelectric conversion in the PDs 13-1 to 13-4 to the FD node 21 according to respective transfer signals TG. The FD node 21 temporarily accumulates these charges.
The TGD transistors 15-3 to 15-6 transfer the charges generated by photoelectric conversion in the PDs 13-3 to 13-6 to the SN node 22 according to the respective transfer signal TGD. The SN node 22 temporarily accumulates these charges.
The intensity reading circuit 23 is configured by combining an amplification transistor 31, a selection transistor 32, and a reset transistor 33, and outputs an intensity signal corresponding to the amounts of light received by the PDs 13-1 to 13-4. The amplification transistor 31 generates an intensity signal according to the charge accumulated in the FD node 21, and when the intensity reading circuit 23 is selected by a selection signal SEL supplied to the selection transistor 32, the intensity signal is read via a vertical signal line VSL. Furthermore, the charge accumulated in the FD node 21 is discharged according to a reset signal RST supplied to the reset transistor 33, and the FD node 21 is reset.
The logarithmic conversion circuit 24 is configured by combining amplification transistors 41 and 42 and Log transistors 43 and 44, and connecting a constant current source 46 to the combination via a Cu—Cu contact section 45, and outputs a voltage signal of a voltage value obtained by logarithmically converting the amount of light received by the PDs 13-3 to 13-6 to a row selection circuit 51. Here, the voltage signal output from the logarithmic conversion circuit 24 is used in a logic circuit at a subsequent stage to detect that an event has occurred in a case where the voltage signal is equal to or more than a predetermined voltage value, and hereinafter, this voltage signal is also referred to as an event detection signal.
The row selection circuit 51 is configured by connecting a capacitor 52, an amplifier 53, a capacitor 54, and a switch 55, and outputs an event detection signal output from the logarithmic conversion circuit 24 to a logic circuit (not illustrated) according to a row selection signal for selecting pixels 12 in each row.
The image sensor 11 is thus configured, and the pixels 12-1 to 12-4 surrounded by a one-dot chain line form an intensity sharing unit that shares the FD node 21 and the intensity reading circuit 23, and the pixels 12-3 to 12-6 surrounded by a two-dot chain line form an event sharing unit that shares the SN node 22 and the logarithmic conversion circuit 24.
As illustrated in
The intensity sharing unit has a wiring configuration in which the amplification transistor 31, the selection transistor 32, and the reset transistor 33 included in the intensity reading circuit 23 are connected to the FD node 21 provided at the center of the pixels 12-1 to 12-4. The event sharing unit has a wiring configuration in which the amplification transistors 41 and 42 and the Log transistors 43 and 44 included in the logarithmic conversion circuit 24 are connected to the SN node 22 provided at the center of the pixels 12-3 to 12-6.
The image sensor 11 is thus configured, and the adopted pixel sharing structure in which the intensity reading circuit 23 and the logarithmic conversion circuit 24 are each shared by every four pixels 12 enables miniaturization of the pixels 12 or expansion of the area of the PDs 13. That is, since the image sensor 11 can reduce the number of necessary pixel transistors as compared with the conventional configuration in which it is necessary to provide the intensity reading circuit 23 and the logarithmic conversion circuit 24 for each pixel 12, it is possible to miniaturize the pixels 12 or expand the area of the PDs 13. As a result, the image sensor 11 can achieve miniaturization and high sensitivity as compared with the conventional configuration, and can improve performance.
Furthermore, the image sensor 11 is configured such that among the four pixels 12-1 to 12-4 serving as an intensity sharing unit and among the four pixels 12-3 to 12-6 serving as an event sharing unit, the pixels 12-3 and 12-4 share the same FD node 21 and SN node 22, that is, the sharing destinations thereof are the same. On the other hand, the image sensor 11 is configured such that the pixels 12-1 and 12-2 among the four pixels 12-1 to 12-4 serving as an intensity sharing unit and the pixels 12-5 and 12-6 among the four pixels 12-3 to 12-6 serving as an event sharing unit share different FD nodes 21 and SN nodes 22, that is, the sharing destinations thereof are different.
As described above, the image sensor 11 is configured such that sharing destinations of at least some of the pixels 12 are different between the intensity sharing unit and the event sharing unit, whereby the intervals between the TG transistors 14 and the TGD transistors 15 can be wide in plan view of the sensor surface. That is, the image sensor 11 is configured in a planar layout in which intervals between the TG transistors 14 and intervals between the TGD transistors 15 are narrow, and intervals between the TG transistors 14 and the TGD transistors 15 are wider than the intervals.
Therefore, the image sensor 11 can reduce interference between the TG transistors 14 and the TGD transistors 15 when they are simultaneously driven (for example, they are driven by a driving method of
<Method of Driving Image Sensor>
A method of driving the pixels 12 in the image sensor 11 will be described with reference to
As illustrated in
In the intensity detection period, the pixels 12 are driven so as to discharge the charge accumulated in the FD node 21 via the reset transistor 33 sequentially in the vertical direction according to the intensity shutter signal (Intensity Shutter) in the vertical blanking period. Subsequently, in the intensity reading period, the pixels 12 are driven so as to read the charges generated in the PDs 13 to the FD node 21 via the TG transistor 14 sequentially in the vertical direction according to the intensity reading signal (Intensity Read).
In the event detection period, driving of the pixels 12 for starting reading of the charges generated in the PDs 13 via the TGD transistors 15 and driving of the pixels 12 for ending the reading are alternately and repeatedly performed, sequentially in the vertical direction in accordance with event read on signals (ON event Read) and event read off signals (OFF event Read).
As described above, in a case where the pixels 12 are driven by switching between the intensity detection period and the event detection period, a basic driving method of the image sensor 11 is similar to that of a general complementary metal oxide semiconductor (CMOS) image sensor.
As illustrated in A of
In the intensity detection period, the pixels 12 are driven in accordance with the selection signal SEL, the reset signal RST, and the transfer signals TG1 to TG4 as illustrated in B of
For example, during driving in which binning is performed, the pixels 12 are driven such that the selection signal SEL becomes the H level, the reset signal RST, the transfer signal TG1, the transfer signal TG2, the transfer signal TG3, and the transfer signal TG4 sequentially become the H level in pulse, and thereafter, the selection signal SEL becomes the L level. At the time of driving in which binning is not performed, the pixels 12 are driven such that the selection signal SEL becomes the H level, then the reset signal RST and the transfer signal TG1 sequentially become the H level in pulse, and thereafter, the selection signal SEL becomes the L level. Thereafter, similar driving is repeated for the transfer signals TG2 to TG4.
In the event detection period, the pixels 12 are driven in accordance with the row selection signal and the transfer signals TGD1 to TGD4 as illustrated in B of
For example, at the time of driving in which binning is performed, the pixels 12 are driven such that the row selection signal becomes the H level, the transfer signal TGD1, the transfer signal TGD2, the transfer signal TGD3, and the transfer signal TGD4 sequentially become the H level, the transfer signals TGD1 to TGD4 simultaneously become the L level, and thereafter, the row selection signal becomes the L level. At the time of driving in which binning is not performed, the row selection signal becomes the H level, the transfer signal TGD1 becomes the H level in pulse, and then the row selection signal becomes the L level, and thereafter, similar driving is repeated for the transfer signals TGD2 to TGD4.
In this manner, the image sensor 11 can drive the pixels 12 by switching between the intensity detection period and the event detection period.
Furthermore, the image sensor 11 can drive the pixels 12 with the intensity detection period and the event detection period in parallel by setting two pixels 12 out of the four pixels 12 as an intensity sharing unit and setting the other two pixels 12 as an event sharing unit, for example.
With reference to
That is, as illustrated in A of
When the intensity sharing unit and the event sharing unit are set as described above, the pixels 12 are driven according to the selection signal SEL, the reset signal RST, the row selection signal, the transfer signals TG1 and TG4, and the transfer signals TGD1 and TGD4 as illustrated in B of
As described above, the image sensor 11 can drive the pixels 12 with the intensity detection period and the event detection period in parallel by setting, as an intensity sharing unit, two pixels arranged side by side in an oblique direction among the four pixels 12, and setting, as an event sharing unit, two pixels arranged side by side in an oblique direction adjacent to the intensity sharing unit.
With reference to
That is, as illustrated in A of
When the intensity sharing unit and the intensity sharing unit are set as described above, the pixels 12 are driven according to the selection signal SEL, the reset signal RST, the row selection signal, the transfer signals TG1 and TG3, and the transfer signals TGD1 and TGD3 as illustrated in B of
As described above, the image sensor 11 can drive the pixels 12 with the intensity detection period and the event detection period in parallel by setting, as an intensity sharing unit, two pixels arranged side by side in the vertical direction among the four pixels 12, and setting, as an event sharing unit, two pixels arranged side by side in the vertical direction adjacent to the intensity sharing unit.
<Arrangement Example of Transistors>
An arrangement example of the transistors included in the image sensor 11 will be described with reference to
Note that, in the following description, among various transistors used to drive the pixels 12, transistors other than the TG transistors 14 and the TGD transistors 15 are referred to as pixel Trs. For example, the pixel Trs include the amplification transistor 31, the selection transistor 32, the reset transistor 33, the amplification transistors 41 and 42, and the Log transistors 43 and 44.
As illustrated in
That is, the TG transistors 14 and the TGD transistors 15 are alternately arranged in the row direction at positions below the PDs 13 of the pixels 12(2n, y) in the even-numbered rows and above the PDs 13 of the pixels 12(2n+1, y) in the odd-numbered rows.
Furthermore, the pixel Trs are arranged at the center of the four PDs 13 arranged in 2×2 at positions above the PDs 13 of the pixels 12(2n, y) in the even-numbered rows and below the PDs 13 of the pixels 12(2n+1, y) in the odd-numbered rows.
In the first arrangement example of the transistors as described above, the intensity sharing units and the event sharing units are arranged so as to be shifted from each other by one pixel in the row direction. That is, the intensity sharing units including the pixels 12(2n, 2m), the pixels 12(2n, 2m+1), the pixels 12(2n+1, 2m), and the pixels 12(2n+1, 2m+1) and the event sharing units including the pixels 12(2n+1, 2m), the pixel 12(2n+1, 2m+1), the pixel 12(2n+2, 2m), and the pixel 12(2n+2, 2m+1) are arranged to be shifted by one pixel in the row direction.
Specifically, the pixel 12(0, 0), the pixel 12(0, 1), the pixel 12(1, 0), and the pixel 12(1, 1) surrounded by a one-dot chain line illustrated in
As illustrated in
That is, the TG transistors 14 are arranged at even-numbered positions in the row direction at positions below the PDs 13 of the pixels 12(2n, y) in the even-numbered rows and above the PDs 13 of the pixels 12(2n+1, y) in the odd-numbered rows. The TGD transistors 15 are arranged at odd-numbered positions in the row direction at positions above the PDs 13 of the pixels 12(2n, y) in the even-numbered rows and below the PDs 13 of the pixels 12(2n+1, y) in the odd-numbered rows. That is, the TG transistors 14 and the TGD transistors 15 are alternately arranged in the row direction and the column direction (that is, in the oblique direction).
Furthermore, the pixel Trs are arranged at the center of the four PDs 13 arranged in 2×2, at positions above the PDs 13 of the pixels 12(2n, y) in the even-numbered rows and below the PDs 13 of the pixel 12(2n+1, y) in the odd-numbered rows, and at positions below the PDs 13 of the pixels 12(2n, y) in the even-numbered rows and above the PDs 13 of the pixels 12(2n+1, y) in the odd-numbered rows.
In the second arrangement example of the transistors as described above, the intensity sharing units and the event sharing units are arranged so as to be shifted from each other by one pixel in the row direction and the column direction. That is, the intensity sharing units including the pixels 12(2n, 2m), the pixels 12(2n, 2m+1), the pixels 12(2n+1, 2m), and the pixels 12(2n+1, 2m+1) and the event sharing units including the pixels 12(2n+1, 2m+1), the pixel 12(2n+1, 2m+2), the pixel 12(2n+2, 2m+1), and the pixel 12(2n+2, 2m+2) are arranged to be shifted by one pixel in the row direction and the column direction.
Specifically, the pixel 12(0, 0), the pixel 12(0, 1), the pixel 12(1, 0), and the pixel 12(1, 1) surrounded by a one-dot chain line illustrated in
In the third arrangement example of the transistors illustrated in
On the other hand, in the third arrangement example of the transistors, the arrangement of the pixel Trs is different from that of the first arrangement example of the transistors in
In the third arrangement example of the transistors as described above, the intensity sharing units and the event sharing units are arranged so as to be shifted from each other by one pixel in the row direction.
In the fourth arrangement example of the transistors illustrated in
On the other hand, in the fourth arrangement example of the transistors, the arrangement of the pixel Trs is different from that of the first arrangement example of the transistors in
In the fourth arrangement example of the transistors as described above, the intensity sharing units and the event sharing units are arranged so as to be shifted from each other by one pixel in the row direction.
In the fifth arrangement example of the transistors illustrated in
Then, in the fifth arrangement example of the transistors, the TG transistors 14 and the TGD transistors 15 are arranged similarly to the first arrangement example of the transistors described with reference to
In the fifth arrangement example of the transistors as described above, the intensity sharing units and the event sharing units are arranged so as to be shifted from each other by one pixel in the row direction.
<Image Sensor of Multilayer Structure>
The image sensor 11 has a two-layer structure in which a sensor substrate on which the PDs 13 and the like are provided and a logic substrate on which logic substrates such as the row selection circuit 51 are provided are stacked via the Cu—Cu contact section 45 illustrated in
A configuration example of the image sensor 11 having a three-layer structure will be described with reference to
For example, the image sensor 11 can have a three-layer structure in which a sensor substrate on which the PDs 13 and the like are provided, a transistor substrate on which the pixel transistors are provided, and a logic substrate on which logic substrates such as the row selection circuit 51 are provided are stacked. Note that the circuit configuration of the image sensor 11 having a three-layer structure is similar to the circuit diagram illustrated in
A of
B of
As described above, in the image sensor 11 having a three-layer structure, the area of the PDs 13 in the sensor substrate can be expanded by providing the pixel transistors on the transistor substrate. Therefore, the image sensor 11 can achieve higher sensitivity.
<Arrangement Example of Filters>
An arrangement example of filters stacked on the light receiving surface of the image sensor 11 will be described with reference to
In the first arrangement example of the filters illustrated in
In the second arrangement example of the filters illustrated in
In the image sensor 11 using the first and second arrangement examples of filters as described above, it is possible to acquire, at the FD node 21 and the SN node 22, all pieces of color information by reading charges via the TG transistor 14 and the TGD transistor 15 for each pixel.
In the third arrangement example of the filters illustrated in
Moreover, in the third arrangement example of the filters, the red filters R, the green filters G, of the blue filters B are assigned to each unit of the four pixels 12 to be the intensity sharing unit so that the 4×4 filters of the same color coincide with the intensity sharing unit. Therefore, for the event sharing unit, the red filter R is assigned to one pixel 12, the green filters G are assigned to two pixels 12, and the blue filter B is assigned to one pixel 12.
In the image sensor 11 using the third arrangement example of the filters as described above, intensity signals can be synthesized for each intensity sharing unit in which filters of the same color are arranged, and sensitivity for each color can be improved.
In the fourth arrangement example of the filters illustrated in
Moreover, in the fourth arrangement example of the filters, the red filters R, the green filters G, or the blue filters B are assigned to each unit of the four pixels 12 to be the event sharing unit so that the 4×4 filters of the same color coincide with the event sharing unit. Therefore, for the intensity sharing unit, the red filter R is assigned to one pixel 12, the green filters G are assigned to two pixels 12, and the blue filter B is assigned to one pixel 12.
In the image sensor 11 using the fourth arrangement example of filters as described above, event detection signals can be synthesized for each event sharing unit in which filters of the same color are arranged, and an event can be detected by capturing a finer change in light. Furthermore, the resolution of the intensity signal can be improved as compared with the third arrangement example of the filters of
In the fifth arrangement example of the filters illustrated in
For example, the SN node 22 is shared and used by the 4×4 pixels 12 in which the filters IR are arranged. Furthermore, each of the FD nodes 21 arranged at the four corners of the 4×4 pixels 12 in which the filters IR are arranged is shared by a pixel 12 of a red filter R, a pixel 12 of a green filter G, and a pixel 12 of a blue filter B.
For example, in the example illustrated in
In the ninth arrangement example of the filters illustrated in
For example, the SN node 22 is shared and used by the 4×4 pixels 12 in which the filters IR are arranged. Furthermore, among the FD nodes 21 arranged at the four corners of the 4×4 pixels 12 in which the filters IR are arranged, one FD node 21 is shared by pixels 12 of three red filters R, two FD nodes 21 each are shared by pixels 12 of three green filters G, and one FD node 21 is shared by pixels 12 of three blue filters B.
For example, in the example illustrated in
In the image sensor 11 using the fifth and sixth arrangement examples of filters as described above, an event can be detected with higher sensitivity by 4×4 pixels 12 in which the filters IR are arranged.
<Configuration Example of Electronic Device>
The above-described image sensor 11 may be applied to various electronic devices such as an imaging system such as a digital still camera and a digital video camera, a mobile phone having an imaging function, or another device having an imaging function, for example.
As illustrated in
The optical system 102 includes one or a plurality of lenses, and guides image light from an object (incident light) to the imaging element 103 to form an image on a light-receiving surface (sensor unit) of the imaging element 103.
As the imaging element 103, the image sensor 11 described above is used. Electrons are accumulated in the imaging element 103 for a certain period in accordance with the image formed on the light-receiving surface via the optical system 102. Then, a signal corresponding to the electrons accumulated in the imaging element 103 is supplied to the signal processing circuit 104.
The signal processing circuit 104 performs various types of signal processing on the pixel signal output from the imaging element 103. An image (image data) obtained by the signal processing applied by the signal processing circuit 104 is supplied to the monitor 105 to be displayed or supplied to the memory 106 to be stored (recorded).
The imaging device 101 configured as described above can capture, for example, a higher quality image when occurrence of an event is detected by using the above-described image sensor 11.
<Usage Example of Image Sensor>
The above-described image sensor may be used in various cases in which light such as visible light, infrared light, ultraviolet light, and X-ray is sensed as hereinafter described, for example.
A device which takes an image to be used for viewing such as a digital camera and portable equipment with a camera function
<Combination Examples of Configurations>
Note that the present technology may have the following configurations.
(1)
An image sensor including:
(2)
The image sensor according to (1) described above, in which
(3)
The image sensor according to (1) or (2) described above further including:
(4)
The image sensor according to any one of (1) to (3) described above, in which the pixels are driven while a first detection period
(5)
The image sensor according to any one of (1) to (3) described above, in which
(6)
The image sensor according to any one of (1) to (5) described above, in which
(7)
The image sensor according to any one of (1) to (5) described above, in which
(8)
The image sensor according to (3) described above, in which
(9)
The image sensor according to (3) described above, in which
(10)
The image sensor according to any one of (1) to (9) described above further including
(11)
The image sensor according to (3) described above, in which
(12)
The image sensor according to any one of (1) to (11) described above, in which
(13)
The image sensor according to any one of (1) to (11) described above, in which
(14)
The image sensor according to any one of (1) to (11) described above, in which
(15)
The image sensor according to any one of (1) to (11) described above, in which
(16)
The image sensor according to any one of (1) to (12) described above, in which
(17)
An electronic device including an image sensor, including:
Note that the present embodiment is not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present disclosure. Furthermore, the effects described in the present specification are merely examples and are not limited, and other effects may be provided.
Number | Date | Country | Kind |
---|---|---|---|
2021-023132 | Feb 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/000161 | 1/6/2022 | WO |