The present disclosure relates to a solid-state imaging element and an electronic device.
In recent years, in order to achieve downsizing of a solid-state imaging element and densification of pixels, a solid-state imaging element having a three-dimensional structure has been developed. In the solid-state imaging element having the three-dimensional structure, for example, a semiconductor substrate having a plurality of sensor pixels and a semiconductor substrate having a signal processing circuit that processes a signal obtained by each sensor pixel are stacked on each other (see, for example, Patent Literature 1).
The first layer of the solid-state imaging element is provided with a photodiode (PD), a floating diffusion (FD), a transfer gate (TG) that is a gate electrode of a transfer transistor, and the like. Normally, signal lines such as control lines are drawn out from the first layer to the upper side of the second layer by a through contact, and are arranged in the second and subsequent layers. In order to maintain uniformity of capacitance between the transfer gate TG and the floating diffusion FD, that is, TG-FD capacitance, for each pixel, optimization of TG wiring and FD wiring is performed in the second and subsequent layers.
Patent Literature 1: JP 2010-245506 A
However, as the number of signal lines becomes larger and the pixel pitch becomes finer, the degree of freedom in wiring of the second and subsequent layers decreases. Further, since the signal lines are drawn out in the second and subsequent layers, this leads to an increase in the number of wiring layers in the second and subsequent layers. Thus, it is difficult to adjust the TG-FG capacitance while suppressing an increase in the number of wiring layers.
Accordingly, the present disclosure provides a solid-state imaging element and an electronic device capable of facilitating adjustment of capacitance between a transfer gate and a floating diffusion while suppressing an increase in the number of wiring layers.
A solid-state imaging element according to an aspect of the present disclosure includes a first semiconductor substrate; a second semiconductor substrate stacked on the first semiconductor substrate with an insulating layer interposed therebetween; a photoelectric conversion element that is provided on the first semiconductor substrate and generates charge by photoelectric conversion; a floating diffusion layer that is provided on the first semiconductor substrate and retains the charge generated by the photoelectric conversion element; a transfer gate that is a gate electrode of a transfer transistor that is provided on the first semiconductor substrate and transfers the charge generated by the photoelectric conversion element to the floating diffusion layer; a first through wire electrically connected to the floating diffusion layer and penetrating the insulating layer and the second semiconductor substrate; a second through wire electrically connected to the transfer gate and penetrating the insulating layer and the second semiconductor substrate; a wiring layer stacked on the second semiconductor substrate and having a wiring electrically connected to the first through wire or the second through wire; and an adjustment layer that is provided on the second semiconductor substrate so as to be in contact with both or one of the first through wire and the second through wire and adjusts a capacitance between the transfer gate and the floating diffusion layer.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings. Note that the solid-state imaging element and the electronic device according to the present disclosure are not limited by this embodiment. Further, in each of the following embodiments, basically the same parts are denoted by the same reference signs, and redundant description is omitted.
One or more embodiments (examples and modifications) described below can each be implemented independently. On the other hand, at least some of the plurality of embodiments described below may be appropriately combined with at least some of other embodiments. The plurality of embodiments may include novel features different from each other. Therefore, the plurality of embodiments can contribute to solving different objects or problems, and can exhibit different effects. Note that the effects in the embodiments are merely examples and are not limited, and other effects may be provided.
The present disclosure will be described according to the following order of items.
1. First Embodiment
1-1. Example of Schematic Configuration of Solid-State Imaging Element
1-2. Example of Pixel Circuit
1-3. Example of Connection Mode of Pixel Circuit
1-4. Example of Cross-Sectional Configuration of Solid-State Imaging Element
1-5. Example of Layer Structure of Solid-State Imaging Element
1-6. Modification of Layer Structure of Solid-State Imaging Element
1-7. Example of Method of Manufacturing Solid-State Imaging Element
1-8. Effects
2. Second embodiment
2-1. Example of Layer Structure of Solid-State Imaging Element
2-2. Modification of Layer Structure of Solid-State Imaging Element
2-3. Effects
3. Third embodiment
3-1. Example of Layer Structure of Solid-State Imaging Element
3-2. Effects
4. Fourth embodiment
4-1. Example of Layer Structure of Solid-State Imaging Element
4-2. Effects
Other Embodiments
6. Application Example
7. Application Example
7-1. Vehicle Control System
7-2. Operating Room System
8. Appendix
An example of a schematic configuration of a solid-state imaging element 1 according to a first embodiment will be described with reference to
As illustrated in
The first substrate 10 includes a first semiconductor substrate 11 and a plurality of sensor pixels 12 that performs photoelectric conversion. The first semiconductor substrate 11 has the sensor pixels 12. These sensor pixels 12 are provided in a matrix (two-dimensional array) in the pixel region 13 of the first substrate 10.
The second substrate 20 includes a second semiconductor substrate 21, readout circuits 22 that output pixel signals, a plurality of pixel drive lines 23 extending in a row direction, and a plurality of vertical signal lines 24 extending in a column direction. The second semiconductor substrate 21 has one readout circuit 22 for every four sensor pixels 12. The readout circuit 22 outputs a pixel signal based on charge output from the sensor pixel 12.
The third substrate 30 includes a third semiconductor substrate 31 and a logic circuit 32 that processes the pixel signal. The third semiconductor substrate 31 has the logic circuit 32. The logic circuit 32 includes, for example, a vertical drive circuit 33, a column signal processing circuit 34, a horizontal drive circuit 35, and a system control circuit 36.
The logic circuit 32 outputs an output voltage Vout for each sensor pixel 12 to the outside. Note that, in the logic circuit 32, for example, a low resistance region including silicide formed using a self aligned silicide process such as CoSi2 or NiSi may be formed on a surface of an impurity diffusion region in contact with the source electrode and the drain electrode.
For example, the vertical drive circuit 33 sequentially selects the plurality of sensor pixels 12 row by row.
The column signal processing circuit 34 performs, for example, correlated double sampling (CDS) processing on the pixel signal output from each sensor pixel 12 of the row selected by the vertical drive circuit 33. For example, the column signal processing circuit 34 extracts the signal level of each pixel signal by executing the CDS processing, and holds pixel data corresponding to the amount of light received by each sensor pixel 12.
For example, the horizontal drive circuit 35 sequentially outputs the pixel data held in the column signal processing circuit 34 to the outside.
The system control circuit 36 controls driving of each block (the vertical drive circuit 33, the column signal processing circuit 34, and the horizontal drive circuit 35) in the logic circuit 32, for example.
Next, an example of a pixel circuit according to the first embodiment will be described with reference to
As illustrated in
Each sensor pixel 12 has common components. In
Each of the sensor pixels 12 includes, for example, a photodiode PD and a transfer transistor TR electrically connected to the photodiode PD. These sensor pixels 12 share a floating diffusion FD electrically connected to each transfer transistor TR. That is, the individual photodiode PD of each sensor pixel 12 is electrically connected to the floating diffusion FD via the transfer transistor TR. For example, the photodiode PD, the transfer transistor TR, the floating diffusion FD, and so on are provided on the first substrate 10.
The photodiode PD performs photoelectric conversion to generate charge corresponding to the amount of received light. A cathode of the photodiode PD is electrically connected to a source of the transfer transistor TR. Further, an anode of the photodiode PD is electrically connected to a reference potential line (for example, ground). The photodiode PD is an example of a photoelectric conversion element.
The transfer transistor TR is electrically connected between the photodiode PD and the floating diffusion FD. In the transfer transistor TR, for example, a drain is electrically connected to the floating diffusion FD, and a transfer gate TG as a gate is electrically connected to the pixel drive line 23 (see
The floating diffusion FD is common to the sensor pixels 12 sharing one readout circuit 22, and is electrically connected to an input end of the readout circuit 22 common to the sensor pixels 12. The floating diffusion FD temporarily holds the charge output from the photodiode PD and input via the transfer transistor TR. The floating diffusion FD is an example of a floating diffusion layer.
Here, in the transfer transistor TR0, a capacitance C0 is added between the transfer gate TG0 and the floating diffusion FD. Further, in the transfer transistor TR3, a capacitance C3 is added between the transfer gate TG3 and the floating diffusion FD. By adjusting these capacitances C0 and C3, for example, individual capacitances (TG-FD capacitances) between the transfer gates TG (TG0 to TG3) and the floating diffusion FD are made uniform. This capacitance adjustment will be described in detail later.
The readout circuit 22 includes, for example, a reset transistor RST, a selection transistor SEL, and an amplification transistor AMP. The reset transistor RST, the selection transistor SEL, the amplification transistor AMP, and the like are provided on the second substrate 20, for example. The reset transistor RST, the amplification transistor AMP, and the selection transistor SEL are, for example, CMOS transistors.
The reset transistor RST is a transistor for resetting potential. In the reset transistor RST, for example, a drain is electrically connected to a power supply line VDD, and a source is electrically connected to the floating diffusion FD. Further, a gate is electrically connected to the pixel drive line 23 (see
The amplification transistor AMP is a transistor for voltage amplification. In the amplification transistor AMP, for example, a drain is electrically connected to the power supply line VDD, and a gate is electrically connected to the floating diffusion FD. The amplification transistor AMP amplifies the potential of the floating diffusion FD, and generates a voltage corresponding to amplified potential as a pixel signal.
The selection transistor SEL is a transistor for pixel selection. In the selection transistor SEL, for example, a drain is electrically connected to a source of the amplification transistor AMP, and a source is electrically connected to the vertical signal line 24. Further, a gate is electrically connected to the pixel drive line 23 (see
Note that the configuration of the readout circuit 22 is not particularly limited. For example, the selection transistor SEL may be provided between the power supply line VDD and the amplification transistor AMP. Further, one or more of the reset transistor RST, the amplification transistor AMP, the selection transistor SEL, and the like can be omitted depending on the method of reading the pixel signal, or another transistor can be added.
An example of a connection mode of the pixel circuit according to the first embodiment will be described with reference to
As illustrated in
An example of a cross-sectional configuration of the solid-state imaging element 1 according to the first embodiment will be described with reference to
As illustrated in
The first substrate 10 is formed by stacking an insulating layer 46 on the first semiconductor substrate (semiconductor layer) 11. The first substrate 10 includes the insulating layer 46 as a part of an interlayer insulating film 51. The insulating layer 46 is provided in a gap between the first semiconductor substrate 11 and a second semiconductor substrate 21 described later.
The first semiconductor substrate 11 includes a silicon substrate. The first semiconductor substrate 11 has, for example, a p-well layer 42 in a part of a front surface and in the vicinity thereof, and has the photodiode PD of a conductivity type different from that of the p-well layer 42 in another region (region deeper than the p-well layer 42). The p-well layer 42 includes a p-type semiconductor region. The photodiode PD includes a semiconductor region of a conductivity type (specifically, n-type) different from that of the p-well layer 42. Further, the first semiconductor substrate 11 includes, in the p-well layer 42, the floating diffusion FD as a semiconductor region of a conductivity type (specifically, n type) different from that of the p-well layer 42. The floating diffusion FD is formed in the p-well layer 42, and is one floating diffusion layer common to the four adjacent sensor pixels 12.
The first substrate 10 includes the photodiode PD and the transfer transistor TR in each sensor pixel 12, and further includes the floating diffusion FD in each of the four sensor pixels 12. The transfer transistor TR and the floating diffusion FD are provided in a portion on the front surface side (side opposite to the light incident surface side, the second substrate 20 side) of the first semiconductor substrate 11.
The first substrate 10 further includes an element isolation section 43 that separates each sensor pixel 12. The element isolation section 43 is formed so as to extend in a normal direction of the first semiconductor substrate 11 (direction perpendicular to the front surface of the first semiconductor substrate 11). The element isolation section 43 is provided between two sensor pixels 12 adjacent to each other. The element isolation section 43 electrically isolates the sensor pixels 12 adjacent to each other from each other. The element isolation section 43 is formed by, for example, silicon oxide. The element isolation section 43 is, for example, an isolation section of deep trench isolation (DTI) type in which a trench is formed from a back surface to middle of the first semiconductor substrate 11.
The color filter 40 is provided on the back surface side of the first semiconductor substrate 11. The color filter 40 is provided, for example, at a position in contact with the back surface of the first semiconductor substrate 11 and facing the sensor pixel 12. The light receiving lens 50 is, for example, in contact with a back surface of the color filter 40 and is provided at a position facing the sensor pixel 12 via the color filter 40. One color filter 40 and one light receiving lens 50 are provided for each sensor pixel 12.
The second substrate 20 is formed by stacking an insulating layer 52 on a semiconductor substrate (semiconductor layer) 21. The second substrate 20 includes the insulating layer 52 as a part of the interlayer insulating film 51. The insulating layer 52 is provided in a gap between the second semiconductor substrate 21 and a third semiconductor substrate 31 described later. The second semiconductor substrate 21 includes a silicon substrate.
The second substrate 20 includes one readout circuit 22 for every four sensor pixels 12 (see
The second substrate 20 further includes a plurality of insulating layers 53 and 54 penetrating the second semiconductor substrate 21 in the same layer as the second semiconductor substrate 21. These insulating layers 53 and 54 are provided on the second substrate 20 as a part of the interlayer insulating film 51.
A stacked body including the first substrate 10 and the second substrate 20 includes an interlayer insulating film 51, a plurality of through wires 71 and 72, a diffusion layer (floating diffusion layer) 74, and a gate layer 75.
Each of the through wires 71 and 72 is provided in the interlayer insulating film 51, extends in a normal direction of the second semiconductor substrate 21, and penetrates the second semiconductor substrate 21. Each of the through wires 71 and 72 is referred to as a through contact. The first substrate 10 and the second substrate 20 are electrically connected to each other by the through wires 71 and 72. As the through wires 71 and 72, for example, a first through wire 71 for the floating diffusion FD (FD1 to FD4) and a plurality of second through wires 72 for the transfer gate TG (TG0 to TG3) exist. Each of the second through wires 72 penetrates the insulating layers 53 and 54.
The diffusion layer 74 is provided in the same layer as the second semiconductor substrate 21 so as to be in contact with the first through wire 71, and is electrically connected to the first through wire 71. The gate layer 75 is provided on the second semiconductor substrate 21 so as to be in contact with the second through wire 72 without being in contact with the diffusion layer 74, and is electrically connected to the second through wire 72. The diffusion layer 74 and the gate layer 75 function as an adjustment layer that adjusts the capacitance between the transfer gate TG and the floating diffusion FD, that is, the TG-FD capacitance.
The second substrate 20 further includes a plurality of connection sections 59 electrically connected to the readout circuit 22 and the second semiconductor substrate 21 in the insulating layer 52. Furthermore, the second substrate 20 includes, for example, a wiring layer 56 on the insulating layer 52.
The wiring layer 56 includes, for example, an insulating layer 57, and a plurality of pixel drive lines 23 and a plurality of vertical signal lines 24 provided in the insulating layer 57. Furthermore, the wiring layer 56 includes a connection wiring 55 in the insulating layer 57 for each floating diffusion FD. The connection wiring 55 is electrically connected to the first through wire 71 connected to the floating diffusion FD. Any of the pixel drive lines 23 is electrically connected to the second through wire 72 connected to the transfer gate TG. The pixel drive line 23, the vertical signal line 24, the connection wiring 55, and the like are examples of wirings.
The wiring layer 56 further includes, for example, a plurality of pad electrodes 58 in the insulating layer 57. Each pad electrode 58 is formed by metal such as copper (Cu) or aluminum (Al), for example. Each pad electrode 58 is exposed on the surface of the wiring layer 56. Each pad electrode 58 is used for electrical connection between the second substrate 20 and the third substrate 30 and bonding between the second substrate 20 and the third substrate 30. For example, one pad electrode 58 is provided for each of the pixel drive lines 23 and the vertical signal lines 24.
The third substrate 30 is formed by stacking an interlayer insulating film 61 on a semiconductor substrate (semiconductor layer) 31, for example. The third semiconductor substrate 31 includes a silicon substrate. Note that the third substrate 30 is bonded to the second substrate 20 through surfaces on the front surface sides of each other, and thus the vertical description is opposite to the vertical direction in the drawings when the configuration in the third substrate 30 is described.
The third substrate 30 has a configuration in which the logic circuit 32 is provided in a portion on a front surface side of the third semiconductor substrate 31. The third substrate 30 includes, for example, a wiring layer 62 on the interlayer insulating film 61. The wiring layer 62 includes, for example, an insulating layer 63 and a plurality of pad electrodes 64 provided in the insulating layer 63. Each pad electrode 64 is electrically connected to the logic circuit 32. Each pad electrode 64 is formed by, for example, Cu (copper). Each pad electrode 64 is exposed on a front surface of the wiring layer 62. Each pad electrode 64 is used for electrical connection between the second substrate 20 and the third substrate 30 and bonding between the second substrate 20 and the third substrate 30. Note that the number of pad electrodes 64 is not necessarily plural.
The third substrate 30 and the second substrate 20 are electrically connected to each other by bonding the pad electrodes 58 and 64 to each other. The third substrate 30 is bonded to the second substrate 20 with the front surface of the third semiconductor substrate 31 facing the front surface side of the second semiconductor substrate 21. That is, the third substrate 30 is bonded to the second substrate 20 in a face-to-face manner.
An example of a layer structure of the solid-state imaging element 1 according to the first embodiment will be described with reference to
As illustrated in
As illustrated in
In the second layer, a plurality of signal lines including control lines and the like are further provided so as to extend in a left-right direction in
As illustrated in
The gate layer 75 is positioned above the transfer gate TG3 (TG in
Here, the gate layer 75 is provided on the second semiconductor substrate 21 via an insulating film, and this insulating film is a part of the insulating layer 52. Similarly, the transfer gate TG3 is also provided on the first semiconductor substrate 11 via an insulating film, and this insulating film is a part of the insulating layer 46.
On the other hand, as illustrated in
With such a layer structure, the diffusion layer 74 and the gate layer 75 function as share contacts and adjust the TG-FD capacitance (for example, the coupling amount of capacitive coupling). That is, the TG-FD capacitance that cannot be matched in the first layer or the like can be adjusted by the diffusion layer 74 as the second layer and the gate layer 75.
For example, in a case where the diffusion layer 74 and the gate layer 75 are not provided, the individual TG-FD capacitance of each of the transfer gates TG1 and TG2 is larger than that of the other transfer gates TG0 and TG3 (see
Note that the floating diffusion FD (or the diffusion layer 74) and the gate layer 75 may be arranged so as to overlap each other, may be arranged so as to be flush with each other, or may be arranged so as to be separated from each other in a plane viewed from the light incident surface.
A modification of the layer structure of the solid-state imaging element 1 according to the first embodiment will be described with reference to
As illustrated in
Even in such a layer structure, the diffusion layer 74 and the gate layer 75 function as share contacts and adjust the TG-FD capacitance. That is, the TG-FG capacitance that cannot be matched in the first layer or the like can be adjusted by the diffusion layer 74 as the second layer and the gate layer 75.
A method of manufacturing the solid-state imaging element 1 according to the first embodiment will be described with reference to
As illustrated in an upper left part of
Next, as illustrated in an upper center of
Next, as illustrated in an upper right part of
Furthermore, as illustrated in a lower left part of
Next, as illustrated in a lower center of
Next, a barrier metal, a metal film, or the like is formed in the through hole CH by a CVD method, a physical vapor deposition (PVD) method, an atomic layer deposition (ALD) method, a plating method, or the like so as to fill the through hole CH. Furthermore, an excessive metal film or the like protruding from the through hole CH is removed by using CMP, dry etching, or the like.
Thus, the first through wire 71 and the second through wire 72 are formed as illustrated in a lower right part of
As described above, according to the first embodiment, the diffusion layer 74 is provided in the second semiconductor substrate 21 so as to be in contact with the first through wire 71, and the gate layer 75 is provided on the second semiconductor substrate 21 so as to be in contact with the second through wire 72 without being in contact with the diffusion layer 74 (see
Note that the diffusion layer 74 may be formed by a material different from or the same as the material of the floating diffusion FD. In addition, the gate layer 75 may be formed by a material different from the transfer gate TG or the same material as the transfer gate TG (for example, polysilicon). When the same material is used, it is possible to facilitate preparation of the material and reduce cost.
An example of a layer structure of a solid-state imaging element 1 according to a second embodiment will be described with reference to
Note that a plan view illustrating an example of a schematic configuration of the first layer of the solid-state imaging element 1 according to the second embodiment is the same as the plan view illustrated in
As illustrated in
As illustrated in
According to such a layer structure, the diffusion layer (first diffusion layer) 74 and the diffusion layer (second diffusion layer) 76 function as share contacts and adjust the TG-FD capacitance. That is, the TG-FG capacitance that cannot be matched in the first layer or the like can be adjusted by the diffusion layer 74 as the second layer and the diffusion layer 76.
For example, in a case where the diffusion layer 74 or the diffusion layer 76 is not provided, the individual TG-FD capacitance of each of the transfer gates TG1 and TG2 is larger than that of the other transfer gates TG0 and TG3. Accordingly, by providing the diffusion layer 74 and the diffusion layer 76 for the transfer gates TG0 and TG3, respectively, the individual FD-TG capacitance of each of the transfer gates TG0 and TG3 can be increased to be close to the individual FD-TG capacitance of each of the transfer gates TG1 and TG2. Thus, the individual FD-TG capacitances of the transfer gates TG0 to TG3 can be made uniform.
A modification of the layer structure of the solid-state imaging element 1 according to the second embodiment will be described with reference to
As illustrated in
Even in such a layer structure, the diffusion layer 76 functions as a share contact and adjusts the TG-FD capacitance. That is, the TG-FG capacitance that cannot be matched in the first layer or the like can be adjusted by the diffusion layer 74 as the second layer and the gate layer 75.
As described above, according to the second embodiment, the same effects as those of the first embodiment can be obtained. That is, the diffusion layer 74 is provided in the second semiconductor substrate 21 so as to be in contact with the first through wire 71, and the diffusion layer 76 is provided in the second semiconductor substrate 21 so as to be in contact with the second through wire 72 without being in contact with the diffusion layer 74 (see
In addition, the diffusion layer 74 is not provided, and the diffusion layer 76 is provided on the second semiconductor substrate 21 so as to be in contact with the second through wire 72 (see
Note that diffusion layer 74 and diffusion layer 76 may be formed by different materials or the same material. When the same material is used, it is possible to facilitate preparation of the material and reduce the cost, and moreover, it is possible to form the diffusion layer 74 and the diffusion layer 76 in the same process, so that the number of manufacturing processes can be reduced. As the material, for example, the same material as the floating diffusion FD can be used. That is, both or one of the diffusion layer 74 and the diffusion layer 76 may be formed by the same material as the floating diffusion FD. When the same material as the floating diffusion FD is used, it is possible to facilitate preparation of the material and reduce the cost.
An example of a layer structure of a solid-state imaging element 1 according to a third embodiment will be described with reference to
As illustrated in
Note that a cross-sectional view of the first and second layers of the solid-state imaging element 1 according to the third embodiment taken along line E-E in
With such a layer structure, similarly to the first embodiment, the diffusion layer 74 and the gate layer 75 function as share contacts and adjust the TG-FD capacitance. That is, the TG-FG capacitance that cannot be matched in the first layer or the like can be adjusted by the diffusion layer 74 as the second layer and the gate layer 75.
For example, in a case where the diffusion layer 74 and the gate layer 75 are not provided and the wiring for the floating diffusion FD is routed as in the example of
As described above, according to the third embodiment, the same effects as those of the first embodiment can be obtained. That is, instead of adjusting the TG-FD capacitance by optimizing the wiring or increasing the wiring layer, the TG-FD capacitance can be adjusted only by providing the diffusion layer 74 and the gate layer 75 on the second semiconductor substrate 21 as the second layer. Therefore, adjustment of the TG-FD capacitance can be facilitated while suppressing an increase in the number of wiring layers.
An example of a layer structure of a solid-state imaging element 1 according to a fourth embodiment will be described with reference to
As illustrated in
Note that a plan view illustrating an example of a schematic configuration of the first layer of the solid-state imaging element 1 according to the fourth embodiment is the same as the plan view illustrated in
With such a layer structure, similarly to the first embodiment, the diffusion layer 74 and the gate layer 75 function as share contacts and adjust the TG-FD capacitance. That is, the TG-FG capacitance that cannot be matched in the first layer or the like can be adjusted by the diffusion layer 74 as the second layer and the gate layer 75.
For example, in a case where the diffusion layer 74 and the gate layer 75 are not provided and the wiring for the floating diffusion FD is routed as in the example of
As described above, according to the fourth embodiment, the same effects as those of the third embodiment can be obtained. That is, instead of adjusting the TG-FD capacitance by optimizing the wiring or increasing the wiring layer, the TG-FD capacitance can be adjusted only by providing the diffusion layer 74 and the gate layer 75 on the second semiconductor substrate 21 as the second layer. Therefore, adjustment of the TG-FD capacitance can be facilitated while suppressing an increase in the number of wiring layers. Further, the total FD capacitance can be optimized by changing the ratio between the FD base capacitance and the FD wiring capacitance.
The processing according to the above embodiment may be performed in various different forms (modifications) other than the above embodiment. For example, the configuration is not limited to the above-described example, and may be various modes. Further, for example, the configuration, the processing procedure, the specific name, and the information including various data and parameters illustrated in the document or the drawings can be arbitrarily changed unless otherwise specified.
Further, each component of each device illustrated in the drawings is functionally conceptual, and is not necessarily physically configured as illustrated in the drawings. That is, a specific form of distribution and integration of each device is not limited to the illustrated form, and all or a part thereof can be functionally or physically distributed and integrated in any unit according to various loads, usage conditions, and the like.
In the above embodiments and modifications, the conductivity types may be reversed. For example, in the description of each embodiment and each modification, the p-type may be replaced with the n-type, and the n-type may be replaced with the p-type. Even in this case, similar effects to those of respective embodiments and respective modifications can be obtained.
In addition, in the above-described embodiments and respective modifications, as the element isolation section 43, an isolation section of the deep trench isolation (DTI) type in which a trench is formed from the back surface to the middle of the first semiconductor substrate 11 has been exemplified, but the element isolation section is not limited thereto, and for example, an isolation section penetrating the first semiconductor substrate 11 (full trench) and electrically completely isolating two or more adjacent sensor pixels 12 may be used.
Further, the solid-state imaging element 1 according to each of the above-described embodiments and modifications can be applied not only as a visible light receiving element but also to an element capable of detecting various types of radiation such as infrared rays, ultraviolet rays, X-rays, and electromagnetic waves. The present invention can also be applied to various applications such as distance measurement, change in light amount, and detection of physical properties in addition to image output.
The solid-state imaging element 1 according to each of the above embodiments and modifications is applied to various electronic devices. Examples of the electronic device include electronic devices having an imaging function, such as a digital still camera, a video camera, a smartphone, a tablet terminal, a mobile phone, a personal digital assistant (PDA), a notebook personal computer (PC), and a desktop PC.
An example of an imaging device 300 will be described with reference to
As illustrated in
The optical system 301 includes one or more lenses. The optical system 301 guides light (incident light) from a subject to the imaging element 303 and forms an image on a light receiving surface of the imaging element 303.
The shutter device 302 is disposed between the optical system 301 and the imaging element 303. The shutter device 302 controls a light irradiation period and a light shielding period with respect to the imaging element 303 according to controlling of the control circuit 304.
The imaging element 303 accumulates signal charge for a certain period according to light formed on the light receiving surface via the optical system 301 and the shutter device 302. The signal charge accumulated in the imaging element 303 is transferred in accordance with a drive signal (timing signal) supplied from the control circuit 304.
The control circuit 304 outputs a drive signal for controlling a transfer operation of the imaging element 303 and a shutter operation of the shutter device 302 to drive the imaging element 303 and the shutter device 302.
The signal processing circuit 305 performs various types of signal processing on the signal charge output from the imaging element 303. An image (image data) obtained by performing the signal processing by the signal processing circuit 305 is supplied to the monitor 306 and is also supplied to the memory 307.
The monitor 306 displays a moving image or a still image captured by the imaging element 303 on the basis of the image data supplied from the signal processing circuit 305. As the monitor 306, for example, a panel type display device such as a liquid crystal panel or an organic electro luminescence (EL) panel is used.
The memory 307 stores the image data supplied from the signal processing circuit 305, that is, image data of a moving image or a still image captured by the imaging element 303. As the memory 307, for example, a recording medium such as a semiconductor memory or a hard disk is used.
Also in the imaging device 300 configured as described above, by using any one of the solid-state imaging elements 1 according to the above-described embodiments and modifications as the imaging element 303, it is possible to facilitate adjustment of the TG-FD capacitance while suppressing an increase in the number of wiring layers.
A technology according to the present disclosure is applicable to various products. For example, the technology according to the present disclosure may be implemented as a device mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a boat, a robot, a construction machine, an agricultural machine (tractor), and the like. In addition, for example, the technology according to the present disclosure may be applied to an endoscopic surgery system, a microscopic surgery system, or the like.
Each of the control units includes: a microcomputer that performs arithmetic processing according to various kinds of programs; a storage section that stores the programs executed by the microcomputer, parameters used for various kinds of operations, or the like; and a driving circuit that drives various kinds of control target devices. Each of the control units further includes: a network interface (I/F) for performing communication with other control units via the communication network 7010; and a communication I/F for performing communication with a device, a sensor, or the like within and without the vehicle by wire communication or radio communication. A functional configuration of the integrated control unit 7600 illustrated in
The driving system control unit 7100 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs. For example, the driving system control unit 7100 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like. The driving system control unit 7100 may have a function as a control device of an antilock brake system (ABS), electronic stability control (ESC), or the like.
The driving system control unit 7100 is connected with a vehicle state detecting section 7110. The vehicle state detecting section 7110, for example, includes at least one of a gyro sensor that detects the angular velocity of axial rotational movement of a vehicle body, an acceleration sensor that detects the acceleration of the vehicle, and sensors for detecting an amount of operation of an accelerator pedal, an amount of operation of a brake pedal, the steering angle of a steering wheel, an engine speed or the rotational speed of wheels, and the like. The driving system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detecting section 7110, and controls the internal combustion engine, the driving motor, an electric power steering device, the brake device, and the like.
The body system control unit 7200 controls the operation of various kinds of devices provided to the vehicle body in accordance with various kinds of programs. For example, the body system control unit 7200 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal, a fog lamp, or the like. In this case, radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 7200. The body system control unit 7200 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
The battery control unit 7300 controls a secondary battery 7310, which is a power supply source for the driving motor, in accordance with various kinds of programs. For example, the battery control unit 7300 is supplied with information about a battery temperature, a battery output voltage, an amount of charge remaining in the battery, or the like from a battery device including the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and performs control for regulating the temperature of the secondary battery 7310 or controls a cooling device provided to the battery device or the like.
The outside-vehicle information detecting unit 7400 detects information about the outside of the vehicle including the vehicle control system 7000. For example, the outside-vehicle information detecting unit 7400 is connected with at least one of an imaging section 7410 and an outside-vehicle information detecting section 7420. The imaging section 7410 includes at least one of a time-of-flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras. The outside-vehicle information detecting section 7420, for example, includes at least one of an environmental sensor for detecting current atmospheric conditions or weather conditions and a peripheral information detecting sensor for detecting another vehicle, an obstacle, a pedestrian, or the like on the periphery of the vehicle including the vehicle control system 7000.
The environmental sensor, for example, may be at least one of a rain drop sensor detecting rain, a fog sensor detecting a fog, a sunshine sensor detecting a degree of sunshine, and a snow sensor detecting a snowfall. The peripheral information detecting sensor may be at least one of an ultrasonic sensor, a radar device, and a LIDAR device (Light detection and Ranging device, or Laser imaging detection and ranging device). Each of the imaging section 7410 and the outside-vehicle information detecting section 7420 may be provided as an independent sensor or device, or may be provided as a device in which a plurality of sensors or devices are integrated.
Incidentally,
Outside-vehicle information detecting sections 7920, 7922, 7924, 7926, 7928, and 7930 provided to the front, rear, sides, and corners of the vehicle 7900 and the upper portion of the windshield within the interior of the vehicle may be, for example, an ultrasonic sensor or a radar device. The outside-vehicle information detecting sections 7920, 7926, and 7930 provided to the front nose of the vehicle 7900, the rear bumper, the back door of the vehicle 7900, and the upper portion of the windshield within the interior of the vehicle may be a LIDAR device, for example. These outside-vehicle information detecting sections 7920 to 7930 are used mainly to detect a preceding vehicle, a pedestrian, an obstacle, or the like.
Returning to
In addition, on the basis of the received image data, the outside-vehicle information detecting unit 7400 may perform image recognition processing of recognizing a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto. The outside-vehicle information detecting unit 7400 may subject the received image data to processing such as distortion correction, alignment, or the like, and combine the image data imaged by a plurality of different imaging sections 7410 to generate a bird's-eye image or a panoramic image. The outside-vehicle information detecting unit 7400 may perform viewpoint conversion processing using the image data imaged by the imaging section 7410 including the different imaging parts.
The in-vehicle information detecting unit 7500 detects information about the inside of the vehicle. The in-vehicle information detecting unit 7500 is, for example, connected with a driver state detecting section 7510 that detects the state of a driver. The driver state detecting section 7510 may include a camera that images the driver, a biosensor that detects biological information of the driver, a microphone that collects sound within the interior of the vehicle, or the like. The biosensor is, for example, disposed in a seat surface, the steering wheel, or the like, and detects biological information of an occupant sitting in a seat or the driver holding the steering wheel. On the basis of detection information input from the driver state detecting section 7510, the in-vehicle information detecting unit 7500 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing. The in-vehicle information detecting unit 7500 may subject an audio signal obtained by the collection of the sound to processing such as noise canceling processing or the like.
The integrated control unit 7600 controls general operation within the vehicle control system 7000 in accordance with various kinds of programs. The integrated control unit 7600 is connected with an input section 7800. The input section 7800 is implemented by a device capable of input operation by an occupant, such, for example, as a touch panel, a button, a microphone, a switch, a lever, or the like. The integrated control unit 7600 may be supplied with data obtained by voice recognition of voice input through the microphone. The input section 7800 may, for example, be a remote control device using infrared rays or other radio waves, or an external connecting device such as a mobile telephone, a personal digital assistant (PDA), or the like that supports operation of the vehicle control system 7000. The input section 7800 may be, for example, a camera. In that case, an occupant can input information by gesture. Alternatively, data may be input which is obtained by detecting the movement of a wearable device that an occupant wears. Further, the input section 7800 may, for example, include an input control circuit or the like that generates an input signal on the basis of information input by an occupant or the like using the above-described input section 7800, and which outputs the generated input signal to the integrated control unit 7600. An occupant or the like inputs various kinds of data or gives an instruction for processing operation to the vehicle control system 7000 by operating the input section 7800.
The storage section 7690 may include a read only memory (ROM) that stores various kinds of programs executed by the microcomputer and a random access memory (RAM) that stores various kinds of parameters, operation results, sensor values, or the like. In addition, the storage section 7690 may be implemented by a magnetic storage device such as a hard disc drive (HDD) or the like, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
The general-purpose communication I/F 7620 is a communication I/F used widely, which communication I/F mediates communication with various apparatuses present in an external environment 7750. The general-purpose communication I/F 7620 may implement a cellular communication protocol such as global system for mobile communications (GSM (registered trademark)), worldwide interoperability for microwave access (WiMAX (registered trademark)), long term evolution (LTE (registered trademark)), LTE-advanced (LTE-A), or the like, or another wireless communication protocol such as wireless LAN (referred to also as wireless fidelity (Wi-Fi (registered trademark)), Bluetooth (registered trademark), or the like. The general-purpose communication I/F 7620 may, for example, connect to an apparatus (for example, an application server or a control server) present on an external network (for example, the Internet, a cloud network, or a company-specific network) via a base station or an access point. In addition, the general-purpose communication I/F 7620 may connect to a terminal present in the vicinity of the vehicle (which terminal is, for example, a terminal of the driver, a pedestrian, or a store, or a machine type communication (MTC) terminal) using a peer to peer (P2P) technology, for example.
The dedicated communication I/F 7630 is a communication I/F that supports a communication protocol developed for use in vehicles. The dedicated communication I/F 7630 may implement a standard protocol such, for example, as wireless access in vehicle environment (WAVE), which is a combination of institute of electrical and electronic engineers (IEEE) 802.11p as a lower layer and IEEE 1609 as a higher layer, dedicated short range communications (DSRC), or a cellular communication protocol. The dedicated communication I/F 7630 typically carries out V2X communication as a concept including one or more of communication between a vehicle and a vehicle (Vehicle to Vehicle), communication between a road and a vehicle (Vehicle to Infrastructure), communication between a vehicle and a home (Vehicle to Home), and communication between a pedestrian and a vehicle (Vehicle to Pedestrian).
The positioning section 7640, for example, performs positioning by receiving a global navigation satellite system (GNSS) signal from a GNSS satellite (for example, a GPS signal from a global positioning system (GPS) satellite), and generates positional information including the latitude, longitude, and altitude of the vehicle. Incidentally, the positioning section 7640 may identify a current position by exchanging signals with a wireless access point, or may obtain the positional information from a terminal such as a mobile telephone, a personal handyphone system (PHS), or a smart phone that has a positioning function.
The beacon receiving section 7650, for example, receives a radio wave or an electromagnetic wave transmitted from a radio station installed on a road or the like, and thereby obtains information about the current position, congestion, a closed road, a necessary time, or the like. Incidentally, the function of the beacon receiving section 7650 may be included in the dedicated communication I/F 7630 described above.
The in-vehicle device I/F 7660 is a communication interface that mediates connection between the microcomputer 7610 and various in-vehicle devices 7760 present within the vehicle. The in-vehicle device I/F 7660 may establish wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFC), or wireless universal serial bus (WUSB). In addition, the in-vehicle device I/F 7660 may establish wired connection by universal serial bus (USB), high-definition multimedia interface (HDMI (registered trademark)), mobile high-definition link (MHL), or the like via a connection terminal (and a cable if necessary) not depicted in the figures. The in-vehicle devices 7760 may, for example, include at least one of a mobile device and a wearable device possessed by an occupant and an information device carried into or attached to the vehicle. The in-vehicle devices 7760 may also include a navigation device that searches for a path to an arbitrary destination. The in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
The vehicle-mounted network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The vehicle-mounted network I/F 7680 transmits and receives signals or the like in conformity with a predetermined protocol supported by the communication network 7010.
The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 in accordance with various kinds of programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680. For example, the microcomputer 7610 may calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the obtained information about the inside and outside of the vehicle, and output a control command to the driving system control unit 7100. For example, the microcomputer 7610 may perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like. In addition, the microcomputer 7610 may perform cooperative control intended for automated driving, which makes the vehicle to travel automatedly without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like on the basis of the obtained information about the surroundings of the vehicle.
The microcomputer 7610 may generate three-dimensional distance information between the vehicle and an object such as a surrounding structure, a person, or the like, and generate local map information including information about the surroundings of the current position of the vehicle, on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning section 7640, the beacon receiving section 7650, the in-vehicle device I/F 7660, and the vehicle-mounted network I/F 7680. In addition, the microcomputer 7610 may predict danger such as collision of the vehicle, approaching of a pedestrian or the like, an entry to a closed road, or the like on the basis of the obtained information, and generate a warning signal. The warning signal may, for example, be a signal for producing a warning sound or lighting a warning lamp.
The sound/image output section 7670 transmits an output signal of at least one of a sound and an image to an output device capable of visually or auditorily notifying information to an occupant of the vehicle or the outside of the vehicle. In the example of
Incidentally, at least two control units connected to each other via the communication network 7010 in the example depicted in
Note that a computer program for implementing each function of the imaging device 300 according to the application example can be mounted on any control unit or the like. Furthermore, it is also possible to provide a computer-readable recording medium storing such a computer program. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. In addition, the computer program described above may be distributed via, for example, a network without using the recording medium.
In the vehicle control system 7000 described above, the imaging device 300 according to the application example can be applied to the integrated control unit 7600 of the application example. For example, the control circuit 304, the signal processing circuit 305, and the memory 307 of the imaging device 300 may be implemented by the microcomputer 7610 or the storage section 7690 of the integrated control unit 7600. Furthermore, the solid-state imaging element 1 according to each of the above-described embodiments and modifications and the imaging device 300 according to the application example can be applied to the imaging section 7410 and the outside-vehicle information detecting section 7420 according to the application example, for example, the imaging sections 7910, 7912, 7914, 7916, and 7918 and the outside-vehicle information detecting sections 7920 to 7930, and the like according to the application example. By using any one of the solid-state imaging element 1 according to each of the above-described embodiments and modifications and the imaging device 300 according to the application example, even in the vehicle control system 7000, adjustment of the TG-FD capacitance can be facilitated while suppressing an increase in the number of wiring layers.
In addition, at least some components of the imaging device 300 according to the application example may be implemented in a module (for example, an integrated circuit module including one die) for the integrated control unit 7600 of the application example. Alternatively, a part of the imaging device 300 according to the application example may be implemented by a plurality of control units of the vehicle control system 7000.
A technology according to the present disclosure is applicable to various products. For example, the technology according to the present disclosure may be applied to an operating room system.
Various devices can be installed in the operating room.
The group of devices 5101, the ceiling camera 5187, the operating field camera 5189, and the display devices 5103A to 5103C are connected to the IF controller 5109 via IP converters 5115A to 5115F (hereinafter, denoted by reference numeral 5115 when not individually distinguished). The IP converters 5115D, 5115E, and 5115F on video source sides (camera sides) perform IP conversion on videos from individual medical image capturing devices (such as an endoscope, an operation microscope, an X-ray imaging device, an operating field camera, and a pathological image capturing device), and transmit the results on the network. The IP converters 5115A to 5115D on video output sides (monitor sides) convert the videos transmitted through the network into monitor-unique formats, and output the results. The IP converters on the video source sides function as encoders, and the IP converters on the video output sides function as decoders. The IP converters 5115 may have various image processing functions, and may have functions of, for example, resolution conversion processing corresponding to output destinations, rotation correction and image stabilization of an endoscopic video, and object recognition processing. The image processing functions may also include partial processing such as feature information extraction for analysis on a server described later. These image processing functions may be specific to the connected medical image devices, or may be upgradable from outside. The IP converters on the display sides can perform processing such as synthesis of a plurality of videos (for example, picture-in-picture (PinP) processing) and superimposition of annotation information. The protocol conversion function of each of the IP converters is a function to convert a received signal into a converted signal conforming to a communication protocol allowing the signal to be transmitted on the network (such as the Internet). Any communication protocol may be set as the communication protocol. The signal received by the IP converter and convertible in terms of protocol is a digital signal, and is, for example, a video signal or a pixel signal. The IP converter may be incorporated in a video source side device or in a video output side device.
The group of devices 5101 belong to, for example, an endoscopic surgery system, and include, for example, the endoscope and a display device for displaying an image captured by the endoscope. The display devices 5103A to 5103D, the patient bed 5183, and the light 5191 are, for example, devices equipped in the operating room separately from the endoscopic surgery system. Each of these devices for surgical or diagnostic is also called a medical device. The OR controller 5107 and/or the IF controller 5109 controls operations of the medical devices in cooperation. When the endoscopic surgery robot (surgery master-slave) system and the medical image acquisition devices such as an X-ray imaging device are included in the operating room, those devices can also be connected as the group of devices 5101 in the same manner.
The OR controller 5107 controls processing related to image display in the medical devices in an integrated manner. Specifically, the group of devices 5101, the ceiling camera 5187, and the operating field camera 5189 among the devices included in the operating room system 5100 can each be a device having a function to transmit (hereinafter, also called a transmission source device) information to be displayed (hereinafter, also called display information) during the operation. The display devices 5103A to 5103D can each be a device to output the display information (hereinafter, also called an output destination device). The OR controller 5107 has a function to control operations of the transmission source devices and the output destination devices so as to acquire the display information from the transmission source devices and transmit the display information to the output destination devices to cause the output destination devices to display or record the display information. The display information refers to, for example, various images captured during the operation and various types of information on the operation (for example, body information and past examination results of a patient and information about a surgical procedure).
Specifically, information about an image of a surgical site in a body cavity of the patient captured by the endoscope can be transmitted as the display information from the group of devices 5101 to the OR controller 5107. Information about an image of the area near the hands of the operator captured by the ceiling camera 5187 can be transmitted as the display information from the ceiling camera 5187. Information about an image representing the overall situation in the operating room captured by the operating field camera 5189 can be transmitted as the display information from the operating field camera 5189. When another device having an imaging function is present in the operating room system 5100, the OR controller 5107 may also acquire information about an image captured by the other device as the display information from the other device.
The OR controller 5107 displays the acquired display information (that is, the images captured during the operation and the various types of information on the operation) on at least one of the display devices 5103A to 5103D serving as the output destination devices. In the illustrated example, the display device 5103A is a display device installed on the ceiling of the operating room, being hung therefrom; the display device 5103B is a display device installed on a wall surface of the operating room; the display device 5103C is a display device installed on a desk in the operating room; and the display device 5103D is a mobile device (such as a tablet personal computer (PC)) having a display function.
The IF controller 5109 controls input and output of the video signal from and to connected devices. For example, the IF controller 5109 controls input and output of the video signal based on controlling of the OR controller 5107. The IF controller 5109 includes, for example, an IP switcher, and controls high-speed transfer of the image (video) signal between devices disposed on the IP network.
The operating room system 5100 may include a device outside the operating room. The device outside the operating room can be a server connected to a network built in and outside a hospital, a PC used by a medical staff, or a projector installed in a meeting room of the hospital. When such an external device is present outside the hospital, the OR controller 5107 can also display the display information on a display device of another hospital via, for example, a teleconference system for telemedicine.
An external server 5113 is, for example, an in-hospital server or a cloud server outside the operating room, and may be used for, for example, image analysis and/or data analysis. In this case, the video information in the operating room may be transmitted to the external server 5113, and the server may generate additional information through big data analysis or recognition/analysis processing using artificial intelligence (AI) (machine learning), and feed the additional information back to the display devices in the operating room. At this time, an IP converter 5115H connected to the video devices in the operating room transmits data to the external server 5113, so that the video is analyzed. The transmitted data may be, for example, a video itself of the operation using the endoscope or other tools, metadata extracted from the video, and/or data indicating an operating status of the connected devices.
The operating room system 5100 is further provided with a central operation panel 5111. Through the central operation panel 5111, a user can give the OR controller 5107 an instruction about input/output control of the IF controller 5109 and an instruction about an operation of the connected devices. The user can switch image display through the central operation panel 5111. The central operation panel 5111 is configured by providing a touchscreen on a display surface of a display device. The central operation panel 5111 may be connected to the IF controller 5109 via an IP converter 5115J.
The IP network may be established using a wired network, or a part or the whole of the network may be established using a wireless network. For example, each of the IP converters on the video source sides may have a wireless communication function, and may transmit the received image to an output side IP converter via a wireless communication network, such as the fifth-generation mobile communication system (5G) or the sixth-generation mobile communication system (6G).
The technology according to the present disclosure can be suitably applied to the ceiling camera 5187 and the operating field camera 5189 among the configurations described above. Specifically, the solid-state imaging element 1 according to each of the above embodiments and modifications and the imaging device 300 according to the application example can be applied to the ceiling camera 5187, the operating field camera 5189, and the like. By using any one of the solid-state imaging element 1 according to each of the above-described embodiments and modifications and the imaging device 300 according to the application example, even in the operating room system 5100, it is possible to facilitate adjustment of the TG-FD capacitance while suppressing an increase in the number of wiring layers.
Note that the present technology can also have the following configurations.
A solid-state imaging element, comprising:
The solid-state imaging element according to (1), wherein
The solid-state imaging element according to (1), wherein
The solid-state imaging element according to (1), wherein
The solid-state imaging element according to (2), wherein
The solid-state imaging element according to (2), wherein
The solid-state imaging element according to (3), wherein
The solid-state imaging element according to (3), wherein
The solid-state imaging element according to (4), wherein
An electronic device, comprising
An electronic device including the solid-state imaging element according to any one of (1) to (9).
Number | Date | Country | Kind |
---|---|---|---|
2020-203494 | Dec 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/043102 | 11/25/2021 | WO |