The present disclosure relates to a semiconductor device, a method of manufacturing the semiconductor device, and an imaging element.
The Applicant of the present application has proposed a semiconductor device having a three-dimensional structure in which, for example, a sensor board (first board) and a circuit board (second board) are bonded together and stacked to allow an imaging unit to have higher integration (see, for example, PTL 1). The sensor board (first board) includes a photoelectric conversion element. The circuit board (second board) includes a peripheral circuit. In PTL 1, a first electrode of the first board and a second electrode of the second board are disposed to be opposed to each other with an insulating thin film interposed in between and then heated to be joined together.
PTL 1: Japanese Unexamined Patent Application Publication No. 2013-73988
Incidentally, further higher integration is requested from such a semiconductor device. It is thus desirable to provide a semiconductor device having a structure suitable for higher integration, a method of manufacturing the semiconductor device, and an imaging element.
A semiconductor device according to an embodiment of the present disclosure includes: a first semiconductor substrate; and a second semiconductor substrate. The first semiconductor substrate is provided with a first electrode including a first protruding portion and a first base portion. The first protruding portion includes a first abutting surface. The first base portion is linked to the first protruding portion and has volume greater than volume of the first protruding portion. The second semiconductor substrate is provided with a second electrode including a second protruding portion and a second base portion. The second protruding portion includes a second abutting surface that abuts the first abutting surface. The second base portion is linked to the second protruding portion and has volume greater than volume of the second protruding portion. The second semiconductor substrate is stacked on the first semiconductor substrate.
In the first electrode of the semiconductor device according to the embodiment of the present disclosure, the volume of the first base portion linked to the first protruding portion including the first abutting surface is greater than the volume of the first protruding portion. In addition, in the second electrode of the semiconductor device according to the embodiment of the present disclosure, the volume of the second base portion linked to the second protruding portion including the second abutting surface is greater than the volume of the second protruding portion. The presence of the first base portion and the second base portion therefore causes the first abutting surface and the second abutting surface to be favorably joined together even in a case where respective planarization processes on the first abutting surface and the second abutting surface partially recess the first abutting surface and the second abutting surface.
The following describes embodiments of the present disclosure in detail with reference to the drawings. It is to be noted that description is given in the following order.
The sensor board 10 is provided with a pixel region 3 in which the plurality of pixels 2 is regularly arrayed two-dimensionally. Each of the plurality of pixels 2 includes an imaging element including a photoelectric conversion section 51 (see
In addition, the circuit board 20 is provided with peripheral circuits such as a vertical driving circuit 6 for driving the respective pixels 2 provided to the sensor board 10, a column signal processing circuit 7, a horizontal driving circuit 8, and a system control circuit 9. A portion or all of these circuits may be optionally provided to the sensor board 10.
As illustrated in
The electrode layer 10A includes a first electrode 12 embedded in a first insulating film 11. There is provided a diffusion prevention film 13 on the border between the first insulating film 11 and the first electrode 12. The electrode layer 10A includes the junction surface CS1 opposed to the junction surface CS2 (described below) of the circuit board 20. The junction surface CS1 is planarized, for example, by chemical machine polishing (CMP).
The first electrode 12 is a coupling terminal that is coupled to a second electrode 22 described below to form paired coupling terminals. The first electrode 12 includes a first protruding portion 12A and a first base portion 12B. The first protruding portion 12A includes a first abutting surface 12AS exposed from the junction surface CS1. The first protruding portion 12A stands on the first base portion 12B. The first base portion 12B is linked to the first protruding portion 12A on the opposite side to the first abutting surface 12AS. The first base portion 12B has volume greater than the volume of the first protruding portion 12A. More specifically, as illustrated in
The first insulating film 11 includes an insulating layer 11A and an insulating layer 11B. The insulating layer 11A is provided in the same layer as the layer of the first protruding portion 12A to surround the first protruding portion 12A. The insulating layer 11B is provided in the same layer as the layer of the first base portion 12B to surround the first base portion 12B. The insulating layer 11A and the insulating layer 11B each include, for example, a TEOS film. The TEOS film refers to a silicon oxide film formed by using chemical vapor deposition (referred to as CVD below) in which TEOS gas (Tetra Ethoxy Silane gas: a composition of Si(OC2H5)4) is used as raw material gas.
There is provided a diffusion prevention layer 14 in a layer between the insulating layer 11A and the insulating layer 11B to separate the insulating layer 11A and the insulating layer 11B and separate the insulating layer 11A and the first base portion 12B. The diffusion prevention layer 14 is formed by using a material that suppresses the diffusion of a material included in the first electrode 12 to the first insulating film 11. Examples of the material that suppresses the diffusion include an insulating material such as SiCN. Specifically, the diffusion prevention layer 14 is formed by using, for example, a vapor-phase method such as CVD or sputtering. It is to be noted that there may be a first insulating film in place of the diffusion prevention layer 14.
The diffusion prevention film 13 includes a diffusion prevention layer 13A, a diffusion prevention layer 13B, and a diffusion prevention layer 13C. The diffusion prevention layer 13A is provided to separate the first protruding portion 12A and the insulating layer 11A. The diffusion prevention layer 13B is provided to separate the first base portion 12B and the insulating layer 11B. The diffusion prevention layer 13C covers the first base portion 12B to connect the diffusion prevention layer 13A and the diffusion prevention layer 13B. Each of the diffusion prevention layers 13A to 13C is formed by using a material that suppresses the diffusion of a material included in the first electrode 12 to the first insulating film 11. A material including, for example, at least one of Ti (titanium), TiN (titanium nitride), Ta (tantalum), or TaN (tantalum nitride) is preferred as a material included in each of the diffusion prevention layers 13A to 13C. In addition, each of the diffusion prevention layers 13A to 13C may be formed by using a plurality of layers.
The wiring layer 10B includes an insulating layer 15 and an insulating layer 16. The insulating layer 15 and the insulating layer 16 are stacked in order on the electrode layer 10A. For example, a wiring line 31 is embedded in the insulating layer 16. The wiring line 31 includes a wiring layer 31A and a barrier metal layer 31B. The wiring layer 31A includes a highly electrically conductive material such as Cu (copper). The barrier metal layer 31B surrounds the wiring layer 31A. The barrier metal layer 31B suppresses the diffusion of a material included in the wiring layer 31A. Further, for example, a transfer gate TG of a transfer transistor and a gate electrode 32 of a pixel transistor Tr1 are embedded near the surface of the insulating layer 16 opposite to the electrode layer 10A. The pixel transistor Tr1 is, for example, any of an amplification transistor, a reset transistor, or a selection transistor.
The semiconductor layer 10C includes an insulating layer 17 and a semiconductor layer 18. The insulating layer 17 covers the wiring layer 10B. The semiconductor layer 18 covers the insulating layer 17. The semiconductor layer 18 includes single-crystal silicon and the like. There are provided a source 23S and a drain 23D of the pixel transistor Tr1 and a floating diffusion FD near the surface of the semiconductor layer 18 opposed to the wiring layer 10B. The source 23S and the drain 23D of the pixel transistor Tr1 and the floating diffusion FD are positioned on the opposite side to the gate electrode 32 with the insulating layer 17 interposed in between. The semiconductor layer 18 is further provided with the photoelectric conversion section 51 for each of the pixels 2.
As illustrated in
The electrode layer 20A includes the second electrode 22 embedded in a second insulating film 21. There is provided a diffusion prevention film 23 on the border between the second insulating film 21 and the second electrode 22. The electrode layer 20A includes the junction surface CS2 opposed to the junction surface CS1 of the sensor board 10. The junction surface CS2 is also planarized, for example, by chemical machine polishing (CMP) as with the junction surface CS1.
The second electrode 22 is a coupling terminal that is coupled to the first electrode 12 to form paired coupling terminals. The second electrode 22 includes a second protruding portion 22A and a second base portion 22B. The formation of the paired coupling terminals of the first electrode 12 and the second electrode 22 in this way allows signals to be exchanged between the sensor board 10 and the circuit board 20. The second protruding portion 22A includes a second abutting surface 22AS exposed from the junction surface CS2. The second protruding portion 22A stands on the second base portion 22B. The second base portion 22B is linked to the second protruding portion 22A on the opposite side to the second abutting surface 22AS. The second base portion 22B has volume greater than the volume of the second protruding portion 22A. More specifically, as illustrated in
As described above, the sensor board 10 includes the pixel region 3 in which the plurality of pixels 2 is formed and the peripheral region that surrounds the pixel region 3. Abutting sections AS of the first abutting surface 12AS and the second abutting surface 22AS are formed in a region overlapping with the pixel region 3 on the sensor board 10 in the stack direction (Z axis direction). The abutting sections AS are joined by using, for example, plasma junction.
The second insulating film 21 includes an insulating layer 21A and an insulating layer 21B. The insulating layer 21A is provided in the same layer as the layer of the second protruding portion 22A to surround the second protruding portion 22A. The insulating layer 21B is provided in the same layer as the layer of the second base portion 22B to surround the second base portion 22B. The insulating layer 21A and the insulating layer 21B each include, for example, a TEOS film.
There is provided a diffusion prevention layer 24 in a layer between the insulating layer 21A and the insulating layer 21B to separate the insulating layer 21A and the insulating layer 21B and separate the insulating layer 21A and the second base portion 22B. The diffusion prevention layer 24 is formed by using a material that suppresses the diffusion of a material included in the second electrode 22 to the second insulating film 21. Examples of the material that suppresses the diffusion include an insulating material such as SiCN. Specifically, the diffusion prevention layer 24 is formed by using, for example, a vapor-phase method such as CVD or sputtering. It is to be noted that the second insulating film 21 may be provided in place of the diffusion prevention layer 24.
The diffusion prevention film 23 includes a diffusion prevention layer 23A, a diffusion prevention layer 23B, and a diffusion prevention layer 23C. The diffusion prevention layer 23A is provided to separate the second protruding portion 22A and the insulating layer 21A. The diffusion prevention layer 23B is provided to separate the second base portion 22B and the insulating layer 21B. The diffusion prevention layer 23C covers the second base portion 22B to connect the diffusion prevention layer 23A and the diffusion prevention layer 23B. Each of the diffusion prevention layers 23A to 23C is formed by using a material that suppresses the diffusion of a material included in the second electrode 22 to the second insulating film 21. A material including, for example, at least one of Ti (titanium), TiN (titanium nitride), Ta (tantalum), or TaN (tantalum nitride) is preferred as a material included in each of the diffusion prevention layers 23A to 23C. In addition, each of the diffusion prevention layers 23A to 23C may be formed by using a plurality of layers.
The wiring layer 20B includes an insulating layer 25 and an insulating layer 26. The insulating layer 25 covers the surface of the electrode layer 20A opposite to the junction surface CS2. The insulating layer 26 is provided below the insulating layer 25. For example, a wiring line 41 is embedded in the insulating layer 26. The wiring line 41 includes a wiring layer 41A and a barrier metal layer 41B. The wiring layer 41A includes a highly electrically conductive material such as Cu (copper). The barrier metal layer 41B surrounds the wiring layer 41A. The barrier metal layer 41B suppresses the diffusion of a material included in the wiring layer 41A. It is to be noted that the wiring layer 41A is coupled to the second base portion 22B of the second electrode 22 by a wiring line 29 that penetrates the insulating layer 21B and the insulating layer 25. The wiring line 29 includes a wiring layer 29A and a barrier metal layer 29B. The wiring layer 29A includes a highly electrically conductive material such as Cu (copper). The barrier metal layer 29B surrounds the wiring layer 29A. The barrier metal layer 29B suppresses the diffusion of a material included in the wiring layer 29A. Further, for example, a gate electrode 42 of a transistor Tr2 in a logic circuit such as the column signal processing circuit 7 is embedded near the surface of the insulating layer 26 opposite to the electrode layer 20A.
The semiconductor layer 20C includes an insulating layer 27 and a semiconductor layer 28. The insulating layer 27 covers the wiring layer 20B. The semiconductor layer 28 covers the insulating layer 27. The semiconductor layer 28 includes single-crystal silicon and the like. There are provided a source 43S and a drain 43D of the transistor Tr2 near the surface of the semiconductor layer 28 opposed to the wiring layer 20B. The source 43S and the drain 43D of the transistor Tr2 are positioned on the opposite side to the gate electrode 42 with the insulating layer 27 interposed in between.
The protective film 19 is provided to cover the photoelectric conversion section 51 of the sensor board 10. The protective film 19 is a material film having a passivation property. Examples of a material included therein include silicon oxide, silicon nitride, silicon oxynitride, or the like. The color filter layer CF is a color filter for each color. The color filter is provided in association with each photoelectric conversion section 51 on a one-to-one basis. There is no limitation on the array of the color filters for the respective colors. The on-chip lens LNS is provided in association with each photoelectric conversion section 51 and each color filter layer CF on a one-to-one basis. The on-chip lens LNS is configured to collect incident light in each photoelectric conversion section 51.
Next, with reference to
First, as illustrated in
Next, as illustrated in
Next, as illustrated in
Next, as illustrated in
After that, as illustrated in
Next, with reference to
First, as illustrated in
After that, annealing treatment is performed, for example, at a temperature of about 400° C. This annealing treatment fills the gaps 12V1, 12V2, 22V1, and 22V2 as illustrated in
In the first electrode 12 on the sensor board 10 of the solid-state imaging unit 1 according to the present embodiment, the volume of the first base portion 12B linked to the first protruding portion 12A including the first abutting surface 12AS is greater than the volume of the first protruding portion 12A. Similarly, in the second electrode 22, the volume of the second base portion 22B linked to the second protruding portion 22A including the second abutting surface 22AS is greater than the volume of the second protruding portion 22A. The presence of the first base portion 12B and the second base portion 22B therefore causes the first abutting surface 12AS and the second abutting surface 22AS to be favorably joined together even in a case where respective planarization processes on the first abutting surface 12AS and the second abutting surface 22AS partially recess the first abutting surface and the second abutting surface to generate the recess portions R. This securely couples the first electrode 12 and the second electrode 22 electrically. This makes it possible to securely couple the first electrode 12 and the second electrode 22 electrically even in a case where the sensor board 10 and the circuit board 20 are smaller and the first electrode 12 and the second electrode 22 are finer. In addition, this allows for secure electrical coupling even in a case of mismatch bonding positions. In other words, the solid-state imaging unit 1 is able to address further higher integration.
In addition, it is possible in the solid-state imaging unit 1 to decrease the area of each of the junction portions of the first abutting surface 12AS of the first electrode 12 and the second abutting surface 22AS of the second electrode 22. It is possible to provide even a finer pixel region with paired coupling terminals for each of pixels or for a desired number of pixels. This makes it possible as a result to contribute to the miniaturization and higher integration of the solid-state imaging unit 1.
The camera 2000 includes an optical unit 2001 including a lens group and the like, an imaging unit (imaging device) 2002 to which the solid-state imaging unit 1, 1A, or the like (that is referred to as solid-state imaging unit 1 or the like below) described above is applied, and a DSP (Digital Signal Processor) circuit 2003 that is a camera signal processing circuit. In addition, the camera 2000 also includes a frame memory 2004, a display unit 2005, a recording unit 2006, an operation unit 2007, and a power supply unit 2008. The DSP circuit 2003, the frame memory 2004, the display unit 2005, the recording unit 2006, the operation unit 2007, and the power supply unit 2008 are coupled to each other via a bus line 2009.
The optical unit 2001 takes in incident light (image light) from a subject to form an image on an imaging surface of the imaging unit 2002. The imaging unit 2002 converts the amount of incident light formed, as an image, on the imaging surface by the optical unit 2001 into an electric signal on a pixel unit basis and outputs the converted electric signal as a pixel signal.
The display unit 2005 includes, for example, a panel-type display device such as a liquid crystal panel or an organic EL panel and displays a moving image or a still image captured by the imaging unit 2002. The recording unit 2006 records the moving image or the still image captured by the imaging unit 2002 in a recording medium such as a hard disk or a semiconductor memory.
The operation unit 2007 issues an operation instruction about a variety of functions of the camera 2000 under an operation of a user. The power supply unit 2008 appropriately supplies the DSP circuit 2003, the frame memory 2004, the display unit 2005, the recording unit 2006, and the operation unit 2007 with various types of power for operations of these supply targets.
As described above, the use of the solid-state imaging unit 1 or the like described above as the imaging unit 2002 makes it possible to expect a favorable image.
The technology (the present technology) according to the present disclosure is applicable to a variety of products. For example, the technology according to the present disclosure may be achieved as a device mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a vessel, or a robot.
The vehicle control system 12000 includes a plurality of electronic control units connected to each other via a communication network 12001. In the example depicted in
The driving system control unit 12010 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs. For example, the driving system control unit 12010 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like.
The body system control unit 12020 controls the operation of various kinds of devices provided to a vehicle body in accordance with various kinds of programs. For example, the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal, a fog lamp, or the like. In this case, radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 12020. The body system control unit 12020 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
The outside-vehicle information detecting unit 12030 detects information about the outside of the vehicle including the vehicle control system 12000. For example, the outside-vehicle information detecting unit 12030 is connected with an imaging section 12031. The outside-vehicle information detecting unit 12030 makes the imaging section 12031 image an image of the outside of the vehicle, and receives the imaged image. On the basis of the received image, the outside-vehicle information detecting unit 12030 may perform processing of detecting an object such as a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
The imaging section 12031 is an optical sensor that receives light, and which outputs an electric signal corresponding to a received light amount of the light. The imaging section 12031 can output the electric signal as an image, or can output the electric signal as information about a measured distance. In addition, the light received by the imaging section 12031 may be visible light, or may be invisible light such as infrared rays or the like.
The in-vehicle information detecting unit 12040 detects information about the inside of the vehicle. The in-vehicle information detecting unit 12040 is, for example, connected with a driver state detecting section 12041 that detects the state of a driver. The driver state detecting section 12041, for example, includes a camera that images the driver. On the basis of detection information input from the driver state detecting section 12041, the in-vehicle information detecting unit 12040 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing.
The microcomputer 12051 can calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the information about the inside or outside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040, and output a control command to the driving system control unit 12010. For example, the microcomputer 12051 can perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like.
In addition, the microcomputer 12051 can perform cooperative control intended for automatic driving, which makes the vehicle to travel autonomously without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like on the basis of the information about the outside or inside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040.
In addition, the microcomputer 12051 can output a control command to the body system control unit 12020 on the basis of the information about the outside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030. For example, the microcomputer 12051 can perform cooperative control intended to prevent a glare by controlling the headlamp so as to change from a high beam to a low beam, for example, in accordance with the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detecting unit 12030.
The sound/image output section 12052 transmits an output signal of at least one of a sound and an image to an output device capable of visually or auditorily notifying information to an occupant of the vehicle or the outside of the vehicle. In the example of
In
The imaging sections 12101, 12102, 12103, 12104, and 12105 are, for example, disposed at positions on a front nose, sideview mirrors, a rear bumper, and a back door of the vehicle 12100 as well as a position on an upper portion of a windshield within the interior of the vehicle. The imaging section 12101 provided to the front nose and the imaging section 12105 provided to the upper portion of the windshield within the interior of the vehicle obtain mainly an image of the front of the vehicle 12100. The imaging sections 12102 and 12103 provided to the sideview mirrors obtain mainly an image of the sides of the vehicle 12100. The imaging section 12104 provided to the rear bumper or the back door obtains mainly an image of the rear of the vehicle 12100. The imaging section 12105 provided to the upper portion of the windshield within the interior of the vehicle is used mainly to detect a preceding vehicle, a pedestrian, an obstacle, a signal, a traffic sign, a lane, or the like.
Incidentally,
At least one of the imaging sections 12101 to 12104 may have a function of obtaining distance information. For example, at least one of the imaging sections 12101 to 12104 may be a stereo camera constituted of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
For example, the microcomputer 12051 can determine a distance to each three-dimensional object within the imaging ranges 12111 to 12114 and a temporal change in the distance (relative speed with respect to the vehicle 12100) on the basis of the distance information obtained from the imaging sections 12101 to 12104, and thereby extract, as a preceding vehicle, a nearest three-dimensional object in particular that is present on a traveling path of the vehicle 12100 and which travels in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, equal to or more than 0 km/hour). Further, the microcomputer 12051 can set a following distance to be maintained in front of a preceding vehicle in advance, and perform automatic brake control (including following stop control), automatic acceleration control (including following start control), or the like. It is thus possible to perform cooperative control intended for automatic driving that makes the vehicle travel autonomously without depending on the operation of the driver or the like.
For example, the microcomputer 12051 can classify three-dimensional object data on three-dimensional objects into three-dimensional object data of a two-wheeled vehicle, a standard-sized vehicle, a large-sized vehicle, a pedestrian, a utility pole, and other three-dimensional objects on the basis of the distance information obtained from the imaging sections 12101 to 12104, extract the classified three-dimensional object data, and use the extracted three-dimensional object data for automatic avoidance of an obstacle. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that the driver of the vehicle 12100 can recognize visually and obstacles that are difficult for the driver of the vehicle 12100 to recognize visually. Then, the microcomputer 12051 determines a collision risk indicating a risk of collision with each obstacle. In a situation in which the collision risk is equal to or higher than a set value and there is thus a possibility of collision, the microcomputer 12051 outputs a warning to the driver via the audio speaker 12061 or the display section 12062, and performs forced deceleration or avoidance steering via the driving system control unit 12010. The microcomputer 12051 can thereby assist in driving to avoid collision.
At least one of the imaging sections 12101 to 12104 may be an infrared camera that detects infrared rays. The microcomputer 12051 can, for example, recognize a pedestrian by determining whether or not there is a pedestrian in imaged images of the imaging sections 12101 to 12104. Such recognition of a pedestrian is, for example, performed by a procedure of extracting characteristic points in the imaged images of the imaging sections 12101 to 12104 as infrared cameras and a procedure of determining whether or not it is the pedestrian by performing pattern matching processing on a series of characteristic points representing the contour of the object. When the microcomputer 12051 determines that there is a pedestrian in the imaged images of the imaging sections 12101 to 12104, and thus recognizes the pedestrian, the sound/image output section 12052 controls the display section 12062 so that a square contour line for emphasis is displayed so as to be superimposed on the recognized pedestrian. The sound/image output section 12052 may also control the display section 12062 so that an icon or the like representing the pedestrian is displayed at a desired position.
The above has described the example of the vehicle control system to which the technology according to the present disclosure may be applied. The technology according to the present disclosure may be applied to the imaging section 12031 among the components described above. Specifically, the solid-state imaging unit 1 or the like illustrated in
Although the present disclosure has been described above with reference to several embodiments and modification examples, the present disclosure is not limited to the embodiments and the like described above. It is possible to make a variety of modifications. For example, the solid-state imaging unit 1 has been exemplified that has a two-layer structure of the sensor board 10 and the circuit board 20 in the embodiments described above. The present disclosure is not, however, limited to this. The present disclosure is applicable, for example, to a stacked structure of three or more layers.
In addition, any disposition positions, dimensions, and shapes may be set for the respective components of the sensor board 10 and of the circuit board 20 described in the embodiments and the like above. For example,
In addition, the case has been exemplified and described in the embodiments and the like described above where the abutting sections AS are formed in a region overlapping with the pixel region 3 in the stack direction, but the present disclosure is not limited to this. For example, as with a solid-state imaging unit 1B according to a second modification example of the present disclosure illustrated in
It is also possible in the solid-state imaging unit 1B like this to decrease the area of each of the junction portions of the first abutting surface 12AS of the first electrode 12 and the second abutting surface 22AS of the second electrode 22. This makes it possible to provide even a finer pixel region with paired coupling terminals for each of pixels or for a desired number of pixels. This makes it possible as a result to contribute to the miniaturization and higher integration of the solid-state imaging unit 1B.
In addition, the case has been described in the embodiments and the like described above where an end of the first abutting surface 12AS and an end of the second abutting surface 22AS each have the recess portion R as illustrated in
In addition, the first electrode has a two-layer structure of the first base portion and the first protruding portion in the embodiments and the like described above, but the present disclosure is not limited to this. For example, as with a solid-state imaging unit according to a fourth modification example of the present disclosure illustrated in each of
In a case where the first electrode and the second electrode each have a three-layer structure, the plane area of the first protruding portion 12A may be the smallest, the plane area of the first base portion 12B may be the largest, and the plane area of the first middle portion 12C may be the middle therebetween, for example, as in a fifth modification example of the present disclosure illustrated in each of
In a case where the first electrode and the second electrode each have a three-layer structure, for example, the adjacent first electrodes 12 may share the first base portion 12B, for example, as in a sixth modification example of the present disclosure illustrated in each of
In addition, in the embodiments and the like described above, the first electrode has a rectangular cross-sectional shape and the second electrode has a rectangular cross-sectional shape, but the present disclosure is not limited to this. A portion or the whole of the first electrode may have a tapered cross section that decreases in width in the in-plane direction of the XY plane as coming closer to the first abutting surface 12AS, for example, as in seventh to ninth modification examples of the present disclosure illustrated in
In addition, it is possible in the present disclosure, to set any positional relationship between the first protruding portion, the first middle portion, and the first base portion of the first electrode in the in-plane direction as long as they communicate with each other. The same applies to the second electrode. For example, in a case where the first electrode 12 and the second electrode 22 each have a two-layer structure, the positional relationship between the first protruding portion 12A and the first base portion 12B and the positional relationship between the second protruding portion 22A and the second base portion 22B in the in-plane direction of the XY plane may be, for example, the positional relationships as in
In addition, the first protruding portion and the first base portion or the first protruding portion, the first middle portion, and the first base portion of the first electrode may be separately formed or integrally formed. The same applies to the second electrode.
In addition, the case has been exemplified in the embodiments and the like described above where the junction surface of the first electrode and the junction surface of the second electrode are joined together by a technique such as plasma junction to form paired coupling terminals that exchange signals, but the present disclosure is not limited to this. The technology according to the present disclosure is applicable even in a case where not coupling terminals are not joined together, but wiring lines are joined together for forming a power supply path, for example, as with a solid-state imaging unit 1C illustrated in
Further, the technology according to the present disclosure is not limited to a case where coupling terminals are joined together as a path for electrical communication or wiring lines are joined together as a path for power supply, but is applicable, for example, in a case where coupling terminals or wiring lines are joined together as a structure for increasing the strength for joining the sensor board and the circuit board. Further, the technology according to the present disclosure is also applicable, for example, in a case where the respective light-shielding metal layers are joined together that are formed in the pixel region on the sensor board and the pixel region on the circuit board.
As described above, the semiconductor device according to the embodiment of the present disclosure is suitable for higher integration. It is to be noted that the effects described in the present specification are merely illustrative and nonlimiting. There may be other effects. In addition, the present technology may have the following configurations.
A semiconductor device including:
The semiconductor device according to (1), in which area occupied by the first base portion is greater than area occupied by the first protruding portion in a plane orthogonal to a stack direction of the first semiconductor substrate and the second semiconductor substrate.
The semiconductor device according to (2), in which area occupied by the second base portion is greater than area occupied by the second protruding portion in the plane orthogonal to the stack direction.
The semiconductor device according to any one of (1) to (3), in which the first semiconductor substrate includes a pixel region in which a plurality of imaging elements is formed and a peripheral region that surrounds the pixel region, and abutting sections of the first abutting surface and the second abutting surface are formed in a region that overlaps with the pixel region in a stack direction of the first semiconductor substrate and the second semiconductor substrate.
The semiconductor device according to any one of (1) to (4), in which the first abutting surface and the second abutting surface form abutting sections that are joined together by using plasma junction.
The semiconductor device according to any one of (1) to (5), in which the second electrode includes a plurality of the second protruding portions linked to the one second base portion.
The semiconductor device according to any one of (1) to (6), further including:
A method of manufacturing a semiconductor device, the method including:
An imaging element including:
The imaging element according to (9), in which
The present application claims the priority on the basis of Japanese Pat. Application No. 2018-189792 filed on Oct. 5, 2018 with Japan Pat. Office, the entire contents of which are incorporated in the present application by reference.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations, and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2018-189792 | Oct 2018 | JP | national |
This application is a continuation application of U.S. Pat. Application Serial No. 17/279,962, filed on Mar. 25, 2021, which is a U.S. National Phase of International Pat. Application No. PCT/JP2019/036321, filed on Sep. 17, 2019, which claims priority benefit of Japanese Pat. Application No. JP 2018-189792 filed in the Japan Pat. Office on Oct. 05, 2018. Each of the above-referenced applications is hereby incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 17279962 | Mar 2021 | US |
Child | 18354792 | US |