The present technology relates to a semiconductor chip, a method for manufacturing the same, and an electronic device, and more particularly to a semiconductor chip, a method for manufacturing the same, and an electronic device capable of improving reliability and durability.
In recent years, there is known an imaging device in which a lowermost layer lens among a plurality of lenses that focuses incident light on a light-receiving surface of a solid-state imaging element is formed on a foremost stage of an integrated configuration unit in which the solid-state imaging element, a glass substrate, an infrared (IR)-cut filter are laminated.
In such an imaging device, it has been devised to form a hemming portion at the boundary with a glass substrate on an outer peripheral side surface of the lowermost layer lens so that the lowermost layer lens is more strongly bonded to the glass substrate (see, for example, PTL 1).
In addition, it has been devised to apply an anti-reflection (AR) coating to the lowermost layer lens to prevent reflection (see, for example, PTL 2). This AR coating is, for example, is a resin such as acrylic resin, epoxy resin, or styrene resin, an insulation film (for example, SiCH, SiCOH, and SiCNH) mainly containing Si (silicon), C (carbon), and H (hydrogen), an insulation film (for example, SiON and SiN) mainly containing Si (silicon) and N (nitrogen), or an SiO2 film, a P-SiO film, or a HDP-SiO film formed using an oxidant and an material gas which is at least any one of silicon hydroxide, alkylsilane, alkoxysilane, polysiloxane, and the like.
By the way, as a method for forming the lowermost layer lens on the glass substrate laminated on the solid-state imaging element, for example, there is known a method in which a lens resin, which is a photocurable resin, is applied to a lens region on the glass substrate where the lowermost layer lens is formed, and the lens resin is exposed using a mold with a light shielding mask having a shape corresponding to a desired shape of the lowermost layer lens. In this method, the lowermost layer lens having a desired shape is formed by performing a cleaning treatment after the exposure to remove an unexposed (uncured) lens resin and the like.
However, it is difficult to control the shape of the end portion of the lowermost layer lens with high accuracy due to the influence of refraction of exposure light. As a result, when a hemming portion (lens hem) formed at the end portion comes into contact with the uncured lens resin and the like, which are removed by the cleaning treatment, a portion of the hemming portion together with the uncured lens resin and the like may peel off due to the cleaning treatment and cracks may occur. If this crack reduces the adhesion force between the glass substrate and the lowermost layer lens, lens peel-off may occur. Therefore, it is desired to suppress the occurrence of such cracks and improve reliability and durability.
The present technology has been made in view of such circumstances, and is intended to improve reliability and durability.
A semiconductor chip according to a first aspect of the present technology includes: an imaging element; a glass substrate provided on the imaging element; and a lens formed on the glass substrate, wherein the glass substrate has a groove around a region where the lens is formed.
A manufacturing method according to a second aspect of the present technology is a method for manufacturing a semiconductor chip, including: forming a groove around a region where a lens is formed, on a glass substrate provided on an imaging element; and forming the lens in the region of the glass substrate where the lens is formed.
An electronic device according to a third aspect of the present technology is an electronic device including: a semiconductor chip including: an imaging element; a glass substrate provided on the imaging element; and a lens formed on the glass substrate; and the glass substrate having a groove around a region where the lens is formed, a signal processing circuit that processes signals from the semiconductor chip.
In the first aspect of the present technology, an imaging element, a glass substrate provided on the imaging element, and a lens formed on the glass substrate are provided, and the glass substrate has a groove around a region where the lens is formed.
In the second aspect of the present technology, a groove is formed around a region where a lens is formed, of a glass substrate provided on an imaging element, and the lens is formed on the region of the glass substrate where the lens is formed.
In the third aspect of the present technology, a semiconductor chip including an imaging element, a glass substrate provided on the imaging element, and a lens formed on the glass substrate, the glass substrate having a groove around a region where the lens is formed, and a signal processing circuit for processing signals from the semiconductor chip are provided.
Modes for embodying the present technology (hereinafter referred to as embodiments) will be described below. Note that the description will be made in the following order.
Note that, in drawings to be referred to in the following description, same or similar portions are denoted by same or similar reference signs. However, the drawings are schematic and relationships between thicknesses and plan view dimensions, ratios of thicknesses of respective layers, and the like differ from those in real. In addition, drawings include portions where dimensional relationships and ratios differ between the drawings in some cases.
In addition, it is to be understood that definitions of directions such as upward and downward in the following description are merely definitions provided for the sake of brevity and are not intended to limit technical ideas of the present disclosure. For example, when an object is observed after being rotated by 90 degrees, up-down is converted into and interpreted as left-right, and when an object is observed after being rotated by 180 degrees, up-down is interpreted as being inverted.
As shown in
A groove 15 is formed in a region of the glass substrate 14 where the lens 16 is not formed around the region where the lens 16 is formed. Here, assuming that the thickness of the region of the glass substrate 14 other than the groove 15 is L1, the thickness of the groove 15 is L2 (L1>L2). Further, a side surface 15a of the groove 15 is inclined, but this inclination occurs unintentionally when the glass substrate 14 is etched to form the groove 15. The side surface 15a of the groove 15 is ideally a vertical surface.
The lens 16, which is a convex acrylic lens, is formed in a region on the glass substrate 14 corresponding to the on-chip lens 11b. The lens 16 is composed of a main portion 16a having a desired shape and a hemming portion 16b which is unintentionally formed at the end portion of the lens 16 during manufacturing. The lens 16 is a lowermost layer lens among a plurality of lenses that focuses incident light on a light-receiving surface (not shown) formed in a region on the surface of the laminated substrate 11a corresponding to the on-chip lens 11b.
In the semiconductor chip 10 configured as described above, light incident through lenses (not shown) and the lens 16 enters the light-receiving surface of the laminated substrate 11a through the glass substrate 14, the adhesive 12, and the on-chip lens 11b, and charges corresponding to the light are accumulated whereby imaging is performed.
As shown in
In step S2, etching such as wet etching or dry etching is performed on the glass substrate 14 applied with the resist 32 in step S1, and the region of the glass substrate 14 not applied with the resist 32 is etched in the vertical direction. As a result, the groove 15 is formed around the region of the glass substrate 14 where the lens 16 is formed. At this time, the side surfaces 15a of the groove 15 are ideally perpendicular to the glass substrate 14, but actually, the bottom surface of the groove 15 is smaller than the upper surface, and the side surfaces 15a are inclined. In step S3, the resist 32 is removed.
In step S4 of
In step S6, a mold 35 with a light-shielding mask having a shape corresponding to the shape of the main portion 16a of the lens 16 is pressed (imprinted) against the lens resin 34, and the lens resin 34 is irradiated with light through the mold 35, whereby the lens resin 34 is exposed. As a result, the exposed lens resin 34 having a shape corresponding to the shape of the main portion 16a is cured, and the lens 16 is molded. That is, the lens 16 having a substantially desired shape is formed in the region on the glass substrate 14 where the lens 16 is formed. On the other hand, the lens resin 34 at which light from the mold 35 does not arrive and which is not exposed is not cured.
In step S7 of
The integrated portion 13 manufactured as described above is bonded to the solid-state imaging element 11 via the adhesive 12, whereby the semiconductor chip 10 is manufactured.
As shown on the left side of
Specifically, since there is a step of the distance L1−L2 between the upper surface of the glass substrate 14 on which the lens 16 is formed and the bottom surface of the groove 15, the lens resin 34 extruded from above the glass substrate 14 by the pressing of the mold 35 in step S6 of
However, since the hemming portion 16b is formed only around the main portion 16a, that is, near the upper surface of the glass substrate 14, the hemming portion 16b does not come into contact with the water repellent film 33 applied to the bottom surface of the groove 15 at a position lower than the upper surface of the glass substrate 14 by L1−L2. Therefore, as shown on the right side of
Moreover, since the hemming portion 16b is ideally formed in a vertical direction along the side surface 15a from the upper surface of the glass substrate 14, the adhesion force between the lens 16 and the glass substrate 14 is improved, and peeling of the lens 16 from the glass substrate 14, that is, so-called lens peel-off, can be suppressed.
As shown in
<Description of Integrated Portion when there is No Groove>
That is,
As shown in
As shown in
Specifically, the water repellent film 33 suppresses the leakage of the lens resin 34 to the region where the lens 52 is not formed, but it is difficult to control the shape of the end portion of the lens 52 with high accuracy due to refraction of light from the mold 35. Therefore, although the hemming portion 52b is unintentionally formed in a region other than the main portion 16a when the lens 52 is molded, since the no groove 15 is formed in the glass substrate 51, the hemming portion 52b and the water repellent film 33 make contact with each other on the same glass substrate 14. Thereafter, when a cleaning treatment is performed and the water-repellent film 33 and the uncured lens resin 34 thereon are removed, a portion of the hemming portion 52b peels off together with the water-repellent film 33 and the uncured lens resin 34.
As shown in
As shown in
The integrated portion 90 of
The groove 92 is different from the groove 15 in that the side surface 92a of the groove 92 is an inclined surface having a predetermined inclination, and the other configuration is the same as that of the groove 15. In
Like the groove 15, the groove 92 has a bottom surface at a position lower than the upper surface of the glass substrate 14 by L1−L2. Therefore, as shown in
In addition, since the side surface 92a of the groove 92 is an inclined surface, the adhesion area of the hemming portion 16b with respect to the side surface 92a is increased compared to the case where the side surface 92a is a vertical surface. Therefore, the adhesion force between the side surface 92a and the hemming portion 16b is improved, and as a result, the occurrence of lens peel-off can be further suppressed.
The integrated portion 110 of
The groove 112 is different from the groove 15 in that the side surfaces 112a are satin-finished to prevent regular reflection of light, and the other configuration is the same as that of the groove 15. That is, the groove 112 has an uneven surface 113 on the surface of the side surface 112a.
Like the groove 15, the groove 112 has a bottom surface at a position lower than the upper surface of the glass substrate 14 by L1−L2. Therefore, as shown in
Further, since the uneven surface 113 is provided on the surface of the side surface 112a of the groove 112, the light incident from the mold 35 and leaking out from the mold 35 during molding of the lens 16 is irregularly reflected by the uneven surface 113. As a result, the formation of the hemming portion 16b due to the leaking light can be suppressed.
The processing applied to the side surface 112a may be processing other than the satin finish, as long as the processing prevents regular reflection of light.
The integrated portion 130 of
The groove 132 is different from the groove 15 in that the bottom surface is curved and the thickness at the lowest position of the bottom surface is L3 (L1>L3), and the other configuration is the same as that of the groove 15.
In the groove 132, the lowest position of the bottom surface is at the position lower than the upper surface of the glass substrate 14 by L1−L3. Therefore, like the groove 15, as shown in
Further, since the bottom surface of the groove 132 is curved, the water repellent film 33 and the uncured lens resin 34 can be easily removed together with the cleaning liquid 36 after the cleaning treatment.
As described above, the semiconductor chip 10 includes the solid-state imaging element 11, the glass substrate 14 (91, 111, 131) provided on the solid-state imaging element 11, and the lens 16 formed on the glass substrate 14 (91, 111, 131). The glass substrate 14 has a groove 15 (92, 112, 132) around the region where the lens 16 is formed.
Therefore, the water-repellent film 33 and the hemming portion 16b do not come into contact when the lens 16 is formed. As a result, when the water repellent film 33 and the uncured lens resin 34 are removed together with the cleaning liquid 36, it is possible to prevent a portion of the hemming portion 16b from peeling off together with them, thereby preventing cracks from occurring in the hemming portion 16b. Therefore, it is possible to suppress the peeling of the lens 16 from the glass substrate 14 (91, 111, 131) due to the cracks. As a result, reliability and durability of the semiconductor chip 10 can be improved.
Although illustration is omitted, the method for manufacturing the integrated portion 90 (110, 130) is the same as the method for manufacturing the integrated portion 13 described with reference to
The semiconductor chip 140 of
The groove 152 is different from the groove 15 in that the bottom surface of the groove 152 is processed to a sawtooth cross-sectional shape as processing to increase the surface area of the bottom surface compared to the case where the bottom surface of the groove 152 is flat, and the thickness at the lowest position of the bottom surface is L4 (L4>L2), and the other configuration is the same as that of the groove 15. It should be noted that L1−L4 is so small that the hemming portion 16b comes into contact with the bottom surface of the groove 15.
The bottom surface of the groove 152 makes contact with the hemming portion 16b, but the cross-sectional shape of the bottom surface of the groove 152 has a sawtooth shape, and the surface area of the bottom surface of the groove 152 is larger than when the bottom surface is flat. Therefore, the adhesion force between the bottom surface of the groove 152 and the hemming portion 16b is greater than when the bottom surface of the groove 152 is flat.
The processing performed on the bottom surface of the groove 152 may be any processing that increases the surface area compared to the case where the bottom surface is flat, that is, processing that makes the bottom surface uneven.
In
First, steps similar to the processing of steps S1 to S3 in
In step S33, the mold 35 is pressed against the lens resin 34 and the lens resin 34 is irradiated with light through the mold 35, whereby the lens resin 34 is exposed. As a result, the exposed lens resin 34 having a shape corresponding to the shape of the main portion 16a is cured, and the lens 16 is molded. That is, the lens 16 having a substantially desired shape is formed in the region on the glass substrate 151 where the lens 16 is formed. On the other hand, the lens resin 34 at which light from the mold 35 does not arrive and which is not exposed is not cured.
In step S34 of
Here, since the surface area of the bottom surface of the groove 152 is larger than when the bottom surface of the groove 152 is flat, the adhesion force between the bottom surface of the groove 152 and the hemming portion 16b is high. Therefore, it is possible to prevent the hemming portion 16b from peeling off from the bottom surface of the groove 152 together with the uncured lens resin 34, thereby preventing cracks from occurring in the hemming portion 16b.
The integrated portion 150 manufactured as described above is bonded onto the solid-state imaging element 11 via the adhesive 12, whereby the semiconductor chip 140 is manufactured. In addition, in the example of the manufacturing method in
As described above, the semiconductor chip 140 includes the solid-state imaging element 11, the glass substrate 151 provided on the solid-state imaging element 11, and the lens 16 formed on the glass substrate 151. The glass substrate 151 has the groove 152 around the region where the lens 16 is formed. Further, the bottom surface of the groove 152 is processed to increase the surface area compared to the case where the bottom surface of the groove 152 is flat. Therefore, the adhesion force between the bottom surface of the groove 152 and the hemming portion 16b is improved. As a result, when the uncured lens resin 34 is removed together with the cleaning liquid 36, it is possible to prevent the hemming portion 16b from peeling off together with the uncured lens resin 34, thereby preventing cracks from occurring in the hemming portion 16b. Therefore, it is possible to suppress the peeling of the lens 16 from the glass substrate 151 due to the cracks. As a result, reliability and durability of the semiconductor chip 140 can be improved.
In the first embodiment and the second embodiment, the solid-state imaging element 11 and the integrated portion 13 (90, 110, 130, 150) are manufactured separately, and the integrated portion 13 (90, 110, 130, 150) is bonded to the solid-state imaging element 11 after manufacturing the integrated portion 13 (90, 110, 130, 150) to manufacture the semiconductor chip 10 (140). However, after the solid-state imaging element 11 is formed, the integrated portion 13 (90, 110, 130, 150) may be formed on the solid-state imaging element 11. In this case, when the integrated portion 13 (90, 110, 130, 150) is manufactured, the solid-state imaging element 11 to which the glass substrate 14 (91, 111, 131, 151) is bonded via the adhesive 12 is installed on the chuck 31.
In addition, the features of the grooves 15 (92, 112, 132, 152) of the first embodiment and the second embodiment may be combined. For example, the grooves 112, 132, and 152 may have inclined side surfaces similarly to the groove 92. In addition, the grooves 132 and 152 may have the uneven surfaces 113 on the side surfaces, similarly to the groove 92.
In a semiconductor chip 200 of
The semiconductor chip 200 of
A buffer layer 211 is formed between the AR coating 212 and the lens 16. The buffer layer 211 covers the side and upper surfaces of the lens 16 and is also formed on regions of the glass substrate 14 where the lens 16 is not formed, including the groove 15. The buffer layer 211 has a refractive index that is low enough not to affect the light incident through the AR coating 212 from the outside when the light is transmitted through the lens 16. For example, the buffer layer 211 has a refractive index that is the same as or slightly higher than that of the lens 16. In addition, the buffer layer 211 is hard enough to suppress deformation of the AR coating. As the buffer layer 211, a film such as SiO2 or SiON can be used.
The AR coating 212 is an anti-reflection film and covers the entire surface of the buffer layer 211. The AR coating 212 has a four-layer structure in which a high-refractive-index film made of at least one of TiOx, TaOx, NbOx, and the like, and a low-refractive-index film made of at least one of SiOx, SiON, and the like, are alternately laminated.
As described above, the semiconductor chip 200 has the buffer layer 211 between the lens 16 and the AR coating 212. As a result, the AR coating 212 can be suppressed from being damaged when a temperature cycle (TC) test, a temperature humidity storage (THS) test, or the like is performed to evaluate the reliability of the semiconductor chip 200.
Specifically, if the buffer layer 211 is not provided between the lens 16 and the AR coating 212, when a temperature cycle test or the like is performed on the semiconductor chip 200, the AR coating 212 may be deformed and damaged due to a thermal stress difference between the lens 16 and the AR coating 212. In addition, if the buffer layer 211 is not provided, when a temperature humidity storage test or the like is performed on the semiconductor chip 200, the lens 16 may absorb moisture and swells and the AR coating 212 may be damaged.
Therefore, in the semiconductor chip 200, the buffer layer 211 is inserted between the lens 16 and the AR coating 212 so as to function as a buffer material, whereby thermal deformation due to the temperature cycle test or the like and intrusion of moisture into the lens 16 due to the temperature humidity storage test or the like are suppressed. As a result, damage to the AR coating 212 is suppressed, and the reliability of the semiconductor chip 200 can be ensured. For example, it is possible to ensure reliability in a temperature cycle test and a temperature humidity storage test for 500 hours or more.
Although the description is omitted, the method for manufacturing the integrated portion 210 is the same as the method for manufacturing the integrated portion of the fourth embodiment, which will be described later.
The semiconductor chip 300 in
In step S51 of
In step S52, the buffer layer 211 is formed by sputtering or the like on the upper and side surfaces of the lenses 311 and on the regions of the glass substrate 14 where the lenses 311 are not formed. The thickness of this buffer layer 211 is such that it does not break due to its own weight, and the thickness can take any value, for example, 400 nm to 1100 nm, and in this example, the thickness is 800 nm. Moreover, the thickness of the side surfaces of the buffer layer 211 is a thickness necessary for suppressing the intrusion of water, for example, 60% or more of the thickness of the upper surface. The buffer layer 211 may be formed by a method other than sputtering, but the film can be easily formed at a low temperature in the case of sputtering.
In step S53, the AR coating 212 is formed on the entire surface of the buffer layer 211 formed in step S52, and the semiconductor chip 300 is manufactured.
Although the chuck 31 is omitted in
As described above, the semiconductor chip 300 has the buffer layer 211 between the lens 311 and the AR coating 212. Therefore, like the semiconductor chip 200, damage to the AR coating 212 can be suppressed, and the reliability of the semiconductor chip 300 can be ensured.
The semiconductor chip 330 of
The AR coating 341 covers only the buffer layer 211 formed on the upper surface of the lens 311, instead of the entire surface of the buffer layer 211. That is, only the buffer layer 211 is formed on the side surfaces of the lenses 311 and the regions of the glass substrate 14 where the lenses 311 are not formed.
Steps S71 and S72 in
As described above, the semiconductor chip 330 has the buffer layer 211 between the lens 311 and the AR coating 341. Therefore, like the semiconductor chip 200, damage to the AR coating 212 can be suppressed, and the reliability of the semiconductor chip 300 can be ensured.
Moreover, since the AR coating 212 is not formed on the buffer layer 211 on the side surface of the lens 311 in the semiconductor chip 330, the semiconductor chip 300 can be manufactured easily. On the other hand, since the side surfaces (side walls) of the lens 311 do not affect the optical characteristics, the semiconductor chip 330 does not have the AR coating 212 on the buffer layer 211 on the side surfaces of the lens 311, but can have the same optical characteristics as the semiconductor chip 300. In addition, since the buffer layer 211 covers the side surface of the lens 311, it is possible to prevent moisture from entering the lens 311 during a temperature humidity storage test or the like.
As a result, for example, even if it is difficult to form a high-refractive-index film or the like in the horizontal direction (the direction perpendicular to the side surface of the lens 311), the semiconductor chip 330 having optical characteristics and reliability similar to those of the semiconductor chip 300 can be manufactured.
In the fourth embodiment, the integrated portion 310 (340) is formed after the solid-state imaging element 11 is formed. However, like the first embodiment and the second embodiment, the solid-state imaging element 11 and the integrated portion 310 (340) may be manufactured separately, and the integrated portion 310 (340) may be bonded to the solid-state imaging element 11 after manufacturing the integrated portion 310 (330) to form the semiconductor chip 300 (330).
In addition, the number of layers of the AR coatings 212 and 341 is not limited to four, and may be any number. The lenses 16 and 311 may be lenses other than acrylic lenses.
Furthermore, in the third and fourth embodiments, the semiconductor chip 200 (300, 330) has the groove 15, but may have the other grooves 92, 112, 132, or 152 described above.
In
With reference to
A laminated substrate 11a of the solid-state imaging element 11 of
The multilayer wiring layer 422 includes a plurality of wiring layers 423 including an uppermost wiring layer 423a closest to the upper substrate 402, an intermediate wiring layer 423b, and a lowermost wiring layer 423c closest to the semiconductor substrate 421 and an interlayer insulating film 424 formed between the wiring layers 423.
The plurality of wiring layers 423 are made, for example, of copper (Cu), aluminum (Al), or tungsten (W), the interlayer insulating film 424 is made, for example, of a silicon oxide film or a silicon nitride film. All the layers of the plurality of wiring layers 423 and the interlayer insulating film 424 may be of the same material or two or more materials may be used among different layers.
A through-silicon hole 425 penetrating through the semiconductor substrate 421 is formed at a predetermined position of the semiconductor substrate 421, and a connecting conductor 427 is embedded in the inner wall of the through-silicon hole 425 via an insulating film 426, whereby a through-silicon electrode (TSV: Through Silicon Via) 428 is formed. The insulating film 426 can be made, for example, of a SiO2 film or a SiN film.
In the through-silicon electrode 428 shown in
The connecting conductor 427 of the through-silicon electrode 428 is connected to a rewiring 429 formed on the lower surface side of the semiconductor substrate 421, and the rewiring 429 is connected to a solder ball 430. The connecting conductor 427 and the rewiring 429 can be made, for example, of copper (Cu), tungsten (W), titanium (Ti), tantalum (Ta), titanium-tungsten alloy (TiW), polysilicon, or the like.
A solder mask (solder resist) 431 is formed on the lower surface side of the semiconductor substrate 421 so as to cover the rewiring 429 and the insulating film 426 except for the regions where the solder balls 430 are formed.
On the other hand, in the upper substrate 402, a multilayer wiring layer 452 is formed on the lower side (lower substrate 401 side) of a semiconductor substrate 451 made of silicon (Si). The multilayer wiring layer 452 constitutes the circuit of the pixel unit.
The multilayer wiring layer 452 includes a plurality of wiring layers 453 including an uppermost wiring layer 453a closest to the semiconductor substrate 451, an intermediate wiring layer 453b, and a lowermost wiring layer 453c closest to the lower substrate 401, and an interlayer insulating film 454 formed between the wiring layers 453.
The materials used for the plurality of wiring layers 453 and the interlayer insulating film 454 may be the same kinds of materials as those used for the wiring layers 423 and the interlayer insulating film 424. In addition, one or two or more materials may be used separately for the plurality of wiring layers 453 and the interlayer insulating film 454 similarly to the wiring layers 423 and the interlayer insulating film 424 described above.
In the example of
The upper surface of the semiconductor substrate 451 is provided with a light-receiving surface in which photodiodes 455 as photoelectric conversion elements formed by PN junctions are arranged two-dimensionally for each pixel. The photodiode 455 generates and accumulates charges (signal charges) corresponding to the amount of light received.
Although not shown, a plurality of pixel transistors, a memory unit, and the like, other than the photodiodes 455, which constitute the pixel unit is also formed in the semiconductor substrate 451 and the multilayer wiring layer 452.
A through-silicon electrode 457 connected to the wiring layer 453a of the upper substrate 402 and a through-chip electrode 458 connected to the wiring layer 423a of the lower substrate 401 are formed at a predetermined position of the semiconductor substrate 451 where color filters 456 of the R (red), G (green), or B (blue) and the on-chip lens 11b are not formed.
The through-silicon electrode 457 and the through-chip electrode 458 are connected by a connection wiring 459 formed on the upper surface of the semiconductor substrate 451. An insulating film 460 is formed between each of the through-silicon electrode 457 and the through-chip electrode 458 and the semiconductor substrate 451. Furthermore, the color filter 456 and the on-chip lens 11b are formed on the upper surface of the semiconductor substrate 451 with an insulating film (planarization film) 461 interposed therebetween.
As described above, the laminated substrate 11a of the solid-state imaging element 11 has a laminated structure in which the multilayer wiring layer 422 side of the lower substrate 401 and the multilayer wiring layer 452 side of the upper substrate 402 are bonded together. In
In the laminated substrate 11a, the wiring layer 453 of the upper substrate 402 and the wiring layer 423 of the lower substrate 401 are connected by two through-electrodes, that is, the through-silicon electrode 457 and the through-chip electrode 458, and the wiring layer 423 of the lower substrate 401 and the solder balls (rear electrodes) 430 are connected by the through-silicon electrodes 428 and the rewirings 429. As a result, the surface area of the solid-state imaging element 11 can be minimized. Therefore, the semiconductor chip 10 can be miniaturized.
In the solid-state imaging element 11 of
The solid-state imaging element 11 of
That is, in the solid-state imaging element 11 of
The method for connection with the solder balls 430 on the lower side of the solid-state imaging element 11 of
On the other hand, the solid-state imaging element 11 of
The dummy wiring 511 is provided to reduce the influence of unevenness during metal bonding (Cu—Cu bonding) between the uppermost wiring layer 423a on the lower substrate 401 side and the lowermost wiring layer 453c on the upper substrate 402 side. That is, when the rewiring 429 is formed only in a partial region of the lower surface of the semiconductor substrate 421 when performing Cu—Cu bonding, unevenness occurs due to the difference in thickness due to the presence or absence of the rewiring 429. Therefore, by providing the dummy wiring 511, the influence of the unevenness can be reduced.
Although illustration is omitted, the structure of the solid-state imaging element 11 in the semiconductor chips 140, 200, 300, and 330 is also the same as that of the solid-state imaging element 11 in
The semiconductor chip 10 (140, 200, 300, 330) described above can be applied to various electronic devices such as an imaging device such as a digital still camera or a digital video camera, a mobile phone with an imaging function, or other device with an imaging function.
The imaging device 1001 shown in
The optical system 1002 includes one or more lenses and directs light from an object (incident light) to the solid-state imaging device 1004 and forms an image on the light-receiving surface of the solid-state imaging device 1004.
The shutter device 1003 is arranged between the optical system 1002 and the solid-state imaging device 1004, and controls a light emission period and a light shielding period for the solid-state imaging device 1004 according to control by the drive circuit 1005.
The solid-state imaging device 1004 is configured by the semiconductor chip 10 (140, 200, 300, 330) described above. The solid-state imaging device 1004 accumulates signal charges for a certain period of time according to the light imaged on the light-receiving surface via the optical system 1002 and the shutter device 1003. The signal charges accumulated in the solid-state imaging device 1004 are transferred according to the drive signal (timing signal) supplied from the drive circuit 1005.
The drive circuit 1005 outputs a drive signal that controls the transfer operation of the solid-state imaging device 1004 and the shutter operation of the shutter device 1003, and drives the solid-state imaging device 1004 and the shutter device 1003.
The signal processing circuit 1006 performs various signal processes on the signal charges output from the solid-state imaging device 1004. An image (image data) obtained by the signal processing performed by the signal processing circuit 1006 is supplied to the monitor 1007 for display or supplied to the memory 1008 for storage (recording).
In the imaging device 1001 configured in this way, by applying the semiconductor chip 10 (140, 200, 300, 330) as the solid-state imaging device 1004, the reliability and durability of the imaging device 1001 can be improved.
The semiconductor chip 10 (140, 200, 300, 330) described above can be used in various cases for sensing light such as visible light, infrared light, ultraviolet light, and X-rays, for example, as follows.
The technology of the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
The endoscope 11100 includes a lens barrel 11101 of which a region having a predetermined length from a tip thereof is inserted into a body cavity of the patient 11132, and a camera head 11102 connected to a base end of the lens barrel 11101. In the illustrated example, the endoscope 11100 configured as a so-called rigid endoscope having the rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible endoscope having a flexible lens barrel.
The distal end of the lens barrel 11101 is provided with an opening into which an objective lens is fitted. A light source device 11203 is connected to the endoscope 11100, light generated by the light source device 11203 is guided to the distal end of the lens barrel 11101 by a light guide extended to the inside of the lens barrel 11101, and the light is radiated toward an observation target in the body cavity of the patient 11132 through the objective lens. The endoscope 11100 may be a direct viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
An optical system and an imaging element are provided inside the camera head 11102, and the reflected light (observation light) from the observation target converges on the imaging element by the optical system. The observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to an observation image is generated. The image signal is transmitted to a camera control unit (CCU) 11201 as RAW data.
The CCU 11201 is constituted by a central processing unit (CPU), a graphics processing unit (GPU), and the like and comprehensively controls the operation of the endoscope 11100 and a display device 11202. In addition, the CCU 11201 receives an image signal from the camera head 11102 and performs various types of image processing for displaying an image based on the image signal, for example, development processing (demosaic processing) on the image signal.
The display device 11202 displays the image based on the image signal subjected to the image processing by the CCU 11201 under the control of the CCU 11201.
The light source device 11203 is constituted of, for example, a light source such as an LED (Light Emitting Diode) and supplies the endoscope 11100 with irradiation light when photographing a surgical site or the like.
An input device 11204 is an input interface for the endoscopic surgery system 11000. The user can input various types of information or instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change imaging conditions (a type of radiation light, a magnification, a focal length, or the like) of the endoscope 11100.
A treatment tool control device 11205 controls driving of the energized treatment tool 11112 for cauterization or incision of a tissue, sealing of blood vessel, or the like. A pneumoperitoneum device 11206 sends a gas into the body cavity of the patient 11132 via the pneumoperitoneum tube 11111 in order to inflate the body cavity for the purpose of securing a field of view using the endoscope 11100 and a working space of the operator. A recorder 11207 is a device capable of recording various types of information on surgery. A printer 11208 is a device capable of printing various types of information on surgery in various formats such as text, images, and graphs.
The light source device 11203 that supplies the endoscope 11100 with the radiation light for imaging the surgical site can be configured of, for example, an LED, a laser light source, or a white light source configured of a combination thereof. When a white light source is formed by a combination of RGB laser light sources, it is possible to control an output intensity and an output timing of each color (each wavelength) with high accuracy, and thus the light source device 11203 can adjust white balance of the captured image. Further, in this case, laser light from each of the respective RGB laser light sources is radiated to the observation target in a time division manner, and driving of the imaging element of the camera head 11102 is controlled in synchronization with radiation timing such that images corresponding to respective RGB can be captured in a time division manner. According to this method, it is possible to obtain a color image without providing a color filter in the imaging element.
Further, driving of the light source device 11203 may be controlled so that an intensity of output light is changed at predetermined time intervals. The driving of the imaging element of the camera head 11102 is controlled in synchronization with a timing of changing the intensity of the light, and images are acquired in a time division manner and combined, such that an image having a high dynamic range without so-called blackout and whiteout can be generated.
In addition, the light source device 11203 may have a configuration in which light in a predetermined wavelength band corresponding to special light observation can be supplied. In the special light observation, for example, by emitting light in a band narrower than that of radiation light (that is, white light) during normal observation using wavelength dependence of light absorption in a body tissue, so-called narrow band light observation (narrow band imaging) in which a predetermined tissue such as a blood vessel in a mucous membrane surface layer is imaged with a high contrast is performed. Alternatively, in the special light observation, fluorescence observation in which an image is obtained by fluorescence generated by emitting excitation light may be performed. The fluorescence observation can be performed by emitting excitation light to a body tissue and observing fluorescence from the body tissue (autofluorescence observation), or locally injecting a reagent such as indocyanine green (ICG) to a body tissue and emitting excitation light corresponding to a fluorescence wavelength of the reagent to the body tissue to obtain a fluorescence image. The light source device 11203 may have a configuration in which narrow band light and/or excitation light corresponding to such special light observation can be supplied.
The camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405. The CCU 11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and the CCU 11201 are communicatively connected to each other by a transmission cable 11400.
The lens unit 11401 is an optical system provided in a connection portion for connection to the lens barrel 11101. Observation light taken from a tip of the lens barrel 11101 is guided to the camera head 11102 and is incident on the lens unit 11401. The lens unit 11401 is configured in combination of a plurality of lenses including a zoom lens and a focus lens.
The imaging unit 11402 is constituted by an imaging element. The imaging element constituting the imaging unit 11402 may be one element (a so-called single plate type) or a plurality of elements (a so-called multi-plate type). When the imaging unit 11402 is configured as a multi-plate type, for example, image signals corresponding to RGB are generated by the imaging elements, and a color image may be obtained by synthesizing the image signals. Alternatively, the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring image signals for the right eye and the left eye corresponding to three-dimensional (3D) display. When 3D display is performed, the operator 11131 can ascertain the depth of biological tissues in the surgical site more accurately. When the imaging unit 11402 is configured in a multi-plate type, a plurality of systems of lens units 11401 may be provided in correspondence to the imaging elements.
The imaging unit 11402 need not necessarily be provided in the camera head 11102. For example, the imaging unit 11402 may be provided immediately after the objective lens inside of the lens barrel 11101.
The drive unit 11403 is configured by an actuator and the zoom lens and the focus lens of the lens unit 11401 are moved by a predetermined distance along an optical axis under the control of the camera head control unit 11405. Accordingly, the magnification and focus of the image captured by the imaging unit 11402 can be adjusted appropriately.
The communication unit 11404 is configured using a communication device for transmitting and receiving various types of information to and from the CCU 11201. The communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
The communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the camera head control unit 11405 with the control signal. The control signal includes, for example, information regarding imaging conditions such as information indicating designation of a frame rate of a captured image, information indicating designation of an exposure value at the time of imaging, and/or information indicating designation of a magnification and a focus of the captured image.
The imaging conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately designated by the user, or may be automatically set by the control unit 11413 of the CCU 11201 on the basis of the acquired image signal. In the latter case, a so-called auto exposure (AE) function, a so-called auto focus (AF) function, and a so-called auto white balance (AWB) function are provided in the endoscope 11100.
The camera head control unit 11405 controls driving of the camera head 11102 on the basis of a control signal from the CCU 11201 received via the communication unit 11404.
The communication unit 11411 is constituted of a communication device that transmits and receives various kinds of information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted via the transmission cable 11400 from the camera head 11102.
Further, the communication unit 11411 transmits the control signal for controlling the driving of the camera head 11102 to the camera head 11102. The image signal or the control signal can be transmitted through electric communication, optical communication, or the like.
The image processing unit 11412 performs various types of image processing on the image signal that is the RAW data transmitted from the camera head 11102.
The control unit 11413 performs various kinds of control on imaging of a surgical site by the endoscope 11100, display of a captured image obtained through imaging of a surgical site, or the like. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
In addition, the control unit 11413 causes the display device 11202 to display a captured image showing a surgical site or the like on the basis of an image signal subjected to the image processing by the image processing unit 11412. In doing so, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 can recognize a surgical instrument such as forceps, a specific biological site, bleeding, mist or the like at the time of use of the energized treatment tool 11112, or the like by detecting a shape, a color, or the like of an edge of an object included in the captured image. When the display device 11202 is caused to display a captured image, the control unit 11413 may superimpose various kinds of surgery support information on an image of the surgical site for display using a recognition result of the captured image. By displaying the surgery support information in a superimposed manner and presenting it to the operator 11131, a burden on the operator 11131 can be reduced, and the operator 11131 can reliably proceed with the surgery.
The transmission cable 11400 that connects the camera head 11102 and the CCU 11201 is an electrical signal cable compatible with communication of electrical signals, an optical fiber compatible with optical communication, or a composite cable of these.
Here, although wired communication is performed using the transmission cable 11400 in the illustrated example, communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
An example of an endoscopic surgery system to which the technology according to the present disclosure can be applied has been described above. The technology according to the present disclosure can be applied to the imaging unit 11402 and the like among the configurations described above. Specifically, the semiconductor chip 10 (140, 200, 300, 330) can be applied to the imaging unit 11402. By applying the technology according to the present disclosure to the imaging unit 11402, the reliability and durability of the imaging unit 11402 can be improved.
Here, although the endoscopic surgery system has been described as an example, the technology according to the present disclosure may be applied to other, for example, a microscopic surgery system.
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be realized as a device equipped in any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, and a robot.
The vehicle control system 12000 includes a plurality of electronic control units connected thereto via a communication network 12001. In the example illustrated in
The drive system control unit 12010 controls an operation of a device related to a drive system of a vehicle according to various programs. For example, the drive system control unit 12010 functions as a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting a driving force to wheels, a steering mechanism for adjusting a turning angle of a vehicle, and a control device such as a braking device that generates a braking force of a vehicle.
The body system control unit 12020 controls operations of various devices mounted in the vehicle body according to various programs. For example, the body system control unit 12020 functions as a control device of a keyless entry system, a smart key system, a power window device, or various lamps such as a headlamp, a back lamp, a brake lamp, a turn signal, and a fog lamp. In this case, radio waves transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 12020. The body system control unit 12020 receives inputs of the radio waves or signals and controls a door lock device, a power window device, and a lamp of the vehicle.
The outside-vehicle information detection unit 12030 detects information on the outside of the vehicle having the vehicle control system 12000 mounted thereon. For example, an imaging unit 12031 is connected to the outside-vehicle information detection unit 12030. The outside-vehicle information detection unit 12030 causes the imaging unit 12031 to capture an image of the outside of the vehicle and receives the captured image. The outside-vehicle information detection unit 12030 may perform object detection processing or distance detection processing for peoples, cars, obstacles, signs, and letters on the road on the basis of the received image.
The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of the received light. The imaging unit 12031 can also output the electrical signal as an image or distance measurement information. In addition, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared light.
The inside-vehicle information detection unit 12040 detects information on the inside of the vehicle. For example, a driver state detection unit 12041 that detects a driver's state is connected to the inside-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that captures an image of a driver, and the inside-vehicle information detection unit 12040 may calculate a degree of fatigue or concentration of the driver or may determine whether or not the driver is dozing on the basis of detection information input from the driver state detection unit 12041.
The microcomputer 12051 can calculate a control target value of the driving force generation device, the steering mechanism, or the braking device on the basis of the information on the outside or the inside of the vehicle acquired by the outside-vehicle information detection unit 12030 or the inside-vehicle information detection unit 12040 and output a control command to the drive system control unit 12010. For example, the microcomputer 12051 can perform cooperative control for the purpose of realizing functions of an advanced driver assistance system (ADAS) including collision avoidance or impact mitigation of a vehicle, following traveling based on inter-vehicle distance, vehicle speed maintenance driving, vehicle collision warning, vehicle lane deviation warning, or the like.
Further, the microcomputer 12051 can perform cooperative control for the purpose of automated driving or the like in which autonomous travel is performed without depending on operations of the driver, by controlling the driving force generator, the steering mechanism, or the braking device and the like on the basis of information about the surroundings of the vehicle, the information being acquired by the outside-vehicle information detection unit 12030 or the inside-vehicle information detection unit 12040.
In addition, the microcomputer 12051 can output a control command to the body system control unit 12020 based on the information acquired by the outside-vehicle information detection unit 12030 outside the vehicle. For example, the microcomputer 12051 can perform cooperative control for the purpose of preventing glare, such as switching from a high beam to a low beam, by controlling the headlamp according to the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detection unit 12030.
The sound/image output unit 12052 transmits an output signal of at least one of sound and an image to an output device capable of visually or audibly notifying a passenger or the outside of the vehicle of information. In the example shown in
In
The imaging units 12101, 12102, 12103, 12104, and 12105 are provided at positions such as a front nose, side-view mirrors, a rear bumper, a back door, and an upper portion of a windshield in a vehicle interior of the vehicle 12100, for example. The imaging unit 12101 provided on the front nose and the imaging unit 12105 provided in the upper portion of the windshield in the vehicle interior mainly acquire images of the front of the vehicle 12100. The imaging units 12102 and 12103 provided on the side-view mirrors mainly acquire images of a lateral side of the vehicle 12100. The imaging unit 12104 provided on the rear bumper or the back door mainly acquires images of the rear of the vehicle 12100. Front-view images acquired by the imaging units 12101 and 12105 are mainly used for detection of preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, and the like.
Here,
At least one of the imaging units 12101 to 12104 may have a function for obtaining distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera constituted by a plurality of imaging elements or may be an imaging element that has pixels for phase difference detection.
For example, the microcomputer 12051 can extract, particularly, a closest three-dimensional object on a path along which the vehicle 12100 is traveling, which is a three-dimensional object traveling at a predetermined speed (for example, 0 km/h or higher) in the substantially same direction as the vehicle 12100, as a preceding vehicle by acquiring a distance to each of three-dimensional objects in the imaging ranges 12111 to 12114 and temporal change in the distance (a relative speed with respect to the vehicle 12100) on the basis of the distance information obtained from the imaging units 12101 to 12104. Further, the microcomputer 12051 can set an inter-vehicle distance which should be secured in front of the vehicle in advance with respect to the preceding vehicle and can perform automated brake control (also including following stop control) or automated acceleration control (also including following start control). In this way, it is possible to perform cooperative control for the purpose of automated driving or the like in which a vehicle autonomously travels without depending on operations of the driver.
For example, the microcomputer 12051 can classify and extract three-dimensional data regarding three-dimensional objects into two-wheeled vehicles, normal vehicles, large vehicles, pedestrians, and other three-dimensional objects such as electric poles based on distance information obtained from the imaging units 12101 to 12104 and can use the three-dimensional data to perform automated avoidance of obstacles. For example, the microcomputer 12051 differentiates surrounding obstacles of the vehicle 12100 into obstacles which can be viewed by the driver of the vehicle 12100 and obstacles which are difficult to view. Then, the microcomputer 12051 determines a collision risk indicating the degree of risk of collision with each obstacle, and when the collision risk is equal to or higher than a set value and there is a possibility of collision, an alarm is output to the driver through the audio speaker 12061 or the display unit 12062, forced deceleration or avoidance steering is performed through the drive system control unit 12010, and thus it is possible to perform driving support for collision avoidance.
At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether there is a pedestrian in the captured image of the imaging units 12101 to 12104. Such pedestrian recognition is performed by, for example, a procedure in which feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras are extracted and a procedure in which pattern matching processing is performed on a series of feature points indicating an outline of an object to determine whether or not the object is a pedestrian. When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and the pedestrian is recognized, the sound/image output unit 12052 controls the display unit 12062 so that a square contour line for emphasis is superimposed and displayed with the recognized pedestrian. In addition, the sound/image output unit 12052 may control the display unit 12062 so that an icon indicating a pedestrian or the like is displayed at a desired position.
An example of a vehicle control system to which the technology according to the present disclosure can be applied has been described above. The technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above. Specifically, the semiconductor chip 10 (140, 200, 300, 330) can be applied to the imaging unit 12031. By applying the technology according to the present disclosure to the imaging unit 12031, the reliability and durability of the imaging unit 12031 can be improved.
The embodiments of the present technology are not limited to the aforementioned embodiments, and various changes can be made without departing from the gist of the present technology.
For example, a combination of all or part of the above-mentioned plurality of embodiments may be employed.
The advantageous effects described herein are merely exemplary and are not limited, and other advantageous effects of the advantageous effects described in the present specification may be achieved.
The present technology can be configured as follows.
(1)
A semiconductor chip including:
The semiconductor chip according to (1), wherein
The semiconductor chip according to (1) or (2), wherein
The semiconductor chip according to any one of (1) to (3), wherein
The semiconductor chip according to any one of (1) to (3), wherein
The semiconductor chip according to (5), wherein
The semiconductor chip according to any one of (1) to (6), further including:
The semiconductor chip according to (7), wherein
The semiconductor chip according to (7), wherein
The semiconductor chip according to (9), wherein
The semiconductor chip according to (9) or (10), wherein
The semiconductor chip according to (9) or (10), wherein
A method for manufacturing a semiconductor chip, including:
The method for manufacturing the semiconductor chip according to (13), wherein
An electronic device including:
Number | Date | Country | Kind |
---|---|---|---|
2021-083153 | May 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/004382 | 2/4/2022 | WO |