This application claims priority to Japanese Priority Patent Application JP 2019-219713 filed on Dec. 4, 2019, the entire contents of which are incorporated herein by reference.
The present disclosure relates to a semiconductor element and an electronic apparatus including the semiconductor element. The semiconductor element is used in an infrared sensor, for example.
Image sensors or infrared sensors having sensitivity within an infrared region have been commercially available. For example, PTL 1 discloses a semiconductor element. In the semiconductor element, an element substrate and a circuit substrate are bonded to each other through Cu—Cu bonding. In the element substrate, a compound semiconductor layer and a wiring layer are layered with each other.
In such a semiconductor element as described above, improved reliability has now been demanded.
It is desirable to provide a semiconductor element and an electronic apparatus making it possible to improve reliability.
According to an embodiment of the present disclosure, there is provided a light detecting device. The light detecting device includes an element substrate including an element region and a peripheral region and a circuit substrate that faces the element substrate and is electrically connected to the semiconductor layer through the first wiring layer. The element region includes a first wiring layer and a semiconductor layer. The semiconductor layer includes a compound semiconductor material, and the peripheral region is outside the element region in a plan view. An outer boundary of the element substrate is different from an outer boundary of the circuit substrate.
According to another embodiment of the present disclosure, there is provided an electronic apparatus. The electronic apparatus includes a light detecting device. The light detecting device includes an element substrate including an element region and a peripheral region and a circuit substrate that faces the element substrate and is electrically connected to the semiconductor layer through the first wiring layer. The element region includes a first wiring layer and a semiconductor layer. The semiconductor layer includes a compound semiconductor material, and the peripheral region is outside the element region in a plan view. An outer boundary of the element substrate is different from an outer boundary of the circuit substrate.
The accompanying drawings are included to provide a further understanding of the technology, and are incorporated in and constitute a part of this specification. The drawings show illustrative embodiments and, together with the specification, serve to explain various principles of the technology.
In the following, some embodiments of the present disclosure are described in detail with reference to the accompanying drawings. The embodiments described below are specific but non-limiting examples of the present disclosure, and the present disclosure is not limited to aspects described below. The present disclosure is not also limited to arrangements, sizes, dimensional ratios, and other factors of components illustrated in the drawings. It is to be noted that the description is given in the following order.
1. Embodiment: example of light receiving element having groove on peripheral part in peripheral region
1-1. Configuration of Light Receiving Element
1-2. Manufacturing Method of Light Receiving Element
1-3. Operation of Light Receiving Element
1-4. Workings and Effects
2. Modification Examples
2-1. Modification Example 1: examples of light receiving elements having groove or plurality of grooves adjacent to peripheral part in peripheral region
2-2. Modification Example 2: example of light receiving element having plurality of grooves on and adjacent to peripheral part in peripheral region
2-3. Modification Example 3: examples of light receiving elements having groove or plurality of grooves provided, from read-out circuit substrate, either of on or adjacent to or both of on and adjacent to respective sides of peripheral part in peripheral region
2-4. Modification Example 4: example of light receiving element having color filter layer and on-chip lens adjacent to light incident face
3. Application Examples
4. Applied Examples
The light receiving element 1 has a laminated structure of an element substrate 10 and a read-out circuit substrate 20, as illustrated in
The element substrate 10 includes, from a position adjacent to the read-out circuit substrate 20, a wiring layer 10W, first electrodes 11, a semiconductor layer 10S, a second electrode 15, and passivation films 16A and 16B in this order. The semiconductor layer 10S has a counter surface facing the wiring layer 10W and end faces (side surfaces). The counter face and the end faces are covered with insulating films 17A and 17B. The read-out circuit substrate 20 is a so-called a read-out integrated circuit (ROIC). The read-out circuit substrate 20 includes a wiring layer 20W in contact with the bonding face S2 of the element substrate 10, a multi-layered wiring layer 22C, and a semiconductor substrate 21. The semiconductor substrate 21 faces the element substrate 10 across the wiring layer 20W and the multi-layered wiring layer 22C.
The element substrate 10 has an element region R1 in its central part. The element region R1 serves as a light receiving region. The semiconductor layer 10S is disposed in the element region R1. In other words, the element region R1 represents a region where the semiconductor layer 10S is provided. The element region R1 includes an optical black (OPB) region R1B lying adjacent to a peripheral region R2 and covered with an electrically-conductive film 15B. The OPB region R1B is provided to surround the light receiving region. The OPB region R1B is used to acquire a pixel signal at a black level.
The peripheral region R2 is provided outside the element region R1 and surrounds the element region R1. In the peripheral region R2 of the element substrate 10, a buried layer 18 is provided, together with the insulating films 17A and 17B. In the peripheral region R2, through holes H1 and H2 are further provided. The through holes H1 and H2 pass through the element substrate 10 and reach the read-out circuit substrate 20. In the light receiving element 1, light is outputted from the light incident face S1 of the element substrate 10, via the passivation films 16A and 16B, the second electrode 15, and a second contact layer 14, to the semiconductor layer 10S. Signal electric charges having undergone photoelectric conversion in the semiconductor layer 10S move, via the wiring layer 10W, to the read-out circuit substrate 20. The read-out circuit substrate 20 then reads the signal electric charges. In the present embodiment, a groove H3 is further provided on a peripheral part (e.g., chip ends E) in the peripheral region R2. The groove H3 extends from the light incident face S1 of the element substrate 10 to inside the multi-layered wiring layer 22C, for example, in the read-out circuit substrate 20. In other words, a recess is formed on side surfaces (e.g., the chip ends E) of the light receiving element 1. The recess extends from the light incident face S1 of the element substrate 10 to the multi-layered wiring layer 22C in the read-out circuit substrate 20. Configurations of components will now be described herein.
The wiring layer 10W is provided across the element region R1 and the peripheral region R2. The wiring layer 10W includes the bonding face S2 with the read-out circuit substrate 20. In the light receiving element 1 in which the bonding face S2 of the element substrate 10 is provided across the element region R1 and the peripheral region R2, the bonding face S2 is flush across the element region R1 and the peripheral region R2, for example.
The wiring layer 10W includes, in interlayer insulating films 19A and 19B, for example, contact electrodes 19E and dummy electrodes 19ED. In the wiring layer 10W, for example, the interlayer insulating film 19B is disposed adjacent to the read-out circuit substrate 20, and the interlayer insulating film 19A is disposed adjacent to a first contact layer 12. The interlayer insulating films 19A and 19B are laminated to each other. The interlayer insulating films 19A and 19B each include, for example, an inorganic insulating material. Non-limiting examples of the inorganic insulating material include, for example, silicon nitride (SiN), aluminum oxide (Al2O3), silicon oxide (SiO2), and hafnium oxide (HfO2). The interlayer insulating films 19A and 19B may include an identical inorganic insulating material.
The contact electrodes 19E are provided in the element region R1, for example. The contact electrodes 19E electrically couple with each other the first electrodes 11 and the read-out circuit substrate 20. The contact electrodes 19E are provided in the respective pixels P in the element region R1. The contact electrodes 19E adjacent to each other is electrically separated by the buried layer 18 and the interlayer insulating films 19A and 19B. The contact electrodes 19E include copper (Cu) pads, for example. The contact electrodes 19E are exposed to the bonding face S2. The dummy electrodes 19ED are provided in the peripheral region R2, for example. The dummy electrodes 19ED are coupled to respective dummy electrodes 22ED in the wiring layer 20W described later. The dummy electrodes 19ED and the dummy electrodes 22ED provided as described above make it possible to improve the strength of the peripheral region R2. The dummy electrodes 19ED and the contact electrodes 19E are formed in a single step, for example. The dummy electrodes 19ED include copper (Cu) pads, for example. The dummy electrodes 19ED are exposed to the bonding face S2.
The first electrodes 11 provided between the contact electrodes 19E and the semiconductor layer 10S serve as electrodes or anodes that receive a voltage for use in reading signal electric charges generated in a photoelectric conversion layer 13. The signal electric charges may be holes or electrons. Hereinafter, the signal electric charges are described as being holes for purpose of convenience. The first electrodes 11 are provided in the respective pixels P in the element region R1. The first electrodes 11 are provided so as to fill respective openings 17H of the insulating film 17A. The first electrodes 11 are in contact with the semiconductor layer 10S, more specifically, respective diffusion regions 12A described later.
The first electrode 11 is, for example, a metal of titanium (Ti), tungsten (W), titanium nitride (TiN), platinum (Pt), gold (Au), germanium (Ge), palladium (Pd), zinc (Zn), nickel (Ni) or aluminum (Al) or a metal alloy including at least one of them. The first electrode 11 may be a single-layer film including such a constituent material as described above. The first electrode 11 may otherwise be a multi-layered film including such two or more constituent materials as described above. For example, the first electrode 11 is a multi-layered film including titanium and tungsten. The first electrode 11 ranges in thickness from several ten nm to several hundred nm, for example.
The semiconductor layer 10S includes, from a position adjacent to the wiring layer 10W, the first contact layer 12, the photoelectric conversion layer 13, and the second contact layer 14 for example. The first contact layer 12, the photoelectric conversion layer 13, and the second contact layer 14 have respective planar shapes identical to each other and respective end faces disposed at identical positions in plan view.
The first contact layer 12 is commonly provided across all the pixels P, for example. The first contact layer 12 is disposed between the insulating film 17A and the photoelectric conversion layer 13. The first contact layer 12 electrically separates the pixels P adjacent to each other. In the first contact layer 12, the plurality of diffusion regions 12A is provided, for example. Using, in the first contact layer 12, a compound semiconductor material greater in band gap than a compound semiconductor material included in the photoelectric conversion layer 13 makes it possible to suppress a dark current. The first contact layer 12 may include, for example, n-type indium phosphide (InP).
The diffusion regions 12A provided and disposed in the first contact layer 12 are separated away from each other. The diffusion regions 12A are provided in the respective pixels P. The first electrodes 11 is coupled to the diffusion regions 12A. The OPB region R1B also includes the diffusion regions 12A. The diffusion regions 12A are used to read signal electric charges generated in the photoelectric conversion layer 13 from the respective pixels P. The diffusion regions 12A include a p-type impurity, for example. Non-limiting examples of the p-type impurity include zinc (Zn). As described above, pn bonding interfaces are respectively formed between the diffusion regions 12A and the first contact layer 12 excluding the diffusion regions 12A to electrically separate the pixels P adjacent to each other. The diffusion regions 12A are provided in a thickness direction of the first contact layer 12, for example. The diffusion regions 12A are also partially provided in the thickness direction of the photoelectric conversion layer 13.
The photoelectric conversion layer 13 between the first electrodes 11 and the second electrode 15, more specifically, between the first contact layer 12 and the second contact layer 14 is commonly provided across all the pixels P, for example. The photoelectric conversion layer 13 absorbs light at a predetermined wavelength to generate signal electric charges. The photoelectric conversion layer 13 includes, for example, a compound semiconductor material such as an i-type group III-V semiconductor. Non-limiting examples of the compound semiconductor material included in the photoelectric conversion layer 13 include, for example, indium gallium arsenide (InGaAs), indium arsenic antimony (InAsSb), indium arsenide (InAs), indium antimony (InSb), and mercury cadmium telluride (HgCdTe). The photoelectric conversion layer 13 may include germanium (Ge). In the photoelectric conversion layer 13, for example, light at a wavelength within a region from the visible region to the short infrared region undergoes photoelectric conversion.
The second contact layer 14 is commonly provided across all the pixels P, for example. The second contact layer 14 is provided between and in contact with the photoelectric conversion layer 13 and the second electrode 15. Electric charges discharged from the second electrode 15 move to the second contact layer 14. The second contact layer 14 includes, for example, a compound semiconductor including an n-type impurity. The second contact layer 14 may include, for example, n-type indium phosphide (InP).
It is to be noted that a light absorption rate of the compound semiconductor included in the second contact layer 14 changes in accordance with a wavelength. Adjusting a film thickness of the second contact layer 14 therefore makes it possible to allow light at a wavelength within a desired wavelength band to reach the photoelectric conversion layer 13. For example,
Similar to the second contact layer 14, the light absorption rate of the photoelectric conversion layer 13 including a compound semiconductor also changes in accordance with a wavelength. It is therefore desirable that, to allow blue light at a wavelength of 400 nm, serving as light within the visible region, to undergo photoelectric conversion in the photoelectric conversion layer 13, for example, the photoelectric conversion layer 13 have a thickness of 100 nm or greater, for example. It is also desirable that, to allow light at a wavelength within the short infrared region to undergo photoelectric conversion, the photoelectric conversion layer 13 have a thickness of 3 μm or greater, for example. Furthermore, it is desirable that, to allow light at a wavelength within a region from the visible region to the short infrared region to undergo photoelectric conversion, the photoelectric conversion layer 13 have a thickness in a range from 500 nm to 6 μm, for example.
The second electrode 15 serves as a common electrode among the pixels P, for example. The second electrode 15 is provided on and in contact with a light-incident side of the second contact layer 14. The second electrode 15 serves as a cathode to discharge electric charges that do not serve as signal electric charges, among electric charges generated in the photoelectric conversion layer 13. For example, in a case where holes are read from the first electrodes 11 as signal electric charges, the second electrode 15 discharges electrons. The second electrode 15 includes an electrically-conductive film allowing incident light such as infrared light to pass through, for example. The second electrode 15 may include, for example, indium tin oxide (ITO) or ITiO (In2O3—TiO2).
The passivation films 16A and 16B cover the second electrode 15 from a side of the light incident face S1. It is desirable that the passivation films 16A and 16B each include a material that does not absorb light at a wavelength within a region from the visible region (from 380 nm or greater to less than 780 nm, for example) to the short infrared region (from 780 nm or greater to less than 2400 nm, for example). The passivation films 16A and 16B may include an identical material. Alternatively, the passivation films 16A and 16B may respectively include materials different from each other. Furthermore, the passivation films 16A and 16B may have anti-reflection properties. The passivation films 16A and 16B may be formed through an atomic layer deposition (ALD) method, a chemical vapor deposition (CVD) method, a physical vapor deposition (PVD) method, or an application method, for example.
The passivation film 16A is provided on the second electrode 15, as described above. The passivation film 16A extends to chip ends E in the peripheral region R2, for example. The passivation film 16A has an opening 16H in the OPB region R1B. The opening 16H has a frame shape so as to surround the light receiving region, as illustrated in
It is desirable that the passivation film 16A include a material having non-reducing properties. Non-limiting examples of the material having non-reducing properties include, for example, oxides (MxOy), nitrides (MxNy), and oxynitrides (MxOyNz). M represents, for example, silicon (Si), titanium (Ti), hafnium (Hf), zirconium (Zr), or yttrium (Y), for example. The letters x, y, and z each represent an integer of 1 or greater. It is desirable to apply a film forming method without using a reducing gas to silicon nitride (SiN). Non-limiting examples of such a film forming method include, for example, a sputtering method and an application method. The passivation film 16A may be formed into a single-layer film including such a material as described above, for example. It is desirable that, to provide the passivation film 16A as a single-layer film, the single-layer film have a film density of 2.0 g/cm3 or higher. Any particular upper limit is not specified for the film density. The film density may be 8.0 g/cm3 or lower, for example. It is to be noted that the film density is defined with an expression: Mass of thin film/Volume (g/cm3). The film density is acquired through an x-ray reflectivity (XRR) measurement method, for example. The passivation film 16A therefore has sealing properties. Alternatively, the passivation film 16A may be formed into a multi-layered film. Furthermore, the passivation film 16A may be a multi-layered film including three or more layers, i.e., films 16A1, 16A2, 16A3, 16A4, to 16AX, laminated on the second electrode 15, as illustrated in
The passivation film 16B is provided to cover the passivation film 16A and the electrically-conductive film 15B. The passivation film 16B extends to the chip ends E in the peripheral region R2, similar to the passivation film 16A, for example. The passivation film 16B may include, for example, silicon nitride (SiN), aluminum oxide (Al2O3), silicon oxide (SiO2), or tantalum oxide (Ta2O3). It is to be noted that any particular method of forming a silicon nitride (SiN) film is not specified in forming the passivation film 16B. The passivation film 16B may be a silicon nitride (SiN) film formed through a plasma CVD method using a reducing gas, instead of the sputtering method and the application method.
The insulating films 17A and 17B are provided between the first contact layer 12 and the buried layer 18. Specifically, the insulating film 17A covers a counter surface, facing the wiring layer 10W, of the first contact layer 12. The insulating film 17A further covers end faces of the first contact layer 12, end faces of the photoelectric conversion layer 13, end faces of the second contact layer 14, and end faces of the second electrode 15. In the peripheral region R2, the insulating film 17A is in contact with the passivation films 16A and 16B. The insulating film 17B is provided along and in contact with the first electrodes 11 and the insulating film 17A. In other words, the insulating film 17B covers a bonding face of the first contact layer 12 with the first electrodes 11 and the insulating film 17A interposed. The insulating film 17B also covers the end faces of the first contact layer 12, the end faces of the photoelectric conversion layer 13, the end faces of the second contact layer 14, and the end faces of the second electrode 15 with the insulating film 17A interposed. In the peripheral region R2, the insulating film 17B extends, together with the insulating film 17A, to the chip ends E, for example.
The insulating film 17A includes, for example, oxide such as silicon oxide (SiOX) or aluminum oxide (Al2O31). The insulating film 17A may have a layered structure including a plurality of films. The insulating film 17A may include, for example, a silicon (Si)-based insulating material such as silicon oxynitride (SiON), carbon-containing silicon oxide (SiOC), silicon nitride (SiN), or silicon carbide (SiC). The insulating film 17A has a thickness in a range from several ten nm to several hundred nm, for example. It is desirable that the insulating film 17B include a material having high passivation properties, among such above-described insulating materials as a material of the insulating film 17A. It is desirable to use silicon nitride (SiN), for example. It is therefore possible to improve the semiconductor layer 10S in protection properties. The insulating film 17B has a thickness in a range from 100 nm to 200 nm, for example.
The electrically-conductive film 15B extends from the OPB region R1B to the through holes H1 in the peripheral region R2. The electrically-conductive film 15B is in contact with the second electrode 15 via the opening 16H, provided in the OPB region R1B, of the passivation film 16A. The electrically-conductive film 15B is also in contact with wiring lines of the read-out circuit substrate 20 (wiring lines 22CB described later) via the through holes H1. A voltage is therefore supplied from the read-out circuit substrate 20, via the electrically-conductive film 15B, to the second electrode 15. The electrically-conductive film 15B serves as a path along which the voltage is supplied to the second electrode 15, as described above. The electrically-conductive film 15B also serves as a light shielding film. The electrically-conductive film 15B forms the OPB region R1B. The electrically-conductive film 15B includes, for example, a metal material such as tungsten (W), aluminum (Al), titanium (Ti), molybdenum (Mo), tantalum (Ta), or copper (Cu). The passivation film 16B may be provided on the electrically-conductive film 15B.
An adhesion layer B may be provided between end parts of the second contact layer 14 and the second electrode 15. The adhesion layer B is used to form the light receiving element 1, as will be described later. The adhesion layer B bonds the semiconductor layer 10S to a temporary substrate 33 described later and illustrated in
The adhesion layer B may extend across a large part of the peripheral region R2. For example, the adhesion layer B may extend from positions adjacent to edges of the semiconductor layer 10S, i.e., the element region R1, to respective positions between the through holes H1 and the through holes H2. The adhesion layer B may otherwise extend from positions adjacent to the edges of the semiconductor layer 10S, i.e., the element region R1, to the chip ends, i.e., the chip ends E.
In manufacturing steps of the light receiving element 1, level differences between the semiconductor layer 10S and the temporary substrate 33, which is described later and illustrated in
In the peripheral region R2, the buried layer 18 is provided between the wiring layer 10W and the insulating film 17B, and between the wiring layer 10W and the passivation film 16A. For example, the buried layer 18 is greater in thickness than the semiconductor layer 10S. In the present embodiment, the buried layer 18 is provided to surround the semiconductor layer 10S, forming the peripheral region R2 around the semiconductor layer 10S. It is therefore possible to provide the bonding face S2 with the read-out circuit substrate 20 in the peripheral region R2. In a case where the bonding face S2 is formed in the peripheral region R2, the thickness of the buried layer 18 may be reduced. It is however desirable that the buried layer 18 cover the semiconductor layer 10S in the thickness direction. It is further desirable that the buried layer 18 wholly cover the end faces of the semiconductor layer 10S. The buried layer 18 wholly covering the end faces of the semiconductor layer 10S via the insulating films 17A and 17B makes it possible to effectively suppress moisture from entering the semiconductor layer 10S. In the element region R1, the buried layer 18 is provided between the semiconductor layer 10S and the wiring layer 10W to cover the first electrodes 11.
The buried layer 18 has a planarized surface facing the bonding face S2. In the peripheral region R2, the wiring layer 10W is provided along the planarized face of the buried layer 18. The buried layer 18 may include an inorganic insulating material, such as silicon oxide (SiOX), silicon nitride (SiN), silicon oxynitride (SiON), carbon-containing silicon oxide (SiOC), or silicon carbide (SiC).
In the manufacturing steps of the light receiving element 1, the buried layer 18 is first formed, and the wiring layer 10W including the interlayer insulating films 19A and 19B and the contact electrodes 19E is then formed above the buried layer 18, as illustrated in
The buried layer 18 includes the through holes H1 and H2 and the groove H3. The through holes H1 and H2 and the groove H3 pass through the buried layer 18. The through holes H1 and H2 extend through the wiring layer 10W and the buried layer 18 to the read-out circuit substrate 20. The through holes H1 and H2 have a square planar shape, for example. The through holes H1 and H2 are provided to surround the element region R1, as illustrated in
The through holes H2 are provided closer in position to the chip ends E than the through holes H1, for example. The through holes H2 extend through the passivation films 16A and 16B, the buried layer 18, and the wiring layer 10W to pad electrodes 22P, described later, of the read-out circuit substrate 20. Via the through holes H2, the light receiving element 1 achieves external electrical coupling. Alternatively, the through holes H1 and H2 may not extend to the read-out circuit substrate 20. For example, the through holes H1 and H2 may extend to the wiring lines in the wiring layer 10W. The wiring lines may be coupled to the wiring lines 22CB and the pad electrodes 22P in the read-out circuit substrate 20. In a case where the adhesion layer B extends from positions adjacent to the edges of the semiconductor layer 10S, i.e., the element region R1, to positions between the through holes H1 and the through holes H2, or otherwise, to the chip ends, i.e., the chip ends E, as described above, the through holes H1 and H2 may pass through the adhesion layer B.
The groove H3 lies on the peripheral part, i.e., the chip ends E, in the peripheral region R2. The peripheral part lies on an outer side than the through holes H2. The groove H3 extends from the light incident face S1 of the element substrate 10 to a deeper position than the bonding face S2 between the element substrate 10 and the read-out circuit substrate 20. Specifically, the groove H3 extends through the passivation films 16A and 16B, the buried layer 18, and the wiring layer 10W, to inside the multi-layered wiring layer 22C, for example, in the read-out circuit substrate 20. As will be described later in detail, the groove H3 is provided at a position through which a blade B passes in a manufacturing step of the light receiving element 1. The manufacturing step corresponds to a dicing step illustrated in
Although
It is desirable that the groove H3 be a continuous groove provided on a periphery of the light receiving element 1, as illustrated in
Holes and electrons generated in the photoelectric conversion layer 13 are to be read from the first electrodes 11 and the second electrode 15. To promptly perform this reading, it is desirable that the first electrodes 11 and the second electrode 15 be provided at a distance sufficient for photoelectric conversion and so as not to excessively separated away from each other. That is, it is desirable to reduce the thickness of the element substrate 10. For example, the distance between the first electrodes 11 and the second electrode 15 or the thickness of the element substrate 10 is preferably 10 μm or less, more preferably 7 μm or less, or still more preferably 5 μm or less.
The semiconductor substrate 21 of the read-out circuit substrate 20 faces the element substrate 10 across the wiring layer 20W and the multi-layered wiring layer 22C. The semiconductor substrate 21 includes silicon (Si), for example. A plurality of transistors is provided adjacent to a surface, facing the wiring layer 20W, of the semiconductor substrate 21. For example, the plurality of transistors is used to configure read-out circuits in the respective pixels P. The wiring layer 20W includes, for example, an interlayer insulating film 22A and an interlayer insulating film 22B laminated in this order from a side adjacent to the element substrate 10. For example, the contact electrodes 22E and the dummy electrodes 22ED are provided in the interlayer insulating film 22A. The multi-layered wiring layer 22C is provided to face the element substrate 10 across the wiring layer 20W. For example, the pad electrodes 22P and the plurality of wiring lines 22CB are provided in the multi-layered wiring layer 22C. The interlayer insulating films 22A and 22B each include, for example, an inorganic insulating material. Non-limiting examples of the inorganic insulating material include, for example, silicon nitride (SiN), aluminum oxide (Al2O3), silicon oxide (SiO2), and hafnium oxide (HfO2).
The contact electrodes 22E electrically couple the first electrodes 11 and the wiring lines 22CB with each other. The contact electrodes 22E are provided in the respective pixels P in the element region R1. The contact electrodes 22E are in contact with the respective contact electrodes 19E at the bonding face S2 of the element substrate 10. The interlayer insulating film 22A electrically separates the contact electrodes 22E adjacent to each other.
The dummy electrodes 22ED provided in the peripheral region R2 are in contact with the respective dummy electrodes 19ED at the bonding face S2 of the element substrate 10. The dummy electrodes 22ED and the contact electrodes 22E are formed in a single step, for example. The contact electrodes 22E and the dummy electrodes 22ED include respective copper (Cu) pads, for example. The contact electrodes 22E and the dummy electrodes 22ED are exposed to a counter surface of the read-out circuit substrate 20. The counter surface faces the element substrate 10. That is, the contact electrodes 19E and the contact electrodes 22E, as well as the dummy electrodes 19ED and the dummy electrodes 22ED are respectively bonded to each other through Cu—Cu bonding, for example. As will be described later in detail, such bonding therefore makes it possible to make the pixels P finer.
The wiring lines 22CB coupled to the respective contact electrodes 19E are coupled to the respective transistors provided adjacent to the surface of the semiconductor substrate 21. The first electrodes 11 and the read-out circuits are coupled to each other in the respective pixels P. The wiring lines 22CB coupled to the electrically-conductive film 15B via the through holes H1 each have a predetermined potential, for example. As described above, the read-out circuits read holes, for example, out of electric charges generated in the photoelectric conversion layer 13, from the first electrodes 11 via the contact electrodes 19E and 22E. The remaining electric charges, i.e., electrons, generated in the photoelectric conversion layer 13 are discharged from the second electrode 15 via the electrically-conductive film 15B at the predetermined potential.
The pad electrodes 22P provided in the peripheral region R2 allow external electrical coupling. The through holes H2 extending through the element substrate 10 to the pad electrodes 22P are provided adjacent to the chip ends E of the light receiving element 1. External electrical coupling is thus to be achieved via the through holes H2. For example, such coupling is achieved through a wire bonding method or a bumping method. For example, the predetermined potential may be supplied from external terminals disposed in the respective through holes H2 to the second electrode 15 via the through holes H2, the wiring lines 22CB of the read-out circuit substrate 20, and the electrically-conductive film 15B. The read-out circuits in the semiconductor substrate 21 may read, via the contact electrodes 19E and 22E, signal voltages read from the respective first electrodes 11 as a result of photoelectric conversion in the photoelectric conversion layer 13. The signal voltages may be outputted, via the read-out circuits, to the external terminals disposed in the respective through holes H2. Signal voltages may be outputted, via the read-out circuits and other circuits included in the read-out circuit substrate 20, for example, to the external terminals. Non-limiting examples of the other circuits include a signal processing circuit and an output circuit. In an embodiment, the pad electrode 22P is provided in the element substrate 10 as illustrated in
It is preferred that the read-out circuit substrate 20 be greater in thickness than the element substrate 10. For example, the read-out circuit substrate 20 is preferably twice or more, more preferably five times or more, or still more preferably ten times or more in thickness than the element substrate 10. Otherwise, the thickness of the read-out circuit substrate 20 is, for example, 100 μm or greater, 150 μm or greater, or 200 μm or greater. The read-out circuit substrate 20 having a greater thickness, as described above, secures mechanical strength of the light receiving element 1. It is to be noted that the read-out circuit substrate 20 may include only the semiconductor substrate 21 in which the circuits are formed. Alternatively, the read-out circuit substrate 20 may further include another substrate such as a support substrate, in addition to the semiconductor substrate 21 in which the circuits are formed.
It is to be noted that a metal pattern, for example, may be formed on the multi-layered wiring layer 22C, in addition to the pad electrodes 22P and the plurality of wiring lines 22CB. The metal pattern serves as, for example, a mark in a bonding step of the temporary substrate 33 and the read-out circuit substrate 20, as illustrated in
The light receiving element 1 may be manufactured through steps described below.
As illustrated in
Next, the growth substrate 31 on which the semiconductor layer 10S is formed is bonded to the temporary substrate 33 with the adhesion layer B interposed, as illustrated in
Alternatively, as illustrated in
After the growth substrate 31 on which the semiconductor layer 10S is formed is bonded to the temporary substrate 33, the growth substrate 31 is removed, as illustrated in
Next, as illustrated in
Alternatively, as illustrated in
While the semiconductor layer 10S is being etched, the adhesion layer B is also being etched together with the semiconductor layer 10S, for example. The adhesion layer B may be etched into an area greater than the semiconductor layer 10S. The adhesion layer B may extend around the semiconductor layer 10S, as illustrated in
Alternatively, as illustrated in
As illustrated in
After the semiconductor layer 10S is shaped, the diffusion regions 12A are formed in the respective pixels P in the semiconductor layer 10S, as illustrated in
After the diffusion regions 12A are provided in the semiconductor layer 10S, the first electrodes 11 are formed on the semiconductor layer 10S, as illustrated in
After the insulating film 17B is formed, the buried layer 18 is formed over the entire surface of the temporary substrate 33, as illustrated in
After the buried layer 18 is formed, the wiring layer 10W is formed to face the semiconductor layer 10S across the buried layer 18, as illustrated in
After the wiring layer 10W is formed, the read-out circuit substrate 20 is bonded to the temporary substrate 33 with the wiring layer 10W interposed, as illustrated in
After the read-out circuit substrate 20 is bonded to the temporary substrate 33, the temporary substrate 33 is removed, as illustrated in
After the temporary substrate 33 is removed, the insulating layer 33IA and the adhesion layer B, for example, are also removed to allow the surface of the semiconductor layer 10S to be exposed, as illustrated in
Next, as illustrated in
Next, the through holes H2 extending through the element substrate 10 to the pad electrodes 22P in the read-out circuit substrate 20 are formed, as illustrated in
The groove H3 has the width W1 greater than the width W2 of the blade B used during the dicing step. The groove H3 further has the bottom surface at a deeper position than the bonding face S2 between the element substrate 10 and the read-out circuit substrate 20, e.g., a deeper position than the pad electrodes 22P exposed to the bottom surfaces in the through holes H2. It is desirable that the wiring lines 22CB or another metal pattern, for example, be not formed, but only the insulating film is formed at the position where the groove H3 is to be formed, as described above. It is therefore possible to form the groove H3 and the through holes H2 in a single step. It is to be noted that the through holes H2 and the groove H3 may be formed through a single etching step. Alternatively, the through holes H2 and the groove H3 may be formed through a plurality of etching steps. Such a plurality of etching steps is referred to as multi-stage etching. Level differences are formed on and around side surfaces of the through holes H2 and the groove H3 formed through multi-stage etching.
Finally, the blade B is inserted into the groove H3 to separate the element substrate 10 and the read-out circuit substrate 20 away from each other on a chip basis, as illustrated in
It is to be noted that, although
In the light receiving element 1, when light at a wavelength within a region from the visible region to the infrared region, for example, is incident on the photoelectric conversion layer 13 via the passivation films 16A and 16B, the second electrode 15, and the second contact layer 14, the photoelectric conversion layer 13 absorbs the light. This causes pairs of holes and electrons to be generated in the photoelectric conversion layer 13. That is, the light undergoes photoelectric conversion. As a predetermined voltage is applied to the first electrodes 11 at this time, for example, a potential gradient occurs in the photoelectric conversion layer 13. Ones (e.g., holes) of the generated electric charges move to the diffusion regions 12A to serve as signal electric charges, and are collected from the diffusion regions 12A to the first electrodes 11. The signal electric charges move, via the contact electrodes 19E and 22E, to the semiconductor substrate 21. The signal electric charges are then read from the respective pixels P.
In the light receiving element 1 according to the present embodiment, the groove H3 is provided on the peripheral part in the peripheral region R2 lying outside of the element region R1. The groove H3 extends from the light incident face S1 of the element substrate 10 to the multi-layered wiring layer 22C, for example, in the read-out circuit substrate 20. The configuration therefore prevents the element substrate 10 and the read-out circuit substrate 20 from peeling off each other at the bonding face S2, which will now be described herein.
As described above, image sensors or infrared sensors having sensitivity within an infrared region have been commercially available. For example, a semiconductor element has been known that includes an element substrate and a circuit substrate bonded to each other through Cu—Cu bonding. The element substrate is a laminate of a compound semiconductor layer and a wiring layer.
In the semiconductor element as described above, the element substrate does not include a semiconductor substrate, but includes only an insulating film. In such a configuration, a force pressing and bonding each other an element substrate and a circuit substrate tends to weak. The element substrate and the circuit substrate might therefore peel off each other at a bonding interface during the dicing step.
To deal with this concern, the groove H3 is provided outside of the through holes H2 in the peripheral region R2 in the present embodiment. The groove H3 extends from the light incident face S1 of the element substrate 10. The groove H3 is provided deeper than the bonding face S2 between the element substrate 10 and the read-out circuit substrate 20. The element substrate 10 and the read-out circuit substrate 20 are separated away from each other on a chip basis during the dicing step in the groove H3. The groove H3 therefore forms, after the dicing step, a recess on the chip ends E, i.e., the peripheral part of the light receiving element 1, making it possible to prevent the element substrate 10 and the read-out circuit substrate 20 from peeling off each other at the bonding face S2.
In the light receiving element 1 according to the present embodiment, as described above, the groove H3 is provided on the peripheral part in the peripheral region R2. The groove H3 extends from the element substrate 10. The groove H3 is provided deeper than the bonding face between the element substrate 10 and the read-out circuit substrate 20. Dicing is then performed within the groove H3. It is therefore possible to prevent the element substrate 10 and the read-out circuit substrate 20 from peeling off each other at the bonding face S2. It is thus possible to improve reliability and yields during manufacturing.
Some modification examples, i.e., modification examples 1 to 4, to the present embodiment described above will now be described. Note that the like elements are denoted with the same reference numerals, and any redundant description thereof is not described in detail.
In a case where the plurality of grooves H3 is provided, at predetermined intervals, on the semiconductor layer 10S in the form of the plurality of chips, and then dicing is performed at the plurality of grooves H3, as described above, and in a case where peeling off has occurred on one of the chip end parts E after dicing, for example, the peeling off of the element substrate 10 and the read-out circuit substrate 20 at the bonding face stops outside of one of the grooves H3. The light receiving element 1A according to the modification example thus makes it possible to achieve effects similar to the effects of the embodiment described above.
It is to be noted that
To form the light receiving element 1C according to the modification example, the through holes H1, the electrically-conductive film 15B, and the passivation film 16B are formed in this order, as illustrated in
In the light receiving element 1C according to the modification example, as described above, the three grooves H3, i.e., the grooves H3-1, H3-2, and H-3, are provided on the semiconductor layer 10S in the form of the plurality of chips, for example. The groove H3-2 representing a central groove, among the three grooves H3, is greater in width than the blade B. Within the groove H3-2, the element substrate 10 and the read-out circuit substrate 20 are separated away from each other. The configuration makes it possible to prevent the element substrate 10 and the read-out circuit substrate 20 from peeling off each other at the bonding face S2. Even if peeling occurs at one of the chip ends E, the corresponding groove H3 (e.g., the groove H3-1) formed adjacent to the peripheral part makes it possible to stop the peeling off. It is thus possible to further improve reliability and yields during manufacturing.
Even in the light receiving elements 1D to 1F described above, it is possible to achieve effects similar to the effects of the embodiment described above.
In the light receiving element 1G, for example, the color filter layer 41 and the on-chip lenses 42 corresponding to RGB are provided in this order on the passivation films 16A and 16B of the element substrate 10 with a planarizing film 16C interposed. The color filter layer 41 may include an infrared (IR) filter. Providing the color filter layer 41 makes it possible to acquire light-receiving data of light received at corresponding wavelengths in the respective pixels P.
The on-chip lenses 42 cause light incident onto the light receiving element 1G to converge onto the photoelectric conversion layer 13. The on-chip lenses 42 include, for example, an organic material or silicon oxide (SiO2). In the light receiving element 1G, the buried layer 18 is provided in the peripheral region R2. Level differences become smaller or otherwise no level difference is created between the element region R1 and the peripheral region R2 on the element substrate 10, forming the planar light incident face S1. A photolithography step, for example, is therefore used, making it possible to highly accurately form the on-chip lenses 42. For example, the color filter layer 41 and the on-chip lenses 42 are terminated in the element region R1. The planarizing film 16C disposed between the passivation films 16A and 16B and the color filter layer 41 is provided, for example, from the element region R1 to the peripheral region R2. The planarizing film 16C is terminated in the peripheral region R2. The color filter layer 41, the on-chip lenses 42, and the planarizing film 16C may be terminated at any positions in the element region R1 or the peripheral region R2.
In the modification example, the color filter layer 41 and the on-chip lenses 42 may be provided on the light incident face S1 of the element substrate 10. Even in the modification example, it is possible to achieve effects equivalent to the effects of the embodiments described above. It is also possible to read pixel signals on a color basis by setting the second contact layer 14 to have a film thickness in the range from 5 nm to 300 nm, for example, in the configuration according to the modification example. Furthermore, it is possible to easily form, at high accuracy, the on-chip lenses 42 on the light incident face S1 planarized with the buried layer 18.
The element region R1 is disposed in a two-dimensional matrix, for example. In the element region R1, the plurality of pixels P constituting the light receiving element 1 is provided. A pixel driving line Lread is provided on each pixel row of the pixels P, for example. The pixel driving lines Lread are row selection lines or reset control lines, for example. Additionally, a vertical signal line Lsig is provided on each pixel column of the pixels P. The pixel driving lines Lread transmit drive signals for reading signals from the pixels P. One end of each pixel driving line Lread is coupled to a corresponding output end of the row scanner 131.
The row scanner 131 includes a shift register and an address decoder, for example. The row scanner 131 serves as a pixel driver that drives the pixels P in the element region R1 on a row basis, for example. Signals outputted from the pixels P on the pixel row selectively scanned by the row scanner 131 are supplied, via the corresponding vertical signal line Lsig, to the horizontal selector 133. The horizontal selector 133 includes, for example, an amplifier and a horizontal selection switch provided on each vertical signal line Lsig.
The column scanner 134 includes a shift register and an address decoder, for example. The column scanner 134 scans and sequentially drives the horizontal selection switches of the horizontal selector 133. As the column scanner 134 selects and scans the switches, signals transmitted from the pixels via the vertical signal lines Lsig are sequentially outputted to horizontal signal lines 135. The signals then enter, via the horizontal signal lines 135, a non-illustrated signal processor, for example.
In the imaging device 2, as illustrated in
The system controller 132 receives external data, such as clock data or data instructing an operation mode, for example. The system controller 132 also outputs data including, for example, internal information on the imaging device 2. The system controller 132 further includes a timing generator that generates various kinds of timing signals. On the basis of the various kinds of timing signals generated in the timing generator, the system controller 132 drives and controls the row scanner 131, the horizontal selector 133, and the column scanner 134, for example.
It is possible to apply the imaging device 2 described above to various types of electronic apparatuses, including, for example, a camera that captures images within the infrared region.
The optical system 210 guides image light or incident light from an object to the imaging device 2. The optical system 210 may include a plurality of optical lenses. The shutter 211 controls a period during which light is irradiated onto the imaging device 2. The shutter 211 also controls a light shielding period. The driver 213 controls a transfer operation to the device 2 and a shutter operation of the shutter 211. The signal processor 212 performs various kinds of signal processing on signals outputted from the imaging device 2. An image signal Dout having undergone signal processing is stored in a storage medium such as memory, or is outputted to a monitor, for example.
Furthermore, it is also possible to apply the light receiving element illustrated in the present embodiment, e.g., the light receiving element 1, and other examples to electronic apparatuses described below.
The technology according to the present disclosure (the present technology) is applicable to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
In
The endoscope 11100 includes a lens barrel 11101 having a region of a predetermined length from a distal end thereof to be inserted into a body cavity of the patient 11132, and a camera head 11102 connected to a proximal end of the lens barrel 11101. In the example depicted, the endoscope 11100 is depicted which includes as a rigid endoscope having the lens barrel 11101 of the hard type. However, the endoscope 11100 may otherwise be included as a flexible endoscope having the lens barrel 11101 of the flexible type.
The lens barrel 11101 has, at a distal end thereof, an opening in which an objective lens is fitted. A light source apparatus 11203 is connected to the endoscope 11100 such that light generated by the light source apparatus 11203 is introduced to a distal end of the lens barrel 11101 by a light guide extending in the inside of the lens barrel 11101 and is irradiated toward an observation target in a body cavity of the patient 11132 through the objective lens. It is to be noted that the endoscope 11100 may be a forward-viewing endoscope or may be an oblique-viewing endoscope or a side-viewing endoscope.
An optical system and an image pickup element are provided in the inside of the camera head 11102 such that reflected light (observation light) from the observation target is condensed on the image pickup element by the optical system. The observation light is photo-electrically converted by the image pickup element to generate an electric signal corresponding to the observation light, namely, an image signal corresponding to an observation image. The image signal is transmitted as RAW data to a CCU 11201.
The CCU 11201 includes a central processing unit (CPU), a graphics processing unit (GPU) or the like and integrally controls operation of the endoscope 11100 and a display apparatus 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs, for the image signal, various image processes for displaying an image based on the image signal such as, for example, a development process (demosaic process).
The display apparatus 11202 displays thereon an image based on an image signal, for which the image processes have been performed by the CCU 11201, under the control of the CCU 11201.
The light source apparatus 11203 includes a light source such as, for example, a light emitting diode (LED) and supplies irradiation light upon imaging of a surgical region to the endoscope 11100.
An inputting apparatus 11204 is an input interface for the endoscopic surgery system 11000. A user can perform inputting of various kinds of information or instruction inputting to the endoscopic surgery system 11000 through the inputting apparatus 11204. For example, the user would input an instruction or a like to change an image pickup condition (type of irradiation light, magnification, focal distance or the like) by the endoscope 11100.
A treatment tool controlling apparatus 11205 controls driving of the energy device 11112 for cautery or incision of a tissue, sealing of a blood vessel or the like. A pneumoperitoneum apparatus 11206 feeds gas into a body cavity of the patient 11132 through the pneumoperitoneum tube 11111 to inflate the body cavity in order to secure the field of view of the endoscope 11100 and secure the working space for the surgeon. A recorder 11207 is an apparatus capable of recording various kinds of information relating to surgery. A printer 11208 is an apparatus capable of printing various kinds of information relating to surgery in various forms such as a text, an image or a graph.
It is to be noted that the light source apparatus 11203 which supplies irradiation light when a surgical region is to be imaged to the endoscope 11100 may include a white light source which includes, for example, an LED, a laser light source or a combination of them. Where a white light source includes a combination of red, green, and blue (RGB) laser light sources, since the output intensity and the output timing can be controlled with a high degree of accuracy for each color (each wavelength), adjustment of the white balance of a picked up image can be performed by the light source apparatus 11203. Further, in this case, if laser beams from the respective RGB laser light sources are irradiated time-divisionally on an observation target and driving of the image pickup elements of the camera head 11102 are controlled in synchronism with the irradiation timings. Then images individually corresponding to the R, G and B colors can be also picked up time-divisionally. According to this method, a color image can be obtained even if color filters are not provided for the image pickup element.
Further, the light source apparatus 11203 may be controlled such that the intensity of light to be outputted is changed for each predetermined time. By controlling driving of the image pickup element of the camera head 11102 in synchronism with the timing of the change of the intensity of light to acquire images time-divisionally and synthesizing the images, an image of a high dynamic range free from underexposed blocked up shadows and overexposed highlights can be created.
Further, the light source apparatus 11203 may be configured to supply light of a predetermined wavelength band ready for special light observation. In special light observation, for example, by utilizing the wavelength dependency of absorption of light in a body tissue to irradiate light of a narrow band in comparison with irradiation light upon ordinary observation (namely, white light), narrow band observation (narrow band imaging) of imaging a predetermined tissue such as a blood vessel of a superficial portion of the mucous membrane or the like in a high contrast is performed. Alternatively, in special light observation, fluorescent observation for obtaining an image from fluorescent light generated by irradiation of excitation light may be performed. In fluorescent observation, it is possible to perform observation of fluorescent light from a body tissue by irradiating excitation light on the body tissue (autofluorescence observation) or to obtain a fluorescent light image by locally injecting a reagent such as indocyanine green (ICG) into a body tissue and irradiating excitation light corresponding to a fluorescent light wavelength of the reagent upon the body tissue. The light source apparatus 11203 can be configured to supply such narrow-band light and/or excitation light suitable for special light observation as described above.
The camera head 11102 includes a lens unit 11401, an image pickup unit 11402, a driving unit 11403, a communication unit 11404 and a camera head controlling unit 11405. The CCU 11201 includes a communication unit 11411, an image processing unit 11412 and a control unit 11413. The camera head 11102 and the CCU 11201 are connected for communication to each other by a transmission cable 11400.
The lens unit 11401 is an optical system, provided at a connecting location to the lens barrel 11101. Observation light taken in from a distal end of the lens barrel 11101 is guided to the camera head 11102 and introduced into the lens unit 11401. The lens unit 11401 includes a combination of a plurality of lenses including a zoom lens and a focusing lens.
The number of image pickup elements which is included by the image pickup unit 11402 may be one (single-plate type) or a plural number (multi-plate type). Where the image pickup unit 11402 is configured as that of the multi-plate type, for example, image signals corresponding to respective R, G and B are generated by the image pickup elements, and the image signals may be synthesized to obtain a color image. The image pickup unit 11402 may also be configured so as to have a pair of image pickup elements for acquiring respective image signals for the right eye and the left eye ready for three dimensional (3D) display. If 3D display is performed, then the depth of a living body tissue in a surgical region can be comprehended more accurately by the surgeon 11131. It is to be noted that, where the image pickup unit 11402 is configured as that of stereoscopic type, a plurality of systems of lens units 11401 are provided corresponding to the individual image pickup elements.
Further, the image pickup unit 11402 may not necessarily be provided on the camera head 11102. For example, the image pickup unit 11402 may be provided immediately behind the objective lens in the inside of the lens barrel 11101.
The driving unit 11403 includes an actuator and moves the zoom lens and the focusing lens of the lens unit 11401 by a predetermined distance along an optical axis under the control of the camera head controlling unit 11405. Consequently, the magnification and the focal point of a picked up image by the image pickup unit 11402 can be adjusted suitably.
The communication unit 11404 includes a communication apparatus for transmitting and receiving various kinds of information to and from the CCU 11201. The communication unit 11404 transmits an image signal acquired from the image pickup unit 11402 as RAW data to the CCU 11201 through the transmission cable 11400.
In addition, the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head controlling unit 11405. The control signal includes information relating to image pickup conditions such as, for example, information that a frame rate of a picked up image is designated, information that an exposure value upon image picking up is designated and/or information that a magnification and a focal point of a picked up image are designated.
It is to be noted that the image pickup conditions such as the frame rate, exposure value, magnification or focal point may be designated by the user or may be set automatically by the control unit 11413 of the CCU 11201 on the basis of an acquired image signal. In the latter case, an auto exposure (AE) function, an auto focus (AF) function and an auto white balance (AWB) function are incorporated in the endoscope 11100.
The camera head controlling unit 11405 controls driving of the camera head 11102 on the basis of a control signal from the CCU 11201 received through the communication unit 11404.
The communication unit 11411 includes a communication apparatus for transmitting and receiving various kinds of information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted thereto from the camera head 11102 through the transmission cable 11400.
Further, the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102. The image signal and the control signal can be transmitted by electrical communication, optical communication or the like.
The image processing unit 11412 performs various image processes for an image signal in the form of RAW data transmitted thereto from the camera head 11102.
The control unit 11413 performs various kinds of control relating to image picking up of a surgical region or the like by the endoscope 11100 and display of a picked up image obtained by image picking up of the surgical region or the like. For example, the control unit 11413 creates a control signal for controlling driving of the camera head 11102.
Further, the control unit 11413 controls, on the basis of an image signal for which image processes have been performed by the image processing unit 11412, the display apparatus 11202 to display a picked up image in which the surgical region or the like is imaged. Thereupon, the control unit 11413 may recognize various objects in the picked up image using various image recognition technologies. For example, the control unit 11413 can recognize a surgical tool such as forceps, a particular living body region, bleeding, mist when the energy device 11112 is used and so forth by detecting the shape, color and so forth of edges of objects included in a picked up image. The control unit 11413 may cause, when it controls the display apparatus 11202 to display a picked up image, various kinds of surgery supporting information to be displayed in an overlapping manner with an image of the surgical region using a result of the recognition. Where surgery supporting information is displayed in an overlapping manner and presented to the surgeon 11131, the burden on the surgeon 11131 can be reduced and the surgeon 11131 can proceed with the surgery with certainty.
The transmission cable 11400 which connects the camera head 11102 and the CCU 11201 to each other is an electric signal cable ready for communication of an electric signal, an optical fiber ready for optical communication or a composite cable ready for both of electrical and optical communications.
Here, while, in the example depicted, communication is performed by wired communication using the transmission cable 11400, the communication between the camera head 11102 and the CCU 11201 may be performed by wireless communication.
An example of the endoscopic surgery system to which the technology according to the present disclosure is applicable has been described above. The technology according to the present disclosure is applied to, for example, the image pickup unit 11402 in the configuration described above. Applying the technology according to the present disclosure to the image pickup unit 11402 improves detection accuracy.
It is to be noted that the endoscopic surgery system is described as an example in the description above, the technology according to the present disclosure may be applied to, for example, microscopic surgery system.
The technology according to the present disclosure (present technology) is applicable to various products. For example, the technology according to the present disclosure may be achieved in the form of an apparatus to be mounted on a mobile body of any kind. Non-limiting examples of the mobile body may include an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, any personal mobility device, an airplane, an unmanned aerial vehicle (drone), a vessel, and a robot.
The vehicle control system 12000 includes a plurality of electronic control units connected to each other via a communication network 12001. In the example depicted in
The driving system control unit 12010 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs. For example, the driving system control unit 12010 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like.
The body system control unit 12020 controls the operation of various kinds of devices provided to a vehicle body in accordance with various kinds of programs. For example, the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal, a fog lamp, or the like. In this case, radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 12020. The body system control unit 12020 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
The outside-vehicle information detecting unit 12030 detects information about the outside of the vehicle including the vehicle control system 12000. For example, the outside-vehicle information detecting unit 12030 is connected with an imaging section 12031. The outside-vehicle information detecting unit 12030 makes the imaging section 12031 image an image of the outside of the vehicle, and receives the imaged image. On the basis of the received image, the outside-vehicle information detecting unit 12030 may perform processing of detecting an object such as a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
The imaging section 12031 is an optical sensor that receives light, and which outputs an electric signal corresponding to a received light amount of the light. The imaging section 12031 can output the electric signal as an image, or can output the electric signal as information about a measured distance. In addition, the light received by the imaging section 12031 may be visible light, or may be invisible light such as infrared rays or the like.
The in-vehicle information detecting unit 12040 detects information about the inside of the vehicle. The in-vehicle information detecting unit 12040 is, for example, connected with a driver state detecting section 12041 that detects the state of a driver. The driver state detecting section 12041, for example, includes a camera that images the driver. On the basis of detection information input from the driver state detecting section 12041, the in-vehicle information detecting unit 12040 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing.
The microcomputer 12051 can calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the information about the inside or outside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040, and output a control command to the driving system control unit 12010. For example, the microcomputer 12051 can perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like.
In addition, the microcomputer 12051 can perform cooperative control intended for automatic driving, which makes the vehicle to travel autonomously without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like on the basis of the information about the outside or inside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040.
In addition, the microcomputer 12051 can output a control command to the body system control unit 12020 on the basis of the information about the outside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030. For example, the microcomputer 12051 can perform cooperative control intended to prevent a glare by controlling the headlamp so as to change from a high beam to a low beam, for example, in accordance with the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detecting unit 12030.
The sound/image output section 12052 transmits an output signal of at least one of a sound and an image to an output device capable of visually or auditorily notifying information to an occupant of the vehicle or the outside of the vehicle. In the example of
In
The imaging sections 12101, 12102, 12103, 12104, and 12105 are, for example, disposed at positions on a front nose, sideview mirrors, a rear bumper, and a back door of the vehicle 12100 as well as a position on an upper portion of a windshield within the interior of the vehicle. The imaging section 12101 provided to the front nose and the imaging section 12105 provided to the upper portion of the windshield within the interior of the vehicle obtain mainly an image of the front of the vehicle 12100. The imaging sections 12102 and 12103 provided to the sideview mirrors obtain mainly an image of the sides of the vehicle 12100. The imaging section 12104 provided to the rear bumper or the back door obtains mainly an image of the rear of the vehicle 12100. The imaging section 12105 provided to the upper portion of the windshield within the interior of the vehicle is used mainly to detect a preceding vehicle, a pedestrian, an obstacle, a signal, a traffic sign, a lane, or the like.
Incidentally,
At least one of the imaging sections 12101 to 12104 may have a function of obtaining distance information. For example, at least one of the imaging sections 12101 to 12104 may be a stereo camera constituted of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
For example, the microcomputer 12051 can determine a distance to each three-dimensional object within the imaging ranges 12111 to 12114 and a temporal change in the distance (relative speed with respect to the vehicle 12100) on the basis of the distance information obtained from the imaging sections 12101 to 12104, and thereby extract, as a preceding vehicle, a nearest three-dimensional object in particular that is present on a traveling path of the vehicle 12100 and which travels in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, equal to or more than 0 km/hour). Further, the microcomputer 12051 can set a following distance to be maintained in front of a preceding vehicle in advance, and perform automatic brake control (including following stop control), automatic acceleration control (including following start control), or the like. It is thus possible to perform cooperative control intended for automatic driving that makes the vehicle travel autonomously without depending on the operation of the driver or the like.
For example, the microcomputer 12051 can classify three-dimensional object data on three-dimensional objects into three-dimensional object data of a two-wheeled vehicle, a standard-sized vehicle, a large-sized vehicle, a pedestrian, a utility pole, and other three-dimensional objects on the basis of the distance information obtained from the imaging sections 12101 to 12104, extract the classified three-dimensional object data, and use the extracted three-dimensional object data for automatic avoidance of an obstacle. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that the driver of the vehicle 12100 can recognize visually and obstacles that are difficult for the driver of the vehicle 12100 to recognize visually. Then, the microcomputer 12051 determines a collision risk indicating a risk of collision with each obstacle. In a situation in which the collision risk is equal to or higher than a set value and there is thus a possibility of collision, the microcomputer 12051 outputs a warning to the driver via the audio speaker 12061 or the display section 12062, and performs forced deceleration or avoidance steering via the driving system control unit 12010. The microcomputer 12051 can thereby assist in driving to avoid collision.
At least one of the imaging sections 12101 to 12104 may be an infrared camera that detects infrared rays. The microcomputer 12051 can, for example, recognize a pedestrian by determining whether or not there is a pedestrian in imaged images of the imaging sections 12101 to 12104. Such recognition of a pedestrian is, for example, performed by a procedure of extracting characteristic points in the imaged images of the imaging sections 12101 to 12104 as infrared cameras and a procedure of determining whether or not it is the pedestrian by performing pattern matching processing on a series of characteristic points representing the contour of the object. When the microcomputer 12051 determines that there is a pedestrian in the imaged images of the imaging sections 12101 to 12104, and thus recognizes the pedestrian, the sound/image output section 12052 controls the display section 12062 so that a square contour line for emphasis is displayed so as to be superimposed on the recognized pedestrian. The sound/image output section 12052 may also control the display section 12062 so that an icon or the like representing the pedestrian is displayed at a desired position.
An example of the vehicle control system to which the technology according to the present disclosure has been described above. The technology according to the present disclosure may be applied to, for example, the imaging section 12031 in the configuration described above. Applying the technology according to the present disclosure to the imaging section 12031 allows a pickup image to be more easily viewable, helping reduce fatigue of a driver.
Although the present disclosure has been described with reference to the present embodiment, the modification examples 1 to 4, the application examples, the applied examples, and other examples, the contents of the present disclosure are not limited thereto, but may be modified in a wide variety of ways. For example, the layer configuration of the light receiving element according to the present embodiment described above is merely an example, and may further include other layers. Additionally, the materials and the thicknesses of the layers merely illustrate examples, and are not limited to the values described above. In the embodiment described above, the semiconductor layer 10S includes the first contact layer 12, the photoelectric conversion layer 13, and the second contact layer 14. For example, the semiconductor layer 10S, however, may include at least the photoelectric conversion layer 13. For example, the first contact layer 12 and the second contact layer 14 may not be provided. Other layers may otherwise be included.
Furthermore, holes serve as signal electric charges in the embodiment described above, for purpose of convenience. However, electrons may serve as signal electric charges. For example, an n-type impurity may be included in the diffusion regions.
In addition, the light receiving element represents, in the embodiment described above, a specific but non-limiting example of a semiconductor element according to an embodiment of the present technology. However, another element than the light receiving element may represent a semiconductor element according to an embodiment of the present technology. For example, a luminescent element may represent a semiconductor element according to an embodiment of the present technology.
It should be appreciated that the effects described herein are mere examples. Effects of the present embodiment and other examples of the present disclosure are not limited to those described herein. The present disclosure may further include any effect other than those described herein.
Moreover, the present disclosure may have the following configuration, for example. According to the present technology having the configuration described below, one or more grooves are provided at least either on the peripheral part of the peripheral region lying outside of the element region or adjacent to the peripheral part. The one or more grooves extend from the surface of the element substrate to a portion of the read-out circuit substrate, or from the surface of the read-out circuit substrate to a portion of the element substrate. The configuration makes it possible to prevent the element substrate and the read-out circuit substrate from peeling off each other at the bonding face, and to improve reliability.
The present disclosure may have the following configurations.
(1) A light detecting device, comprising:
an element substrate including an element region and a peripheral region, wherein the element region includes a first wiring layer and a semiconductor layer including a compound semiconductor material, and the peripheral region is outside the element region in a plan view, and
a circuit substrate that faces the element substrate and is electrically connected to the semiconductor layer through the first wiring layer,
wherein an outer boundary of the element substrate is different from an outer boundary of the circuit substrate.
(2) The light detecting device according to the above (1), wherein a side surface of the circuit substrate has a different shape from a side surface of the element circuit.
(3) The light detecting device according to any one of the above (1) to (2), wherein the circuit substrate includes a second wiring layer and a semiconductor substrate, and an area of the semiconductor substrate is larger than the element substrate.
(4) The light detecting device according to any one of the above (1) to (3), wherein the semiconductor layer in the element substrate includes a photoelectric conversion layer, and the photoelectric conversion layer includes the compound semiconductor material.
(5) The light detecting device according to any one of the above (1) to (4), wherein the semiconductor layer includes a diffusion region, and the diffusion region is configured to read electric charges generated from the photoelectric conversion layer.
(6) The light detecting device according to any one of the above (1) to (5), further comprising a first electrode that is electrically connected to the diffusion region.
(7) The light detecting device according to any one of the above (1) to (6), further comprising a second electrode facing the first electrode, wherein the semiconductor layer is provided between the first and second electrodes.
(8) The light detecting device according to any one of the above (1) to (7), wherein the compound semiconductor material includes at least one of indium gallium arsenide, indium arsenic antimony, indium arsenide, indium antimony, and mercury cadmium telluride.
(9) The light detecting device according to any one of the above (1) to (8), further comprising a color filter layer on a light incident surface of the element substrate.
(10) The light detecting device according to any one of the above (1) to (9), further comprising on-chip lenses on the color filter layer.
(11) The light detecting device according to any one of the above (1) to (10), wherein each of the element substrate and the circuit substrate includes a contact electrode, and the element substrate and the circuit substrates are electrically connected through the contact electrode.
(12) The light detecting device according to any one of the above (1) to (11), wherein the contact electrode includes a copper pad.
(13) The light detecting device according to any one of the above (1) to (12), wherein the element substrate and the circuit substrate are stacked with each other through Cu—Cu bonding.
(14) The light detecting device according to any one of the above (1) to (13), wherein a side surface of the circuit substrate has a step shape.
(15) The light detecting device according to any one of the above (1) to (14), wherein a side surface of the element circuit is flat.
(16) The light detecting device according to any one of the above (1) to (15), wherein the step shape of the side surface of the circuit substrate has a depth in a thickness direction and a width in a horizontal direction, and wherein the depth in the thickness direction is different from the width in the horizontal direction.
(17) The light detecting device according to any one of the above (1) to (16), wherein the depth is larger than the width.
(18) The light detecting device according to any one of the above (1) to (17), wherein a side surface of the element substrate has a step shape.
(19) The light detecting device according to any one of the above (1) to (18), wherein the step shape of the side surface of the element substrate has a depth in a thickness direction and a width in a horizontal direction, and wherein the depth in the thickness direction is larger than the width in the horizontal direction.
(20) An electronic apparatus that includes a light detecting device, the light detecting device comprising:
an element substrate including an element region and a peripheral region, wherein the element region includes a first wiring layer and a semiconductor layer including a compound semiconductor material, and the peripheral region is outside the element region in a plan view, and
a circuit substrate that faces the element substrate and is electrically connected to the semiconductor layer through the first wiring layer,
wherein an outer boundary of the element substrate is different from an outer boundary of the circuit substrate.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
2019-219713 | Dec 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/043278 | 11/19/2020 | WO |