The present disclosure relates to an imaging device and electronic equipment.
In recent years, an imaging device has adopted a method of detecting a phase difference using a pair of phase difference detection pixels as an autofocus function. As such an example, an imaging element disclosed in Patent Literature 1 below can be mentioned. In the technology disclosed in Patent Literature 1, both of an effective pixel that images an image of a subject and a phase difference detection pixel that detects the phase difference described above are separately provided on a light receiving surface.
Patent Literature 1: JP 2000-292685 A
However, in the technology disclosed in Patent Literature 1, when a captured image of a subject is acquired, it is difficult to use information obtained by the phase difference detection pixel as the same information as information from the imaging pixel. Therefore, in the above technology, interpolation is performed on an image of a pixel corresponding to the phase difference detection pixel using information from effective pixels around the phase difference detection pixel and a captured image is generated. That is, in the technology disclosed in Patent Literature 1, since the phase difference detection pixel is provided to perform phase difference detection, it is difficult to avoid deterioration of the captured image due to a loss of information concerning the captured image corresponding to the phase difference detection pixel.
Therefore, the present disclosure proposes an imaging device and electronic equipment capable of avoiding deterioration of a captured image while improving accuracy of phase difference detection.
According to the present disclosure, there is provided an imaging device including: a semiconductor substrate; and a plurality of imaging elements that are arrayed in a matrix in a first direction and a second direction on the semiconductor substrate and perform photoelectric conversion on incident light. In the imaging device, each of the plurality of imaging elements includes: a plurality of pixels provided to be adjacent to one another in a predetermined unit region of the semiconductor substrate and including a photoelectric conversion section containing impurities of a first conductivity type; a separation section that separates the plurality of pixels; two first element separation walls provided to pierce through at least a part of the semiconductor substrate along two first side surfaces of the predetermined unit region extending in the second direction; an on-chip lens provided above a light receiving surface of the semiconductor substrate to be shared by the plurality of pixels; and a first diffusion region provided in the semiconductor substrate around the first element separation wall and the separation section and containing impurities of a second conductivity type having a conductivity type opposite to the first conductivity type.
Furthermore, according to the present disclosure, there is provided an electronic equipment including an imaging device. The imaging device includes a semiconductor substrate and a plurality of imaging elements that are arrayed in a matrix in a first direction and a second direction on the semiconductor substrate and perform photoelectric conversion on incident light. In the imaging device, each of the plurality of imaging elements includes: a plurality of pixels provided to be adjacent to one another in a predetermined unit region of the semiconductor substrate and including a photoelectric conversion section containing impurities of a first conductivity type; a separation section that separates the plurality of pixels; two first element separation walls provided to pierce through at least a part of the semiconductor substrate along two first side surfaces of the predetermined unit region extending in the second direction; an on-chip lens provided above a light receiving surface of the semiconductor substrate to be shared by the plurality of pixels; and a first diffusion region provided in the semiconductor substrate around the first element separation wall and the separation section and containing impurities of a second conductivity type having a conductivity type opposite to the first conductivity type.
Preferred embodiments of the present disclosure are explained in detail below with reference to the accompanying drawings. Note that, in the present specification and the drawings, components having substantially the same functional configurations are denoted by the same reference numerals and signs, whereby redundant explanation of the components is omitted.
The drawings referred to in the following explanation are drawings for facilitating the explanation and understanding of an embodiment of the present disclosure. In order to clearly show the drawings, shapes, dimensions, ratios, and the like illustrated in the drawings are sometimes different from actual ones. Further, an imaging device illustrated in the drawings can be changed in design as appropriate in consideration of the following explanation and publicly known technologies.
Shapes and dimensions expressed in the following explanation mean not only shapes and dimensions defined mathematically or geometrically but also similar shapes and dimensions including differences (errors and distortions) to an allowable extent in an operation of the imaging device and a manufacturing process for the imaging device. Further, “same” used for specific shapes and dimensions in the following description does mean not only a case of complete mathematical or geometric matching but also a case of having a difference (error/distortion) to an allowable extent in the operation of the imaging device and the manufacturing process for the imaging device.
Further, in the following explanation, “electrically connect” means connecting a plurality of elements directly or indirectly via other elements.
Further, in the following explanation, “sharing” means that one other element (for example, an on-chip lens) is used together between elements (for example, pixels) different from each other.
Note that the explanation is made in the following order.
First, a schematic configuration of an imaging device 1 according to an embodiment of the present disclosure is explained with reference to
The pixel array unit 20 includes the plurality of imaging elements 100 two-dimensionally arranged in a matrix in a row direction (a first direction) and a column direction (a second direction) on the semiconductor substrate 10. The imaging elements 100 are elements that perform photoelectric conversion on incident light and includes a photoelectric conversion section (not illustrated) and a plurality of pixel transistors (for example, MOS (Metal-Oxide-Semiconductor) transistors) (not illustrated). The pixel transistors include, for example, four MOS transistors including a transfer transistor, a selection transistor, a reset transistor, and an amplification transistor. Further, in the pixel array unit 20, the plurality of imaging elements 100 are two-dimensionally arranged according to, for example, the Bayer array. Here, the Bayer array is an array pattern in which the imaging elements 100 that absorb light having a green wavelength (for example, a wavelength of 495 nm to 570 nm) and generate electric charges are arranged in a checkered pattern and the imaging elements 100 that absorb light having a red wavelength (for example, a wavelength of 620 nm to 750 nm) and generate electric charges and the imaging elements 100 that absorb light having a blue wavelength (for example, a wavelength of 450 nm to 495 nm) and generate electric charges are alternately arranged in the remaining portion for each column. Note that a detailed structure of the imaging element 100 is explained below.
The vertical drive circuit unit 21 is formed by, for example, a shift register, selects a pixel drive wire 26, supplies a pulse for driving the imaging elements 100 to the selected pixel drive wire 26, and drives the imaging elements 100 in units of rows. That is, the vertical drive circuit unit 21 selectively scans the imaging elements 100 of the pixel array unit 20 sequentially in the vertical direction (the up-down direction in
The column signal processing circuit unit 22 is arranged for each column of the imaging elements 100 and performs, for each pixel column, signal processing such as noise removal on pixel signals output from the imaging elements 100 for one row. For example, the column signal processing circuit unit 22 performs signal processing such as CDS (Correlated Double Sampling and AD (Analog-Digital) conversion in order to remove fixed pattern noise unique to pixels.
The horizontal drive circuit unit 23 is formed by, for example, a shift register, sequentially selects each of the column signal processing circuit units 22 explained above by sequentially outputting horizontal scanning pulses, and causes each of the column signal processing circuit units 22 to output a pixel signal to a horizontal signal line 28.
The output circuit unit 24 performs signal processing on the pixel signals sequentially supplied from each of the column signal processing circuit units 22 explained above through the horizontal signal line 28 and outputs the pixel signals. The output circuit unit 24 may function as, for example, a functional unit that performs buffering or may perform processing such as black level adjustment, column variation correction, and various kinds of digital signal processing. Note that buffering means temporarily storing pixel signals in order to compensate for differences in processing speed and transfer speed when the pixel signals are exchanged. Further, an input/output terminal 29 is a terminal for exchanging signals with an external device.
The control circuit unit 25 receives an input clock and data for instructing an operation mode and the like and outputs data such as internal information of the imaging device 1. That is, the control circuit unit 25 generates, based on a vertical synchronization signal, a horizontal synchronization signal, and a master clock, a clock signal or a control signal serving as a reference for operations of the vertical drive circuit unit 21, the column signal processing circuit unit 22, the horizontal drive circuit unit 23, and the like. Then, the control circuit unit 25 outputs the generated clock signal and the generated control signal to the vertical drive circuit unit 21, the column signal processing circuit unit 22, the horizontal drive circuit unit 23, and the like.
Next, details of the embodiment according to the present disclosure are explained, a comparative example studied by the present inventors before creating the embodiment according to the present disclosure is explained. First, a background of creating the comparative example is explained
The comparative example compared with the embodiment of the present disclosure has been created during intensive studies on providing phase difference detection pixels on the entire surface of the pixel array unit 20 of the imaging device 1 (all-pixel phase difference detection) in order to further improve an autofocus function, that is, to improve the accuracy of phase difference detection while avoiding deterioration of a captured image. In the comparative example, the imaging element 100 functioning as one imaging element at the time of imaging and functioning as a pair of phase difference detection pixels at the time of phase difference detection is provided on the entire surface of the pixel array unit 20 (a dual photodiode structure). According to the comparative example that enables such all-pixel phase difference detection, since the phase difference detection pixels are provided on the entire surface, the accuracy of phase difference detection can be improve and, further, imaging can be performed by all imaging elements. Therefore, deterioration of a captured image can be avoided.
Further, in the comparative example, in order to improve the accuracy of phase difference detection, an element for physically and electrically separating the phase difference detection pixels for preventing outputs of the pair of phase difference detection pixels from being mixed at the time of the phase difference detection is provided. In addition, in the comparative example, an overflow path is provided between the pair of phase difference detection pixels in order to avoid deterioration of a captured image. Specifically, at the time of normal imaging, when electric charges of any one pixel of the phase difference detection pixels is about to be saturated, the saturation of the one pixel can be avoided by moving the electric charges to the other pixel via the overflow path. Then, by providing such an overflow path, the linearity of a pixel signal output from the imaging element can be secured and deterioration of a captured image can be prevented.
Details of such a comparative example are sequentially explained below.
First, a sectional configuration of the imaging element 100 according to the comparative example is explained with reference to
As illustrated in
In the following explanation, a stacked structure of the imaging element 100 according to the comparative example is explained. However, in the following explanation, explanation is made in order from the upper side (a light receiving surface 10 a side) to the lower side in
As illustrated in
Then, the incident light condensed by the on-chip lens 200 is emitted to each of the photoelectric conversion sections 302 of the pair of pixels 300a and 300b via the color filter 202 provided below the on-chip lens 200. The color filter 202 is any of a color filter that transmits a red wavelength component, a color filter that transmits a green wavelength component, and a color filter that transmits a blue wavelength component. For example, the color filter 202 can be formed of, for example, a material in which a pigment or a dye is dispersed in a transparent binder such as silicone.
Further, the light blocking section 204 is provided on the light receiving surface 10a of the semiconductor substrate 10 to surround the color filter 202. Since the light blocking section 204 is provided between the imaging elements 100 adjacent to each other, it is possible to perform light blocking between the imaging elements 100 in order to suppress crosstalk between the adjacent imaging elements 100 and further improve accuracy in phase difference detection. The light blocking section 204 can be formed of, for example, a metal material or the like containing tungsten (W), aluminum (Al), copper (Cu), titanium (Ti), molybdenum (Mo), nickel (Ni), or the like.
Moreover, for example, in a predetermined unit region in the semiconductor substrate 10 of a second conductivity type (for example, a p-type), the photoelectric conversion sections (photodiodes) 302 having impurities of a first conductivity type (for example, an n-type) is provided for each of the pixels 300a and 300b adjacent to each other. As explained above, the photoelectric conversion sections 302 absorb light having a red wavelength component, a green wavelength component, or a blue wavelength component made incident through the color filter 202 and generate electric charges. Then, in the present embodiment, the photoelectric conversion section 302 of the pixel 300a and the photoelectric conversion section 302 of the pixel 300b can function as a pair of phase difference detection pixels at the time of phase difference detection. That is, in the present embodiment, a phase difference can be detected by detecting a difference between pixel signals based on the electric charges generated by the photoelectric conversion section 302 of the pixel 300a and the photoelectric conversion section 302 of the pixel 300b.
Specifically, an amount of electric charges to be generated, that is, the sensitivity of the photoelectric conversion sections 302 changes depending on an incident angle of light with respect to the optical axes (axes perpendicular to light receiving surfaces) of the photoelectric conversion sections 302. For example, the photoelectric conversion sections 302 have the highest sensitivity when the incident angle is 0 degrees. Further, the sensitivity of the photoelectric conversion sections 302 has, with respect to the incident angle, a symmetrical relation in which the optical axes are symmetrical axes when the incident angle is 0 degree. Therefore, in the photoelectric conversion section 302 of the pixel 300a and the photoelectric conversion section 302 of the pixel 300b, light from the same point is made incident at different incident angles and electric charges of amounts corresponding to the incident angles are generated. Therefore, a shift (a phase difference) occurs in a detected image. That is, the phase difference can be detected by detecting a difference between pixel signals based on electric charge amounts generated by the photoelectric conversion section 302 of the pixel 300a and the photoelectric conversion section 302 of the pixel 300b. Therefore, such a difference (phase difference) between the pixel signals is detected as, for example, a difference signal in a detecting unit (not illustrated) of the output circuit unit 24, a defocus amount is calculated based on the detected phase difference, and an image forming lens (not illustrated) is adjusted (moved), whereby autofocus can be realized. Note that, in the above explanation, it is explained that the phase difference is detected as the difference between the pixel signals of the photoelectric conversion section 302 of the pixel 300a and the photoelectric conversion section 302 of the pixel 300b. However, in the present embodiment, not only this, but, for example, the phase difference may be detected as a ratio of the pixel signals of the photoelectric conversion section 302 of the pixel 300a and the photoelectric conversion section 302 of the pixel 300b.
Further, in the comparative example, the two photoelectric conversion sections 302 are physically separated by the projecting section 304 (the pixel dividing region) (an example of a separation section). The projecting section 304 includes a groove section (a trench) (not illustrated) provided as a through-DTI (Deep Trench Isolation) pierce through the semiconductor substrate 10 from a front surface 10b side opposite to the light receiving surface 10a in the thickness direction of the semiconductor substrate 10 and a material embedded in the trench and made of an oxide film or a metal film such as a silicon oxide film (SiO), a silicon nitride film, amorphous silicon, polycrystalline silicon, a titanium oxide film (TiO), aluminum, or tungsten. In the imaging element 100, at the time of phase difference detection, when the pixel signals output by the pair of pixels 300a and 300b are mixed with each other and color mixing occurs, accuracy of phase difference detection is deteriorated. In the present embodiment, since the projecting section 304 pierces through the semiconductor substrate 10, the pair of pixels 300a and 300b can be physically separated effectively. As a result, the occurrence of color mixing can be suppressed and the accuracy of the phase difference detection can be further improved.
Further, when the imaging element 100 is viewed from the light receiving surface 10a side or the front surface 10b side, the slit 312 (see
Further, in the comparative example, since the projecting section 304 pierces through the semiconductor substrate 10, the diffusion regions 306 can be formed deep (here, depth is a distance from the light receiving surface 10a of the semiconductor substrate 10 in the thickness direction of the semiconductor substrate 10) in the semiconductor substrate 10 by conformal doping via the projecting section 304. Therefore, in the comparative example, since desired diffusion regions 306 can be accurately formed, the pair of pixels 300a and 300b can be effectively electrically separated. As a result, the occurrence of color mixing can be suppressed and the accuracy of the phase difference detection can be further improved. Note that details of the region of the slit 312 is explained below.
Further, in the comparative example, as illustrated in
In the comparative example, the element separation wall 310 surrounding the pixels 300a and 300b and physically separating the imaging elements 100 adjacent to each other is provided in the semiconductor substrate 10. The element separation wall 310 includes a groove section (a trench) (not illustrated) provided to pierce through the semiconductor substrate 10 in the thickness direction of the semiconductor substrate 10 and the material embedded in the trench and made of an oxide film or a metal film such as a silicon oxide film, a silicon nitride film, amorphous silicon, polycrystalline silicon, a titanium oxide film, aluminum, or tungsten. That is, the projecting section 304 and the element separation wall 310 may be formed of the same material. Note that, in the comparative example, since the element separation wall 310 and the projecting section 304 have the same configuration, the element separation wall 310 and the projecting section 304 can have an integrated form and, therefore, can be formed simultaneously. As a result, according to the comparative example, since the projecting section 304 can be formed simultaneously with the element separation wall 310, an increase in process steps for the imaging element 100 can be suppressed.
Further, in the comparative example, the diffusion regions 306 can be formed deep (here, depth is a distance from the light receiving surface 10a of the semiconductor substrate 10 in the thickness direction of the semiconductor substrate 10) in the semiconductor substrate 10 around the element separation wall 310 by conformal doping of impurities of the second conductivity type (for example, the p-type) via the element separation wall 310.
Further, in the comparative example, the electric charges generated in the photoelectric conversion section 302 of the pixel 300a and the photoelectric conversion section 302 of the pixel 300b are transferred via the transfer gates 400a and 400b of the transfer transistors (one type of the pixel transistors explained above) provided on the surface 10b located on the opposite side of the light receiving surface 10a of the semiconductor substrate 10. The transfer gates 400a and 400b can be formed of, for example, a metal film. Then, the electric charges may be stored in, for example, a floating diffusion section (a charge storage section) (not illustrated) provided in a semiconductor region having the first conductivity type (for example, the n-type) provided in the semiconductor substrate 10. Note that, in the comparative example, the floating diffusion section is not limited to be provided in the semiconductor substrate 10 and may be provided, for example, on another substrate (not illustrated) stacked on the semiconductor substrate 10.
Further, on the front surface 10b of the semiconductor substrate 10, a plurality of various pixel transistors (not illustrated) other than the transfer transistors explained above, which are used for, for example, reading electric charges as pixel signal, may be provided. Further, in the comparative example, the pixel transistors may be provided on the semiconductor substrate 10 or may be provided on another substrate (not illustrated) stacked on the semiconductor substrate 10.
Next, a planar configuration of the imaging element 100 according to the comparative example is explained with reference to
As illustrated in
Further, the two projecting sections 304 are provided in the center of the imaging element 100 in the row direction when the imaging element 100 is viewed from above the light receiving surface 10a or the surface 10b. Projecting lengths (lengths in the column direction) of the projecting sections 304 are substantially the same. As explained above, the two projecting sections 304 are provided to pierce through the semiconductor substrate 10. Note that, in the comparative example, the width of the projecting section 304 is not particularly limited as long as the pair of pixels 300a and 300b can be separated.
Further, the projecting section 304 and the element separation wall 310 according to the comparative example explained above have a form as illustrated in
As explained above, in the comparative example, since the slit 312 is provided near the center O of the imaging element 100, scattering of light by the projecting section 304 is suppressed. Therefore, according to the comparative example, light made incident on the center O of the imaging element 100 can be made incident on the photoelectric conversion sections 302 without being scattered. As a result, according to the comparative example, since the imaging element 100 can more reliably capture light made incident on the center O of the imaging element 100, deterioration of imaging pixels can be avoided.
Further, in the comparative example, as explained above, for example, the impurities of the first conductivity type are introduced into the region on the surface 10b side of the slit 312 by ion implantation and the channel serving as the overflow path can be formed. Therefore, according to the comparative example, it is possible to form the overflow path at the time of normal imaging while separating the pair of pixels 300a and 300b at the time of phase difference detection. Therefore, it is possible to avoid deterioration of a captured image while improving the accuracy of phase difference detection.
Further, in the comparative example, it is possible to introduce impurities into the region of the slit 312 through the trench of the projecting section 304 with conformal doping and form the diffusion regions 306. Therefore, use of ion implantation can be avoided. Therefore, according to the comparative example, since the ion implantation is not used, it is possible to avoid introduction of impurities into the photoelectric conversion sections 302 and it is possible to avoid a reduction of and damage to the photoelectric conversion sections 302. Further, by using the conformal doping, it is possible to repair crystal defects while uniformly diffusing impurities by applying a high temperature. As a result, according to the comparative example, it is possible to suppress deterioration in sensitivity and a reduction of a dynamic range of the imaging element 100.
Note that the conformal doping is a method of uniformly introducing impurities into the semiconductor substrate 10. Specifically, uniformization of impurities is realized using plasma doping, vapor phase decomposition (VPD), solid phase diffusion, thermal diffusion, or the like. Compared with such conformal doping, the ion implantation method used for impurity introduction has an impurity distribution having a peak depending on implantation energy. Therefore, it is difficult to uniformly introduce impurities.
Note that, in the comparative example, when the imaging element 100 is viewed from above the light receiving surface 10a or the surface 10b, the element separation wall 310 may include two projecting sections 304 projecting in the row direction toward the center O of the imaging element 100 and facing each other. Further, in this case, the two projecting sections 304 may be provided in the center of the imaging element 100 in the column direction when the imaging element 100 is viewed from above the light receiving surface 10a or the surface 10b.
As explained above, according to the comparative example, at the time of phase difference detection, since the diffusion regions 306 electrically separated from the projecting section 304 that physically separates the pair of pixels 300a and 300b, the diffusion region 320 that electrically separates the pair of pixels 300a and 300b, and the like are provided, it is possible to avoid deterioration in a captured image while improving the accuracy of phase difference detection. Specifically, in the comparative example, the pair of pixels 300a and 300b can be effectively separated by the projecting section 304 and the diffusion regions 306. As a result, it is possible to suppress occurrence of color mixing and further improve the accuracy of phase difference detection. Further, in the comparative example, since the overflow path is provided, when electric charges of any one pixel of the pixels 300a and 300b is about to be saturated at the time of normal imaging, saturation of one pixel can be avoided by transferring the electric charges to the other pixel via the overflow path. Therefore, according to the comparative example, by providing such an overflow path, it is possible to secure the linearity of a pixel signal output from the imaging element 100 and prevent deterioration of a captured image.
Further, in the comparative example, since the diffusion regions 306 can be formed by diffusing impurities to the region of the slit 312 through the trench of the projecting section 304 with conformal doping, use of ion implantation can be avoided. Therefore, according to the comparative example, since the ion implantation is not used, it is possible to avoid introduction of impurities into the photoelectric conversion sections 302 and it is possible to avoid a reduction of and damage to the photoelectric conversion sections 302. Further, by using the conformal doping, it is possible to repair crystal defects while uniformly diffusing impurities by applying a high temperature. As a result, according to the comparative example, it is possible to suppress deterioration in sensitivity and a reduction of a dynamic range of the imaging element 100.
In the comparative example, since the projecting section 304 pierces through the semiconductor substrate 10, the diffusion regions 306 can be formed in a deep region in the semiconductor substrate 10 by conformal doping via the projection section 304. Therefore, in the comparative example, since desired diffusion regions 306 can be accurately formed, the pair of pixels 300a and 300b can be effectively electrically separated. As a result, the occurrence of color mixing can be suppressed and the accuracy of the phase difference detection can be further improved. According to the comparative example, since the element separation wall 310 and the projecting section 304 have the same form, the projecting section 304 can be formed simultaneously with the element separation wall 310. An increase in process steps for the imaging element 100 can be suppressed.
In addition, in the comparative example, since the slit 312 is provided in the center O of the imaging element 100, scattering of light by the projecting section 304 is suppressed and light made incident on the center O of the imaging element 100 can be made incident on the photoelectric conversion sections 302 without being scattered. As a result, according to the comparative example, since the imaging element 100 can more reliably capture light made incident on the center O of the imaging element 100, deterioration of imaging pixels can be avoided.
In the following explanation, details of embodiments of the present disclosure created by the present inventors are sequentially explained based on the comparative example explained above.
Next, a first embodiment of the present disclosure created by the present inventors is explained. First, a background leading to the creation of the first embodiment is explained.
In the imaging element 100 according to the comparative example, it is inevitable that the photoelectric conversion sections 302 (the photodiodes) are reduced in size by the element separation wall 310 surrounding the imaging element 100 and the diffusion regions 306 provided around the element separation wall 310 and the projecting section 304. In particular, when the imaging element 100 is further refined, since the photoelectric conversion sections 302 are small, there is a limit to an amount of electric charges to be generated even if a large amount of light is made incident on the imaging element 100. In other words, in the comparative example, there is a limit in increasing a saturation signal amount (Qs) of the imaging element 100. In addition, in the comparative example, since the element separation wall 310 is provided in the row direction and the column direction, there is a limit in a range in which the transfer gates 400a and 400b, various pixel transistors (not illustrated), the floating diffusion section (the charge storage section) (not illustrated), and the like can be arranged. In the comparative example, flexibility of a layout is low.
Therefore, the present inventors have created the first embodiment of the present disclosure in order to further improve flexibility of a layout while further increasing the saturation signal amount (Qs) in the imaging element 100 according to such a comparative example.
First, a planar configuration of the present embodiment is explained with reference to
Whereas the element separation wall 310 is provided in the row direction (the first direction) and the column direction (the second direction) in the comparative example, in the present embodiment, as illustrated in
Further, in the present embodiment, the imaging elements 100 adjacent in the row direction (the first direction) are physically and electrically separated by the element separation walls 310b. However, unlike the comparative example, the element separation wall 310 in the row direction is not provided. Therefore, there is no element that separates the imaging elements 100 adjacent to each other in the column direction (the second direction). Therefore, there is a high possibility that color mixing occurs between the imaging elements 100 adjacent to each other in the column direction. Therefore, in the present embodiment, in order to electrically separate the imaging elements 100 adjacent to each other in the column direction, a diffusion region (a second diffusion region) 306d is provided between the imaging elements 100 adjacent to each other in the column direction. Specifically, as illustrated in
As explained above, in the present embodiment, by providing the element separation walls 310b only in the column direction (the second direction) and providing the diffusion region (the second diffusion region) 306d between the imaging elements 100 adjacent to each other in the column direction, the imaging elements 100 adjacent in the column direction can be electrically separated. Therefore, in the present embodiment, since the element separation wall 310 in the row direction is not provided, the photoelectric conversion sections 302 (the photodiodes) can be increased in size compared with the comparative example. As a result, according to the present embodiment, the saturation signal amount (Qs) of the imaging element 100 can be further increased.
In addition, in the present embodiment, since the element separation wall 310 in the row direction is not provided, a range in which the transfer gates 400a and 400b, the various pixel transistors (not illustrated), the floating diffusion section (the charge storage section) 601, the ground section 602, and the like can be arranged is widened. As a result, according to the present embodiment, flexibility of a layout is improved.
Further, in the present embodiment, since the element separation walls 310b are provided only in the column direction (the second direction) and the element separation wall 310 is not provided in the row direction (the first direction), the element separation wall 310 is not formed in a lattice shape (in plan view). Therefore, according to the present embodiment, since the element separation walls 310b can be formed in a simple shape, the element separation walls 310b can be formed more accurately and the rectangularity of the element separation walls 310b can be improved.
Note that, in the above explanation, the element separation walls 310b are provided only in the column direction (the second direction). However, in the present embodiment, conversely, the element separation wall 310 may be provided only in the row direction (the first direction). In this case, in order to electrically separate the imaging elements 100 adjacent to each other in the row direction, the diffusion region (the second diffusion region) 306d is provided between the imaging elements 100 adjacent to each other in the row direction.
Next, a part of a manufacturing process (a manufacturing method) for the imaging element 100 according to the present embodiment is explained with reference to
First, in order to form the element separation wall 310b and the projecting section 304 in the semiconductor substrate 10, a trench is formed at a predetermined position of the semiconductor substrate 10, and a material (For example, polysilicon) containing impurities of the second conductivity type (For example, p-type) is formed in the trench. Further, the material containing the impurities in the trench is removed by dry etching to be left on the inner wall surface of the trench. Subsequently, by applying heat to the semiconductor substrate 10, the impurities are diffused from the material to the semiconductor substrate 10. That is, the diffusion regions 306 are formed by conformal doping. Subsequently, by forming an insulating material in the trench, a form illustrated in
Further, in the present embodiment, impurities of the first conductivity type (for example, the n-type) are ion-implanted (patterned) into a region 500 illustrated in
In this way, as illustrated in
Next, the imaging element 100 according to the present embodiment can also be formed by another method (anisotropic conformal doping). A part of the manufacturing process (the manufacturing method) for the manufacturing method 2 is explained with reference to
First, as illustrated in
Next, the material (for example, polysilicon) containing the impurities of the second conductivity type (for example, the p-type) is etched using a mask or the like and the material is left only at a desired part. Further, polysilicon not containing impurities is formed an etched part and a form illustrated in
Further, as illustrated in
Subsequently, a trench is formed along the element separation walls 310b and the projecting section 304, a material (for example, polysilicon) containing impurities of the second conductivity type (for example, the p-type) is formed in the trench, and a form illustrated in
Further, as illustrated in
In the embodiment explained above, the element separation wall 310 in the row direction (the second direction) is explained as not being provided. However, the present embodiment is not limited to this and can be modified as appropriate. Therefore, a modification 1 of the present embodiment is explained with reference to
As illustrated in
Note that, as illustrated in
Further, in the present modification, as illustrated in
In the present modification, the shape of the element separation wall (the second element separation wall) 340 is not limited. Various shapes such as a rectangular shape, a circular shape, an elliptical shape, a polygonal shape, and a shape obtained by connecting vertexes of two triangles illustrated in
Further, in the present embodiment, an element that separates the two pixels 300a and 300b (the photoelectric conversion sections 302) is not limited to the pair of projecting sections 304 (an example of the separation section) and the diffusion region 306e around the projecting sections. Therefore, a modification of the separation section that separates the two pixels 300a and 300b is explained with reference to
For example, the separation section illustrated at the left end of
As illustrated second from the left side in
As illustrated third from the left side of
In the present modification, rather than being physically separated, for example, as illustrated on the right side of
In the present embodiment, since flexibility of a layout is high, arrangement of the pixel transistors and the like is not limited. Therefore, the arrangement of the pixel transistors is explained with reference to
For example, in the present modification, as illustrated on the left side of
For example, in the present modification, as illustrated on the right side of
In the present embodiment explained above, since the element separation wall 310 in the row direction (the second direction) is not provided, there is no element that separates the imaging elements 100 adjacent to each other in the column direction (the second direction). Therefore, there is a high possibility that color mixing occurs between the imaging elements 100 adjacent to each other in the column direction. Therefore, in order to prevent such color mixing, it is conceivable to deform the light blocking section 204 provided on the light receiving surface 10a of the semiconductor substrate 10. In the following explanation, modifications of such a light blocking section 204 are explained with reference to
As illustrated in
As illustrated in
Next, a second embodiment of the present disclosure created by the present inventors is explained. First, a background leading to the creation of the second embodiment is explained.
In the imaging element 100 according to the comparative example, since the element separation wall 310 surrounding the imaging element 100 and the diffusion regions 306 provided around the element separation wall 310 and the projecting section 304 are provided, it is inevitable that the photoelectric conversion sections 302 (the photodiodes) are reduced in size. In other words, in the comparative example, there is a limit in increasing the saturation signal amount (Qs) of the imaging element 100.
Therefore, as in the first embodiment explained above, the present inventors have created the second embodiment of the present disclosure in order to further increase the saturation signal amount (Qs) of the imaging element 100 according to the comparative example.
First, a planar configuration of the imaging element 100 of the present embodiment is explained with reference to
As illustrated in
In the present embodiment, the diffusion regions 306 containing impurities of the second conductivity type (for example, the p-type) (including a diffusion region (a third diffusion region) around the element separation walls 310a) are provided around the element separation walls 310a and 310b. Specifically, in the present embodiment, the diffusion regions 306 around the element separation walls 310a may be narrower than the diffusion regions 306 around the element separation walls 310b, and the concentration of impurities in the diffusion regions 306 around the element separation walls 310a may be lower than the concentration of impurities in the diffusion regions 306 around the element separation walls 310b.
Next, a sectional configuration of the imaging element 100 of the present embodiment is explained with reference to
Specifically, as illustrated in
Further, as explained above, in the present embodiment, the width B of the element separation walls 310a is smaller compared with the width A of the element separation walls 310b. In the present embodiment, the diffusion regions 306 around the element separation wall 310a may be narrower than the diffusion regions 306 around the element separation wall 310b, Further, the concentration of impurities in the diffusion regions 306 around the element separation wall 310a may be lower than the concentration of impurities in the diffusion regions 306 around the element separation wall 310b.
In the present embodiment, by setting the width B of the element separation walls 310a smaller compared with the width A of the element separation walls 310b, the region of the photoelectric conversion sections 302 (the photodiodes) can be increased in size compared with the comparative example. Therefore, the saturation signal amount (Qs) can be further increased.
Note that, in
Next, a part of a manufacturing process (a manufacturing method) for the imaging element 100 according to the present embodiment is explained with reference to
First, as illustrated in an upper part and a middle part of
Next, as illustrated in a lower part of
Further, in the present embodiment, the imaging element 100 can be formed by other manufacturing methods. Next, a part of a manufacturing process (a manufacturing method) for the imaging element 100 according to the present embodiment is explained with reference to
First, as illustrated in an upper part of
Further, as illustrated in a middle part of
Next, as illustrated in a lower part of
Further, although not illustrated, in the present manufacturing method, impurities are diffused into the semiconductor substrate 10 by applying heat to the semiconductor substrate 10 (conformal doping). Furthermore, the element separation walls 310a and 310b are formed by forming insulating materials in the trenches 750.
Next, a part of a manufacturing process (a manufacturing method) for the imaging element 100 according to the present embodiment is explained with reference to
First, as illustrated in an upper part of
Further, as illustrated in a middle part of
Further, as illustrated in a lower part of
Further, in the present embodiment, the element that separates the two pixels 300a and 300b (the photoelectric conversion sections 302) is not limited to the pair of projecting sections 304 (an example of the separation section) and the diffusion regions 306 around the projection. Therefore, a modification of the separation section that separates the two pixels 300a and 300b is explained with reference to
For example, the separation section illustrated in an upper part of
For example, the separation section illustrated in a third part from the top of
For example, as illustrated in a lower part of
Next, a third embodiment of the present disclosure created by the present inventors is explained with reference to
As illustrated in
In the comparative example, when the transfer gate 400 is increased in size, the transfer gate 400 is closer to the position of the overflow path between the pixels 300a and 300b. Therefore, when a line passing the center of the overflow path and extending in the column direction is set as a symmetric axis, a potential gradient of the overflow path is not sometimes symmetric by being affected by modulation from the transfer gate 400. In addition, since it is inevitable that the photoelectric conversion sections 302 (photodiodes: PDs) decrease in size when the transfer gate 400 is increased in size, there is a limit in an increase in the saturation signal amount (Qs) of the imaging element 100.
Therefore, the present inventors have created the third embodiment of the present disclosure such that the influence from the transfer gate 400 can be suppressed, the potential gradient of the overflow path can be made more symmetric, and the degree of modulation and the saturation signal amount (Qs) by the transfer gate 400 can be further increased.
First, a configuration of the imaging element 100 of the present embodiment is explained with reference to
In the present embodiment, as illustrated in
When viewed from above the front surface 10b, the floating diffusion section (FD section) (the charge storage section) 601 is provided in the vicinity of a first intersection where one element separation wall (the third element separation wall) 310a and the projecting section 304 (an example of the separation section) intersect. Further, the transfer gate 400 is provided in the vicinity of a second intersection where the element separation wall (the third element separation wall) 310a forming the first intersection and the element separation wall (the first element separation wall) 310b extending in the column direction (second direction) intersect.
In the present embodiment, the transfer gate (the transfer gate electrode) 400 can be disposed in a position away from the overflow path by being disposed as explained above. Note that, in the present embodiment, the transfer gate 400 is preferably disposed in a position away from the overflow path as far as possible as long as the transfer gate 400 does not hinder arrangement and functions of the other elements. According to the present embodiment, since the influence from the potential modulation by the transfer gate 400 can be suppressed, the potential gradient of the overflow path can be brought close to symmetry. In addition, in the present embodiment, since the photoelectric conversion sections 302 (the photodiodes: PDs) can be widely formed by disposing the transfer gate 400 in the position away from the overflow path, the saturation signal amount (Qs) of the imaging element 100 can be further increased.
In the present embodiment, as illustrated in
In the present embodiment, as illustrated in
Further, the present embodiment can be modified. A modification of the transfer gate (the transfer gate electrode) 400 is explained with reference to
As illustrated in
In the embodiment of the present disclosure, the two transfer gates 400a and 400b, the FD section (the floating diffusion section) 601, and the ground section 602 may be disposed as illustrated in
As illustrated in
The FD section 601 is a floating diffusion shared by two cell regions adjacent to each other (see a dotted line region in
The ground section 602 is a ground section shared by the two cell regions adjacent to each other (see a dotted line region in
Here, as illustrated in
Therefore, in the present embodiment, as illustrated in
In the present embodiment, the ground section 602 can be modified as explained below. Therefore, a detailed configuration of the ground section 602 is explained with reference to
As illustrated in
As illustrated in
As illustrated in
As illustrated in
Further, in the present embodiment, the FD section 601 and the ground sections 602 can be modified as explained below. Therefore, detailed configurations of the FD section 601 and the ground sections 602 ware explained with reference to
As illustrated in
As illustrated in
Note that the shapes of the FD section 601 and the ground sections 602 may be the same (see
The FD section 601 and the ground sections 602 are arranged in an array (for example, in a matrix in the row direction and the column direction) but may be arranged at the same pitch as the cell pitch of the cell region or may be arranged by being shifted from each other by a half pitch.
The shapes of the FD section 601 and the ground sections 602 may be, for example, other polygonal shapes or elliptical shapes other than the octagonal shape having the long sides and the short sides.
As explained above, according to the embodiment of the present disclosure, it is possible to avoid deterioration of a captured image while improving the accuracy of phase difference detection.
Note that, in the embodiment of the present disclosure explained above, a case where the present disclosure is applied to a back-illuminated CMOS image sensor structure is explained. However, the embodiment of the present disclosure is not limited to this and may be applied to other structures.
Note that, in the embodiment of the present disclosure explained above, the imaging element 100 in which the first conductivity type is the n-type, the second conductivity type is the p-type, and electrons are used as signal charges is explained. However, embodiments of the present disclosure are not limited to such an example. For example, the present embodiment can be applied to the imaging element 100 in which the first conductivity type is the p-type, the second conductivity type is the n-type, and holes are used as signal charges.
In the embodiment of the present disclosure explained above, the semiconductor substrate 10 may not always be a silicon substrate and may be another substrate (for example, an SOI (Silicon On Insulator) substrate, an SiGe substrate, or the like). The semiconductor substrate 10 may be a semiconductor substrate in which a semiconductor structure and the like are formed on such various substrates.
Further, the imaging device 1 according to the embodiment of the present disclosure is not limited to an imaging device that detects a distribution of an incident light amount of visible light and images the distribution as an image. For example, the present embodiment can be applied to an imaging device that images a distribution of an incident amount of an infrared ray, an X-ray, particles, or the like as an image and an imaging device (a physical quantity distribution detection device) such as a fingerprint detection sensor that detects a distribution of another physical quantity such as pressure or capacitance and image the distribution as an image.
The imaging device 1 according to the embodiment of the present disclosure can be manufactured using a method, an apparatus, and conditions used for manufacturing a general semiconductor device. That is, the imaging device 1 according to the present embodiment can be manufactured using an existing manufacturing process for a semiconductor device.
Note that examples of the method explained above include a PVD (Physical Vapor Deposition) method, a CVD (Chemical Vapor Deposition) method, and an ALD (Atomic Layer Deposition) method. Examples of the PVD method include a vacuum vapor deposition method, an EB (electron beam) vapor deposition method, various sputtering methods (a Magnetron sputtering method, an RF (Radio Frequency)-DC (Direct Current) coupled bias sputtering method, an ECR (Electron Cyclotron Resonance) sputtering method, a counter target sputtering method, a high frequency sputtering method, and the like), an ion plating method, a laser ablation method, a molecular beam epitaxy (MBE) method, and a laser transfer method. Examples of the CVD method include a plasma CVD method, a thermal CVD method, an organic metal (MO) CVD method, and a photo CVD method. Further, other methods include an electrolytic plating method, an electroless plating method, a spin coating method; an immersion method; a cast method; a micro-contact printing; a drop cast method; a various printing methods such as a screen printing method, an inkjet printing method, an offset printing method, a gravure printing method, and a flexographic printing method; a stamping method; a spray method; and various coating methods such as an air doctor coater method, a blade coater method, a rod coater method, a knife coater method, a squeeze coater method, a reverse roll coater method, a transfer roll coater method, a gravure coater method, a kiss coater method, a cast coater method, a spray coater method, a slit orifice coater method, and a calendar coater method. Further, examples of the patterning method include chemical etching such as shadow mask, laser transfer, and photolithography and physical etching by ultraviolet rays, laser, or the like. In addition, examples of a planarization technology include a CMP (Chemical Mechanical Polishing) method, a laser planarization method, and a reflow method.
Note that, in the embodiment of the present disclosure explained above, the structures of the projecting section 304 and the pixel separation wall 334 are explained. However, the structure according to the embodiment of the present disclosure is not limited thereto. Here, various forms of the structures of the sections are explained in detail with reference to
As illustrated in
The RDTI is structure in which the trench T3 is formed from the light receiving surface 10a of the semiconductor substrate 10 to halfway in the semiconductor substrate 10. The FDTI is structure in which a trench is formed from the front surface 10b of the semiconductor substrate 10 to halfway in the semiconductor substrate 10. The FFTI is structure formed by causing the trench T3 to pierce through the semiconductor substrate 10 from the front surface 10b to the light receiving surface 10a of the semiconductor substrate 10. The RFTI is a method of forming the trench T3 to pierce through the semiconductor substrate 10 from the light receiving surface 10a to the front surface 10b of the semiconductor substrate 10. The RDTI+FDTI is a method in which the RDTI and the FDTI explained above are combined. In the RDTI+FDTI, the trench T3 extending from the light receiving surface 10a and the trench T3 extending from the front surface 10b are connected near the center in the thickness direction of the semiconductor substrate 10.
As illustrated in
Here, as the pixel separation wall 334, another structure may be used besides one pixel separation wall 334 that is not in contact with the element separation wall 310 as illustrated in
Note that, in the embodiment of the present disclosure explained above, a case in which the present disclosure is applied to a one-layer CMOS image sensor structure is explained. However, the embodiment of the present disclosure is not limited thereto and may be applied to other structures such as a stacked CMOS image sensor (CIS) structure. For example, as illustrated in
In the structure illustrated in
In the structure illustrated in
The photodiode (PD) has an n-type semiconductor region 34 and a p-type semiconductor region 35 on the substrate surface side. Gate electrodes 36 are formed on the surfaces of substrates configuring pixels via gate insulating films. Pixel transistors Tr1 and Tr2 are formed by the source/drain regions 33 paired with the gate electrodes 36. For example, the pixel transistor Tr1 adjacent to the photodiode (PD) corresponds to a transfer transistor. A source/drain region of the pixel transistor Tr1 corresponds to a floating diffusion (FD). Unit pixels are separated by element separation regions 38.
On the first semiconductor substrate 31, MOS transistors Tr3, Tr4 configuring a control circuit are formed. The MOS transistors Tr3 and Tr4 are formed by the n-type source/drain regions 33 and the gate electrodes 36 formed via a gate insulating film. Further, an interlayer insulating film 39 in a first layer is formed on the surface of the first semiconductor substrate 31. Connection conductors 44 connected to required transistors are formed in the interlayer insulating film 39. In addition, a multilayer wiring layer 41 is formed by the wires 40 in a plurality of layers via the interlayer insulating film 39 to be connected to the connection conductors 44.
As illustrated in
A multilayer wiring layer 55 is formed by providing wires 53 in a plurality of layers in the interlayer insulating film 49 to be connected to the connection conductors 54 and the connection conductor 51 for electrode extraction.
Further, as illustrated in
As illustrated in
On the other hand, on the second semiconductor substrate 45 side, an opening 77 corresponding to the connection conductor 51 is provided. A spherical electrode bump 78 electrically connected to the connection conductor 51 through the opening 77 is provided.
In the structure illustrated in
As illustrated in
Further, a contact 265 used for electrical connection to the second semiconductor substrate 212 is provided on the first semiconductor substrate 211. The contact 265 is connected to a contact 311 of a second semiconductor substrate 212 explained below and is also connected to a pad 280a of the first semiconductor substrate 211.
On the other hand, a logic circuit is formed on the second semiconductor substrate 212. Specifically, the MOS transistor Tr6, the MOS transistor Tr7, and the MOS transistor Tr8, which are a plurality of transistors configuring a logic circuit, are formed in a p-type semiconductor well region (not illustrated) of the second semiconductor substrate 212. In the second semiconductor substrate 212, connection conductors 254 connected to the MOS transistor Tr6, the MOS transistor Tr7, and the MOS transistor Tr8 are formed.
Further, the contact 311 used for electrical connection to the first semiconductor substrate 211 and the third semiconductor substrate 213 is formed on the second semiconductor substrate 212. The contact 311 is connected to the contact 265 of the first semiconductor substrate 211 and is also connected to a pad 330a of the third semiconductor substrate 213.
Further, a memory circuit is formed on the third semiconductor substrate 213. Specifically, an MOS transistor Tr11, an MOS transistor Tr12, and an MOS transistor Tr13, which are a plurality of transistors configuring a memory circuit, are formed in a p-type semiconductor well region (not illustrated) of the third semiconductor substrate 213.
Further, in the third semiconductor substrate 213, connection conductors 344 connected to the MOS transistor Tr11, the MOS transistor Tr12, and the MOS transistor Tr13 are formed.
In the structure illustrated in
In the structure illustrated in
The second substrate 20A includes stacking an insulating layer 88 on a semiconductor substrate 21A. The second substrate 20A includes the insulating layer 88 as a part of the interlayer insulating film 87. The insulating layer 88 is provided in a gap between the semiconductor substrate 21A and a semiconductor substrate 81. The second substrate 20A includes a read circuit 22A. Specifically, the second substrate 20A has a configuration in which the read circuit 22A is provided in a portion on the front surface side (the third substrate 30 side) of the semiconductor substrate 21A. The second substrate 20A is bonded to the first substrate 80 with the rear surface of the semiconductor substrate 21A directed to the front surface side of the semiconductor substrate 11. That is, the second substrate 20A is bonded to the first substrate 80 in a face-to-back manner. The second substrate 20A further includes, in the same layer as the semiconductor substrate 21A, an insulating layer 89 piercing through the semiconductor substrate 21A. The second substrate 20A includes the insulating layer 89 as a part of the interlayer insulating film 87.
A stacked body including the first substrate 80 and the second substrate 20A includes the interlayer insulating film 87 and a through-wire 90 provided in the interlayer insulating film 87. Specifically, the through-wire 90 is electrically connected to the floating diffusion FD and a connection wire 91 explained below. The second substrate 20A further includes, for example, a wiring layer 56 on the insulating layer 88.
The wiring layer 56 further includes, for example, a plurality of pad electrodes 58 in an insulating layer 57. The pad electrodes 58 are made of metal such as copper (Cu) or aluminum (Al). The pad electrodes 58 are exposed on the surface of the wiring layer 56. The pad electrodes 58 are used for electrical connection of the second substrate 20A and the third substrate 30 and bonding of the second substrate 20A and the third substrate 30.
The third substrate 30 includes, for example, stacking an interlayer insulating film 61 on the semiconductor substrate 81. Note that, as explained below, the third substrate 30 is bonded to the second substrate 20A on surfaces on the front surface side. The third substrate 30 has a configuration in which a logic circuit 82 is provided in a portion on the front surface side of the semiconductor substrate 81. The third substrate 30 further includes, for example, a wiring layer 62 on the interlayer insulating film 61. The wiring layer 62 includes, for example, an insulating layer 92 and a plurality of pad electrodes 64 provided in the insulating layer 92. The plurality of pad electrodes 64 are electrically connected to the logic circuit 82. The pad electrodes 64 are made of, for example, Cu (copper). The pad electrodes 64 are exposed on the surface of the wiring layer 62. The pad electrodes 64 are used for electrical connection of the second substrate 20A and the third substrate 30 and bonding of the second substrate 20A and the third substrate 30.
Note that, when the technology of the present disclosure is applied to a one-stage pixel (a normal CIS), as an example, as illustrated in
The imaging element 100 illustrated in
The technology according to the present disclosure (the present technology) can be further applied to various products. For example, the technology according to the present disclosure may be applied to a camera or the like. Therefore, a configuration example of the camera 700 serving as electronic equipment to which the present technology is applied is explained with reference to
As illustrated in
The technology according to the present disclosure (the present technology) can be further applied to various products. For example, the technology according to the present disclosure may be applied to a smartphone or the like. Therefore, a configuration example of a smartphone 900 serving as electronic equipment to which the present technology is applied is explained with reference to
As illustrated in
The CPU 901 functions as an arithmetic processing device and a control device and controls an entire operation or a part of the operation in the smartphone 900 according to various programs recorded in the ROM 902, the RAM 903, the storage device 904, or the like. The ROM 902 stores programs, arithmetic operation parameters, and the like to be used by the CPU 901. The RAM 903 primarily stores programs to be used in execution of the CPU 901, parameters that appropriately change in the execution, and the like. The CPU 901, the ROM 902, and the RAM 903 are connected to one another by a bus 914. The storage device 904 is a device for data storage configured as an example of a storage unit of the smartphone 900. The storage device 904 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, or an optical storage device. The storage device 904 stores programs to be executed by the CPU 901, various data, various data acquired from the outside, and the like.
The communication module 905 is a communication interface including, for example, a communication device for connecting to a communication network 906. The communication module 905 can be, for example, a communication card for a wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication module 905 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various kinds of communication, or the like. The communication module 905 transmits and receives signals and the like to and from, for example, the Internet and other communication equipment using a predetermined protocol such as TCP (Transmission Control Protocol)/IP (Internet Protocol). The communication network 906 connected to the communication module 905 is a network connected by wire or radio and is, for example, the Internet, a home LAN, infrared communication, or satellite communication.
The sensor module 907 includes various sensors such as a motion sensor (for example, an acceleration sensor, a gyro sensor, or a geomagnetic sensor), a biological information sensor (for example, a pulse sensor, a blood pressure sensor, or a fingerprint sensor), or a position sensor (for example, a GNSS (Global Navigation Satellite System) receiver).
The imaging device 909 is provided on the surface of the smartphone 900 and can image a target object or the like located on the rear side or the front side of the smartphone 900. Specifically, the imaging device 909 can include an imaging element (not illustrated) such as a CMOS (Complementary MOS) image sensor to which the technology according to the present disclosure (the present technology) can be applied and a signal processing circuit (not illustrated) that applies imaging signal processing to a signal photo-electrically converted by the imaging element. Further, the imaging device 909 can further include an optical system mechanism (not illustrated) including an imaging lens, a zoom lens, a focus lens, and the like and a drive system mechanism (not illustrated) that controls an operation of the optical system mechanism. The imaging element condenses incident light from a target object as an optical image. The signal processing circuit can acquire a captured image by photo-electrically converting the formed optical image in units of pixels, reading signals of pixels as imaging signals, and performing image processing.
The display device 910 is provided on the surface of the smartphone 900 and can be a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro Luminescence) display. The display device 910 can display an operation screen, a captured image acquired by the imaging device 909 explained above, and the like.
The speaker 911 can output, for example, call voice, voice incidental to video content displayed by the display device 910 explained above, and the like to a user.
The microphone 912 can collect, for example, call voice of the user, voice including a command to start a function of the smartphone 900, and voice in a surrounding environment of the smartphone 900.
The input device 913 is a device operated by the user such as a button, a keyboard, a touch panel, or a mouse. The input device 913 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. By operating the input device 913, the user can input various data to the smartphone 900 and instructs the smartphone 900 to perform a processing operation.
The configuration example of the smartphone 900 is explained above. The components explained above may be configured using general-purpose members or may include hardware specialized for the functions of the components. Such a configuration can be changed as appropriate according to a technical level at each time to be implemented.
The technology according to the present disclosure (the present technology) can be further applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
In
The endoscope 11100 includes a lens barrel 11101 having a region of a predetermined length from a distal end thereof to be inserted into a body lumen of the patient 11132, and a camera head 11102 connected to a proximal end of the lens barrel 11101. In the example depicted, the endoscope 11100 is depicted which includes as a hard mirror having the lens barrel 11101 of the hard type. However, the endoscope 11100 may otherwise be included as a soft mirror having the lens barrel 11101 of the soft type.
The lens barrel 11101 has, at a distal end thereof, an opening in which an objective lens is fitted. A light source apparatus 11203 is connected to the endoscope 11100 such that light generated by the light source apparatus 11203 is introduced to a distal end of the lens barrel 11101 by a light guide extending in the inside of the lens barrel 11101 and is irradiated toward an observation target in a body lumen of the patient 11132 through the objective lens. It is to be noted that the endoscope 11100 may be a direct view mirror or may be a perspective view mirror or a side view mirror.
An optical system and an image pickup element are provided in the inside of the camera head 11102 such that reflected light (observation light) from the observation target is condensed on the image pickup element by the optical system. The observation light is photo-electrically converted by the image pickup element to generate an electric signal corresponding to the observation light, namely, an image signal corresponding to an observation image. The image signal is transmitted as RAW data to a CCU 11201.
The CCU 11201 includes a central processing unit (CPU), a graphics processing unit (GPU) or the like and integrally controls operation of the endoscope 11100 and a display apparatus 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs, for the image signal, various image processes for displaying an image based on the image signal such as, for example, a development process (demosaic process).
The display apparatus 11202 displays thereon an image based on an image signal, for which the image processes have been performed by the CCU 11201, under the control of the CCU 11201.
The light source apparatus 11203 includes a light source such as, for example, a light emitting diode (LED) and supplies irradiation light upon imaging of a surgical region to the endoscope 11100.
An inputting apparatus 11204 is an input interface for the endoscopic surgery system 11000. A user can perform inputting of various kinds of information or instruction inputting to the endoscopic surgery system 11000 through the inputting apparatus 11204. For example, the user would input an instruction or a like to change an image pickup condition (type of irradiation light, magnification, focal distance or the like) by the endoscope 11100.
A treatment tool controlling apparatus 11205 controls driving of the energy treatment tool 11112 for cautery or incision of a tissue, sealing of a blood vessel or the like. A pneumoperitoneum apparatus 11206 feeds gas into a body lumen of the patient 11132 through the pneumoperitoneum tube 11111 to inflate the body lumen in order to secure the field of view of the endoscope 11100 and secure the working space for the surgeon. A recorder 11207 is an apparatus capable of recording various kinds of information relating to surgery. A printer 11208 is an apparatus capable of printing various kinds of information relating to surgery in various forms such as a text, an image or a graph.
It is to be noted that the light source apparatus 11203 which supplies irradiation light when a surgical region is to be imaged to the endoscope 11100 may include a white light source which includes, for example, an LED, a laser light source or a combination of them. Where a white light source includes a combination of red, green, and blue (RGB) laser light sources, since the output intensity and the output timing can be controlled with a high degree of accuracy for each color (each wavelength), adjustment of the white balance of a picked up image can be performed by the light source apparatus 11203. Further, in this case, if laser beams from the respective RGB laser light sources are irradiated time-divisionally on an observation target and driving of the image pickup elements of the camera head 11102 are controlled in synchronism with the irradiation timings. Then images individually corresponding to the R, G and B colors can be also picked up time-divisionally. According to this method, a color image can be obtained even if color filters are not provided for the image pickup element.
Further, the light source apparatus 11203 may be controlled such that the intensity of light to be outputted is changed for each predetermined time. By controlling driving of the image pickup element of the camera head 11102 in synchronism with the timing of the change of the intensity of light to acquire images time-divisionally and synthesizing the images, an image of a high dynamic range free from underexposed blocked up shadows and overexposed highlights can be created.
Further, the light source apparatus 11203 may be configured to supply light of a predetermined wavelength band ready for special light observation. In special light observation, for example, by utilizing the wavelength dependency of absorption of light in a body tissue to irradiate light of a narrow band in comparison with irradiation light upon ordinary observation (namely, white light), narrow band observation (narrow band imaging) of imaging a predetermined tissue such as a blood vessel of a superficial portion of the mucous membrane or the like in a high contrast is performed. Alternatively, in special light observation, fluorescent observation for obtaining an image from fluorescent light generated by irradiation of excitation light may be performed. In fluorescent observation, it is possible to perform observation of fluorescent light from a body tissue by irradiating excitation light on the body tissue (autofluorescence observation) or to obtain a fluorescent light image by locally injecting a reagent such as indocyanine green (ICG) into a body tissue and irradiating excitation light corresponding to a fluorescent light wavelength of the reagent upon the body tissue. The light source apparatus 11203 can be configured to supply such narrow-band light and/or excitation light suitable for special light observation as described above.
The camera head 11102 includes a lens unit 11401, an image pickup unit 11402, a driving unit 11403, a communication unit 11404 and a camera head controlling unit 11405. The CCU 11201 includes a communication unit 11411, an image processing unit 11412 and a control unit 11413. The camera head 11102 and the CCU 11201 are connected for communication to each other by a transmission cable 11400.
The lens unit 11401 is an optical system, provided at a connecting location to the lens barrel 11101. Observation light taken in from a distal end of the lens barrel 11101 is guided to the camera head 11102 and introduced into the lens unit 11401. The lens unit 11401 includes a combination of a plurality of lenses including a zoom lens and a focusing lens.
The image pickup unit 11402 includes an image pickup element. The number of image pickup elements which is included by the image pickup unit 11402 may be one (single-plate type) or a plural number (multi-plate type). Where the image pickup unit 11402 is configured as that of the multi-plate type, for example, image signals corresponding to respective R, G and B are generated by the image pickup elements, and the image signals may be synthesized to obtain a color image. The image pickup unit 11402 may also be configured so as to have a pair of image pickup elements for acquiring respective image signals for the right eye and the left eye ready for three dimensional (3D) display. If 3D display is performed, then the depth of a living body tissue in a surgical region can be comprehended more accurately by the surgeon 11131. It is to be noted that, where the image pickup unit 11402 is configured as that of stereoscopic type, a plurality of systems of lens units 11401 are provided corresponding to the individual image pickup elements.
Further, the image pickup unit 11402 may not necessarily be provided on the camera head 11102. For example, the image pickup unit 11402 may be provided immediately behind the objective lens in the inside of the lens barrel 11101.
The driving unit 11403 includes an actuator and moves the zoom lens and the focusing lens of the lens unit 11401 by a predetermined distance along an optical axis under the control of the camera head controlling unit 11405. Consequently, the magnification and the focal point of a picked up image by the image pickup unit 11402 can be adjusted suitably.
The communication unit 11404 includes a communication apparatus for transmitting and receiving various kinds of information to and from the CCU 11201. The communication unit 11404 transmits an image signal acquired from the image pickup unit 11402 as RAW data to the CCU 11201 through the transmission cable 11400.
In addition, the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head controlling unit 11405. The control signal includes information relating to image pickup conditions such as, for example, information that a frame rate of a picked up image is designated, information that an exposure value upon image picking up is designated and/or information that a magnification and a focal point of a picked up image are designated.
It is to be noted that the image pickup conditions such as the frame rate, exposure value, magnification or focal point may be designated by the user or may be set automatically by the control unit 11413 of the CCU 11201 on the basis of an acquired image signal. In the latter case, an auto exposure (AE) function, an auto focus (AF) function and an auto white balance (AWB) function are incorporated in the endoscope 11100.
The camera head controlling unit 11405 controls driving of the camera head 11102 on the basis of a control signal from the CCU 11201 received through the communication unit 11404.
The communication unit 11411 includes a communication apparatus for transmitting and receiving various kinds of information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted thereto from the camera head 11102 through the transmission cable 11400.
Further, the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102. The image signal and the control signal can be transmitted by electrical communication, optical communication or the like.
The image processing unit 11412 performs various image processes for an image signal in the form of RAW data transmitted thereto from the camera head 11102.
The control unit 11413 performs various kinds of control relating to image picking up of a surgical region or the like by the endoscope 11100 and display of a picked up image obtained by image picking up of the surgical region or the like. For example, the control unit 11413 creates a control signal for controlling driving of the camera head 11102.
Further, the control unit 11413 controls, on the basis of an image signal for which image processes have been performed by the image processing unit 11412, the display apparatus 11202 to display a picked up image in which the surgical region or the like is imaged. Thereupon, the control unit 11413 may recognize various objects in the picked up image using various image recognition technologies. For example, the control unit 11413 can recognize a surgical tool such as forceps, a particular living body region, bleeding, mist when the energy treatment tool 11112 is used and so forth by detecting the shape, color and so forth of edges of objects included in a picked up image. The control unit 11413 may cause, when it controls the display apparatus 11202 to display a picked up image, various kinds of surgery supporting information to be displayed in an overlapping manner with an image of the surgical region using a result of the recognition. Where surgery supporting information is displayed in an overlapping manner and presented to the surgeon 11131, the burden on the surgeon 11131 can be reduced and the surgeon 11131 can proceed with the surgery with certainty.
The transmission cable 11400 which connects the camera head 11102 and the CCU 11201 to each other is an electric signal cable ready for communication of an electric signal, an optical fiber ready for optical communication or a composite cable ready for both of electrical and optical communications.
Here, while, in the example depicted, communication is performed by wired communication using the transmission cable 11400, the communication between the camera head 11102 and the CCU 11201 may be performed by wireless communication.
An example of the endoscopic surgery system to which the technology according to the present disclosure can be applied is explained above. The technology according to the present disclosure can be applied to, for example, the endoscope 11100, (the image pickup unit 11402 of) the camera head 11102, (the image processing unit 11412 of) the CCU 11201, and the like) among the components described above.
Note that, here, the endoscopic surgery system is explained as an example. However, besides, the technology according to the present disclosure may be applied to, for example, a microscopic surgery system or the like.
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be implemented as a device mounted on a mobile body of any type such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, and a robot.
The vehicle control system 12000 includes a plurality of electronic control units connected to each other via a communication network 12001. In the example depicted in
The driving system control unit 12010 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs. For example, the driving system control unit 12010 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like.
The body system control unit 12020 controls the operation of various kinds of devices provided to a vehicle body in accordance with various kinds of programs. For example, the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal, a fog lamp, or the like. In this case, radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 12020. The body system control unit 12020 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
The outside-vehicle information detecting unit 12030 detects information about the outside of the vehicle including the vehicle control system 12000. For example, the outside-vehicle information detecting unit 12030 is connected with an imaging section 12031. The outside-vehicle information detecting unit 12030 makes the imaging section 12031 image an image of the outside of the vehicle, and receives the imaged image. On the basis of the received image, the outside-vehicle information detecting unit 12030 may perform processing of detecting an object such as a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
The imaging section 12031 is an optical sensor that receives light, and which outputs an electric signal corresponding to a received light amount of the light. The imaging section 12031 can output the electric signal as an image, or can output the electric signal as information about a measured distance. In addition, the light received by the imaging section 12031 may be visible light, or may be invisible light such as infrared rays or the like.
The in-vehicle information detecting unit 12040 detects information about the inside of the vehicle. The in-vehicle information detecting unit 12040 is, for example, connected with a driver state detecting section 12041 that detects the state of a driver. The driver state detecting section 12041, for example, includes a camera that images the driver. On the basis of detection information input from the driver state detecting section 12041, the in-vehicle information detecting unit 12040 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing.
The microcomputer 12051 can calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the information about the inside or outside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040, and output a control command to the driving system control unit 12010. For example, the microcomputer 12051 can perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like.
In addition, the microcomputer 12051 can perform cooperative control intended for automated driving, which makes the vehicle to travel automatedly without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like on the basis of the information about the outside or inside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040.
In addition, the microcomputer 12051 can output a control command to the body system control unit 12020 on the basis of the information about the outside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030. For example, the microcomputer 12051 can perform cooperative control intended to prevent a glare by controlling the headlamp so as to change from a high beam to a low beam, for example, in accordance with the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detecting unit 12030.
The sound/image output section 12052 transmits an output signal of at least one of a sound and an image to an output device capable of visually or auditorily notifying information to an occupant of the vehicle or the outside of the vehicle. In the example of
In
The imaging sections 12101, 12102, 12103, 12104, and 12105 are, for example, disposed at positions on a front nose, sideview mirrors, a rear bumper, and a back door of the vehicle 12100 as well as a position on an upper portion of a windshield within the interior of the vehicle. The imaging section 12101 provided to the front nose and the imaging section 12105 provided to the upper portion of the windshield within the interior of the vehicle obtain mainly an image of the front of the vehicle 12100. The imaging sections 12102 and 12103 provided to the sideview mirrors obtain mainly an image of the sides of the vehicle 12100. The imaging section 12104 provided to the rear bumper or the back door obtains mainly an image of the rear of the vehicle 12100. Front images acquired by the imaging sections 12101 and 12105 are mainly used for detecting a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
Incidentally,
At least one of the imaging sections 12101 to 12104 may have a function of obtaining distance information. For example, at least one of the imaging sections 12101 to 12104 may be a stereo camera constituted of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
For example, the microcomputer 12051 can determine a distance to each three-dimensional object within the imaging ranges 12111 to 12114 and a temporal change in the distance (relative speed with respect to the vehicle 12100) on the basis of the distance information obtained from the imaging sections 12101 to 12104, and thereby extract, as a preceding vehicle, a nearest three-dimensional object in particular that is present on a traveling path of the vehicle 12100 and which travels in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, equal to or more than 0 km/hour). Further, the microcomputer 12051 can set a following distance to be maintained from a preceding vehicle in advance, and perform automatic brake control (including following stop control), automatic acceleration control (including following start control), or the like. It is thus possible to perform cooperative control intended for automated driving that makes the vehicle travel automatedly without depending on the operation of the driver or the like.
For example, the microcomputer 12051 can classify three-dimensional object data on three-dimensional objects into three-dimensional object data of a two-wheeled vehicle, a standard-sized vehicle, a large-sized vehicle, a pedestrian, a utility pole, and other three-dimensional objects on the basis of the distance information obtained from the imaging sections 12101 to 12104, extract the classified three-dimensional object data, and use the extracted three-dimensional object data for automatic avoidance of an obstacle. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that the driver of the vehicle 12100 can recognize visually and obstacles that are difficult for the driver of the vehicle 12100 to recognize visually. Then, the microcomputer 12051 determines a collision risk indicating a risk of collision with each obstacle. In a situation in which the collision risk is equal to or higher than a set value and there is thus a possibility of collision, the microcomputer 12051 outputs a warning to the driver via the audio speaker 12061 or the display section 12062, and performs forced deceleration or avoidance steering via the driving system control unit 12010. The microcomputer 12051 can thereby assist in driving to avoid collision.
At least one of the imaging sections 12101 to 12104 may be an infrared camera that detects infrared rays. The microcomputer 12051 can, for example, recognize a pedestrian by determining whether or not there is a pedestrian in imaged images of the imaging sections 12101 to 12104. Such recognition of a pedestrian is, for example, performed by a procedure of extracting characteristic points in the imaged images of the imaging sections 12101 to 12104 as infrared cameras and a procedure of determining whether or not it is the pedestrian by performing pattern matching processing on a series of characteristic points representing the contour of the object. When the microcomputer 12051 determines that there is a pedestrian in the imaged images of the imaging sections 12101 to 12104, and thus recognizes the pedestrian, the sound/image output section 12052 controls the display section 12062 so that a square contour line for emphasis is displayed so as to be superimposed on the recognized pedestrian. The sound/image output section 12052 may also control the display section 12062 so that an icon or the like representing the pedestrian is displayed at a desired position.
The example of the vehicle control system to which the technology according to the present disclosure can be applied is explained above. The technology according to the present disclosure can be applied to, for example, the imaging section 12031 and the like among the components explained above.
The preferred embodiment of the present disclosure is explained in detail above with reference to the accompanying drawings. However, the technical scope of the present disclosure is not limited to such an example. It is evident that those having the ordinary knowledge in the technical field of the present disclosure can arrive at various alterations or corrections within the category of the technical idea described in claims. It is understood that these alterations and corrections naturally belong to the technical scope of the present disclosure. The embodiments and the modifications explained above can be implemented in combination with each other.
The effects described in the present specification are only explanatory or illustrative and are not limiting. That is, the technology according to the present disclosure can achieve other effects obvious for those skilled in the art from the description of the present specification together with or instead of the effects described above.
Note that the present technology can also take the following configurations.
Number | Date | Country | Kind |
---|---|---|---|
2021-155511 | Sep 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/009286 | 3/4/2022 | WO |