The present disclosure relates to a solid-state imaging device.
The solid-state imaging device includes, for example, a plurality of pixels arranged in a two-dimensional array, and an element isolation insulation film surrounding each of the pixels. Each of the pixels includes, for example, a pixel transistor such as a transfer transistor, a reset transistor, a selection transistor, or an amplification transistor, or a dummy transistor that is a dummy of the pixel transistor.
However, depending on the arrangement of the pixels and the shape of the element isolation insulation film, a sensitivity difference may be generated between the pixels of the solid-state imaging device.
Therefore, the present disclosure provides a solid-state imaging device capable of preventing the sensitivity difference from being generated between the pixels.
According to a first aspect of the present disclosure, there is provided a fixed imaging device including: a first pixel; and a second pixel located in a first direction of the first pixel, in which each of the first and second pixels includes a first transistor and a second transistor, and the first and second transistors in the second pixel are disposed periodically in the first direction with respect to the first and second transistors in the first pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the first pixel and the second pixel.
Furthermore, the solid-state imaging device according to the first aspect may further includes: a third pixel located in a second direction of the first pixel; and a fourth pixel located in the second direction of the second pixel, in which each of the third and fourth pixels may include the first transistor and the second transistor, and the first and second transistors in the fourth pixel may be disposed periodically in the first direction with respect to the first and second transistors in the third pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the first pixel and the second pixel, and between the third pixel and the fourth pixel.
Furthermore, according to the first aspect, the first and second transistors in the third pixel may be disposed symmetrically in the second direction with respect to the first and second transistors in the first pixel, and/or the first and second transistors in the fourth pixel may be disposed symmetrically in the second direction with respect to the first and second transistors in the second pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the first pixel and the second pixel, and between the third pixel and the fourth pixel.
Furthermore, according to the first aspect, the first and second transistors in the third pixel may be disposed periodically in the second direction with respect to the first and second transistors in the first pixel, and/or the first and second transistors in the fourth pixel may be disposed periodically in the second direction with respect to the first and second transistors in the second pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the first pixel and the third pixel, and/or between the second pixel and the fourth pixel.
Furthermore, according to the first aspect, each of the first and second pixels may include a photoelectric conversion unit provided in a substrate, and include the first and second transistors under the substrate. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels including the photoelectric conversion unit.
Furthermore, according to the first aspect, the photoelectric conversion unit may include a first semiconductor region and a second semiconductor region surrounding the first semiconductor region, and the first and second semiconductor regions in the second pixel may be disposed periodically in the first direction with respect to the first and second semiconductor regions in the first pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the photoelectric conversion unit.
Furthermore, according to the first aspect, each of the first and second pixels may include a floating diffusion portion in the substrate, and the floating diffusion portion in the second pixel may be disposed periodically in the first direction with respect to the floating diffusion portion in the first pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the floating diffusion portion.
Furthermore, the solid-state imaging device according to the first aspect may further include a first wiring layer provided under the substrate and including a plurality of first wirings, in which the first wirings in the second pixel may be disposed periodically in the first direction with respect to the first wirings in the first pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the first wiring layer.
Furthermore, according to the first aspect, each of the first and second pixels may include the plurality of first wirings extending to one side in the first direction or second direction. In this configuration, for example, the first wirings can be suitably disposed.
Furthermore, the solid-state imaging device according to the first aspect may further include a second wiring layer provided under the first wiring layer and including a plurality of second wirings, in which the second wirings in the second pixel may be disposed periodically in the first direction with respect to the second wirings in the first pixel. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the second wiring layer.
Furthermore, according to the first aspect, each of the first and second pixels may include the plurality of first wirings extending to one side in the first direction or second direction and the plurality of second wirings extending to the other side in the first direction or second direction. In this configuration, for example, the first and second wirings can be suitably disposed.
Furthermore, according to the first aspect, the first transistor may be a transfer transistor. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the transfer transistor.
Furthermore, according to the first aspect, the second transistor may be a pixel transistor other than the transfer transistor or a dummy transistor that is a dummy of the pixel transistor. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the pixel transistor or the dummy transistor other than the transfer transistor.
Furthermore, according to the first aspect, at least one of the first pixel or the second pixel may not include an element isolation insulation film between the first transistor and the second transistor. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the element isolation insulation film.
Furthermore, the solid-state imaging device according to the first aspect may further include an element isolation insulation film surrounding each of the first and second pixels. In this configuration, for example, it is possible to prevent colors from being mixed between the pixels.
According to a second aspect of the present disclosure, there is provided a fixed imaging device including: a first pixel; and a second pixel located in a first direction of the first pixel, in which each of the first and second pixels includes a first transistor and a second transistor, and at least one of the first pixel or the second pixel does not include an element isolation insulation film between the first transistor and the second transistor. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated between the pixels, which is caused by the element isolation insulation film.
Furthermore, the solid-state imaging device according to the second aspect may further include an element isolation insulation film surrounding each of the first and second pixels. In this configuration, for example, it is possible to prevent colors from being mixed between the pixels.
According to a third aspect of the present disclosure, there is provided a fixed imaging device including: a first pixel; a second pixel located adjacent to the first pixel in a first direction; a third pixel located adjacent to the first pixel in a second direction; a fourth pixel located adjacent to the second pixel in the second direction; a first element isolation insulation film provided in each of the first to fourth pixels; and a second element isolation insulation film surrounding each of the first to fourth pixels, in which at least one of the first element isolation insulation film or the second element isolation insulation film includes a portion having a first width and a portion having a second width larger than the first width in plan view. In this configuration, for example, it is possible to prevent a sensitivity difference from being generated among the first to fourth pixels by the first or second element isolation insulation film.
Furthermore, according to third aspect, each of the first to fourth pixels may include a first transistor and a second transistor, the first element isolation insulation film may be disposed between the first transistor and the second transistor, the first transistors in the first to fourth pixels may be disposed periodically in the first and second directions, and the second transistors in the first to fourth pixels may include gate electrodes having two or more types of areas in plan view. In this configuration, for example, it is possible to prevent a sensitivity difference caused by the second transistor.
Furthermore, according to the third aspect, each of the first to fourth pixels may include a first transistor and a second transistor, the first element isolation insulation film may be disposed between the first transistor and the second transistor, the first transistors in the first to fourth pixels may be disposed periodically in the first and second directions, and the second transistors in the first to fourth pixels may be disposed periodically in the first and the second directions. In this configuration, for example, it is possible to prevent a sensitivity difference caused by the transistor other than the second transistor.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
The solid-state imaging device in
Each of the pixels 1 includes a photodiode functioning as a photoelectric conversion unit and a MOS transistor functioning as a pixel transistor. Examples of the pixel transistor include a transfer transistor, a reset transistor, a selection transistor, and an amplification transistor. Some pixels 1 include a dummy transistor that is a dummy of the pixel transistor.
The pixel array region 2 includes a plurality of the pixels 1 arranged in a two-dimensional array. The pixel array region 2 includes an effective pixel region that receives light to perform photoelectric conversion, and amplifies and outputs a signal charge generated by the photoelectric conversion, and a black reference pixel region that outputs optical black serving as a reference of a black level. Generally, the black reference pixel region is disposed on an outer peripheral portion of the effective pixel region.
The control circuit 3 generates various signals serving as references of operations of the vertical drive circuit 4, each of the column signal processing circuits 5, the horizontal drive circuit 6, and the like on the basis of a vertical synchronization signal, a horizontal synchronization signal, a master clock, and the like. The signal generated by the control circuit 3 is, for example, a clock signal or a control signal, and is input to the vertical drive circuit 4, the column signal processing circuit 5, the horizontal drive circuit 6, and the like.
The vertical drive circuit 4 includes, for example, a shift register, and scans each of the pixels 1 in the pixel array region 2 in a vertical direction row by row. The vertical drive circuit 4 further supplies a pixel signal based on the signal charge generated by each of the pixels 1 to the column signal processing circuit 5 through each of the vertical signal lines 8.
The column signal processing circuit 5 is disposed, for example, for each column formed by the pixels 1 in the pixel array region 2, and performs signal processing on the signals output from the pixels 1 forming one row for each column on the basis of the signals from the black reference pixel region. Examples of this signal processing include noise removal and signal amplification.
The horizontal drive circuit 6 includes, for example, a shift register, and supplies the pixel signal from each column signal processing circuit 5 to the horizontal signal line 9.
The output circuit 7 performs signal processing on the signal supplied from each column signal processing circuit 5 through the horizontal signal line 9, and outputs the signal subjected to the signal processing.
As illustrated in
The substrate 11 is, for example, a semiconductor substrate such as a silicon (Si) substrate.
The n-type semiconductor region 12 of each pixel 1 and the p-type semiconductor region 13 of each pixel 1 are provided in the substrate 11 and form a pn junction. The photodiode PD of each pixel 1 is mainly realized by the pn junction. The photodiode PD functions as a photoelectric conversion unit that converts light into an electric charge. Specifically, the photodiode PD receives light through the back surface S2 of the substrate 11, generates a signal charge according to the amount of received light, and accumulates the generated signal charge in the n-type semiconductor region 12. In the present embodiment, the n-type semiconductor region 12 and the p-type semiconductor region 13 generally have a columnar shape and tubular shape extending in the Z direction, and the p-type semiconductor region 13 surrounds the n-type semiconductor region 12 in a tubular shape. The n-type semiconductor region 12 is an example of a first semiconductor region of the present disclosure, and the p-type semiconductor region 13 is an example of a second semiconductor region of the present disclosure.
The n+-type semiconductor region 14 of each pixel 1 is provided under the p-type semiconductor region 13 in the substrate 11, and functions as, for example, a floating diffusion portion. The n+-type semiconductor region 14 is formed, for example, by implanting an n-type impurity at a high concentration into a part of the p-type semiconductor region 13. In the present embodiment, the signal charge accumulated in the n-type semiconductor region 12 is transferred to the n+-type semiconductor region 14.
The light-shielding film 15 is a film having a function of shielding light, and is formed on the back surface S2 of the substrate 11. The light-shielding film 15 of the present embodiment is formed on the element isolation insulation film 21 provided in the substrate 11, and has a mesh-like planar shape. Light incident on the light-shielding film 15 is shielded by the light-shielding film 15 or passes through an opening (mesh) of the light-shielding film 15. The light-shielding film 15 is, for example, a film containing a metal element such as tungsten (W), aluminum (Al), or copper (Cu).
The color filter 16 has a function of transmitting light having a predetermined wavelength, and is formed on the back surface S2 of the substrate 11 for each pixel 1. For example, the color filters 16 for red (R), green (G), and blue (B) are disposed on the photodiodes PD of red, green, and blue pixels 1, respectively. Moreover, the color filter 16 for infrared light may be disposed on the photodiode PD of the pixel 1 of infrared light.
The on-chip lens 17 has a function of condensing incident light, and is formed on the color filter 16 for each pixel 1. The light condensed by the on-chip lens 17 passes through the color filter 16 and enters the photodiode PD. The photodiode PD converts the light into an electric charge.
The element isolation insulation film 21 is provided in the substrate 11 and isolates the pixels 1 of the solid-state imaging device from each other. The element isolation insulation film 21 is provided to prevent colors from being mixed between the pixels 1. The element isolation insulation film 21 of the present embodiment penetrates the substrate 11 from the front surface S1 to the back surface S2. Furthermore, the element isolation insulation film 21 of the present embodiment has a shape surrounding each pixel 1. Therefore, the color mixing between the pixels 1 can be effectively suppressed. The element isolation insulation film 21 is, for example, a silicon oxide (SiO2) film. The element isolation insulation film 21 may include a film having a negative fixed charge (fixed charge film). Note that the element isolation insulation film 21 of the present embodiment includes a portion penetrating the substrate 11 alone and a portion penetrating the substrate 11 together with an element isolation insulation film 29 to be described later.
The interlayer insulation film 22 is formed on the front surface S1 of the substrate 11. The interlayer insulation film 22 is, for example, a silicon oxide film or a laminated film including the silicon oxide film and other insulation films.
The gate insulation film 23 and gate electrode 24 of each pixel 1 are sequentially provided on the front surface S1 of the substrate 1 and covered by the interlayer insulation film 22. The gate insulation film 23 and gate electrode 24 of the present embodiment are provided under the p-type semiconductor region 13 between the n-type semiconductor region 12 and the n+-type semiconductor region 14, and form the transfer transistor TG. The transfer transistor TG can transfer the signal charge accumulated in the n-type semiconductor region 12 to the n+-type semiconductor region 14. The transfer transistor TG is an example of a first transistor of the present disclosure.
Note that the transfer transistor TG may be a vertical transistor. That is, the gate insulation film 23 and gate electrode 24 of the transfer transistor TG may include a portion embedded in a groove formed in the substrate 11.
The wiring layers 25 to 27 are sequentially provided in the interlayer insulation film 22 on the front surface S1 of the substrate 11, and form a multilayer wiring structure. The multilayer wiring structure of the present embodiment includes the three wiring layers 25 to 27, but may include four or more wiring layers. Each of the wiring layers 25 to 27 includes a plurality of wirings, and the pixel transistor such as the transfer transistor TG is driven using these wirings. The wiring layers 25 to 27 are, for example, layers containing a metal element such as tungsten, aluminum, or copper. The wiring layers 25 to 27 are examples of first and second wiring layers of the present disclosure.
The support substrate 28 is provided on the front surface S1 of the substrate 11 via the interlayer insulation film 22, and is provided to secure the strength of the substrate 11. The support substrate 28 is, for example, a semiconductor substrate such as a silicon substrate.
In the present embodiment, light incident on the on-chip lens 17 is condensed by the on-chip lens 17, passes through the color filter 16, passes through the opening of the light-shielding film 15, and is incident on the photodiode PD. The photodiode PD converts the light into an electric charge by photoelectric conversion to generate a signal charge. The signal charge is output as a pixel signal via the vertical signal lines 8 in the wiring layers 25 to 27.
Note that the n-type semiconductor region and p-type semiconductor region in the substrate 11 of the present embodiment may be interchanged with each other. Specifically, the n-type semiconductor region 12, the p-type semiconductor region 13, and the n+-type semiconductor region 14 may be changed to the p-type semiconductor region, the n-type semiconductor region, and the p+-type semiconductor region, respectively.
Next, a relationship between two pixels 1 illustrated in
Two pixels 1 illustrated in
Each component in the right pixel 1 illustrated in
Furthermore, wirings of each of the wiring layers 25 to 27 in the right pixel 1 are disposed symmetrically in the X direction with respect to the corresponding wirings of each of the wiring layers 25 to 27 in the left pixel 1. In
Note that, any of the components corresponding to each other in these pixels 1 may not be disposed symmetrically in the X direction. For example, any wiring of the wiring layers 25 to 27 in the right pixel 1 may not be disposed symmetrically in the X direction with respect to the corresponding wiring of the wiring layers 25 to 27 in the left pixel 1. Furthermore, any wiring of the wiring layers 25 to 27 in the right pixel 1 may not correspond to any wiring of the wiring layers 25 to 27 in the left pixel 1.
Each pixel 1 illustrated in
Two pixels 1 illustrated in
Each component in the left pixel 1 illustrated in
Furthermore, wirings of each of the wiring layers 25 to 27 in the left pixel 1 are disposed periodically in the Y direction with respect to the corresponding wirings of each of the wiring layers 25 to 27 in the right pixel 1. In
Note that, any of the components corresponding to each other in these pixels 1 may not be disposed periodically in the Y direction. For example, any wiring of the wiring layers 25 to 27 in the left pixel 1 may not be disposed periodically in the Y direction with respect to the corresponding wiring of the wiring layers 25 to 27 in the right pixel 1. Furthermore, any wiring of the wiring layers 25 to 27 in the left pixel 1 may not correspond to any wiring of the wiring layers 25 to 27 in the right pixel 1.
A of
The lower left pixel 1 illustrated in A of
The upper left pixel 1 illustrated in A of
The upper right pixel 1 illustrated in A of
The lower right pixel 1 illustrated in A of
The lower left pixel 1 illustrated in A of
The same applies to the other pixels 1 illustrated in A of
The lower left pixel 1 illustrated in A of
The same applies to the other pixels 1 illustrated in A of
Four pixels 1 illustrated in A of
A relationship among
B of
C of
Hereinafter, a relationship among four pixels 1 illustrated in A of
In A of
Furthermore, the lower right pixel 1 including the dummy transistor and the upper right pixel 1 including the amplification transistor AMP are adjacent to each other in the Y direction. In the present embodiment, the structures of these pixels 1 are periodically in the Y direction. Specifically, components corresponding to each other in these pixels 1 have a shape periodic in the Y direction and are disposed periodically in the Y direction. For example, the gate electrode 24 of the upper right transfer transistor TG is disposed periodically in the Y direction with respect to the gate electrode 24 of the lower right transfer transistor TG. Moreover, the gate electrode 24 of the amplification transistor AMP is disposed periodically in the Y direction with respect to the gate electrode 24 of the dummy transistor. Moreover, the n-type semiconductor region 12, p-type semiconductor region 13, three n+-type semiconductor regions 14, and element isolation insulation film 29 in the upper right pixel 1 are respectively disposed periodically in the Y direction with respect to the n-type semiconductor region 12, p-type semiconductor region 13, three n+-type semiconductor regions 14, and element isolation insulation film 29 in the lower right pixel 1.
Furthermore, the lower left pixel 1 including the reset transistor RST and the lower right pixel 1 including the dummy transistor are adjacent to each other in the X direction. In the present embodiment, the structures of these pixels 1 are symmetrical in the X direction. Specifically, components corresponding to each other in these pixels 1 have a shape symmetrical in the X direction and are disposed symmetrically in the X direction. For example, the gate electrode 24 of the lower right transfer transistor TG is disposed symmetrically in the X direction with respect to the gate electrode 24 of the lower left transfer transistor TG. Moreover, the gate electrode 24 of the dummy transistor is disposed symmetrically in the X direction with respect to the gate electrode 24 of the reset transistor RST. Moreover, the n-type semiconductor region 12, p-type semiconductor region 13, three n+-type semiconductor regions 14, and element isolation insulation film 29 in the lower right pixel 1 are respectively disposed symmetrically in the X direction with respect to the n-type semiconductor region 12, p-type semiconductor region 13, three n+-type semiconductor regions 14, and element isolation insulation film 29 in the lower left pixel 1.
Furthermore, the upper left pixel 1 including the selection transistor SEL and the upper right pixel 1 including the amplification transistor AMP are adjacent to each other in the X direction. In the present embodiment, the structures of these pixels 1 are symmetrical in the X direction. Specifically, components corresponding to each other in these pixels 1 have a shape symmetrical in the X direction and are disposed symmetrically in the X direction. For example, the gate electrode 24 of the upper right transfer transistor TG is disposed symmetrically in the X direction with respect to the gate electrode 24 of the upper left transfer transistor TG. Moreover, the gate electrode 24 of the amplification transistor AMP is disposed symmetrically in the X direction with respect to the gate electrode 24 of the selection transistor SEL. Moreover, the n-type semiconductor region 12, p-type semiconductor region 13, three n+-type semiconductor regions 14, and element isolation insulation film 29 in the upper right pixel 1 are respectively disposed symmetrically in the X direction with respect to the n-type semiconductor region 12, p-type semiconductor region 13, three n+-type semiconductor regions 14, and element isolation insulation film 29 in the upper left pixel 1.
In the present embodiment, these relationships are also established in the wiring layers 25 to 27. For example, wirings of each of the wiring layers 25 to 27 in the upper left pixel 1 are disposed periodically in the Y direction with respect to the corresponding wirings of each of the wiring layers 25 to 27 in the lower left pixel 1 (
Note that, any of the components corresponding to each other in these pixels 1 may not be disposed periodically in the Y direction or may not be disposed symmetrically in the X direction. For example, any wiring of the wiring layers 25 to 27 in the upper left pixel 1 may not be disposed periodically in the Y direction with respect to the corresponding wiring of the wiring layers 25 to 27 in the lower left pixel 1. Furthermore, any wiring of the wiring layers 25 to 27 in the lower right pixel 1 may not be disposed symmetrically in the X direction with respect to the corresponding wiring of the wiring layers 25 to 27 in the lower left pixel 1.
As described above, two pixels 1 adjacent to each other in the X direction have a symmetrical structure in the X direction. As an example of this, C of
On the other hand, two pixels 1 adjacent to each other in the Y direction have a periodic structure in the Y direction. As an example of this, B of
According to the present embodiment, since two pixels 1 adjacent to each other in the Y direction have the periodic structure, it is possible to prevent the sensitivity difference from being generated between these pixels 1. On the other hand, when two pixels 1 adjacent to each other in the X direction have the symmetrical structure, there is an advantage that, for example, components in one pixel 1 and components in the other pixel 1 can be electrically connected by a short wiring. According to the present embodiment, it is possible to achieve both suppression of the sensitivity difference and shortening of the wiring.
In the present embodiment, four pixels 1 illustrated in A of
A of
In the present comparative example, two pixels 1 adjacent to each other in the X direction have a symmetrical structure in the X direction. Therefore, as illustrated in C of
According to the present comparative example, the components in four pixels 1 can be electrically connected by a short wiring. However, according to the present comparative example, there is a high possibility that a sensitivity difference is generated between these pixels 1. On the other hand, according to the present embodiment, it is possible to prevent the sensitivity difference from being generated between the different pixels 1 while the components in the different pixels 1 are electrically connected by a short wiring.
A and B of
According to the present embodiment, since these wirings 25a and these wirings 26a are disposed so as to intersect each other, most of the light escaping from the front surface S1 of the substrate 11 can be reflected to the substrate 11 by the wirings 25a and 26a. Therefore, the light is prevented from escaping from the substrate 11 to the support substrate 28.
Note that although each of the wirings 25a illustrated in A of
C and D of
The distance D1 between the wirings 25a and the distance D2 between the wirings 26a may be short as in the first example, or may be long as in the second example. However, in order to effectively prevent light from escaping from the substrate 11 to the support substrate 28, it is desirable that the distances D1 and D2 are short. For example, in a case where a wavelength of target light is λ, the distances D1 and D2 are desirably set to lengths that do not allow transmission of light having a wavelength of λ.
In the present embodiment, two pixels 1 adjacent to each other in the X direction have a symmetrical structure in the X direction, and two pixels 1 adjacent to each other in the Y direction have a periodic structure in the Y direction. In the present embodiment, this relationship may be applied to a contact plug or a via plug, which is electrically connected to the wiring layers 25 to 27. For example, in two pixels 1 adjacent to each other in the X direction, the contact plugs corresponding to each other may be disposed symmetrically in the X direction. Furthermore, in two pixels 1 adjacent to each other in the Y direction, the via plugs corresponding to each other may be disposed periodically in the Y direction.
First, an element isolation trench H is formed in the substrate 11 from the front surface S1 of the substrate 11 by photolithography and reactive ion etching (ME) (
Next, a material of the element isolation insulation film 21 is formed on the front surface S1 of the substrate 11, and an upper surface of the material is planarized by chemical mechanical polishing (CMP) (
Next, in the substrate 11 or on the substrate 11, the n-type semiconductor region 12, the p-type semiconductor region 13, the n+-type semiconductor region 14, the interlayer insulation film 22, the gate insulation film 23, the gate electrode 24, the wiring layer 25, the wiring layer 26, the wiring layer 27, the support substrate 28, and the like are formed (
Next, the substrate 11 is turned upside down (
Next, the substrate 11 is thinned from the back surface S2 of the substrate 11 (
Next, on the back surface S2 of the substrate 11, the light-shielding film 15, the color filter 16, and the on-chip lens 17 are formed (
Next, the solid-state imaging device according to a modification example of the present embodiment will be described with reference to
A of
In the present modification example, two pixels 1 adjacent to each other in the Y direction have a periodic structure in the Y direction. Therefore, it is possible to prevent a sensitivity difference from being generated between the pixels 1 adjacent to each other in the Y direction. Moreover, in the present modification example, two pixels 1 adjacent to each other in the X direction have a periodic structure in the X direction. Therefore, it is also possible to prevent the sensitivity difference from being generated between the pixels 1 adjacent to each other in the X direction. Thus, according to the present modification example, it is possible to more effectively prevent the sensitivity difference from being generated between the different pixels 1.
Similarly to C of
Similarly to B of
As described above, in the present embodiment, two pixels 1 adjacent to each other in the Y direction have a periodic structure in the Y direction. For example, the transfer transistor TG of one pixel 1 is disposed periodically in the Y direction with respect to the transfer transistor TG of the other pixel 1. Furthermore, the n-type semiconductor region 12, p-type semiconductor region 13, and n+-type semiconductor region 14 in one pixel 1 are respectively disposed periodically in the Y direction with respect to the n-type semiconductor region 12, p-type semiconductor region 13, and n+-type semiconductor region 14 in the other pixel 1. Thus, according to the present embodiment, it is possible to prevent the sensitivity difference from being generated between these pixels 1.
Note that in the present embodiment, two pixels 1 adjacent to each other in the Y direction may have a symmetrical structure in the Y direction, and two pixels 1 adjacent to each other in the X direction may have a periodic structure in the X direction.
A of
The solid-state imaging device of the present embodiment generally has the structure similar to that of the solid-state imaging device of the comparative example of the first embodiment illustrated in A to C of
However, the lower left pixel 1 illustrated in A of
The same applies to the other pixels 1 illustrated in A of
As illustrated in B of
In a case where the element isolation insulation film 29 is provided between the transfer transistor TG and the reset transistor RST, there is a possibility that light incident into the substrate 11 is reflected by the element isolation insulation film 29. Such reflected light may cause color mixing between the pixels 1.
The solid-state imaging device according to the present embodiment does not include the element isolation insulation film 29 between the transfer transistor TG and the reset transistor RST. Therefore, according to the present embodiment, it is possible to suppress the color mixing between the pixels 1 due to the element isolation insulation film 29.
The solid-state imaging device of the present embodiment can be realized by, for example, omitting the step of forming the element isolation insulation film 29 between the transfer transistor TG and the reset transistor RST when the solid-state imaging device is manufactured by the method illustrated in
A of
In the present modification example, two pixels 1 adjacent to each other in the Y direction have a periodic structure in the Y direction similarly to that of the solid-state imaging device illustrated in A of
A of
In the present modification example, similarly to that of the solid-state imaging device illustrated in A of
As described above, the solid-state imaging device according to the present embodiment does not include the element isolation insulation film 29 between the transfer transistor TG and the reset transistor RST. Therefore, according to the present embodiment, it is possible to suppress the color mixing between the pixels 1 due to the element isolation insulation film 29.
The solid-state imaging device of the present embodiment generally has the structure similar to that of the solid-state imaging device of the modification example of the first embodiment illustrated in
However, in four pixels 1 illustrated in
Furthermore, the solid-state imaging device of the present embodiment includes an element isolation insulation film 21 reaching the back surface S2 of the substrate 11 and an element isolation insulation film 29 not reaching the back surface S2 of the substrate 11 (see also
Each internal element isolation insulation film 29a is provided inside each pixel 1 and is interposed between the transfer transistor TG of each pixel 1 and another pixel transistor (reset transistor RST, the selection transistor SEL, the amplification transistor AMP, or the dummy transistor).
The external element isolation insulation film 29b is provided outside each pixel 1, and extends in the X direction and the Y direction between the pixels 1 adjacent to each other. The external element isolation insulation film 29b has a planar shape similar to that of the element isolation insulation film 21, and has a shape surrounding each of four pixels 1 illustrated in
Note that
The solid-state imaging device of the present embodiment can be realized, for example, by forming the internal element isolation insulation films 29a and the external element isolation insulation film 29b as the element isolation insulation film 29 when the solid-state imaging device is manufactured by the method illustrated in
Hereinafter, details of the solid-state imaging device according to the third embodiment will be further described subsequently with reference to
The solid-state imaging device of the present embodiment is, for example, a near-infrared light (NIR) sensor. In this case, each pixel 1 of the present embodiment is used as an NIR pixel for detecting near-infrared light, and a color filter 16 (
Four pixels 1 illustrated in
The sharing of the pixel transistor between the pixels 1 is performed, for example, to reduce the chip size of the solid-state imaging device. However, when such sharing is performed, the symmetry and periodicity of pixel transistors and wirings may deteriorate between these pixels 1 (sharing pixels). For example, in the present embodiment, the size of the amplification transistor AMP is different from the size of the reset transistor RST and the size of the dummy transistor. This is to increase the size of the amplification transistor AMP to reduce noise of the amplification transistor AMP.
The influence of the deterioration in symmetry and periodicity is also applied to imaging characteristics of the NIR sensor. The near-infrared light is less likely to be absorbed by a silicon substrate (substrate 11) as compared with visible light, and easily reaches each pixel transistor without much decrease in intensity. Therefore, in a case where the near-infrared light is detected, the influence of symmetry and periodicity is more strongly applied to the imaging characteristics than a case where the visible light is detected. In the NIR sensor, for example, a large sensitivity difference is likely to be generated between the sharing pixels.
As a technique of correcting the sensitivity difference between the sharing pixels, for example, there is a technique of correcting an opening of the light-shielding film 15 (
In the present embodiment, in order to correct the sensitivity difference between the sharing pixels, the width of the internal element isolation insulation film 29a is adjusted for each pixel 1. Therefore, it is possible to correct the sensitivity difference between the sharing pixels without decreasing the Qe of the NIR sensor. The element isolation insulation films 21 and 29 of the present embodiment are silicon oxide films and have a property of reflecting light. The light reflected by the element isolation insulation films 21 and 29 can contribute to the sensitivity of the pixel 1. Thus, according to the present embodiment, by adjusting the width of the internal element isolation insulation film 29a for each pixel 1, the influence of the internal element isolation insulation film 29a on the sensitivity can be adjusted for each pixel 1, and thus the sensitivity difference between the sharing pixels can be reduced.
In the present embodiment, when the internal element isolation insulation film 29a of a certain pixel 1 is thickened, the light component reflected by the internal element isolation insulation film 29a increases, and the sensitivity of the pixel 1 increases. Thus, in a case where the sensitivity difference between the sharing pixels is corrected using this technology, the internal element isolation insulation film 29a of the pixel 1 having low sensitivity is generally thickened. Therefore, the output of the pixel 1 having low sensitivity can be matched with the output of the pixel 1 having high sensitivity, and the decrease in Qe of the NIR sensor can be suppressed.
Note that the structure of the internal element isolation insulation film 29a of the present embodiment may be applied to a solid-state imaging device other than the NIR sensor. Furthermore, in the present embodiment, the width of the internal element isolation insulation film 29a of the pixel 1 other than the pixel 1 including the amplification transistor AMP may be adjusted. Furthermore, in the present embodiment, the sensitivity difference between the sharing pixels may be corrected by adjusting the width of the external element isolation insulation film 29b instead of the width of the internal element isolation insulation film 29a.
Similarly to
The internal element isolation insulation film 29a of the present modification example has the width α in any portion. On the other hand, the external element isolation insulation film 29b of the present modification example has the width β almost entirely, but has a width β′ in the +Y direction of the pixel 1 including the amplification transistor AMP. The width β′ is set to be thicker than the width β. The width β is an example of a first width of the present disclosure, and the width β′ is an example of a second width of the present disclosure. According to the present modification example, the sensitivity difference between the sharing pixels can be corrected by adjusting the width of the external element isolation insulation film 29b.
Similarly to
The internal element isolation insulation film 29a of the present modification example has the width α in any portion. On the other hand, the external element isolation insulation film 29b of the present modification example has the width β almost entirely, but has a width β′ in the ±X direction of the pixel 1 including the amplification transistor AMP. According to the present modification example, the sensitivity difference between the sharing pixels can be corrected by adjusting the width of the external element isolation insulation film 29b at a plurality of portions.
The external element isolation insulation film 29b of the present modification example has a width β in any portion. On the other hand, the internal element isolation insulation film 29a of the present modification example has the width α in the pixel 1 including the reset transistor RST or the selection transistor SEL, but has the width α′ in the pixel 1 including the amplification transistor AMP or the dummy transistor. According to the present modification example, the sensitivity difference between the sharing pixels can be corrected by adjusting the widths of the internal element isolation insulation films 29a in a plurality of the pixels 1.
In the present modification example, the areas of the gate electrodes 24 of the reset transistor RST, selection transistor SEL, amplification transistor AMP, and dummy transistor in plan view are set to be the same. Moreover, the reset transistor RST, selection transistor SEL, amplification transistor AMP, and dummy transistor of the present modification example are disposed periodically in the X direction and the Y direction similarly to the transfer transistor TG. Specifically, each of the gate electrodes 24 of the reset transistor RST, selection transistor SEL, amplification transistor AMP, and dummy transistor of the present modification example is disposed in the −Y direction of the internal element isolation insulation film 29a near the center of the internal element isolation insulation film 29a in the corresponding pixel 1.
On the other hand, the element isolation insulation film 29 of the present modification example has the same shape as that of the element isolation insulation film 29 illustrated in
In the present modification example, the areas of the gate electrodes 24 of the reset transistor RST, selection transistor SEL, amplification transistor AMP, and dummy transistor in plan view are set to be the same. Therefore, the sensitivity difference between the sharing pixels, which is caused by these pixel transistors, is not generally generated. However, in a case where the shapes of the wirings (for example, the wirings in the wiring layers 25 to 27) of the solid-state imaging device of the present modification example are different between the shared pixels, the sensitivity difference between the sharing pixels may be generated. According to the present modification example, the sensitivity difference can be reduced. Note that in the present modification example, instead of adopting the shape of the element isolation insulation film 29 illustrated in
As described above, the element isolation insulation film 29a (or 29b) of the present embodiment includes a portion having the width α (or β) and a portion having the width α′ (or β′). Therefore, according to the present embodiment, the sensitivity difference can be prevented from being generated between the pixels 1 by adjusting the width of the element isolation insulation film 29a (or 29b).
Note that the internal element isolation insulation film 29a or external element isolation insulation film 29b of the present embodiment may have three or more types of widths. Furthermore, in the solid-state imaging device of the present embodiment, the internal element isolation insulation film 29a may have two or more types of widths, and the external element isolation insulation film 29b may have two or more types of widths.
Hereinafter, the solid-state imaging devices according to fourth to ninth embodiments will be described. The solid-state imaging devices of the fourth to ninth embodiments will be described focusing on a difference from the solid-state imaging devices of the first to third embodiments, and the description of common points with the solid-state imaging devices of the first to third embodiments will be omitted.
A of
The solid-state imaging device of the present embodiment generally has the structure similar to that of the solid-state imaging device of the comparative example of the first embodiment illustrated in A to C of
However, as illustrated in A of
A to C of
In A of
On the other hand, in four pixels 1 illustrated in A of
Note that in the solid-state imaging device of the present embodiment, the upper left and upper right pixels 1 illustrated in A of
The solid-state imaging device illustrated in A to C of
As illustrated in
According to the present embodiment, the side surface of the element isolation insulation film 21 has the tapered shape, for example, a gradient of potential (transfer gradient) can be easily applied to the transfer transistor TG side. Therefore, the quantum efficiency (Qe) and the transfer gradient can be optimized.
Note that each portion as the element isolation insulation film 21 of the present embodiment may be provided on the element isolation insulation film 29 in a similar manner to the element isolation insulation film 21 illustrated in
The solid-state imaging device illustrated in
Each pixel 1 of the present embodiment has a hexagonal shape in plan view. Therefore, each pixel 1 of the present embodiment has a honeycomb structure having a hexagonal column shape. Each pixel 1 illustrated in
Three pixels 1 on the straight line A2 have a periodic structure in the Y direction. For example, the transfer transistor TG, transistor Tr, floating diffusion portion FD, and element isolation insulation film 29 in the upper pixel 1 on the straight line A2 are periodically disposed in the Y direction, respectively, with respect to the transfer transistor TG, transistor Tr, floating diffusion portion FD, and element isolation insulation film 29 in the central pixel 1 on the straight line A2. Therefore, the effects similar to those of the pixels 1 illustrated in
On the other hand, two pixels 1 on the straight line A1 have a structure that is rotationally symmetric to each other. For example, the transfer transistor TG, transistor Tr, floating diffusion portion FD, and element isolation insulation film 29 in the upper pixel 1 on the straight line A1 are disposed, respectively, at positions at which the transfer transistor TG, transistor Tr, floating diffusion portion FD, and element isolation insulation film 29 in the lower pixel 1 on the straight line A1 are rotated by 180 degrees. The same applies to two pixels 1 on the straight line A3.
Furthermore, the pixels 1 separated from each other in the X direction have a periodic structure in the X direction. For example, the transfer transistor TG, transistor Tr, floating diffusion portion FD, and element isolation insulation film 29 in the upper pixel 1 on the straight line A3 are periodically disposed in the X direction, respectively, with respect to the transfer transistor TG, transistor Tr, floating diffusion portion FD, and element isolation insulation film 29 in the upper pixel 1 on the straight line A1. Therefore, the effects similar to those of the pixels 1 illustrated in
According to the present embodiment, the pixel 1 having the honeycomb structure is adopted, and thus it is possible to improve the degree of freedom in designing the layout of the components in each pixel 1. For example, a distance between the transistors Tr of the different pixels 1 can be increased. This is why there are only four corners at which the transistor Tr can be disposed in a case where the shape of each pixel 1 is a quadrangle, but there are six corners at which the transistor Tr can be disposed in a case where the shape of each pixel 1 is a hexagon. In
In the modification example of A of
The same applies to the modification example of B of
The solid-state imaging device illustrated in A of
In the solid-state imaging device illustrated in B of
The same applies to the solid-state imaging device illustrated in C of
A of
B of
A of
The solid-state imaging device of the present embodiment generally has the structure similar to that of the solid-state imaging device of the first embodiment illustrated in A to C of
However, as illustrated in A of
As described above, each pixel 1 of the present embodiment includes the element isolation insulation film 29 on the symmetry plane of each pixel 1 perpendicular to the Y direction. Thus, the shape of the element isolation insulation film 29 in each pixel 1 is line-symmetric with respect to the symmetry plane described above. Therefore, it is possible to prevent the element isolation insulation film 29 from deteriorating the optical symmetry of each pixel 1.
As in
B of
The well contact region 32 is a semiconductor region provided in the substrate 11. The well contact region 32 is, for example, a p-type semiconductor region. Furthermore, each of the contact plugs 31 illustrated in B of
The contact plug 31 illustrated in B of
A and B in
When the well contact region 32 is disposed in the pixel 1 as in the present comparative example, there is a possibility that the size of the photodiode PD is reduced due to the well contact region 32. As a result, there is a possibility that the photoelectric conversion efficiency of each pixel 1 decreases.
On the other hand, the well contact region 32 and the corresponding contact plug 31 of the present embodiment are provided under the element isolation insulation film 21. Therefore, it is possible to prevent the size of the photodiode PD from being reduced due to the well contact region 32. Thus, according to the present embodiment, the photoelectric conversion efficiency of each pixel 1 can be improved.
Note that the well contact region 32 illustrated in A of
However, in the upper left pixel 1 and upper right pixel 1 illustrated in
Four pixels 1 illustrated in
Note that the structures of the n-type semiconductor region 12 and transfer transistor TG illustrated in
In the modification example of A of
A of
B of
The camera 100 includes an optical unit 101 including a lens group, an imaging device 102 which is the solid-state imaging device according to any one of the first to ninth embodiments, a digital signal processor (DSP) circuit 103 which is a camera signal processing circuit, a frame memory 104, a display unit 105, a recording unit 106, an operation unit 107, and a power supply unit 108. Furthermore, the DSP circuit 103, the frame memory 104, the display unit 105, the recording unit 106, the operation unit 107, and the power supply unit 108 are connected to each other via a bus line 109.
The optical unit 101 receives incident light (image light) from a subject and forms an image on an imaging surface of the imaging device 102. The imaging device 102 converts the light amount of the incident light imaged on the imaging surface by the optical unit 101 into an electrical signal in units of pixels, and outputs the electrical signal as a pixel signal.
The DSP circuit 103 performs signal processing on the pixel signal output by the imaging device 102. The frame memory 104 is a memory for storing one screen of a moving image or a still image, which is captured by the imaging device 102.
The display unit 105 includes, for example, a display device of a panel type such as a liquid crystal panel type or an organic EL panel type, and displays the moving image or the still image, which is captured by the imaging device 102. The recording unit 106 records the moving image or the still image, which is captured by the imaging device 102 on a recording medium such as a hard disk or a semiconductor memory.
The operation unit 107 issues operation commands for various functions of the camera 100 under operation by a user. The power supply unit 108 appropriately supplies various power sources as operation power sources of the DSP circuit 103, the frame memory 104, the display unit 105, the recording unit 106, and the operation unit 107 to these power supply targets.
In a case where the solid-state imaging device according to any one of the first to ninth embodiments is used as the imaging device 102, a good image can be expected to be acquired.
The solid-state imaging device can be applied to various other products. For example, the solid-state imaging device may be mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, or a robot.
The vehicle control system 200 includes a plurality of electronic control units connected to each other via a communication network 201. In the example illustrated in
The driving system control unit 210 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs. For example, the driving system control unit 210 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like.
The body system control unit 220 controls the operation of various kinds of devices provided to a vehicle body in accordance with various kinds of programs. For example, the body system control unit 220 functions as a control device for a smart key system, a keyless entry system, a power window device, or various kinds of lamps (for example, a headlamp, a backup lamp, a brake lamp, a turn signal lamp, a fog lamp). In this case, radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 220. The body system control unit 220 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
The outside-vehicle information detection unit 230 detects information regarding the outside of the vehicle including the vehicle control system 200. For example, the imaging unit 231 is connected to the outside-vehicle information detection unit 230. The outside-vehicle information detection unit 230 makes the imaging unit 231 form an image of the outside of the vehicle, and receives the captured image from the imaging unit 231. On the basis of the received image, the outside-vehicle information detection unit 230 may perform processing of detecting an object such as a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
The imaging unit 231 is an optical sensor that receives light and outputs an electric signal corresponding to a received light amount of the light. The imaging unit 231 can output the electric signal as an image, or can output the electric signal as information regarding a measured distance. The light received by the imaging unit 231 may be visible light, or may be invisible light such as infrared rays or the like. The imaging unit 231 includes the solid-state imaging device according to any one of the first to ninth embodiments.
The in-vehicle information detection unit 240 detects information regarding the inside of the vehicle including the vehicle control system 200. The in-vehicle information detection unit 240 is, for example, connected to a driver state detection unit 241 that detects the state of a driver. For example, the driver state detection unit 241 includes a camera that images the driver. On the basis of detection information input from the driver state detection unit 241, the in-vehicle information detection unit 240 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing. The camera may include the solid-state imaging device according to any one of the first to ninth embodiments, and may be, for example, the camera 100 illustrated in
The microcomputer 251 can calculate a control target value for the driving force generation device, the steering mechanism, or the braking device on the basis of information regarding the inside or outside of the vehicle, the information obtained by the outside-vehicle information detection unit 230 or the in-vehicle information detection unit 240, and output a control command to the driving system control unit 210. For example, the microcomputer 251 can perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which performs collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision, a warning of deviation from a lane, or the like.
Furthermore, the microcomputer 251 can perform cooperative control intended for automated driving, which makes the vehicle to travel in an automated manner without depending on the operation of the driver, or the like, by controlling the driving force generation device, the steering mechanism, or the braking device on the basis of information regarding the periphery of the vehicle, the information obtained by the outside-vehicle information detection unit 230 or the in-vehicle information detection unit 240.
Furthermore, the microcomputer 251 can output a control command to the body system control unit 220 on the basis of information regarding the outside of the vehicle, the information obtained by the outside-vehicle information detection unit 230. For example, the microcomputer 251 can perform cooperative control intended to prevent a glare by controlling the headlamp so as to change from a high beam to a low beam, for example, in accordance with the position of a preceding vehicle or an oncoming vehicle, which is detected by the outside-vehicle information detection unit 230.
The sound/image output unit 252 transmits an output signal of at least one of a sound or an image to an output device capable of visually or auditorily notifying an occupant of the vehicle or the outside of the vehicle of information. In the example of
A vehicle 300 illustrated in
The imaging unit 301 provided on the front nose mainly acquires an image of a forward side of the vehicle 300. The imaging unit 302 provided on a left sideview mirror and the imaging unit 303 provided on a right sideview mirror mainly acquire an image of the sideward side of the vehicle 300. The imaging unit 304 provided on the rear bumper or the back door mainly acquires an image of the rearward side of the vehicle 300. The imaging unit 305 provided on an upper portion of the windshield inside the vehicle interior mainly acquires an image of the forward side of the vehicle 300. The imaging unit 305 is used to detect, for example, a preceding vehicle, a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, and the like.
At least one of the imaging units 301 to 304 may have a function of obtaining distance information. For example, at least one of the imaging units 301 to 304 may be a stereo camera including a plurality of imaging devices, or may be an imaging device having pixels for phase difference detection.
For example, the microcomputer 251 (
For example, the microcomputer 251 can classify three-dimensional object data on three-dimensional objects into three-dimensional object data of a two-wheeled vehicle, a standard-sized vehicle, a large-sized vehicle, a pedestrian, a utility pole, and other three-dimensional objects on the basis of the distance information obtained from the imaging units 301 to 304, extract the classified three-dimensional object data, and use the extracted three-dimensional object data for automatic avoidance of an obstacle. For example, the microcomputer 251 identifies obstacles around the vehicle 300 as obstacles that the driver of the vehicle 300 can recognize visually and obstacles that are difficult for the driver of the vehicle 300 to recognize visually. Then, the microcomputer 251 determines a collision risk indicating the degree of a risk of collision with each obstacle. In a situation in which the collision risk is equal to or higher than a set value and thus there is a possibility of collision, the microcomputer 251 outputs a warning to the driver via the audio speaker 261 or the display unit 262, and performs forced deceleration or avoidance steering via the driving system control unit 210. In this manner, the microcomputer 251 can assist in driving to avoid the collision.
At least one of the imaging units 301 to 304 may be an infrared camera that detects infrared rays. For example, the microcomputer 251 determines whether or not there is a pedestrian in images captured by the imaging units 301 to 304, and thus the pedestrian can be recognized. Such recognition of the pedestrian is performed, for example, by a procedure of extracting feature points in the images captured by the imaging units 301 to 304 as infrared cameras and a procedure of determining whether or not it is the pedestrian by performing pattern matching processing on a series of feature points representing the contour of the object. When the microcomputer 251 determines that there is the pedestrian in the images captured by the imaging units 301 to 304, and thus recognizes the pedestrian, the sound/image output unit 252 controls the display unit 262 such that a square contour line for emphasis is displayed on the recognized pedestrian in a superimposing manner. Furthermore, the sound/image output unit 252 may control the display unit 262 such that an icon or the like indicating a pedestrian is displayed at a desired position.
The endoscope 500 includes a lens barrel 501 of which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 532, and a camera head 502 connected to the base end of the lens barrel 501. In the illustrated example, the endoscope 500 configured as a so-called rigid scope having the rigid lens barrel 501 is illustrated, but the endoscope 500 may be configured as a so-called flexible scope having a flexible lens barrel.
An opening portion into which an objective lens is fitted is provided at the distal end of the lens barrel 501. A light source device 603 is connected to the endoscope 500, and light generated by the light source device 603 is guided to the distal end of the lens barrel by a light guide extending inside the lens barrel 501, and is emitted toward an observation target in the body cavity of the patient 532 via the objective lens. Note that the endoscope 500 may be a forward-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
An optical system and an imaging element are provided inside the camera head 502, and reflected light (observation light) from the observation target is condensed on the imaging element by the optical system. The observation light is photoelectrically converted by the imaging element, and an electric signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal as RAW data is transmitted to a camera control unit (CCU) 601.
The CCU 601 includes a central processing unit (CPU) and a graphics processing unit (GPU), and integrally controls operations of the endoscope 500 and the display device 602. Moreover, the CCU 601 receives an image signal from the camera head 502, and performs various image processing for displaying an image based on the image signal, such as development processing (demosaic processing), on the image signal.
The display device 602 displays an image based on the image signal subjected to the image processing by the CCU 601 under the control of the CCU 601.
For example, the light source device 603 includes a light source such as a light emitting diode (LED) and supplies irradiation light for imaging a surgical site or the like to the endoscope 500.
An input device 604 is an input interface for the endoscopic surgery system 11000. A user can input various types of information and instructions to the endoscopic surgery system 400 via the input device 604. For example, the user inputs an instruction or the like to change imaging conditions (type, magnification, focal length, and the like of irradiation light) by the endoscope 500.
A treatment tool control device 605 controls driving of the energy treatment tool 512 for cauterization and incision of tissue, sealing of a blood vessel, or the like. A pneumoperitoneum device 606 feeds gas into the body cavity of the patient 532 via the pneumoperitoneum tube 511 in order to inflate the body cavity of the patient 532 for the purpose of securing a visual field of the endoscope 500 and securing a working space of the operator. A recorder 607 is a device capable of recording various types of information regarding surgery. A printer 608 is a device capable of printing various types of information regarding surgery in various formats such as text, image, or graph.
Note that the light source device 603 that supplies the endoscope 500 with the irradiation light at the time of imaging the surgical site may include, for example, an LED, a laser light source, or a white light source including a combination thereof. In a case where the white light source includes a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy, and thus adjustment of the white balance of the captured image can be performed in the light source device 603. Furthermore, in this case, the observation target is irradiated with laser light from each of the RGB laser light sources in a time division manner, and driving of the imaging element of the camera head 502 is controlled in synchronization with an irradiation timing. Therefore, it is also possible to capture an image corresponding to each of RGB in a time division manner. In this method, a color image can be obtained without providing a color filter on the imaging element.
Furthermore, the driving of the light source device 603 may be controlled so as to change the intensity of light to be output every predetermined time. Since the driving of the imaging element of the camera head 502 is controlled in synchronization with the timing of the change of the light intensity to acquire images in a time division manner and the images are synthesized, it is possible to generate an image of a high dynamic range without so-called black defects and halation.
Furthermore, the light source device 603 may be configured to be capable of supplying light in a predetermined wavelength band corresponding to special light imaging. In the special light imaging, by using wavelength dependency of light absorption in a body tissue, for example, so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel in a mucous membrane surface layer is imaged with high contrast by radiating light in a narrower band as compared with irradiation light (that is, white light) at the time of normal imaging. Alternatively, in the special light imaging, fluorescence imaging for obtaining an image with fluorescence generated by irradiation with excitation light may be performed. In the fluorescence imaging, it is possible to irradiate a body tissue with excitation light to observe fluorescence from the body tissue (Auto-Fluorescence Imaging), or to locally inject a reagent such as indocyanine green (ICG) into the body tissue and irradiate the body tissue with excitation light corresponding to a fluorescence wavelength of the reagent to obtain a fluorescent image. The light source device 603 can be configured to be capable of supplying narrow band light and/or excitation light corresponding to such special light imaging.
The camera head 502 includes a lens unit 701, an imaging unit 702, a drive unit 703, a communication unit 704, and a camera head control unit 705. The CCU 601 includes a communication unit 711, an image processing unit 712, and a control unit 713. The camera head 502 and the CCU 601 are communicably connected to each other by a transmission cable 700.
The lens unit 701 is an optical system provided at a connection portion with the lens barrel 501. Observation light taken in from the distal end of the lens barrel 501 is guided to the camera head 502 and enters the lens unit 701. The lens unit 701 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
The imaging unit 702 includes an imaging element. The number of imaging elements constituting the imaging unit 702 may be one (so-called single-plate type) or plural (so-called multi-plate type). In a case where the imaging unit 702 is configured as a multi-plate type imaging unit, for example, an image signal corresponding to each RGB may be generated by each imaging element, and a color image may be obtained by combining the image signals. Alternatively, the imaging unit 702 may include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to three-dimensional (3D) display. By performing the 3D display, the operator 531 can more accurately recognize the depth of the living tissue in the surgical site. Note that, in a case where the imaging unit 702 is configured as a multi-plate type imaging unit, a plurality of lens units 701 can be provided corresponding to the imaging elements. The imaging unit 702 is the solid-state imaging device according to any one of the first to ninth embodiments.
Furthermore, the imaging unit 702 may not be necessarily provided in the camera head 502. For example, the imaging unit 702 may be provided immediately after the objective lens inside the lens barrel 501.
The drive unit 703 includes an actuator, and moves the zoom lens and focus lens of the lens unit 701 by a predetermined distance along an optical axis under the control of the camera head control unit 705. Therefore, the magnification and focus of the image captured by the imaging unit 702 can be appropriately adjusted.
The communication unit 704 includes a communication device for transmitting and receiving various types of information to and from the CCU 601. The communication unit 704 transmits, as RAW data, the image signal obtained from the imaging unit 702 to the CCU 601 via the transmission cable 700.
Furthermore, the communication unit 704 receives a control signal for controlling driving of the camera head 502 from the CCU 601, and supplies the control signal to the camera head control unit 705. The control signal includes information regarding imaging conditions, for example, information for specifying a frame rate of a captured image, information for specifying an exposure value at the time of imaging, and/or information for specifying magnification and focus of a captured image.
Note that the imaging conditions such as the frame rate, the exposure value, the magnification, and the focus may be appropriately specified by the user, or may be automatically set by the control unit 713 of the CCU 601 on the basis of the acquired image signal. In the latter case, a so-called auto exposure (AE) function, an auto focus (AF) function, and an auto white balance (AWB) function are installed in the endoscope 500.
The camera head control unit 705 controls driving of the camera head 502 on the basis of the control signal received from the CCU 601 via the communication unit 704.
The communication unit 711 includes a communication device for transmitting and receiving various types of information to and from the camera head 502. The communication unit 711 receives an image signal transmitted from the camera head 502 via the transmission cable 700.
Furthermore, the communication unit 711 transmits the control signal for controlling driving of the camera head 502 to the camera head 502. The image signal and the control signal can be transmitted by electric communication, optical communication, or the like.
The image processing unit 712 performs various image processing on the image signal that is RAW data transmitted from the camera head 502.
The control unit 713 performs various control related to imaging of a surgical site or the like by the endoscope 500 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 713 generates a control signal for controlling driving of the camera head 502.
Furthermore, the control unit 713 causes the display device 602 to display the captured image showing a surgical site or the like on the basis of the image signal subjected to the image processing by the image processing unit 712. At this time, the control unit 713 may recognize various objects in the captured image by using various image recognition technologies. For example, the control unit 713 can recognize a surgical tool such as forceps, a specific living body site, bleeding, mist at the time of using the energy treatment tool 512, and the like by detecting the shape, color, and the like of an edge of the object included in the captured image. When displaying the captured image on the display device 602, the control unit 713 may superimpose various types of surgery support information on the image of the surgical site to perform display by using the recognition result. Since the surgery support information is superimposed to be displayed and presented to the operator 531, the burden on the operator 531 can be reduced and the operator 531 can reliably perform the surgery.
The transmission cable 700 connecting the camera head 502 with the CCU 601 is an electric signal cable compatible with electric signal communication, an optical fiber compatible with optical communication, or a composite cable formed by combining the electric signal cable and the optical fiber.
Here, in the illustrated example, communication is performed by wire using the transmission cable 700, but communication between the camera head 502 and the CCU 601 may be performed wirelessly.
The embodiments of the present disclosure have been described above, but these embodiments may be implemented with various modifications without departing from the gist of the present disclosure. For example, two or more embodiments may be implemented in combination.
Note that the present disclosure can also have the following configurations.
(1)
A solid-state imaging device including:
(2)
The solid-state imaging device according to (1), further including:
(3)
The solid-state imaging device according to (2),
(4)
The solid-state imaging device according to (2),
(5)
The solid-state imaging device according to (1),
(6)
The solid-state imaging device according to (5),
(7)
The solid-state imaging device according to (5),
(8)
The solid-state imaging device according to (5), further including a first wiring layer provided under the substrate and including a plurality of first wirings,
(9)
The solid-state imaging device according to (9),
(10)
The solid-state imaging device according to (8), further including a second wiring layer provided under the first wiring layer and including a plurality of second wirings,
(11)
The solid-state imaging device according to (10),
(12)
The solid-state imaging device according to (1),
(13)
The solid-state imaging device according to (12),
(14)
The solid-state imaging device according to (1),
(15)
The solid-state imaging device according to (1), further including an element isolation insulation film surrounding each of the first and second pixels.
(16)
A solid-state imaging device including:
(17)
The solid-state imaging device according to (16), further including an element isolation insulation film surrounding each of the first and second pixels.
(18)
A solid-state imaging device including:
(19)
The solid-state imaging device according to (18),
(20)
The solid-state imaging device according to (18),
(21)
The solid-state imaging device according to (18),
(22)
A manufacturing method for a solid-state imaging device including a first pixel and a second pixel located in a first direction of the first pixel, the method including:
(23)
The manufacturing method for a solid-state imaging device according to (22), further including forming a third pixel located in a second direction of the first pixel and a fourth pixel located in the second direction of the second pixel,
(24)
The manufacturing method for a solid-state imaging device according to (22),
(25)
A solid-state imaging device including:
(26)
The solid-state imaging device according to (15), in which a side surface of the element isolation insulation film includes a portion having a tapered shape.
(27)
The solid-state imaging device according to (1), in which the first and second pixels each have a hexagonal shape in plan view.
(28)
The solid-state imaging device according to (1), in which the first or second pixel includes an element isolation insulation film provided between the first transistor and the second transistor, and located on a symmetry plane of the first or second pixel, which is perpendicular to the first direction.
(29)
The solid-state imaging device according to (5), further including:
(30)
The solid-state imaging device according to (3),
(31)
A solid-state imaging device including:
(32)
The solid-state imaging device according to (31), further including:
Number | Date | Country | Kind |
---|---|---|---|
2020-176278 | Oct 2020 | JP | national |
2021-016898 | Feb 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/038657 | 10/19/2021 | WO |