The present disclosure relates to a solid-state imaging device and an electronic device, and particularly relates to a solid-state imaging device and an electronic device, which are capable of achieving a high-speed communication interface.
Backside-illuminated solid-state imaging devices with a stacked structure in which a pixel substrate on which pixels are formed and a control substrate on which a control circuit that controls the pixels or performs other processing is formed are stacked have been widely used. In the backside-illuminated solid-state imaging devices with the stacked structure, there is a structure in which a pad serving as an output section that outputs a pixel signal or other signals is formed on the control substrate side, and an opening process has been performed on a portion on the pixel substrate side over the pad. In such a structure, since the pad is positioned at the deeper position as viewed from the pixel substrate side, wire bonding to be formed on the pad is required to be accurately carried out, which causes deterioration in assembly quality.
Therefore, Patent Document 1 has proposed a structure in which a pad is formed in a wiring layer that is an uppermost layer of a pixel substrate. In addition, Patent Document 2 has proposed a technology in which, in a case where a pad is formed in a wiring layer that is an uppermost layer of a pixel substrate, the pad of the pixel substrate is connected to a wiring on a control substrate side to improve quality related to the formation of color filters, microlenses, and other components, and a variation in device characteristics or other changes during a heat treatment.
However, in a case where the pad of the pixel substrate is connected to the wiring on the control substrate side, there is a concern that pad capacitance increases. In response to an increase in the number of pixels for a solid-state imaging device and an increase in demand for moving image performance, a baud rate of a communication interface mounted in the solid-state imaging device is increasing year by year, and pad capacitance is one of factors that restrict the increase in the rate.
The present disclosure has been made in view of such a situation, and an object thereof is to achieve a high-speed communication interface.
A solid-state imaging device according to a first aspect of the present disclosure includes a pixel substrate on which a pixel is formed and a control substrate, in which the pixel substrate and the control substrate are stacked, the pixel substrate includes a pad serving as a contact point with an external device, and the control substrate includes a signal pad wiring connected to the pad, a shield wiring disposed around the signal pad wiring, and a high-resistance element connected to the shield wiring.
An electronic device according to a second aspect of the present disclosure is provided with a solid-state imaging device that includes a pixel substrate on which a pixel is formed, and a control substrate, in which the pixel substrate and the control substrate are stacked, the pixel substrate includes a pad serving as a contact point with an external device, and the control substrate includes a signal pad wiring connected to the pad, a shield wiring disposed around the signal pad wiring, and a high-resistance element connected to the shield wiring.
According to the first and second aspects of the present disclosure, the pixel substrate on which the pixel is formed and the control substrate are stacked, the pixel substrate is provided with the pad serving as the contact point with the external device, and the control substrate is provided with the signal pad wiring connected to the pad, the shield wiring disposed around the signal pad wiring, and the high-resistance element connected to the shield wiring.
The solid-state imaging device or the electronic device may be an independent device or a module incorporated in another device.
Hereinafter, modes for carrying out the technology of the present disclosure (which will be hereinafter referred to as embodiments) will be described with reference to the accompanying drawings. The description will be given in the following order.
Note that, in the drawings referred to in the following description, the same or similar parts are denoted by the same or similar reference signs, and the description thereof will not be repeated as appropriate. The drawings are schematic, and the relationship between the thickness and the plane dimension, the ratio of the thickness of each layer, and other points are different from the actual ones. In addition, the drawings may include parts having different dimensional relationships and ratios.
In addition, definition of directions such as upward and downward directions, and the like in the following description is merely the definition for convenience of description, and does not limit the technical idea of the present disclosure. For example, when an object is rotated by 90° to be observed, the upper and lower sides are changed as the left and right sides, and when the object is rotated by 180°, the upper and lower sides are reversed.
A solid-state imaging device 1 in
The solid-state imaging device 1 includes a pixel array section 11 and peripheral circuit sections. The peripheral circuit sections include, for example, a vertical drive section 12, a column processing section 13, a horizontal drive section 14, and a system control section 15.
The solid-state imaging device 1 further includes a signal processing section 16, a data storage section 17, and an I/F circuit 18. At least one of the signal processing section 16, the data storage section 17, or the I/F circuit 18 is disposed over a substrate different from the substrate on which the pixel array section 11 is formed.
The pixel array section 11 has a configuration in which pixels 21 each of which has a photoelectric conversion section that generates and accumulates photocharges according to the amount of received light are two-dimensionally arranged in a matrix in the row direction and the column direction. Here, the row direction refers to a pixel row of the pixel array section 11, that is, an array direction in the horizontal direction, and the column direction refers to a pixel column of the pixel array section 11, that is, an array direction in the vertical direction.
Each of the pixels 21 includes a photodiode serving as a photoelectric conversion section and a plurality of pixel transistors (what is called MOS transistors). The plurality of pixel transistors includes four transistors that are a transfer transistor, a selection transistor, a reset transistor, and an amplification transistor, for example.
In the pixel array section 11, pixel drive wirings 22 serving as row signal lines are disposed, one for each pixel row, along the row direction and vertical signal lines 23 serving as column signal lines are disposed, one for each pixel column, along the column direction. Each of the pixel drive wirings 22 transmits a drive signal for driving when a signal is read from each of the pixels 21. In
The vertical drive section 12 includes a shift register, an address decoder, and other components and drives the individual pixels 21 in the pixel array section 11 at the same time for all the pixels, row by row, or other ways. The vertical drive section 12 constitutes a drive section that controls the operation of each of the pixels 21 in the pixel array section 11 together with the system control section 15. Although a specific configuration of the vertical drive section 12 is not illustrated, the vertical drive section generally includes two scanning systems of a reading scanning system and a sweeping scanning system.
In order to read a signal from each of the pixels 21, the reading scanning system sequentially selects and scans the pixels 21 of the pixel array section 11 row by row. The signal read from each of the pixels 21 is an analog signal. The sweeping scanning system performs sweep scanning on a read row on which the read scanning is to be performed by the reading scanning system earlier than the read scanning by an exposure time.
The sweep scanning by the sweeping scanning system allows unnecessary photocharges to be swept out from the photoelectric conversion section of each of the pixels 21 in the read row, thereby resetting the photoelectric conversion section of each of the pixels 21. Then, by sweeping out (resetting) unnecessary photocharges by the sweeping scanning system, a so-called electronic shutter operation is performed. Here, the electronic shutter operation refers to operation of discharging the photocharges of the photoelectric conversion section and newly starting exposure (starting accumulation of photocharges).
The signal read by the read operation of the reading scanning system corresponds to the amount of the received light after the immediately preceding read operation or electronic shutter operation. Then, the period from the read timing by the immediately preceding read operation or the sweep timing by the electronic shutter operation to the read timing by the current read operation is the exposure period in the pixels 21.
The signal output from each of the pixels 21 in the pixel row selectively scanned by the vertical drive section 12 is input to the column processing section 13 through each of the vertical signal lines 23 for each pixel column. The column processing section 13 performs predetermined signal processing on the signal output from each pixel 21 in the selected row through the vertical signal line 23 for each pixel column of the pixel array section 11, and temporarily holds the pixel signal after the signal processing.
Specifically, the column processing section 13 performs, as signal processing, at least noise removal processing, for example, correlated double sampling (CDS) processing or double data sampling (DDS) processing. For example, in the CDS processing, fixed pattern noise unique to the pixel, such as reset noise or threshold variation of an amplification transistor in the pixel, is removed. The column processing section 13 may have, for example, an analog-digital (AD) conversion function in addition to the noise removal processing, and convert an analog pixel signal into a digital signal and output the digital signal.
The horizontal drive section 14 includes a shift register, an address decoder, and other components, and sequentially selects a unit circuit corresponding to the pixel column in the column processing section 13. When the selective scanning is performed by the horizontal drive section 14, the pixel signal subjected to the signal processing for every unit circuit in the column processing section 13 is sequentially output.
The system control section 15 includes a timing generator that generates various timing signals and other components, and performs drive control of the vertical drive section 12, the column processing section 13, the horizontal drive section 14, and other sections on the basis of various timings generated by the timing generator.
The signal processing section 16 has at least an arithmetic processing function, and performs various signal processing such as arithmetic processing on the pixel signals output from the column processing section 13. The data storage section 17 temporarily stores data necessary for signal processing in the signal processing section 16. The pixel signal on which the signal processing has been completed in the signal processing section 16 is supplied to the I/F circuit 18.
The I/F circuit 18 converts the pixel signal supplied from the signal processing section 16 into a predetermined format such as the mobile industry processor interface (MIPI) standard, for example, and outputs the converted signal to an external device via the output section 19.
The solid-state imaging device 1 configured as described above has a stacked structure in which a pixel substrate on which the pixels 21 are formed and a control substrate on which a control circuit that controls the pixels 21 or performs signal processing and other processing is formed are stacked.
In the pixel substrate 41, the pixel array section 11 is formed at the substrate center portion of the chip-size. In addition, a plurality of pads 51 each of which serves as a contact point with an external device is formed on the outer peripheral portion of the substrate to surround the outside of the pixel array section 11. Each of the pads 51 on the pixel substrate 41 includes the output section 19 in
On the other hand, the control substrate 42 is provide with the I/F circuit 18 and a plurality of pad wirings 52. In addition, although not illustrated, the control substrate 42 may be provided with, for example, the signal processing section 16, the system control section 15, and other sections.
The pad wirings 52 of the control substrate 42 are formed at positions corresponding to the pads 51 of the pixel substrate 41, and are wirings on the control substrate 42 side, each wiring being electrically connected to each pad 51 at the same planar position. The pad wirings 52 include a pad wiring 52A (hereinafter, referred to as a signal pad wiring 52A) that is electrically connected to the signal pad 51 of the pixel substrate 41 and a pad wiring 52B (hereinafter, referred to as a power supply or ground pad wiring 52B) that is electrically connected to the power supply or ground pad 51 of the pixel substrate 41. For example, the signal pad wiring 52A and the power supply or ground pad wiring 52B has different wiring pattern areas from each other.
In
The pixel substrate 41 includes a light-condensing layer 61, a semiconductor substrate 62, and a wiring layer 63, which are stacked in this order from the top. Therefore, the light-condensing layer 61 and the wiring layer 63 are disposed on opposite sides to each other, with the semiconductor substrate 62 interposed therebetween (with respect to the semiconductor substrate 62). The wiring layer 63 is formed on the front surface side of the semiconductor substrate 62, and the light-condensing layer 61 is formed on the back surface side.
The light-condensing layer 61 includes an on-chip microlens 67, a color filter 68, and other components. Light incident from a subject onto an upper surface (incident surface) of the light-condensing layer 61 is condensed in a photodiode (not illustrated) that is formed in the semiconductor substrate 62 by the on-chip microlens 67.
The semiconductor substrate 62 includes, for example, a silicon substrate containing silicon (Si) as a semiconductor. Note that, a material of the semiconductor substrate 62 may be, for example, a Group IV semiconductor such as Ge or may be a Group III-V semiconductor such as GaAs, InP, or InGaAs, in addition to Si described above. Although not illustrated, semiconductor devices, such as a photodiode of each pixel 21, a transfer gate, a charge-voltage converter (FD), a reset gate, an amplification transistor, and a selection transistor, are formed in the semiconductor substrate 62.
In the wiring layer 63, multi-layer wirings 64 and an interlayer insulating film 65 are formed. Each wiring 64 is connected to an upper-layer wiring 64 or a lower-layer wiring 64 via a via 66 in a predetermined region. The wirings 64 and the via 66 are formed using a metal material such as Cu, Al, or other metal materials, for example.
In addition, the pad 51 for external connection is formed in the wiring layer 63. In this example, the pad 51 is formed to have a thickness substantially equivalent to the height from a second-layer wiring 64 to a fourth-layer wiring 64 constituting the wiring layer 63. A through-hole 71 penetrating the light-condensing layer 61 and the semiconductor substrate 62 is formed above the pad 51. A part of a surface (hereinafter, referred to as a connection surface) of the pad 51 where a wire bond ball is to be formed is exposed by the formation of the through-hole 71.
A via 73 is connected to a surface opposite to the connection surface of the pad 51, and the pad 51 is connected to a pad 72 for bonding to the control substrate 42 via the via 73. The pad 72 contains Cu, for example. The pad 72 is connected to a pad 91 of a wiring layer 81 of the control substrate 42 by Cu—Cu bonding.
The control substrate 42 includes the wiring layer 81 and a semiconductor substrate 82, which are stacked in this order from the top. The wiring layer 81 of the control substrate 42 is formed at the wiring layer 63 side of the pixel substrate 41, and the wiring layer 63 of the pixel substrate 41 and the wiring layer 81 of the control substrate 42 are in contact with each other.
In the wiring layer 81, multi-layer wirings 83 and an interlayer insulating film 84 are formed. Each wiring 83 is connected to an upper-layer wiring 83 or a lower-layer wiring 83 via a via 85 in a predetermined region. The wirings 83 and the via 85 are formed using a metal material such as Cu, Al, or other metal materials, for example.
A pad 91 for bonding to the pixel substrate 41 is formed on the upper end of the wiring layer 81. The pad 91 contains Cu, for example. In
The pad 91 on the control substrate 42 side, which is Cu—Cu bonded to the pad 72 on the pixel substrate 41 side, is connected to the signal pad wiring 52A at an uppermost layer level in the wiring layer 81 via the via 92. A wiring 83A is provided at a predetermined position of the same layer level as the signal pad wiring 52A, and the wiring 83A is connected to a wiring 83B below the signal pad wiring 52A via a via 85A.
The semiconductor substrate 82 is formed using, for example, a silicon substrate, but it is similar to the semiconductor substrate 62 described above in that a material other than silicon may be used.
The solid-state imaging device 1 in which the pixel substrate 41 and the control substrate 42 are stacked as described above includes a backside-illuminated MOS solid-state imaging device in which light is incident from a rear surface side of the semiconductor substrate 62.
In the solid-state imaging device 1, the pad 51 on the pixel substrate 41 side, which outputs a pixel signal as an output signal, is electrically connected to the signal pad wiring 52A on the control substrate 42 side via the via 73, the pad 72, the pad 91, and the via 92. In the wiring layer 81 on the control substrate 42 side, the signal pad wiring 52A, the wiring 83A, the wiring 83B, and other wirings employ a structure on which the influence of capacitance is reduced in terms of electrical characteristics, thereby achieving a high-speed communication interface and a broadband. Hereinafter, a wiring structure in the wiring layer 81 on the control substrate 42 side, which has achieved the high-speed communication interface and broadband, will be described in detail.
The pad 51 of the pixel substrate 41 is connected to the signal pad wiring 52A of the control substrate 42 via the via 73, the pad 72, the pad 91, and the via 92. An output signal SIG is supplied to the signal pad wiring 52A, and the output signal SIG is output from the pad 51 of the pixel substrate 41 to the external device via the pad 91, the pad 72, and other components.
The wiring 83A formed at the same layer level as the signal pad wiring 52A is connected to the wiring 83B below the signal pad wiring 52A via the via 85A, and the wiring 83B below the signal pad wiring 52A is connected to a high-resistance element 101 and a wiring 83C in the wiring layer 81 via a via 85B and other parts. A VSS potential (for example, GND) is supplied to the wiring 83C. The VSS potential is also supplied to the wirings 83A and 83B via the high-resistance element 101, and the wirings 83A and 83B constitute a shield wiring.
The signal pad wiring 52A to which the output signal SIG is input is formed to have a smaller plane area than the pad 51 of the pixel substrate 41 within a range that allows a current value of the output signal SIG. In the example of
The wiring 83A is disposed at the same layer level as the signal pad wiring 52A to surround the signal pad wiring 52A, and the wiring 83B is formed at a position to overlap with the signal pad wiring 52A and the wiring 83A in plan view. The wiring 83B is connected to the high-resistance element 101 via one or more vias (including the via 85B).
The pad 51 of the pixel substrate 41 in
Although capacitances 102A and 102B, which function as a pad coupling capacitance, are generated between the adjacent wirings 83A and 83B respectively disposed at the same layer level as the signal pad wiring 52A and below the signal pad wiring 52A, the pad coupling capacitance can be reduced by forming the signal pad wiring 52A with the minimum area. In addition, shield capacitances (capacitances 102A and 102B) can be invalidated by supplying the VSS potential to the wirings 83A and 83B via the high-resistance element 101. Although a parasitic capacitance 103 generated between the upper and lower wirings 83 is generated between the wiring 83B and a well region 111 (or a P sub-substrate region) of the semiconductor substrate 82, the parasitic capacitance 103 can also be invalidated by supplying the VSS potential to the wiring 83B via the high-resistance element 101.
The effect of the wiring structure on the control substrate 42 side, which is connected to the signal pad 51, will be described with reference to
A of
A step response will be considered in a case where a pulse signal changing from 0 V to 1 V is output from a signal source 150 as the output signal SIG.
In the present RC circuit in A of
In the comparative RC circuit in B of
In the present RC circuit and the comparative RC circuit, conditions other than the resistance 153 corresponding to the high-resistance element 101 are the same. For example, as illustrated in
As is apparent with reference to
As described above, according to the wiring structure on the control substrate 42 side as illustrated in
As described above, as long as the current value of the output signal SIG is allowed for the signal pad wiring 52A on the control substrate 42 side, the pattern shapes are not required to be uniform, and it is sufficient to form the signal pad wiring 52A in a planar shape suitable for each application. It is sufficient to determine the wiring pattern (planar shape) of the signal pad wiring 52A according to the impedance allowed for connection to the signal pad 51 of the pixel substrate 41 or other components.
A to E of
A of
B of
C of
D of
E of
The wiring pattern of the signal pad wiring 52A may also have a pattern shape other than the examples illustrated in A to E of
A to D of
The wiring pattern of the wiring 83B disposed below the signal pad wiring 52A can include solid film patterns as illustrated in A and B of
A and B of
The wiring pattern of the wiring 83A can have a pattern shape surrounding the signal pad wiring 52A to which the output signal SIG is input from three sides in the horizontal direction and the vertical direction as a U-shaped pattern as illustrated in A of
Alternatively, in a case where a distance from another pad adjacent in the vertical direction in
In the above-described example, the example in which both the wiring 83A and the wiring 83B serving as the shield wiring are disposed around the signal pad wiring 52A to which the output signal SIG is input has been exemplified, but one of the wiring 83A at the same layer level as the signal pad wiring 52A or the wiring 83B below the signal pad wiring 52A may also be used.
In the description so far, the wiring pattern examples of the signal pad wiring 52A for signal application such as a pixel signal and a control signal, and the wiring 83A and the wiring 83B formed around the signal pad wiring 52A have been described. Meanwhile, as described with reference to
A to C of
The power supply or ground pad wiring 52B is formed to have a high pattern density so as to have a low impedance because the power supply or ground pad wiring 52B is required to allow a sufficient current to flow. For example, as illustrated in A of
Next, configuration examples of the high-resistance element 101 connected to the wirings 83A and 83B serving as the shield wiring will be described with reference to
For example, as illustrated in A of
In addition, for example, as illustrated in B of
In addition, for example, as illustrated in C of
In addition, for example, as illustrated in D of
In addition, for example, as illustrated in A of
The high-resistance element 101 can be realized with a small area by using the parasitic resistance of the wiring 83 or the via 85.
In the example described above, the VSS potential (for example, GND) is supplied to the wirings 83A and 83B serving as the shield wiring via the high-resistance element 101.
However, as the potential supplied to the wirings 83A and 83B via the high-resistance element 101, a potential different from the VSS potential may be supplied. More specifically, a Vcom potential (Vcom>VSS) higher than the VSS potential can be supplied to the wirings 83A and 83B via the high-resistance element 101.
In the I/F circuit 18, a Vouth potential and a VSS potential generated by a regulator 251 are supplied to a driver 252. The driver 252 generates an output signal SIG which includes differential signals with the VSS potential that is an OFF potential and the Vouth potential that is an ON potential, and outputs the output signal SIG to the signal pad wiring 52A. At this time, in a case where noise is included in a well with the VSS potential in the semiconductor substrate 82, the noise of the semiconductor substrate 82 is superimposed on the output signal SIG via the capacitance 102 serving as a pad coupling capacitance to cause jitter.
In a case where the Vcom potential is supplied to the wirings 83A and 83B serving as the shield wiring, the capacitance 102 serving as the pad coupling capacitance is coupled with respect to the Vcom potential even in a case where the noise is included in the well with the VSS potential of the semiconductor substrate 82. Therefore, the noise of the semiconductor substrate 82 is separated and is not superimposed on the output signal SIG to cause jitter.
In the I/F circuit 18, a regulator 251H generates a Vouth potential that is an ON potential and supplies the Vouth potential to the driver 252. A regulator 251L generates a Voutl potential that is an OFF potential and supplies the Voutl potential to the driver 252. The driver 252 generates an output signal SIG which includes differential signals with a Voutl potential that is an OFF potential and a Vouth potential that is an ON potential, and outputs the output signal SIG to the signal pad wiring 52A. In a case where noise is included in the well with the VSS potential in the semiconductor substrate 82, the noise is superimposed on the output signal SIG as jitter because the noise is not in the same phase as the variation of the output signal SIG.
In a case where the output signal SIG is generated on the basis of the Vcom potential, and the Vcom potential is supplied to the wirings 83A and 83B serving as the shield wiring, the capacitance 102 serving as the pad coupling capacitance is coupled with respect to the Vcom potential even in a case where the noise is included in the well with the VSS potential of the semiconductor substrate 82. Therefore, the noise of the semiconductor substrate 82 is separated and is not superimposed on the output signal SIG to cause jitter.
As described above, the VSS potential (for example, GND) may be supplied to the wirings 83A and 83B serving as the shield wiring via the high-resistance element 101, but the Vcom potential different from the VSS potential (Vcom>VSS) can be supplied to minimize the VSS noise from being superimposed on the output signal SIG. Since the signal pad wiring 52A is formed as a plane area that is small enough to allow the current value of the output signal SIG, the space in the wiring layer 81 is widened, thereby improving the degree of freedom of the wiring 83. By using this, the Vcom potential different from the VSS potential can be supplied to the wirings 83A and 83B serving as the shield wiring to reduce charge and discharge charges of the pad coupling capacitance.
The above-described solid-state imaging device 1 can be used as an image sensor in various cases of sensing light such as visible light, infrared light, ultraviolet light, and X-rays as described below, for example.
Application of the technology of the present disclosure is not limited to that to the solid-state imaging device. That is, the technology of the present disclosure can be applied to general electronic devices using a solid-state imaging device as an image capturing section (a photoelectric conversion section), such as an imaging device such as a digital still camera or a video camera, a mobile terminal device having an imaging function, and a copying machine using a solid-state imaging device as an image reading unit. The solid-state imaging device may be formed as one chip, or may be in a module form having an imaging function in which an imaging section and a signal processing section or an optical system are packaged together.
An imaging device 300 in
The optical section 301 captures incident light (image light) from a subject and forms an image on an imaging surface of the solid-state imaging device 302. The solid-state imaging device 302 converts the light amount of the incident light imaged on the imaging surface by the optical section 301 into an electrical signal in units of pixels and outputs the electrical signal as a pixel signal. As the solid-state imaging device 302, it is possible to use the solid-state imaging device 1 in
The display section 305 includes, for example, a thin display such as a liquid crystal display (LCD), an organic electro luminescence (EL) display, or other displays, and displays a moving image or a still image captured by the solid-state imaging device 302. The recording section 306 records the moving image or the still image captured by the solid-state imaging device 302 on a recording medium such as a hard disk, a semiconductor memory, or other recording medium.
The operation section 307 issues operation commands for various functions of the imaging device 300 under operation by the user. The power supply section 308 appropriately supplies various power sources serving as operation power sources of the DSP circuit 303, the frame memory 304, the display section 305, the recording section 306, and the operation section 307 to these supply targets.
As described above, the configuration of the above-described solid-state imaging device 1 can be employed as the solid-state imaging device 302 to achieve the high-speed communication interface. Therefore, even in the imaging device 300 such as a video camera, a digital still camera, or a camera module for a mobile device such as a mobile phone or other devices, the increased rate of capturing images can be achieved.
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology of the present disclosure may be achieved in the form of a device to be mounted on a mobile object of any kind, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a vessel, or a robot.
The vehicle control system 12000 includes a plurality of electronic control units connected to each other via a communication network 12001. In the example illustrated in
The driving system control unit 12010 controls the operation of devices related to the driving system of the vehicle in accordance with various kinds of programs. For example, the driving system control unit 12010 functions as a control device for a driving force generating device for generating the driving force of the vehicle, such as an internal combustion engine, a driving motor, or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism for adjusting the steering angle of the vehicle, a braking device for generating the braking force of the vehicle, and the like.
The body system control unit 12020 controls the operation of various kinds of devices provided to a vehicle body in accordance with various kinds of programs. For example, the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various kinds of lamps such as a headlamp, a backup lamp, a brake lamp, a turn signal, a fog lamp, or the like. In this case, radio waves transmitted from a mobile device as an alternative to a key or signals of various kinds of switches can be input to the body system control unit 12020. The body system control unit 12020 receives these input radio waves or signals, and controls a door lock device, the power window device, the lamps, or the like of the vehicle.
The outside-vehicle information detecting unit 12030 detects information about the outside of the vehicle including the vehicle control system 12000. For example, the outside-vehicle information detecting unit 12030 is connected with an imaging section 12031. The outside-vehicle information detecting unit 12030 makes the imaging section 12031 image an image of the outside of the vehicle, and receives the imaged image. On the basis of the received image, the outside-vehicle information detecting unit 12030 may perform processing of detecting an object such as a human, a vehicle, an obstacle, a sign, a character on a road surface, or the like, or processing of detecting a distance thereto.
The imaging section 12031 is an optical sensor that receives light, and which outputs an electric signal corresponding to a received light amount of the light. The imaging section 12031 can output the electric signal as an image, or can output the electric signal as information about a measured distance. In addition, the light received by the imaging section 12031 may be visible light, or may be invisible light such as infrared rays or the like.
The in-vehicle information detecting unit 12040 detects information about the inside of the vehicle. The in-vehicle information detecting unit 12040 is, for example, connected with a driver state detecting section 12041 that detects the state of a driver. The driver state detecting section 12041, for example, includes a camera that images the driver. On the basis of detection information input from the driver state detecting section 12041, the in-vehicle information detecting unit 12040 may calculate a degree of fatigue of the driver or a degree of concentration of the driver, or may determine whether the driver is dozing.
The microcomputer 12051 can calculate a control target value for the driving force generating device, the steering mechanism, or the braking device on the basis of the information about the inside or outside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040, and output a control command to the driving system control unit 12010. For example, the microcomputer 12051 can perform cooperative control intended to implement functions of an advanced driver assistance system (ADAS) which functions include collision avoidance or shock mitigation for the vehicle, following driving based on a following distance, vehicle speed maintaining driving, a warning of collision of the vehicle, a warning of deviation of the vehicle from a lane, or the like.
In addition, the microcomputer 12051 can perform cooperative control intended for automated driving, which makes the vehicle to travel automatedly without depending on the operation of the driver, or the like, by controlling the driving force generating device, the steering mechanism, the braking device, or the like on the basis of the information about the outside or inside of the vehicle which information is obtained by the outside-vehicle information detecting unit 12030 or the in-vehicle information detecting unit 12040.
In addition, the microcomputer 12051 can output a control command to the body system control unit 12020 on the basis of the information about the outside of the vehicle which is obtained by the outside-vehicle information detecting unit 12030. For example, the microcomputer 12051 can perform cooperative control intended to prevent a glare by controlling the headlamp so as to change from a high beam to a low beam, for example, in accordance with the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detecting unit 12030.
The sound/image output section 12052 transmits an output signal of at least one of a sound and an image to an output device capable of visually or auditorily notifying information to an occupant of the vehicle or the outside of the vehicle. In the example of
In
The imaging sections 12101, 12102, 12103, 12104, and 12105 are, for example, disposed at positions on a front nose, sideview mirrors, a rear bumper, and a back door of the vehicle 12100 as well as a position on an upper portion of a windshield within the interior of the vehicle, and the like. The imaging section 12101 provided to the front nose and the imaging section 12105 provided to the upper portion of the windshield within the interior of the vehicle obtain mainly an image of the front of the vehicle 12100. The imaging sections 12102 and 12103 provided to the sideview mirrors obtain mainly an image of the sides of the vehicle 12100.
The imaging section 12104 provided to the rear bumper or the back door obtains mainly an image of the rear of the vehicle 12100. The forward images obtained by the imaging sections 12101 and 12105 are used mainly to detect a preceding vehicle, a pedestrian, an obstacle, a traffic signal, a traffic sign, a lane, or the like.
Note that
At least one of the imaging sections 12101 to 12104 may have a function of obtaining distance information. For example, at least one of the imaging sections 12101 to 12104 may be a stereo camera constituted of a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
For example, the microcomputer 12051 can determine a distance to each three-dimensional object within the imaging ranges 12111 to 12114 and a temporal change in the distance (relative speed with respect to the vehicle 12100) on the basis of the distance information obtained from the imaging sections 12101 to 12104, and thereby extract, as a preceding vehicle, a nearest three-dimensional object in particular that is present on a traveling path of the vehicle 12100 and which travels in substantially the same direction as the vehicle 12100 at a predetermined speed (for example, equal to or more than 0 km/hour). Further, the microcomputer 12051 can set a following distance to be maintained in front of a preceding vehicle in advance, and perform automatic brake control (including following stop control), automatic acceleration control (including following start control), or the like. It is thus possible to perform cooperative control intended for automated driving that makes the vehicle travel automatedly without depending on the operation of the driver or the like.
For example, the microcomputer 12051 can classify three-dimensional object data on three-dimensional objects into three-dimensional object data of a two-wheeled vehicle, a standard-sized vehicle, a large-sized vehicle, a pedestrian, a utility pole, and other three-dimensional objects on the basis of the distance information obtained from the imaging sections 12101 to 12104, extract the classified three-dimensional object data, and use the extracted three-dimensional object data for automatic avoidance of an obstacle. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that the driver of the vehicle 12100 can recognize visually and obstacles that are difficult for the driver of the vehicle 12100 to recognize visually. Then, the microcomputer 12051 determines a collision risk indicating a risk of collision with each obstacle. In a situation in which the collision risk is equal to or higher than a set value and there is thus a possibility of collision, the microcomputer 12051 outputs a warning to the driver via the audio speaker 12061 or the display section 12062, and performs forced deceleration or avoidance steering via the driving system control unit 12010. The microcomputer 12051 can thereby assist in driving to avoid collision.
At least one of the imaging sections 12101 to 12104 may be an infrared camera that detects infrared rays. The microcomputer 12051 can, for example, recognize a pedestrian by determining whether or not there is a pedestrian in imaged images of the imaging sections 12101 to 12104. Such recognition of a pedestrian is, for example, performed by a procedure of extracting characteristic points in the imaged images of the imaging sections 12101 to 12104 as infrared cameras and a procedure of determining whether or not it is the pedestrian by performing pattern matching processing on a series of characteristic points representing the contour of the object. When the microcomputer 12051 determines that there is a pedestrian in the imaged images of the imaging sections 12101 to 12104, and thus recognizes the pedestrian, the sound/image output section 12052 controls the display section 12062 so that a square contour line for emphasis is displayed so as to be superimposed on the recognized pedestrian. The sound/image output section 12052 may also control the display section 12062 so that an icon or the like representing the pedestrian is displayed at a desired position.
An example of the vehicle control system to which the technology according to the present disclosure can be applied has been described above. The technology according to the present disclosure may be applied to the imaging section 12031 in the configuration described above. Specifically, the solid-state imaging device 1 described above can be applied as the imaging section 12031. By applying the technology according to the present disclosure to the imaging section 12031, it is possible to obtain a more easily-viewed imaged image and obtain distance information while achieving the high-speed communication interface. In addition, the obtained imaged image and distance information can be used to reduce driver's fatigue and improve the safety level of the driver and the vehicle.
In addition, the present disclosure is not limited to application to a solid-state imaging device that detects distribution of the amount of incident light of visible light and captures the distribution as an image, and can be applied to all solid-state imaging devices (physical quantity distribution detection devices) such as a solid-state imaging device that captures distribution of the amount of incident infrared rays, X-rays, particles, or the like as an image, and a fingerprint detection sensor that detects distribution of other physical quantities such as pressure and capacitance and captures the distribution as an image in a broad sense.
In addition, the technology of the present disclosure is applicable not only to a solid-state imaging device but also to all semiconductor devices having other semiconductor integrated circuits.
The embodiment of the present disclosure is not limited to the above-described embodiments, and various modifications may be made without departing from the gist of the technology of the present disclosure.
Note that, the effects described in the present specification are merely examples and are not limited, and there may be effects other than those described in the present specification.
Note that, the technology of the present disclosure can have the following configurations.
(1)
A solid-state imaging device including: a pixel substrate on which a pixel is formed; and a control substrate, the pixel substrate and the control substrate being stacked,
(2)
The solid-state imaging device according to (1),
(3)
The solid-state imaging device according to (2),
(4)
The solid-state imaging device according to (2),
(5)
The solid-state imaging device according to any one of (1) to (4),
(6)
The solid-state imaging device according to any one of (1) to (5),
(7)
The solid-state imaging device according to any one of (1) to (5),
(8)
The solid-state imaging device according to any one of (1) to (5),
(9)
The solid-state imaging device according to any one of (1) to (8),
(10)
The solid-state imaging device according to any one of (1) to (8),
(11)
The solid-state imaging device according to any one of (1) to (10),
(12)
The solid-state imaging device according to any one of (1) to (11),
(13)
The solid-state imaging device according to (12),
(14)
An electronic device including a solid-state imaging device,
| Number | Date | Country | Kind |
|---|---|---|---|
| 2022-072882 | Apr 2022 | JP | national |
| Filing Document | Filing Date | Country | Kind |
|---|---|---|---|
| PCT/JP2023/014547 | 4/10/2023 | WO |