The present disclosure relates to a photoelectric conversion apparatus and device.
Japanese Patent Laid-Open No. 2018/113606 discloses a photoelectric conversion apparatus constituted by semiconductor substrates stacked in three layers.
In Japanese Patent Laid-Open No. 2018/113606, a through-electrode is provided in an insulating region that penetrates through a second substrate, so that the area of the second substrate where elements can be arranged is limited.
An aspect of the present disclosure is a photoelectric conversion apparatus including a first component and a second component. The first component includes a first semiconductor substrate, a first photoelectric conversion circuit, a second photoelectric conversion circuit, a floating diffusion, a first transfer transistor, and a second transfer transistor. The first semiconductor substrate has a first plane and a second plane facing the first plane. The first photoelectric conversion circuit is configured to receive light from the second plane. The second photoelectric conversion circuit is configured to receive light from the second plane. The first transfer transistor is provided on a side where the first plane is provided and is configured to transfer signal charge generated in the first photoelectric conversion circuit to the floating diffusion. The second transfer transistor is provided on the side where the first plane is provided and is configured to transfer signal charge generated in the second photoelectric conversion circuit to the floating diffusion. The second component includes a second semiconductor substrate, an insulator, a first amplification transistor, and a second amplification transistor. The second semiconductor substrate has a third plane and a fourth plane facing the third plane. The insulator is configured to penetrate through the second semiconductor substrate from the third plane to the fourth plane or from the fourth plane to the third plane. The first amplification transistor is configured to receive a signal via the first transfer transistor. The second amplification transistor is configured to receive a signal via the second transfer transistor. The second component is stacked on the first component. A polysilicon member that is a gate of the first transfer transistor is a gate of the second transfer transistor, and a through-electrode configured to penetrate through the insulator and the polysilicon member are electrically connected to each other.
Further features of the disclosure will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
In the following, each embodiment is described with reference to the drawings.
In each of the following embodiments, an image pickup apparatus will be focused on as an example of a photoelectric conversion apparatus. Note that the individual embodiments are not limited to image pickup apparatuses and can be applied to other examples of photoelectric conversion apparatuses. For example, there are distance measuring devices (devices for, for example, distance measurement using focus detection and Time of Flight (TOF)) and light metering devices (devices for, for example, measuring the amount of incident light).
The conductivity types of semiconductor regions and wells and dopants injected in the embodiments described below are examples, and are not limited to only the conductivity types and dopants described in the embodiments. The conductivity types and dopants described in the embodiments can be changed as needed, and the potentials of the semiconductor regions and wells are changed as needed accordingly with this change.
Note that the conductivity types of the transistors described in the following embodiments are examples and are not limited to only the conductivity types described in the exemplary embodiments. The conductivity type of each transistor described in the embodiments can be changed as needed, and the gate, source, and drain potentials of the transistor are changed accordingly with this change.
For example, for a transistor to operate as a switch, it is sufficient that the low and high levels of the potential supplied to the gate be switched as the conductivity type is changed, in contrast to description in the exemplary embodiment. The conductivity types of semiconductor regions described in the following exemplary embodiments are also examples, and are not limited to only the conductivity types described in the exemplary embodiments. The conductivity types described in the exemplary embodiments can be changed as needed, and the potentials of semiconductor regions can be changed accordingly with this change.
In the following embodiments, connections between circuit elements may also be mentioned. In this case, even in a case where there is another element between elements of interest, the elements of interest are treated as connected, unless otherwise noted. For example, suppose that an element A is connected to one node of a capacitive element C having a plurality of nodes, and an element B is connected to another node of the capacitive element C. Even in such a case, the elements A and B are treated as connected, unless otherwise noted.
Metal members such as wiring lines and pads described herein may be composed of a single metal of one element or a mixture (an alloy). For example, wiring lines described as copper wiring lines may be composed of copper alone or may be composed primarily of copper with additional other components. For example, pads that are connected to external terminals may be composed of aluminum alone or may be composed primarily of aluminum with additional other components. The copper wiring lines and aluminum pads described here are examples and can be changed to various metals.
The wiring lines and pads described here are examples of metal members used in photoelectric conversion apparatuses and may be applicable to other metal members.
The configuration common to each embodiment of a photoelectric conversion apparatus, which is an example of a semiconductor device according to the present disclosure, will be described using
The photoelectric conversion apparatus 100 includes two substrates which are stacked one on top of the other and are electrically connected to each other. The two substrates are a sensor substrate 11 and a circuit substrate 21. The sensor substrate 11 has a first semiconductor layer (a first semiconductor substrate) and a first wiring layer. The first semiconductor layer has photoelectric conversion elements 102, which will be described later. The circuit substrate 21 has a second semiconductor layer (a second semiconductor substrate) and a second wiring layer. The second semiconductor layer has, for example, signal processing units 103, which will be described later. In the photoelectric conversion apparatus described in each embodiment, the surface of a first semiconductor layer 300 that is in contact with a first wiring layer 301 is called the front surface (a first plane) of the first semiconductor layer 300, and the surface of the first semiconductor layer 300 facing the front surface of the first semiconductor layer 300 is called the rear surface (a second plane) of the first semiconductor layer 300. Similarly, the surface of a second semiconductor layer 400 that is in contact with a second wiring layer 401 is called the front surface (a third plane) of the second semiconductor layer 400, and the surface of the second semiconductor layer 400 on the opposite side from the third plane is called the rear surface (a fourth plane) of the second semiconductor layer 400. In the following, the sensor substrate 11 and the circuit substrate 21 will be described as chips obtained by dicing; however, the sensor substrate 11 and the circuit substrate 21 are not limited to such chips. For example, each substrate may be a wafer. The individual substrates may be stacked one on top of the other in a wafer state and then be subjected to dicing. Alternatively, the individual substrates may be divided into chips, and chips may be staked one on top of the other and joined to each other.
A pixel region 12 is arranged on the sensor substrate 11, and a circuit region 22, which processes signals detected by the pixel region 12, is arranged on the circuit substrate 21.
Typically, the pixels 101 are pixels for forming an image; however, the pixels 101 do not have to form an image when used for time of flight (TOF). That is, the pixels 101 may also be used to measure the time of arrival of light and the amount of light.
The photoelectric conversion elements 102 in
The vertical scanning circuit 110 receives a control pulse supplied from the control pulse generation unit 115 and supplies the control pulse to each pixel. In the vertical scanning circuit 110, a logic circuit such as a shift register or an address decoder is used.
In each pixel 101, a signal output from the photoelectric conversion element 102 is processed by the signal processing unit 103. The signal processing unit 103 is provided with a reset transistor, an amplification transistor, a memory, and so forth described below.
To read out signals from the individual pixels, the horizontal scanning circuit 111 inputs, into the signal processing units 103, a control pulse for sequentially selecting a column.
Regarding a selected column, a signal is output from the signal processing unit 103 of the pixel selected by the vertical scanning circuit 110 to a corresponding one of the signal lines 113.
The signal output to the signal line 113 is output through an output circuit 114 to a recording unit or a signal processing unit outside the photoelectric conversion apparatus 100.
In
As illustrated in
Note that the arrangement of the signal lines 113, the column circuit 112, and the output circuit 114 is not limited to the arrangement illustrated in
In the following, photoelectric conversion apparatuses according to the individual embodiments will be described. Note that the sensor substrate 11 described above may also be called a first substrate or a first component, and the circuit substrate 21 described above may also be called a second substrate or a second component. Moreover, the rear surface side of the first semiconductor layer defined above may also be called a light-incident side.
In the following, each embodiment is described with reference to the drawings.
Each pixel 101 has, for example, a photodiode PD, a transfer transistor TX electrically connected to the photodiode PD, and a floating diffusion FD. The signal processing unit 103 temporarily holds, in the floating diffusion FD, electric charge output from the photodiode PD via the transfer transistor TX. The floating diffusion FD is connected to an input node of an amplification transistor AMP. The photodiode PD performs photoelectric conversion to generate electric charge corresponding to the amount of light received. The cathode of the photodiode PD is electrically connected to the source of the transfer transistor TX, and an electric potential applied to the well region is applied to the anode of the photodiode PD.
That is, the anode of the photodiode PD is electrically connected to a reference potential line (for example, ground potential). Moreover, the photodiode PD is provided in the well region connected to this reference potential line. The drain of the transfer transistor TX is electrically connected to the floating diffusion FD, and the gate of the transfer transistor TX is electrically connected to a pixel drive line. The transfer transistor TX is, for example, a complementary metal-oxide-semiconductor (CMOS) transistor.
The signal processing unit 103 includes, for example, a reset transistor RES, a selection transistor SEL, and an amplification transistor SF. Note that the selection transistor SEL may be omitted as needed. Moreover, the electrical path between the reset transistor RES and the floating diffusion FD may be further equipped with a transistor FDINC to change the capacitance value of the floating diffusion FD.
The source of the reset transistor RES (the input terminal of the signal processing unit 103) is electrically connected to the floating diffusion FD. The drain of the reset transistor RES is electrically connected to a power supply line (SVDD) and the drain of the amplification transistor SF. The gate of the reset transistor RES is electrically connected to the pixel drive line. The source of the amplification transistor SF is electrically connected to the drain of the selection transistor SEL, and the gate of the amplification transistor SF is electrically connected to the source of the reset transistor RES. The source of the selection transistor SEL (the output terminal of the signal processor 103) is electrically connected to a pixel output line, and the gate of the selection transistor SEL is electrically connected to the pixel drive line.
When the transfer transistor TX enters its ON state, the electric charge of the photodiode PD is transferred to the floating diffusion FD. The reset transistor RES resets the electric potential of the floating diffusion FD to a predetermined potential. When the reset transistor RES enters its ON state, the electric potential of the floating diffusion FD is reset to the electric potential of the power supply line (SVDD). The selection transistor SEL controls the output timing of a pixel signal from the signal processing unit 103. The amplification transistor SF generates, as a pixel signal, a voltage signal corresponding to the level of the electric charge held in the floating diffusion FD. The amplification transistor SF constitutes a source follower type amplifier, which outputs a pixel signal that is a voltage signal corresponding to the level of electric charge generated by the photodiode PD. When the selection transistor SEL enters its ON state, the amplification transistor SF amplifies the electric potential of the floating diffusion FD and outputs a voltage corresponding to the resulting electric potential to a column signal processing circuit via the pixel output line. The reset transistor RES, the amplification transistor SF, and the selection transistor SEL are CMOS transistors, for example.
The transfer gate of the transfer transistor TX controls conduction between the photodiode PD and the floating diffusion FD of each pixel. A pixel isolation portion 201 is provided between a plurality of pixels 101 and electrically separates a plurality of semiconductor regions from each other. The pixel isolation portion 201 may include an insulating section such as silicon oxide or may be a semiconductor region that forms a potential barrier. Typically, the pixel isolation portion 201 is a semiconductor region whose primary carrier is a charge having polarity opposite to that of signal charge accumulated by the photodiode PD. A pixel separation layer is provided between the pixel isolation portion 201 and the photodiode PD. The pixel separation layer has the role of reducing dark current, especially when the pixel isolation portion 201 including an insulating section is provided. The floating diffusion FD and the gate of the amplification transistor SF are connected with a through-hole electrode interposed therebetween. The through-hole electrode is composed mainly of metals such as tungsten and copper. The through-hole electrode is formed so as to penetrate through an insulator 251 that separates the second semiconductor layer 400. The insulator 251 electrically separates the plurality of signal processing units 103 from each other. The insulator 251 is provided so as to penetrate through the second semiconductor layer 400 from the third plane to the fourth plane thereof.
The transfer transistors TX are scanned in row order for the pixels 101 arranged in an array in the pixel region 12. In the photoelectric conversion apparatus having a configuration illustrated in
A photoelectric conversion apparatus according to the present embodiment will be described using
Similarly to as in the first embodiment, photodiodes PD, transfer transistors TX, and floating diffusions FD are provided in or on the first substrate. Elements such as reset transistors RES and selection transistors SEL, control lines, and signal lines are arranged in or on the second substrate.
A feature of the photoelectric conversion apparatus according to the present embodiment is that the pixels 101 that do not share the gate of any transfer transistor TX among the four pixels 101 arranged in two rows and two columns share a floating diffusion FD. By combining sharing of the gate of the transfer transistor TX and sharing of the floating diffusion FD, it is possible to reduce the number of through-electrodes 421 compared to that in the first embodiment. In addition, the number of circuits arranged in or on the second substrate can be reduced by sharing a signal processing section 103 among a plurality of pixels 101. Thus, noise performance can be improved by increasing the ratio (L/W) of channel width (W) to channel length (L) of the amplification transistor SF, for example.
In the photoelectric conversion apparatus according to the present embodiment, the transfer transistors are controlled every two rows in the order of an even-numbered row (the m-th column) and an odd-numbered row (the (m+1)-th column). First, electric charge is transferred to each FD in the n-th and (n+1)-th rows simultaneously. Specifically, first, the transfer transistors spanning the n-th and (n+1)-th rows and arranged in the even-numbered column are turned on. Then, a signal at (n, m) is transferred to the floating diffusion FD arranged in the n-th row. At the same time, a signal at (n+1, m) is transferred to the floating diffusion FD arranged in the (n+1)-th row.
Since a pixel circuit is independently provided for each floating diffusion FD, signals from all the floating diffusions FD are read out via the signal lines, column circuit, and output circuit during one scanning period. Next, the transfer transistors spanning the n-th and (n+1)-th rows and arranged in the odd-numbered column are turned on. The subsequent procedure is the same as the operation described above.
In other words, a first group of transfer transistors including the transfer transistors corresponding to the pixels 101 at (n, m) and (n+1, m) is controlled by a first control signal. A second group of transfer transistors including the transfer transistors corresponding to the pixels 101 at (n, m+1) and (n+1, m+1) is also controlled by the first control signal. Note that the way in which driving is performed is not limited to this, and scanning may be performed sequentially in the column direction, for example. In this case, the direction in which the transfer transistors included in the first group of transfer transistors are arranged (vertical direction) intersects with the direction in which the floating diffusions FD connected to each of the transfer transistors included in the first group of transfer transistors is arranged (horizontal direction).
Electric charge is transferred at the same timing as in the case of the structure illustrated in
A photoelectric conversion apparatus according to the present embodiment will be described using
A current source transistor BIAS, which functions as a constant current source, is connected between the output terminal of the amplification transistor SF and a reference potential Vss. When the current source transistor BIAS is to function as a constant current source, a predetermined level of voltage is applied. When the current source transistor BIAS is not to function as a constant current source (not used), a Low level (0 V) is applied. Although a constant current source is not an essential constructional element, the configuration including a constant current source increases the speed of writing into memories.
In the circuit configuration according to the present embodiment, for example, the following driving is assumed.
First, the reset transistor RES and the transfer transistor TX are turned on to reset the photodiode PD and start electric charge accumulation. During the electric charge accumulation period, the reset transistor RES and the transistor GS2 are turned on to reset the potentials of the floating diffusion FD, a node X, and a node Y.
Next, after a reset settling period, the transfer transistor TX is turned on, and a signal is transferred from the photodiode PD to the floating diffusion FD. In this case, the transistor GS1 is also turned on. The voltage level of the output from the amplification transistor SF is written to the node X through the transistor GS1, and voltage is also written to the node Y through capacitive coupling.
Signals of the nodes Y are sequentially read out in row order by operating selection transistors SEL1. The read-out signals are AD-converted by the column ADC in the subsequent stage.
Thereafter, the transistor GS2 is turned on to reset the node Y. After the transistors GS1 and GS2 are turned off, the reset level of the node Y is read out.
The photoelectric conversion apparatus illustrated in
In the circuit with this configuration, a method can be used in which the N and S signals of the floating diffusion FD are stored, in a respective manner, in the memory units connected in parallel and read out in a sequential manner.
In the circuit with the configuration illustrated in
In contrast, the circuit configuration illustrated in
In this configuration, similarly to as in the configuration illustrated in
In the circuit configuration according to the present embodiment, for example, the following driving is assumed.
Description of the operation will be started from the point where the floating diffusion FD, the memory C1, and the memory C2 are reset. The transistors GS1 and GS2 are turned on, and the N signal at a reset level is read out and written into the memory C2.
The transfer transistor TX is turned on, and electric charge is transferred from the photodiode PD.
The transistor GS1 is turned off, the transistor GS2 is turned on, and the (S+N) signal is read out and written into the memory C1. After reading out the N signal to a correlated double sampling (CDS) circuit positioned in the subsequent stage, the transistor GS2 is turned on to distribute electric charge. By reading (S/2+N) to the CDS circuit and subtracting the N signal from (S/2+N), an S/2 signal whose pixel reset noise has been removed can be obtained.
The gates of the transfer transistors TX, which are shared in units of every two rows and two columns among the pixels 101, are more extensively connected as a POL wiring line. For example, all pixels in the pixel array may be connected and driven simultaneously. The number of through-electrodes can be further reduced.
A photoelectric conversion apparatus according to the present embodiment will be described using
In the following, the sensor substrate 11, the circuit substrate 21, and the second circuit substrate 31 will be described as chips obtained by dicing; however, the sensor substrate 11, the circuit substrate 21, and the second circuit substrate 31 are not limited to such chips. For example, each substrate may be a wafer. The individual substrates may be stacked one on top of the other in a wafer state and then be subjected to dicing. Alternatively, the individual substrates may also be divided into chips, and chips may be staked one on top of the other and joined to each other. The sensor substrate 11 may also be called the first substrate 11, the circuit substrate 21 may also be called the second substrate 21, and the second circuit substrate 31 may also be called a third substrate 31 or a third component.
The fourth embodiment differs from the third embodiment in that the second circuit substrate 31 and a second circuit region 32 are added. The signal processing sections 103 are arranged across two substrates, which are the circuit substrate 21 and the second circuit substrate 31. The signal processing units arranged in or on the circuit substrate 21 are treated as signal processing units 103A, and the signal processing units arranged in or on the second circuit substrate 31 are treated as signal processing units 103B.
The photoelectric conversion elements 102 illustrated in
Normally, when hybrid bonding is used to bond wiring layers facing each other, it is difficult to electrically connect yet another semiconductor layer on a pixel-by-pixel basis. However, by using the configuration illustrated in the present embodiment, it is possible to achieve three-layer stacking that is obtained by electrically connecting the three layers on a pixel-by-pixel basis. With a structure in which the three semiconductor layers are stacked, the area of arrangement for the signal processing circuit can be increased, so that higher functionality can be achieved.
In the photoelectric conversion apparatus illustrated in
A further modification example of the configuration of the photoelectric conversion apparatus according to the present embodiment is illustrated using
Note that, to ensure the global shutter function of transferring signals at the same time, it is sufficient that the transfer transistors TX operate collectively and the signals be transferred to the floating diffusions FD. In other words, the timings at which the SF outputs after transfer are transferred to the memory units via the gates of the transistors GS1 do not have to match. It is sufficient that the selection transistors SEL and the gates of the transistors GS1 corresponding to the four pixels 101 be operated in the order of a, b, c, and d illustrated in
A photoelectric conversion system according to the present embodiment will be described using
The photoelectric conversion apparatuses described in the first to fourth embodiments described above can be applied to various types of photoelectric conversion systems. Examples of the photoelectric conversion systems to which the photoelectric conversion apparatuses described in the first to fourth embodiments described above can be applied include digital still cameras, digital camcorders, surveillance cameras, copiers, fax machines, mobile phones, vehicle-mounted cameras, and observation satellites. The examples of the photoelectric conversion systems also include a camera module having an optical system such as a lens and an image pickup apparatus.
The photoelectric conversion system illustrated in
The photoelectric conversion system includes a signal processing unit 1007, which is an image generation unit configured to generate an image by performing processing on an output signal output from the image pickup apparatus 1004. The signal processing unit 1007 performs an operation in which various types of correction or compression are performed as needed to output image data. The signal processing unit 1007 may be formed in or on a semiconductor substrate provided with the image pickup apparatus 1004 or may be formed in or on another semiconductor substrate different from the semiconductor substrate provided with the image pickup apparatus 1004.
The photoelectric conversion system further includes a memory unit 1010 for temporarily storing image data and an external interface (I/F) unit 1013 for communicating with an external computer or the like. Furthermore, the photoelectric conversion system includes a recording medium 1012 such as a semiconductor memory for recording or reading out captured image data and a recording medium control I/F unit 1011 for recording data in or reading out data from the recording medium 1012. Note that the recording medium 1012 may be built in or detachable from the photoelectric conversion system.
Furthermore, the photoelectric conversion system includes a central control-operation unit 1009, which controls various types of arithmetic operations and the entire digital still camera, and a timing generation unit 1008, which outputs various types of timing signals to the image pickup apparatus 1004 and the signal processing unit 1007. In this case, a timing signal and the like may be input from the outside. It is sufficient that the photoelectric conversion system include at least the image pickup apparatus 1004 and the signal processing unit 1007, which processes an output signal output from the image pickup apparatus 1004.
The image pickup apparatus 1004 outputs an image pickup signal to the signal processing unit 1007. The signal processing unit 1007 performs certain signal processing on the image pickup signal output from the image pickup apparatus 1004 to output image data. The signal processing unit 1007 generates an image using the image pickup signal output from the image pickup apparatus 1004.
In this manner, according to the present embodiment, the photoelectric conversion system can be realized to which any one of the photoelectric conversion apparatuses (image pickup apparatuses) according to the embodiments described above.
A photoelectric conversion system and a moving object according to the present embodiment will be described using
The distance information acquisition unit may also be realized by, for example, a field-programmable gate array (FPGA) or an application-specific integrated circuit (ASIC) or may also be realized by a combination of an FPGA and an ASIC.
The photoelectric conversion system 2300 is connected to a vehicle information acquisition device 2320 and can acquire vehicle information such as a vehicle speed, a yaw rate, and a steering angle. Moreover, a control engine control unit (ECU) 2330 is connected to the photoelectric conversion system 2300. The control ECU 2330 is a controller that outputs, on the basis of a determination result from the collision determination unit 2318, a control signal for causing the vehicle to generate a braking force. Moreover, the photoelectric conversion system 2300 is also connected to an alarm device 2340, which alerts the driver on the basis of a determination result from the collision determination unit 2318. For example, in a case where the chances of a collision are high based on a determination result from the collision determination unit 2318, the control ECU 2330 performs vehicle control to avoid a collision or reduce damage by braking, releasing the accelerator, controlling the engine output, or the like. The alarm device 2340 alerts the user by going off an alarm such as certain sound, displaying alarm information on the screen of, for example, a car navigation system, or vibrating their seat belt or the steering wheel.
In the present embodiment, images around the vehicle, for example, images of views in front of or behind the vehicle are captured by the photoelectric conversion system 2300.
In the above, an example has been described in which control for preventing the vehicle from colliding with other vehicles. However, the photoelectric conversion system 2300 can also be applied to perform, for example, control under which the vehicle drives autonomously so as to follow other vehicles or control under which the vehicle drives autonomously so as not to drive out of the lane. Furthermore, the photoelectric conversion system 2300 can be applied not only to vehicles such as cars but also to, for example, moving objects (moving apparatuses) such as vessels, airplanes, or industrial robots. In addition, the photoelectric conversion system 2300 can be applied not only to moving objects but also to a wide range of apparatuses using object recognition such as an intelligent transportation system (ITS).
A photoelectric conversion system according to the present embodiment will be described using
As illustrated in
The optical system 1402 includes one or more lenses. The optical system 1402 guides image light (incident light) from the subject to the photoelectric conversion apparatus 1403, and causes an image to be formed on a light receiving surface (a sensor unit) of the photoelectric conversion apparatus 1403.
As the photoelectric conversion apparatus 1403, any one of the photoelectric conversion apparatuses described in the individual embodiments described above is used. A distance signal representing a distance obtained from a light reception signal and output from the photoelectric conversion apparatus 1403 is supplied to the image processing circuit 1404.
The image processing circuit 1404 performs image processing in which a distance image is constructed on the basis of the distance signal supplied from the photoelectric conversion apparatus 1403. The distance image (image data) obtained as a result of the image processing is supplied to and displayed on the monitor 1405 or is supplied to and stored (recorded) in the memory 1406.
In the distance image sensor 1401 configured in this manner, the characteristics of pixels are improved by using one of the photoelectric conversion apparatuses described above and consequently, for example, a more accurate distance image can be acquired.
A photoelectric conversion system according to the present embodiment will be described using
The endoscope 1100 includes a lens tube 1101 and a camera head 1102. A portion of the lens tube 1101 starting from its leading edge and having a predetermined length is inserted into a body cavity of the patient 1132. The camera head 1102 is connected to a base end of the lens tube 1101. In the illustrated example, the endoscope 1100 is formed as a rigid scope including the lens tube 1101, which is rigid; however, the endoscope 1100 may be formed as a so-called flexible scope having a flexible lens tube.
The leading edge of the lens tube 1101 is provided with an opening in which an objective lens is embedded. The endoscope 1100 is connected to a light source device 1203. Light generated by the light source device 1203 is guided to the leading edge of the lens tube 1101 along a light guide extended in the lens tube 1101. Light guided to the leading edge of the lens tube 1101 is emitted toward an observation target in the body cavity of the patient 1132 through the objective lens. Note that the endoscope 1100 may be a direct-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
The camera head 1102 includes an optical system and a photoelectric conversion apparatus. Reflected light (observation light) from the observation target is concentrated by the optical system onto the photoelectric conversion apparatus. The observation light is photoelectrically converted by the photoelectric conversion apparatus, and an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image is generated. As the photoelectric conversion apparatus, any one of the photoelectric conversion apparatuses described in the individual embodiments described above can be used. The image signal is transmitted as RAW data to a camera control unit (CCU) 1135.
The CCU 1135 includes, for example, a central processing unit (CPU) and a graphics processing unit (GPU), and performs central control on operations of the endoscope 1100 and a display device 1136. Furthermore, the CCU 1135 receives an image signal from the camera head 1102, and performs, on the image signal, various types of image processing for displaying an image based on the image signal such as development processing (demosaicing) or the like.
The display device 1136 displays, under control performed by the CCU 1135, the image based on the image signal on which image processing is performed by the CCU 1135.
The light source device 1203 includes, for example, a light source such as a light-emitting diode (LED) and supplies, to the endoscope 1100, illumination light to be used when an image of a surgical target or the like is captured.
An input device 1137 is an input interface for the endoscopic operation system 1150. The user can input various types of information or commands to the endoscopic operation system 1150 through the input device 1137.
A treatment tool control device 1138 controls driving of an energy treatment tool 1112 for ablating or dissecting tissue, closing a blood vessel, or the like.
The light source device 1203 supplies, to the endoscope 1100, illumination light to be used when an image of a surgical target is captured. The light source device 1203 includes a white light source formed by, for example, LEDs, laser light sources, or a combination of LEDs and laser light sources. In a case where the white light source is formed by a combination of RGB laser light sources, the output intensity and the output timing of each color (each wavelength) can be controlled with high accuracy, and thus the white balance of a captured image can be adjusted by the light source device 1203. Moreover, in this case, by irradiating an observation target with laser light from each of the RGB laser light sources in a time division manner and controlling driving of an image sensor of the camera head 1102 in synchronization with the irradiation timing, images corresponding to R, G, and B in a respective manner can be captured in a time division manner. With the method, the image sensor can capture color images without being provided with color filters.
Driving of the light source device 1203 may be controlled such that the intensity of output light is changed every certain time period. Images are acquired in a time division manner by controlling driving of the image sensor of the camera head 1102 in synchronization with the timing at which the intensity of the light is changed, and the images are combined. As a result, high dynamic range images without so-called crushed shadows and blown highlights can be generated.
The light source device 1203 may also be configured to be able to supply light having a predetermined wavelength band corresponding to special light observation. In special light observation, for example, the wavelength dependence of light absorption in body tissue is used. Specifically, by performing irradiation with light of a narrower band than the illumination light used at the time of a normal observation (that is, white light), images of certain tissue such as a blood vessel in a mucosal surface layer can be captured with high contrast.
Alternatively, in special light observation, fluorescence observation may be performed in which an image is obtained using fluorescence generated by excitation light irradiation. In fluorescence observation, for example, body tissue is irradiated with excitation light, and fluorescence from the body tissue can be observed. Alternatively, in fluorescence observation, a reagent such as indocyanine green (ICG) is locally injected to body tissue, and the body tissue is irradiated with excitation light corresponding to the fluorescence wavelength of the reagent, so that a fluorescence image can be obtained. The light source device 1203 may be configured to be able to supply at least one out of light of a narrow band and excitation light that correspond to such special light observation.
A photoelectric conversion system according to the present embodiment will be described using
The glasses 1600 further have a control device 1603. The control device 1603 functions as a power source that supplies power to the photoelectric conversion apparatus 1602 and the display device described above. The control device 1603 controls the operation of the photoelectric conversion apparatus 1602 and the display device. In the lens 1601, an optical system is formed that concentrate light onto the photoelectric conversion apparatus 1602.
The line of sight of the user to the displayed image is detected from the image of their eyeball captured through image capturing using infrared light. A freely chosen known method can be applied to line-of-sight detection using a captured image of their eyeball. As an example, a line-of-sight detection method based on Purkinje images generated by reflected illumination light from the user's cornea can be used.
More specifically, line-of-sight detection processing based on a pupil-corneal reflection method is performed. The line of sight of the user is detected by calculating, using a pupil-corneal reflection method, a line-of-sight vector representing the orientation of their eyeball (a rotation angle) on the basis of an image of their pupil and Purkinje images included in a captured image of their eyeball.
The display device according to the present embodiment has a photoelectric conversion apparatus having a light reception element, and may control an image displayed on the display device on the basis of information regarding the user's line of sight from the photoelectric conversion apparatus.
Specifically, for the display device, a first line-of-sight region, at which the user gazes, and a second line-of-sight region other than the first line-of-sight region are determined on the basis of the line-of-sight information. The first display region and the second display region may be determined by the control device of the display device. Alternatively, the first display region and the second display region determined by an external control device may be received. In a display region of the display device, the display resolution of the first line-of-sight region may be controlled to be higher than that of the second line-of-sight region. That is, the resolution of the second line-of-sight region may be made lower than that of the first line-of-sight region.
The display region has a first display region and a second display region, which is different from the first display region. A prioritized region may be determined from among the first display region and the second display region on the basis of the line-of-sight information. The first display region and the second display region may be determined by the control device of the display device. Alternatively, the first display region and the second display region determined by an external control device may be received. The resolution of the prioritized region may be controlled to be higher than that of the region other than the prioritized region. That is, the resolution of the region having a relatively low priority may be reduced.
Note that artificial intelligence (AI) may be used to determine the first line-of-sight region or the prioritized region. AI may be a model configured to use an image of a user's eyeball and the direction in which their eyeball in the image actually sees as supervised data and to estimate the angle of the line of sight from an image of a user's eyeball and the distance to a target ahead of the line of sight. The display device, the photoelectric conversion apparatus, or an external device may have an AI program. In a case where an external device has the AI program, the angle of the line of sight of the user and the distance to the target are transferred to the display device through communication.
In a case where display control is performed on the basis of visual recognition and detection, the present embodiment can be applied to smart glasses further having a photoelectric conversion apparatus that captures an outside image. The smart glasses can display, in real time, outside information regarding a captured outside image.
The present disclosure is not limited to the embodiments described above, and various modifications are possible.
For example, an example obtained by adding part of any one of the embodiments to another one of the embodiments and an example obtained by replacing part of one of the embodiments with part of another one of the embodiments are also included in embodiments of the present disclosure.
Furthermore, the photoelectric conversion systems described in the fifth and sixth embodiments are examples of photoelectric conversion systems to which the photoelectric conversion apparatuses can be applied. The photoelectric conversion systems to which the photoelectric conversion apparatuses according to the present disclosure are applicable are not limited to the configurations illustrated in
Note that the embodiments described above are merely specific examples of embodiments for implementing the present disclosure, and the technical scope of the present disclosure should not be interpreted as limited by these embodiments. In other words, the present disclosure can be implemented in various forms without departing from its technical concept or its main features.
According to the present disclosure, the degree of freedom in arranging elements of the second substrate can be improved.
While the disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2022-190917, filed Nov. 30, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2022-190917 | Nov 2022 | JP | national |