The present invention relates to an imaging element and an imaging device.
Priority is claimed on Japanese Patent Application No. 2021-061148, filed Mar. 31, 2021, the content of which is incorporated herein by reference.
Data of an image captured by an image sensor is subjected to image processing in an external circuit called an image processing engine. As the data of the image sent from the image sensor to an external circuit or the like increases, the processing time for outputting data from the image sensor becomes longer.
An imaging element according to a first aspect of the present invention includes a first substrate including a plurality of pixels configured to output a signal based on charge obtained by photoelectric conversion, a second substrate including a conversion unit configured to convert at least a first signal output from a first pixel among the plurality of pixels and a second signal output from the first pixel after the first signal into digital signals, and a third substrate including a computing unit including a calculating unit configured to calculate an evaluation value of a first image generated on the basis of the first signal converted into a digital signal by the conversion unit and an evaluation value of a second image generated on the basis of the second signal converted into a digital signal by the conversion unit and a selecting unit configured to select data of at least one of the first image and the second image to be output to an outside on the basis of the evaluation value of the first image and the evaluation value of the second image.
An imaging device according to a second aspect of the present invention includes the imaging element according to the first aspect.
Hereinafter, embodiments for carrying out the invention will be described with reference to the drawings.
<Configuration of Imaging Device>
The image-capturing optical system 2 may be configured to be attachable to and detachable from the imaging device 1.
<Cross-Sectional Structure of Imaging Element>
Incident light L shown by a white arrow is incident in a positive direction of a Z-axis. Further, as shown in coordinate axes, a right direction on the drawing perpendicular to the Z-axis is a positive direction of an X-axis, and a forward direction on the drawing orthogonal to the Z-axis and the X-axis is a positive direction of a Y-axis. In the imaging element 3, the first substrate 111, the second substrate 112, the third substrate 113, and the fourth substrate 114 are stacked in a direction in which the incident light L is incident.
The imaging element 3 further includes a microlens layer 101, a color filter layer 102, and a passivation layer 103. The passivation layer 103, the color filter layer 102, and the microlens layer 101 are stacked in order on the first substrate 111.
The microlens layer 101 has a plurality of microlenses ML. The microlenses ML condense the incident light onto a photoelectric conversion unit which will be described below. The color filter layer 102 has a plurality of color filters F. The passivation layer 103 is configured of a nitride film or an oxide film.
The first substrate 111, the second substrate 112, the third substrate 113, and the fourth substrate 114 respectively have first surfaces 105a, 106a, 107a, and 108a on which gate electrodes and gate insulating films are provided, and second surfaces 105b, 106b, 107b, and 108b that are different from the first surfaces. Furthermore, various elements such as transistors are provided on the first surfaces 105a, 106a, 107a, and 108a, respectively. The wiring layers 140, 141, 144, and 145 are stacked and provided on the first surface 105a of the first substrate 111, the first surface 106a of the second substrate 112, the first surface 107a of the third substrate 113, and the first surface 108a of the fourth substrate 114, respectively. Furthermore, the wiring layers (inter-substrate connection layers) 142 and 143 are stacked and provided on the second surface 106b of the second substrate 112 and the second surface 107b of the third substrate 113, respectively. The wiring layers 140 to 145 are layers including a conductive film (a metal film) and an insulating film, and each has a plurality of wirings, vias, and the like arranged therein.
The elements on the first surface 105a of the first substrate 111 and the elements on the first surface 106a of the second substrate 112 are electrically connected by connection parts 109 such as bumps and electrodes via the wiring layers 140 and 141. Similarly, the elements on the first surface 107a of the third substrate 113 and the elements on the first surface 108a of the fourth substrate 114 are electrically connected by connection parts 109 such as bumps and electrodes via the wiring layers 144 and 145. Further, the second substrate 112 and the third substrate 113 have a plurality of through electrodes 110. The through electrodes 110 of the second substrate 112 connect circuits provided on the first surface 106a and the second surface 106b of the second substrate 112 to each other, and the through electrodes 110 of the third substrate 113 connect circuits provided on the first surface 107a and the second surface 107b of the third substrate 113 to each other. The circuits provided on the second surface 106b of the second substrate 112 and the circuits provided on the second surface 107b of the third substrate 113 are electrically connected by the connection parts 109 such as bumps and electrodes via the inter-substrate connection layers 142 and 143.
In the embodiment, although the case in which the first substrate 111, the second substrate 112, the third substrate 113, and the fourth substrate 114 are stacked is illustrated, the number of stacked substrates may be greater or smaller than in the embodiment.
Further, the first substrate 111, the second substrate 112, the third substrate 113, and the fourth substrate 114 may be referred to as a first layer, a second layer, a third layer, and a fourth layer, respectively.
<Example of Configuration of Imaging Element>
The second substrate 112 includes, for example, an A/D conversion unit 230 and the scanning unit 260. The A/D conversion unit 230 converts the signal output from the corresponding pixel 10 into a digital signal. The signal converted by the A/D conversion unit 230 is sent to the third substrate 113.
The scanning unit 260 generates a reading control signal for the reading unit 20 based on an instruction signal input via an input unit 290 of the fourth substrate 114. The instruction signal is sent from an image processing engine 30 which will be described below with reference to
The third substrate 113 includes, for example, a memory 250, a computing unit 240, and an evaluation value storage unit 280. The memory 250 stores the digital signal converted by the A/D conversion unit 230. The memory 250 has a storage capacity capable of, for example, storing at least two frames of images generated on the basis of the digital signal converted by the A/D conversion unit 230. In the embodiment, one image is referred to as one frame image.
The computing unit 240 performs a predetermined computation using at least one of the digital signal stored in the memory 250 and the digital signal converted by the A/D conversion unit 230. An evaluation value for an image is calculated through the computation. The evaluation value indicates any one of, for example, a focus adjustment state of the image-capturing optical system 2, a brightness of an image generated on the basis of the digital signal, and a movement state of a main subject. The computing unit 240 can determine superiority or inferiority of a plurality of images on the basis of the calculated evaluation value.
The evaluation value storage unit 280 stores a reference value for objectively evaluating the evaluation value calculated by the computing unit 240. The reference value is recorded in advance in the evaluation value storage unit 280.
Since the evaluation value storage unit 280 is used in second and third embodiments, it may be omitted in the first embodiment.
The fourth substrate 114 includes, for example, an output unit 270 and an input unit 290. The output unit 270 sends the digital signal stored in the memory 250 or the digital signal converted by the A/D conversion unit 230 to the image processing engine 30 (
For example, when the data of the image for recording which will be described below is sent to the image processing engine 30, the digital signal stored in the memory 250 is sent to the image processing engine 30 as the data of the image for recording. Furthermore, when the data of the image for monitor display which will be described below is sent to the image processing engine 30, the digital signal converted by the A/D conversion unit 230 is sent to the image processing engine 30 as the data of the image for monitor display.
An instruction signal from the image processing engine 30 is input to the input unit 290 via the input unit 290 of the fourth substrate 114. The instruction signal is sent to the second substrate 112.
<Imaging Element and Image Processing Engine>
The image processing engine 30 constitutes a part of the control unit 4 and includes an imaging element control unit 310, an input unit 320, an image processing unit 330, and a memory 340.
The operation member 6 including a release button, operation switch, or the like is provided on an exterior surface of the imaging device 1, for example. The operation member 6 sends an operation signal according to a user's operation to the imaging element control unit 310. The user instructs the imaging device 1 to capture an image, to set image-capturing conditions, and the like by operating the operation member 6.
When the imaging element control unit 310 receives an instruction to set the image-capturing conditions, and the like, it sends information indicating the set image-capturing conditions to the imaging element 3. Further, when a half-press operation signal indicating that the release button has been half-pressed with a shorter stroke than in a full-press operation is input from the operation member 6, the imaging element control unit 310 sends an instruction signal instructing to start image-capturing for monitor display to the imaging element 3 in order to continuously display the image for monitor display on a display unit or a viewfinder (not shown).
Furthermore, when a full-press operation signal indicating that the release button has been fully pressed with a longer stroke than in the half-press operation is input from the operation member 6, the imaging element control unit 310 sends an instruction signal instructing to start image-capturing of a still image for recording to the imaging element 3.
The digital signal output from the imaging element 3 is input to the input unit 320. The input digital signal is sent to the image processing unit 330. The image processing unit 330 performs predetermined image processing on the digital signal acquired from the imaging element 3 to generate the data of an image. The generated data of the image for recording is recorded in the memory 340 or used for displaying a confirmation image after image-capturing. Further, the generated data of the image for monitor display is used for display on a viewfinder or the like. The data of the image recorded in the memory 340 can be recorded in the storage medium 5.
<Detailed Description of the Computing Unit>
In the first embodiment, the computing unit 240 of the imaging element 3 calculates an evaluation value indicating the focus adjustment state of the image-capturing optical system 2.
The pixels 10 forming the pixel array 210 of the imaging element 3 have a photoelectric conversion unit for image generation. However, in a part or the whole of a region corresponding to a focus point, a pixel 10 having a photoelectric conversion unit for focus detection instead of the photoelectric conversion unit for image generation is disposed.
The focus point will be described.
The focus point P indicates a position in the image-capturing range 50 in which a focus of the image-capturing optical system 2 can be adjusted, and is also referred to as a focus detection area, a focus detection position, or a distance measuring point.
The illustrated number of focus points P and positions in the image-capturing range 50 are merely examples, and are not limited to the mode shown in
A calculating unit 244 of the computing unit 240 calculates, as an evaluation value, an amount of image shift (a phase difference) between a pair of images caused by a pair of light fluxes passing through different regions of the image-capturing optical system 2, for example, based on a photoelectric conversion signal from a pixel 10 having a photoelectric conversion unit for focus detection. The amount of image shift can be calculated for each of the focus points P.
The amount of image shift between the pair of images is a value that is the basis for calculating an amount of defocus which is an amount of shift between a position of a subject image formed by the light flux passing through the image-capturing optical system 2 and a position of an imaging surface of the imaging element 3, and the amount of defocus can be calculated by multiplying the amount of image shift between the pair of images by a predetermined conversion coefficient.
For example, the calculating unit 244 sets the amount of image shift between the pair of images calculated at a focus point P located approximately at the center of the image-capturing range 50 among the plurality of focus points P as an evaluation value for the image.
The computing unit 240 may automatically select a focus point P to be used for calculating the amount of image shift between the pair of images (in other words, calculating the evaluation value) among all the focus points P. and may select the focus point P identified by the user's operation.
The calculating unit 244 calculates the evaluation value for each of a plurality of images captured while the image-capturing instruction for recording continues. A selecting unit 245 of the computing unit 240 selects an excellent image from among the plurality of images on the basis of the evaluation value. In the first embodiment, an image with the smallest amount of image shift between the pair of images, in other words, the image that is most in focus is selected. The data of the image selected by the selecting unit 245 is sent to the image processing engine 30 via the output unit 270.
The processing performed by the computing unit 240 will be described in detail with reference to
The pixel 10 having the photoelectric conversion unit for focus detection acquires a pair of images formed by a pair of light fluxes passing through different regions of the image-capturing optical system 2. When the image-capturing optical system 2 forms a sharp image at a predetermined focal plane, so-called focusing, the above-described pair of images relatively coincide with each other (a state in which the amount of shift is almost 0), as illustrated in
The computing unit 240 calculates the amount of image shift by performing known image shift detection calculation processing (correlation calculation processing, phase difference detection processing) on a signal train indicating the pair of images acquired by a plurality of pixels 10 having photoelectric conversion units for focus detection disposed side by side in the X-axis direction.
In
In the embodiment, it is assumed that a region 60 in which the evaluation value is calculated by the computing unit 240 is set in advance approximately in the center of the image-capturing range. It is possible to reduce a burden of processing performed by the computing unit 240 to calculate the evaluation value by setting the region 60 as a part of the image-capturing range, compared to a case in which the region 60 is set as the whole of the image-capturing range. The computing unit 240 can automatically set the region 60 or set the region 60 to a range identified by a user's operation.
Data of a first image captured by the imaging element 3 during the full-press operation is stored in a first region 251 of the memory 250. Data of a second image captured by the imaging element 3 during the full-press operation is stored in a second region 252 of the memory 250. The data of the image stored in the memory 250 includes an image captured by the pixel 10 having the photoelectric conversion unit for image generation and the pair of images acquired by the pixel 10 having the photoelectric conversion unit for focus detection. The selecting unit 245 determines the superiority or inferiority of the first image and the second image based on the evaluation values calculated by the calculating unit 244 for the images stored in the first region 251 and the second region 252 of the memory 250. In the first image illustrated in
Since the amount of image shift between the pair of images shown in
The selecting unit 245 determines the superiority or inferiority of the third image and the second image based on the evaluation values for the images stored in the first region 251 and the second region 252 of the memory 250. In the third image illustrated in
Since the amount of image shift between the pair of images shown in
The selecting unit 245 determines the superiority or inferiority of the fourth image and the second image based on the evaluation values for the images stored in the first region 251 and the second region 252 of the memory 250. In the fourth image illustrated in
Since the amount of image shift between the pair of images shown in
As described above, when the data of the captured image is stored in the first region 251 or the second region 252 of the memory 250. the computing unit 240 of the imaging element 3 calculates an evaluation value for the image (the amount of image shift between the pair of images) using the calculating unit 244. Then, based on the calculated evaluation value, the selecting unit 245 selects one image in which the object 70 is most in focus among the four images captured by the imaging element 3 during the full-press operation. Only the data of the image selected by the selecting unit 245 is sent to the image processing engine 30 via the output unit 270.
The image processing engine 30 that has received the data of the image output from the imaging element 3 performs predetermined image processing in the image processing unit 330 to generate the data of the image for recording, and records it in the memory 340.
According to the first embodiment described above, the following effects can be obtained.
(1) The imaging element 3 includes a plurality of pixels 10 that output a signal based on charge obtained by photoelectric conversion, a calculating unit 244 that calculates an evaluation value of an image generated on the basis of the signals, a selecting unit 245 that selects one image from the plurality of images on the basis of the evaluation value of each of the plurality of images calculated by the calculating unit 244, and an output unit 270 that outputs data of the image selected by the selecting unit 245.
With such a configuration, an amount of the data of the image output from the imaging element 3 can be reduced compared to a case in which data for all of the plurality of images captured by the imaging element 3 is output from the imaging element 3 to the image processing engine 30. Thus, the processing time for outputting data and power consumption in the imaging element 3 can be reduced.
(2) The selecting unit 245 of the imaging element 3 selects one image with a highest evaluation value among the plurality of images. With such a configuration, the data of the image output from the imaging element 3 can be reduced to one image. Further, data of an image considered to be the best among the plurality of images on the basis of the evaluation value can be output to the image processing engine 30.
(4) The plurality of pixels 10 of the imaging element 3 photoelectrically convert light that has passed through the image-capturing optical system 2, and the calculating unit 244 calculates an evaluation value indicating a focus adjustment state of the image-capturing optical system 2.
With such a configuration, it is possible to output the data of the image that is most in focus among the plurality of images to the image processing engine 30.
(5) The calculating unit 244 of the imaging element 3 calculates an evaluation value on the basis of a signal corresponding to a predetermined region 60 of the image among the signals.
With such a configuration, compared to the case in which the evaluation value is calculated on the basis of the signals corresponding to the entire image-capturing range 50, a processing load on the calculating unit 244 for calculating the evaluation value can be reduced.
In the first embodiment, one image in which the object 70 is most in focus is selected among the plurality of images captured by the imaging element 3 during the full-press operation, and the data of the selected image is sent to the image processing engine 30. Instead, in the second embodiment, data of an image in which the object 70 is in focus is sent to the image processing engine 30 among a plurality of images captured by the imaging element 3 during the full-press operation. The number of images sent to the image processing engine 30 is not limited to one. Such a configuration will be described below.
<Imaging Element and Image Processing Engine>
For example, when a image-capturing instruction for recording is input from the image processing engine 30, the imaging element 3A captures a plurality of images while the image-capturing instruction continues, and sends data of an image determined to be in focus on the object 70 among the plurality of images to the image processing engine 30 as the data of the image for recording.
As in the case of the imaging element 3, when an image-capturing instruction for monitor display is input from the image processing engine 30, images for monitor display are repeatedly captured, and data of the captured image is sent to the image processing engine 30 as the data of the image for monitor display.
<Detailed Description of the Computing Unit>
In the second embodiment, the computing unit 240A of the imaging element 3A calculates an evaluation value indicating a focus adjustment state of the image-capturing optical system 2. This evaluation value is the same as the evaluation value calculated by the computing unit 240 of the imaging element 3 according to the first embodiment.
The calculating unit 244 of the computing unit 240A calculates the evaluation value for each of a plurality of images captured while the image-capturing instruction for recording continues. The selecting unit 245 of the computing unit 240A compares a reference value recorded in advance in the evaluation value storage unit 280 and the evaluation value calculated by the calculating unit 244. The reference value is, for example, an allowable value for the amount of image shift between the pair of images. The selecting unit 245 selects an image of which a calculated evaluation value satisfies the reference value, in other words, the amount of image shift between the pair of images as the evaluation value is smaller than the amount of image shift corresponding to the reference value.
In the second embodiment, when the evaluation value for an image is higher than the reference value, in other words, when the evaluation value satisfies the reference value, the image is selected. The data of the image selected by the selecting unit 245 is sent to the image processing engine 30 via the output unit 270.
With reference to
In
Data of a first image captured by the imaging element 3A during the full-press operation is stored in a predetermined region of the memory 250. The data of the image stored in the memory 250 includes an image captured by the pixel 10 having the photoelectric conversion unit for image generation and a pair of images acquired by the pixel 10 having the photoelectric conversion unit for focus detection. The selecting unit 245 compares the evaluation value calculated by the calculating unit 244 for the first image with the reference value recorded in the evaluation value storage unit 280.
In the first image illustrated in
Since the amount of image shift between the pair of images illustrated in
The data of the second image captured by the imaging element 3A during the full-press operation is stored in the same region of the memory 250 in which the data of the image made erasable was stored. The selecting unit 245 compares the evaluation value for the second image with the reference value recorded in the evaluation value storage unit 280. As illustrated in
When the amount of image shift between the pair of images illustrated in
Data of a third image captured by the imaging element 3A during the full-press operation is stored in the same region of the memory 250 in which the data of the image made erasable was stored. The selecting unit 245 compares an evaluation value for the third image with the reference value recorded in the evaluation value storage unit 280.
In the third image illustrated in
Since the amount of image shift between the pair of images illustrated in
Data of a fourth image captured by the imaging element 3A during the full-press operation is stored in the same region of the memory 250 in which the data of the image made erasable was stored. The selecting unit 245 compares an evaluation value for the fourth image with the reference value recorded in the evaluation value storage unit 280. As illustrated in
When the amount of image shift between the pair of images illustrated in
As described above, when data of a captured image is stored in a predetermined region of the memory 250, the computing unit 240A of the imaging element 3A calculates an evaluation value for the image (the amount of image shift between the pair of images) using the calculating unit 244. Then, when the calculated evaluation value satisfies the reference value recorded in the evaluation value storage unit 280, the image is selected by the selecting unit 245. Only the data of the image selected by the selecting unit 245 is sent to the image processing engine 30 via the output unit 270.
The image processing engine 30 that has received the data of the image output from the imaging element 3A performs predetermined image processing in the image processing unit 330 to generate the data of the image for recording, and records it in the memory 340.
According to the second embodiment described above, the following effects can be obtained.
(1) The imaging element 3A includes a plurality of pixels 10 that output a signal based on charge obtained by photoelectric conversion, a calculating unit 244 that calculates an evaluation value of an image generated on the basis of the signals, a selecting unit 245 that selects at least one image from the plurality of images on the basis of the evaluation value of each of the plurality of images calculated by the calculating unit 244, and an output unit 270 that outputs data of the image selected by the selecting unit 245.
With such a configuration, an amount of data of the image output from the imaging element 3A can be reduced compared to a case in which data for all of the plurality of images captured by the imaging element 3A is output from the imaging element 3A to the image processing engine 30. Thus, the processing time for outputting data and power consumption in the imaging element 3A can be reduced.
(2) The selecting unit 245 of the imaging element 3A selects an image of which an evaluation value exceeds a predetermined value among the plurality of images. With such a configuration, the amount of data of the image output from the imaging element 3A can be reduced compared to a case in which data of all the images is output. Further, data of an image of which an evaluation value is higher than a predetermined value (in other words, it satisfies the reference value) among the plurality of images can be output to the image processing engine 30.
(3) The plurality of pixels 10 of the imaging element 3A photoelectrically convert light that has passed through the image-capturing optical system 2, and the calculating unit 244 calculates an evaluation value indicating a focus adjustment state of the image-capturing optical system 2.
With such a configuration, it is possible to output data of an image that is in focus to a certain extent among the plurality of images to the image processing engine 30.
(4) The calculating unit 244 of the imaging element 3A calculates an evaluation value on the basis of a signal corresponding to a predetermined region 60 of the image among the above signals.
With such a configuration, compared to a case in which the evaluation value is calculated on the basis of signals corresponding to the entire image-capturing range 50, the processing load on the calculating unit 244 for calculating the evaluation value can be reduced.
In a third embodiment, data of an image in which movement of a subject in the image is small among a plurality of images captured by the imaging element 3 during a full-press operation is sent to the image processing engine 30. The number of images sent to the image processing engine 30 is not limited to one. Such a configuration will be described below.
<Imaging Element and Image Processing Engine>
For example, when an image-capturing instruction for recording is input from the image processing engine 30, the imaging element 3B captures a plurality of images while the image-capturing instruction continues, and sends data of an image in which a main subject has little movement among the plurality of images, in other words, it can be estimated that image blur of the main subject is small, to the image processing engine 30 as the data of the image for recording.
As in the imaging elements 3 and 3A, when an image-capturing instruction for monitor display is input from the image processing engine 30, an image for monitor display is repeatedly captured, and data of the captured image is sent to the image processing engine 30 as the data of the image for monitor display.
<Detailed Description of the Computing Unit>
In the third embodiment, the computing unit 240B of the imaging element 3B calculates an evaluation value indicating a magnitude of the movement of the main subject. This evaluation value is based on a difference in data of images captured before and after among the plurality of images.
The computing unit 240B calculates an evaluation value for each of the plurality of images captured while the image-capturing instruction for recording continues. The calculation of the evaluation value is performed by a calculating unit 244 which will be described below with reference to
With reference to
In
The computing unit 240B stores data of a first image (
The computing unit 240B calculates a difference between the data of the second image stored in the first region 251 of the memory 250 and the data of the first image stored in the second region 252 of the memory 250 for each of the corresponding pixels using the difference calculating unit 242. When such a calculation is performed, as illustrated in
In the embodiment, it is assumed that a region in the subject image in which there is movement is set as a region 60 for which the computing unit 240B calculates the evaluation value. It is possible to reduce a burden of processing performed by the computing unit 240B to calculate the evaluation value by setting the region 60 as a part of an image-capturing range, compared to a case in which the region 60 is set in the entire image-capturing range. The computing unit 240B may automatically set the region 60 or set the region 60 to a range identified by a user's operation.
In the computing unit 240B, a plurality of difference values calculated for each of the pixels included in the region 60 is added by the cumulative addition unit 243. The calculating unit 244 sets a cumulative addition value calculated by the cumulative addition unit 243 as an evaluation value for the second image. The selecting unit 245 compares the evaluation value for the second image with the reference value recorded in the evaluation value storage unit 280.
For example, when the reference value is set to be equal to the difference illustrated in
Since the difference illustrated in
The computing unit 240B stores data of a third image (
The computing unit 240B calculates a difference between the data of the third image stored in the first region 251 of the memory 250 and the data of the second image stored in the second region 252 of the memory 250 for each of the corresponding pixels using the difference calculating unit 242.
In the computing unit 240B, the plurality of difference values calculated for each of the pixels included in the region 60 are added by the cumulative addition unit 243. The calculating unit 244 sets the cumulative addition value calculated by the cumulative addition unit 243 as the evaluation value for the third image. The selecting unit 245 compares the evaluation value for the third image with the reference value recorded in the evaluation value storage unit 280.
As described above, when the reference value is set to be equal to the difference illustrated in
Since the difference illustrated in
In this way, for the second and subsequent images captured by the imaging element 3B during the full-press operation, while the data of the newly captured image is stored in the first region 251 of the memory 250, a difference with the data of the image stored in the second region 252 of the memory 250 is calculated for each of the corresponding pixels by the difference calculating unit 242.
The computing unit 240B stores data of a fourth image (
The computing unit 240B calculates a difference between the data of the fourth image stored in the first region 251 of the memory 250 and the data of the third image stored in the second region 252 of the memory 250 for each of the corresponding pixels using the difference calculating unit 242.
In the computing unit 240B, the plurality of difference values calculated for each of the pixels included in the region 60 are added by the cumulative addition unit 243. The calculating unit 244 sets the cumulative addition value calculated by the cumulative addition unit 243 as the evaluation value for the fourth image. The selecting unit 245 compares the evaluation value for the fourth image with the reference value recorded in the evaluation value storage unit 280.
As illustrated in
When the difference illustrated in
As described above, when a newly captured image is stored in the first region 251 of the memory 250, the computing unit 240B of the imaging element 3B calculates an evaluation value for the image (a cumulative addition value of the difference with the previously captured image) using the difference calculating unit 242, the cumulative addition unit 243, and the calculating unit 244. Then, when the calculated evaluation value satisfies the reference value recorded in the evaluation value storage unit 280, the image is selected by the selecting unit 245. Only the data of the image selected by the selecting unit 245 is sent to the image processing engine 30 via the output unit 270.
The image processing engine 30 that has received the data of the image output from the imaging element 3B performs predetermined image processing in the image processing unit 330 to generate the data of the image for recording. and records it in the memory 340.
According to the third embodiment described above, the following effects can be obtained.
(1) The imaging element 3B includes a plurality of pixels 10 that output a signal based on charge obtained by photoelectric conversion, a calculating unit 244 that calculates an evaluation value of an image generated on the basis of the signals, a selecting unit 245 that selects at least one image from the plurality of images on the basis of the evaluation value of each of the plurality of images calculated by the calculating unit 244, and an output unit 270 that outputs an image signal of the image selected by the selecting unit 245.
With such a configuration, an amount of the data of the image output from the imaging element 3B can be reduced compared to a case in which data for all of the plurality of images captured by the imaging element 3B is output from the imaging element 3B to the image processing engine 30. Thus, the processing time for outputting data and power consumption in the imaging element 3B can be reduced.
(2) The calculating unit 244 of the imaging element 3B calculates an evaluation value on the basis of a difference between images of a current frame and a previous frame, and the selecting unit 245 selects an image of which an evaluation value is less than a predetermined value among the plurality of images.
With such a configuration, an amount of the data of the image output from the imaging element 3B can be reduced compared to a case in which data for all of the plurality of images captured by the imaging element 3B is output from the imaging element 3B to the image processing engine 30. Thus, the processing time for outputting data and power consumption in the imaging element 3B can be reduced.
(3) The calculating unit 244 of the imaging element 3B calculates an evaluation value indicating the movement of the subject (for example, the person 80) appearing in the image. With such a configuration, for example, data of an image of the person 80 with less image blur among the plurality of images can be output to the image processing engine 30.
(4) The calculating unit 244 of the imaging element 3B calculates an evaluation value on the basis of a signal corresponding to the predetermined region 60 in the image among the above signals.
With such a configuration, compared to a case in which the evaluation value is calculated on the basis of signals corresponding to the entire image-capturing range 50, the processing load on the calculating unit 244 for calculating the evaluation value can be reduced.
(Modified Example 1)
In the first to third embodiments described above, the examples in which the imaging elements 3, 3A, and 3B have a back-side illuminated configuration have been described. Instead, each of the imaging elements 3, 3A, and 3B may have a front-side illuminated configuration in which the wiring layer 140 is provided on the incident surface side on which light is incident.
(Modified Example 2)
In the first to third embodiments described above, the examples in which a photodiode is used as the photoelectric conversion unit have been described. However, a photoelectric conversion film may be used as the photoelectric conversion unit.
(Modified Example 3)
The imaging elements 3, 3A, and 3B may be applied to a camera, a smartphone, a tablet, a camera built into a PC, a vehicle-mounted camera, and the like.
(Modified Example 4)
In the first to third embodiments described above, the examples in which an evaluation value indicating the focus adjustment state of the image-capturing optical system 2 or an evaluation value indicating the magnitude of movement of the main subject is calculated have been described. For example, a configuration in which an evaluation value indicating brightness of a photographed image, an evaluation value indicating a color temperature (also called white balance) of a photographed image, or an evaluation value indicating a contrast of a photographed image is calculated as an evaluation value that indicates information other than the focus adjustment state and the magnitude of movement of the main subject may be adopted. The selecting unit selects an image to be sent to the image processing engine 30 on the basis of the calculated evaluation value.
(Modified Example 5)
In the first to third embodiments described above, for example, the examples in which a plurality of images are captured by the imaging element 3, 3A, or 3B while the full-press operation is performed on the release button, and an image to be sent from the imaging element 3, 3A, or 3B to the image processing engine 30 is selected among the plurality of images on the basis of the evaluation value calculated for each of the images has been described. Selecting an image to be sent from the imaging element 3, 3A, or 3B to the image processing engine 30 among the plurality of images is not limited to the case in which the full-press operation is continuously performed on the release button, and may also be applied to, for example, a case in which the imaging device 1 automatically starts image-capturing, such as a case in which an image-capturing setting of capturing a plurality of images at a predetermined time is set in the imaging device 1, a case in which an image-capturing setting of capturing a plurality of images when a subject moves to a predetermined region within the image-capturing range 50 is set in the imaging device 1, and a case in which an instruction to start image-capturing is sent to the imaging device 1 from an external device.
(Modified Example 6)
In the first to third embodiments described above, the examples in which the same image-capturing conditions are applied to the plurality of images captured by the imaging element 3, 3A, or 3B without particularly changing the image-capturing conditions have been described. The image-capturing conditions include, for example, a shutter speed, an aperture value, ISO sensitivity, and the like. Instead, when a plurality of images are captured by the imaging element 3, 3A, or 3B, bracket image-capturing in which the images are captured while at least some of the imaging conditions are changed stepwise may be performed. In Modified example 6, the image to be sent from the imaging element 3, 3A, or 3B to the image processing engine 30 is selected among a plurality of images that are bracket-captured.
(Modified Example 7)
As a modified example of the third embodiment described above, a third region may be secured in the memory 250 described in the third embodiment, and data of an image newly captured by the imaging element 3B (referred to as data A) may be stored in the third region. In Modified example 7, the first region 251 and the second region 252 of the memory 250 respectively store data (referred to as data A-) of a thinned image in which the number of data is reduced by thinning out the data A at a predetermined thinning rate and store data (referred to as data B-) of a thinned image in which the number of data is reduced by thinning out the data (referred to as data B) of an image captured before the data A by the imaging element 3B at the same thinning rate as the data A.
The computing unit 240B calculates an evaluation value (a value obtained by cumulatively adding differences in data) for an image newly captured by the imaging element 3B (an image of which data is stored in the third region of the memory 250) on the basis of the data A- and the data B-stored in the first region 251 and the second region of the memory 250, and sends the data of the image stored in the third region of the memory 250 to the image processing engine 30 when the calculated evaluation value satisfies the reference value recorded in the evaluation value storage unit 280.
In Modified example 7 described above, the thinning rate of the data of the images to be respectively stored in the first region 251 and the second region 252 is set so that the sum of storage capacities of the first region 251 and the second region 252 of the memory 250 is smaller than a storage capacity of the third region of the memory 250. With such a configuration, the storage capacity of the memory 250 (the sum of the first region 251, the second region 252, and the third region) required in Modified example 7 can be curbed to be smaller than the storage capacity of the memory 250 (the sum of the first region 251 and second region 252) required in the third embodiment.
Although various embodiments and modified examples have been described above, the present invention is not limited thereto. A mode in which the configurations shown in the embodiments and modified examples are used in combination is also included within the scope of the present invention. Other modes considered within the technical spirit of the present invention are also included within the scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2021-061148 | Mar 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/015318 | 3/29/2022 | WO |