The present invention relates to an image capture apparatus configured to combine images.
When a plurality of objects at different distances from a camera is to be captured or when an object that is long in a depth direction is to be captured, only a part of the object or objects may be brought into focus because there is an insufficient depth of field in the imaging optical system.
Accordingly, a technology (hereinafter, also called focus stacking) (refer to Japanese Patent Laid-Open No. 2002-84444) has been known which captures a plurality of images by changing focus positions, extracts in focus regions from the images, and combines them into one image to generate a combined image in which the entire captured region is in focus. The focus stacking technology may be used to acquire an image in which an intended object is entirely in-focus.
However, when camera shake occurs during an image capturing operation by applying the focus stacking technology, the following problem may occur.
The present disclosure is provided in view of the problem.
Embodiments of the invention provide an image capture apparatus which can generate a combined image having reduced camera shake due to camera shake when a plurality of images captured at different focus positions are to be combined. An image capture apparatus according to embodiments of the present invention includes an image sensor, a processor, and a memory including instructions that, when executed by the processor, cause the processor to set a plurality of target focus positions, cause the image sensor to capture a plurality of images based on the target focus positions, and calculate a focus position with respect to an object when each of the plurality of images is captured. In this case, the instructions further cause the processor to reset at least a part of the target focus positions based on a result of a comparison between the target focus positions and the calculated focus positions.
An image capture apparatus according to embodiments of the present invention includes an image sensor having a sensor array having a plurality of photoelectric conversion units for one microlens, a processor, and a memory including instructions that, when executed by the processor, cause the processor to set a plurality of target focus positions, cause the image sensor to capture a plurality of images based on the target focus positions, and calculate a focus position with respect to an object when each of the plurality of images is captured. In this case, the instructions further cause the processor to reconstruct an image focused with respect to at least one target focus position by using at least a part of the plurality of images based on a difference between the plurality of target focus positions and the focus positions calculated with respect to the plurality of images.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
With reference to the attached drawings, embodiments of the present invention will be described in detail below. A digital camera will be described below as an example, but the present disclosure is applicable to any image capture apparatus capable of adjusting a focus position.
A control circuit 201 may be a signal processor such as a CPU and an MPU and control components of the digital camera 200 by pre-reading programs contained in a ROM 205, which will be described below. For example, the control circuit 201 may transmit a command associated with termination and completion of image-capturing to the image sensor 204, as will be described below. Alternatively, a command associated with an image process is transmitted to the image processing circuit 207, as will be described below, based on settings prepared in advance. A user's command is input to the digital camera 200 by an operating member 210, which will be described below, and reaches the corresponding components of the digital camera 200 through the control circuit 201.
A driving device 202 may include a motor, for example, and causes an optical system 203, which will be described below, to mechanically operate based on a command from the control circuit 201. For example, based on a command from the control circuit 201, the driving device 202 moves the position of a focus lens included in the optical system 203 to adjust the focal length of the optical system 203.
The optical system 203 may include a zoom lens, a focus lens, and a diaphragm, for example. The diaphragm is a mechanism configured to adjust the quantity of transmitted light. Changing the position of the lens can change the focus position. However, the term “focus position” here is defined with reference to an object unless otherwise specified.
The image sensor 204 is a photoelectric converter and may perform photoelectric conversion including photoelectrically converting an input optical signal to an electrical signal. For example, a CCD or a CMOS sensor is applicable as the image sensor 204. The structure of the image sensor will be described in detail below. The image sensor 204 has a movement image capturing mode and can capture a plurality of images that are serial in time as frames of a movement image.
The ROM 205 is a read-only non-volatile memory being a recording medium and stores operating programs for blocks included in the digital camera 200 and parameters for operations of the blocks. A RAM 206 is a re-writable volatile memory is usable as a temporary storage area for data output by operations performed by blocks included in the digital camera 200.
An image processing circuit 207 is configured to perform various image processes such as white balance adjustment, color interpolation, and filtering on data of an image output from the image sensor 204 or an image signal recorded in an internal memory 209, which will be described below. The image processing circuit 207 is further configured to perform a compression process based on a standard such as JPEG on data of an image signal captured by the image sensor 204.
The image processing circuit 207 includes an integrate circuit (ASIC) including circuits configured to perform specific processes. Alternatively, the control circuit 201 may perform a process based on a program read out from the ROM 205 so that the control circuit 201 can also use a part or all of functions of the image processing circuit 207. When the control circuit 201 uses all functions of the image processing circuit 207, the necessity for providing the image processing circuit 207 as hardware may be eliminated.
A display device 208 may be a liquid crystal display or an organic electroluminescence display configured to display an image temporarily saved in the RAM 206, an image saved in an internal memory 209, which will be described below, or a setting screen for the digital camera 200. The monitor 101 illustrated in
The internal memory 209 is configured to store an image captured by the image sensor 204, an image having undergone a process in the image processing circuit 207, and information regarding a focus position for image capturing. Instead of such an internal memory, the memory card 108 as illustrated in
The operating member 210 may be a button, a switch, a key, a mode dial or the like attached to the digital camera 200 or a touch panel which is also used as the display device 208. A command from a user reaches the control circuit 201 through the operating member 210. The shutter button 102, the mode selection switch 103, the controller wheel 106 and the switch 107 illustrated in
An apparatus movement detecting device 211 may be a gyro sensor and is configured to detect a movement of the digital camera 200 and to detect a movement in a yaw direction and a pitch direction based on a change in angle per unit time, that is, an angular velocity of the digital camera 200.
Referring to
The focus stacking processing is started with reach of a command from a user to the control circuit 201 through the operating member 210. The control circuit 201 in step S501 sets a focus position for capturing the first image based on the depth of object of the digital camera 200, an amount of change in focus position between images for capturing the second and subsequent images, and the number of images to be captured. With those values, a plurality of focus positions is calculated and is set. Next, in step S502, the control circuit 201 moves a lens included in the optical system 203 to change the focus position of the digital camera 200. For the first image, the lens is moved to the preset focus position. For the second and subsequent images, the lens is moved to a focus position on the most minimum-object-distance side or infinite end side among focus positions where a capturing operation has not been performed yet of the preset focus positions. In step S503, the image sensor 204 is controlled to perform a capturing operation. The control circuit 201 in step S504 rates on the images captured in step S503. The rating to be performed on captured images will be described in detail below.
In step S505, the control circuit 201 determines whether the captured images satisfy standards or not. If they satisfy standards, the processing moves to step S506 where whether the set number of images have been captured is determined.
If the control circuit 201 in step S505 determines that the captured images do not satisfy standards, the focus position set for the next capturing operation is corrected in step S507, and the processing returns to step S502. The correction of the set focus position in step S507 will be described in detail below.
If it is determined in step S506 that the set number of images have been captured, the control circuit 201 in step S508 performs a combining process to generate a combined image. If it is determined in step S506 that the set number of images have not been captured, the processing returns to step S502.
The combining of images in step S508 may be based on a publicly known method, and an example thereof will be described below. First, for alignment, a sum of absolute differences (SAD) of outputs of pixels of two images is acquired by changing the relative position between the plurality of images. The relative moving amount and the moving direction of the two images are acquired for a lowest value of the sum of absolute differences. Then, after a transform coefficient for an affine transformation or a projective transformation based on the acquired moving amount and moving direction is calculated, the transform coefficient is optimized by using the least squares method such that the error between the moving amount provided with the transform coefficient and the moving amount calculated from the sum of absolute differences can be minimized. Based on the optimized transform coefficient, a deformation process is performed on the images to be aligned. The image processing circuit 207 performs the alignment and the deformation process on all of the images captured by the image sensor 204 in step S503, and a combination ratio is then given to each of regions of each of the images. As an example, among a plurality of images corresponding to one identical region, the image processing circuit 207 gives a combination ratio of 100% to pixels included in the identical region of an image having an in-focus region and gives a combination ratio of 0% to pixels included in the identical region of other images. Alternatively, based on the in focus degrees of images of regions, a combination ratio may be assigned to the in-focus region of each of the images. In order to prevent unnaturalness at the combination boundaries, the image processing circuit 207 changes the combination ratio between adjacent pixels in stepwise manner. Finally, based on the combination ration of the pixels, a combined image is generated.
Next, the rating of captured images in step S504 will be described.
In step S602, the control circuit 201 calculates a displacement amount of the focus position. The displacement amount of the focus position is calculated from the focus position for capturing acquired in step S601 and the focus positions set in step S501. For example, in a case where the focus position acquired in step S601 is a focus position on the Nth captured image, the focus position may be compared with the Nth focus position of the focus positions set in step S501 to calculate the displacement amount of the focus position.
In step S603, the control circuit 201 determines whether the displacement amount of the focus position is lower than or equal to a predetermined value or not.
If it is determined in step S603 that the displacement amount of the focus position is higher than the predetermined amount, the control circuit 201 moves the processing to step S604 where it is determined that the standards are not satisfied and exits the processing.
If it is determined in step S603 that the displacement amount of the focus position is lower than or equal to the predetermined value, the control circuit 201 moves the processing to step S605 where it is determined that the standards are satisfied and exits the processing. Based on the determination result of step S604 or S605, the determination in step S505 in
The processing for rating captured images performed in step S504 has been described up to this point.
Next, processing for correcting a focus position performed in step S507 will be described.
In step S701, the control circuit 201 calculates a distance between the focus position for the previous capturing operation and the focus position for the current capturing operation. In step S702, the control circuit 201 determines whether the distance between focus positions on the images calculated in step S701 is higher than a predetermined value or not. The predetermined value is determined based on a focal length and an acceptable circle-of-confusion diameter of the image capture device. If a distance higher than the predetermined value may cause a region that is out of focus in both of the two images.
If the control circuit 201 determines that the distance between the focus positions in step S702 is higher than the predetermined value, the control circuit 201 moves the processing to step S703 where a new focus position is added to between the previous image capturing position and the current image capturing position. At the same time, the control circuit 201 adds one to the planned number of images in step S506. Then, the control circuit 201 moves to step S704. In a case where the control circuit 201 adds a new focus position in step S704, image capturing may be performed with the added focus position in the immediately following step S503. Though the image capturing with the added focus position may be performed after capturing with another focus position is completed, camera shake may occur therebetween. As a result, based on the camera shake, the added focus position is to be corrected again.
On the other hand, if the control circuit 201 determines that the distance between the focus positions in step S702 is lower than or equal to the predetermined value, a new focus position is not added, and the processing moves to step S704.
In step S704, the focus position set in step S501 or the focus position updated when the processing is performed in the previous step S704 is updated with the new focus position. In other words, a reference focus position is updated for calculating a displacement amount of the focus position in step S602 in
The control circuit 201 corrects the focus position in step S704 in this manner and exits the flowchart in
Accordingly, as illustrated in
According to the first embodiment, in a case where a plurality of images captured at different focus positions are combined, displacement amounts between each of the focus positions where the image is actually captured and a planned focus position is calculated. Thus, the focus position for the next capturing operation can be adjusted to reduce blurring in the resulting combined image.
In a case where larger camera shake occurs in a forward side, the capturing may possibly complete before the planned focus position even though the planned number of images are captured. Accordingly, even when capturing the planned number of images completes in step S506 in
According to a second embodiment, an image capture apparatus having a focus detection sensor separately from an image sensor is applied. Details thereof will be described with reference to drawings. Any repetitive descriptions regarding the first and second embodiments will be omitted.
The control circuit 201 in step S1001 sets an initial focus position, an amount of change in focus position and the number of capturing operations and in step S1011 inserts toe mirror 903 to the imaging optical path. The control circuit 201 in step S1013 before the capturing in step S1003 causes the mirror 903 to be withdrawn from the imaging optical path.
In step S1012, the AF sensor 906 acquires ranging information, and the acquired distance information is used for the rating of captured images in step S1004 to be performed by the control circuit 201. There is actually a difference between the time when ranging information is acquired in step S1012 and the time when image capturing is performed in step S1003, but the difference is not significant and does not cause any problem. Thus, the ranging information acquired in step S1012 may be used to perform the rating of captured images can be performed in step S1004.
According to the second embodiment, even in an image capture apparatus having an image sensor and a focus detection sensor separately, a plurality of images captured at different focus positions may be combined, resulting in generation of a combined image having reduced blurring.
According to a third embodiment an image capture apparatus is applied which estimates a focus position for capturing based on a result of analysis on contrast values to generate captured images, instead of direct calculation of a displacement amount of a focus position and the direction of displacement. The third embodiment will be described in detail with reference to drawings. Any repetitive descriptions regarding the first and third embodiments will be omitted.
In step S1101, the control circuit 201 sets an initial focus position and an amount of change in focus positions and the number of images to be captured based on user's settings. In step S1102, the control circuit 201 moves the focus position of the image capture apparatus to the most infinite end side focus position of the focus positions set in step S1101. Next, in step S1103, the image capture apparatus performs a capturing operation at the set focus positions.
In step S1104, the image processing circuit 207 sets a point of interest and analyzes a contrast value of the point of interest on the images. The image processing circuit 207 may extract an edge from an image and may set many points of interest at the edge.
Next, the analysis on the contrast value at a point of interest in step S1104 will be described with reference to examples. For simplicity, analysis on points of interest 1201 and 1202 will only be described, for example.
The image processing circuit 207 may set as many points of interest as possible in step S1101 for higher accuracy of determination of the direction of camera shake. For determination of addition of a focus position in step S1105, the image processing circuit 207 does not use the point of interest having a contrast value from which an approximate curved line cannot be calculated.
When camera shake occurs, image usable for generation of a combined image cannot be captured at the focus position 1220 or 1230. Therefore, the control circuit 201 determines in step S1105 that addition of a focus position is necessary. Next, in step S1106, a focus position is added, and the image capture apparatus in step S1107 captures an image with the added focus position. The added focus position may be the focus position 1220 or 1230 but may be any arbitrary position close to the focus position 1220 or 1230 if an out-of-focus period does not occur.
On the other hand, if the control circuit 201 in step S1105 determines that addition of a focus position is not necessary, the processing directly moves to step S1108. In step S1108, the image processing circuit 207 performs combining processing.
According to the third embodiment, in order to combine a plurality of images captured at different focus positions, the amount and direction of displacement due to camera shake may not be directly calculated, but the focus position for the next capturing operation may be adjusted to generate a combined image with reduced blurring.
According to a fourth embodiment, image configuration is attempted by using a refocusable image capture device, unlike the first to third embodiments.
An image capture apparatus according to the fourth embodiment may be the same as the one illustrated in
By using the image sensor as illustrated in
Here, a method will be described which calculates a focus position (refocus plane) corresponding to an object within a certain range.
The correspondences between the pupil divided regions all to a55 in the imaging lens 141 illustrated in
The photoelectric conversion units in the pixel array 130 receive light passing through pupil regions different from each other of the imaging lens 141. A plurality of pixel signals from the divided signals is combined to generate a pair of signals pupil divided in the horizontal direction.
Expression (1) integrates light beams passing through left side regions (pupil regions all to a52) of the exit pupils of the imaging lens 141 for each of the photoelectric conversion units of the pixel array 130. This is applied to a plurality of pixel arrays 130 in the horizontal direction, so that an object image, which is called an image A, can be constructed by output signals therefrom. Expression (2) integrates light beams passing through right side regions (pupil regions a14 to a55) of exit pupils of the imaging lens 141 for each of the photoelectric conversion units in one pixel array 130. This is applied to a plurality of pixel arrays 130 arranged in the horizontal direction so that an object image, which is called an image B, can be constructed from output signals therefrom. The control circuit 201 performs correlation calculation on the image A and the image B to detect an image displacement amount (pupil division phase difference). The image displacement amount may be multiplied by a transform coefficient depending on the focal position of the imaging lens 141 and an optical system so that the focus position corresponding to an object within a frame can be calculated.
Next, image reconstruction processing will be described on a refocus plane being a set focus position for the captured data acquired by the image sensor 204.
When a pixel receiving the light outputs L(x′, y′, u, v), an output E(x, y) obtained at the coordinates (x, y) on the refocus plane is equal to a result of integration of L(x′, y′, u, v) with respect to the pupil regions of the imaging lens, which can be expressed by the following Expression (4).
Because the refocus coefficient α in Expression (4) is determined by a user, the position (x′, y′) of the microlens to which light enters can be acquired if (x, y) and (u, v) are given. Then, from the pixel array 130 corresponding to the microlens, the pixel corresponding to the position (u, v) can be acquired. The pixel outputs L(x′, y′, u, v). This processing is performed all of the pupil divided regions, and the acquired pixel outputs are integrated to acquire E (x, y) When (u, v) are defined as representative coordinates of a pupil divided region of the imaging lens, the integration in Expression (4) can be calculated by a simple addition.
The refocus method has been described up to this point. However, it is difficult to generate a correct refocused image if the refocus plane is not set at a focus position of a refocus range from a focus position with which an original image is captured. This may be because the angle distribution of a tight beams entering to an image capture device, that is, a parallax amount of parallax images is limited by an aperture system of an imaging lens and a diaphragm and pixel pitches in the image sensor. Next, a method for calculating a refocus range will be described.
Hereinafter, a two-dimensional intensity distribution of light will be called a light field space component. In this case, the refocus range depends on a sampling pitch Δy of a space component and a sampling pitch Δu of an angle component, and its coefficient α± is given by the following Expression (5).
In the configuration example illustrated in
α±s2∓NFΔy=s2∓NFΔLA (6)
Here, the pupil distance P of the imaging optical system 171 corresponds to a distance between the exit pupil plane of the imaging optical system 171 and an image side conjugate plane of the imaging optical system 171 with respect to the object plane 172. N is a one-dimensional number of divisions of a pupil of the imaging optical system 171, F is an F value of the imaging optical system 171, and ΔLA is a pitch between the pixel arrays 130.
The method for calculating a refocus range has been described up to this point. Hereinafter, the term “refocus range” refers to an object side refocus range unless otherwise specified.
In step S1801, the control circuit 201 acquires distance information regarding an object. In step S1802, the control circuit 201 sets a focus position based on the distance information regarding the object and user's settings. As an example, a user may first use a touch panel to designate a region to bring into focus, and the control circuit 201 acquires distance information regarding the corresponding region and sets focus positions for images based on the region designated by the user.
In step S1803, the control circuit 201 captures images with the set focus positions which are changed by the optical system 203 and acquires distance information regarding each of the captured images.
In step S1804, the control circuit 201 determines whether camera shake including a movement in the optical axis direction has occurred during the capturing operation in step S1803 or not. More specifically, the control circuit 201 calculates the focus position for the capturing operation based on the distance information acquired in step S1804. The calculated focus position for the capturing operation and the set focus position are compared to determine whether camera shake has occurred or not. As an example, a threshold value for a difference in focus position may be defined, and if the difference between the calculated focus position for a capturing operation and a set focus position is equal to or higher than the threshold value, it may be determined that camera shake has occurred. Alternatively, the apparatus movement detecting device 211 may detect a movement of the image capture apparatus during a capturing operation and, if the width of the movement is equal to or higher than a predetermined threshold value, it may be determined that camera shake has occurred. Determining the presence of camera shake from a movement of the image capture apparatus detected by the apparatus movement detecting device 211 can eliminate the necessity for the calculating of the focus position for the capturing operation in step S1804, and the focus position for the capturing operation is calculated in the processing of generating a refocused image in step S1805 instead.
Here, the control circuit 201 may detect fixing and, if it is detected that the digital camera is fixed to a fixing portion such as a tripod, it may be determined that camera shake does not occur.
If it is determined that camera shake has occurred, the processing moves to step S1805 where a refocused image is generated. If it is determined that camera shake has not occurred, the refocus processing is omitted, and, for reduction of the processing time, the processing directly moves to step S1806 where image combining is performed. The image combining in step S1806 generates a combined image in the same manner as that of the first embodiment.
Next, the generating of refocused image in step S1805 will be described in detail.
If it is determined that the refocusing is possible in step S1903, the control circuit 201 advances the processing to step S1904 where a new focus position is added. Then, a new image is captured with the focus position. Next, in step S1905, the refocus correction amount is changed based on a difference between the focus position added in step S1904 and an actual focus position of the image which should be captured with the added focus position. Finally, the processing moves to step S1906 where refocus processing is performed.
In a case where the control circuit 201 determines that refocusing to the initially set focus position is not possible as in
According to this embodiment, in a case where a plurality of images captured with different focus positions are to be combined, the focus positions are refocused to reduce influences of camera shake and so on for acquiring a combined image with higher quality.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment (s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2016-183798 filed Sep. 21, 2016 and Japanese Patent Application No. 2016-19133 filed Sep. 29, 2016, which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-183798 | Sep 2016 | JP | national |
2016-191331 | Sep 2016 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
9473698 | Muto | Oct 2016 | B2 |
20180075617 | Abe | Mar 2018 | A1 |
20190028653 | Minami | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
102801929 | Nov 2012 | CN |
103152520 | Jun 2013 | CN |
104270560 | Jan 2015 | CN |
2002-084444 | Mar 2002 | JP |
Number | Date | Country | |
---|---|---|---|
20180084192 A1 | Mar 2018 | US |