The present invention relates to an imaging apparatus configured to provide imaging used for refocus and a refocus process.
The known refocus technology combines a plurality of parallax images (or viewpoint images) each having a parallax obtained by imaging or image capturing from a plurality of imaging positions (viewpoints) and generates a refocused image as an image in which an in-focus state is adjusted after imaging. Japanese Patent Laid-Open No. 2011-022796 discloses a refocus process that generates a refocused image by shifting and combining a plurality of parallax images in accordance with the viewpoints of the plurality of parallax images and the object distance to be focused so that the same main object is superimposed on itself.
The conventional imaging determines an exposure value and a dynamic range for a main object to be focused, which is determined before the imaging. An image having a corrected luminance can be generated through image processing to an image acquired through imaging. However, it is difficult to correct the luminance of an image that contains the overexposure and underexposure, and color curving of a high chroma part, and increased noises, etc. can occur. For example, when an exposure value is adjusted to a main object in an imaging scene having a large brightness or luminance difference, another object may suffer from the overexposure or underexposure. At this time, even when the luminance is corrected through image processing to another overexposed object, in particular, a proper exposure may not be obtained.
Assume that a user attempts to generate an image targeted on an object different from the main object through a post-imaging refocus process. Then, the object can be focused but the proper luminance may not be obtained depending on the exposure condition in the imaging.
The present invention provides an imaging apparatus etc., which can provide a refocused image in which each object has proper luminance even when a main object is changed in a refocus process in capturing an imaging scene having a large luminance difference.
An image processing apparatus according to one aspect of the present invention is configured to generate a refocused image through a refocus process with a plurality of parallax images each having a parallax acquired through imaging. The image processing apparatus includes one or more processors, and a memory storing instructions which, when executed by the one or more processors, causes the one or more processors to perform operations of units of the image processing apparatus. The units include a range acquiring unit configured to acquire a refocusable range in the refocus process, which is a distance range in which a refocus is available, an exposure acquiring unit configured to acquire a plurality of first exposure values in accordance with luminance values of a plurality of distances in the refocusable range, an exposure setting unit configured to set a second exposure value as an exposure value in the imaging, a correction value acquiring unit configured to acquire a luminance correction value based on a refocus distance as a distance to be refocused in the refocus process and at least one first exposure value and at least one second exposure value, and a processing unit configured to provide the refocus process with the luminance correction value.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Referring now to the accompanying drawings, a description will be given of embodiments of the present invention.
The imaging unit 100 in this embodiment captures images of the same imaging scene from a plurality of viewpoints (imaging positions) in accordance with one imaging command, and obtains a plurality of pieces of image data each having a parallax (which will be referred to as “a plurality of parallax images” hereinafter).
A central processing unit (referred to as a “CPU” hereinafter) 101 is a processor configured to generally control each component in the imaging apparatus. A RAM 102 is a memory that serves as a main memory, a work area, etc. for the CPU 101. A ROM 103 is a memory that stores a control program etc. executed by the CPU 101. A bus 104 is a transmission channel for various types of data and, for example, the image data obtained by the imaging unit 100 is transmitted to a predetermined processing unit via this bus 104. The operating unit 105 is an input device configured to input a command provided from the user into the CPU 101, and includes an operating member, such as a button, a mode dial, a touch screen having a touch input function, etc.
The display unit 106 includes a liquid crystal display, etc., and displays an image, a letter, etc. The display unit 106 may include a touch screen included in the operating unit 105. A display control unit 107 controls displaying an image, a letter, etc., on the display unit 106.
An imaging control unit 108 controls focusing, opening and closing of a shutter, an aperture diameter adjustment of an aperture stop in the imaging unit 100, etc. based on a command from the CPU 101. A digital signal processing unit 109 performs various image processing, such as a white balance process, a gamma process, a noise reduction process, etc. for image data received via the bus 104 (which contains a refocused image, which will be described later), and generates digitally processed image data.
An encoder unit 110 converts the digitally processed image data received via the bus 104 into a file format, such as a JPEG and an MPEG. An external memory control unit 111 is an interface that connects the imaging apparatus to a personal computer and another medium, such as a hard disk drive, an optical disc drive, and a semiconductor memory. The image data obtained or generated by the imaging apparatus is output to an external storage unit via the external memory control unit 111 and stored.
An image processing unit 112 performs a refocus process, which will be described later, using a plurality of parallax images obtained by the imaging unit 100, generates a refocused image, and performs image processing that generates an output image using digitally processed image data output from the digital signal processing unit 109. The CPU 101 and the image processing unit 112 constitute an image processing apparatus.
Referring now to
As illustrated in
Referring now to
The object images 401 and 402 have parallaxes depending on the object distances of the objects A and B. The refocused images 420 and 421 obtained by combining the parallax images 410 and 411 have different shift amounts of the parallax images 410 and 411 in combining the parallax images 410 and 411. The refocused image 420 is an image obtained by shifting and combining the parallax images 410 and 411 so as to superimpose the object image 401 on itself, and the object image 401 (the object A as a main object) is focused. On the other hand, in the parallax images 410 and 411, the object image 402 has a parallax different from that of the object image 401 in magnitude, and thus is combined at a shifted position in the refocused image 420. Hence, the object image 402 is blurred in the refocused image 420.
The refocused image 421 is an image obtained by shifting and combining the parallax images 410 and 411 so as to superimpose the object image 402 on itself, and the object image 402 is focused (the object B as the main object). On the other hand, in the parallax images 410 and 411, the object image 401 has a parallax different from that of the object image 402 in magnitude, and thus is combined at a shifted position in the refocused image 421. Hence, the object image 401 blurs in the refocused image 421.
A predetermined object distance (in-focus distance) is focused by shifting and combining a plurality of parallax images by a shift amount determined based on the object to be focused, and a refocused image that has a blur depending on a distance difference from the in-focus distance can be generated.
As illustrated in
A description will now be given of an illustrative imaging scene having a large luminance difference to be solved by this embodiment.
Accordingly, in imaging the imaging scene having a large luminance difference, this embodiment determines the exposure value in imaging and corrects the luminance through image processing after imaging so that each main object can have proper exposure even when the main object is varied in the refocus process.
A flowchart in
In S700, the CPU 101 calculates (obtains) a refocusable range through the above-mentioned method. Next, in S701, the CPU 101 detects a candidate of a main object (main object candidate) in the refocusable range based on image data acquired in an imaging preparation before main imaging for acquiring a plurality of parallax images (image data for live-view images). Then, the CPU 101 confirms the number of main object candidates. In the example illustrated in FIG. 5, there are two main object candidates (objects A and B) in the refocusable range. The main object candidates are detected based on a known process, such as a face recognition process and an object detection process. When a plurality of main object candidates are detected, the CPU 101 determines one main object based on the detection reliability, the detected distance, size, or another element of the object candidate, etc.
Next, in S702, the CPU 101 obtains an exposure value for obtaining a proper exposure for the main object determined in S701 (a first exposure value at an object distance with which the main object is located). When the object is a human, as described with reference to
In S703, the CPU 101 determines whether or not only one main object candidate has been confirmed by S701. When there is only one main object candidate, the main object does not change in the refocus process and thus the CPU 101 moves to S709, where the CPU 101 determines the exposure value calculated in S702 as the exposure in imaging (referred to as “an imaging exposure” hereinafter). On the other hand, when there are two or more main object candidates, the CPU 101 moves to S704.
In S704, the CPU 101 calculates a luminance value for a main object candidate different from the main object determined in S701. The luminance value is calculated similarly to a calculation of the luminance value of the main object described in S702. The CPU 101 uses the calculated luminance value, and calculates the exposure value (first exposure value with the object distance with which the other main object candidate is located) as the candidate exposure value, which enables the other main object candidate to have the proper luminance level.
In S705, the CPU 101 determines whether or not a calculation of the candidate exposure value has been completed for all main object candidates in the refocusable range, and the flow returns to S704 when the calculation has not yet been completed so as to calculate the luminance values for the remaining main object candidates. Then, the CPU 101 determines the candidate exposure values. When the calculation has been completed, the flow moves to S706.
In S706, the CPU 101 calculates a maximum exposure difference that is a difference between a maximum overexposure value and a minimum underexposure value among the main object exposure value and the candidate exposure values calculated in the previous steps. When the maximum exposure difference is larger than a predetermined value, such as 1 Ev, the CPU 101 determines that the dynamic range is to be extended (set). Expending the dynamic range is a process of expanding the dynamic range by applying a gamma curve that captures an image with an exposure value set to the underexposure by 1 EV and increases the intermediate luminance by 1 Ev in the post-imaging image process.
Next, in S707, the CPU 101 sets an imaging exposure value (second exposure value) as an exposure value in imaging for obtaining a plurality of parallax images. More specifically, the CPU 101 selects, as the imaging exposure value, the maximum overexposure value (the maximum exposure value) among the main object exposure value and candidate exposure values calculated hitherto.
Next, in S708, the CPU 101 calculates an exposure difference as a difference between the imaging exposure value and the main object exposure value set in S707 and an exposure difference as a difference between the imaging exposure value and each candidate exposure value. Moreover, the CPU 101 correlates the calculated exposure difference with each of the main object and the main object candidate, and stores (or records) the calculated exposure difference in the internal memory. For example, the calculated exposure difference may be recorded as accessory information of the image file. Then, the CPU 101 ends this process.
Referring now to
In S801, the CPU 101 confirms the number of object candidates in the refocusable range similar to S701 in
In S802, the CPU 101 determines only one main object candidate has been confirmed in S801, similar to S703 in
Next, in S803, the CPU 101 determines a refocus object as a main object to be focused or refocused in the refocus process among the plurality of main object candidates. In other words, the CPU 101 determines the refocus distance as an object distance to be refocused.
Next, in S804, the CPU 101 determines whether the final refocus distance is a CPU refocus distance as the refocus distance determined in S803 or a user refocus distance adjusted from the refocus distance by the user. The CPU 101 moves to S805 when the final refocus distance is the CPU refocus distance. Where the final refocus distance is the user refocus distance and the user refocus distance is closer than the closest object candidate or farther than the farthest object candidate, the flow moves to S805. On the other hand, where the final refocus distance is a distance between a certain object candidate and another object candidate (referred to as an “object intermediate distance” hereinafter), the CPU 101 moves to S806.
In S805, the CPU 101 reads an exposure difference corresponding to the refocus object (or CPU refocus distance) among the exposure differences stored in S708 illustrated in
On the other hand, in S806, the CPU 101 reads out two exposure differences corresponding to the main object candidates located at the object distances before and after the refocus distance (object intermediate distance) among the exposure differences stored in S708 illustrated in
In S807, the CPU 101 determines the refocus exposure difference in accordance with the refocus distance based on the two exposure differences read out in S806. More specifically, the CPU 101 selects one main object distance with a closer distance than the refocus distance, selects the exposure difference corresponding to the main object candidate among the two exposure differences, and determines the selected exposure difference as the refocus exposure difference. Alternatively, the CPU 101 may calculate the refocus exposure difference based on an interpolation calculation with the distance using the two exposure differences. Where the refocus distance is closer to the object A, the CPU 101 sets the exposure difference corresponding to the object A and read out in S806 to the refocus exposure difference. Where the refocus distance is farther to the object B, the CPU 101 sets the exposure difference corresponding to the object B and read out in S806 to the refocus exposure difference. Thus, the CPU 101 sets the refocus exposure difference in accordance with the refocus distance in S805 to S807. Then, the flow moves to S808.
In S808, the CPU 101 converts the refocus exposure difference set in accordance with the refocus distance in S805 and S807, into the luminance correction value. The refocus exposure value is expressed as a power of 2 or (2n) where n corresponds to the luminance correction value. The luminance correction value is a gain value for the luminance correction.
In S809, the CPU 101 makes the image processing unit 112 perform the refocus process using the luminance correction value determined in S808. In this refocus process, the image processing unit 112 provides the luminance correction process for the plurality of pre-combination parallax images obtained by imaging or the refocused image generated by combining the parallax images. Thereby, a well refocused image can be generated which contains a main object (image) having a proper luminance.
The processes described in
In this embodiment, the CPU 101 stores the exposure difference as the difference between the imaging exposure value and the main object and candidate exposure values, obtains the refocus exposure difference corresponding to the refocus distance using the exposure difference, and acquires the luminance correction value based on the refocus exposure value. Alternatively, the CPU 101 may store the imaging exposure value, the main object exposure value, the candidate exposure value, obtain the refocus exposure value corresponding to the refocus distance based on these exposure values, acquire the refocus exposure difference as the difference between the imaging exposure value and the refocus exposure value, and finally procure the luminance correction value. In other words, storing and using the exposure difference are equivalent with storing and using the imaging exposure value, the main object exposure value, and the candidate exposure value.
While this embodiment describes the imaging apparatus that includes the imaging unit and the imaging processing apparatus, the image processing apparatus may be configured separate from the imaging apparatus having the imaging unit. In this case, a plurality of parallax images acquired by the imaging apparatus (imaging unit) may be input into the image processing apparatus using the communication and the recording medium.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2016-185617, filed on Sep. 23, 2016, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-185617 | Sep 2016 | JP | national |