Field of the Invention
The present invention relates to an image processing apparatus, an image processing method, a recording medium, and an image pickup apparatus.
Description of the Related Art
There has been disclosed a technology of performing still image photography during moving image photography in a digital camera, which uses an image pickup element such as a complementary metal oxide semiconductor (CMOS) to obtain a still image or a moving image (Japanese Patent Application Laid-Open No. 2006-345485). According to the technology disclosed in Japanese Patent Application Laid-Open No. 2006-345485, a switch is temporarily made to a still image photography mode during the moving image photography to enable the still image photography during the moving image photography.
On the other hand, there has been disclosed a technology in which light beams that have passed through different regions of an exit pupil of an imaging optical system are acquired by an image pickup element, and signals obtained from the light beams that have passed through the different pupil regions are reconstructed to generate image data (refocused image) on an arbitrary focal plane in one photography (Japanese Patent Application Laid-Open No. 2007-004471).
According to one aspect of an embodiment, an image processing apparatus, including a setting unit configured to set a focal plane of a still image, which is generated from moving image data including a plurality of frames, each of which is refocusable; a reconstruction unit configured to reconstruct a predetermined number of the frames of the moving image data in accordance with the focal plane of the still image, which is set by the setting unit, to thereby generate the predetermined number of refocused images; and a combining unit configured to combine the predetermined number of the refocused images, which are generated by the reconstruction unit, to thereby generate the still image.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
In Japanese Patent Application Laid-Open No. 2006-345485 described above, the switch is made to the still image photography mode during the moving image photography to record the still image, and hence there is a problem in that moving image recording is interrupted. Moreover, in a case where a moving image frame is extracted to generate the still image without interrupting the moving image recording, there is a problem in that a still image having exposure time that depends on a frame rate of the moving image recording is obtained.
In order to solve this problem, it may be contemplated to combine by addition a plurality of frames in a moving image to generate a still image having exposure time that is artificially changed. However, in a case where a focal plane has changed during the moving image photography, there is a problem in that, when the still image is generated by the combining by simple addition, the still image does not have a uniquely determined focal plane.
Moreover, in the technology disclosed in Japanese Patent Application Laid-Open No. 2007-004471 described above, it is difficult to obtain an image having a uniquely determined focal plane from a combined image.
Hereinafter, exemplary embodiments of the present invention are described in detail with reference to the accompanying drawings. Note that, in the following description, as the embodiments of the present invention, an example in which an image processing apparatus according to the present invention is applied to an image pickup apparatus is described. The image pickup apparatus, to which the image processing apparatus according to the present invention is applicable, includes a light field camera, a multiple-lens camera, and the like, which may be used to obtain imaging data from which a refocused image may be generated by reconstruction, and is an image pickup apparatus capable of moving image photography and still image photography during the moving image photography.
Referring to
On the image pickup surface of the image pickup element 102, a plurality of microlenses are arranged in a lattice pattern. Below each of the microlenses, a plurality of photoelectric conversion portions (pixel array) are arranged. Each of the microlenses pupil-divides a light flux passing through an exit pupil of the photographing optical system 101, and the plurality of photoelectric conversion portions are configured to respectively receive light beams that are pupil-divided by the microlenses. Therefore, the image pickup element 102 may be used to obtain pixel data that can be refocused from outputs of the plurality of photoelectric conversion portions (divided pixels). A detailed configuration of the image pickup element 102 is described later.
An A/D converter 103 is a converter for converting an analog signal from the image pickup element 102 into a digital signal. The A/D converter 103 outputs the digital signal obtained by the conversion to an image processing unit 104.
The image processing unit 104 includes circuits and a program group which have an image processing function and include a focus detection unit 105, a refocused image generation unit 106, and an image combining unit 107 to be described later. The image processing unit 104 performs predetermined image processing on imaging data output from the A/D conversion unit 103 based on a control instruction from a system control unit 108, to thereby generate image data for recording and image data for display.
The focus detection unit 105 includes circuits and programs for performing a predetermined calculation based on the imaging data obtained through the A/D conversion unit 103 so as to calculate an evaluation value to be used for lens position adjustment. The evaluation value calculated by the focus detection unit 105 is converted into a lens driving instruction through the system control unit 108. The obtained lens driving instruction is used to drive the photographing optical system 101. A detailed operation of the focus detection unit 105 is described later.
The refocused image generation unit 106 can generate images on the actually photographed physical focal plane and a focal plane (virtual focal plane) different from the physical focal plane (the images are hereinafter referred to as “refocused images”) by performing combining processing on the imaging data obtained through the A/D conversion unit 103. A detailed operation of the refocused image generation unit 106 is described later.
The image combining unit 107 combines a plurality of refocused images, which are generated by the refocused image generation unit 106, to generate one still image. A detailed operation of the image combining unit 107 is also described later.
The system control unit 108 not only controls an operation of the entire system but also acquires data obtained by image processing and lens position information of the photographing optical system 101 to function to mediate data between blocks. A control program to be executed by the system control unit 108 is stored in advance in a memory (not shown).
An operation unit 109 is an operation member such as various switches or dials mounted to the image pickup apparatus 100. A photographer can set photographing parameters and control a photographing operation by using the operation unit 109. The operation input by the operation unit 109 is output to the system control unit 108.
A display unit 110 is a display such as a liquid crystal display (LCD). The display unit 110 displays an image obtained by photographing, the image data stored in a memory unit 111, and various setting screens that are transferred from the system control unit 108. The memory unit 111 records image data, which is transferred via the system control unit 108, and reads out the recorded data on and from a recording medium such as a Secure Digital (SD) card or a CompactFlash (CF) card.
Next referring to
Next, a configuration of the unit pixel is described.
The microlens 201 is arranged near an imaging plane of the photographing optical system 101, and collects light beams output from the photographing optical system 101. The light beams collected by the microlens 201 are transmitted through a color filter 204, and then converted into electric signals in the photoelectric conversion portions 202 and 203. Note that, the color filter 204 is a color filter of red (R), green (G), and blue (B) that are arranged in accordance with the Bayer array, for example.
Hereinabove, the configuration of the image pickup apparatus 100 to which the image processing apparatus according to this embodiment is applied has been described. Next, a still image photographing operation during the moving image photography in the image pickup apparatus according to this embodiment is described with reference to
When an operation to start the moving image photography is executed by the operation unit 109, the image pickup apparatus 100 executes a moving image photography processing operation in accordance with the flow charts of
In Step S401, the system control unit 108 performs preliminary operations such as autofocus and automatic exposure. When the operations preliminary to the photography are complete in Step S401, the system control unit 108 advances the processing operation to Step 3402.
In Step S402, under the control of the system control unit 108, exposure is performed at a shutter speed that depends on a moving image photography frame rate, and pixel signals of a region to be photographed (effective pixel region) are output from the image pickup element 102. When the exposure operation is complete in Step S402, the system control unit 108 advances the processing operation to Step S403.
In Step S403, the image processing unit 104 subjects imaging data of the region to be photographed, which is acquired by the image pickup element 102 and the A/D converter 103, to predetermined image processing to generate image data for recording and for display. When the image processing is complete in Step S403, the system control unit 108 advances the processing operation to Step S404.
In Step S404, the system control unit 108 determines whether or not a still image photographing operation is performed by the operation unit 109 during the moving image photography. In a case where it is determined in Step S404 that the still image photographing operation is performed, the system control unit 108 advances the processing operation to Step S405, and in a case where it is determined that the still image photographing operation is not performed, the system control unit 108 advances the processing operation to Step S406.
The processing operation in Step S405 is described with reference to
In Step S405, the image combining unit 107 calculates and determines the number of frames to be combined in generating a still image based on a shutter speed and an exposure correction value (still image photographing conditions), which are set in advance, and the moving image recording frame rate. As illustrated in
In Step S406, it is determined whether or not a moving image frame that is currently being processed by the image processing unit 104 is a frame to be combined that is selected by the image combining unit 107 in Step S405. In a case where it is determined in Step S406 that the currently processed frame is a frame to be combined, the system control unit 108 advances the processing operation to Step S407, and in a case where it is determined that the currently processed frame is not a frame to be combined, the system control unit 108 advances the processing operation to Step S408.
In Step S407, the image combining unit 107 marks the frame to be combined. In the case of
In Step S408, the data for recording, which is generated by the image processing unit 104, is recorded on a memory medium, which is inserted in a memory media slot (not shown) of the image pickup apparatus 100, via the system control unit 108 by the memory unit 111. Moreover, the data for display is displayed on the display unit 110 via the system control unit 108. When the moving image recording processing is complete in Step S408, the system control unit 108 advances the processing operation to Step S409.
In Step S409, the system control unit 108 determines whether or not an operation to stop the moving image photography is performed by the operation unit 109. In a case where it is determined in Step S409 that the operation to stop the moving image photography is performed, the system control unit 108 advances the processing operation to Step S410. On the other hand, in a case where the stopping operation is not performed in Step S409, the system control unit 108 returns the processing operation to Step S401 to start processing the next frame.
In Step S410, the system control unit 108 determines whether or not the still image photography is performed during the moving image photography. In a case where the still image photography is performed, the system control unit 108 advances the processing operation to Step S411, and in a case where the still image photography is not performed, the system control unit 108 ends the moving image photographing operation.
In Step S411, the image processing unit 104 generates the still image obtained by the still image photography that is performed during the moving image photography.
The still image generation processing in Step S411 is described with reference to the flow chart of
When the still image generation processing is started, first in Step S412, the refocused image generation unit 106 determines a virtual focal plane of a plurality of frames of a moving image, which are to be combined for generating the still image. As illustrated in
Note that, focal plane information of each of the moving image frames is acquired by the focus detection unit 105. In the moving image photography, the focus detection unit 105 first generates divided pupil images from the imaged signals of the divided pixels of the unit pixels in an autofocus region. As already described in the description of the image pickup element 102, the divided pixels 202 and 203 receive the light beams that have passed through the different regions on the exit pupil, respectively. Therefore, outputs of the divided pixels, which are in a positional relationship that is symmetric with respect to a center of the microlens, may be collected to obtain a pair of divided pupil signals (divided pupil images). Next, the focus detection unit 105 performs a correlation calculation on the obtained pair of divided pupil images to calculate a difference between the signals, that is, an amount of image shift. In the correlation calculation, absolute values of difference values of overlapping pixels are integrated while the pairs of divided pupil images are shifted in a horizontal direction to acquire evaluation values. Next, an amount of shift of the images to a position at which the evaluation value becomes the lowest, that is, a position at which a degree of matching of the divided pupil images becomes the highest is calculated. The focus detection unit 105 determines a drive amount of the photographing optical system 101 based on the amount of shift, and the system control unit 108 drives the photographing optical system in accordance with the determined drive amount to perform a focal plane adjustment. At this time, the focus detection unit 105 outputs the focal plane information on each frame from a focus lens position after the focus adjustment. Note that, those technologies are publicly known, and hence details thereof are omitted. Moreover, the example in which the absolute values of the differences of the pixels between the divided pupil images are used as the evaluation values has been described herein, but the evaluation values may be acquired by a sum of squares of the difference values or other such methods.
In Step S413, the refocused image generation unit 106 performs refocused image generation processing on the plurality of frames to be combined, which are identified by the markers, based on the virtual focal plane determined in Step S412 to unify the focal planes of the plurality of frames to be combined. Alternatively, the virtual focal plane of each frame is set to be closer to the reference focal plane than the physical focal plane of each frame to perform the refocused image generation processing.
Next referring to
The light beams that have passed through the exit pupil regions 302 and 303, which have been described with reference to
With the use of the shifted addition (reconstruction processing) as described above, the refocused images that are in focus on an arbitrary subject region may be generated. By the reconstruction processing described above, the refocused images corresponding to the reference virtual focal plane, which is determined in Step S412, are generated.
Note that, the method of the refocus processing has heretofore been described taking as an example the case where there are two divided pixels for one microlens 201, but the range of the present invention is not limited thereto. Any method may be employed as long as the reconstruction processing corresponding to the pixel structure such as the number of divided pixels and a configuration of the optical system is performed to obtain the refocused images corresponding to the virtual focal plane. Those technologies are publicly known, and hence details thereof are omitted.
Further, for the sake of simplicity of description, the example of the processing of generating the refocused images using the simple shift-and-add operation has been described. However, a method of obtaining the refocused image is not limited thereto. The refocused image on the virtual focal plane may be obtained by using another method. For example, processing using weighting addition as described in Japanese Patent Application Laid-Open No. 2007-004471 may be used.
Next in Step S414, the image combining unit 107 combines the plurality of refocused images, which are generated based on the virtual focal plane, to generate the still image corresponding to the shutter speed and the exposure correction value, which are set in advance.
As described above, in this embodiment, processing of generating the refocused images by the reconstruction is performed on the plurality of moving image frames to be combined to unify the virtual focal planes of the plurality of frames, and hence the still image having the uniquely determined focal plane may be generated even when the plurality of frames are combined.
In a case where an effect of a change in shutter speed accompanying the exposure correction is to be obtained, the frames are combined by addition to generate a lightened image. Moreover, in a case where the shutter speed is set, the photographer does not intend to change the exposure, and hence the frames are combined by an arithmetic mean to generate a still image having an apparently lower sensitivity. Repetition of the combining by addition may cause overexposure, and hence even when the number of frames to be combined is increased by the combining by the arithmetic mean, an adjustment is always made to ensure normal exposure. How much the shutter speed is changed (setting of the shutter speed of the still image) may be set by an operation input by the user, or may be set automatically by exposure control. The setting is reflected on setting of the number of images to be combined.
Note that, the examples of the combining processing by means of the combining by addition and the combining by the arithmetic mean have been described herein. However, the present invention is not limited thereto, and another method may be used to perform the combining processing depending on the purpose. For example, combining processing using normalization of exposure may be performed.
Finally, in Step S415, the still image data, which is generated in Step S414, is recorded on the memory medium inserted in the memory media slot (not shown) of the image pickup apparatus 100 via the system control unit 108 by the memory unit 111. Moreover, depending on the setting, data for confirmation of the still image may also be temporally displayed on the display unit 110 via the system control unit 108. When the still image recording processing is complete, the still image generation processing is ended, and the moving image photography processing operation is ended.
As described above, according to this embodiment, in the case where the still image recording during the moving image photography is performed, the plurality of moving image frames based on the still image photographing condition are combined after the focal planes are brought closer to the virtual focal plane as the reference by the refocusing. This allows a good still image having exposure time that does not depend on the moving image recording frame rate and having a small shift of the focal plane to be generated.
The combining of the frames after the refocusing is performed by the combining by simple addition, with the result that the still image having the lower shutter speed and increased exposure may be obtained. On the other hand, the combining by the arithmetic mean may be performed to obtain the still image having the lower shutter speed and constant exposure, and hence the apparent sensitivity may be reduced.
In this embodiment, the processing operation in which the still image is generated in accordance with the still image photographing operation during the moving image photography has been described. However, the present invention is not limited thereto, and the still image may also be generated at an arbitrary timing and with arbitrary settings while reproducing moving image recording data. In this case, the image processing apparatus according to this embodiment may also be realized with an information processing apparatus such as a personal computer (PC) having a function of reproducing the moving image data by giving the function of setting processing of the still image photography to the image processing apparatus.
Moreover, in selecting the plurality of frames to be combined, the example in which the frame at the time when the still image photographing operation is performed is set as the reference and a necessary number of subsequent frames are selected has been described. The present invention is not limited thereto, and a frame at the time of the still image photographing operation or at a specified arbitrary timing may be set as an intermediate frame or a final frame. In this manner, it is possible to obtain similar effects as those obtained when timings of a front curtain and a rear curtain of the shutter are changed.
Moreover, as the method of determining the virtual focal plane in the refocusing, an example of the focal plane corresponding to the median value of all the frames to be combined has been described. However, the present invention is not limited thereto, and the focal plane at the time of the still image photographing operation or at the specified arbitrary timing may be used. Alternatively, the focal plane of the first frame or the last frame of the plurality of frames to be combined may be used. The focal plane may be changed to obtain an image intended by the photographer or an unexpected image.
Moreover, in a case where the focal plane has not changed or has changed only in a predetermined range in the plurality of frames to be combined, processing of determining the virtual focal plane as the reference (S412) and generating the refocused images (S413) may not be performed to reduce a processing load and power consumption.
Next referring to
When an operation to start the still image generation processing operation is executed by the operation unit 109 of the image pickup apparatus 100, the system control unit 108 executes the processing operation in the flow chart of
In Step S701, the system control unit 108 displays a list of convertible frame rates on the display unit 110, and sets the converted frame rate in accordance with the input by the operation unit 109. When the converted frame rate is set, the system control unit 108 advances the processing operation to Step S405.
In Step S405, as in the first embodiment, the image combining unit 107 calculates and determines the number of frames used for the combining based on the set frame rate. As illustrated in
In Step S702, the system control unit 108 determines whether or not there is any unconverted data in the recorded moving image data. In a case where there is unconverted data, the processing operation proceeds to Step S412, and in a case where there is no unconverted data, the still image generation processing operation is ended.
In Steps S412 to S414, as in the first embodiment, the virtual focal plane is determined, the refocused images having the unified focal plane are generated and combined for the plurality of frames to be combined, and converted frames 813 to 815 are generated. Note that, in the image combining in Step S414, the combining by the arithmetic mean or the normalization is performed so as not to cause overexposure due to the combining. When a new frame is generated by the image combining in Step S414, the system control unit 108 advances the processing operation to S703.
In Step S703, the still image data generated by converting the frame rate in Step S414 is recorded on the memory medium, which is inserted in the memory media slot (not shown) of the image pickup apparatus 100 via the system control unit 108 by the memory unit 111. When the processing of recording the still image data ends, the system control unit 108 returns the processing operation to Step S702, and repeats the processing operation of Steps S702, S412 to S414, and S703 until the conversion of the frame rate is complete for all the moving image data.
As described above, according to the second embodiment also, the images are combined after the focal planes are unified to generate the still image, and hence a good still image having the uniquely determined focal plane may be generated.
Note that, in Step S701, the range of the moving image data to be subjected to the frame rate conversion may also be specified. In that case, the determination in Step S702 is performed for the specified range of the moving image data.
Note that, in this embodiment, the light field data is acquired by an optical system as illustrated in
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2014-072323, filed Mar. 31, 2014 which is hereby incorporated by reference wherein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2014-072323 | Mar 2014 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20070252074 | Ng | Nov 2007 | A1 |
Number | Date | Country |
---|---|---|
2006-345485 | Dec 2006 | JP |
2007-004471 | Jan 2007 | JP |
2011-022796 | Feb 2011 | JP |
Number | Date | Country | |
---|---|---|---|
20150281578 A1 | Oct 2015 | US |