The present invention relates to an image processing technology for performing dynamic range extension processing using a plurality of image data each having a different viewpoint.
There is a technology of extending a dynamic range (hereinafter, referred to as DR) by combining a plurality of images. In an imaging device and the like, there is a method of generating a high dynamic range (HDR) image without overexposure or underexposure and saving the image as a file. Japanese Patent Laid-Open No. 2013-251724 discloses a technology which is capable of recording image data after HDR combination and values of pixels before the combination, which are changed between before and after the combination, as additional information in a file, and correcting an area with an unnatural combined result later. In addition, in Japanese Patent Laid-Open No. 2014-160912, an HDR image is acquired by using and combining a plurality of light field images formed of a plurality of viewpoint images. A pixel value which is not used in combination is replaced with a specific value at the time of saving a file, and a file can be compressed and saved with high efficiency.
In addition, Japanese Patent Laid-Open No. 2016-58993 discloses a technology of combining an HDR image using pupil-divided images. Specifically, an imaging element which shares a single microlens and receives light passing through pupil areas with different optical systems is used. From a first pixel and a second pixel, it is possible to generate an HDR image by combining image data obtained from one pixel and image data obtained by adding values of the first pixel and the second pixel according to the brightness of a subject.
However, a technology capable of saving data of an HDR image combined using pupil-divided images as a file in a format in accordance with a purpose of use of a user has not been proposed.
The present invention provides an image processing device and an image processing method which can save an image combined using a multi-viewpoint image in a storage medium in a predetermined file format.
An image processing device according to the first embodiment of the present invention includes a memory storing instructions and a processor which is capable of executing the instructions causing the image processing device to: acquire a plurality of image data each having a different viewpoint as input image data; perform combining processing related to dynamic range extension on the input image data and generate output image data; and save the output image data or the viewpoint image data and the output image data in a predetermined file format as a file in a storage medium.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
Hereinafter, each embodiment of the present invention will be described in detail with reference to drawings.
In the present embodiment, an example of an image processing device which can save image data before combining processing of dynamic range extension and combination parameters in a file is shown.
A control unit 110 includes, for example, a central processing unit (CPU) which is a central unit for controlling the entire PC 100. An image processing unit 120 performs HDR combination processing using input image data. The HDR combination processing will be described below. A memory 130 is a random access memory (RAM) which temporarily stores a program or data supplied from the outside. A memory 130 is used as a temporary storage area for data output in accordance with execution of a program. A read only memory (ROM) 140 is a storage device which stores a program or parameters. The ROM 140 in the present embodiment stores a program code for software executed by the control unit 110, such as an application 200 (refer to
A storage medium 150 can be read and written by a computer. For example, a built-in memory included in a computer, a memory card detachably connected to the computer, a medium capable of recording electronic data such as a HDD, a CD-ROM, an MO disk, an optical disc, a magneto-optical disc, and the like can be used. Digital data such as image data is stored as a file in the storage medium 150.
The operation unit 160 is constituted by a keyboard, a pointing device, and the like. A user performs an operation instruction for the PC 100 using the operation unit 160 to enable designation of input and output data, change of a program, execution of image processing, and the like.
A display unit 170 includes a display device such as a liquid crystal display and the like. A graphical user interface (GUI) screen of the application 200, a result of image processing, and the like are displayed on a screen of the display unit 170 for example. An internal bus 180 is a transmission path of control signals or data signals between respective elements in the PC 100. If the PC 100 has an imaging function, the PC 100 includes an imaging unit 190. The imaging unit 190 includes an imaging optical system having optical members such as a lens or an aperture, and an imaging element which photoelectrically converts an optical image formed through the imaging optical system. The control unit 110 and the image processing unit 120 perform image processing such as development on image data acquired from the imaging unit 190.
A pixel group 1210 of two rows and two columns in an upper left of
An editing operation area 240 is an area constituted by a GUI group for a user to perform an image editing operation. The GUI group is, for example, objects which are buttons, sliders, check boxes, or numerical value input boxes. A user can perform an editing operation instruction assigned to the GUI by operating each GUI object of the editing operation area 240. An editing operation is, for example, image rotation, trimming, brightness adjustment, contrast adjustment, white balance adjustment, and noise removal. An HDR combination processing button 241 is a button for a user to click to instruct execution of HDR combination processing. The save processing button 250 is a button for a user to instruct saving of an editing result for an input image file. A setting button 251 is a button for a user to set the operation of the application 200. In addition, the application 200 has a general menu operation unit as an image processing application, but this is not shown.
A TIFF header section 301 of the RAW image file 300 is an area in which data for identifying a structure of a TIFF format file, an offset to a first IFD section, and the like are stored.
The following data is stored in each of the IFD sections 302 to 305.
Meta data A to D such as photographing information or parameters related to each piece of image data stored in image data sections 306 to 309.
Offset values E to H of the image data sections 306 to 309.
An offset value of a next IFD section.
In the IFD section positioned last, a specific offset value indicating that there is no next IFD section is stored. In addition, the IFD sections 302 to 305 include a size (the number of pixels in the vertical and horizontal directions) of image data stored in a corresponding image data section, information indicating whether an image is a reduced image, and information on a pupil-divided image to be described below in the metadata A to D, Accordingly, an image processing device performing processing using the RAW image file 300 can read appropriate image data according to a purpose from a plurality of pieces of image data by referring to the IFD sections 302 to 305.
The image data sections 306 to 309 are configured as a display image data section 306, a thumbnail image data section 307, an (A+B) image data section 308, and an A image data section 309. The display image data section 306 is an area for storing a display image to be displayed on a display unit 170 and the like. In the present embodiment, a data format of the display image is set to the joint photographic coding experts group (JPEG) format. The thumbnail image data section 307 is an area for storing a thumbnail image to be used in a display in the thumbnail display area 220 of the application 200 and the like. The thumbnail image is an image reduced by data thinning-out processing and the like of a display image. The (A+B) image data section 308 and the A image data section 309 are areas for storing RAW image data recorded by an imaging device capable of acquiring a pupil-divided image. Specifically, data is recorded by the following method.
The imaging device has an imaging element such as a charge-coupled device (CCD) type image sensor or a complementary metal oxide semiconductor (CMOS) type image sensor. Each of a plurality of main pixels constituting the imaging element is disposed under a single microlens, and has a first pixel and a second pixel which share a single microlens and receive light passing through different pupil areas of an imaging optical system. Optical images received by the imaging element are subjected to photoelectric conversion and A (analog)/D (digital) conversion, and a pupil-divided image (A image) of the first pixel and a pupil-divided image (B image) of the second pixel are generated. The A image and the B image are viewpoint images having different viewpoints. Since the A image and the B image have different pupil intensity distributions and shading characteristics, shading correction is performed using inverse characteristics thereof. By this correction, it is possible to correct uneven brightness caused by uneven vignetting amounts of divided pupils. The imaging device performs processing of recording A image data after shading-correction in the A image data section 309 of the RAW image file 300. In addition, the imaging device performs processing of recording (A+B) image data obtained by adding the A image to the B image in the (A+B) image data section 308. At this time, since the A image has a pixel aperture in an under-exposure state by one stage with respect to the (A+B) image, the (A+B) image is an appropriately exposed image and the A image is an underexposed image. Instead of the A image, the B image may also be recorded as an underexposed image in the RAW image file 300. In addition, bit depths of the (A+B) image and the A image are set to 14 bpp (bits per pixel) in the present embodiment, but may also be recorded at other bit depths. In the present embodiment, data stored in the (A+B) image data section 308 and the A image data section 309 are set to uncompressed RAW image data which is not compressed. The present embodiment is not limited to this form, and may also be configured to store reversibly compressed RAW image data in the (A+B) image data section 308 and the A image data section 309 and to perform decompression processing of the compressed RAW image data at the time of reading.
An editing parameter section 310 is an area for recording parameters of editing processing. The parameters of editing processing are parameters when editing processing has been performed on the RAW image file 300 by the application 200 and the like in the past, and are configured to include an image editing parameter and an HDR combination parameter. The image editing parameter is, for example, a parameter of image editing processing executed by a user operating the editing operation area 240. In the present embodiment, the HDR combination parameter is set to a distinguishing flag which shows whether HDR combination processing has been performed on input RAW image data (hereinafter, referred to as an HDR combination flag). If the HDR combination flag is ON, this means that HDR combination processing has been executed. As the HDR combination parameter, a determination result for each pixel may also be used.
An operation of image processing and save processing according to the present embodiment will be described using
An operation of the application 200 including an HDR combination function and a file save function will be described with reference to the flowchart of
In S401, the control unit 110 determines whether an operation of selecting an input RAW image file has been performed by a user. The operation of selecting a file is performed, for example, by a user clicking one of folders displayed in the folder tree display area 210 and then clicking one of thumbnail images displayed in the thumbnail display area 220. If it is determined that the operation of selecting an input RAW image file has been performed, the procedure proceeds to processing of S402. In addition, if it is determined that an operation of selecting an input RAW image file has not been performed, the procedure proceeds to processing of S403.
In S402, the control unit 110 performs image reading processing. Details thereof will be described below. If the image reading processing ends, the control unit 110 returns to the processing of S400. In S403, the control unit 110 determines whether a GUI object of the editing operation area 240 has been operated by a user, that is, whether the image editing operation has been performed. If it is determined that the image editing operation has been performed, the procedure proceeds to processing of S404, and, if it is determined that an image editing operation has not been performed, the procedure proceeds to processing of S405.
In S404, the control unit 110 performs image editing processing in accordance with parameters or contents assigned to a GUI object operated in S403 on image data stored in the (A+B) image data section 308 of an input RAW image file. A result of image editing processing is presented to a user by being displayed in the preview area 230. If the image editing processing ends, the control unit 110 returns to the processing of S400. The image editing processing is not an essential matter of the present invention, and thus description will be omitted.
In S405, the control unit 110 determines whether the HDR combination processing button 241 has been pressed by a user. If it is determined that the HDR combination processing button 241 of
In S407, the control unit 110 determines whether the save processing button 250 has been pressed by a user. If it is determined that the save processing button 250 of
In S408, the control unit 110 performs file save processing. Details thereof will be described below. If the file save processing ends, the control unit 110 returns the processing to S400. In S409, the control unit 110 determines whether an end operation of the application 200 has been performed by a user. If it is determined that the end operation of the application 200 has been performed, the procedure ends the processing, and, if it is determined that the end operation of the application 200 has not been performed, the procedure returns the processing to S400.
In the operation described above, after the processing starts, S403 to S408 may also be skipped until an input RAW image file is selected in S401. Alternatively, a specific input RAW image file among RAW image files displayed in the thumbnail display area 220 may also be set as an initial input RAW image file. The specific input RAW image file may be, for example, a RAW image file having an order of a display position, a clip name, a photographing date and time, and the like at the beginning or at the end thereof, a RAW image file which has been subjected to previous image processing, or the like.
Next, with reference to a flowchart of
In S502, the control unit 110 executes the image editing processing. The image editing processing in accordance with an image editing parameter recorded in the editing parameter section 310 is performed on image data stored in the (A+B) image data section 308 of the input RAW image file. If the image editing processing ends, the procedure proceeds to the processing of S503.
In S503, the control unit 110 performs the determination processing of a HDR combination flag. If a HDR combination parameter exists in the editing parameter section 310 of the input RAW image file, and a determination condition in which the HDR combination flag is ON is satisfied, the procedure proceeds to the processing of S504. If the determination condition is not satisfied, the procedure proceeds to processing of S505. In S504, the control unit 110 performs HDR combination processing on the input RAW image file. Details thereof will be described below. If the HDR combination processing ends, the procedure proceeds to the processing of S505.
In S505, the control unit 110 performs preview display processing. After the image data of the display image data section 306 is acquired from the input RAW image file and processing of displaying the image data in the preview area 230 of
Next, with reference to
With reference to a flowchart of
In S703, from acquired A image data 601 and the (A+B) image data 603, the control unit 110 selects the A image data 601. In S704, the control unit 110 selects the (A+B) image data 603 from the acquired A image data 601 and the (A+B) image data 603. In S705, the control unit 110 performs processing of generating an HDR combined image on the basis of the image data selected in S703 and S704. The control unit 110 at this time gains up the A image data 601 at a pixel position at which the brightness of a subject image is equal to or greater than the threshold value to generate A* image data 602. In S706, the control unit 110 performs processing of displaying the HDR combined image generated in S705 in the preview area 230, and ends the HDR combination processing.
Next, file save processing shown in S408 of
In S800, the control unit 110 acquires (A+B) image data from the (A+B) image data section 308 of the input RAW image file. In S801, the control unit 110 acquires A image data from the A image data section 309 of the input RAW image file. In S802, the control unit 110 acquires editing parameters. The editing parameters are as follows.
Parameters acquired from the editing parameter section 310 of an input RAW image file,
Parameters of image editing processing executed in S404 of
HDR combination parameters of HDR combination processing executed in S406 of
In S803, the control unit 110 performs processing of generating a display image and a thumbnail image using a result of the image editing processing in S404 of
In the present embodiment, a result of performing HDR combination processing on the RAW image file configured from the (A+B) image and the A image can be saved as a RAW image file having the same configuration as an original file. In addition, the result of HDR combination processing is saved in the same file format as the input RAW image file, and thereby image editing using the application 200 is possible again. According to the present embodiment, it is possible to provide an image processing device which is advantageous in saving HDR image data generated by performing HDR combination processing on a plurality of image data each having a different viewpoint acquired from one instance of photographing in a storage medium in a predetermined file format.
Next, a second embodiment of the present invention will be described. In the present embodiment, an example in which a user can select a format of a RAW image file to be saved from a plurality of different file formats will be described. A block diagram showing a configuration of a PC, a UI configuration diagram of the application 200, and a conceptual diagram of a RAW image file stored in the storage medium 150 in the present embodiment are the same as in
With reference to
A RAW image data section 908 is an area for storing a single piece of RAW image data. In the present embodiment, a bit depth of the RAW image data is set to 14 bpp or 15 bpp. A difference image data section 909 is an area for storing difference image data generated by file save processing. Details of data contents will be described below. In addition, the difference image data section 909 does not necessarily exist.
IFD sections 904 and 905 are IFD sections corresponding to the RAW image data section 908 and the difference image data section 909, respectively. In the second IFD section 904, information on the bit depth of RAW image data stored in the RAW image data section 908 is also recorded.
Next, the image processing and the file save processing according to the present embodiment will be described with reference to
In S1005, the control unit 110 searches for information on an IFD section of an input RAW image file, and determines whether both the (A+B) image data section 308 and the A image data section 309 exist in the input RAW image file. If the (A+B) image data section 308 and the A image data section 309 exist, the procedure proceeds to processing of S1006, and, if the (A+B) image data section 308 and the A image data section 309 do not exist, the procedure proceeds to processing of S1008. Processing of S1006 to S1010 is the same as processing of S405 to S409 of
Next, specific image reading processing of the present embodiment in S1002 of
In S1103, the control unit 110 searches for information on an IFD section of an input RAW image file, and determines whether the A image data section 309 exists in the input RAW image file. If it is determined that the A image data section 309 exists in the input RAW image file, the procedure proceeds to HDR combination flag determination processing of S1104. If it is determined that the A image data section 309 does not exist in the input RAW image file, the procedure proceeds to processing of S1106. The processing of S1104 and S1105 is the same as processing of S503 to S504 of
In S1106, the control unit 110 searches for information on an IFD section of an input RAW image file, and determines whether the difference image data section 909 exists in the input RAW image file. If it is determined that the difference image data section 909 exists in the input RAW image file, the procedure proceeds to processing of S1107, and, if it is determined that the difference image data section 909 does not exist in the input RAW image file, the procedure proceeds to processing of S1108.
In S1107, the control unit 110 adds difference image data of the difference image data section 909 to image data of the RAW image data section 908. In S1108, the control unit 110 acquires the image data of the display image data section 306 from the input RAW image file to display the data in the preview area 230, and ends the image reading processing. The image data displayed in the preview area 230 may also be image data generated using a result of performing processing in S1102, S1105, and S1107.
The HDR combination processing in S1007 of
Next, file save processing in S1009 of
In S1301, the control unit 110 selects the same file format as the input RAW image file, and proceeds to processing of S1306 of
In S1304, the control unit 110 determines whether the OK button 1204 of
In S1306 of
In S1309, the control unit 110 acquires HDR combined images obtained by the HDR combination processing in S406 of
In S1311, the control unit 110 acquires difference image data. The difference image data is obtained by subtracting an image obtained by clipping an HDR combined image at 14 bpp from the HDR combined image acquired in S1309. In the present embodiment, the difference image data is the same as 1 bpp image data represented by a highest-order bit of A image data.
In S1312, the control unit 110 determines whether a 14 bpp RAW file format has been selected in the drop-down list 1202 of the file format selection dialog 1200. If the “14 bpp RAW” file format has been selected, the procedure proceeds to processing of S1313, and, if the “14 bpp RAW” file format has not been selected, that is, if a “15 bpp RAW” file format has been selected, the procedure proceeds to processing of S1314.
In S1313, the control unit 110 performs HDR RAW image compression processing. The HDR RAW image compression processing is processing of converting a 15 bpp HDR combined image into 14 bpp data. Details thereof will be described below. If the HDR RAW image compression processing ends, the procedure proceeds to the processing of S1314. Processing of S1314 to S1316 is the same as the processing of S802 to S804 of
A graph line 1402 of
In addition, a method of compressing a file to 14 bpp by setting the pixel values indicated by the graph line 1401 to output pixel values gained down by one stage like pixel values represented by a graph 1403 may also be used. In this case, data of contents gained-up by one stage is added to the editing parameters acquired in S1314 of
According to the present embodiment, a result of performing the HDR combination processing on the RAW image file constituted by the (A+B) image and the A image can be saved as a RAW image file of a format selected by a user.
Next, a third embodiment of the present invention will be described. In the present embodiment, an example in which the HDR combination processing is performed on a RAW image file constituted by an A image and a B image is shown. In the first embodiment, an example in which a file format selection dialog is necessarily displayed at the time of executing the file save processing is described. On the other hand, in the present embodiment, an example in which, if a file format is selected once, an operation of selecting a file format is not necessary whenever the file save processing is executed is shown.
Each of the A image data section 1508 and the B image data section 1509 is an area for storing RAW image data recorded by an imaging device capable of acquiring pupil-divided images. Specifically, the imaging device described in the first embodiment generates each piece of data of a pupil-divided image (A image) of a first pixel and a pupil-divided image (B image) of a second pixel. The imaging device records the A image subjected to shading correction in the A image data section 1508 of the RAW image file 1500, and records the B image subjected to shading correction in the B image data section 1509. At this time, with respect to an (A+B) image obtained by adding the A image to the B image, the A image and the B image have a pixel aperture underexposed by one stage, and thus the (A+B) image is an appropriately exposed image, and the A image and the B image are underexposed images. The IFD sections 1504 and 1505 are IFD sections corresponding to the A image data section 1508 and the B image data section 1509, respectively.
Next, image processing and save processing according to the present embodiment will be described with reference to
In S1605, the control unit 110 searches for information on an IFD section of an input RAW image file, and determines whether both the A image data section 1508 and the B image data section 1509 exist in the input RAW image file. If it is determined that both the A image data section 1508 and the B image data section 1509 exist in the input RAW image file, the procedure proceeds to processing of S1606. If it is determined that both the A image data section 1508 and the B image data section 1509 do not exist in the input RAW image file, the procedure proceeds to processing of S1608. The processing of S1606 is the same as the processing of S1006 of
In S1607, the control unit 110 performs HDR combination processing. In the present embodiment, the HDR combination processing is processing of generating (A+B) image data by adding data acquired from the A image data section 1508 and the B image data section 1509 of the input image file, and displaying the (A+B) image data in the preview area 230. Each of A image data and B image data is a 14 bpp image underexposed by one stage, and thus the (A+B) image data obtained by adding these pieces of image data is an appropriately exposed image of 15 bpp, and has a pixel value corresponding to a brightness of a subject the same as in the graph line 602 of
In S1608, the control unit 110 determines whether a setting button 251 of a file format has been pressed by a user. If it is determined that the setting button 251 has been pressed, the procedure proceeds to processing of S1609, and, if it is determined that the setting button 251 has not been pressed, the procedure proceeds to processing of S1610. In S1609, the control unit 110 performs file format setting processing. Details thereof will be described below. If the file format setting processing ends, the procedure returns to the processing of S1600. Processing in S1610 and S1612 is the same as the processing in S1008 and S1010 of
Next, the file format setting processing in S1609 of
In S1703, the control unit 110 sets a file format selected in the drop-down list 1202 of the file format selection dialog 1200 as setting information of the application 200 and records the file format in the memory 130. The setting information of a file format is recorded in the ROM 140 at the time of ending the application 200 and is read at the time of starting the application 200 again, and thus the same setting can be used again. The setting information of a file format maybe stored in the ROM 140 at the time of S1703, maybe cancelled at the time of ending the application 200, and an initial value may be used at the time of starting the application 200 again. The initial value of the setting information of a file format is set to an “original” format. If the setting information of a file format is not set, S1610 of
In S1704, the control unit 110 determines whether the cancel button 1205 has been pressed by a user. If it is determined that the cancel button 1205 has been pressed, the procedure ends the file format setting processing, and, if the cancel button 1205 has not been pressed, the procedure returns to processing of S1701.
Next, the file save processing shown in S1611 of
In S1802, the control unit 110 selects a file format set in S1609 of
In S1804, the control unit 110 acquires A image data from the A image data section 1508 of an input RAW image file. In S1805, the control unit 110 acquires B image data from the B image data section 1509 of an input RAW image file. Processing of S1806 to S1813 of
According to the present embodiment, a result of performing the HDR combination processing on a RAW image file constituted by an A image and a B image can be saved as a RAW image file in a format selected by a user. In addition, an application holds a result of selecting a file format by a user once, and thereby it is not necessary to perform an operation of selecting a file format whenever the file save processing is executed and file save in a desired file format is performed.
Although the present invention has been described in detail on the basis of preferred embodiments thereof, the present invention is not limited to these specific embodiments, and various modes in a range not departing from the gist of the present invention are included in the present invention. Some of the embodiments described above may be appropriately combined.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium such that they perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like,
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2016-253249, filed Dec. 27, 2016, which is hereby incorporated by reference wherein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2016-253249 | Dec 2016 | JP | national |