The present invention relates to an image conversion apparatus, a control method for an image conversion apparatus, and a storage medium.
The dynamic range of an imaging apparatus is increased in response to an improvement in the sensitivity of a photo acceptance unit. On the other hand, the dynamic range of an image signal output from the imaging apparatus is limited by a transmission method. A high dynamic range (HDR) transmission method of the image signal includes an original Log method of a camera manufacturer, and Recommendation ITU-R BT.2100 (BT.2100) developed by the International Telecommunication Union-Radiocommunication Sector (ITU-R). Further, BT.2100 is divided into a perceptual quantization (PQ) method and a hybrid log gamma (HLG) method. These HDR transmission methods have different dynamic ranges.
Incidentally, the dynamic range of the imaging apparatus changes due to exposure adjustment of a diaphragm or sensitivity, and hence there are cases where the dynamic range thereof exceeds the dynamic range of the image signal based on the above transmission method. In the case where an input dynamic range (the dynamic range of the imaging apparatus) exceeds an output dynamic range (the dynamic range of the image signal which is to be output), a gradation which is not less than the output dynamic range is saturated and becomes invisible. In this case, it is possible to identify an area in which the gradation is saturated (saturated area) by using a specific color or replacement by a pattern image.
As a technique for identifying the saturated area, Japanese Patent Application Publication No. 2006-165716 discloses a technique for allowing a relationship between an exposure adjustment value and the saturated area to be determined by color-coding the saturated areas of a plurality of the exposure adjustment values and displaying the saturated areas at the same time.
In addition, Japanese Patent Application Publication No. 2014-167609 discloses a technique for allowing the gradation of the saturated area to be determined by color-coding and the saturated areas according to a gradation value and displaying the saturated areas.
By color-coding and displaying the saturated area, it becomes possible to determine the saturated area. However, in the case where correction is performed such that the input dynamic range falls within the output dynamic range, it is difficult to determine a corrected area and a correction strength in the saturated area.
The present invention provides a technique for allowing a corrected area and a correction strength of knee correction to be easily determined.
An image conversion apparatus according to an embodiment of the present invention includes an image input unit configured to acquire image data, a correction unit configured to correct the image data by compressing a gradation exceeding a first threshold value with a predetermined correction strength, and a combining unit configured to combine a warning image which differs according to the predetermined correction strength with an area having a gradation exceeding a second threshold value in the image data.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinbelow, embodiments of the present invention will be described by using the drawings. In the case where the dynamic range (input dynamic range) of an imaging apparatus exceeds an output dynamic range, it is possible to cause the input dynamic range to fall within the output dynamic range by knee correction which compresses a signal which is not less than a predetermined gradation (knee point).
However, by performing the knee correction, resolution in a gradation which is not less than the knee point is reduced in proportion to a correction strength. Accordingly, in a natural image, it is difficult to determine a corrected area corrected by the knee correction and the correction strength. In addition, there is a possibility of the occurrence of loss of gradation of a subject which is not intended by a user.
To cope with this, an image conversion apparatus according to the present embodiment allows the corrected area and the correction strength in a saturated area to be easily determined by changing a warning image which is displayed in the corrected area of the knee correction according to the correction strength.
Apparatus Configuration
The image input unit 101 is an acquisition unit configured to acquire image data from the outside. In the case where the image conversion apparatus 100 is the imaging apparatus, the image input unit 101 has an image sensor, and acquires image data obtained by performing color or gradation correction processing on an electrical signal output by the image sensor. In the case where the image conversion apparatus 100 is the display apparatus, the image input unit 101 has an input interface such as a serial digital interface (SDI), and acquires image data from the input interface. In the case where the image conversion apparatus 100 is the computer which operates the image editing software, the image input unit 101 reads an image file retained in a storage apparatus to acquire image data.
The image correction unit 102 performs correction processing on the image data acquired in the image input unit 101. Specifically, the image correction unit 102 executes conversion processing which uses a one-dimensional lookup table (1D-LUT) defined for each set of RGB values of the image data. Note that the correction processing by the image correction unit 102 is not limited to the conversion processing using the 1D-LUT, and it is possible to use another image correction processing such as, e.g., a three-dimensional lookup table (3D-LUT), gain adjustment, offset adjustment, or matrix conversion.
The image combining unit 103 combines a warning image generated in the warning image generation unit 108 described later with the image data corrected in the image correction unit 102 according to a determination result of the signal level determination unit 107 described later. Warning image combining processing of the image combining unit 103 will be described later with reference to a flowchart in
The image output unit 104 outputs the image data processed in the image combining unit 103 to the outside. Specifically, the image output unit 104 has an output interface such as, e.g., an SDI, and outputs the image data from the output interface. In addition, in the case where the image output unit 104 has a display panel such as a liquid crystal panel or a driver of the display panel, the image output unit 104 displays image data subjected to correction processing based on the color gamut or gradation characteristics of the display panel in the display panel.
The control unit 105 sets parameters corresponding to the knee correction in individual blocks (functional units) based on a user operation. Specifically, the control unit 105 sets the 1D-LUT in the image correction unit 102, sets a threshold value of a signal level in the signal level determination unit 107, and sets a conversion table in the warning image generation unit 108. The detail of parameter setting processing based on the knee correction in the control unit 105 will be described later with reference to a flowchart in
The signal level detection unit 106 converts RGB values of the image data and detects the signal level. The signal level detected in the signal level detection unit 106 is specifically the maximum value of the RGB values of the image data. Note that the signal level detected in the signal level detection unit 106 is not limited to the maximum value of the RGB values of the image data. For example, the signal level detection unit 106 may convert the RGB values of the image data to YCbCr values, and may detect the Y value as the signal level.
The signal level determination unit 107 determines the signal level by comparing the signal level detected in the signal level detection unit 106 with the threshold value set by the control unit 105. The determination result is output to the image combining unit 103.
The warning image generation unit 108 generates the warning image, and outputs the warning image to the image combining unit 103. Herein, the warning image will be described with reference to
The display color of the zebra pattern is determined by using a conversion table (hereinafter referred to as a zebra color conversion table) which converts the signal level to the RGB values. The zebra color conversion table is set in the warning image generation unit 108 by the control unit 105. Herein, the zebra color conversion table will be described with reference to
Parameter Setting Processing Herein, the parameter setting processing based on the knee correction by the control unit 105 will be described with reference to
In S11, the control unit 105 determines whether a knee correction function is turned ON or OFF by the user operation. In the case where the knee correction function is ON, the processing proceeds to S12. In the case where the knee correction function is OFF, the processing proceeds to initialization processing in S16 and S17.
In S12, the control unit 105 determines whether or not setting values of the knee correction are changed by the user operation. In the case where the setting values of the knee correction are changed, the processing proceeds to S13 to S15, and each parameter is set or updated. In the case where the setting values of the knee correction are not changed, the parameter setting processing shown in
Herein, the setting values of the knee correction and generation of the 1D-LUT will be described with reference to
In an example in
In S13, the control unit 105 generates the 1D-LUT in which the knee correction is reflected based on the setting values of the knee correction changed in S12, and sets the generated 1D-LUT in the image correction unit 102.
In S14, the control unit 105 generates the zebra color conversion table for converting the signal level to the display color of the zebra pattern based on the knee point and the knee slope changed in S12, and sets the generated zebra color conversion table in the warning image generation unit 108. The detail of zebra color conversion table generation processing will be described later with reference to a flowchart in
In S15, the control unit 105 sets the knee point changed in S12 as a signal level threshold value in the signal level determination unit 107. In the example in
In S16, the control unit 105 initializes the 1D-LUT which is to be set in the image correction unit 102. Herein, the initialization of the 1D-LUT will be described with reference to
In S17, the control unit 105 initializes the zebra color conversion table which is to be set in the warning image generation unit 108. Herein, the initialization of the zebra color conversion table will be described with reference to
Zebra Color Conversion Table Generation Processing
Herein, with reference to
In S21, the control unit 105 sets 0 as a signal level Lv to thereby perform initialization. In S22, the control unit 105 determines whether or not the signal level Lv is not more than the knee point. In the case where the signal level Lv is not more than the knee point, the processing proceeds to S23. In the case where the signal level Lv is more than the knee point, the processing proceeds to S24.
In S23, the control unit 105 sets the display color of the zebra pattern to [R=0, G=0, B=0]. That is, in the case where the signal level Lv is not more than the knee point and the knee correction is not performed, the control unit 105 sets black as the output value of the zebra color conversion table.
In S24, the control unit 105 determines whether or not the signal level Lv is saturated. In the case where the signal level Lv is saturated, the processing proceeds to S25. In the case where the signal level Lv is not saturated, the processing proceeds to S26.
In S25, the control unit 105 sets the display color of the zebra pattern to [R=1023, G=0, B=0]. That is, in the case of the saturated signal level, the control unit 105 sets red as the output value of the zebra color conversion table.
In S26, the control unit 105 sets the display color of the zebra pattern to [R=0, G=0, B=1023], to [R=1023, G=0, B=1023], and to [R=1023, G=0, B=0]. That is, in the case of the signal level at which the knee correction is performed, the control unit 105 sets blue, magenta, and red as the output values of the zebra color conversion table.
The control unit 105 changes the output value set in the zebra color conversion table according to the knee slope, i.e., a compression ratio of the knee correction. Specifically, in the case where the knee slope is Ns (0.00 to 1.00), the RGB values of the output value of the zebra color conversion table are determined by the following Formula 1:
R=MIN(1,(1−Ns)×2)×1023
G=0
B=MIN(1,Ns×2)×1023 (Formula 1)
MIN ( ) in Formula 1 is a function for selecting a minimum value. By calculating the RGB values by using Formula 1, the display color of the zebra pattern changes from blue to magenta and from magenta to red as the compression ratio of the knee correction increases. Note that the RGB values of the output value of the zebra color conversion table are not limited to Formula 1, and may be calculated according to the correction strength such as the compression ratio of the knee correction.
In addition, as shown in the following Formula 2, the degree of change may be changed by performing gamma correction on the compression ratio of the knee correction. adjGamma in Formula 2 is an adjustment gamma.
R=MIN(1,(1−NsadjGamma)×2)×1023
G=0
B=MIN(1,NsadjGamma×2)×1023 (Formula 2)
In S27, the control unit 105 continuously increments the value of the signal level Lv by 1. In S28, the control unit 105 determines whether or not the signal level Lv exceeds the maximum value in the image data. In the case where the signal level Lv exceeds the maximum value, the zebra color conversion table generation processing is ended. In the case where the signal level Lv does not exceed the maximum value, the processing returns to S22. The control unit 105 repeats the processing in S22 to S27, and sets the RGB values of the display color of the zebra pattern of each signal level in the zebra color conversion table.
Warning Image Combination Processing
Herein, the warning image combining processing by the image combining unit 103 will be described with reference to
In S31, the image combining unit 103 determines whether or not the warning display is valid (a warning display function is ON) based on an instruction from the control unit 105. In the case where the warning display is invalid, the combining of the warning image is not performed, and the processing is ended. In the case where the warning display is valid, the processing proceeds to S32.
In S32, the image combining unit 103 determines whether or not the signal level detected in the signal level detection unit 106 is more than the threshold value. The determination result of the signal level can be acquired from the signal level determination unit 107. In the case where the signal level is not more than the threshold value, the combining of the warning image is not performed, and the processing is ended. In the case where the signal level is more than the threshold value, the processing proceeds to S33.
In S33, the image combining unit 103 combines the warning image generated in the warning image generation unit 108 with the image data output from the image correction unit 102. In the case where an alpha blending factor (a value) is specified for the warning image, the warning image is combined according to the a value. In the case where the a value is not specified, the image data is replaced with the warning image.
Herein, with reference to
R=MIN(1,(1−0.13)×2)×1023=1×1023=1023
G=0
B=MIN(1,0.13×2)×1023=0.26×1023=266 (Formula 3)
In this case, the control unit 105 generates the zebra color conversion table shown in
R=MIN(1,(1−0.6)×2)×1023=0.8×1023=818
G=0
B=MIN(1,0.6×2)×1023=1×1023=1023 (Formula 4)
The signal level of 111 to 200% (563 to 1023 in the 10-bit gradation) corresponds to a saturated area, and hence the display color of the zebra pattern 3 is [R=1023, G=0, B=0] according to S25 in the zebra color conversion table generation processing in
In this case, the control unit 105 generates the zebra color conversion table shown in
Note that, in S15 in the parameter setting processing in
In the example in
Consequently, according to S23 in the zebra color conversion table generation processing in
Operation and Effect of Embodiment 1
As described above, the image conversion apparatus 100 of Embodiment 1 can change the warning image displayed in the corrected area of the knee correction to a different warning image according to the correction strength of the knee correction. With this, it becomes possible for the user to intuitively determine the corrected area and the correction strength of the knee correction. For example, in the case where the knee correction is automatically applied, the user can determine whether or not an unintended area is corrected with an unintended strength in advance. In addition, also in the case where the knee correction is manually adjusted, the user can save the effort of determining the correction strength of the knee correction every time the knee correction is adjusted. Further, the saturated area is changed according to the correction strength of the knee correction, and hence it becomes possible for the user to adjust the knee correction while checking a balance between the correction strength and the saturated area.
Note that the example of the warning display for the knee correction has been described in Embodiment 1, but the present invention is not limited thereto. For example, it is also possible to apply the present invention to the warning display for gradation correction or color gamut correction.
Hereinbelow, Embodiment 2 of the present invention will be described by using
Apparatus Configuration
The image input unit 201 of the image correction apparatus 200 is an acquisition unit configured to acquire image data from the outside. Specifically, the image input unit 201 has an image sensor, and acquires image data obtained by performing color or gradation correction processing on an electrical signal output by the image sensor.
Similarly to the image correction unit 102 of the image conversion apparatus 100 in
The image output unit 203 of the image correction apparatus 200 outputs the image data corrected in the image correction unit 202 to the outside. Specifically, the image output unit 203 has an output interface such as an SDI, and outputs the image data from the output interface.
The control unit 204 of the image correction apparatus 200 sets the image processing parameters including those related to the knee correction in the image correction unit 202 based on the user operation. Herein, the image processing parameter is the 1D-LUT but, as long as the image processing parameter is a parameter related to image processing, the image processing parameter is not limited to the 1D-LUT. The image processing parameter may also be a parameter of, e.g., gain adjustment, offset adjustment, or matrix conversion.
In addition, the control unit 204 outputs parameters related to the knee correction (hereinafter referred to as knee correction parameters) to the parameter output unit 205. The knee correction parameters include the knee point and the knee slope, but the knee correction parameters are not limited thereto, and the knee correction parameters may include the threshold value of the signal level and RGB values or the like set in the zebra color conversion table. The detail of parameter setting processing of the control unit 204 will be described later with reference to a flowchart in
The parameter output unit 205 of the image correction apparatus 200 acquires the knee correction parameters from the control unit 204, and outputs the knee correction parameters to the parameter input unit 304 of the image conversion apparatus 300.
The image input unit 301 of the image conversion apparatus 300 is an acquisition unit configured to acquire image data from the outside. Specifically, the image input unit 301 has an input interface such as an SDI, and acquires image data from the input interface.
Similarly to the image combining unit 103 of the image conversion apparatus 100 in
The image output unit 303 of the image conversion apparatus 300 outputs the image data processed in the image combining unit 302 to the outside. Specifically, the image output unit 303 has a display panel such as a liquid crystal panel or a driver of the display panel, and displays the image data subjected to correction processing based on the color gamut or gradation characteristics of the display panel in the display panel.
The parameter input unit 304 of the image conversion apparatus 300 acquires the knee correction parameters from the image correction apparatus 200. The control unit 305 of the image conversion apparatus 300 sets corresponding parameters in the signal level determination unit 307 and the warning image generation unit 308 based on the knee correction parameters acquired by the parameter input unit 304. The detail of parameter setting processing in the control unit 305 will be described with reference to a flowchart in
Similarly to the signal level detection unit 106 of the image conversion apparatus 100 in
Similarly to the signal level determination unit 107 of the image conversion apparatus 100 in
Similarly to the warning image generation unit 108 of the image conversion apparatus 100 in
Parameter Setting Processing of Image Correction Apparatus
The parameter setting processing by the control unit 204 of the image correction apparatus 200 will be described with reference to
In S41, similarly to S11 in the parameter setting processing in
In S42, similarly to S12 in
In S43, similarly to S13 in
In S44, the control unit 204 outputs the knee correction parameters used in the generation of the 1D-LUT in S43 to the parameter output unit 205. Herein, the knee correction parameters are parameters related to the knee correction such as, e.g., the knee point, the knee slope, and the parameter indicating that the knee correction function is ON/OFF.
In S45, similarly to S16 in
Parameter Setting Processing of Image Conversion Apparatus The parameter setting processing by the control unit 305 of the image conversion apparatus 300 will be described with reference to
In S51, the control unit 305 determines whether the knee correction function of the image correction apparatus 200 which is an external apparatus is ON or OFF. Specifically, the control unit 305 acquires the knee correction parameters from the image correction apparatus 200 via the parameter input unit 304. The control unit 305 can determine whether the knee correction function is ON or OFF based on the parameter which is included in the knee correction parameters and indicates that the knee correction function is ON/OFF. In the case where the knee correction function is ON, the processing proceeds to S52. In the case where the knee correction function is OFF, the processing proceeds to initialization processing in S55.
In S52, the control unit 305 determines whether or not the knee correction parameters are changed. Specifically, the control unit 305 determines whether or not the knee point and the knee slope included in the knee correction parameters acquired from the image correction apparatus 200 are changed. For example, the control unit 305 can determine whether or not the knee correction parameters are changed by recording the knee correction parameters acquired from the image correction apparatus 200 in a storage unit such as an auxiliary storage apparatus in the image conversion apparatus 300 and comparing the recorded knee correction parameters with previously recorded knee correction parameters.
In the case where the knee correction parameters are changed, the processing proceeds to S53 to S54, and each parameter is set or updated. In the case where the knee correction parameters are not changed, the parameter setting processing of the image conversion apparatus shown in
In S53, similarly to S14 in
In S54, similarly to S15 in
In S55, similarly to S17 in
Operation and Effect of Embodiment 2
As described above, even in the case where the knee correction is used in the external image correction apparatus 200, the image conversion apparatus 300 of Embodiment 2 can change the warning image displayed in the corrected area of the knee correction in the image data to a difference warning image according to the correction strength of the knee correction. For example, in the case where the image correction apparatus is a camera and the image conversion apparatus is a display, the user can determine the state of the knee correction used in the camera with the external display with high accuracy.
Although the present invention has been described in detail based on its preferred embodiments, the present invention is not limited to the specific embodiments, and various forms within the scope that does not depart from the gist of the invention are also included in the present invention. Further, each embodiment described above is only illustrative of an exemplary embodiment of the present invention, and the embodiments may be appropriately combined with each other.
Note that individual functional units in Embodiments 1 and 2 may or may not be individual pieces of hardware. Functions of two or more functional units may be implemented by common hardware. Each of a plurality of functions of one functional unit may be implemented by each of individual pieces of hardware. Two or more functions of one functional unit may be implemented by common hardware. In addition, each functional unit may or may not be implemented by hardware such as an ASIC, an FPGA, or a DSP. For example, an apparatus may have a processor and a memory in which a control program is stored. Further, the processor reads the control program from the memory and executes the control program, and functions of at least part of functional units of the apparatus may be thereby implemented.
According to the present invention, it becomes possible to easily determine the corrected area and the correction strength of the knee correction.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2020-040803, filed on Mar. 10, 2020, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2020-040803 | Mar 2020 | JP | national |