The present disclosure relates to an image processing device, an image processing method, and a program. More specifically, the present disclosure relates to an image processing device, an image processing method, and a program for executing white balance gain calculation processing and white balance adjustment processing.
As one of problems of an image captured by an imaging device (camera), there is a problem that a color different from an original subject color is output.
For example, a white subject may be output as a reddish color or a bluish color, which is not white. This is mainly caused by a color of irradiation light on the subject, that is, a light source color.
Specifically, for example, when an image of a scene is captured in the setting sun, the entire image becomes reddish. Furthermore, this similarly applies to an image captured in a room with orange illumination light, and the entire image becomes reddish.
Furthermore, an image captured under strong blue illumination light is to be an image that is entirely bluish.
When a subject is irradiated with light having a color as described above, an image is captured in which an original color of the subject is not reproduced due to the color of the irradiation light.
In order to solve this problem, many imaging devices have a function of executing white balance (WB) adjustment processing on a captured image.
The white balance adjustment processing is executed as pixel value correction processing for setting a pixel value of a captured image to an original color of a subject.
For example, the pixel value correction processing is performed so as to output a pixel value of white in an area where the original color of the subject is white, a pixel value of red in an area where the original color of the subject is red, and a pixel value of blue in an area where the original color of the subject is blue. By performing the white balance adjustment processing, an image in which the original color of the subject is reproduced can be generated and recorded.
Furthermore, recent cameras have an automatic white balance (AWB) function for automatically executing the white balance adjustment processing.
The automatic white balance (AWB) function is a function that automatically calculates a white balance gain, which is an output adjustment gain for each color component (RGB) in consideration of a light source color of an imaging scene or the like, and executes signal level adjustment for each color component by performing correction processing of each RGB pixel value by using the calculated gain.
As a technique for realizing the automatic white balance (AWB) function, a technique called gray world is widely known.
This technique is a technique of executing white balance adjustment on the assumption that an average value of pixel values of the entire captured image is to be substantially achromatic.
However, since this technique performs processing on the assumption that the average value of the entire captured image is to be substantially achromatic, accuracy of the white balance adjustment processing decreases in a case where the subject average color of the captured image is not achromatic.
Moreover, since this technique is a technique of uniformly adjusting the entire image by using an average color of the entire image, there is a problem that this technique cannot be used, for example, in a case where every image area is irradiated with light of different colors.
Furthermore, Patent Document 1 (Japanese Patent No. 4447520) discloses a light source color estimation technique using a reflection model of an object.
The technique disclosed in Patent Document 1 is a technique including: assuming that a luminance value of a pixel in a captured image is configured by two types of luminance values of “only diffused reflection light” or “diffused reflection light+specular reflection light”; calculating a specular reflection light component by performing subtraction of these two types of luminance values; estimating a color of the calculated specular reflection light component as a light source color; and executing white balance adjustment processing based on the estimated light source color.
However, in practice, intensity of diffused reflection light varies depending on texture of a subject or an object normal line defined by an uneven shape of the subject. Therefore, there is a high possibility that the value calculated by the subtraction described above greatly changes depending on texture corresponding to a pixel used to calculate the diffused reflection light component or the uneven shape, and the estimated value of the light source color greatly fluctuates. Furthermore, the assumption that the luminance value of the pixel is “only diffused reflection light” is very strict assumption, and it is also difficult to find a pixel in which this assumption is established.
Moreover, Patent Document 2 (Japanese Patent Application Laid-Open No. H06-319150) discloses a technique of performing white balance adjustment processing by using a chromatic color area in an image.
However, this technique is a technique based on the assumption that a light source color with which a subject is irradiated is along a black-body radiation curve, and accuracy decreases in a case where this assumption is not established.
Furthermore, Non-Patent Document 1 (Afifi, Mahmoud and Brown, Michael S. Deep White-Balance Editing, CVPR 2020) discloses a technique of executing white balance adjustment processing by using deep learning.
This technique is a technique of estimating an optimal white balance adjustment parameter for a captured image by using a learning model generated in advance using a context such as an object in an image, an environment, or a time zone, and performing white balance adjustment processing using the estimated parameter.
However, in this technique, it is necessary to generate the learning model by executing learning processing using a large number of captured images in advance, and processing accuracy of the white balance adjustment processing depends on the generated learning model. Moreover, there is a problem that arithmetic processing using a learning model for performing optimal parameter calculation processing is to be very heavy and complicated processing, and usage is difficult unless a camera or an image processing device is provided with a processor having a high arithmetic function.
The present disclosure has been made in view of the problems described above, for example, and an object thereof is to provide an image processing device, an image processing method, and a program capable of calculating a high-precision white balance gain using a color polarized image.
Furthermore, the present disclosure provides an image processing device, an image processing method, and a program capable of white balance gain calculation processing using a pixel value of a chromatic area of a polarization color image captured by a polarization camera, and optimal white balance gain calculation in unit of pixel or in unit of image area of the captured image.
A first aspect of the present disclosure is an image processing device including:
Moreover, a second aspect of the present disclosure is
Moreover, a third aspect of the present disclosure is
Note that the program of the present disclosure is, for example, a program that can be provided by a storage medium or a communication medium that provides a variety of program codes in a computer-readable format, to an information processing device or a computer system that can execute the program codes. By providing such a program in a computer-readable format, processing corresponding to the program is implemented on the information processing device or the computer system.
Other objects, features, and advantages of the present disclosure will become apparent from a more detailed description based on embodiments of the present disclosure described later and the accompanying drawings. Note that a system described herein is a logical set configuration of a plurality of devices, and is not limited to a system in which devices with respective configurations are in the same housing.
According to a configuration of one embodiment of the present disclosure, a configuration is realized in which white balance gain calculation processing and white balance adjustment are executed by using polarization information acquired from a color polarized image.
Specifically, for example, there are provided: the polarization information acquisition unit configured to acquire polarization information from a color polarized image; the white balance gain calculation unit configured to calculate a white balance gain by using the acquired polarization information; and the white balance adjustment unit configured to execute white balance adjustment processing to which the calculated white balance gain is applied. The polarization information acquisition unit calculates a color-corresponding polarization degree from the color polarized image, and the white balance gain calculation unit detects a pixel position where polarization degrees of two colors coincide with each other on the basis of a pixel position where subject reflectances of the two colors coincide with each other, and calculates a white balance gain by using color-corresponding polarization information of the detected pixel position.
With this configuration, a configuration is realized in which white balance gain calculation processing and white balance adjustment are executed by using polarization information acquired from a color polarized image.
Note that the effects described herein are merely examples and are not limited, and additional effects may also be provided.
Hereinafter, details of an image processing device, and an image processing method, and a program of the present disclosure will be described with reference to the drawings. Note that the description will be made in accordance with the following items.
Before describing processing executed by an image processing device of the present disclosure, first, an outline of white balance adjustment processing will be described with reference to
Irradiation light 11 of the light source 10 is reflected by the subject 20, and the imaging unit (camera) 30 captures the reflection light as observation light 22, generates a captured image of the subject 20, and records the captured image in a memory.
Here, by using an intensity ratio (relative intensity) of R (red), G (green), and B (blue), which are three primary colors of colors, a light source color (color characteristic), which is a color of the irradiation light 11 of the light source 10, is indicated as
When the subject 20 is irradiated with the irradiation light 11 of the light source 10 having such a light source color (color characteristic), reflection light is output from the subject 20.
Note that the reflection light is different for each constituent part of the subject 20.
The reflection light output from each part of the subject 20 is input to the imaging unit (camera) 30 as the observation light 22, and an image is captured in which a pixel value based on the observation light 22 is set.
The pixel value of the captured image is set on the basis of the reflection light (=observation light 22) output from each part of the subject 20.
As a result, each pixel value of the image captured by the imaging unit (camera) 30 is a pixel value reflecting the color characteristic of the observation light 22.
Here, a reflectance of each color of R (red), G (green), and B (blue) at a certain point P of the subject 20 is defined as
Note that the subject reflectance (rR, rG, rB) is a value different for each constituent part of the subject 20.
Here, as an example, a subject reflectance of one point P of the subject 20 will be described as (rR, rG, rB).
Note that, in a case where a reflection characteristic of the subject 20 satisfies a dichroic reflection model (reflectance=diffuse reflectance+specular reflectance), each element of the subject reflectance (rR, rG, rB) is indicated as a sum of a specular reflectance rs common to each color of R (red), G (green), and B (blue) and diffuse reflectances rdR, rdG, and rdB different for each color of R (red), G (green), and B (blue).
That is, the following is satisfied.
A color characteristic (=an intensity ratio of each of RGB colors) of R (red), G (green), and B (blue) of the observation light 22 corresponding to the point P and being incident on the imaging unit (camera) from one point P of the subject 20, that is, the point P having the subject reflectance (rR, rG, rB) are defined as
The observation light (iR, iG, iB) corresponding to the point P is a multiplication value of the light source color (LR, LG, LB) of the light source 10 and the subject reflectance (rR, rG, rB) of the point P of the subject 20 for each of R (red), G (green), and B (blue).
That is, the observation light (iR, iG, iB) corresponding to the point P of the subject 20 can be calculated according to the following (Expression 1).
As the pixel value corresponding to the point P of the subject 20 in an image captured by the imaging unit (camera) 30, a pixel value according to the color characteristic calculated by (Expression 1) described above is set.
This similarly applies to constituent parts other than the point P of the subject 20, and the observation light (iR, iG, iB) corresponding to each constituent point of the subject 20 can be calculated according to a multiplication value of the light source color (LR, LG, LB) of the light source 10 and the subject reflectance (rR, rG, rB) of each constituent point of the subject 20, that is, according to (Expression 1) described above.
As understood from (Expression 1) described above, a pixel value of the observation light 22, that is, an image captured by the imaging unit (camera) 30 is a pixel value that changes in proportion to the light source color (LR, LG, LB).
Therefore, for example, if the light source color (LR, LG, LB) of the light source 10 is a reddish color, the pixel value of the image captured by the imaging unit (camera) 30 is set to a reddish pixel value.
Furthermore, if the light source color (LR, LG, LB) of the light source 10 is a bluish color, the pixel value of the image captured by the imaging unit (camera) 30 is also a bluish pixel value.
That is, the color of the light source color (LR, LG, LB) is reflected in the color of the pixel value of the image captured by the imaging unit (camera) 30, and an image having a color different from the original color of the subject 20 is captured.
The white balance adjustment processing is executed as processing of correcting the pixel value of the image captured by the imaging unit (camera) 30 to the original color of the subject 20 independent of the light source color (LR, LG, LB).
An example of the white balance adjustment processing will be described with reference to
Hereinafter, processing of each step is sequentially described.
Processing step S11 is captured image acquisition processing performed by the imaging unit (camera) 30.
As described above with reference to
As described above, the pixel value of the image captured by the imaging unit (camera) 30 is a value depending on the color characteristic of the light source color (LR, LG, LB) of the light source 10, and may be a pixel value with which the original color of the subject 20 is not reproduced.
Step S12 is the white balance gain calculation processing.
A white balance gain is a pixel value adjustment parameter for correcting the pixel value of the captured image acquired in step S11 to the original color of the subject 20.
There are various techniques for a processing mode of this white balance gain calculation processing.
For example, in a gray world technique, assuming that an average value of pixel values of the entire captured image is substantially achromatic, a pixel value adjustment parameter for uniformalizing an average value of individual RGB colors of the captured image is calculated as a white balance gain.
In many cases, a color (light source color) of the light source 10 is estimated, and the white balance gain calculation processing is performed using the estimated light source color.
The white balance gain calculated by this technique is defined as
Each element of the white balance gain (kR, 1, kB) corresponds to a multiplication parameter for the pixel value (R, G, B) of each color of the image captured by the imaging unit (camera) 30.
A white balance gain (multiplication parameter) corresponding to G (green)=1, which means that the pixel value of G (green) of the captured image is used as a reference and the G pixel value is not changed.
A white balance gain corresponding to R (red)=kR, and the R pixel value is corrected by multiplying the pixel value of R (red) of the captured image by the white balance gain kR.
A white balance gain corresponding to B (blue)=kB, and the B pixel value is corrected by multiplying the pixel value of B (blue) of the captured image by the white balance gain kB.
Through these processes, the white balance adjustment processing of the captured image is executed.
The white balance gain (kR, 1, kB) calculated on the basis of the light source color (LR, LG, LB) of the light source 10 is as the following (Expression 3).
In step S13, which is the final step, the white balance adjustment processing is executed.
That is, the pixel value (R, G, B) of each color of the image captured by the imaging unit (camera) 30 is corrected using the white balance gain (kR, 1, kB) calculated in step S12 described above.
The pixel value (R, G, B) of the image captured by the imaging unit (camera) 30 is corrected as follows.
A captured image including RGB pixel values of the captured image before white balance adjustment is defined as a captured image (iR, iG, iB), and an image including RGB pixel values after white balance adjustment is defined as a white balance adjusted image (wbiR, wbiG, wbiB).
The white balance adjusted image (wbiR, wbiG, wbiB) is generated according to the following (Expression 4).
That is, a pixel value of each pixel constituting the white balance adjusted image (wbiR, wbiG, wbiB) is calculated by multiplying the pixel value (iR, iG, iB) of each pixel constituting the captured image (iR, iG, iB) by the white balance gain (kR, 1, kB) (=((LG/LR), 1, (LG/LB))).
The white balance adjusted image (wbiR, wbiG, wbiB) calculated according to (Expression 4) described above is an image including pixel values reflecting the color of the subject's last name independent of the color characteristic of the light source 10.
However, the processing described above can be executed only in a case where color (light source color) estimation processing of the light source 10 can be performed with high accuracy.
For example, the processing described above can be performed in a case where the camera includes a sensor for color analysis of ambient light and has a configuration for performing color analysis of the ambient light on the basis of a detection value of the sensor, but the light source color cannot be estimated with high accuracy in a case where the camera does not include such a sensor.
Furthermore, in the white balance gain calculation processing using the gray world technique described above, assuming that an average value of pixel values of the entire captured image is substantially achromatic, a pixel value adjustment parameter for uniformalizing an average value of individual RGB colors of the captured image is calculated as a white balance gain.
In this processing, in a case where the average color of the subject is not achromatic, this assumption is not established, and the accuracy of the white balance adjustment processing decreases.
The processing of the present disclosure solves such a problem, and is capable of highly accurate white balance gain calculation and white balance adjustment processing, by applying polarization information obtained from a polarization color image to perform highly accurate light source color estimation.
Next, an outline of processing executed by an image processing device of the present disclosure will be described.
The image processing device of the present disclosure performs the white balance gain calculation processing by using a polarized image, and executes white balance adjustment processing of a captured image by using the calculated white balance gain.
An outline of processing executed by the image processing device of the present disclosure will be described with reference to
The irradiation light 11 of the light source 10 is reflected by the subject 20, and the imaging unit (color polarized image capturing camera) 50 captures an image of only a specific polarization component from the observation light 22 including the reflection light, and inputs the captured color polarized image to the image processing unit 100.
The image processing unit 100 calculates a white balance gain by using the color polarized image captured by the imaging unit (color polarized image capturing camera) 50, and executes the white balance adjustment processing by using the calculated white balance gain.
By using an intensity ratio (relative intensity) of R (red), G (green), and B (blue), which are three primary colors of colors, a light source color (color characteristic) of the irradiation light 11 of the light source 10 is indicated as
Moreover, a light source Stokes vector of each of RGB colors indicating a polarization state of each of RGB colors of the irradiation light 11 of the light source 10 can be expressed by the following Expression (11).
The Stokes vector is a vector indicating a polarization state of light, and includes four types of parameters (Stokes parameters) s0 to s4.
The Stokes parameter s0 is an unpolarized light intensity signal,
Note that, in the processing of the present disclosure, since the color polarized image is acquired using a linear polarizer as a polarizer, the white balance gain is calculated using three types of the Stokes parameters s0 to s2 among the four types of Stokes parameters s0 to s3.
The imaging unit (color polarized image capturing camera) 50 captures three different types of polarized images in order to acquire the three types of Stokes parameters.
Hereinafter, a processing example in which the three Stokes parameters s0 to s2 are used will be described.
Each of RGB colors of the irradiation light 11 of the light source 10 has a polarization state defined by the light source Stokes vector shown in (Expression 11) described above, and reflection light is output from the subject 20 when the subject 20 is irradiated with the irradiation light 11.
Note that the reflection light is different for each constituent part of the subject 20.
The reflection light output from each part of the subject 20 is input to the imaging unit (color polarized image capturing camera) 50 as the observation light 22, and an image is captured in which a pixel value based on the observation light 22 is set.
The pixel value of the captured image is set on the basis of the reflection light (=observation light 22) output from each part of the subject 20.
As a result, each pixel value of the image captured by the imaging unit (color polarized image capturing camera) 50 is a pixel value reflecting the color characteristic of the observation light 22.
Here, a reflectance of each color of R (red), G (green), and B (blue) at a certain point P of the subject 20 is defined as
Note that the subject reflectance (rR, rG, rB) is a value different for each constituent part of the subject 20. Here, as an example, a subject reflectance of one point P of the subject 20 will be described as (IR, rG, rB).
Note that, as described above, in a case where the reflection characteristic of the subject 20 satisfies a dichroic reflection model (reflectance=diffuse reflectance+specular reflectance), each element of the subject reflectance (rR, rG, rB) is indicated as a sum of the specular reflectance Is common to each color of R (red), G (green), and B (blue) and the diffuse reflectances rdR, LdG, and rdB different for each color of R (red), G (green), and B (blue).
That is, the following is satisfied.
Moreover, a polarization state of the irradiation light 11 changes when the irradiation light is emitted on the subject 20 and becomes reflection light. That is, a polarization state of the observation light (reflection light) 22 illustrated in the figure is different from the polarization state of the irradiation light 11.
The change in the polarization state varies depending on the reflection characteristic of the subject 20.
For example, when a Stokes vector indicating the polarization state of the irradiation light 11 is S=(s0, s1, s2) and a Stokes vector indicating the polarization state of the observation light (reflection light) 22 is s′=(s′0, s′1, s′2), a relational expression between the two Stokes vectors S and S′ can be described as a relational expression using one transformation matrix M, that is,
S′=MS.
M is called a Muller matrix.
The Muller matrix M is a transformation matrix reflecting the reflection characteristic of the subject 20.
The Muller matrix M can be simply expressed by a linear sum of a matrix Ms indicating specular reflection and a matrix Md indicating diffuse reflection of the subject 20.
The reflection characteristic of the subject 20 varies for each of RGB colors. For example, the Muller matrix corresponding to each of RGB colors of one point P of the subject 20 is expressed by a linear sum of the matrix Ms indicating specular reflection and the matrix Md indicating diffuse reflection of the point P of the subject 20, that is, by the following expressions (Expression 12a) to (Expression 12c).
Muller matrix of R (red light)=rsMs+rdRMdR (Expression 12a)
Muller matrix of G (green light)=rsMs+rdGMdG (Expression 12b)
Muller matrix of B (blue light)=rsMs+rdBMdB (Expression 12c)
Note that, in (Expression 12a) to (Expression 12c) described above,
Furthermore, the Stokes vector S′ of each color indicating polarization states of R (red), G (green), and B (blue) of the observation light (reflection light) 22 corresponding to one point P of the subject 20 can be expressed as
(Expression 13a) to (Expression 13c) shown below, in accordance with the following relational expression.
S′=MS.
The imaging unit (color polarized image capturing camera) 50 captures multiple color polarized images for acquiring Stokes parameters constituting Stokes vectors of the individual colors indicating polarization states of R (red), G (green), and B (blue) of the observation light (reflection light) 22, that is,
S′
R=(s′0R,s′1R,s′2R)T,
S′
G=(s′0G,s′1G,s′2G)T, and
s′
B=(s′0B,s′1B,s′2B)T,
Note that the imaging unit (color polarized image capturing camera) 50 has a configuration for capturing these multiple color polarized images.
For example, a configuration using multiple cameras including different polarizing plates can be used. Alternatively, a configuration may be adopted in which a single camera having an imaging element including a polarizing element in unit of pixel is used.
Note that a specific configuration example of the imaging unit 50 that captures multiple different polarized images will be described in detail later.
The image processing device 100 is input with multiple color polarized images captured by the imaging unit (color polarized image capturing camera) 50, acquires Stokes parameters constituting the above-described Stokes vectors S′R, S′G, and S′B, calculates white balance gains corresponding to each of colors of RGB, and executes the white balance adjustment processing to which the calculated white balance gains are applied.
An example of processing executed by the image processing device 100 will be described with reference to
Hereinafter, processing of each step is sequentially described.
Processing step S101 is input processing for multiple color polarized images captured by the imaging unit (color polarized image capturing camera) 50.
The imaging unit (color polarized image capturing camera) 50 captures three types of images in order to acquire three types of Stokes parameters S′0 to S′2 of the observation light 22.
The three types of images are the following three types of images shown in step S101 of
Step S102 is the white balance gain calculation processing.
In step S102, from the three types of images acquired in step S101, the following Stokes parameters are acquired, that is,
Details of the white balance gain calculation processing using these three types of Stokes parameters will be described in detail in the next item [3. Details of white balance gain calculation processing using polarization information].
The white balance gain calculated in step S102 is defined as
Each element of the white balance gain (kR, 1, kB) corresponds to a multiplication parameter for the pixel value (R, G, B) of each color of the image captured by the imaging unit (camera) 30.
By using the pixel value of G (green) of the captured image as a reference without changing the G pixel value, the pixel value of R (red) of the captured image is multiplied by the gain kR, and the pixel value of B (blue) of the captured image is multiplied by the gain kB, thereby a corrected pixel value after white balance adjustment can be calculated.
The white balance gain (kR, 1, kB) can be expressed by the following (Expression 14) by using the light source color (LR, LG, LB) of the light source 10.
In step S103, which is the final step, the white balance adjustment processing is executed.
In step S103, the pixel value (R, G, B) of each color of the image captured by the imaging unit (camera) 30 is corrected using the white balance gain (kR, 1, kB) calculated in step S102 described above.
Note that the Stokes parameter (s′0R, s′0G, s′0B) acquired from the multiple color polarized images acquired in step S101 corresponds to a light intensity signal (luminance signal) corresponding to each of RGB colors and corresponds to an RGB color image. That is, the RGB color image is a color image including a light intensity signal (luminance signal) corresponding to each of RGB colors, that is, the signals (s′0R, s′0G, s′0B).
The Stokes parameter (s′0R, s′0G, s′0B) corresponds to an RGB pixel value constituting an RGB color image.
That is, the following is satisfied.
R pixel value=s′0R
G pixel value=s′0G
B pixel value=s′0B
An RGB image as a white balance adjusted image can be generated by multiplying an image (s′0R, s′0G, s′0B) including luminance signals corresponding to these RGB pixel values by the white balance gain (kR, 1, kB) calculated in step S102 described above.
Note that the image processing device 100 of the present disclosure can also execute white balance adjustment on a polarized image acquired by the imaging unit 50 to generate a white balance-adjusted polarized image.
A description is given to a change in pixel values before and after adjustment in a case where white balance adjustment is performed on the image (s′0R, s′0G, s′0B) including luminance signals corresponding to RGB pixel values.
A captured image including RGB pixel values before white balance adjustment is defined as a captured image (iR, iG, iB) (=(s′0R, s′0G, s′0B)), and an image including RGB pixel values after white balance adjustment is defined as a white balance adjusted image (wbiR, wbiG, wbiB).
The white balance adjusted image (wbiR, wbiG, wbiB) is generated according to the following (Expression 15).
That is, a pixel value of each pixel constituting the white balance adjusted image (wbiR, wbiG, wbiB) is calculated by multiplying the pixel value (s′0R, s′0G, s′0B) by the white balance gain (kR, 1, kB) (=((LG/LR), 1, (LG/LB))).
The white balance adjusted image (wbiR, wbiG, wbiB) calculated according to (Expression 15) described above is an image including pixel values reflecting the color of the subject's last name independent of the color characteristic of the light source 10.
Next, white balance gain calculation processing using polarization information will be described in detail.
The image processing device 100 of the present disclosure calculates a white balance gain by using polarization information.
The polarization information to be applied to the calculation processing of the white balance gain is information that can be acquired from a color polarized image that is an image captured by the imaging unit (color polarized image capturing camera) 50.
Specifically, the polarization information is information such as a Stokes parameter and a degree of linear polarization (DoLP) that can be calculated with the Stokes parameter.
Note that the linear polarization degree (DoLP) is a ratio (%) of linearly polarized light included in the observation light (reflection light) 22, and details will be described later.
Hereinafter, a description will be made on details of the processing of step S102 described with reference to
As described above, in step S102, the following Stokes parameters are acquired from the three color polarized images acquired in step S101. That is,
Moreover, by using the acquired three types of Stokes parameters, calculation is performed to obtain a white balance gain, which is a pixel value adjustment parameter for correcting the pixel value of the captured image to the original color of the subject 20.
Hereinafter, two processing examples of a specific processing example of the calculation processing of the white balance gain using the Stokes parameter will be sequentially described.
First, with reference to
The image processing device 100 of the present disclosure calculates a white balance gain by using polarization information acquired from a polarized image input from the imaging unit 50. The image processing device 100 of the present disclosure calculates an optimum white balance gain by using the polarization information of a chromatic area even in a case where an achromatic area cannot be detected from the captured image.
White balance gain calculation processing example 1 described below is a processing example in which the image processing device 10 executes the following processing steps A to B to calculate the white balance gain, as illustrated as processing of the image processing device 100 in
Hereinafter, with reference to
Similarly to the description with reference to
The irradiation light 11 of the light source 10 is reflected by the subject 20, and the imaging unit (color polarized image capturing camera) 50 captures an image of only a specific polarization component from the observation light 22 including the reflection light, and inputs the captured color polarized image to the image processing unit 100.
The image processing unit 100 calculates a white balance gain by using the color polarized image captured by the imaging unit (color polarized image capturing camera) 50, and executes the white balance adjustment processing by using the calculated white balance gain.
A light source Stokes vector of each of RGB colors indicating a polarization state of each of RGB colors of the irradiation light 11 of the light source 10 is expressed by the following expression (Expression 11) as described above.
Note that, as described above, the Stokes vector is a vector indicating a polarization state of light, and includes four types of parameters (Stokes parameters) of s0 to s3. However, the image processing device 100 of the present disclosure calculates a white balance gain by using the following three types of Stokes parameters so to s2.
Stokes parameter s0=unpolarized light intensity signal
Stokes parameter s1=difference signal of horizontal/vertical linear polarization component
Stokes parameter s2=difference signal of 45 degree linearly polarized component
When the subject 20 is irradiated with the irradiation light 11 of the light source 10 having the polarization state indicated by the light source Stokes vectors of the individual colors of RGB shown in (Expression 11) described above, the observation light 22 which is reflection light from the subject 20 is input to the imaging unit (color polarized image capturing camera) 50.
As described above with reference to
When a Stokes vector of the irradiation light 11 is S=(s0, s1, s2), and
s′=MS.
The Muller matrix M can be simply expressed by a linear sum of the matrix Ms indicating specular reflection and the matrix Md indicating diffuse reflection of the subject 20, and the Muller matrix corresponding to each of RGB colors is expressed by the following expressions (Expression 12a) to (Expression 12c).
Note that,
Furthermore, the Stokes vector S′ of each color indicating polarization states of R (red), G (green), and B (blue) of the observation light (reflection light) 22 corresponding to one point P of the subject 20 can be expressed as
S′=MS.
The imaging unit (color polarized image capturing camera) 50 captures multiple color polarized images for acquiring Stokes parameters constituting Stokes vectors of the individual colors indicating polarization states of R (red), G (green), and B (blue) of the observation light (reflection light) 22, that is,
S′
R=(s′0R,s′1R,s′2R)T,
S′
G=(s′0G,s′1G,s′2G)T, and
S′
B=(s′0B,s′1B,s′2B)T,
That is, the color polarized images are the following three types of images illustrated in
Note that the Stokes parameter s0 is an unpolarized light intensity signal, and the image (s′0R, s′0G, s′0B) is an image including a light intensity signal (luminance signal) corresponding to each of RGB colors. Therefore, if the white balance adjustment processing is executed using the image (s′0R, s′0G, s′0B) including the Stokes parameter s0 as the white balance adjustment target image, an RGB image accurately reflecting the color of the subject 20 can be acquired.
In the captured image (s′0R, s′0G, s′0B), the captured image including the RGB pixel values before the white balance adjustment is an image including the same RGB pixel values as the observation light (iR, iG, iB) corresponding to each constituent point of the subject 20.
However, the observation light (iR, iG, iB) corresponding to each constituent point of the subject 20 is a multiplication value of the light source color (LR, LG, LB) of the light source 10 and the subject reflectance (rR, rG, rB) of each constituent point of the subject 20, that is, light calculated according to the following (Expression 1) described above.
That is, the captured image (s′0R, s′0G, s′0B) including the Stokes parameter s0, and each value thereof is calculated according to the following (Expression 16).
As understood from (Expression 16) described above, the pixel value of the captured image (s′0R, s′0G, s′0B) is a pixel value that changes in proportion to the light source color (LR, LG, LB) and the subject reflectance (rR, rG, kB).
That is, the pixel value of the captured image (s′0R, s′0G, s′0B) is to be an image including a pixel value of a color different from the original color of the subject 20 due to a color change according to the light source color (LR, LG, LB) of the light source 10 or the subject reflectance (rR, rG, rB).
Note that, as described above, in a case where the reflection characteristic of the subject 20 satisfies a dichroic reflection model (reflectance=diffuse reflectance+specular reflectance), each element of the subject reflectance (rR, rG, rB) is indicated as a sum of the specular reflectance Is common to each color of R (red), G (green), and B (blue) and the diffuse reflectances rdR, rdG, and rdB different for each color of R (red), G (green), and B (blue).
That is, the following is satisfied.
Therefore, each value of the captured image (s′0R, s′0G, s′0B) can be expressed as the following (Expression 17).
As can be understood from (Expression 17) described above, an influence of a color change on the RGB pixel values (s′0R, s′0G, s′0B) of the captured image (s′0R, s′0G, s′0B) is
In order to calculate a white balance gain for eliminating a color change caused by the light source color (LR, LG, LB), the image processing device 100 of the present disclosure detects a pixel having no color change due to the reflectance (rR, rG, rB)=((rs+rdR), (rs+rdG), (rs+rdB)) of the subject 20, from the image captured by the imaging unit 50.
That is, a pixel estimated to have undergone a color change based only on the light source color (LR, LG, LB) is detected from the image captured by the imaging unit 50.
The processing executed as this processing is the processing (step A) of the image processing device 100 illustrated in
(Step A) A pixel in which linear polarization degrees (DoLP) of two different colors (R and G, B and G) coincide with each other is detected from an input image.
The detection pixel detected in the processing of step A is a pixel in which the reflectances of the two colors (R and G, B and G) of the subject coincide with each other, and a cause of a color change of the captured image is only the influence of the light source color (LR, LG, LB).
Note that, as described above, the white balance gains (kR, 1, kB) include two types of white balance gains:
As processing for calculating the white balance gain kR corresponding to R (red), the image processing device 100 detects a pixel in which linear polarization degrees (DoLPR and DoLPG) of two different colors (R and G) coincide with each other, from the image captured by the imaging unit 50.
The pixel (DoLPR=DoLPG) detected by this processing is a pixel in which the reflectances of the two colors (R and G) of the subject coincide with each other, and a color change of the captured image is caused only by the influence of the light source color (LR, LG, LB).
Moreover, as processing for calculating the white balance gain kB corresponding to B (blue), the image processing device 100 detects a pixel in which linear polarization degrees (DoLPB and DoLPG) of two different colors (B and G) coincide with each other, from the image captured by the imaging unit 50.
The pixel (DoLPB=DoLPG) detected by this processing is a pixel in which reflectances of the two colors (B and G) of the subject coincide with each other, and a color change of the captured image is caused only by the influence of the light source color (LR, LG, LB).
As described above, the reflectance (rR, rG, rB) Of the subject 20 is indicated as a sum of the specular reflectance Is common to each color of R (red), G (green), and B (blue) and the diffuse reflectances rdR, rdG, and rdB different for each color of R (red), G (green), and B (blue).
That is, the following is satisfied.
A pixel position where the reflectance IR of R (red) and the reflectance rG of G (green) coincide with each other is a pixel position where the following expression is established.
r
R
=r
G
Furthermore, a pixel position where the reflectance rB of B (blue) and the reflectance rG of G (green) coincide with each other is a pixel position where the following expression is established.
r
B
=r
G
Assuming that an RGB color image including the Stokes parameter s0 that can be acquired from a polarized image input from the imaging unit 50 is the captured image (s′0R, s′0G, s′0B), consideration is given to a pixel value at the pixel position where the reflectance IR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
R
=r
G
At the pixel position where the reflectance IR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
R
=r
G
That is, since LRrR=LGrG is established, LRrR can be replaced with LRrG.
From (Expression 18) described above, the following (Expression 19) is obtained.
The final term (LG/LR) in (Expression 19) described above
Therefore, a relational expression between the white balance gain (kR, 1, kB) for eliminating the influence of the light source color (LR, LG, LB) and the light source color (LR, LG, LB) is to be a relational expression expressed by the following (Expression 20).
As shown in (Expression 20) described above, (LG/LR) calculated according to (Expression 19) described above
That is, (LG/LR) calculated according to (Expression 19) described above is the white balance gain kR to be applied to correction of the pixel value (intensity) of R (red) when the pixel value (intensity) of G (green) is used as a reference.
In this way, if it is possible to detect the pixel position where the reflectance IR of R (red) and the reflectance rG of G (green) coincide with each other in the subject 20, that is, the pixel position where
r
R
=r
G
Similarly, at the pixel position where the reflectance rB of B (blue) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
B
=r
G
That is, since LBrB=Lars is established, LBrB can be replaced with LBrG.
From (Expression 21) described above, the following (Expression 22) is obtained.
The final term (LG/LB) in (Expression 22) described above
Therefore, a relational expression between the white balance gain (kR, 1, kB) for eliminating the influence of the light source color (LR, LG, LB) and the light source color (LR, LG, LB) is to be a relational expression expressed by the following (Expression 23).
As shown in (Expression 23) described above, (LG/LB) calculated according to (Expression 22) described above
That is, (LG/LB) calculated according to (Expression 22) described above is the white balance gain kB to be applied to correction of the pixel value (intensity) of B (blue) when the pixel value (intensity) of G (green) is used as a reference.
In this way, from the image captured by the imaging unit 50, if it is possible to detect a pixel position where the B (blue) reflectance rB and the G (green) reflectance rG coincide with each other, that is, a pixel position where
r
B
=r
G
As described above, as the processing for calculating the white balance gain kR corresponding to R (red), the image processing device 100 detects a pixel in which the reflectances of the two colors (R and G) of the subject coincide with each other, and a color change of the captured image is caused only by the influence of the light source color (LR, LG, LB).
Furthermore, as the processing for calculating the white balance gain kB corresponding to B (blue), a pixel is detected in which the reflectances of the two colors (B and G) of the subject coincide with each other, and a color change of the captured image is caused only by the influence of the light source color (LR, LG, LB).
The image processing device 100 of the present disclosure executes the processing of (Step A) illustrated in
That is, the processing of “(Step A) a pixel in which linear polarization degrees (DoLP) of two different colors (R and G, B and G) coincide with each other is detected from an input image” is executed.
For example, in order to detect a pixel position where
r
R
=r
G
is established, which is required to calculate the white balance gain kR corresponding to R (red), the image processing device 100 detects a pixel in which linear polarization degrees (DoLPR and DoLPG) of R (red) and G (green) coincide with each other, from the image captured by the imaging unit 50.
Furthermore, in order to detect a pixel position where
r
B
=r
G
The degree of linear polarization (DoLP) is a ratio (%) of linear polarization included in the observation light (reflection light) 22.
Hereinafter, a description will be given to processing executed by the image processing device 100 in order to detect a pixel position where
r
R
=r
G
The linear polarization degrees (DoLP) of R (red) and G (green) in the observation light (reflection light) 22 are calculated according to the following (Expression 24a) and (Expression 24b).
In (Expression 24a) and (Expression 24b) described above, s′0R, s′1R, s′2R, s′0G, s′1G, and s′2G are Stokes parameters of R (red) and G (green) in the observation light (reflection light) 22, and
All of these Stokes parameters are parameters that can be acquired from three images captured by the imaging unit 50 illustrated in
At the pixel position where the reflectance IR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
R
=r
G
is established, the linear polarization degrees (DoLP) of R (red) and G (green) in the observation light (reflection light) 22 coincide with each other. Therefore, in order to detect the pixel position where rR=rG is established, it is only required to detect a pixel in which the linear polarization degree (DoLPR) of R (red) and the linear polarization degree (DoLPG) of G (green) in the observation light (reflection light) 22 coincide with each other.
Hereinafter, a description is given to the reason why the linear polarization degrees (DoLPR and DoLPG) of R (red) and G (green) in the observation light (reflection light) 22 coincide with each other at the pixel position where rR=rG is established.
As described above with reference to
S′=MS
Therefore, the following relational expressions (Expression 13a) to (Expression 13c) described above are also established for individual colors of RGB.
Note that, in (Expression 13a) to (Expression 13c) described above,
Here, in a case where
r
R
=r
G,
By this replacement, (Expression 13a) and (Expression 13b) can be rewritten as the following (Expression 25a) and (Expression 25b).
From (Expression 25a) and (Expression 25b) described above,
According to these relational expressions, (Expression 24a) and (Expression 24b) described above, that is, the linear polarization degree (DoLPR) of R (red) and the linear polarization degree (DoLPG) of G (green) in the observation light (reflection light) 22 can be rewritten as the following (Expression 26a) and (Expression 26b).
The final terms of (Expression 26a) and (Expression 26b) described above coincide with each other.
This indicates that the linear polarization degree (DoLPR) of R (red) and the linear polarization degree (DoLPG) of G (green) in the observation light (reflection light) 22 coincide with each other.
In this way, at the pixel position where the reflectance rR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
R
=r
G
(DoLPR)=(DoLPG)
As described above, in order to detect the pixel position where the reflectance rR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
R
=r
G
The image processing device 100 of the present disclosure is input with the following three images from the imaging unit 50, that is, input with these three types of images illustrated in
From these images, the image processing device 100 can acquire the Stokes parameters required for calculating the linear polarization degree (DoLPR) of R (red) and the linear polarization degree (DoLPG) of G (green) in the observation light (reflection light) 22.
That is, the image processing device 100 acquires the Stokes parameters (s′0R, s′1R, s′2R, s′0G, s′1G, s′2G) required for calculating the linear polarization degree (DoLPR) of R (red) and the linear polarization degree (DoLPG) of G (green) illustrated in (Expression 24a) and (Expression 24b) described above, from the color polarized image input from the imaging unit 50.
By using the three images input from the imaging unit 50, that is, the images (a) to (c) illustrated in
The pixel position where the linear polarization degree (DoLPR) of R (red) and the linear polarization degree (DoLPG) of G (green) coincide with each other is the pixel position where the reflectance rR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
R
=r
G
In this way, the image processing device 100 of the present disclosure detects the pixel position where the reflectance rR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where rR=rG is established, from the image captured by the imaging unit 50.
At the pixel position where rR=rG is established, as described above, the following expression,
Note that, as described above, the Stokes parameter so that can be acquired from the three types of color polarized images input from the imaging unit 50 illustrated in
Accordingly,
Note that, as described above, in the present processing example, the white balance gain is set such that, by using the pixel value of G (green) of the captured image as a reference without changing the G pixel value, the white balance gain to be multiplied by the pixel value of R (red) of the captured image is kR, and the white balance gain to be multiplied by the pixel value of B (blue) of the captured image is kB.
The processing described above is an example of calculation processing of the white balance gain kR to be applied to correction of one white balance gain among the white balance gains (kR, 1, kB), that is, the pixel value (intensity) of R (red)
As described above, the processing example described above is a calculation processing example of the white balance gain kR to be applied to correction of the pixel value (intensity) of R (red), but the calculation processing of the white balance gain kB to be applied to correction of the pixel value (intensity) of B (blue) can be similarly executed.
In a case of calculating the white balance gain kB to be applied to correction of the pixel value (intensity) of B (blue), a pixel position where the reflectance rB of B (blue) and the reflectance rG of G (green) coincide with each other, that is, a pixel position where
r
B
=r
G
is established is detected. For this purpose, a pixel position is detected where the linear polarization degree (DoLPB) of B (blue) and the linear polarization degree (DoLPG) of G (green) in the observation light (reflection light) 22 coincide with each other.
At the pixel position where rB=rG is established, as described above, the following expression,
As described above, the RGB pixel values (s′0R, s′0G, s′0B) of the captured image (s′0R, s′0G, s′0B) acquired from the multiple color polarized images input from the imaging unit 50 illustrated in
Accordingly,
As described above, the white balance gain (kR, 1, kB) can be defined as follows using the light source color (LR, LG, LB) of the light source 10.
The captured image (s′0R, s′0G, s′0B) acquired from the multiple color polarized images input from the imaging unit 50 illustrated in
In the embodiment described above,
That is, by detecting a coincident pixel of linear polarization degrees (DoLP) of different colors, the detection processing has been performed for the pixel position where the reflectances of the two colors in RGB of the subject 20 coincide with each other and only the influence of the light source color (LR, LG, LB) occurs.
Specifically, in the captured image (s′0R, s′0G, s′0B), the detection processing of the pixel position where only the influence of the light source color (LR, LG, LB) occurs has been performed.
For such pixel position detection processing, it is also possible to apply other methods instead of the processing of detecting a coincident pixel of the linear polarization degree (DoLP).
For example, the image processing device 100 can also perform the detection processing of the pixel position where a color change of the captured image is caused only by the influence of the light source color (LR, LG, LB), by calculating “n-th component (parameter)/m-th component (parameter) (n #m)” by using two different components (parameters) among Stokes parameters obtained from images input from the imaging unit 50, that is, the following three types of images illustrated in
Specifically, first, each component ratio (parameter ratio) of the Stokes parameter is calculated according to (Expression 27a), (Expression 27b), and (Expression 27c) shown below.
Moreover, a pixel position where (Expression 28a) or (Expression 28b) below is established is detected.
The pixel position where (Expression 28a) described above is established is a pixel position where the reflectances of R (red) and G (green) coincide with each other, and a color change is caused only by the influence of the light source color (LR, LG, LB) for R (red) and G (green) of the image (s′0R, s′0G, s′0B)
Furthermore, the pixel position where (Expression 28b) described above is established is a pixel position where the reflectances of G (blue) and G (green) coincide with each other, and a color change is caused only by the influence of the light source color (LR, LG, LB) for B (blue) and G (green) of the image (s′0R, s′0G, s′0B)
By using such a technique, processing may be performed in which the pixel position where a color change of the captured image is caused only by the influence of the light source color (LR, LG, LB) is detected, and the white balance gains kR and kB are calculated from the pixel values set these pixel positions.
Next, with reference to
As illustrated in
Using a relationship that “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is equal to “a color (DoLPR, DoLPG, DoLPB) of the linear polarization degree (DoLP)”, two or more relational expressions are generated including a Stokes parameter that can be acquired from the captured image and two white balance gains kR and kB that are unknown. (Two or more relational expressions corresponding to different pixel positions of the captured image are generated).
(Step Q) The white balance gains kR and kB are calculated by solving the two or more relational expressions generated in step P as simultaneous equations.
Hereinafter, with reference to
Similarly to the description with reference to
The irradiation light 11 of the light source 10 is reflected by the subject 20, and the imaging unit (color polarized image capturing camera) 50 captures an image of only a specific polarization component from the observation light 22 including the reflection light, and inputs the captured color polarized image to the image processing unit 100.
The image processing unit 100 calculates a white balance gain by using the color polarized image captured by the imaging unit (color polarized image capturing camera) 50, and executes the white balance adjustment processing by using the calculated white balance gain.
Configurations of the light source 10, the subject 20, and the imaging unit (color polarized image capturing camera) 50 are similar to those in
As illustrated in
As illustrated in
A Stokes vector S′ indicating a polarization state of the observation light (reflection light) 22 corresponding to one point P of the subject 20 is to be a Stokes vector according to the following relational expression.
S′=MS
As illustrated in
The imaging unit (color polarized image capturing camera) 50 captures multiple color polarized images for acquiring Stokes parameters constituting Stokes vectors of the individual colors indicating polarization states of R (red), G (green), and B (blue) of the observation light (reflection light) 22, that is,
S′
R=(s′0R,s′1R,s′2R)T,
S′
G=(s′0G,s′1G,s′2G)T,
S′
B=(s′0B,s′1B,s′2B)T,
That is, the color polarized images are the following three types of images illustrated in
The image processing device 100 is input with these three types of images and executes white balance gain calculation processing. Furthermore, the white balance adjustment processing of the captured image is executed using the calculated white balance gain.
Note that the white balance gain corresponds to a multiplication parameter for the pixel value (R, G, B) of each color of the captured image.
By using the pixel value of G (green) of the captured image as a reference without changing the G pixel value, the pixel value of R (red) of the captured image is multiplied by the gain kR, and the pixel value of B (blue) of the captured image is multiplied by the gain kB, thereby a corrected pixel value after white balance adjustment can be calculated.
As described above, a relational expression between the white balance gain (kR, 1, kB) for eliminating the influence of the light source color (LR, LG, LB) and the light source color (LR, LG, LB) is to be the following relational expression.
Details of the white balance gain calculation processing performed by the image processing device 100 will be described.
As described above, the image processing device 100 executes the white balance gain calculation processing by executing the following steps P and Q.
Using a relationship that “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is equal to “a color (DoLPR, DoLPG, DoLPB) of the linear polarization degree (DoLP)”, two or more relational expressions are generated including a Stokes parameter that can be acquired from the captured image and two white balance gains kR and kB that are unknown. Two or more relational expressions corresponding to different pixel positions of the captured image are generated.
(Step Q) The white balance gains kR and kB are calculated by solving the two or more relational expressions generated in step P as simultaneous equations.
Before the description of the processing steps P and Q described above, first, a description will be given to the reason why the relationship that “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is equal to “a color (DoLPR, DoLPG, DoLPB) of the linear polarization degree (DoLP)” is established.
Note that “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is an opposite color of the white balance adjusted image (kRiR, iG, kBiB) generated by applying the white balance gain (kR, 1, kB) to the captured image (iR, iG, iB) of the observation light (=reflection light) 22 illustrated in
“Opposite color (R, G, B) of white balance adjusted image (kRiR, iG, kBiB)”=“color (DoLPR, DoLPG, DoLPB) of linear polarization degree (DoLP)”
The reason why this relationship is established is that phases of specular polarized light and diffused polarized light generated by reflection on the subject 20 are shifted from each other, and further, in a case where intensity of the specular polarized light and intensity of the diffused polarized light are compared, the intensity of the specular polarized light is large, that is, the specular polarization degree >the diffusion polarization degree is satisfied.
The reason why this relationship is established will be described later with reference to
Hereinafter, on the premise that this relationship is established, the processing step executed by the image processing device 100 of the present disclosure, that is, the following is executed.
Using a relationship that “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is equal to “a color (DoLPR, DoLPG, DoLPB) of the linear polarization degree (DoLP)”, two or more relational expressions are generated including a Stokes parameter that can be acquired from the captured image and two white balance gains kR and kB that are unknown. (Two or more relational expressions corresponding to different pixel positions of the captured image are generated).
(Step Q) The white balance gains kR and kB are calculated by solving the two or more relational expressions generated in step P as simultaneous equations.
Details of the processing steps P and Q described above will be described.
First, by using the relationship that “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is equal to “a color (DoLPR, DoLPG, DoLPB) of the linear polarization degree (DoLP)”, the image processing device 100 generates two or more relational expressions including a Stokes parameter that can be acquired from the captured image and two white balance gains kR and kB that are unknown. Two or more relational expressions corresponding to different pixel positions of the captured image are generated.
As described above, “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is an opposite color of the white balance adjusted image (kRiR, iG, kBiB) generated by applying the white balance gain (kR, 1, kB) to the captured image (iR, iG, iB) of the observation light (=reflection light) 22 (iR, iG, iB) illustrated in
Each value (luminance value) of RGB constituting “an opposite color (RGB) of the white balance adjusted image (kRiR, iG, kBiB)” is defined by the following (Expression 31).
Furthermore, “color (DoLPR, DoLPG, DoLPB) of linear polarization degree (DoLP)” corresponds to a color of a linear polarization component included in the observation light (=reflection light) 22 (iR, iG, iB) illustrated in
Each value (luminance value) of RGB constituting “color (DoLPR, DoLPG, DoLPB) of linear polarization degree (DoLP)” is defined by the following (Expression 32).
Note that (Expression 31) and (Expression 32) described above are both calculation expressions for RGB values after normalization (mean=0, norm=1) processing. Expressions indicated as denominators of (Expression 31) and (Expression 32) correspond to coefficients for the normalization processing.
The relational expression that “an opposite color (RGB) of the white balance adjusted image (kRiR, iG, kB iB)” expressed by the above-described (Expression 31) is equal to “a color (DoLPR, DoLPG, DoLPB) of the linear polarization degree (DoLP)” expressed by the above-described (Expression 32) is expressed by the following (Expression 33).
Note that the left side of the above-described (Expression 33) is an expression collectively indicating “an opposite color (RGB) of the white balance adjusted image (kRiR, iG, kBiB)” as one.
The right side of the above-described (Expression 33) is an expression collectively indicating “a color of the linear polarization degree (DoLP) (DoLPR, DoLPG, DoLPB)”.
When the above-described (Expression 33) is rearranged, the following (Expression 34) can be obtained.
DoLPR, DoLPG, and DoLPB in (Expression 34) described above are values that can be calculated using a Stokes parameter that can be acquired from a color polarized image captured by the imaging unit 50, as shown in (Expression 35a) to (Expression 35c) below.
Furthermore, (iR, iG, iB) in the above-described (Expression 34) can be replaced with the pixel value (s′0R, s′0G, s′0B) of an image including the Stokes parameter so that can be acquired from multiple color polarized images captured by the imaging unit 50.
As described above, the Stokes parameter s′0 corresponds to a light intensity signal (luminance signal) of unpolarized light in the observation light 22, and the image (s′0R, s′0G, s′0B) is a signal having an intensity ratio similar to that of the observation light (iR, iG, iB)
As a result, unknowns included in the relational expression shown in (Expression 34) described above are only the two white balance gains kR and kB.
Therefore, by generating and solving two or more relational expressions shown in (Expression 34) described above as simultaneous equations, the two white balance gains kR and kB can be calculated.
That is, the two white balance gains kR and kB can be calculated by generating the relational expression shown in (Expression 34) described above for two or more pixel positions.
The image processing device 100 applies the white balance gains kR and kB calculated by these processes to the image captured by the imaging unit 50, to execute the white balance adjustment processing.
Note that, as described above, the white balance gain (kR, 1, kB) can be defined as follows using the light source color (LR, LG, LB) of the light source 10.
As described above, it is possible to generate the captured image (s′0R, s′0G, s′0B) including luminance signals corresponding to RGB pixel values on the basis of a color polarized image which is an image captured by the imaging unit 50 illustrated in
Next, with reference to
“opposite color (R,G,B) of white balance adjusted image (kRiR,iG,kBiB)”=“color(DoLPR,DoLPG,DoLPB) of linear polarization degree (DoLP)”
As described above, the reason why the relationship described above is established is that phases of specular polarized light and diffused polarized light generated by reflection on the subject 20 are shifted from each other, and further, in a case where intensity of the specular polarized light and intensity of the diffused polarized light are compared, the intensity of the specular polarized light is large, that is, the specular polarization degree >the diffusion polarization degree is satisfied.
A specific example will be described with reference to
In the setting of (Condition 1), since the irradiation light 11 of the light source 10 is white (R=G=B), an image captured by the imaging unit (camera) 50 is to be an image in which a color change due to the light source color does not occur. That is, an image that does not require white balance adjustment reflecting the color of the subject 20 (=white balance adjusted image) is captured.
Two graphs illustrated in the lower part of
In each graph, the horizontal axis represents a deflection angle (deg), and the vertical axis represents intensity.
Note that the specular reflection component of the observation light reflects a color component of the light source, and the diffuse reflection component reflects a color component of the subject.
The condition setting illustrated in
Therefore, in “(1a) Polarizer-angle-corresponding intensity data of a specular reflection component and a diffuse reflection component of observation light”, the specular reflection component (is) of the observation light is illustrated as one graph common to RGB, and the diffuse reflection component (iRd, iGd, iBd) is illustrated as three individual graphs of RGB. Among the diffuse reflection components (iRd, iGd, iBd), the diffuse reflection component (iRd) of B (red) is the largest, which reflects the color (red) of the subject.
As described above, the graph illustrated in “(1a) polarizer-angle-corresponding intensity data of a specular reflection component and a diffuse reflection component of observation light” is a graph individually illustrating each intensity of the specular reflection component is (=iRs=iG, =iBs) and the diffuse reflection component (iRd, iGd, iBd) according to an angle of the polarizer of each of RGB colors of the observation light (iR, iG, iB).
In this graph, it is understood that formation positions of peaks and valleys of the specular reflection component is(=iRs=iG, =iBs) and the diffuse reflection component (iRd, iGd, iBd) are shifted, and phases of the specular polarized light and the diffuse polarized light are shifted. Furthermore, it is understood that, in a case where the intensity of the specular polarized light and the intensity of the diffused polarized light are compared, the intensity of the specular polarized light is high, that is, the specular polarization degree >the diffusion polarization degree is satisfied.
Moreover, the graph illustrated in “(1b) polarizer-angle-corresponding intensity data of observation light (specular reflection component+diffuse reflection component)” is a graph illustrating total intensity obtained by adding the specular reflection component and the diffuse reflection component according to an angle of the polarizer of each of RGB colors of the observation light (iR, iG, iB).
Note that the specular reflection component according to the angle of the polarizer of each of RGB colors is solid line data (is) in the graph of (1a), and the diffuse reflection component is three pieces of data (iRd, iGd, iBd) corresponding to RGB such as a dotted line illustrated in the graph of (1a).
That is, an intensity signal iR of R (red) illustrated in the graph of (1b) corresponds to a sum of is(=iRs) and iRd illustrated in the graph of (1a).
Similarly, an intensity signal iG of G (green) corresponds to a sum of is(=iGs) and iGd illustrated in the graph of (1a).
Similarly, an intensity signal iB of B (blue) corresponds to a sum of is(=iBs) and iBd illustrated in the graph of (1a).
A color of an image captured by the imaging unit (camera) 50 is set according to the observation light (iR, iG, iB) intensity of the graph illustrated in this “(1b) observation light (specular reflection component+diffuse reflection component)”.
As understood from the graph illustrated in “(1a) a specular reflection component and a diffuse reflection component of observation light”, when the intensity is compared among the diffuse reflection components (iRd, iGd, iBd) of the individual colors of RGB, the diffuse reflection component (iRd) of R (red) is larger than the diffuse reflection components (iGd, iBd) of G (green) and B (blue).
As a result, regarding the intensity of the observation light (iR, iG, iB) of the graph indicated by “(1b) observation light (specular reflection component+diffuse reflection component)” as well, the observation light (iR) of R (red) is larger than the observation light (iG, iB) of G (green) and B (blue).
This is because the irradiation light 11 of the light source 10 is white (R=G=B), and the subject 20 is red. In this case, the image captured by the imaging unit (camera) 50 is an image in which a color according to the observation light (iR, iG, iB) intensity of the graph illustrated in this “(1b) observation light (specular reflection component+diffuse reflection component)” is set, that is, an image accurately reflecting the color (red) of the subject 20.
In the setting of (Condition 1) illustrated in
Note that, in the graph illustrated in “(1b) observation light (specular reflection component+diffuse reflection component)” in
Using the white balance gains kR and kB and the observation light (iR, iG, iB) observed in (Condition 1) illustrated in
That is, individual RGB values of the following are calculated.
Note that, (Condition 1) illustrated in
As calculation results of this, the following has been obtained.
From the above-described results, it is proved that (Expression 31)≈(Expression 32), that is,
In the setting of (Condition 2), since the irradiation light 11 of the light source 10 is green (G), an image captured by the imaging unit (camera) 50 is an image in which a color change due to the light source color occurs. That is, an image that requires white balance adjustment reflecting the color of the subject 20 is captured.
Similarly to
In each graph, the horizontal axis represents a deflection angle (deg), and the vertical axis represents intensity.
The graph illustrated in “(2a) a specular reflection component and a diffuse reflection component of observation light” is a graph individually illustrating each intensity of the specular reflection component (iRs, iG, iBs) and the diffuse reflection component (iRd, iGd, iBd) according to an angle of the polarizer of each of RGB colors of the observation light (iR, iG, iB). In the present example, unlike the example illustrated in
This is because, as described above, the specular reflection component of the observation light reflects a color component of the light source, and the diffuse reflection component reflects a color component of the subject. Further, in the condition setting illustrated in
The graph illustrated in “(2b) observation light (specular reflection component+diffuse reflection component)” is a graph illustrating total intensity obtained by adding the specular reflection component and the diffuse reflection component according to an angle of the polarizer of each of RGB colors of the observation light (iR, iG, iB).
The color of the image captured by the imaging unit (camera) 50 is set according to the observation light (iR, iG, iB) intensity of the graph illustrated in this “(2b) observation light (specular reflection component+diffuse reflection component)”.
The intensity signal iR of R (red) illustrated in the graph of (2b) corresponds to a sum of is(=iRs) and iRd illustrated in the graph of (2a).
Similarly, the intensity signal iG of G (green) corresponds to a sum of is(=iG) and iGd illustrated in the graph of (2a).
Similarly, the intensity signal iB of B (blue) corresponds to a sum of is(=iBs) and iBd illustrated in the graph of (2a).
In the graph illustrated in “(2a) a specular reflection component and a diffuse reflection component of observation light”, when the intensity is compared among the specular reflection components (iRs, iG, iBs) of each of RGB colors, the specular reflection component (iBs) of B (blue) is larger than the specular reflection components (iGs, iRs) of G (green) and R (red).
As a result, regarding the intensity of the observation light (iR, iG, iB) in the graph indicated by “(2b) observation light (specular reflection component+diffuse reflection component)” as well, the observation light (iB) of B (blue) is larger than the observation light (iG, iR) of G (green) and R (red).
This is a result of the irradiation light 11 of the light source 10 being green (G) and the subject 20 being red. In this case, the image captured by the imaging unit (camera) 50 is an image in which a color according to the observation light (iR, iG, iB) intensity of the graph illustrated in this “(2b) observation light (specular reflection component+diffuse reflection component)” is set, that is, an image that does not accurately reflect the color (red) of the subject 20.
In the setting of (Condition 2) illustrated in
Using the white balance gains kR and kB and the observation light (iR, iG, iB) observed in (Condition 2) illustrated in
That is, individual RGB values of the following are calculated.
Note that, (Condition 2) illustrated in
The calculation result is as follows.
From the above-described results, (Expression 31) #(Expression 32), that is,
As described with reference to
That is, the white balance gains kR and kB can be calculated by using the relational expressions illustrated in (Expression 33) and (Expression 34) described above.
In this way, by generating and solving two or more relational expressions shown in (Expression 34) described above as simultaneous equations, the two white balance gains kR and kB can be calculated.
Next, a configuration example of the image processing device of the present disclosure will be described.
Irradiation light is emitted from the light source 10, and the observation light 22 which is reflection light reflected by the subject 20 is input to the imaging unit (color polarized image capturing camera) 50.
The imaging unit (color polarized image capturing camera) 50 captures multiple different color polarized images, and the captured multiple color polarized images are input to the image processing device 100.
The image processing device 100 calculates a white balance gain by using the multiple color polarized images input from the imaging unit (color polarized image capturing camera) 50, and executes the white balance adjustment processing using the calculated white balance gain.
The imaging unit (color polarized image capturing camera) 50 captures a polarized image by using a polarizing filter (polarizing element).
As described above with reference to
The three types of images are the following three types of images illustrated in
A plurality of specific configuration examples of the imaging unit (color polarized image capturing camera) 50 will be described with reference to
The imaging units 50a to 50c each include polarizing filters a, 51a to c, 51c having different polarization directions.
Different polarized images through the polarizing filters a, 51a to c, 51c are captured by image sensors a, 52a to c, 52c.
Three color polarized images a to c captured by the three imaging units 50a to 50c are input to the image processing device 100.
By rotating the rotatable polarizing filter 51r, multiple different polarized images can be captured.
The image sensor inside the imaging unit 50 is configured as a polarizer stacked sensor 52p associated with a polarizer (polarizing filter) corresponding to each pixel. Light (polarized light) via the polarizer (polarizing filter) is input to each pixel of the polarizer stacked sensor 52p.
Specifically, for example, polarizers (polarizing filters) in multiple polarization directions are arranged in association with an image sensor such as a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD).
For example, as illustrated in (a) a polarizer stacked sensor polarization direction example at the lower left of
Polarized images in individually different polarization directions are captured for these four pixels.
In this configuration, multiple polarized images can be acquired by one time of capturing processing, and high-speed processing can be performed.
Note that the polarizer (polarizing filter) only needs to take out linearly polarized light from subject light, and for example, a wire grid, photonic liquid crystal, or the like can be used. Note that, in a case of acquiring a color polarized image, a color filter is provided on an incident surface side of the sensor.
Note that, for example, in a configuration using the multiple imaging units a, 50a to c, 50c as illustrated in
If a position interval of the imaging units a, 50a to c, 50c is negligibly short with respect to a distance to the subject, parallax can be ignored in multiple polarized images having different polarization directions. In this case, it is possible to acquire an image equivalent to an unpolarized normal luminance image by averaging luminance of the polarized images having different polarization directions.
Whereas, in a case where the parallax cannot be ignored, an image equivalent to an unpolarized normal luminance image can be acquired by aligning the polarized images having different polarization directions in accordance with a parallax amount and averaging luminance of the aligned polarized images.
Furthermore, in a case of a configuration illustrated in
The image equivalent to an unpolarized normal luminance image acquired by these processes corresponds to, for example, an image including a Stokes parameter (s′0R, s′0G, s′0B) corresponding to an unpolarized light intensity signal (luminance signal) corresponding to each of RGB colors in the observation light 22 illustrated in
Note that the white balance adjustment target image is not limited to such an unpolarized normal luminance image. The image processing device 100 may execute the white balance adjustment processing by using a polarized image acquired by the imaging unit 50 as an adjustment target image, to generate the white balance-adjusted polarized image.
In the configuration using the polarizer stacked sensor 52p associated with the polarizer (polarizing filter) corresponding to each pixel described above with reference to
The configuration illustrated in each figure is repeated in the horizontal direction and the vertical direction. (a) and (b) of
Furthermore, (b) of
Furthermore, in a case where a wire grid is used as the polarizing filter, polarized light in which an electric field component is perpendicular to a direction of grating (wire direction) is transmitted, and transmittance increases as the wire is longer. Therefore, in a case where the polarization component unit is 2×2 pixels, the transmittance is higher than that of 1×1 pixels. Therefore, in a case where the polarization component unit is 2×2 pixels, the transmittance is higher than that of 1×1 pixels, and an extinction ratio can be improved.
By providing the white pixel in this way, as disclosed in Patent Document “WO 2016/136085 A”, a dynamic range in generating normal line information can be expanded as compared with a case where the white pixel is not provided. Furthermore, since the white pixel has a favorable S/N ratio, the white pixel is less likely to be affected by noise in calculation of a color difference or the like.
Note that the configurations illustrated in
Note that the imaging unit (color polarized image capturing camera) 50 that acquires a color polarized image is not limited to the above-described configuration, and may have another configuration as long as the imaging unit (color polarized image capturing camera) 50 can acquire a color polarized image from which the polarization information such as the Stokes parameter to be used for the white balance gain calculation processing can be obtained.
Furthermore, the color polarized image to be used in the image processing device 100 is not limited to the case of being output from the imaging unit (color polarized image capturing camera) 50 to the image processing device 100. For example, in a case where a color polarized image generated by the imaging unit (color polarized image capturing camera) 50 or the like is recorded on a recording medium, the color polarized image recorded on the recording medium may be read and output to the image processing device 100.
Returning to
The image processing device 100 includes a polarization information acquisition unit 101, a white balance gain calculation unit 102, and a white balance adjustment unit 103.
The polarization information acquisition unit 101 of the image processing device 100 acquires polarization information to be applied to white balance gain calculation, by using a color polarized image acquired by the imaging unit (color polarized image capturing camera) 50.
The polarization information acquisition unit 101 is input with, for example, multiple different polarized images from the imaging unit (color polarized image capturing camera) 50.
Specifically, for example, the following three types of images described above with reference to
From these images, the polarization information acquisition unit 101 acquires polarization information to be used for the white balance gain calculation processing. Specifically, processing is performed for calculating a Stokes parameter corresponding to each of RGB colors or for a linear polarization degree (DoLP) corresponding to each of RGB colors and calculated using the Stokes parameter.
The polarization information acquired by the polarization information acquisition unit 101 is output to the white balance gain calculation unit 102.
The white balance gain calculation unit 102 uses a color polarized image acquired by the imaging unit (color polarized image capturing camera) 50 and the polarization information acquired by the polarization information acquisition unit 101, to calculate a white balance gain to be applied to white balance adjustment.
For example, the white balance gain kR to be multiplied by the R (red) pixel value of the captured image and the white balance gain kB to be multiplied by the B (blue) pixel value of the captured image are calculated.
Each element of the white balance gain (kR, 1, kB) corresponds to a multiplication parameter for the pixel value (R, G, B) of each color of the captured image.
By using the pixel value of G (green) of the captured image as a reference without changing the G pixel value, the pixel value of R (red) of the captured image is multiplied by the white balance gain kR, and the pixel value of B (blue) of the captured image is multiplied by the white balance gain kB, thereby a corrected image after white balance adjustment can be generated.
The white balance gain calculated by the white balance gain calculation unit 102 is output to the white balance adjustment unit 103.
The white balance adjustment unit 103 executes the white balance adjustment processing on a color image acquired by the imaging unit (color polarized image capturing camera) 50.
For example, by multiplying the R (red) pixel value of the color image by the white balance gain kR, and multiplying the B (blue) pixel value by the white balance gain kB, a corrected image after the white balance adjustment is generated.
The white balance adjusted image generated by the white balance adjustment unit 103 is output to an external device, for example, a display device, a recording device, or the like.
Next, a sequence of processing executed by the image processing device of the present disclosure will be described.
With reference to
Note that the image processing device 100 of the present disclosure has a program execution function such as a CPU, for example, and processing according to flowcharts illustrated in
Hereinafter, processing of each step of the flowchart illustrated in
First, in step S201, the image processing device 100 is input with a color polarized image.
A color polarized image captured by the imaging unit (color polarized image capturing camera) 50 illustrated in
Specifically, for example, the following three types of color polarized images described above with reference to
Next, in step S202, the image processing device 100 acquires polarization information to be applied to white balance gain calculation, by using the color polarized image input in step S201.
This processing is processing executed by the polarization information acquisition unit 101 of the image processing device 100 illustrated in
The polarization information acquisition unit 101 acquires polarization information to be used for the white balance gain calculation processing, from the color polarized image input from the imaging unit (color polarized image capturing camera) 50.
Specifically, processing is performed for calculating a Stokes parameter corresponding to each of RGB colors or for a linear polarization degree (DoLP) corresponding to each of RGB colors and calculated using the Stokes parameter.
From the color polarized image input from the imaging unit (color polarized image capturing camera) 50, the polarization information acquisition unit 101 acquires, for example, the following Stokes parameters, that is, three types of Stokes parameters of,
Moreover, the linear polarization degree (DoLP) corresponding to each of RGB colors is calculated using the acquired Stokes parameters.
Note that, as described above, the linear polarization degree (DoLP) is a ratio (%) of linearly polarized light included in observation light (subject reflection light).
Next, in step S203, the image processing device 100 calculates a white balance gain by using polarization information acquired in step S202.
This processing is processing executed by the white balance gain calculation unit 102 of the image processing device 100 illustrated in
The white balance gain calculation unit 102 calculates a white balance gain that is a pixel value adjustment parameter for correcting the pixel value of the captured image to the original color of the subject.
The white balance gain calculation unit 102 uses a color polarized image acquired by the imaging unit (color polarized image capturing camera) 50 and the polarization information acquired by the polarization information acquisition unit 101, to calculate a white balance gain to be applied to white balance adjustment.
For example, the white balance gain kR to be multiplied by the R (red) pixel value of the captured image and the white balance gain kB to be multiplied by the B (blue) pixel value of the captured image are calculated.
Each element of the white balance gain (kR, 1, kB) corresponds to a multiplication parameter for the pixel value (R, G, B) of each color of the captured image.
By using the pixel value of G (green) of the captured image as a reference without changing the G pixel value, the pixel value of R (red) of the captured image is multiplied by the white balance gain kR, and the pixel value of B (blue) of the captured image is multiplied by the white balance gain kB, thereby a corrected image after white balance adjustment can be generated.
The white balance gain calculation processing in step S203 is executed by applying any one of the following two processing examples described above with reference to
A detailed sequence of the above-described two white balance gain calculation processing examples will be described later with reference to flowcharts illustrated in
Finally, in step S204, the image processing device 100 executes the white balance adjustment processing to which the white balance gain calculated in step S203 is applied.
This processing is processing executed by the white balance adjustment unit 103 of the image processing device 100 illustrated in
The white balance adjustment unit 103 executes the white balance adjustment processing on a captured image acquired by the imaging unit (color polarized image capturing camera) 50.
For example, by multiplying a R (red) pixel value of the captured image by the white balance gain kR calculated in step S203, and multiplying a B (blue) pixel value by the white balance gain kB, the white balance adjustment unit 103 generates a corrected image after white balance adjustment.
The white balance adjusted image generated by the white balance adjustment unit 103 is output to an external device, for example, a display device, a recording device, or the like.
Next, a detailed sequence of the white balance gain calculation processing in step S203 will be described. As described above, the white balance gain calculation processing in step S203 is executed by applying any one of the following two processing examples described above with reference to
First, with reference to the flowchart illustrated in
Note that the processing in steps S221 to S222 of the flow illustrated in
In step S221, the white balance gain calculation unit 102 detects a pixel in which linear polarization degrees (DoLP) of two different colors (R and G, B and G) coincide with each other, from a captured image acquired by the imaging unit (color polarized image capturing camera) 50.
Note that the detection pixel is a pixel in which reflectances of two colors (R and G, B and G) of the subject coincide with each other, and a color change of the captured image is caused only by an influence of the light source color (LR, LG, LB).
As described above with reference to
r
R
=r
G
(DoLPR)=(DoLPG)
As described above, in order to detect the pixel position where the reflectance rR of R (red) and the reflectance rG of G (green) coincide with each other, that is, the pixel position where
r
R
=r
G
Similarly, in order to detect a pixel position where the reflectance rB of B (blue) and the reflectance rG of G (green) coincide with each other, that is, a pixel position where
r
B
=r
G
Next, in step S222, the white balance gain calculation unit 102 calculates the white balance gains kR and kB on the basis of the pixel values of the two colors (R and G, B and G) of the detection pixel.
As described above with reference to
Similarly, at the pixel position where the reflectance rB of B (blue) and the reflectance rG of G (green) coincide with each other, that is,
As described above, the white balance gain (kR, 1, kB) can be defined as follows using the light source color (LR, LG, LB) of the light source 10.
The image (s′0R, s′0G, s′0B) acquired from the multiple color polarized images input from the imaging unit 50 has the pixel value (s′0R, s′0G, s′0B) including a luminance signal corresponding to an RGB pixel value. By multiplying this pixel value by the calculated white balance gain (kR, 1, kB), the white balance adjustment processing is executed, and an image after white balance adjustment can be generated.
Next, with reference to a flowchart illustrated in
This “B. White balance gain calculation processing example 2” is the white balance gain calculation processing described above with reference to
Note that the processing in steps S241 to S242 of the flow illustrated in
In step S241, the white balance gain calculation unit 102 executes the following processing.
Using a relationship that “an opposite color of the white balance adjusted image (kRiR, iG, kBiB)” is equal to “a color (DoLPR, DoLPG, DoLPB) of the linear polarization degree (DoLP)”, two or more relational expressions are generated including a Stokes parameter that can be acquired from the captured image and two white balance gains kR and kB that are unknown. Two or more relational expressions corresponding to different pixel positions of the captured image are generated.
The relational expression serving as a base of this relational expression is the relational expression (Expression 33) described above with reference to
As described above, the parameters included in (Expression 34), for example, DoLPR, DoLPG, and DoLPB can be calculated by using Stokes parameters that can be acquired from a color polarized image captured by the imaging unit 50, as described above in (Expression 35).
All other parameters other than the white balance gains kR and kB included in (Expression 34) are known, and unknowns included in (Expression 34) are only the white balance gains kR and kB.
Therefore, by generating and solving two or more relational expressions shown in (Expression 34) as simultaneous equations, the two white balance gains kR and kB can be calculated.
That is, in step S241, the relational expression shown in (Expression 34) is generated for two or more pixel positions.
Next, in step S242, the white balance gain calculation unit 102 solves the two or more relational expressions generated in step S241, as simultaneous equations.
As a result of this processing, the two white balance gains kR and kB, which are two unknowns included in the relational expression shown in (Expression 34), are calculated.
The image processing device 100 applies the white balance gains kR and kB calculated by these processes to the image captured by the imaging unit 50, to execute the white balance adjustment processing.
Next, white balance gain calculation and white balance adjustment processing in unit of pixel or in unit of image area will be described.
By applying the white balance gain calculated according to the processing described in the above-described embodiment to, for example, an unpolarized luminance image acquired from an image captured by the imaging unit 50, it is possible to acquire a white balance adjusted image, that is, an RGB image reflecting the color of the subject with high accuracy.
For example, an image including the Stokes parameter (s′0R, s′0RG, s′0B) acquired from the image captured by the imaging unit 50 is an RGB image including an unpolarized light intensity signal (luminance signal). By performing the white balance adjustment processing on this image an RGB image reflecting the color of the subject with high accuracy can be acquired.
At this time, it is also possible to perform the white balance adjustment processing in which a uniform white balance gain is applied to the entire image. However, different white balance gains may be calculated in unit of pixel constituting the image or in unit of image area including multiple pixels, and individual white balance adjustment processing may be performed in unit of pixel or in unit of image area.
With reference to
As described above, in the image processing device 100 of the present disclosure illustrated in
“A. White balance gain calculation processing example 1” described with reference to
Furthermore, “B. White balance gain calculation processing example 2” described with reference to
In both of these two processing examples, the white balance gain is calculated using the degree of linear polarization (DoLP) obtained from one pixel or two pixels in the image.
That is, it can be interpreted that the calculation processing of the white balance gain corresponding to the specific pixel is performed.
Such a white balance gain corresponding to a specific pixel may be applied to the entire image. However, for example, it is also possible to calculate a white balance gain corresponding to multiple pixels in accordance with “A. White balance gain calculation processing example 1” or “B. White balance gain calculation processing example 2” described above, and perform calculation of a white balance gain for each pixel or in unit of image area in the entire image, and white balance adjustment for each pixel or in unit of image area to which the calculation gain is applied, by using the white balance gains corresponding to the multiple pixels.
With reference to
It is assumed that four pixels Pw1 to Pw4 illustrated in
The white balance gain calculation unit 102 of the image processing device 100 of the present disclosure illustrated in
The white balance gain corresponding to R (red) of the pixel Pw1 is kR1.
The white balance gain corresponding to R (red) of the pixel Pw2 is kR2.
The white balance gain corresponding to R (red) of the pixel Pw3 is kR3.
The white balance gain corresponding to R (red) of the pixel Pw4 is kR4.
The white balance gain calculation unit 102 calculates a white balance gain of a pixel or an image area for which a white balance gain corresponding to the pixel has not been calculated, by performing interpolation processing using a weight corresponding to a distance from the pixels Pw1 to Pw4.
The white balance gain calculation unit 102 calculates a white balance gain kRP corresponding to R (red) of the pixel Pt according to (Expression 41) shown below.
Note that a coefficient wa in (Expression 41) described above is a coefficient for normalizing the weight.
The white balance gain calculation unit 102 executes processing similar to the processing described above for each pixel, and calculates the white balance gains kR corresponding to R (red) and the white balance gains kB corresponding to B (blue) of all pixels constituting the image.
Note that the white balance gain may be calculated not in unit of pixel but in unit of image area including multiple pixels. In this case, for example, a barycentric position of the image area is set as a representative pixel, a white balance gain of the representative pixel is calculated by the above-described processing, and the calculated white balance gain is applied to the pixels of the entire image area in which the barycenter.
In this case, the white balance gain calculation unit 102 performs clustering to segment one image into multiple image areas, and sets a barycentric position and a representative value of the white balance gain for each image area (each class).
Thereafter, interpolation processing is performed using the barycentric position and the white balance gain representative value for each image area (class), and a white balance gain of each area is calculated. Note that the representative value of the white balance gain is a representative value in unit of image area (class), and for example, an average value, a median value, a mode value, or the like can be applied.
As illustrated in
In such a case, white balance gains of other image areas can be set by performing interpolation processing similar to the case described above with reference to
As illustrated in
Furthermore, in a case where the white balance gain calculation unit 102 performs area segmentation of the color polarized image and performs the white balance gain setting in unit of segmented areas, area segmentation may be performed using graph cutting, deep learning (such as convolutional neural network (CNN), and recurrent neural network (RNN)), or the like, and a single white balance gain may be set for each color component in the segmented area.
In a case where the white balance gain calculation unit 102 performs the white balance gain calculation processing in unit of image area, for example, the following processing can be performed.
That is, it is possible to perform processing of identifying a type of an object that is a subject of an image, setting an image area in unit of identified object type, and performing the white balance gain calculation processing in unit of object type.
Note that, as the processing of identifying the type of the object that is the subject of the image, for example, a technique such as pattern matching or semantic segmentation can be applied.
The pattern matching is processing of, for example, storing pattern data including a shape and feature information of a person, a car, or the like in a storage unit, and identifying each subject by comparing the pattern data stored in the storage unit with a subject in an image area on the captured image.
The semantic segmentation is a technique of storing dictionary data (learned data) for object identification based on various kinds of actual object shape and other feature information in the storage unit, and performing object identification as to what the object in the image is, on the basis of a matching degree between the dictionary data and the object in the captured image. In the semantic segmentation, object identification is performed in unit of pixel of the captured image.
Meanwhile, the white balance gain calculation unit 102 may select and use the above-described processing. For example, the white balance gain calculation unit 102 analyzes a dispersion degree of white balance gains of the same color components calculated in unit of pixel or in unit of image area, and switches the processing in accordance with the analyzed dispersion degree.
For example, a gain for the entire color polarized image is set in a case where a variation in the white balance gain for every unit of pixel or image area does not exceed a preset threshold, and a white balance gain is calculated and applied in unit of pixel or in unit of image area in a case where the gain exceeds the threshold.
With reference to a flowchart illustrated in
Hereinafter, processing of each step of the flow of
First, in step S301, the white balance gain calculation unit 102 calculates a white balance gain corresponding to a pixel.
The white balance gain calculation unit 102 executes one of the following two types of processing described above with reference to
Both of these two types of processing calculate a white balance gain by using a degree of linear polarization (DoLP) obtained from one or two pixels of an image. That is, the two types of processing are processing for calculating a white balance gain corresponding to a specific pixel.
Next, in step S302, the white balance gain calculation unit 102 analyzes a variation in the white balance gain in unit of pixel calculated in step S301.
Note that, it is assumed that a white balance gain corresponding to the pixel has been calculated for multiple pixels of the image in step S301.
In step S302, the white balance gain calculation unit 102 analyzes a variation in multiple white balance gains in unit of pixel calculated in step S301.
Next, in step S303, the white balance gain calculation unit 102 determines whether the variation in the white balance gain in unit of pixel analyzed in step S302 falls within an allowable range.
When it is determined that the variation in the white balance gain in unit of pixel falls within the preset allowable range, the processing proceeds to step S304.
Whereas, when it is determined that the variation in the white balance gain is not within the preset allowable range, the processing proceeds to step S305.
Note that the case where it is determined that the variation in the white balance gain in unit of pixel falls within the preset allowable range includes, for example, a case where it can be considered that illumination light is emitted from one light source or multiple light sources having a small difference in color temperature to a subject included in an image.
Whereas, the case where the variation in the white balance gain in unit of pixel exceeds the preset allowable range includes a case where it can be considered that illumination light is emitted from multiple light sources having different color temperatures to a subject included in an image.
When it is determined in step S303 that the variation in the white balance gains in unit of pixel falls within the preset allowable range, the processing proceeds to step S304.
In this case, in step S304, the white balance gain calculation unit 102 sets one white balance gain to be used in the entire area of the image.
The white balance gain calculation unit 102 calculates the white balance gain to be applied to the entire area of the image by performing statistical processing of the white balance gain in unit of pixel calculated in step S301, or the like.
For example, the white balance gain calculation unit 102 sets any one of an average value, a mode value, a median value, and the like of the white balance gains in unit of pixel calculated in step S301 as the white balance gain to be used in the entire image area.
Whereas, when it is determined in step S303 that the variation in the white balance gains in unit of pixel exceeds the preset allowable range, the processing proceeds to step S305.
In this case, in step S305, the white balance gain calculation unit 102 performs area segmentation processing on the image, that is, clustering processing. The white balance gain calculation unit 102 performs clustering as area segmentation processing based on a position of the image and an object type.
Next, in step S306, the white balance gain calculation unit 102 determines whether the variation in the white balance gain corresponding to the image area (class) generated by the area segmentation processing in step S305 exceeds an allowable range.
When it is determined that the variation in the white balance gain corresponding to the image area (class) exceeds the predetermined allowable range, the processing proceeds to step S307.
Whereas, when it is determined that the variation in the white balance gain corresponding to the image area (class) falls within the predetermined allowable range, the processing proceeds to step S308.
When it is determined in step S306 that the variation in the white balance gain corresponding to the image area (class) exceeds the predetermined allowable range, the processing proceeds to step S307.
In this case, in step S307, the white balance gain calculation unit 102 calculates a new white balance gain corresponding to the pixel by performing interpolation processing based on the white balance gain corresponding to the pixel calculated in step S301.
That is, the calculation processing of the new white balance gain corresponding to the pixel based on the interpolation processing described above with reference to
Whereas, when it is determined in step S306 that the variation in the white balance gain corresponding to the image area (class) falls within the predetermined allowable range, the processing proceeds to step S308.
In this case, in step S308, the white balance gain calculation unit 102 calculates a new white balance gain corresponding to the pixel and the image area (class) by performing interpolation processing based on the white balance gain in unit of class determined using the barycenter and an average value in unit of each image area (class) generated by the area segmentation (clustering) processing executed in step S305.
That is, the calculation processing of the new white balance gain corresponding to the pixel and the image area (class) based on the interpolation processing described above with reference to
Note that, in
In this case, a color temperature of the illumination light is different between the image area AR4 (class AR4) and the image area AR5 (class AR5), and for example, a variation exceeding the allowable range occurs between the white balance gain for the image area AR4 (class AR4) and the white balance gain for the image area AR5 (class AR5).
Therefore, for example, the gain of the image area AR6 (class AR6) is calculated by interpolation processing based on the white balance gain corresponding to the image area AR4 (class AR4), the white balance gain corresponding to the image area AR5 (class AR5), and a distance to the image area AR4 and a distance to the image area AR5.
Furthermore, gains are also set for the areas AR1 to AR3, similarly to the area AR6. Therefore, in the areas AR1 to AR3 and AR6, more natural white balance adjustment can be performed in consideration of the two types of illumination light. Note that, in a case where the illumination light is either light from the light source LT or external light, the gain for the entire color polarized image is set since the gains of the area AR4 and the area AR5 are substantially equal.
As described above, the image processing device 100 of the present disclosure calculates an optimum white balance gain for the captured image by using the polarization information.
Even in a case where an achromatic area cannot be detected from the captured image, the image processing device 100 of the present disclosure is capable of processing of calculating the optimum white balance gain for the captured image by using polarization information of a chromatic area.
Moreover, the image processing device 100 of the present disclosure can calculate the optimum white balance gain for each area in the imaging scene by using the polarization information. For example, in a case where multiple light sources having different color temperatures are provided, it is possible to perform processing of calculating the white balance gain in accordance with the color temperature of the illumination light with which the object is irradiated.
Next, a hardware configuration example of the image processing device 100 of the present disclosure will be described.
Each constituent part of the hardware configuration illustrated in
A central processing unit (CPU) 301 functions as a data processing unit that executes various types of processing in accordance with a program stored in a read only memory (ROM) 302 or a storage unit 308. For example, the CPU 301 executes the processing according to the sequence described in the above embodiment.
A random access memory (RAM) 303 stores programs, data, or the like to be performed by the CPU 301. The CPU 301, the ROM 302, and the RAM 303 are connected to each other by a bus 304.
The CPU 301 is connected to an input/output interface 305 via the bus 304, and an input unit 306 including various operation units, switches, and the like, and an output unit 307 including a display as a display unit, a speaker, and the like are connected to the input/output interface 305, in addition to the camera.
The CPU 301 is input with a camera-captured image, operation information, and the like input from the input unit 306, executes various types of processing, and outputs a processing result to, for example, the output unit 307.
The storage unit 308 connected to the input/output interface 305 includes, for example, a hard disk, or the like and stores programs executed by the CPU 301 and various types of data. A communication unit 309 functions as a transmitter and receiver for data communication via a network such as the Internet or a local area network, and communicates with an external device.
A drive 310 connected to the input/output interface 305 drives a removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory such as a memory card, and records or reads data.
Hereinabove, the embodiments according to the present disclosure have been described in detail with reference to the specific embodiments. However, it is obvious that those skilled in the art can modify or substitute the embodiments without departing from the gist of the present disclosure. That is, the present invention has been disclosed in the form of exemplification, and should not be interpreted in a limited manner. In order to determine the gist of the present disclosure, the claims should be considered.
Note that the technology disclosed herein can have the following configurations.
Note that a series of processing herein described can be executed by hardware, software, or a combined configuration of the both. In a case where processing by software is executed, a program in which a processing sequence is recorded can be installed and executed in a memory in a computer incorporated in dedicated hardware, or the program can be installed and executed in a general-purpose computer capable of executing various types of processing. For example, the program can be recorded in advance in a recording medium. In addition to being installed in a computer from the recording medium, a program can be received via a network such as a local area network (LAN) or the Internet and installed in a recording medium such as an internal hard disk or the like.
Furthermore, the various types of processing herein described may be performed not only in time series as described, but also in parallel or individually in accordance with the processing capability of the device that performs the processing or as necessary. Furthermore, a system herein described is a logical set configuration of a plurality of devices, and is not limited to a system in which devices of respective configurations are in the same housing.
As described above, according to the configuration of one embodiment of the present disclosure, a configuration is realized in which white balance gain calculation processing and white balance adjustment are executed by using polarization information acquired from a color polarized image.
Specifically, for example, there are provided: the polarization information acquisition unit configured to acquire polarization information from a color polarized image; the white balance gain calculation unit configured to calculate a white balance gain by using the acquired polarization information; and the white balance adjustment unit configured to execute white balance adjustment processing to which the calculated white balance gain is applied. The polarization information acquisition unit calculates a color-corresponding polarization degree from the color polarized image, and the white balance gain calculation unit detects a pixel position where polarization degrees of two colors coincide with each other on the basis of a pixel position where subject reflectances of the two colors coincide with each other, and calculates a white balance gain by using color-corresponding polarization information of the detected pixel position.
With this configuration, a configuration is realized in which white balance gain calculation processing and white balance adjustment are executed by using polarization information acquired from a color polarized image.
Number | Date | Country | Kind |
---|---|---|---|
2021-199040 | Dec 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/039660 | 10/25/2022 | WO |