The present disclosure relates to an image processing device.
Conventionally, technology for removing an image reflected on a subject included in an image is known. For example, there is technology of removing a reflected image by using captured images captured through polarizing plates having different polarization directions. For example, in such technology, a luminance value of a certain pixel is approximated by a trigonometric function having a polarization angle as a variable. An imaging device of such technology removes a reflected image by setting the luminance value of a reflected image to the minimum value of a trigonometric function curve.
However, if regular reflection light components (specular components) of a subject included in a captured image are uniformly removed, texture information of the subject may be impaired.
Therefore, the present disclosure provides a mechanism capable of correcting a reflected image in a subject while maintaining the texture of the subject included in a captured image.
Note that the above problem or object is merely one of a plurality of problems or objects that can be solved or achieved by a plurality of embodiments disclosed herein.
An image processing device of the present disclosure includes a control unit. The control unit acquires a plurality of polarized images having different polarization directions. The control unit outputs a corrected image in which a specific region included in a specular reflection region is corrected on the basis of the plurality of polarized images.
Hereinafter, embodiments of the disclosure will be described in detail by referring to the accompanying drawings. Note that, in the present specification and the drawings, components having substantially the same functional configuration are denoted by the same symbols, and redundant description is omitted.
Furthermore, in the present specification and the drawings, specific values may be indicated to give description; however, the values are merely examples, and other values may be applied.
In addition, in the present specification and the drawings, similar components of embodiments may be distinguished by attaching different alphabets or numbers after the same symbol. Note that, in a case where it is not necessary to particularly distinguish each of similar components, only the same symbol is assigned.
One or more embodiments (including examples and modifications) described below can be each implemented independently. Meanwhile, at least a part of the plurality of embodiments described below may be combined with and implemented together with at least a part of another embodiment as appropriate. The plurality of embodiments may include novel features different from each other. Therefore, the plurality of embodiments can contribute to solving different objects or disadvantages and achieve different effects.
An example of the reflected image correction processing will be described with reference to
Meanwhile, a captured image used for the reflected image correction processing is a plurality of captured images (hereinafter, also referred to as polarized images) having different polarization directions. The reflected image correction processing is executed by, for example, an image processing device (not illustrated).
For example, in the reflected image correction processing illustrated in
In
Alternatively, it is based on the premise that the image processing device performs reflected image correction processing of removing some of the specular components in the reflection region SR. In this case, as illustrated in the lower-right diagram in
As described above, when the image processing device performs correction to uniformly remove all or some of the specular components in the reflection region SR, the texture in the reflection region SR is also corrected.
Therefore, the reflected image correction processing according to the embodiment of the disclosure corrects a reflected image region (an example of the specific region) included in the reflection region SR (an example of the specular reflection region).
The image processing device specifies the reflection region SR of the subject OB1 included in the polarized images and a reflected image region CR in the reflection region SR on the basis on the polarized images. The image processing device corrects the reflected image region CR. As a result, as illustrated in the right diagram of
The sensor 200 is an imaging device (polarization sensor) that captures a plurality of captured images (polarized images) having different polarization directions. The sensor 200 includes, for example, a light receiving unit (not illustrated) including a plurality of polarizing filters having different polarization directions. The sensor 200 generates polarized images depending on the amount of light received by the light receiving unit via the polarizing filters.
The sensor 200 has, for example, the light receiving unit (not illustrated) in which four polarizing filters having different polarization angles are arrayed in a lattice pattern. For example, as illustrated in the left diagram of
As illustrated in the right diagram of
The sensor 200 outputs the generated polarized images M2_1 to M2_4 to the image processing device 100.
Note that, in this example, it is based on the premise that the sensor 200 generates the plurality of polarized images M2_1 to M2_4 for the respective polarization directions from the polarized image M1 including pixels having different polarization directions (polarization angles); however, it is not limited thereto. For example, as illustrated in
Alternatively, the sensor 200 may include a plurality of light receiving units (not illustrated) for respective polarization directions and generate the polarized images M2_1 to M2_4 corresponding to the respective light receiving units.
Furthermore, in this example, the sensor 200 generates the polarized images M2_1 to M2_4 in the four polarization directions; however, it is not limited thereto. The sensor 200 only needs to generate a plurality of polarized images M2_1 to M2_4 having different polarization directions, and the number of polarization directions may be three or less or five or more.
Furthermore, in this example, the sensor 200 outputs the polarized images M2_1 to M2_4 for the respective polarization directions to the image processing device 100; however, it is not limited thereto. For example, the sensor 200 may output the polarized image M1 to the image processing device 100. In this case, the image processing device 100 converts the polarized image M1 into the polarized images M2_1 to M2_4 for the respective polarization directions.
Returning to
As illustrated in
The communication unit 110 is implemented by, for example, a communication device and communicates with other devices via various wired or wireless networks. For example, the communication unit 110 receives data of a captured image (for example, the polarized images M2) from another device such as the sensor 200 and stores the data in the storage unit 120. Furthermore, for example, the communication unit 110 transmits data of an image (for example, the output image M3) edited in the image processing device 100 and stored in the storage unit 120 to another device. Furthermore, although not illustrated, in a case where the image processing device 100 is a server, the communication unit 110 receives a command such as a request for processing transmitted from a terminal device (not illustrated) that receives provision of the service and provides the command to each unit of the image processing device 100.
The storage unit 120 is a storage device capable of reading and writing data, such as a dynamic random access memory (DRAM), a static random access memory (SRAM), a flash memory, or a hard disk. The storage unit 120 temporarily or permanently stores various types of data used in the image processing device 100.
For example, the storage unit 120 at least temporarily stores data of a captured image received by the communication unit 110 from another device such as the sensor 200 and provides the data to the control unit 130 as necessary. Furthermore, for example, the storage unit 120 at least temporarily stores data of an image edited by the control unit 130 and provides the data to the communication unit 110 for transmission to another device as necessary. Alternatively, the storage unit 120 may provide the data of the edited image to a display device (not illustrated) for display,
The control unit 130 is a controller that controls each unit of the image processing device 100. The control unit 130 is implemented by, for example, a processor such as a central processing unit (CPU), a micro processing unit (MPU), or a graphics processing unit (GPU). For example, the control unit 130 is implemented by a processor executing various programs stored in a storage device inside the image processing device 100 using a random access memory (RAM) or the like as a work area. Note that the control unit 130 may be implemented by an integrated circuit such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA). Any of the CPU, the MPU, the GPU, the ASIC, and the FPGA can be regarded as a controller.
The control unit 130 includes an image acquisition unit 131, a specular image generating unit 132, a diffused image generating unit 133, a correction region determining unit 134, a correction processing unit 135, and a corrected image generating unit 136. Each of the blocks (image acquisition unit 131 to corrected image generating unit 136) included in the control unit 130 is a functional block indicating a function of the control unit 130, These functional blocks may be software blocks or hardware blocks. For example, each of the functional blocks described above may be one software module implemented by software (including microprograms) or may be one circuit block on a semiconductor chip (die). It is naturally understood that each of the functional blocks may be one processor or one integrated circuit. The control unit 130 may be constituted by functional units different from the above-described functional blocks. The functional blocks may be configured in any manner.
Note that the control unit 130 may include functional units different from the above-described functional blocks. In addition, some or all of the actions of the blocks (image acquisition unit 131 to corrected image generating unit 136) included in the control unit 130 may be performed by another device. For example, the sensor 200 may perform some or all of the actions of the blocks included in the control unit 130,
The image acquisition unit 131 acquires the polarized images M2 from the sensor 200. Alternatively, the image acquisition unit 131 may acquire the polarized images M2 from the storage unit 120 or may acquire the polarized images M2 from another device via the communication unit 110. The image acquisition unit 131 outputs the acquired polarized images M2 to the specular image generating unit 132 and the diffused image generating unit 133.
The specular image generating unit 132 generates a specular image on the basis of the polarized images M2. The specular image is an image having a luminance value of a specular component (reflection component) for each pixel.
The specular image generating unit 132 generates a specular image using a difference D between the maximum value and the minimum value of the luminance values approximated as a trigonometric function as the luminance value of a predetermined image. Alternatively, the specular image generating unit 132 may generate a specular image using the maximum value of the luminance values approximated as the trigonometric function as the luminance value of a predetermined pixel.
Note that the specular image generating unit 132 may calculate the luminance value of the specular image for each pixel of the polarized images M2 or may calculate the luminance value of the specular image for each pixel block including a plurality of pixels.
Furthermore, in this example, the specular image generating unit 132 calculates the luminance value of the specular image by approximating the luminance value of the polarized images M2 as a trigonometric function; however, it is not limited thereto. For example, the specular image generating unit 132 sets the maximum luminance value or a difference D between the maximum value and the minimum value among luminance values of pixels located at the same position in the plurality of polarized images M2 as the luminance value of the specular image at that position.
For example, in a case where the specular image is generated on the basis of two polarized images M2, the specular image generating unit 132 generates the specular image in which larger ones of luminance values of pixels of the two polarized images M2 are set as luminance values of the pixels. Alternatively, the specular image generating unit 132 generates a specular image in which the difference D between the luminance values of the pixels of the two polarized images M2 is set as the luminance value of the pixels.
Returning to
The diffused image generating unit 133 illustrated in
As described above, the luminance values of pixels at the same position in the plurality of polarized images M2_1 to M4 can be approximated as the trigonometric function having the polarization angle as the variable (see
The diffused image generating unit 133 generates the diffused image using the minimum value of the luminance values approximated as the trigonometric function as the luminance value of a predetermined image. Note that the diffused image generating unit 133 may calculate the luminance value of the diffused image for each pixel of the polarized images M2 or may calculate the luminance value of the diffused image for each pixel block including a plurality of pixels.
Furthermore, in this example, the diffused image generating unit 133 calculates the luminance value of the diffused image by approximating the luminance value of the polarized images M2 as the trigonometric function; however, it is not limited thereto. For example, the diffused image generating unit 133 sets the minimum luminance value among luminance values of pixels located at the same position in the plurality of polarized images M2 as the luminance value of the diffused image at that position.
For example, in a case where the diffused image is generated on the basis of the two polarized images M2, the diffused image generating unit 133 generates the diffused image in which smaller ones of luminance values of pixels of the two polarized images M2 are set as luminance values of the pixels.
The diffused image generating unit 133 illustrated in
The correction region determining unit 134 determines a correction region to be subjected to reflected image correction on the basis of the polarized images M2.
The reflection region specifying unit 1341 specifies a specular reflection region SR in the specular image using the specular image and the diffused image.
For example, the reflection region specifying unit 1341 selects a pixel having a luminance value larger than a first threshold (luminance threshold) from a difference image between the specular image and the diffused image. The reflection region specifying unit 1341 divides selected pixels into at least one or more pixel groups, a pixel group consisting of pixels in contact with each other.
The reflection region specifying unit 1341 determines a pixel group having an area larger than a second threshold among the pixel groups as the reflection region. In other words, the reflection region specifying unit 1341 determines a pixel group in which the number of pixels included in a pixel group is larger than the second threshold as the reflection region. The reflection region specifying unit 1341 determines, in the specular image, the same region as the determined reflection region as the specular reflection region SR.
The reflection region specifying unit 1341 outputs information regarding the specular reflection region SR to the reflected image region specifying unit 1342.
The reflected image region specifying unit 1342 specifies a reflected image region included in the specular reflection region SR on the basis of the specular image.
The reflected image region specifying unit 1342 determines an image included in the specular reflection region SR on the basis of a change in the luminance value of a detection result of edges of the specular reflection region SR of the specular image. The reflected image region specifying unit 1342 determines the image as the reflected image region R.
The reflected image region specifying unit 1342 outputs information regarding the determined reflected image region R to the correction region specifying unit 1343.
The correction region specifying unit 1343 specifies a correction region CR to be subjected to the reflected image correction processing on the basis of the specular image.
The correction region specifying unit 1343 determines, for example, a region including the reflected image region R as the correction region CR. The region may have a predefined shape or may have a shape designated by a user. Illustrated in
In addition, illustrated
The correction region specifying unit 1343 outputs information regarding the specified correction region CR to the correction processing unit 135.
The correction region determining unit 134 is not limited to the configuration illustrated in
The image position specifying unit 1344 specifies an image (object) included in the specular image. For example, the image position specifying unit 1344 detects an image included in the specular image and the position of the image on the basis of a change in the luminance value of the specular image or a detection result of edges. For example, the image position specifying unit 1344 can specify the subject OB1 or the specular reflection region SR as images included in the specular image in addition to the image (reflected image region R) reflected on the subject OB1.
The image position specifying unit 1344 outputs information regarding the specified image to the correction region specifying unit 1343A.
Note that, in this example, the image position specifying unit 1344 specifies the image on the basis of the specular image; however, it is not limited thereto. For example, the image position specifying unit 1344 may specify the image on the basis of the diffused image or may specify the image on the basis of the diffused image and the specular image.
The correction region specifying unit 1343A specifies the correction region CR on the basis of the specular reflection region SR specified by the reflection region specifying unit 1341 and the image specified by the image position specifying unit 1344.
The correction region specifying unit 1343A first specifies an image located in the specular reflection region SR among the images specified by the image position specifying unit 1344. For example, the correction region specifying unit 1343A specifies an image located in the specular reflection region SR as the reflected image region R.
The correction region specifying unit 1343A specifies the correction region CR to be subjected to the reflected image correction processing on the basis of the reflected image region R. The processing of specifying the correction region CR is the same as the processing in the correction region specifying unit 1343 in
The correction region specifying unit 1343A outputs information regarding the specified correction region CR to the correction processing unit 135,
Returning to
The method determining unit 1351 determines a reflected image correction method to be performed on the correction region CR. For example, the method determining unit 1351 determines a method to be actually performed from among a plurality of methods defined in advance. The method determining unit 1351 may determine the method in accordance with, for example, an instruction from the user or may determine a predefined method as a method to be actually performed. Alternatively, the method determining unit 1351 may determine the method depending on the type of the image (such as characters or a person) reflected in the specular reflection region SR.
The method determining unit 1351 outputs information regarding the determined correction method to the intensity determining unit 1352.
Here, three methods of low-pass filter processing, pixelation processing, and saturation suppression processing will be described as examples of the reflected image correction method determined by the method determining unit 1351. Note that the correction methods described here are examples, and the correction processing unit 135 may perform the reflected image correction by a method other than the three types of processing,
The method determining unit 1351 can determine, for example, low-pass filter processing as the correction method.
In a case where the method determining unit 1351 determines the low-pass filter processing as the correction method, the correction processing unit 135 executes the low-pass filter processing on the correction region CR and outputs the corrected specular image (hereinafter, also referred to as a corrected specular image).
As a result, for example, as illustrated in the left diagram of
At this point, the correction processing unit 135 generates a corrected specular image in which a high-frequency component in the correction region CR is smaller than a high-frequency component in a region other than the correction region CR (region NCR in
A solid line in
With the correction processing unit 135 executing the low-pass filter processing, the high-frequency components in the correction region CR are cut off. For example, in
Note that, in this example, the method determining unit 1351 determines the low-pass filter processing as the correction processing executed by the correction processing unit 135; however, it is not limited thereto. For example, the method determining unit 1351 can determine filtering processing of reducing specific frequency components as the correction processing,
In a case where the method determining unit 1351 determines pixelation processing as the correction method, the correction processing unit 135 executes the pixelation processing on the correction region CR and outputs the corrected specular image.
As a result, for example, as illustrated in the left diagram of
In a case where the method determining unit 1351 determines the saturation suppression processing as the correction method, the correction processing unit 135 executes the saturation suppression processing on the correction region CR and outputs the corrected specular image in which the saturation of the correction region CR is reduced.
As a result, for example, as illustrated in the left diagram of
Returning to
For example, as illustrated in the right diagram of
For example, it is based on the premise that the definition of the reflected image in a correction region CR1 illustrated in the upper left diagram of
In this case, the intensity determining unit 1352 determines the intensity of correction such that the intensity of correction of the correction region CR1 is the highest and that the intensity of correction of the correction region CR3 is the lowest. In addition, the intensity determining unit 1352 determines the intensity of correction such that the intensity of correction of the Correction region CR2 is weaker than that of the correction region CR1 and is stronger than that of the correction region CR3.
Specifically, the intensity determining unit 1352 changes the filter size of the low-pass filter processing depending on the definition of the correction region CR. For example, the intensity determining unit 1352 increases the filter size as the reflected image region R included in the correction region CR is clearer.
As described above, the intensity determining unit 1352 determines the correction intensity of the correction region CR such that the definition of the correction region CR3 after the correction is substantially constant regardless of the definition of the correction region CR before the correction. As a result, the image processing device 100 can output the corrected image in which the definition of the corrected correction region CR4 is substantially constant regardless of the definition of the correction region CR.
Note that, in
For example, in a case where the correction processing unit 135 performs pixelation processing, the intensity determining unit 1352 determines the intensity of correction by changing the range of pixelation disposition depending on the definition. For example, the intensity determining unit 1352 widens the range of random disposition as the correction region CR before correction is clearer.
Furthermore, for example, in a case where the correction processing unit 135 performs the saturation suppression processing, the intensity determining unit 1352 changes how to lower the saturation depending on the variance of the saturation in the correction region CR. For example, the intensity determining unit 1352 determines the intensity such that the larger the variance of the saturation of the correction region CR is, the lower the saturation is.
Returning to
The image correction unit 1353 illustrated in
As illustrated in the right diagram of
As described above, the image correction unit 1353 of the image processing device 100 can perform the reflected image correction following the motion of the subject OB1, for example. In this manner, the image processing device 100 can perform the reflected image correction of a moving image, for example.
As illustrated in the left diagram of
Meanwhile, as illustrated in the left diagram of
As described above, the image correction unit 1353 of the image processing device 100 performs correction of the correction region CR including the image in a case where an image is reflected in the subject OB1 and does not perform correction in a case where no image is reflected in the subject OB1. This makes it possible to prevent the image processing device 100 from performing unnecessary correction on the subject OB1 on which no image is reflected, thereby enabling to maintain the texture of the subject OB1 on which no image is reflected.
As illustrated in the left diagram of
In this case, as illustrated in the right diagram of
As described above, the image correction unit 1353 corrects the correction region CR including the image R reflected in the subject OB1 but does not perform correction on parts other than the correction region CR. Therefore, the image processing device 100 can perform the reflected image correction of the reflected image R while maintaining the texture of the subject OB1.
Note that, in this example, the image correction unit 1353 does not perform the reflected image correction on the image PR appearing in advance in the subject OB1; however, it is not limited thereto. For example, the image correction unit 1353 may perform reflected image correction on the image PR. In this case, the image correction unit 1353 sets the correction intensity for the image PR to be lower than the correction intensity for the correction region CR and performs the reflected image correction.
In this case, for example, the intensity determining unit 1352 in
Alternatively, the image correction unit 1353 may perform correction on the image PR in accordance with a user's instruction. For example, in a case where the user instructs to perform correction on the image PR, the image correction unit 1353 performs correction on the image PR, and in a case where the user instructs not to perform correction on the image PR, the image correction unit 1353 does not perform correction on the image PR. In a case where the user instructs to perform correction on the image PR, the image correction unit 1353 may perform correction on the image PR with an intensity instructed by the user.
In any of the examples described above, correction that makes it difficult to detect a reflected image is performed; however, the correction processing by the correction processing unit 135 is not limited thereto. For example, in order to notify the user that there is an image reflected in the specular image, the image correction unit 1353 may perform correction processing of emphasizing the correction region CR. Examples of the correction processing include contrast enhancement processing of enhancing the contrast of the correction region CR, saturation enhancement processing of enhancing saturation, and the like.
Returning to
For example, the corrected image generating unit 136 can generate the corrected image M3 by weighting and adding the luminance value of at least one of the corrected specular image or the diffused image. The corrected image generating unit 136 can output the generated corrected image M3 to a device as an output stage such as a display device (not illustrated).
Next,
As illustrated in
The image processing device 100 specifies a specular reflection region SR on the basis of the specular image and the diffused image (step S104). The image processing device 100 specifies a reflected image region R included in the specular reflection region SR (step S105). The image processing device 100 determines a correction region CR on the basis of the reflected image region R (step S106).
The image processing device 100 determines a correction method to be performed on the correction region CR (step S107). The image processing device 100 determines the correction intensity of the correction region CR (step S108).
The image processing device 100 performs the reflected image correction on the correction region CR with the correction intensity determined in step S108 using the correction method determined in step S107 (step S109). The image processing device 100 generates a corrected image M3, which is an output image, using the corrected specular image and the diffused image (step S110).
As described above, the image processing device 100 according to the embodiment of the disclosure outputs the corrected image obtained by correcting the correction region CR included in the specular reflection region SR on the basis of the polarized image M2. As a result, the image processing device 100 can correct the reflected image included in the specular reflection region SR while maintaining the texture of the subject OB1.
For example, the reflected image correction processing according to the embodiment of the disclosure can be used for automatically removing (or reducing) a reflected image such as a face or characters included in a photograph or a moving image captured by the sensor 200. As a result, the image processing device 100 can protect privacy of a person whose face, characters, or the like is reflected. Furthermore, with the image processing device 100 automatically executing the reflected image correction processing, it is possible to remove (or mitigate) a reflected image that the user does not aware of, thereby making it possible to protect privacy more reliably.
Furthermore, for example, in the reflected image correction processing according to the embodiment of the disclosure, the user can specify whether or not to execute the reflected image correction or to specify the correction region CR. In this manner, the image processing device 100 can perform at least a part of the reflected image correction processing in accordance with a user's instruction. As a result, the image processing device 100 can remove (or mitigate) reflection of equipment such as the sensor 200, reflection of the photograph-taker, or the like in accordance with the user's intention. Furthermore, the image processing device 100 can selectively remove (or mitigate) a reflected image that the user desires to remove (or mitigate) among all the reflections.
Furthermore, the reflected image correction processing according to the embodiment of the disclosure can be executed for a purpose other than privacy protection. For example, the image processing device 100 can execute the reflected image correction processing in order to execute subsequent processing with higher accuracy. For example, in object detection, the image processing device 100 removes (or mitigates) an object reflected in glass or the like in advance so as not to erroneously detect the reflected image, thereby making it sure to exclude the object from targets of recognition processing for the object detection. As a result, the image processing device 100 can further improve the object detection accuracy. Note that, in this case, the object detection may be performed by the image processing device 100 or may be performed by a device (not illustrated) as an output stage of the image processing device 100.
A part or the entirety of the reflected image correction processing according to the above-described embodiment can be implemented using, for example, artificial intelligence (AI).
As illustrated in
In the learning stage, the image processing device 100 acquires raw data (step S201) and generates a learning dataset (step S202). The raw data is, for example, input data input to a learned model at the use stage. The image processing device 100 generates, for example, a learning dataset obtained by combining the raw data and correct answer data.
Next, the image processing device 100 executes learning using the learning dataset as input (step S203). For example, the image processing device 100 performs supervised machine learning using the learning dataset. As a result, the image processing device 100 acquires the learned model (step S204). The learned model can be generated by using machine learning such as deep neural network (DNN), convolutional neural network (CNN), recurrent neural network (RNN), generative adversarial network (GAN), or an autoencoder, for example.
At the use stage, the image processing device 100 acquires the input data (step S205). The image processing device 100 executes processing by the learned model by inputting the input data to the learned model (step S206) and acquires output data (step S207).
Incidentally, what type of output data is acquired by the image processing device 100 using what type of input data differs depending on which processing of the reflected image correction processing the image processing device 100 executes by AI. Hereinafter, examples of input data and output data will be described as first to third modifications with reference to
As a first modification, it is based on the premise that the image processing device 100 implements processing by the correction region determining unit 134 and processing by the correction processing unit 135 illustrated in
In this case, the image processing device 100 learns, for example, a region determination model (an example of the learned model) used by the correction region determining unit 134 to determine the correction region CR. For example, the correction region determining unit 134 determines the correction region CR using the region determination model.
Furthermore, the image processing device 100 learns a correction model (an example of the learned model) used by the correction processing unit 135 for generating a corrected specular image. For example, the correction processing unit 135 generates a corrected specular image using the correction model.
As illustrated in
The specular image as the input data may include data having a feature that the image R is reflected in the specular reflection region SR and data having a feature that there is no designation for the image R. In this case, the corresponding output data is, for example, information (hereinafter, also referred to as position information) regarding the position around the image R in the specular reflection region SR.
The specular image as the input data may include data having a feature that the image R is reflected in the specular reflection region SR and data having a feature that there is designation for the image R. Incidentally, the designation for the image R means, for example, designation of whether the image R is “characters” or a “person”. Note that illustrated in
The specular image as the input data may include data having a feature that the image R is not reflected in the specular reflection region SR. In this case, since there is no reflection of the image R, the image R is not designated, and the corresponding output data is “none”.
The image processing device 100 learns the region determination model using, for example, the region determination dataset illustrated in
As illustrated in
The image processing device 100 acquires position information of the correction region CR as the output data by inputting the input data to the region determination model. The position information of the correction region CR is, for example, a correction region image having pixel values of “0” or “1”. In this case, the correction region image has the same size as those of the specular image and the diffused image, for example, and is an image in which the correction region CR is indicated by pixel values of “1” (or “0”).
Alternatively, the position information of the correction region CR may be information indicated by any position coordinates (for example, upper left point coordinates) and lengths of the sides of a rectangular region. Furthermore, the position information of the correction region CR may be information indicated by any position coordinates (for example, the center coordinates) and lengths of the axes of an elliptical region.
Note that the region determination model may output, as the output data, the type of the detected image R (for example, “characters”, “person”, and the like) in addition to the position information of the correction region CR.
As illustrated in
At this point, the input data is classified into data having the position information of the correction region CR and data having no position information of the correction region CR. Furthermore, the input data having the position information is classified into data having designation of the image R and data having no designation of the image R.
The input data may include data having a feature that there is position information around the image R in the specular reflection region SR but with no designation of the image R. In this case, the corresponding output data is, for example, an image obtained by correcting the image R in the specular reflection region SR.
The input data may include data having a feature that there is position information around the image R in the specular reflection region SR and that there is designation of the image R. Incidentally, the designation of the image R means, for example, designation of whether the image R is “characters” or a “person”. Note that illustrated in
In this case, the corresponding output data is, for example, an image obtained by correcting characters in the specular reflection region SR. At this point, the correction performed on the characters may be, for example, correction (for example, correction to such an extent that it is difficult to recognize that characters are reflected) effective for the characters or correction to an extent that the characters cannot be recognized by AI or the like. Note that, in this case, even in a case where there is a reflected image other than the characters in the specular reflection region SR, the reflected image is not corrected.
The input data may include data having a feature that there is no position information. In this case, the image R is not designated, and the corresponding output data is a specular image (“not corrected”) that is not corrected.
The image processing device 100 learns the correction model using, for example, the correction dataset illustrated in
As illustrated in
The image processing device 100 inputs the input data to the correction model to acquire the specular image (corrected specular image) after the correction processing as output data.
In this manner, the image processing device 100 can implement the processing by the correction region determining unit 134 and the processing by the correction processing unit 135 illustrated in
In the present modification, the image processing device 100 implements processing by the reflection region specifying unit 1341B, processing by the image position specifying unit 1344B, and processing by the correction processing unit 135B illustrated in
In this case, for example, the image processing device 100 learns a reflection determination model (an example of the learned model) used by the reflection region specifying unit 1341B for determining the specular reflection region SR. For example, the reflection region specifying unit 1341B determines the specular reflection region SR using the reflection determination model.
In addition, the image processing device 100 learns an image specifying model (an example of the learned model) used by the image position specifying unit 1344B to specify an image (object) included in the specular image. For example, the image position specifying unit 1344B specifies an image included in the specular image using the image specifying model.
Furthermore, the image processing device 100 learns a second correction model (an example of the learned model) used by the correction processing unit 135B for generating a corrected specular image. For example, the correction processing unit 135B generates a corrected specular image using the second correction model.
As illustrated in
In a case where the specular reflection region SR is included in the specular image of the input data, the corresponding output data is, for example, the position information of the specular reflection region SR in the specular image. In a case where there is no specular reflection region SR in the specular image of the input data, the corresponding output data is “none”.
The image processing device 100 learns the reflection determination model using, for example, the reflection determination dataset illustrated in
As illustrated in
As illustrated in
The input data may include data having a feature that there is an image in the specular image but with no designation of the image. In this case, the corresponding output data is, for example, position information of all images in the specular image.
The input data may include data having a feature that there is an image in the specular image with designation of the image. Incidentally, the designation of the image means, for example, designation of whether the image is “characters” or a “person”. Note that illustrated in
The input data may include data having a feature that there is no image in the specular image. In this case, designation of an image is not performed, and the corresponding output data is “none”.
The image processing device 100 learns the image specifying model using, for example, the image specifying dataset illustrated in
As illustrated in
The image processing device 100 acquires position information of the image in the specular image as output data by inputting the input data to the image specifying model. The position information of the image can be expressed similarly to the position information of the correction region CR described above.
As illustrated in
At this point, the input data is classified into data having the position information of the image and the specular reflection region SR and data having no such position information. In addition, the input data having the position information of the image and the specular reflection region SR is classified into data with designation of the image R and data with no designation of the image R.
The input data can include data having a feature that the output data of the image position specifying unit 1344B includes position information of an image in the specular image and that the output data of the reflection region specifying unit 1341B includes information regarding the position of the specular reflection region SR. In this case, for example, in a case where there is no designation of the image R to be corrected as the input data, the corresponding output data is, for example, an image obtained by correcting the image R in the specular reflection region SR.
Meanwhile, in this case, for example, in a case where there is designation of the image R to be corrected as the input data (for example, designated as “characters”), the corresponding output data is, for example, an image obtained by correcting the characters in the specular reflection region SR. At this point, the correction performed on the characters may be, for example, correction (for example, correction to such an extent that it is difficult to recognize that characters are reflected) effective for the characters or correction to an extent that the characters cannot be recognized by AI or the like. Note that, in this case, even in a case where there is a reflected image other than the characters in the specular reflection region SR, the reflected image is not corrected.
The input data may include data having a feature that the output data of the image position specifying unit 1344B has no position information of the image in the specular image. In this case, regardless of the output data of the reflection region specifying unit 1341B and designation of the image R to be corrected, the corresponding output data is the specular image that is uncorrected. That is, in a case where no image is included in the specular image, the reflected image correction is not executed on the specular image regardless of the presence or absence of the specular reflection region SR.
In addition, the input data may include data having a feature that the output data of the reflection region specifying unit 1341B has no position information of the specular reflection region SR. In this case, regardless of the output data of the image position specifying unit 1344B and designation of the image R to be corrected, the corresponding output data is the specular image that is uncorrected. That is, in a case where no specular reflection region SR is included in the specular image, the reflected image correction is not executed on the specular image regardless of the presence or absence of the image in the specular image.
The image processing device 100 learns the second correction model using, for example, the second correction dataset illustrated in
As illustrated in
The image processing device 100 inputs the input data to the second correction model to acquire the specular image (corrected specular image) after the correction processing as output data.
In this manner, the image processing device 100 can implement the processing by the correction region determining unit 134B and the processing by the correction processing unit 135B illustrated in
In the first and second modifications described above, the image processing device 100 executes the processing of the correction region determining units 134 and 134B and the processing of the correction processing units 135 and 135B each using AI; however, it is not limited thereto. For example, the image processing device 100 may implement the processing of the correction region determining unit 134 and the correction processing unit 135 as one piece of processing.
The image correction processing unit 137 generates a corrected specular image using a specular image generated by the specular image generating unit 132 and a diffused image generated by the diffused image generating unit 133 and outputs the corrected specular image to the corrected image generating unit 136.
In the present modification, it is based on the premise that the image processing device 100 implements processing by the image correction processing unit 137 illustrated in
In this case, for example, the image processing device 100 learns a reflected image correction model (an example of a learned model) used by the image correction processing unit 137 for generating the corrected specular image. The image correction processing unit 137 generates the corrected specular image using, for example, a reflected image correction model.
As illustrated in
The specular image as the input data may include data having a feature that the image R is reflected in the specular reflection region SR and data having a feature that there is no designation for the image R. In this case, the corresponding output data is, for example, an image obtained by correcting the image R in the specular reflection region SR (corrected specular image).
The specular image as the input data may include data having a feature that the image R is reflected in the specular reflection region SR and data having a feature that there is designation for the image R. Incidentally, the designation of the image R means, for example, designation of whether the image R is “characters” or a “person”. Note that illustrated in
The specular image as the input data may include data having a feature that the image R is not reflected in the specular reflection region SR. In this case, since there is no reflection of the image R, the image R is not designated, and the corresponding output data is an uncorrected image (specular image).
The image processing device 100 learns the region determination model using, for example, a reflected image correction dataset illustrated in
As illustrated in
The image processing device 100 inputs the input data to the reflected image correction model to acquire the specular image (corrected specular image) after the correction processing as output data.
In this example, the image processing device 100 uses the specular image and the diffused image as input and generates the corrected specular image using AI; however, it is not limited thereto. For example, the image processing device 100 may use the specular image and the diffused image as input and generate a corrected image that is an output image using AI. In this case, the image processing device 100 can omit the corrected image generating unit 136 (see
Alternatively, for example, the image processing device 100 may use the polarized images as input and generate the corrected specular image (or corrected image) using AI. In this case, the image processing device 100 can omit the specular image generating unit 132 and the diffused image generating unit 133 (see
Next, an example of the corrected image output by the image processing system 10 will be described.
Illustrated in
As illustrated in
As illustrated in
The image processing system 10 acquires an image including the subject OB1 in which the light of the spotlight SL is reflected as illustrated in the upper diagram of
For example, the image processing system 10 generates a specular image by using a difference D between the maximum value and the minimum value of the luminance values approximated as the trigonometric function as a luminance value of each image. In addition, the image processing system 10 generates a diffused image using, for example, the minimum value of luminance values approximated as the trigonometric function as a luminance value of each image.
Incidentally, as illustrated in the lower diagram of
The image processing system 10 performs reflected image correction using the first region in which the spotlight SL is reflected as the correction region CR. For example, the image processing system 10 applies low-pass filter processing to the correction region CR.
On the other hand, the image processing system 10 sets the second region where the spotlight SL is not reflected as a non-correction region NCR that is not to be corrected. For example, the image processing system 10 does not apply the low-pass filter processing to the non-correction region NCR.
As described above, light of the spotlight SL reflected by the background BG is reflected on the subject OB1 in addition to the background BG. As illustrated in
The subject OB3 illustrated in
In this case, in the correction region CR, the amount of high-frequency components of the stripe pattern is smaller than that in the non-correction region NCR.
As described above, the image processing system 10 outputs the corrected image in which the amount of the specific frequency components in the correction region CR is smaller than that of the specific frequency components in the non-correction region NCR.
In
As illustrated in (a) of
Meanwhile, as illustrated in (b) to (e) of
The correction region CR of the subject OB3 is a region in which the spotlight SL is reflected. In the correction region CR, the amount of high-frequency components of the stripe pattern is smaller than that in the non-correction region NCR in which the spotlight SL is not reflected. That is, the ratio of the high frequency components in the frequency components in the correction region CR is smaller than the ratio of the high frequency components in the frequency components in the non-correction region NCR.
For example, the ratio of the high frequency components in the frequency components in the non-correction region NCR is substantially the same as the ratio of the high frequency components in the frequency components in the subject OB3 of a case where the spotlight SL is not included.
In addition, for example, let us assume that the disposition of the spotlight SL is moved from
As described above, the image processing system 10 outputs the corrected image in which the amount of the high-frequency components of the stripe pattern is reduced in the correction region CR in which the spotlight SL is reflected than in the non-correction region NCR. Note that the high frequency components can be determined, for example, depending on the cut frequency in the low-pass filter processing.
Note that, although the case where the image processing system 10 selects the low-pass filter processing as the reflected image correction processing has been described here, it is not limited thereto. As described above, the image processing system 10 can select pixelation processing or saturation suppression processing as the reflected image correction processing other than the low-pass filter processing.
As illustrated in (a) of
Meanwhile, as illustrated in (b) to (e) of
The correction region CR of the subject OB3 is a region in which the spotlight SL is reflected. The correction region CR is pixelated. On the other hand, the non-correction region NCR is not pixelated and has an image with the same stripe pattern as the case where the reflected image correction is not performed.
In addition, for example, let us assume that the disposition of the spotlight SL is moved from
As described above, the image processing system 10 outputs a corrected image in which the correction region CR is pixelated in which the spotlight SL is reflected and the non-correction region NCR is not pixelated.
Returning to
Meanwhile, as illustrated in (b) to (e) of FIG. 40(C), in the cases where the spotlight SL is disposed in the imaging environment, the image processing system 10 outputs a corrected image including the subject OB3 on which saturation suppression processing has been performed as the reflected image correction.
The correction region CR of the subject OB3 is a region in which the spotlight SL is reflected. In the correction region CR, the saturation of the colored stripe pattern is low. On the other hand, the non-correction region NCR has an image having the same saturation as that of the case where the saturation suppression processing is not applied and the reflected image correction is not performed.
In addition, for example, let us assume that the disposition of the spotlight SL is moved from
As described above, the image processing system 10 outputs the corrected image in which the saturation of the correction region CR in which the spotlight SL is reflected is lower than the saturation of the non-correction region NCR.
The above embodiments and modifications are examples, and various modifications and applications can be made.
For example, an image processing program for executing the above actions is stored and distributed in a computer-readable recording medium such as an optical disc, a semiconductor memory, a magnetic tape, or a flexible disk. Moreover, for example, the control device is configured by installing the program in a computer and executing the above processing. At this point, the control device may be a device (for example, a personal computer) external to the image processing device 100. Furthermore, the control device may be a device (for example, the control unit 130) inside the image processing device 100.
In addition, the image processing program may be stored in a disk device included in a server device on a network such as the Internet such that the image processing program can be downloaded to a computer. In addition, the above functions may be implemented by collaborative operation between an operating system (OS) and application software. In this case, a portion other than the OS may be stored in a medium and distributed, or a portion other than the OS may be stored in a server device to allow a computer to download it, for example.
Among the pieces of processing described in the above embodiments, all or a part of processing described as that performed automatically can be performed manually, or all or a part of processing described as that performed manually can be performed automatically by a known method. In addition, a processing procedure, a specific name, and information including various types of data or parameters illustrated in the above or in the drawings can be modified as desired unless otherwise specified. For example, various types of information illustrated in the drawings are not limited to the information illustrated.
In addition, each component of each device illustrated in the drawings is conceptual in terms of function and is not necessarily physically configured as illustrated in the drawings. That is, the specific form of distribution and integration of devices is not limited to those illustrated in the drawings, and all or a part thereof can be functionally or physically distributed or integrated in any unit depending on various loads, use status, and the like. Note that this configuration by distribution or integration may be performed dynamically.
In addition, the above embodiments can be combined as appropriate as long as the processing content does not contradict each other. In addition, the order of the steps illustrated in the flowcharts or the like of the above embodiments can be modified as appropriate.
Furthermore, for example, the present embodiment can be implemented as any configuration including a device or a system, for example, a processor such as a system large scale integration (LSI), a module using a plurality of processors or the like, a unit using a plurality of modules or the like, a set of the like (namely, some components of a device) obtained by further adding another function to a unit.
Note that, in the present embodiment, a system refers to a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and coupled via a network, and one device in which a plurality of modules is housed in one housing are both systems.
Furthermore, for example, the present embodiment can adopt a configuration of cloud computing in which one function is shared and processed by a plurality of devices in cooperation via a network.
The information devices such as the image processing device 100 according to the embodiments or the modifications described above are implemented by, for example, a computer 1000 having a configuration as illustrated in
The CPU 1100 operates in accordance with a program stored in the ROM 1300 or the HDD 1400 and controls each of the components. For example, the CPU 1100 loads a program stored in the ROM 1300 or the HDD 1400 in the RAM 1200 and executes processing corresponding to various programs.
The ROM 1300 stores a boot program such as a basic input output system (BIOS) executed by the CPU 1100 when the computer 1000 is activated, a program dependent on the hardware of the computer 1000, and the like.
The HDD 1400 is a computer-readable recording medium that non-transiently records a program to be executed by the CPU 1100, data used by such a program, and the like. Specifically, the HDD 1400 is a recording medium that records an image processing program according to the present disclosure, which is an example of program data 1450.
The communication interface 1500 is an interface for the computer 1000 to be connected with an external network 1550 (for example, the Internet). For example, the CPU 1100 receives data from another device or transmits data generated by the CPU 1100 to another device via the communication interface 1500.
The input and output interface 1600 is an interface for connecting an input and output device 1650 and the computer 1000. For example, the CPU 1100 receives data from an input device such as a keyboard or a mouse via the input and output interface 1600. The CPU 1100 also transmits data to an output device such as a display, a speaker, or a printer via the input and output interface 1600. Furthermore, the input and output interface 1600 may function as a media interface that reads a program or the like recorded in a predetermined recording medium. A medium refers to, for example, an optical recording medium such as a digital versatile disc (DVD) or a phase change rewritable disk (PD), a magneto-optical recording medium such as a magneto-optical disk (MO), a tape medium, a magnetic recording medium, or a semiconductor memory.
For example, in a case where the computer 1000 functions as the image processing device 100 according to the embodiment, the CPU 1100 of the computer 1000 implements the function of the control unit 130 or other units by executing the image processing program loaded on the RAM 1200. The HDD 1400 also stores the image processing program according to the present disclosure or data in the storage unit. Note that although the CPU 1100 reads the program data 1450 from the HDD 1400 and executes the program data 1450, as another example, these programs may be acquired from another device via the external network 1550.
Furthermore, the image processing device 100 according to the present embodiment may be applied to a system including a plurality of devices based on a premise of connection to a network (or communication between devices), such as cloud computing. That is, the image processing device 100 according to the present embodiment described above can be implemented, for example, as the image processing system 10 according to the present embodiment by a plurality of devices.
An example of the hardware configuration of the image processing device 100 has been described above. Each of the above components may be configured using a general-purpose member or may be configured by hardware specialized in the function of the component. Such a configuration can be modified as appropriate depending on the technical level at the time of implementation.
Although the embodiments of the disclosure have been described above, the technical scope of the disclosure is not limited to the above embodiments as they are, and various modifications can be made without departing from the gist of the disclosure. In addition, components of different embodiments and modifications may be combined as appropriate.
Furthermore, the effects of the embodiments described herein are merely examples and are not limiting, and other effects may be achieved.
Note that the present technology can also have the following configurations.
(1)
An image processing device comprising:
The image processing device according to (1), wherein the specific region includes a region in which an image is reflected in the specular reflection region.
(3)
The image processing device according to (1) or (2), wherein the specific region includes a region designated by a user.
(4)
The image processing device according to any one of (1) to (3), wherein the control unit generates at least one of a specular image having a specular component as a pixel value or a diffused image having a diffusion component as a pixel value on a basis of the plurality of polarized images.
(5)
The image processing device according to (4), wherein the specific region is specified depending on the specular image.
(6)
The image processing device according to any one of (1) to (5), wherein the control unit outputs the corrected image in which a ratio of specific frequency components in frequency components of the specific region is smaller than the ratio of the specific frequency components in the frequency components of a region other than the specific region.
(7)
The image processing device according to any one of (1) to (6), wherein the control unit performs low-pass filter processing on the specific region.
(8)
The image processing device according to any one of (1) to (7), wherein the control unit outputs the corrected image obtained by applying pixelation processing to the specific region.
(9)
The image processing device according to any one of (1) to (8), wherein the control unit outputs the corrected image in which saturation of the specific region is lower than the saturation of a region other than the specific region.
(10)
The image processing device according to any one of (1) to (9), wherein the control unit decreases saturation of the specific region.
(11)
The image processing device according to any one of (1) to (10), wherein the control unit outputs the corrected image obtained by correcting the specific region with an intensity corresponding to a reflected image included in the specific region.
(12)
The image processing device according to (11), wherein the control unit outputs the corrected image obtained by correcting the specific region with the intensity at which the reflected image included in the specific region enters a predetermined state.
(13)
The image processing device according to any one of (1) to (12), wherein the control unit outputs the corrected image in which low-pass filter processing of a filter size corresponding to definition of the specific region is applied to the specific region.
(14)
The image processing device according to any one of (1) to (13), wherein the control unit outputs the corrected image in which pixelation is disposed in an area corresponding to definition of the specific region.
(15)
The image processing device according to any one of (1) to (14a), wherein the control unit outputs the corrected image in which saturation of the specific region is reduced depending on dispersion of the specific region.
(16)
An image processing device comprising:
An image processing system including:
The image processing system according to (17), in which the specific region includes a region in which an image is reflected in the specular reflection region.
(19)
The image processing system according to (17) or (18), in which the specific region includes a region designated by a user.
(20)
The image processing system according to any one of (17) to (19), in which the control unit generates at least one of a specular image having a specular component as a pixel value or a diffused image having a diffusion component as a pixel value on the basis of the plurality of polarized images.
(21)
The image processing system according to (20), in which the specific region is specified depending on the specular image.
(22)
The image processing system according to any one of (17) to (21), in which the control unit outputs the corrected image in which a ratio of specific frequency components in frequency components of the specific region is smaller than the ratio of the specific frequency components in the frequency components of a region other than the specific region.
(23)
The image processing system according to any one of (17) to (22), in which the control unit performs low-pass filter processing on the specific region.
(24)
The image processing system according to any one of (17) to (23), in which the control unit outputs the corrected image obtained by applying pixelation processing to the specific region.
(25)
The image processing system according to any one of (17) to (24), in which the control unit outputs the corrected image in which saturation of the specific region is lower than the saturation of a region other than the specific region.
(26)
The image processing system according to any one of (17) to (25), in which the control unit decreases saturation of the specific region.
(27)
The image processing system according to any one of (17) to (26), in which the control unit outputs the corrected image obtained by correcting the specific region with an intensity corresponding to a reflected image included in the specific region.
(28)
The image processing system according to (27), in which the control unit outputs the corrected image obtained by correcting the specific region with the intensity at which the reflected image included in the specific region enters a predetermined state.
(29)
The image processing system according to any one of (17) to (28), in which the control unit outputs the corrected image in which low-pass filter processing of a filter size corresponding to definition of the specific region is applied to the specific region.
(30)
The image processing system according to any one of (17) to (29), in which the control unit outputs the corrected image in which pixelation is disposed in an area corresponding to definition of the specific region.
(31)
The image processing system according to any one of (17) to (28), in which the control unit outputs the corrected image in which saturation of the specific region is reduced depending on dispersion of the specific region.
(32)
An image processing system including:
An image processing method including the steps of:
The image processing method according to (33), in which the specific region includes a region in which an image is reflected in the specular reflection region.
(35)
The image processing method according to (33) or (34), in which the specific region includes a region designated by a user.
(36)
The image processing method according to any one of (33) to (35), the method further including generating at least one of a specular image having a specular component as a pixel value or a diffused image having a diffusion component as a pixel value on the basis of the plurality of polarized images.
(37)
The image processing method according to (36), in which the specific region is specified depending on the specular image.
(38)
The image processing method according to any one of (33) to (37), the method further including outputting the corrected image in which a ratio of specific frequency components in frequency components of the specific region is smaller than the ratio of the specific frequency components in the frequency components of a region other than the specific region.
(39)
The image processing method according to any one of (33) to (38), the method further including performing low-pass filter processing on the specific region.
(40)
The image processing method according to any one of (33) to (39), the method further including outputting the corrected image to which pixelation processing is applied to the specific region.
(41)
The image processing method according to any one of (33) to (40), the method further including outputting the corrected image in which saturation of the specific region is lower than the saturation of a region other than the specific region.
(42)
The image processing method according to any one of (33) to (41), the method further including reducing saturation of the specific region.
(43)
The image processing method according to any one of (33) to (42), the method further including outputting the corrected image in which the specific region is corrected with an intensity corresponding to a reflected image included in the specific region.
(44)
The image processing method according to (43), the method further including outputting the corrected image obtained by correcting the specific region with the intensity at which the reflected image included in the specific region enters a predetermined state.
(45)
The image processing method according to any one of (33) to (44), the method further including outputting the corrected image to which low-pass filtering processing of a filter size corresponding to definition of the specific region is applied to the specific region.
(46)
The image processing method according to any one of (33) to (45), the method further including outputting the corrected image in which a pixelation is disposed in an area corresponding to definition of the specific region.
(47)
The image processing method according to any one of (33) to (46), the method further including outputting the corrected image in which saturation of the specific region is reduced depending on dispersion of the specific region.
(48)
The image processing method including the steps of:
An image processing program for causing a computer to function as a control unit that:
The image processing program according to (49), in which the specific region includes a region in which an image is reflected in the specular reflection region.
(51)
The image processing program according to (49) or (50), in which the specific region includes a region designated by a user.
(52)
The image processing program according to any one of (49) to (51), in which the control unit generates at least one of a specular image having a specular component as a pixel value or a diffused image having a diffusion component as a pixel value on the basis of the plurality of polarized images.
(53)
The image processing program according to (52), in which the specific region is specified depending on the specular image.
(54)
The image processing program according to any one of (49) to (53), in which the control unit outputs the corrected image in which a ratio of specific frequency components in frequency components of the specific region is smaller than the ratio of the specific frequency components in the frequency components of a region other than the specific region.
(55)
The image processing program according to any one of (49) to (54), in which the control unit performs low-pass filter processing on the specific region.
(56)
The image processing program according to any one of (49) to (55), in which the control unit outputs the corrected image obtained by applying pixelation processing to the specific region.
(57)
The image processing program according to any one of (49) to (56), in which the control unit outputs the corrected image in which saturation of the specific region is lower than the saturation of a region other than the specific region.
(58)
The image processing program according to any one of (49) to (57), in which the control unit decreases saturation of the specific region.
(59)
The image processing program according to any one of (49) to (58), in which the control unit outputs the corrected image obtained by correcting the specific region with an intensity corresponding to a reflected image included in the specific region.
(60)
The image processing program according to (59), in which the control unit outputs the corrected image obtained by correcting the specific region with the intensity at which the reflected image included in the specific region enters a predetermined state.
(61)
The image processing program according to any one of (49) to (60), in which the control unit outputs the corrected image in which low-pass filter processing of a filter size corresponding to definition of the specific region is applied to the specific region.
(62)
The image processing program according to any one of (49) to (61), in which the control unit outputs the corrected image in which pixelation is disposed in an area corresponding to definition of the specific region.
(63)
The image processing program according to any one of (49) to (62), in which the control unit outputs the corrected image in which saturation of the specific region is reduced depending on dispersion of the specific region.
(64)
An image processing program for causing a computer to function as a control unit that:
Number | Date | Country | Kind |
---|---|---|---|
2022-015680 | Feb 2022 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2023/001447 | 1/19/2023 | WO |