The present application is based on PCT filing PCT/JP2019/019497, filed May 16, 2019, the entire contents of which are incorporated herein by reference.
The present invention relates to an image processing device, a method, an image reading device, a program, and a recording medium.
In particular, the present invention relates to image processing for, in an image reading device that reads an original in different wavelength bands and generates a multispectral image constituted by multiple band images, improving the S/N ratio of the multispectral image.
In an image reading device that obtains multiple band images by reading an original in different wavelength bands, depending on the combination of the spectrum of the light source and the spectral sensitivity of the image sensor, there is a band image of a wavelength band in which a sufficient signal amount is not obtained. For example, band images obtained by illumination using ultraviolet light sources or near-infrared light sources often have lower S/N ratios than band images of visible light bands.
As another example of generating a multispectral image, in imaging the ground or the like from an optical satellite or aircraft, it is widely practiced to simultaneously obtain both a high-resolution panchromatic image and a low-resolution multispectral image for the same object.
In a system that simultaneously obtains both a low-resolution multispectral image and a high-resolution panchromatic image, a pan-sharpening process that combines the two images to generate a high-resolution color image is performed.
In the pan-sharpening process, removal of noise included in the multispectral image is a problem.
In removing noise of band images with low resolution and low S/N ratio by using a panchromatic image with high resolution and high S/N ratio, a noise removal device described in Patent Literature 1 performs multiresolution decomposition on the panchromatic image, performs multiresolution decomposition on the band images, corrects the decomposed components of the panchromatic image by using the decomposed components of the band images, and reconstructs a panchromatic image by using the corrected decomposed components and another decomposed component.
As described above, in Patent Literature 1, noise correction is performed by a combination of a high-resolution panchromatic image and a multispectral image. In some cases, only a low-resolution multispectral image is obtained, and no high-resolution image is obtained. In a case where smoothing for noise reduction is performed with only a multispectral image, when the multispectral image includes a large amount of noise, the smoothing may destroy edges.
The present invention is to solve the above problem, and is intended to provide an image processing device capable of, when a multispectral image including a band image of a wavelength band in which a sufficient signal amount is not obtained is obtained, improving the S/N ratio of the multispectral image without destroying edges.
An image processing device of the present invention includes: a parameter calculator to analyze a reference image of the same field of view as a plurality of band images constituting a multispectral image to calculate a local variance or an edge amount as an image feature amount, and calculate a filter parameter from the image feature amount; and a filter processor to perform an edge preserving smoothing process on the band images by using the filter parameter.
The present invention makes it possible, when a multispectral image including a band image of a wavelength band in which a sufficient signal amount is not obtained is obtained, to improve the S/N ratio of the multispectral image without destroying edges.
The image reading device illustrated in
The light source 1 is constituted by multiple light sources each having a relatively narrow band. Hereinafter, the light sources having the relatively narrow bands will be referred to as band light sources. The multiple band light sources include, for example, a light source having a visible light wavelength band, a light source having an ultraviolet wavelength band, and a light source having a near-infrared wavelength band. The multiple band light sources are controlled to sequentially illuminate an original one by one.
The image sensor 2 sequentially obtains multiple band images DIN by imaging the original when the original as an object is sequentially illuminated by the above multiple band light sources. In general, the multiple band images DIN have different S/N ratios.
The image sensor 2 may be a one-dimensional image sensor (line sensor) or a two-dimensional image sensor (area sensor). The line sensor may be a contact image sensor.
The image processing device 3 combines the multiple band images DIN to generate a reference image having a higher S/N ratio than each band image, analyzes the reference image to calculate a local variance or an edge amount as an image feature amount, calculates a filter parameter from the image feature amount, and uses the filter parameter to smooth the multiple band images DIN while preserving edges.
The image processing device 3 includes an image combiner 31, a parameter calculator 32, and a filter processor 33.
The image combiner 31 combines the multiple band images DIN to generate a reference image MSr having a higher S/N ratio than each band image.
Use of a band image having a low S/N ratio in the combination may increase the noise amount of the reference image MSr and reduce the S/N ratio. To avoid reduction in the S/N ratio of the reference image MSr, in this embodiment, the combination is performed such that a combination weight of a band image whose noise amount is greater is smaller.
The image combiner 31 includes a noise amount calculator 311 and a weighting combiner 312, for example, as illustrated in
The noise amount calculator 311 calculates a noise amount Ni of each of the multiple band images DIN. The calculated noise amounts Ni are each represented by a scalar, for example.
When the image processing device forms part of the image reading device illustrated in
When information on the quantum efficiency of the image sensor 2 or the amount of illumination by the light source cannot be used, the noise amount Ni of each band image can be calculated by analyzing each band image DIN.
Examples of the method of calculating the noise amount Ni by analyzing the image include a method of calculating a local variance of each pixel in a flat region of the image and determining an average thereof.
In this case, it is possible to calculate a local variance of each pixel and take, as a flat region, a region formed by pixels whose local variances are not greater than a threshold or a region in which the proportion of the pixels whose local variances are not greater than a threshold is not less than a predetermined value.
The local variances are determined by, for example, the same calculation as Equation (3) to be described later.
The weighting combiner 312 generates the reference image MSr by combining the multiple band images DIN such that the combination weight of a band image whose noise amount Ni is smaller is greater. The combination is performed by weighting and adding pixel values at the same positions of the multiple band images. Specifically, the weighting combiner 312 determines weighted averages of pixel values at the same positions of the multiple band images DIN as pixel values at the same positions of the reference image MSr.
The weighted average for a certain pixel (pixel of interest) is determined by, for example, the calculation represented by the following Equation (1):
In Equation (1),
As the combination weight Wi for the pixel value MSi(x) of each band image MSi, a normalized value of a reciprocal of the noise amount Ni of the band image MSi is used. The normalized value of the reciprocal of the noise amount of each band image is a value obtained by dividing the reciprocal of the noise amount of the band image by an average of the reciprocals of the noise amounts of all the band images. Thus, the combination weight Wi is represented by the following Equation (2):
Instead of the above method, the weighting combiner 312 may set each weight Wi to 0 or 1 by using a preset noise amount threshold and determine the weighted averages. For example, it may set the weight Wi to 0 when the noise amount is not less than the noise amount threshold and to 1 otherwise, and determine the weighted averages by using the number of band images for which the weight Wi has been set to 1, instead of M.
The reference image MSr obtained by the above combination has a higher S/N ratio than each band image.
The parameter calculator 32 analyzes the reference image MSr to calculate, as an image feature amount, a local variance or an edge amount for each pixel of the reference image MSr, and calculates filter parameters D32 from the image feature amounts.
The local variance var(x) for a position x in the reference image MSr can be determined by, for example, performing the calculation represented by the following Equation (3):
In Equation (3),
The edge amount can be determined by, for example, a method using a bilateral weight. For example, the edge amount bw(x) for a position x in the reference image MSr can be determined by performing the calculation represented by the following Equation (4):
In Equation (4),
σ1 and σ2 denote constants. The constants σ1 and σ2 are arbitrarily determined.
The parameter calculator 32 calculates and outputs the filter parameters D32 from the calculated image feature amounts.
For example, it may simply output the image feature amounts as the filter parameters, or may output, as the filter parameters, reciprocals of the image feature amounts or normalized values of them.
The filter processor 33 uses the filter parameters D32 to smooth the multiple band images DIN while preserving edges, and generates multiple output images DOUT.
The multiple output images DOUT correspond to the respective multiple band images DIN. A set of the multiple output images DOUT forms a multispectral image.
The filter process can be performed by using a reference smoothing filter to which the filter parameters can be input. For example, a joint bilateral filter, a guided filter, or the like can be used.
The filtering process using a joint bilateral filter is represented by, for example, the following Equation (5):
In Equation (5),
The range of the above local region Ω(x) need not be the same as the range of the local region Ω(x) in the above Equation (3) or (4).
In Equation (5), Ws is a distance weight, and determined by, for example, the following Equation (6):
In Equation (6),
In Equation (5), We is a pixel value weight, and determined by, for example, the following Equation (7):
In Equation (7), σc is a parameter that determines the pixel value weight Wc, and denotes a variance.
In this embodiment, var(x) given by the above Equation (3), bw(x) given by Equation (4), or a value obtained by normalizing one of them is used as the parameter σc.
In the image processing device 3 of the first embodiment, when a multispectral image including a band image of a wavelength band in which a sufficient signal amount is not obtained is obtained, it is possible to improve the S/N ratio of the multispectral image without destroying edges.
Also, since the image combiner 31 includes the noise amount calculator 311 as illustrated in
In the above example, the multiple band light sources include the light source having the visible light wavelength band, the light source having the ultraviolet wavelength band, and the light source having the near-infrared wavelength band. The set of the multiple band light sources is not limited to the above example. For example, the visible light wavelength band may be divided. For example, a light source having a red wavelength band, a light source having a green wavelength band, and a light source having a blue wavelength band may be provided instead of the light source having the visible light wavelength band.
In the above example, the variances or edge amounts are determined as the image feature amounts by the parameter calculator 32, and the values or normalized values thereof are used as the parameters σc for determining the combination weights. The image feature amounts determined by the parameter calculator 32 may be values other than the variances or edge amounts. In any case, the image feature amounts, reciprocals thereof, or normalized values thereof may be used as parameters for determining the combination weights.
The illustrated image processing device 3b is generally the same as the image processing device 3 of
The image combiner 31 of the first embodiment performs the image combination such that the combination weight of a band image whose noise amount is greater is smaller. The image combiner 31b of this embodiment performs the image combination by using combination weights determined on the basis of image correlations between the band images.
The image characteristics of each band image depend on its wavelength band, and the farther the wavelength bands of two band images are from each other, the greater the difference in image characteristics between the two band images.
In this embodiment, when it is intended to reduce the noise of a band image of a certain wavelength band, the reference image is generated by performing the image combination such that the combination weight for a band image whose image correlation with the band image of the target wavelength band (the image of the target wavelength band) is greater is greater.
The image combiner 31b combines the multiple band images DIN to generate a reference image MSr having a higher S/N ratio than each band image.
The image combiner 31b includes a correlation calculator 313 and a weighting combiner 312b, for example, as illustrated in
The correlation calculator 313 calculates, for each of the multiple band images DIN, an image correlation Cori with a band image of a target wavelength band.
The image correlation Cori between each band image and the band image of the target wavelength band can be determined on the basis of, for example, a difference in wavelength between each band image and the band image of the target wavelength. In this case, it can be considered that the smaller the above difference, the higher the image correlation.
In this case, the above image correlation can be represented by a function of wavelength.
λt on the horizontal axis denotes a center wavelength or peak wavelength (target wavelength) of the target wavelength band.
In the example illustrated in
In the example illustrated in
In the example illustrated in
In the above example, the correlation calculator 313 determines, for each band image, the image correlation on the basis of the difference in wavelength from the band image of the target wavelength band.
Alternatively, it is possible to determine, for each band image, the image correlation on the basis of the similarity with the band image of the target wavelength band.
As an indicator of the similarity, a sum of absolute differences (SAD), a sum of squared differences (SSD), a normalized cross correlation (NCC), a zero-means normalized cross correlation (ZNCC), or the like can be used, for example.
SAD for each band image MSi (denoted by the symbol SAi) is determined by the following Equation (8):
SSD for each band image MSi (denoted by the symbol SSi) is determined by the following Equation (9):
NCC for each band image MSi (denoted by the symbol NCi) is determined by the following Equation (10):
ZNCC for each band image MSi (denoted by the symbol ZNCi) is determined by the following Equation (11):
In Equations (8) to (11),
The higher the similarity, the smaller the values of SAD and SSD. The higher the similarity, the greater the values of NCC and ZNCC.
SAD and SSD are appropriate when it is intended to make a comparison in terms of image characteristics including intensity (brightness, luminance), and NCC and ZNCC, each of which provides a value independent of the image intensities (the magnitudes of the pixel values) due to normalization, are appropriate when it is intended to extract spectral reflection characteristics of the object in a specific wavelength band.
The weighting combiner 312b generates the reference image MSr by combining the multiple band images DIN such that the combination weight of a band image whose image correlation Cori is greater is greater. The generation is performed by weighting and adding pixel values at the same positions of the multiple band images. Specifically, the weighting combiner 312b determines weighted averages of pixel values at the same positions of the multiple band images DIN as pixel values at the same positions of the reference image MSr.
The weighted average for a certain pixel (pixel of interest) is determined by, for example, the calculation represented by the above Equation (1).
However, as the combination weight Wi for the pixel value MSi(x) of each band image MSi, a normalized value of the image correlation Cori of the band image MSi is used. The normalized value of the image correlation of each band image is a value obtained by dividing the image correlation of the band image by an average of the image correlations of all the band images. Thus, the combination weight Wi is represented by the following Equation (12):
Although in the above example, the similarity of an image is determined for the entire image, it is possible to determine the similarity for each pixel and determine the image correlation for each pixel.
In this case, in Equation (8) to (11), the similarity is determined by using pixels in a local region centered at the pixel of interest, and the image correlation for the pixel of interest is determined from the determined similarity.
Also in the image processing device 3b of the second embodiment, when a multispectral image including a band image of a wavelength band in which a sufficient signal amount is not obtained is obtained, it is possible to improve the S/N ratio of the multispectral image without destroying edges.
Also, since the image combiner 31b includes the correlation calculator 313 as illustrated in
For the first embodiment, as another example of the weighting combination, there has been described a method of setting each weight Wi to 0 or 1 by using a preset noise amount threshold and determining the weighted averages. The same modification can be made to the second embodiment.
Specifically, it is possible to set each weight Wi to 0 or 1 by using a preset correlation threshold and determine the weighted averages. For example, it is possible to set the weight Wi to 1 when the image correlation is not less than the image correlation threshold and to 0 otherwise, and determine the weighted averages by using the number of band images for which the weight Wi has been set to 1, instead of M.
The illustrated image processing device 3c is generally the same as the image processing device 3 of
Also, in addition to the band images DIN of
The band images DIN are the same as the band images of
The white image DIW is an image having a band including all the bands of the multiple band images DIN, and is used as a reference image. The band of the white image DIW is preferably wider than a band obtained by summing all the bands of the multiple band images DIN.
The parameter calculator 32c analyzes the white image DIW, and for each pixel of the white image DIW, calculates a local variance or an edge amount as an image feature amount and calculates a filter parameter D32c from the image feature amount.
The content of the process in the parameter calculator 32c is the same as that of the process in the parameter calculator 32 of
The filter processor 33 uses the filter parameters D32c to smooth the multiple band images DIN while preserving edges, and generates multiple output images DOUT.
The content of the process in the filter processor 33 is the same as that of the process in the filter processor 33 of
In the image processing device 3c of the third embodiment, it is possible to obtain the white image having the band including all the bands of the multiple band images and use it as the reference image. Thus, even in a case in which the number of wavelength bands of the used light source is small, when a multispectral image including a band image of a wavelength band in which a sufficient signal amount is not obtained is obtained, it is possible to improve the S/N ratio of the multispectral image without destroying edges.
The illustrated image processing device 3d is generally the same as the image processing device 3c of
The bandpass filter group 34 receives as an input the white image and includes multiple bandpass filters that pass different wavelength bands, and multiple band images D34 of different wavelength bands are output from the bandpass filter group 34.
In the above first to third embodiments, the multiple band images DIN are sequentially obtained by performing imaging when the original is sequentially illuminated by the multiple band light sources having different wavelength bands. On the other hand, in the fourth embodiment, the multiple band images DIN are generated by using the multiple bandpass filters that pass the different wavelength bands of the white image of a wide band.
The filter processor 33 uses the filter parameters D32c to smooth the multiple band images D34 while preserving edges, and generates multiple output images DOUT.
The fourth embodiment provides the same advantages as the first to third embodiments.
In the above first to fourth embodiments, the image processing device forms part of the image reading device. However, image processing devices of the present invention can be used for purposes other than image reading. For example, they can also be used in imaging the ground or the like from an optical satellite or aircraft.
Each of the image processing devices 3, 3b, 3c, and 3d described in the first to fourth embodiments may be partially or wholly formed by processing circuitry.
For example, the functions of the respective portions of the image processing device may be implemented by respective separate processing circuits, or the functions of the portions may be implemented by a single processing circuit.
The processing circuitry may be implemented by hardware, or by software or a programmed computer.
It is possible that a part of the functions of the respective portions of the image processing device is implemented by hardware and another part is implemented by software.
In the illustrated example, the computer 9 includes a processor 91 and a memory 92.
A program for implementing the functions of the respective portions of the image processing device is stored in the memory 92.
The processor 91 uses, for example, a central processing unit (CPU), a graphics processing unit (GPU), a microprocessor, a microcontroller, a digital signal processor (DSP), or the like.
The memory 92 uses, for example, a semiconductor memory, such as a random access memory (RAM), a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM), or an electrically erasable programmable read only memory (EEPROM), a magnetic disk, an optical disk, a magnetic optical disk, or the like.
The processor 91 implements the function of the image processing device by executing the program stored in the memory 92.
A procedure when a process in the image processing device of the first or second embodiment is performed by the computer illustrated in
In step ST11, the multiple band images DIN are received.
In step ST12, the reference image MSr is generated by combining the multiple band images DIN. This process is performed as described for the image combiner 31 or 31b.
In step ST13, the local variances or edge amounts are calculated as the image feature amounts by analyzing the reference image MSr, and the filter parameters D32 are calculated from the image feature amounts. This process is performed as described for the parameter calculator 32.
In step ST14, smoothing the multiple band images DIN while preserving edges is performed by using the filter parameters D32. This process is performed as described for the filter processor 33.
A procedure when a process in the image processing device of the third embodiment is performed by the computer illustrated in
In
In step ST11, the multiple band images DIN are received.
In step ST22, the white image DIW is received.
In step ST13c, the local variances or edge amounts are calculated as the image feature amounts by analyzing the white image DIW, and the filter parameters D32c are calculated from the image feature amounts. This process is performed as described for the parameter calculator 32c.
In step ST14, smoothing the multiple band images DIN while preserving edges is performed by using the filter parameters D32c. This process is performed as described for the filter processor 33.
A procedure when a process in the image processing device of the fourth embodiment is performed by the computer illustrated in
In
In step ST22, the white image DIW is received.
In step ST13c, the local variances or edge amounts are calculated as the image feature amounts by analyzing the white image DIW, and the filter parameters D32c are calculated from the image feature amounts. This process is performed as described for the parameter calculator 32c.
In step ST33, the bandpass filtering that passes the different wavelength bands is performed on the white image DIW, so that the multiple band images D34 are generated. This process is performed as described for the bandpass filter group 34.
The process of step ST33 can be performed in parallel with the process of step ST13c.
In step ST14, smoothing the multiple band images D34 while preserving edges is performed by using the filter parameters D32c. This process is performed as described for the filter processor 33.
Although image processing devices of the present invention have been described above, the image processing methods implemented by the above image processing devices also form part of the present invention. Also, programs for causing computers to execute processes of the above image processing devices or image processing methods, and computer-readable recording media, e.g., non-transitory recording media, storing the programs also form part of the present invention.
Although embodiments of the present invention have been described, the present invention is not limited to these embodiments.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/019497 | 5/16/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2020/230319 | 11/19/2020 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20160210729 | Monden | Jul 2016 | A1 |
20180328855 | Kido | Nov 2018 | A1 |
20190223820 | Nemoto | Jul 2019 | A1 |
20200074604 | Shibata et al. | Mar 2020 | A1 |
Number | Date | Country |
---|---|---|
109447909 | Mar 2019 | CN |
2016-32289 | Mar 2016 | JP |
WO-2014007869 | Jan 2014 | WO |
2015037189 | Mar 2015 | WO |
2018084069 | May 2018 | WO |
Entry |
---|
International Search Report and Written Opinion mailed on Jul. 2, 2019, received for PCT Application PCT/JP2019/019497, Filed on May 16, 2019, 8 pages including English Translation. |
Number | Date | Country | |
---|---|---|---|
20220222782 A1 | Jul 2022 | US |