This patent application is based on and claims priority pursuant to 35 U.S.C. § 119 to Japanese Patent Application No. 2019-131121, filed on Jul. 16, 2019, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
Exemplary aspects of the present disclosure relate to an image processing apparatus, an image processing method, and a recording medium.
Conventionally, an image capturing technique for authenticating a document for document security is known. Such a technique embeds invisible information that cannot be seen by human eyes in a document, and reads the invisible information by using invisible light to authenticate the document.
There is a technique for extracting only a visible wavelength component by removing an infrared component from a visible image including the infrared component. According to the technique, a document image is illuminated by a lamp having an infrared wavelength component, and an infrared image including an infrared component is acquired by an infrared image sensor. At the same time, a visible image including a visible component and an infrared component is acquired by a visible image sensor. The infrared image and the visible image are used, so that the infrared component is removed from the visible image including the infrared component to extract only a visible wavelength component.
In at least one embodiment of this disclosure, there is described an image processing apparatus that includes a light source, an image sensor, and an invisible component remover. The light source emits visible component light and invisible component light to a target object. The image sensor receives reflected light from the target object to detect a visible invisible mixture image including an invisible component and a visible component and an invisible image including an invisible component. The invisible component remover, based on the visible invisible mixture image and the invisible image which have been detected, removes the invisible component from the visible invisible mixture image to generate a visible image. The invisible component remover includes a removal calculator and a noise reducer. The removal calculator performs a removal calculation process of an invisible component with respect to the visible invisible mixture image. The noise reducer configured to perform a noise reduction process on at least one of an image to be input to the removal calculator and an image to be output from the removal calculator.
Further described is an improved image processing method for an image processing apparatus including a light source and an image sensor. The light source emits visible component light and invisible component light to a target object, and the image sensor receives reflected light from the target object to detect a visible invisible mixture image including an invisible component and a visible component and an invisible image including an invisible component. The image processing method includes removing, based on the visible invisible mixture image and the invisible image which have been detected, the invisible component from the visible invisible mixture image to generate a visible image. The removing includes performing a removal calculation process of an invisible component with respect to the visible invisible mixture image, and performing a noise reduction process on at least one of an image to be input to the removal calculation process and an image to be output from the removal calculation process.
Still further provided is a non-transitory computer-readable recording medium storing program code that causes a computer controlling the image processing apparatus described above to function as an invisible component remover, based on the visible invisible mixture image and the invisible image which have been detected, to remove the invisible component from the visible invisible mixture image to generate a visible image. The program code causes the invisible component remover to function as a removal calculator that performs a removal calculation process of an invisible component with respect to the visible invisible mixture image, and a noise reducer that performs a noise reduction process on at least one of an image to be input to the removal calculator and an image to be output from the removal calculator.
The aforementioned and other aspects, features, and advantages of the present disclosure are better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present disclosure and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have the same function, operate in a similar manner and achieve similar results.
Referring now to the drawings, embodiments of the present disclosure are described below. In the drawings for explaining the following embodiments, the same reference codes are allocated to elements (members or components) having the same function or shape and redundant descriptions thereof are omitted below.
An image processing apparatus, an image processing method, and a recording medium are hereinafter described in detail with reference to the drawings.
The image processing apparatus 100 includes an image reading device 101 as a reading apparatus or an image reading apparatus, an automatic document feeder (ADF) 102, and an image forming device 103. The image forming device 103 is disposed in a lower portion of the image processing apparatus 100. In
The ADF 102 is a document supporting device that feeds a document the image of which is to be read to a reading position. The ADF 102 automatically conveys the document placed on a tray to the reading position. The image reading device 101 reads the document conveyed by the ADF 102 in a predetermined reading position. On an upper surface of the image reading device 101, an exposure glass 1 as a document supporter on which a document is to be placed is disposed. The image reading device 101 reads the document on the exposure glass 1 in the reading position. Particularly, the image reading device 101 is a scanner including a light source, an optical system, and an image sensor such as a charge-coupled device (CCD) to read reflected light from a document illuminated by the light source by using the image sensor via the optical system.
The image forming device 103 prints the document image read by the image reading device 101. The image forming device 103 includes a bypass roller 104 that is used if a recording sheet is manually fed, and a recording sheet supplier 107 that supplies a recording sheet. The recording sheet supplier 107 has a mechanism for supplying recording sheets from a plurality of sheet cassettes 107a. The supplied recording sheet is fed to a secondary transfer belt 112 via a registration roller 108.
A transfer device 114 transfers a toner image on an intermediate transfer belt 113 to the recording sheet to be conveyed on the secondary transfer belt 112.
The image forming device 103 includes an optical writing device 109, a tandem image formation device 105 (for yellow (Y), magenta (M), cyan (C), and black (K)), the intermediate transfer belt 113, and the secondary transfer belt 112. An image written by the optical writing device 109 is formed as a toner image on the intermediate transfer belt 113 by an image forming process performed by the image formation device 105.
Particularly, the image formation device 105 (for Y, M, C, and K) includes four rotatable photoconductor drums (Y, M, C, and K). An image formation element 106 including a charging roller, a developing device, a primary transfer roller, a cleaner, and a discharger is disposed around each of the photoconductor drums. The image formation elements 106 function at the respective photoconductor drums, so that images on the photoconductor drums are transferred to the intermediate transfer belt 113 by the respective primary transfer rollers.
The intermediate transfer belt 113 is disposed in nips between the photoconductor drums and the respective primary transfer rollers and extends across a drive roller and a driven roller. With movement of the intermediate transfer belt 113, a toner image primarily transferred to the intermediate transfer belt 113 is secondarily transferred to a recording sheet on the secondary transfer belt 112 by a secondary transfer device. Such a recording sheet is conveyed to a fixing device 110 by movement of the secondary transfer belt 112, and the toner image is fixed as a color image on the recording sheet. Subsequently, the recording sheet is ejected to an ejection tray disposed outside the image forming device 103. If duplex printing is performed, a front surface and a back surface of the recording sheet are reversed by a reverse device 111 and the reversed recording sheet is fed to the secondary transfer belt 112.
The present embodiment has been described using a case in which the image forming device 103 employing the electrophotographic method forms an image as described above. However, the present embodiment is not limited to the electrophotographic image forming device. An image forming device employing an inkjet method can form an image.
Next, the image reading device 101 is described.
In a reading operation, the image reading device 101 emits light upward from the light source 2 while moving the first carriage 6 and the second carriage 7 in a sub-scanning direction (a direction indicated by an arrow A shown in
Moreover, the image reading device 101 reads reflected light from the reference white board 13 to set a reference, for example, when the power is turned on. That is, the image reading device 101 moves the first carriage 6 to a location immediately below the reference white board 13, and turns on the light source 2 to cause reflected light from the reference white board 13 to form an image on the image sensor 9, thereby performing gain adjustment.
The reader 21 includes the image sensor 9 and the light source 2 including visible component light and NIR component light that is invisible component light. The reader 21 irradiates a document with the visible component light and the NIR component light.
The light source driver 24 drives the light source 2.
The image sensor 9 acquires a visible invisible mixture image (a visible NIR mixture image) including a visible component and a NIR component (an invisible component), and a NIR image (an invisible image) including a NIR component based on reflected light from the document, and outputs the acquired images to the image processor 22 disposed in the following stage. In visible image reading, the image sensor 9 outputs red, green, and blue (RGB) signals. In invisible image reading, the image sensor 9 outputs a NIR signal. A general image sensor includes a color filter characterized in that NIR light is transmitted through the color filter. Hence, in invisible image reading, the NIR signal appears in each of RGB outputs (NIRr, NIRg, and NIRb).
In the present embodiment, an example in which a NIR image is used as an invisible image is described. However, an ultraviolet image can be used as an invisible image, and a wavelength region to be used for the invisible image is not particularly limited to any one region.
The controller 23 controls each of the light source driver 24, the image sensor 9, and the image processor 22.
The image processor 22 includes a black subtracter 221, a line-to-line corrector 222, a NIR component remover 223, and a shading corrector 224.
The black subtracter 221 performs black level correction on a visible NIR mixture image and a NIR image output from the image sensor 9.
The line-to-line corrector 222 performs a line-to-line correction process by which physical displacement of a line of the image sensor 9 is corrected.
The NIR component remover 223 functioning as an invisible component remover removes a NIR component (an invisible component) from a visible NIR mixture image (a visible invisible mixture image) to generate a visible image that does not include the NIR component. A detailed description of the NIR component remover 223 will be given below.
The shading corrector 224 functioning as an image corrector performs shading correction on each of the NIR image and the visible image with the NIR component removed. In the shading correction, a reading level of a white background plate is maintained for each main scanning pixel, and document read data is standardized at the reading level of the white background plate, so that fluctuations in reading level in a main scanning direction are removed.
The NIR component remover 223 is described in detail.
The noise reducer 225 performs a noise reduction process on each of the visible NIR mixture image and the NIR image which have been input.
The removal calculator 226 uses the visible NIR mixture image and the NIR image which have undergone the noise reduction to remove a NIR component from the visible NIR mixture image.
Examples of removal calculation equations to be used by the removal calculator 226 are as follows.
Rout=Rin−NIR×Kr
Gout=Gin−NIR×Kg
Bout=Bin−NIR×Kb
In the equations, where Rin, Gin, and Bin are image signals that have been input to the removal calculator 226.
The image signals (Rin, Gin, and Bin) represent images in which visible components and NIR components are mixed. The removal calculator 226 subtracts NIR signals from the input image signals (Rin, Gin, and Bin) to output images having only visible components. Moreover, the removal calculator 226 multiplies a NIR signal by different coefficients Kr, Kg, and Kb on a channel basis. Such multiplication corrects a difference of the NIR components for each channel due to characteristics of the color filter disposed in the image sensor 9.
Generally, addition and subtraction of images each having noise superimposes noise. That is, a root-sum-square value of noise amount σ1 of a visible NIR mixture image to be input and a noise amount σ2 of a NIR image to be input is a noise amount σ3 of a visible image to be output.
σ3=√(σ12+σ22)
The noise amount of the visible image to be output by such calculation is greater than a noise amount of a visible image including only a visible component optically acquired by an element such as an infrared cut filter.
In the present embodiment, the noise reducer 225 disposed in a stage preceding the removal calculator 226 reduces noise prior to removal of the NIR component. Such noise reduction lowers noise amounts σ1 and σ2 beforehand, and thus a noise amount σ3 of an output visible image can be lowered. Accordingly, the noise reduction process prior to the removal calculation enables the removal calculator 226 to perform NIR component removal calculation on the noise-reduced image. Therefore, noise influence can be reduced, and output image quality can be maintained.
Next, a noise reduction method performed by the noise reducer 225 is described.
First, an example in which a linear filter is used as a noise reduction method is described.
As for the smoothing filter, filter strength can vary depending on a coefficient setting. However, the filter coefficient does not vary depending on a pixel position or a pixel value, and the same weighting calculation is performed on the entire area. Since a complicated process is not necessary, the use of the smoothing filter enables a noise reduction process to be performed with small circuit scale.
Accordingly, the noise reducer 225 executes a filter process using a linear filter. Thus, noise can be reduced by a simple process and roughness degradation can be prevented.
Next, an example in which a non-linear filter is used as a noise reduction method is described.
In the non-linear function F illustrated in
The non-linear filter is not limited to the epsilon filter. A non-linear filter such as a bilateral filter can be used for noise reduction.
Accordingly, the noise reducer 225 performs a filter process using a non-linear filter to perform a noise reduction process in a state in which an edge is preserved, thereby preventing roughness degradation while reducing degradation in resolving power.
Next, an effect of noise reduction prior to removal calculation performed by the removal calculator 226 is described.
As illustrated in
On the other hand, smoothing filters (linear filters) as illustrated in
Accordingly, in the present embodiment, the noise reducer 225 preferably uses linear filters having same strength for a visible NIR mixture image and a NIR image at noise reduction. The use of such filters enables a NIR component to be appropriately removed from the visible NIR mixture image even if a process to be performed by the removal calculator 226 is simplified.
Hence, the use of linear filters having the same strength for a visible NIR mixture image and a NIR image can prevent inconsistency of invisible components included in the visible NIR mixture image and the NIR image at removal calculation.
Next, determination of noise reduction strength in the noise reducer 225 is described.
As illustrated in
Accordingly, an S/N ratio subsequent to visible image generation can be estimated from a mixture ratio of visible components to NIR components. Thus, a change in noise reduction strength based on the mixture ratio can simply adjust the S/N ratio to a target S/N ratio.
According to the present embodiment, when an invisible component included in a visible image is to be removed, an increase in noise to be generated at the removal of the invisible component can be prevented, and image quality degradation can be prevented or reduced.
According to the present embodiment, moreover, execution of a noise reduction process prior to a removal calculation process can prevent or reduce a computing error at the time of removal calculation.
A second embodiment is described.
The second embodiment differs from the first embodiment in that a noise reducer 225 is disposed in a stage following a removal calculator 226. The first embodiment has been described using an example in which a noise reduction process is performed prior to the process performed by the removal calculator 226. However, a noise reduction process can be performed on an output visible image from the removal calculator 226 to reduce a noise amount σ3 of the output visible image. Hereinafter, components and configurations that differ from components and configurations of the first embodiment will be described, and description of like components will be omitted.
The image processor 22 includes a black subtracter 221, a line-to-line corrector 222, a NIR component remover 223, and a shading corrector 224.
Unlike the NIR component remover 223 described in the first embodiment, the NIR component remover 223 of the present embodiment includes the noise reducer 225 in a stage following the removal calculator 226. That is, the NIR component remover 223 first performs removal calculation in the removal calculator 226 by using a NIR image and a visible NIR mixture image that are output from the line-to-line corrector 222 as inputs. Then, the NIR component remover 223 performs a noise reduction process in the noise reducer 225 on the visible image and the NIR image output from the removal calculator 226. Such a configuration can enhance image quality of the NIR image originally having a small signal level while enhancing image quality of the visible image on which noise has been superimposed by the removal calculation.
In addition, a relation between a NIR component included in the visible NIR mixture image and a NIR component included in the NIR image needs to be clear (preferably match each other) to remove the NIR component from the visible NIR mixture image with good accuracy.
In a case where noise reduction is performed prior to the removal calculation as described in the first embodiment, a relation between the noise reduction and the removal calculation may be affected depending on a noise reduction method. However, if noise reduction is performed subsequent to removal calculation, any noise reduction method can be applied as the noise reduction is performed on an image that has undergone the removal calculation.
Moreover, since a non-linear filter as described in Equation 1 can be aggressively used so that the accuracy of the NIR component removal calculation is not affected by a noise reduction method, noise can be reduced while degradation in sharpness of characters and lines can be being prevented or reduced. Hence, image quality can be more enhanced.
According to the present embodiment, a noise reduction process is performed subsequent to the process performed by the removal calculator 226, and thus various noise reduction methods can be applied without consideration of influence on the removal calculation process.
According to the present embodiment, moreover, the use of a non-linear filter in a process subsequent to the process performed by the removal calculator 226 can not only maintain resolving power but also prevent roughness degradation due to noise without consideration of influence on the removal calculation.
A third embodiment is described.
The third embodiment differs from the first and second embodiments in that an offsetter 227 that performs an offset process for providing an offset amount to a visible NIR mixture image is disposed. Hereinafter, components and configurations that differ from components and configurations of the first and second embodiments will be described, and description of like components will be omitted.
As described in
In the present embodiment, the NIR component remover 223 includes the offsetter 227 and an offset remover 228 in addition to a noise reducer 225 and a removal calculator 226.
The offsetter 227 provides an offset amount to a visible NIR mixture image.
The offset remover 228 removes the offset amount provided by the offsetter 227 subsequent to removal calculation and noise reduction.
Next, a description is given of a noise reduction effect if an offset amount is proved prior to a process performed by the removal calculator 226.
Accordingly, even if the noise reducer 225 performs the noise reduction using a non-linear filter subsequent to the removal calculation, the number of pixels to be clipped at zero is reduced, and a NIR component can be removed with color reproduction having higher accuracy without addition of a bit of negative expression.
Next, determination of an offset amount by the offsetter 227 is described.
As illustrated in
Accordingly, an offset amount to be provided in the visible NIR mixture image reading illustrated in
The greater the offset amount, the lower the possibility of zero clipping in a pixel having a reading value close to zero. However, an overflow may occur in a larger reading value. In the present embodiment, the offsetter 227 sets an offset amount to an appropriate value according to an amount of noise of the image reading device 101, so that a suitable value can be set to the image reading device 101.
According to the present embodiment, execution of an offset process can reduce a calculation error that occurs when the removal calculator 226 performs removal calculation. Moreover, the use of a suitable offset amount can prevent an overflow in a larger reading value.
A fourth embodiment is described.
The fourth embodiment differs from the first through third embodiments in that noise reduction is performed subsequent to image correction (e.g., subsequent to shading correction). Hereinafter, components and configurations that differ from components and configurations of the first thorough third embodiments will be described, and description of like components will be omitted.
The image processor 22 includes a black subtracter 221, a line-to-line corrector 222, a NIR component remover 223, and a shading corrector 224. In the present embodiment, shading correction is described as image correction. However, the image correction is not limited to the shading correction.
Unlike the NIR component remover 223 described in each of the first through third embodiments, the NIR component remover 223 of the present embodiment includes a removal calculator 226 and a noise reducer 225 that are respectively arranged in stages preceding and following the shading corrector 224.
As described above in the first embodiment, in the shading correction, a reading level of a white background plate is maintained for each main scanning pixel, and document read data is standardized at the reading level of the white background plate, so that fluctuations in the reading levels in a main scanning direction are removed. Consequently, in each main scanning position, in a case where a relation between the reading level of the white background plate and the reading level of the document surface collapses, the correction cannot be appropriately performed.
On the other hand, if noise reduction using a non-liner filer by the noise reducer 225 is used, a process to be performed changes (a pixel to be referred changes) depending on an image characteristic. Consequently, a relation between a reading level of a white background plate and a reading level of a document surface may change.
To prevent influence of such a change, in the present embodiment, shading correction is performed on data that has undergone removal calculation in the removal calculator 226, and noise reduction is performed by the noise reducer 225 subsequent to the shading correction. With such a configuration, fluctuations in reading levels in a main scanning direction due to the shading correction can be appropriately corrected, and a high quality image without reading density unevenness can be provided.
According to the present embodiment, noise reduction is performed subsequent to image correction (e.g., shading correction). Therefore, an image change in a main scanning direction due to the noise reduction can prevented, and the image correction (e.g., shading correction) can be appropriately performed.
A fifth embodiment is described.
The fifth embodiment differs from the first through fourth embodiments in that a noise reducer 225 is disposed in each of stages preceding and following a removal calculator 226. Hereinafter, components and configurations that differ from components and configurations of the first through fourth embodiments will be described, and description of like components will be omitted.
The NIR component remover 223 of the present embodiment includes a first noise reducer 225a, a removal calculator 226, and a second noise reducer 225b. That is, the NIR component remover 223 includes two noise reducers respectively disposed in stages preceding and following the removal calculator 226.
The first noise reducer 225a uses a linear filter to perform a noise reduction process on the visible NIR mixture image and the NIR image, which have been received.
The removal calculator 226 uses the output data from the first noise reducer 225a to perform a removal calculation process by which a NIR component is removed from the visible NIR mixture image.
The second noise reducer 225b uses a non-linear filter to perform a noise reduction process on the data subsequent to the removal calculation process. The second noise reducer 225b may or may not perform a noise reduction process on a NIR image.
The NIR component remover 223 of the present embodiment removes a certain amount of noise by using a linear filter in the first noise reducer 225a prior to removal calculation. Accordingly, even if an offset process is not performed on a visible NIR mixture image, the NIR component remover 223 of the present embodiment can reduce degradation in color reproduction due to zero clipping at the removal calculation as described in
Moreover, the NIR component remover 223 of the present embodiment performs a noise reduction process using a non-linear filter in the second noise reducer 225b subsequent to the removal calculation. Such noise reduction can further reduce noise while preventing resolution degradation, and thus quality of an image to be output in a following stage can be enhanced.
According to the present embodiment, therefore, even if an offset process is not performed, a noise reduction process in the first noise reducer 225a can reduce an error that occurs at removal calculation, and a noise reduction process in the second noise reducer 225b by using a filter that prevents resolving power degradation can reduce roughness degradation.
Accordingly, the noise reduction by the second noise reducer 225b is performed subsequent to the shading correction. Such a configuration enables the shading correction to be appropriately performed. More particularly, fluctuations in reading levels in a main scanning direction due to the shading correction can be appropriately corrected, and a high quality image without reading density unevenness can be provided.
A sixth embodiment is described.
The sixth embodiment differs from the first through fifth embodiments in that noise reduction is switched. Hereinafter, components and configurations that differ from components and configurations of the first through fifth embodiments will be described, and description of like components will be omitted.
The image processor 22 includes a black subtracter 221, a line-to-line corrector 222, a NIR component remover 223, and a shading corrector 224.
The setting receiving unit 231 outputs mode information that is set by a user via an operation device such as an operation panel to the operation control unit 232.
The operation control unit 232 controls operations of the reader 21 and the image processor 22 according to the mode information which has been set in the setting receiving unit 231. More particularly, the operation control unit 232 controls lighting of alight source 2 and/or the noise reducer 225 and the removal calculator 226 of the NIR component remover 223 to switch operations according to the mode information set in the setting receiving unit 231.
On the other hand, if the user sets a scan operation for scanning a character document and an invisible latent image included in the character document, the visible light source and a NIR light source are simultaneously operated to acquire both of visible and invisible images. In such a case, since removal of a NIR component is necessary, not only removal calculation is set to ON but also noise reduction is set to ON to reduce image quality degradation due to noise degradation at removal calculation.
Herein, an increase in noise reduction strength may lower resolution. In the character document scanning, resolving power is more concerned than image quality degradation due to roughness caused by noise. Thus, although the noise reduction is performed, strength of the noise reduction is desirably suppressed.
Moreover, if the user sets a scanning operation for scanning a photograph document and an invisible latent image included in the photograph document, the visible light source and the NIR light source are simultaneously operated to acquire both of visible and invisible images. In such a case, since removal of a NIR component is necessary, removal calculation is set to ON. In addition, noise reduction is set to ON to reduce image quality degradation caused by noise degradation in the removal calculation.
In the photograph document scanning, since image quality degradation due to roughness is more concerned than resolving power and prevention of the image quality degradation is necessary, strength of the noise reduction is desirably set to a certain extent.
Moreover, image quality can be affected by an infrared component excited from a visible light source or a very small amount of an infrared component in the visible light source depending on a specific target object such as a phosphor. In such a case, since the infrared component needs to be removed, only the visible light source is turned on and removal calculation and noise reduction are set to ON.
According to the present embodiment, the control is switched according to a setting made by a user, so that a suitable process can be performed on a document to output the document with image quality desired by the user.
The present embodiment can employ a combination of control methods other than the combination illustrated in
Each of the above embodiments has been described using an example in which an image forming apparatus as an image processing apparatus is a multifunctional peripheral having at least two of a copy function, a printer function, a scanner function and a facsimile function. However, each of the above embodiments can be applied to any image forming apparatus such as a copier, a printer, a scanner, and a facsimile device.
The present disclosure has been described above with reference to specific embodiments but is not limited thereto. Various modifications and enhancements are possible without departing from scope of the disclosure. It is therefore to be understood that the present disclosure may be practiced otherwise than as specifically described herein. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present disclosure.
Any one of the above-described operations may be performed in various other ways, for example, in an order different from the one described above.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA), and conventional circuit components arranged to perform the recited functions.
Program codes executed by the image processing apparatus to achieve the functions of the described embodiments may be provided in files in an installable format or an executable format that are recorded on computer-readable recording media such as a CD-ROM, a flexible disk (FD), a CD-R, and a digital versatile disk (DVD).
The program codes executed by the image processing apparatus may be stored on a computer connected to a network such as the Internet and provided by being downloaded via the network, or be provided or distributed via a network such as the Internet.
Number | Date | Country | Kind |
---|---|---|---|
JP2019-131121 | Jul 2019 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20050243352 | Fujiwara | Nov 2005 | A1 |
20080252787 | Nakazawa et al. | Oct 2008 | A1 |
20080283729 | Hosaka | Nov 2008 | A1 |
20110261425 | Yamaguchi | Oct 2011 | A1 |
20140211273 | Konno et al. | Jul 2014 | A1 |
20150098117 | Marumoto et al. | Apr 2015 | A1 |
20150163378 | Konno et al. | Jun 2015 | A1 |
20160003673 | Hashimoto et al. | Jan 2016 | A1 |
20160028920 | Hashimoto | Jan 2016 | A1 |
20170163853 | Hata | Jun 2017 | A1 |
20170295298 | Ozaki et al. | Oct 2017 | A1 |
20190163964 | Kawamae | May 2019 | A1 |
20190327387 | Hashimoto et al. | Oct 2019 | A1 |
20190335061 | Nakazawa et al. | Oct 2019 | A1 |
20190340738 | Hartbauer | Nov 2019 | A1 |
20200053229 | Hashimoto et al. | Feb 2020 | A1 |
20200053230 | Nakazawa et al. | Feb 2020 | A1 |
20200053233 | Nakazawa et al. | Feb 2020 | A1 |
20200077010 | Noguchi | Mar 2020 | A1 |
Number | Date | Country |
---|---|---|
101790031 | Jul 2010 | CN |
2000-283790 | Oct 2000 | JP |
2007-043427 | Feb 2007 | JP |
2011-193404 | Sep 2011 | JP |
2011-234034 | Nov 2011 | JP |
Entry |
---|
Extended European Search Report dated Nov. 23, 2020 in European Patent Application No. 20185717.4, 9 pages. |
Office Action dated Mar. 2, 2022 in Chinese Patent Application No. 202010667654.5, 6 pages. |
Number | Date | Country | |
---|---|---|---|
20210021729 A1 | Jan 2021 | US |