The present invention is based on and claims priority to Japanese patent application No. JPAP 2004-165559 filed on Jun. 3, 2004, in the Japanese Patent Office, the entire contents of which are hereby incorporated by reference.
The following disclosure relates generally to correcting background color of a scanned image.
When a book document, such as a book or a booklet having a bound boundary or spine, is placed on an exposure glass of a scanner, the book boundary or spine often raises above the surface of the exposure glass. As a result, a scanned image, particularly a portion corresponding to the book boundary or spine, suffers from lower image quality. For example, the boundary portion may have a darker background color, or it may have a distorted or blurred image.
In light of the above, various methods have been applied to increase the lowered quality of the scanned image. For example, the background color of the scanned image may be corrected using a reference background color. The reference background color can be calculated from information obtained from a selected portion of the scanned image. However, if the selected portion includes noise information, the resultant reference background color may not be accurate. As a result, the background color of the scanned image may not be corrected in a suitable manner. Further, the quality of the scanned image may be degraded due to the improper background color correction.
Exemplary embodiments of the present invention provide an apparatus, system, method, computer program and product, each capable of correcting background color of a scanned image.
For example, a scanned image having a distorted portion and an undistorted portion is obtained. A reference background color is calculated using information obtained from the entire scanned image. Using the reference background color, the distorted background color of the scanned image is corrected.
A more complete appreciation of the disclosure and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
In describing the preferred embodiments illustrated in the drawings, specific terminology is employed for clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that operate in a similar manner. Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views,
The scanner 1 of
In addition to the background color correction, the scanner 1 of
Alternatively, the scanner 1 of
As shown in
To scan an original placed on the exposure glass 2, the first scanning body 5 and the second scanning body 8 move under the exposure glass 2, and direct a light emitted from the exposing lamp 3 to the original. The light reflected off the original is further reflected by the first reflection mirror 4, the second reflection mirror 6, and the third reflection mirror 7, toward the lens 10. The lens 10 forms an image on the CCD 9 according to the reflected light. The CCD 9 converts the formed image to image data.
The scanner 1 may be provided with a printer (not shown) to together function as an image forming apparatus, such as a digital copier 16 illustrated in
Referring to
The CPU 31 controls the operation of the main controller 19. The ROM 32 stores BIOS (basic input output system), for example. The RAM 33 stores various data in an erasable manner to function as a work area of the CPU 31. The HDD 35 stores various programs to be operated by the CPU 31. The optical disc drive 36 reads data from an optical disc 37, for example. The optical disc 37 includes any kind of storage medium, such as CDs, DVDs, or magnetic disks, capable of storing various kinds of data. The communication I/F 38 allows the main controller 19 to communicate with other devices or apparatus.
In this exemplary embodiment, the CPU 31, the ROM 32, and the RAM 33 may together function as a microprocessor or any other kind of processor, capable of performing at least one of the operations disclosed below.
Further, in this exemplary embodiment, the HDD 35, the optical disc drive 36, and the communication I/F 38 may together function as a storage device storing a computer program, which allows the processor to perform at least one of the operations disclosed below. In one example, the CPU 31 may read the computer program stored in the optical disc 37 using the optical disc drive 36, and install it on the HDD 35. In another example, the CPU 31 may download the computer program from a network, such as the Internet, through the communication I/F 38, and install it on the HDD 35. Furthermore, the computer program may be operated on a predetermined operating system (OS), or may be included as a part in a group of files implementing an application software program such as a word processing program or the OS.
Referring now to
In the exemplary embodiments described below, a book document is placed on the exposure glass 2 such that its bound boundary 41 is made in parallel to the main scanning direction X of the scanner 1, as illustrated in
Referring back to
Step S2 obtains color information of the scanned image 40, such as RGB (red, green, blue) data indicating R, G, and B values of each pixel included in the scanned image 40.
Step S3 converts the RGB data to HSV (hue, saturation, intensity value) or HSB (hue, saturation, brightness) data, using any one of the known color space conversion models. For simplicity, the intensity value and the brightness value are collectively referred to as the brightness value in the following disclosure.
For example, if a target pixel located in the coordinate (x, y) has the red value R(x, y), the green value G(x, y), and the blue value B(x, y), the brightness value V(x, y), the saturation value S(x, y), and the hue value H(x, y) for the target pixel may be calculated using the following equations:
V(x, y)=0.3*R(x, y)+0.59*G(x, y)+0.11*B(x, y);
H(x, y)=Tan−1(R(x, y)−V(x, y))/(B(x, y)−V(x,y)); and
S(x, y)=√((R(x, y)−V(x, y))2+(B(x, y)−V(x, y))2).
Using the saturation value S(x, y) obtained in the previous step, Step S4 classifies the pixels in the scanned image 40 into a first group of pixels having high saturation values and a second group of pixels having low saturation values. In this exemplary embodiment, if a target pixel has a saturation value equal to or smaller than a reference saturation value, the target pixel is assumed to have an achromatic color. If a target pixel has a saturation value larger than the reference saturation value, the target pixel is assumed to have a chromatic color. The reference saturation value may be determined based on the empirical rule. Specifically, in this exemplary embodiment, the reference saturation value is set to 15%.
Step S5 calculates a brightness profile V(y) of the scanned image 40, which indicates the distribution of brightness values.
In one example, the scanned image 40 is sliced into a plurality of sections or lines (collectively referred to as the “section”), with each section having a longitudinal length in parallel to the boundary portion 41 or the main scanning direction X. For each of the sections, a histogram indicating the distribution of brightness values is generated, using the brightness values of the corresponding section. For example,
Using the obtained histogram, such as the histogram shown in
Step S6, which is optionally provided, applies filtering to the brightness profile V(y) to remove noise data from the brightness profile V(y), using any one of the known filtering methods. For example, the brightness value of a target pixel may be replaced with the mean or median of brightness values of its neighboring pixels. This filtering process may be repeated for a few times or several times, if necessary.
Step S7 calculates a reference brightness value of the scanned image 40, using the brightness profile V(y) obtained in the previous step.
Step S8 normalizes the brightness profile V(y) based on the reference brightness value Vflat. The normalized brightness profile Vn(y) may be obtained by dividing the brightness profile V(y) by the reference brightness value Vflat. The normalized brightness profile Vn(y) has a value ranging from 0 to 1. If a section of the scanned image 40 has a normalized brightness profile Vn(y) other than 1, preferably, closer to 0, that section is assumed to belong to a distorted portion of the scanned image 40. If a section of the scanned image 40 has a normalized brightness profile Vn(y) substantially equal to 1, that section is assumed to belong to an undistorted portion of the scanned image 40.
Step S9 corrects background color of the scanned image 40, using the normalized brightness profile Vn(y).
In this exemplary embodiment, this step first determines whether a target pixel has a chromatic color by referring to the group (defined in Step S4) to which the target pixel belongs. If the target pixel has a chromatic color, the saturation value S(x, y) and the brightness value V(x, y) of the target pixel are used for background color correction. For example, if the target pixel has the hue value H(x, y), the saturation value S(x, y), and the brightness value V(x, y), a corrected saturation value S′(x, y) and a corrected brightness value V′(x, y) are obtained, respectively, using the normalized brightness profile Vn(y) as follows: S′(x, y)=S(x, y)/Vn(y); and V′(x, y)=V(x, y)/Vn(y). The obtained HSV data, including the hue value H(x, y), the saturation value S′(x, y), and the brightness value V′(x, y), is converted to RGB data, using any one of the known color space conversion models.
If the target pixel has an achromatic color, only the brightness value V(x, y) of the target pixel is used for background color correction. For example, if the target pixel has the hue value H(x, y), the saturation value S(x, y), and the brightness value V(x, y), a corrected brightness value V′(x, y) is obtained using the normalized brightness profile Vn(y) as follows: V′(x, y)=V(x, y)/Vn(y). The obtained HSV data, including the hue value H(x, y), the saturation value S(x, y), and the brightness value V′(x, y), is converted to RGB data, using any one of the known color space conversion models.
Step S10 outputs the corrected image to any other device, such as the printer provided in the digital copier 16, the memory 23 provided in the digital copier 16, or the outside device via the communication I/F 38, for example. At this time, the main controller 19 may further perform distortion correction or blur correction on the corrected image.
In the operation illustrated in
Referring now to
The operation illustrated in
Step S203 obtains a brightness value V(x, y) of each pixel in the scanned image 40 from the RGB data obtained in the previous step, using any one of the known color space conversion models. For example, the brightness value V(x, y) of a target pixel may be obtained through the equation: V(x, y)=0.3* R(x, y)+0.59*G(x, y)+0.11*B(x, y).
Step S209 corrects the background color of the scanned image 40, using the normalized brightness profile Vn(y). In this exemplary embodiment, the R value R(x, y), the G value G(x, y), and the B value B(x, y) of a target pixel are respectively corrected using the following equations: R′(x, y)=R(x, y)/Vn(y); G′(x, y)=G(x, y)/Vn(y); and B′(x, y)=B(x, y)/Vn(y).
Referring now to
Step S1 inputs the scanned image 40.
Step S2 obtains color information of the scanned image 40, such as RGB data including R, G, and B values for each pixel included in the scanned image 40.
Step S305 calculates a profile of the scanned image 40, including a R profile, a G profile, and a B profile, using the RGB data obtained in the previous step.
In one example, a histogram showing the distribution of R values R(x, y) is generated for each section of the scanned image 40 in a substantially similar manner as described above with reference to Step S5. Similarly, a histogram showing the distribution of G values G(x, y), and a histogram showing the distribution of B values B(x, y) are generated, respectively.
Using the histogram for R values, the R values having a number of pixels larger than a predetermined number are extracted, and the average of the extracted R values is calculated as the R profile R(y). Similarly, the G profile G(y) of the scanned image 40 and the B profile B(y) of the scanned image 40 are obtained, respectively.
Step S306, which is optionally provided, applies filtering to the R profile R(y), the G profile G(y), and the B profile B(y), respectively, using any one of the known filtering methods.
Step S307 calculates a reference RGB value of the scanned image 40. In this exemplary embodiment, the R value having the largest number of pixels can be obtained from the histogram generated based on the R profile R(y). Using this R value, a reference R value Rflat is obtained, in a substantially similar manner as described above with reference to Step S7. Similarly, a reference G value Gflat and a reference B value Bflat can be obtained, respectively.
Step S308 normalizes the R profile R(y), the G profile G(y), and the B profile B(y), respectively, based on the corresponding R, G, and B reference values, in a substantially similar manner as described above with reference to Step S8. For example, the normalized R profile Rn(y) may be obtained by dividing the R profile R(y) by the reference R value Rflat. The normalized G profile Gn(y) may be obtained by dividing the G profile G(y) by the reference G value Gflat. The normalized B profile Bn(y) may be obtained by dividing the B profile B(y) by the reference B value Bflat. Each of the normalized profiles ranges from 0 to 1, with the value 1 corresponding to an undistorted portion of the scanned image 40.
Step S309 corrects background color of the scanned image 40, using the normalized R, G, and B profiles. For example, a corrected R value of a target pixel is obtained using the following equation: R′(x, y)=R(x, y)/Rn(y). Similarly, a corrected G value of the target pixel is obtained using the following equation: G′(x, y) G(x, y)/Gn(y). Similarly, a corrected B value of the target pixel is obtained using the following equation: B′(x, y)=B(x, y)/Bn(y).
Step S10 outputs the corrected image to any other device. At this time, the main controller 19 may further perform distortion correction or blur correction on the corrected image.
Any one of the above-described operations shown in
In one example, as illustrated in
In this exemplary embodiment, a color profile, such as a brightness profile or RGB profile, is calculated for each of the sections L1 based on pixel information included in the corresponding section L1, using any one of the above-described methods. At the same time, a reference background color, such as a reference brightness value or RGB value, of the scanned image 40 is calculated based on information obtained from the entire scanned image 40, using any one of the above-described methods. The background color in each of the sections L1 is corrected, using the color profile of the corresponding section L1 and the reference background color.
In another example, as illustrated in
In this exemplary embodiment, a color profile, such as a brightness profile or RGB profile, is calculated for each of the sections L2 based on pixel information included in the corresponding section L2, using any one of the above-described methods. At the same time, a reference background color of the scanned image 40, such as a reference brightness value or RGB value, is calculated based on information obtained from the entire scanned image 40, using any one of the above-described methods. The background color in each of the sections L2 is corrected, using the color profile of the corresponding section L2 and the reference background color.
In another example, as illustrated in
In this exemplary embodiment, a color profile, such as a brightness profile or RGB profile, is calculated for each of the sections L3 based on pixel information included in the corresponding section L3, using any one of the above-described methods. At the same time, a reference background color of the scanned image 40, such as a reference brightness value or RGB value, is calculated based on information obtained from the entire scanned image 40, using any one of the above-described methods. The background color in each of the sections L3 is corrected, using the color profile of the corresponding section L3 and the reference background color.
Referring now to
The operation shown in
Step S103 detects the location corresponding to the bound boundary 41 in the scanned image 40, using the RGB data obtained in the previous step. Exemplary operations of detecting the location corresponding to the bound boundary 41 are described, for example, in U.S. patent application Ser. No. 10/227,743, filed on Aug. 26, 2003, the U.S. patent application Ser. No. 11/054,396, filed on Feb. 10, 2005, and the U.S. Patent Application Publication No. 2003/0198398, published on Oct. 23, 2003.
Step S104 corrects skew of the scanned image 40, if the detected boundary portion 41 is not parallel to the main scanning direction X.
In the operation illustrated in
Further, the RGB data may not be converted to the HSV data, as long as a reference brightness value can be calculated. If the RGB data is not converted to the HSV data and thus the saturation value cannot be obtained, the background color of the scanned image 40 is corrected without using the saturation value.
Numerous additional modifications and variations are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure of this patent specification may be practiced otherwise than as specifically described herein.
For example, the scanned image 40 may not be a color image, as described in any one of the above operations. If the scanned image 40 is a grayscale image, an intensity value of each pixel is obtained as color information of the scanned image 40, which can be used to calculate a color profile or a reference background color.
Further, the placement of the book document is not limited to the above-described exemplary case shown in
Furthermore, any one of the above-described and other operations performed by the main controller 109 may be performed by one or more conventional general purpose microprocessors and/or signal processors. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of this disclosure or the appended claims.
Alternatively, any one of the above-described and other operations performed by the main controller 109 may be performed by ASIC (Application Specific Integrated Circuits), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors and/or signal processors programmed accordingly. For example, the image processor 20 of
Furthermore, the scanner 1 may have a structure different from the structure described with reference to
Furthermore, the background color correction function of the present invention may be performed by a device other than the scanner 1. In one example, the scanner 1 may be connected to any kind of general-purpose computer. The scanner 1 sends image data read from an original to the computer. The computer loads the program and operates at least one of the above-described and other methods according to the present invention. In another example, the computer may perform background color correction on image data, which has been stored in its storage device or received from the outside.
Number | Date | Country | Kind |
---|---|---|---|
2004-165559 | Jun 2004 | JP | national |