The present invention relates generally to image processing and, more particularly, to a system and method for generating black and white reproductions of color documents.
In a conventional black and white (B/W) digital copier, the scanner portion has a monochrome charge coupled device (CCD) sensor. When a color document is copied, some parts of the document are illegible because the color document is scanned by the monochrome CCD sensor. Similarly, since certain color text cannot be distinguished from different colored text, color text printed on a color background may be unreadable.
The monochrome CCD sensor only detects a lightness of color that is represented by three light components: lightness, hue and saturation. By only detecting these three light components, the monochrome CCD sensor either cannot read or has difficulty reading color components of a document, such as text or images. In particular, it is not possible to reproduce a color document solely from a lightness signal.
With the overall cost color printers steadily dropping, there has been a corresponding increase in the generation and usage of color documents. As a result, a greater percentage of documents being copied are color documents. Nonetheless, relatively speaking, generating color copies of color documents still remains costly. To reduce the cost, B/W copies are typically made of the color documents. The B/W copies, however, suffer from the above described reproduction problems.
Accordingly, it would be desirable to be able to reproduce more accurate B/W copies of color documents.
Briefly, in one aspect of the invention, system and method for forming a monochromatic reproduction of a color document using a color CCD sensor includes scanning the color document using the color CCD sensor, generating image data from the scanned color document, and detecting color text data in the image data, said color text data representing a color text portion. A density of the color text data in the image data is adjusted, and the monochromatic reproduction of the color document is formed based on the image data including the adjusted density of the colored text data.
Further features, aspects and advantages of the present invention will become apparent from the detailed description of preferred embodiments that follows, when considered together with the accompanying figures of drawing.
The color CCD 10 may be a 3-line sensor having red, green and blue sensors, or a 4-line CCD sensor, which further includes a monochromatic sensor. The color CCD 10 is configured to detect an original image scanned by a scanner (not shown) in the image reproduction system. The scanner scans light across a surface of the original image, and the light reflected by the original image is directed to impinge on the color CCD 10. In response to the detection of the scanned original image, the color CCD 10 generates image data representing the scanned original image. The image data can be represented as RGB data, which correspond to red data, green data and blue data. If the color CCD 10 is a 4-line CCD sensor, then the image data can also include K data, which corresponds to black data.
The segmentation unit 12 receives the image data generated by the color CCD 10. The segmentation unit 12 analyzes the image data and segments the data into identifiable regions. An identifiable region can be a text region (or portion), a background region, a graphical region and a picture region. The process for segmenting image data into identifiable regions is well known. For example, U.S. Pat. No.5,687,252 to Hiroki Kanno, the entire contents of which are incorporated by reference, describes a segmenting process that may be carried out by the segmentation unit 12. The segmentation unit 12 provides control signals to the density converter 20 and the switch 22. The control signals provide information regarding how the density converter 20 and the switch 22 should operate. The segmentation unit 12 can be implemented as hardware, such as an ASIC or other processing circuit, as software, or as some combination thereof.
The color converter 14 converts the RGB data from the color CCD 10 into cyan, magenta and yellow (CMY) data. If the color CCD does not provide K data , the color converter 14 also generates K data from the received RGB data. Like the segmentation unit 12, the color converter 14 can be implemented as hardware, such as an ASIC or other processing circuit, as software, or as some combination thereof.
The color to B/W converter 16 converts the image data from the color CCD 10 into monochromatic data. The converter 16 can use algorithms known in the art that convert or transform the image data, such as RGB data, into monochromatic data. Transformations from RGB data into monochromatic data are conventional and well known to one skilled in the art. For example, “PostScript Language Reference, third edition,” published by Adobe Systems Inc. describes an example of such a transformation on pages 474-475. Instead of having three values for each pixel in the color data (four values if the color CCD also generates K data), the monochromatic data only has a single value for each pixel. The value of each pixel in the monochromatic data corresponds to a density value of that pixel. Typically, the density value is an eight bit value between 0 and 255, where 0 is the lowest density value, and 255 is the highest density value. This scale of density values for the monochromatic data is referred to as a gray scale. The converter 16 can be implemented as hardware, such as an ASIC or other processing circuit, as software, or as some combination thereof.
The edge detector 18 is configured to examine the image data and identify edge portions of graphical regions in the image data. The edge portions correspond to the perimeter of the graphical region. For example, if the graphical region is square shaped, the edge portions of the graphical region would correspond to the four sides of the graphical region that define the perimeter of the graphical region. The edge detector 18 can make the identification of edge portions using information from the segmentation unit 12, which identifies data corresponding to a graphical region, or independently of the identification by the segmentation unit 12. It is also possible for the edge detector 18 and the segmentation unit 12 to be implemented as part of the same single unit. In addition to identifying the edge portions of the graphical region, the edge detector 18 can alter the density of the pixels in the edge portions to have a high density value. The edge detector 18 can be implemented as hardware, such as an ASIC or other processing circuit, as software, or as some combination thereof.
The density converter 20 adjusts the density values of the pixels of the image data output from the converter 16. As described above, each pixel output from the converter 16 corresponds to a monochromatic value. The density converter 20 can adjust the density values to increase or decrease the density. The adjustment of the density value is made in accordance with a control signal from the segmentation unit 12. The particular adjustment of the density value depends upon the region in which the pixel is present, as will be described in more detail below. Like the previously described components, the density converter 20 can be implemented as hardware, such as an ASIC or other processing circuit, as software, or as some combination thereof.
The switch 22 is coupled to receive the image data output from each of the color converter 14, the density converter 20 and the edge detector 18. The switch 22 also receives a control signal from the segmentation unit 12, which controls which input to the switch 22 will pass to provide a particular pixel of the image data to the image processing unit 24. If the edge detector 18 is configured to control the density converter 20, or is implemented as part of the segmentation unit 12, then the switch 22 may be configured to switch only between the outputs of the color converter 14 and the density converter 20. The switch 22 can be implemented as a hardware or software switch that is capable of providing a selected pixel to the image processing unit 24.
The image processing unit 24 is configured to perform one or more image processing functions on the image data received from the switch 22. The image processing functions include, for example, filtering, smoothing, dithering, halftone processing, error diffusion, gamma correction or other function that alters the image data to improve the reproduction of the original image. The image processing unit 24 can be implemented as hardware, such as an ASIC or other processing circuit, as software, or as some combination thereof.
The printer 26 receives the image data from the image processing unit 24 and converts the image data into a printer format for printing, such as raster image data. The printer 26 uses the raster image data to generate a reproduction of the original image.
The color CCD 10 generates image data from the scanned original image (step 204). The image data generated by the color CCD 10 is either RGB data or RGB and K data, depending upon the implementation of the color CCD 10.
The segmentation unit 12 receives the image data generated by the color CCD 10 and segments the received image data into various regions (step 206). As described above, the regions can be, for example, text regions, background regions, graphical regions and picture regions. For each region, the segmentation unit 12 can identify each pixel within a particular region and generate a corresponding control signal, which identifies the particular pixel as being in a particular type of region.
The image data generated by the color CCD 10 is also received by the color to B/W converter 16, which converts the received image data into monochromatic data (step 208). More particularly, each pixel of the received image data is converted from the multiple color pixel value, i.e. RGB or RGBK, into K data only. Each pixel of the K data output from the converter 16 can be an eight bit value between 0 and 255, each value corresponding to a particular density.
Each pixel of the image data output from the converter 16 is received by the density converter 20, which adjusts the density value of the received pixel in accordance with the control signal from the segmentation unit 12 (step 210). The manner in which the density of the received pixel is adjusted will be explained in conjunction with
Based on the control signal received from the segmentation unit 12, the density converter 20 determines whether the received pixel is in a text region (step 304). If so, the density converter 20 adjusts the density of the received pixel of the image data (step 306). The density adjustment for the text region will be explained in conjunction with
In a conventional system, when reproducing in monochrome an original image having text of various colors, each text region is reproduced in black, but at various densities. The black text of the original image is reproduced with approximately the same density as the original image. The other text colors, however, such as pink and blue, are reproduced faintly, particularly in comparison to the black text. Since the density converter 20 of the image reproduction system of
Returning to
In a conventional system, when reproducing in monochrome an original image having text of various colors that is located in backgrounds of various colors, each text region and background region is reproduced in various densities of black. Because the background regions are also reproduced in black, the text present in the background region may be partially or completely obscured. Since the density converter 20 of the image reproduction system of
When reproducing the text with the highest density and the background with the lowest density, all information about the background of the original image is lost. As a result, the reproduced image of
Returning again to
In a conventional system, when reproducing in monochrome an original image having text of various colors that is located in graphical regions of various colors, each text region and graphical region is reproduced in various densities of black. Because the graphical regions are also reproduced in black, the text present in the graphical region can be partially or completely obscured. Since the density converter 20 of the image reproduction system of
Further, the density of the pixels in the graphical regions are also adjusted. The adjustment of the density of the pixels in the graphical regions will be explained in conjunction with
The density of the edge portion of the graphical region is then adjusted (step 404). In particular, the pixels in a detected edge portion are adjusted to have the highest density, 255, so that the edge portions 713, 716, 719 in
In addition, the density of the interior portion of the graphical region is adjusted (step 406). In contrast to the edge portion, the density of the pixels in the interior portion of the graphical region are adjusted to have the lowest density, 0, to reproduce the interior portions 712, 715, 718 in
Returning once more to
Finally, returning to
The pixels selected by the switch 22 are provided to the image processing unit 24, which applies image processing functions (step 212). As described above, one or more image processing functions can be applied to the image data received by the image processing unit 24 to improve the quality of the reproduced image. After applying the image processing functions, the image is reproduced by the printer 26 (step 214).
The foregoing description of a preferred embodiment of the invention has been presented for purposes of illustration and description. Of course, the various steps of detecting text, background, graphical and picture regions can be done in any order. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and modifications and variations are possible in light in the above teachings or may be acquired from practice of the invention. The embodiments were chosen and described in order to explain the principles of the invention and as practical application to enable one skilled in the art to utilize the invention in various other embodiments and with various modifications are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.