Method and system for improved copy quality by generating contone value based on pixel pattern and image context type around pixel of interest

Information

  • Patent Grant
  • 8023150
  • Patent Number
    8,023,150
  • Date Filed
    Monday, March 8, 2010
    14 years ago
  • Date Issued
    Tuesday, September 20, 2011
    13 years ago
  • CPC
  • US Classifications
    Field of Search
    • US
    • 358 001900
    • 358 002100
    • 358 003010
    • 358 003060
    • 358 003080
    • 358 003210
    • 358 003230
    • 358 462000
    • 382 173000
    • 382 176000
    • 382 205000
    • 382 224000
    • 382 229000
    • 382 237000
    • 382 254000
    • 382 261000
  • International Classifications
    • H04N1/405
    • H04N1/407
    • Disclaimer
      This patent is subject to a terminal disclaimer.
Abstract
A method and system reconstructs a contone image from a binary image by first tagging pixels to identify one of a multiplicity of image content types. The tag information and the pattern of bits surrounding the pixel to be converted to a contone value are used to reconstruct a contone image from a binary image. The pattern of bits in the neighborhood is used to generate a unique identifier. The unique identifier is used as the address for a lookup table with the contone value to be used wherein each lookup table corresponds to an image context type.
Description
BACKGROUND AND SUMMARY

Digital multifunction reprographic systems are well known and have replaced optical reprographic systems as a way to reproduce images. In these conventional digital multifunction reprographic systems, a scanner accepts a document to be copied and converts the document into electronic image(s). These images, usually in the form of pages, are then passed to a central control unit which may re-order or reorganize these pages and then, depending on the request of the user of the device, send the pages or images to a destination. Often this destination is an attached printing unit which makes one or more copies of the original document.


However, these conventional devices perform many other functions besides simple copying. The central control unit is usually equipped with a combination of hardware and software elements that enable it to accept input from other sources. The other sources may include some sort of network interface and/or an interface to a telephone system to enable FAX input.


The network interface is usually configured so that it can accept jobs to be printed from any computer source that is connected to the network. This configuration normally includes elements that can convert input documents formatted in one or more page description languages (PDLs) to the native format of the printing device.


An important inner component of such a conventional multifunction digital device is the image path. This is the combination of software and hardware elements that accepts the electronic images from a multiplicity of sources and performs any operations needed to convert the images to the format desired for the various output paths. The image path is usually one of the more complex and costly components of such digital multifunction devices.


The image path for a conventional multifunction device usually has several constraints. One the hand, there is a desire to make the image path utilize data in a multi-bit per pixel format so as to provide for maximum image quality and a minimum loss of critical information in the transformation of documents from paper to electronic form. On the other hand, there are cost constraints and performance limits on the devices or software that comprise the image path.


Conventional image path electronics may also utilize binary image paths. In this situation, if the input information is scanned in a binary manner at sufficiently high resolution, the scanned image can be reconstructed at the output with little or no perceptible loss of image quality.


Another component of many conventional multifunction devices, especially for those devices having a printing engine that is capable of producing colored output, is the use of analog modulation schemes for the output. In these devices, analog data, in the form of multi-bit pixels, is presented to the modulator of the output printing device. The modulator compares the analog equivalent of the input byte of data to a periodic saw tooth wave. The output therefrom is a signal to the laser imaging component that is pulsewidth modulated by the data stream.


One recent development for conventional multifunction reprographic machines is the use of both binary and analog data in the image path. In such a hybrid image path, the data from the scanner is digitized and converted to binary. All of the intermediate elements of the image path are designed to work with the compact binary data format. Only at the output is the data converted to multi-bit analog form.


One way to implement the resolution conversion is to pass the binary data through the digital equivalent of a two-dimensional low pass filter. The low pass filter may replace each pixel in the binary image by the average of the values within some window centered on the pixel of interest. While such a system does an adequate job of converting the high resolution binary data to analog data, these solutions also have the deleterious effect of smearing sharp edges in the original document. Such an effect is particularly detrimental when reproducing text and line art.


A desirable modification to hybrid image paths would be a system wherein the conversion from binary format to analog format could take into account the existence of sharp edges in the image. Ideally, such a system would be adaptive, that is, the system would change its behavior so that it would apply a resolution conversion process appropriate to sharp edges for those parts of the image that have such edges, but use a different process that was better adapted to more continuous tone parts of the image.


Moreover, the resolution conversion process could make further distinctions in the various aspects of the image beyond a simple division into pictorial vs. text and line art. Such distinctions might include distinguishing between low and high frequency halftone content, or between pictorial and graphic arts kinds of images.


Systems that implement resolution conversion processes, like those outlined above, show significant improvement in image quality compared to systems that do not. However, such systems are subject to problems.


One such problem is the need to somehow distinguish those parts of the image that have edges from those parts of the image that do not. Various processes have been proposed to identify such regions and to develop an image parallel to that being reproduced, a tag image that identifies the different characteristics of each part of the image.


Therefore, it is desirable to implement a method of reconstructing a contone image from its halftoned counterpart that is efficient in use of image path resources and at the same time is adaptive to the different characteristics of the underlying image so as to maximize the image quality of the output.





BRIEF DESCRIPTION OF THE DRAWING

The drawings are only for purposes of illustrating various embodiments and are not to be construed as limiting, wherein:



FIG. 1 shows the image path for a conventional multifunction reprographic system;



FIG. 2 shows a flowchart of the calibration process;



FIG. 3 shows a 4×4 pattern generation kernel matrix;



FIG. 4 shows how the 4×4 matrix from FIG. 3 is used to generate a unique pattern identifier; and



FIG. 5 shows a circuit that uses lookup tables generated during the calibration process to generate a reconstructed contone version of a tagged binary image.





DETAILED DESCRIPTION

For a general understanding, reference is made to the drawings. In the drawings, like references have been used throughout to designate identical or equivalent elements. It is also noted that the drawings may not have been drawn to scale and that certain regions may have been purposely drawn disproportionately so that the features and concepts could be properly illustrated.



FIG. 1 shows, in schematic form the general, image path of a multifunction reprographic system. The image path is a combination of hardware and software elements that generate, process, and store the digital page images. A control system (not shown) configures each element of the image path depending on the user job. The control system also schedules the various jobs and functions of the entire system.


As illustrated in FIG. 1, digital scanner 101 accepts a hardcopy version of the page or pages to be copied and converts each page to a digital image, in contone form, at some moderately high resolution. Within the scanner 101, there are usually electronic elements that do some initial processing of the image, correcting, if needed, for any optical or illumination defects in the scanner 101.


The digital page image is then passed to a preprocessor 102 that performs further manipulations of the page image, such as editing, or tone curve correction. The preprocessor 102 converts the contone image from the scanner 101 to a binary image 109. The preprocessor 102 also can form a tag image 110. This tag image 110 can identify, at a pixel level, various characteristics of the underlying page image. For example, the tag image can indicate whether a pixel in the page image is part of a sharp edge or not. Further details of the tag image will be described below.


After the preprocessing, the page and tag images are passed through a compression circuit 103 which losslessly compresses the page and tag images to a smaller size. The compressed images are then passed to a memory 104. The memory 104 stores each page image and its associated tag image, and keeps track of all the relevant sets of images that comprise the document being copied. The memory 104 can be used for many purposes, including for example, the ability to print multiple copies of an input document with a single scan. There are many other functions of the memory 104 that are well known to those skilled in the art.


At the time of marking, the compressed page and tag image in memory are first decompressed using the decompressor circuit 105 and the resultant image and tag are passed to the binary to contone converter 106. The binary to contone converter 106 is conventionally implemented as a two-dimensional digital filter. The filter may be altered by incorporating one or more adaptive elements that are used to enhance edges that are characteristic of text and line art copy. With suitable tagging apparatus upstream, the filter may be switched between one of several modes to adapt to the content of the portion of the image being converted.


More specifically, the binary to contone converter 106 can control the behavior (from a feature/function sense) of the image path. For example, the binary to contone converter 106 may control the behavior of the image path so as to make the image look darker. On the other hand, for example, the binary to contone converter 106 may control the behavior of the image path so as to make the image look sharper.


To realize binary to contone conversion, two parameters are taken into consideration. The first parameter is directed to the pattern of pixels in the neighborhood of the pixel being converted. More specifically, a pattern of pixels in the neighborhood of the pixel being converted is identified. An example of a pattern identifier process is disclosed in U.S. Pat. No. 6,343,159. The entire content of U.S. Pat. No. 6,343,159 is hereby incorporated by reference.


Once the pattern is identified, a pattern identifier dataword is forwarded to a plurality of look-up tables, each look-up table being capable of converting the pattern identifier dataword to a contone value. These look-up tables are correlated to both the image pattern and image content type, thus a second parameter needs to be considered to choose the correct look-up table.


The second parameter is directed to the characterization of the image context in the neighborhood of the pixel being converted. As the document is scanned and converted to binary, each pixel is tagged with two, or more, bits of information that identify which of a predetermined number of classes of image content characterize the image context in the neighborhood.


An example of a segmentation process that tags the image based on image content is disclosed in U.S. Pat. No. 6,549,658. The entire content of U.S. Pat. No. 6,549,658 is hereby incorporated by reference.


The tag information is carried through the image path in parallel with the image information proper and used to control the binary to contone conversion at the output. The tag information divides up the various types of image content so that the system can adequately cover the reconstruction of all images and still realize the appropriate image quality.



FIG. 2 shows a flowchart for creating the values for the look-up tables. As illustrated in FIG. 2, initially, classes of image content are defined at step S101.


At step S102, a pattern identification matrix is chosen. The pattern identification matrix is a small rectangular window around each pixel that is used to identify the pattern of pixels around the pixel of interest. In the following discussion, the height and width of the window are N and M, respectively.


For encoding, the size of the look-up table is 2N*M; i.e., a 4×4 pattern identification matrix window generates a look-up table that is 65536 elements long. For each image, a pattern kernel is centered on the pixel of interest and a pattern identifier is generated. The process of generating the pattern identifier will be described in more detail below.


For each image class, a large set of test documents is assembled at step S103. For each image class, the test document may be composed of portions of actual customer images and synthetically generated images that are representative of that kind of image content likely to be encountered. Each test document contains only image content from its class.


Thus, for example, a pictorial document class would include a variety of photographs, as well as images from high quality magazine reproduction where the halftone frequency is high; e.g., above 175 dpi; but not text or line art. A document class representing text and line art set would include a large variety of text in terms of fonts and sizes as well as non-Western character sets e.g. Japanese and Chinese. Similarly, a document class representing low frequency halftone documents would include samples from newspapers and similar kinds of documents.


In the next phase of the calibration process, at step S104, each of these test documents is scanned using a scanner. The output of this scan is the contone version of the document. This contone image is now further processed, at step S105, with a halftone. The result of the process of step S105 is a binary version of the same document image. Therefore, when completed, the calibration process generates two images per document: a contone image and a binary image.


Each halftone image is scanned, pixel by pixel, at step S106, and for each pixel, a pattern identifier is generated. The same process of generation is used during calibration and reconstruction. Each element of the pattern kernel is identified with a power of 2 starting with 20=1 and going to 2(N*M-1). There is no unique way of matching each element of the kernel with a power of 2; can be chosen at random; as long as the same matching is used for the calibration and reconstruction. However, it is easier to have some simple ordering of the matching, say from upper left to lower right going across the rows. FIG. 3 shows an example of a matching scheme for a 4×4 kernel 301. In each element of the kernel, the upper left hand number is a simple index into that element, while the lower right number is the power of 2 associated with the element. In this example, the weight of element of index i is given by Wi=2i.


The pattern kernel is applied to each pixel in the binary image and a pattern identifier is generated by developing an N*M bit binary number where the 1s and 0s of the binary number are determined by the image pixel underlying the corresponding pattern kernel element.


For those elements of the kernel where the corresponding pixel in the binary image is a 1, a 1 is entered into the corresponding power of 2 in the binary representation of the pattern identifier, and where the pixel is 0; a 0 is entered into the identifier. That is the pattern identifier is generated according to the equation:

Identifier=Σwi*pi


Where wi is the weight for the pattern kernel element, and pi is the corresponding image pixel value (0 or 1).



FIG. 4 shows an example for a part of an image with the 4×4 kernel applied. Using the pattern kernel 301 and the image portion 401, the pattern identifier for this pattern is given by the binary number: 0101101001011010 or decimal 23130.


Using the pattern identifier, the value of the pixel in the contone image that corresponds to the pixel in the center of the pattern kernel is selected. Using this value, the calibration program keeps a running average value of the contone value for all the pixels with the same pattern identifier, at step S107. As noted above, for a 4×4 window, there are potentially 65536 different pattern identifiers.


The process continues for all images for this particular image content type, at steps S108 and S111. After all the test images for a particular class of image content have been processed in this way, a table of the average value of the contone pixels for each unique pattern identifier has been created. This average value will be the entry in the look-up table for the reconstruction process. The look-up table is simply a 65536 byte table whose entries are the average contone pixel value for each pattern identifier, at step S109.


The process is repeated for each class of image content, at steps S110 and S112. After the above calibration process has been carried out for each set of image content, a number of separate look-up tables generated are generated, one for each content type.


As discussed above, each image pixel is input to the binary to contone reconstruction element of the image path along with its corresponding tag bits. The binary to contone element keeps a memory of a few scanlines and uses the scanlines to reconstruct, for each pixel, the pattern identifier using the same process as was used in the above-described calibration process. This pattern identifier is the address for the look-up tables, and the tag bits are used to choose which look-up table is actually used for generating the output. FIG. 5 shows schematically a system that carries out the binary to contone reconstruction.


In FIG. 5, the binary image data stream is input to a pattern identifier circuit 501 that computes the pattern identifier, using the pattern kernel 502. The output of the pattern identifier circuit 501 is an N*M bit number that is input to the look-up tables, 503, 504, and 505. These look-up tables can be implemented as simple memory chips or equivalent circuit elements, or the look-up tables may be part of a larger ASIC or similar complex programmable device. The look-up tables may be hard coded, permanently programmed for example as ROM memory, or the look-up tables may be programmable to allow for changes.


The tag data stream, suitably delayed to account for any delays in computing the pattern identifier, is input to a multiplexer 506. The multiplexer 506 accepts the tag data input and activates the one of its outputs that corresponds to the binary number represented by the tag. These outputs are each connected to the output enable control of one of the look-up tables, 503, 504, and 505. Thus, the look-up table that has been loaded with the contone value that corresponds to the pattern for the image type is enabled and its value is output where it is used to modulate the output printing mechanism.


In summary, the contone restoration processes, described above, control the behavior (from a feature/function sense) of the image path. For example, the contone restoration processes, described above, may control the behavior of the image path so as to make the image look darker. On the other hand, for example, the contone restoration processes, described above, may control the behavior of the image path so as to make the image look sharper.


It will be appreciated that various of the above-disclosed and other features and functions, or alternatives thereof, may be desirably combined into many other different systems or applications. Also that various presently unforeseen or unanticipated alternatives, modifications, variations or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.

Claims
  • 1. A system to generate a continuous tone image from tagged binary encoded image, comprising: a pattern identifier generator to generate a pattern identifier value based upon a pattern of pixels around a pixel of interest;an image type selector to generate an image type signal identifying an image context type in a pre-determined neighborhood of pixels around a pixel of interest; anda contone value generator, operatively connected to said pattern identifier generator and said image type selector to receive said pattern identifier value and said image type signal, to generate a contone value from the received pattern identifier value based upon the received image type signal.
  • 2. The system as claimed in claim 1, wherein said contone value generator includes a plurality of look-up tables, each look-up table having a plurality of contone values stored therein, each contone value corresponding to a pattern identifier value, each look-up table corresponding to an image context type.
  • 3. The system as claimed in claim 2, wherein said image type signal enables a look-up table associated with the image context type corresponding to the received image type signal.
  • 4. The system as claimed in claim 1, wherein said contone value generator includes a memory having a plurality of contone values stored therein, each combination of a pattern identifier value and an image context type being mapped to a stored contone value.
  • 5. The system as claimed in claim 1, wherein said contone value generator includes a logic circuit to generate a plurality of contone values, the generated contone value corresponding to a pattern identifier value and an image context type.
  • 6. A method to generate a continuous tone image from tagged binary encoded image, comprising: electronically generating, using an electronic device, a pattern identifier value based upon a pattern of pixels around a pixel of interest;electronically generating, using an electronic device, an image type signal identifying an image context type in a pre-determined neighborhood of pixels around a pixel of interest; andelectronically generating, using an electronic device, in response to receiving said pattern identifier value and said image type signal, a contone value from the received pattern identifier value based upon the received image type signal.
PRIORITY INFORMATION

This application is a divisional application of co-pending U.S. patent application Ser. No. 11/272,182, filed on Nov. 10, 2005. This application claims priority, under 35 U.S.C. §120, from co-pending U.S. patent application Ser. No. 11/272,182, filed on Nov. 10, 2005. The entire content of co-pending U.S. patent application Ser. No. 11/272,182, filed on Nov. 10, 2005 is hereby incorporated by reference.

US Referenced Citations (74)
Number Name Date Kind
4958236 Nagashima et al. Sep 1990 A
5008950 Katayama et al. Apr 1991 A
5065255 Kimura et al. Nov 1991 A
5293430 Shiau et al. Mar 1994 A
5323232 Otaka et al. Jun 1994 A
5347599 Yamashita et al. Sep 1994 A
5572606 Tanioka Nov 1996 A
5617216 Wada Apr 1997 A
5617459 Makram-Ebeid et al. Apr 1997 A
5754710 Sekine et al. May 1998 A
5818964 Itoh Oct 1998 A
5850474 Fan et al. Dec 1998 A
5959290 Schweid et al. Sep 1999 A
6020979 Zeck et al. Feb 2000 A
6130966 Sekine et al. Oct 2000 A
6229578 Acharya et al. May 2001 B1
6240205 Fan et al. May 2001 B1
6259823 Lee et al. Jul 2001 B1
6275303 Fukaya Aug 2001 B1
6282325 Han Aug 2001 B1
6285464 Katayama et al. Sep 2001 B1
6343159 Cuciurean-Zapan Jan 2002 B1
6427030 Williams et al. Jul 2002 B1
6477282 Ohtsuki et al. Nov 2002 B1
6594401 Metcalfe et al. Jul 2003 B1
6606420 Loce et al. Aug 2003 B1
6608701 Loce et al. Aug 2003 B1
6683702 Loce et al. Jan 2004 B1
6771832 Naito et al. Aug 2004 B1
6873437 Kuwahara et al. Mar 2005 B1
6920252 Rouvellou Jul 2005 B2
6975434 Pilu et al. Dec 2005 B1
7039232 Nagarajan May 2006 B2
7043080 Dolan May 2006 B1
7079289 Loce et al. Jul 2006 B2
7352490 Tse et al. Apr 2008 B1
7372992 Ohshita May 2008 B2
7440139 Loce et al. Oct 2008 B2
7460276 Xu et al. Dec 2008 B2
7580569 Tse et al. Aug 2009 B2
7773254 Nagarajan et al. Aug 2010 B2
20020126912 Rouvellou Sep 2002 A1
20020140983 Shimizu Oct 2002 A1
20020159096 Sun et al. Oct 2002 A1
20020181797 Young Dec 2002 A1
20020191857 Macy Dec 2002 A1
20020196467 Delhoune et al. Dec 2002 A1
20030007687 Nesterov et al. Jan 2003 A1
20030043210 Hanks Mar 2003 A1
20030090729 Loce et al. May 2003 A1
20030091222 Young et al. May 2003 A1
20030133610 Nagarajan Jul 2003 A1
20030193680 Karidi Oct 2003 A1
20040066538 Rozzi Apr 2004 A1
20040114814 Boliek et al. Jun 2004 A1
20040175037 Guleryuz Sep 2004 A1
20050163374 Ferman et al. Jul 2005 A1
20050206948 Uejo Sep 2005 A1
20050259886 Shan Nov 2005 A1
20050270582 Hara Dec 2005 A1
20060077489 Zhang et al. Apr 2006 A1
20060115182 Deng et al. Jun 2006 A1
20060132847 Xu et al. Jun 2006 A1
20060132850 Banton et al. Jun 2006 A1
20060232798 Xu et al. Oct 2006 A1
20060257045 McCandlish Nov 2006 A1
20070053003 Loce et al. Mar 2007 A1
20070103731 Tse et al. May 2007 A1
20070109602 Tse May 2007 A1
20070172148 Hawley Jul 2007 A1
20070172149 Cuciurean-Zapan Jul 2007 A1
20070258101 Nagarajan et al. Nov 2007 A1
20080049238 Nagarajan et al. Feb 2008 A1
20100046856 Bai et al. Feb 2010 A1
Foreign Referenced Citations (5)
Number Date Country
1583064 Oct 2005 EP
1601184 Nov 2005 EP
2291308 Jan 1996 GB
09051431 Feb 1997 JP
WO9930547 Jun 1999 WO
Related Publications (1)
Number Date Country
20100157374 A1 Jun 2010 US
Divisions (1)
Number Date Country
Parent 11272182 Nov 2005 US
Child 12719233 US