Information
-
Patent Application
-
20030025926
-
Publication Number
20030025926
-
Date Filed
August 06, 200123 years ago
-
Date Published
February 06, 200322 years ago
-
Inventors
-
Original Assignees
-
CPC
-
US Classifications
-
International Classifications
Abstract
In an image processing apparatus, first image data and first discrimination data is generated on the basis of page information described in a page description language. Second discrimination data is generated on the basis of the generated first image data and first discrimination data. Second image data is generated on the basis of the generated second discrimination data and the first image data. An image processing is performed based on the generated second image data and second discrimination data, and the processed data is transferred on a printing medium.
Description
BACKGROUND OF THE INVENTION
[0001] The present invention relates to an image processing apparatus and an image processing method, and more particularly to an image processing apparatus and an image processing method for outputting page description information with high quality, which has been formed by an image forming apparatus such as a personal computer.
[0002] In general, when page description information such as DTP data, which is formed by a personal computer, is to be output from an image output apparatus such as a printer, the data to be output is sent to an image output apparatus such as a printer or an MFP via a printer controller that receives the page description information and develops it to image data comprising pixel arrays of four colors, Cyan, Magenta, Yellow and Black, which represent ink amounts. The printer controller not only performs development to image data but also produces discrimination data representative of attributes of respective pixels of the image data.
[0003] For example, Jpn. Pat. Appln. KOKAI Publication No. 9-282472 discloses a technique wherein characters or given discrimination signals representing other attributes, as well as image data, are produced and transmitted, and the image data is subjected to an image process corresponding to the discrimination signals in an image output apparatus. Thereby, where image data includes character information, an image process, for example, for preventing degradation in quality of characters is performed and the data is output from the image output apparatus.
[0004] On the other hand, Jpn. Pat. Appln. KOKAI Publication No. 2000-270213 discloses a technique wherein generated discrimination data is converted to data representing correspondency with image data, thereby reducing the memory capacity needed for storing the discrimination data.
[0005] In the technique disclosed in the above-mentioned Jpn. Pat. Appln. KOKAI Publication No. 9-282472, however, image development means (i.e. printer controller) simultaneously produces image data and discrimination data on the basis of page description information, and the image data is output from an image forming apparatus capable of switching image processes according to the discrimination data. In this case, an ordinary printer controller is unable to generate desired discrimination data, and thus the printer controller is limited to a specific type.
[0006] Moreover, when an ordinary printer controller is used, image data matching with characteristics of an output apparatus is not necessarily produced. For example, in the case of a color image having a colored background on which black characters are written, such image data is produced in ordinary cases that the black character portion is written in black alone and there is no information on the color of the background. In the case where this image data is output as such from a printer, if an error has occurred in print position between black ink and color ink, a colorless portion forms around the character and the image quality deteriorates.
BRIEF SUMMARY OF THE INVENTION
[0007] The object of the present invention is to provide an image processing apparatus and an image processing method capable of performing a high-image-quality image process matching with output characteristics of a printer, even in a case where an ordinary printer controller is used.
[0008] In order to achieve the object, the present invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data and the first discrimination data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
[0009] The invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the first discrimination data generated by the image development means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
[0010] The invention provides an image processing apparatus comprising: image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means or using the first image data and the first discrimination data; image processing means for subjecting the first image data generated by the image development means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; and image output means for outputting image data processed by the image processing means.
[0011] The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the type of attributes set by the setting means and the first image data and the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
[0012] The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
[0013] The invention provides an image processing apparatus comprising: input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image processing means for subjecting the first image data input by the input means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
[0014] The invention provides an image processing method for image-processing information described in a page description language, and outputting an image, comprising: generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of the information described in the page description language; generating second discrimination data different from the first discrimination data, using the generated first image data and first discrimination data; generating second image data by correcting the generated first image data on the basis of the generated second discrimination data; subjecting the generated second image data to a predetermined process on the basis of the generated second discrimination data; and outputting processed image data.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
[0015]
FIG. 1 is a block diagram showing the structure of an image processing apparatus according to a first embodiment of the present invention;
[0016]
FIG. 2 shows an example of the structure of image development means;
[0017]
FIG. 3 shows an example of the structure of discrimination data generating means;
[0018]
FIG. 4 shows an example of the structure of an edge detection section in the discrimination data generating means;
[0019]
FIG. 5 shows an example of the structure of a color detection section in the discrimination data generating means;
[0020]
FIG. 6 shows an example of the structure of a synthetic determination section in the discrimination data generating means;
[0021]
FIG. 7 shows an example of conversion by a converter;
[0022]
FIG. 8 shows an example of the structure of image data generating means;
[0023]
FIG. 9A shows an example of first image data;
[0024]
FIG. 9B shows an example of second image data in a case where an output value of the first image data has been replaced;
[0025]
FIG. 10 is a view for describing a smoothing process;
[0026]
FIG. 11 shows an example of the structure of image processing means;
[0027]
FIG. 12 shows an example of a correction table;
[0028]
FIG. 13 shows an example of the correction table;
[0029]
FIG. 14 shows an example of the correction table;
[0030]
FIG. 15 shows an example of the correction table;
[0031]
FIG. 16 shows an example of the correction table;
[0032]
FIG. 17 shows an example of the correction table;
[0033]
FIG. 18 shows an example of the correction table;
[0034]
FIG. 19 shows an example of the correction table;
[0035]
FIG. 20 is a b lock diagram showing the structure of an image processing apparatus according to a second embodiment;
[0036]
FIG. 21 is a block diagram showing the structure of an image processing apparatus according to a third embodiment;
[0037]
FIG. 22 is a block diagram showing the structure of an image processing apparatus according to a fourth embodiment;
[0038]
FIG. 23 is a block diagram showing the structure of an image processing apparatus according to a fifth embodiment; and
[0039]
FIG. 24 is a block diagram showing the structure of an image processing apparatus according to a sixth embodiment.
DETAILED DESCRIPTION OF THE INVENTION
[0040] Embodiments of the present invention will now be described with reference to the accompanying drawings.
[0041]
FIG. 1 is a block diagram showing the structure of an image processing apparatus 1 according to a first embodiment of the present invention. This image processing apparatus 1 is called a printer in usual cases. The apparatus receives document data, etc. produced by a personal computer via a network, etc., generates image data comprising toner amount information, and transfers toner on paper, thus performing image formation.
[0042] The image processing apparatus 1 comprises image development means (controller unit) 11, discrimination data generating means 12, image data generating means 13, image processing means 14, and image output means (printer) 15.
[0043] The image development means 11 receives DTP (Desk Top Publishing) data formed on a personal computer or document data of a word processor, etc. as page information described in a page description language (PDL). The image development means 11 develops the received data to first image data as bit map data and to first discrimination data representative of attributes of each pixel.
[0044] The page information contains characters as font data, figures as line description data or painted-out region data, and others as ordinary raster image data. When the page information is output as a print image, it is necessary to develop all data as the same bit map data.
[0045] In addition, it is necessary to develop the attribute cata to pixel-by-pixel discrimination data so that the image processing means 14 may perform an appropriate image quality enhancing process in accordance with attributes of image data.
[0046] Alternatively, the image processing apparatus may be constructed such that the image development means 11 is provided as an external element as a printer controller.
[0047] The discrimination data generating means 12 generates second discrimination data for each pixel, which is necessary for controlling the image processing means 14, on the basis of the first image data and the first discrimination data. The second discrimination data differs from the first discrimination data and corresponds to an image area discrimination signal that is commonly used in a copying machine, etc.
[0048] Accordingly, even where a scanner is connected to the image processing apparatus 1 for the purpose of use as a copying machine, the second discrimination data can be generated from the scanner input image.
[0049] It is necessary, however, to switch the method of generating the second discrimination data, depending on which of images should be treated as first image data, the images being obtained in a case where the scanner input image is processed and in a case where the information of the page description language is developed to the image.
[0050] The image data generating means 13 corrects the first image data on the basis of the second discrimination data generated by the discrimination data generating means 12, and thus generates second image data. The correction of the image data in this context is effected by an over-print process, which is performed based on the fact that a white blank portion forms between a black line and a C, M, or Y color component background due to a print position error at the time of printing out, a trapping process, a character smoothing process, etc.
[0051] The image processing means 14 performs a process for emphasizing an image (in particular, a character) at the time of printing out. General methods of the process are filtering, gamma correction, etc. A filter coefficient or a gamma correction table is switched in accordance with the second discrimination data.
[0052] The image output means 15 uses output image data (corresponding to the ink amount of each color in the case of a printer) generated by the image processing means 14, and transfers ink on a printing medium (paper, etc.).
[0053]
FIG. 2 shows an example of the structure of the image development means 11. The image development means 11 comprises a CPU 21, a RAM 22 and a page memory 23. The page information received by the image development means 11 is converted to first image data and first discrimination data by the CPU 21, which is then developed in the page memory 23 and transmitted pixel by pixel.
[0054]
FIG. 3 shows an example of the structure of the discrimination data generating means 12. The discrimination data generating means 12 comprises line buffers 31a and 31b, an edge detection section 32, a color detection section 33 and a synthetic determination section 34.
[0055] The first image data transmitted from the image development means 11 is input to the line buffer 31a of the discrimination data generating means 12. The first image data is accumulated in the line buffer 31a by several lines, thereby forming block data.
[0056] The first image data output from the line buffer 31a is sent to the edge detection section 32, and it is determined for each color component whether a center pixel (“pixel of interest”) of the block corresponds to an edge portion.
[0057] In addition, the first image data output from the line buffer 31a is sent to the color detection section 33, and it is determined based on the chroma whether the pixel of interest has an achromatic color or a chromatic color.
[0058] On the other hand, the first discrimination data transmitted from the image development means 11 is input to the line buffer 31b of the discrimination data generating means 12. The line buffer 31b is used for establishing synchronism with the first image data.
[0059] The synthetic determination section 34 outputs second discrimination data by performing synthetic determination on the basis of the edge detection result from the edge detection section 32, the determination result from the color detection section 33, and the first discrimination data synchronized by the line buffer 31b.
[0060]
FIG. 4 shows an example of the structure of the edge detection section 32 in the discrimination data generating means 12. The edge detection section 32 comprises Multipliers 41a and 41b, adders 42a and 42b, positive number generators 43a and 43b, an adder 44 and a comparator 45. The edge detection section 32 is provided for each of the color component image signals C, M, Y and K of the first image data input from the line buffer 31a, and the edge detection is performed in parallel.
[0061] The multiplier 41a multiplies a 3×3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol A. The adder 42a adds calculated values of the multiplier 41a. The positive number generator 43a produces an absolute value of the value calculated by the adder 42a.
[0062] The multiplier 41b multiplies a 3×3 matrix of the first image data with coefficients (edge detection operators) shown in FIG. 4 by symbol B. The adder 42b adds calculated values of the multiplier 41b. The positive number generator 43b produces an absolute value of the value calculated by the adder 42b.
[0063] Subsequently, the adder 44 adds the two absolute values obtained by the absolute value generators 43a and 43b. The comparator 45 compares the added value with a predetermined value, thereby determining the presence/absence of the edge.
[0064] The comparison result of the comparator 45 is output to the synthetic determination section 34 as an edge determination result EC, EM, EY, EK, in association with a color component image signal C, M, Y, K in the first image data input from the line buffer 31a.
[0065]
FIG. 5 shows an example of the structure of the color detection section 33 in the discrimination data generating means 12. The color detection section 33 comprises subtracters 51a, 51b and 51c, positive number generators 52a, 52b and 52c, a maximum value selector 53, a comparator 54, digitizers 55a, 55b, 55c and 55d, selectors 56a, 56b, 56c and 56d, AND gates 57a, 57b and 57c, and a NOT gate 58.
[0066] The subtracter 51a calculates a difference in density between color components (C, Y) of image signals of the first image data input from the line buffer 31a, and outputs the difference to the positive number generator 52a. The positive number generator 52a produces an absolute value of the input density difference between the color components (C, Y), and outputs the absolute value to the maximum value selector 53.
[0067] The subtracter 51b calculates a difference in density between color components (C, M) of image signals of the first image data input from the line buffer 31a, and outputs the difference to the positive number generator 52b. The positive number generator 52b produces an absolute value of the input density difference between the color components (C, M), and outputs the absolute value to the maximum value selector 53.
[0068] The subtracter 51c calculates a difference in density between color components (M, Y) of image signals of the first image data input from the line buffer 31a, and outputs the difference to the positive number generator 52c. The positive number generator 52c produces an absolute value of the input density difference between the color components (M, Y), and outputs the absolute value to the maximum value selector 53.
[0069] The maximum value selector 53 selects a maximum of the values input from the positive number generators 52a, 52b and 53c, and outputs the maximum value to the comparator 54.
[0070] The comparator 54 compares the input maximum value and a predetermined value, and determines whether the color is achromatic or chromatic.
[0071] On the other hand, the digitizer 55a digitizes the density of the color component image signal C of the first image data input from the line buffer 31a. The digitizer 55b digitizes the density of the color component image signal M of the first image data input from the line buffer 31a. The digitizer 55c digitizes the density of the color component image signal Y of the first image data input from the line buffer 31a. The digitizer 55d digitizes the density of the color component image signal K of the first image data input from the line buffer 31a.
[0072] The digitized result represents which color component is effective in the synthetic determination section. When the digitized result of the image signal K, i.e. the output of the digitizer 55d, is “1”, there is a case where a black over-print process (for incorporating a background density in the image data of the color component C, M, Y) is performed at the time of image development. An AND value between the digitized result of the image signal C, M, Y and an inverted value of the digitized result of the image signal K is obtained.
[0073] Specifically, the digitized result of the digitizer 55a and an inverted value of the digitized result of the digitizer 55d are input to the AND gate 57a to produce an AND value. The digitized result of the digitizer 55b and an inverted value of the digitized result of the digitizer 55d are input to the AND gate 57b to produce an AND value. The digitized result of the digitizer 55c and an inverted value of the digitized result of the digitizer 55d are input to the AND gate 57c to produce an AND value.
[0074] The selector 56a receives the comparison result of the comparator 54 and the AND value of the AND gate 57a, selects one of them, and outputs a select result SC. The selector 56b receives the comparison result of the comparator 54 and the AND value of the AND gate 57b, selects one of them, and outputs a select result SM. The selector 56c receives the comparison result of the comparator 54 and the AND value of the AND gate 57c, selects one of them, and outputs a select result SY. The selector 56d receives the comparison result of the comparator 54, which has been inverted by the NOT gate 58, and the digitized result of the digitizer 55d, selects one of them, and outputs a select result SK.
[0075] This operation is performed since it is necessary to effect switching between the use as a copying machine and the use as a printer.
[0076]
FIG. 6 shows an example of the structure of the synthetic determination section 34 in the discrimination data generating means 12. The synthetic determination section 34 comprises converters 61a, 61b, 61c and 61d, and AND gates 62a, 62b, 62c and 62d.
[0077] Signals EC, EM, EY and EK input from the edge detection sections 32 associated with the image signals C, M, Y and K represent the edge detection results of C, M, Y and K. Signals SC, SM, SY and SK input from the color detector 33 represent the color detection results of C, M, Y and K.
[0078] The converter 61a receives the edge detection result EC from the edge detector 32 and the first discrimination data from the line buffer 31b, and outputs desired converted discrimination data based on them.
[0079] The converter 61b receives the edge detection result EM from the edge detector 32 and the first discrimination data from the line buffer 31b, and outputs desired converted discrimination data based on them.
[0080] The converter 61c receives the edge detection result EY from the edge detector 32 and the first discrimination data from the line buffer 31b, and outputs desired converted discrimination data based on them.
[0081] The converter 61d receives the edge detection result EK from the edge detector 32 and the first discrimination data from the line buffer 31b, and outputs desired converted discrimination data based on them.
[0082]
FIG. 7 shows an example of conversion by the converters 61a, 61b, 61c and 61d. In FIG. 7, the first discrimination data is classified such that a character described as font data with a predetermined size or less is “TEXT”, an object described as line description data or painted-out data and a character other than “TEXT” are “GRAPHIC”, and an object other than “TEXT” and “GRAPHIC” is “IMAGE”.
[0083] For example, when the first discrimination data is “TEXT” and the edge detection result is “EDGE”, the second discrimination data is output as “NEW-TEXT” (conversion result). When the first discrimination data is “IMAGE” and the edge detection result is “NON-EDGE”, the second discrimination data is output as “NEW-GRAPHIC” (conversion result).
[0084] The desired discrimination data (second discrimination data) output from the converter 61a is input to the AND gate 62a. The AND gate 62a produces second discrimination data DC as an AND value between the desired discrimination data input from the converter 61a and the color detection result SC input from the color detection section 33.
[0085] The desired discrimination data (second discrimination data) output from the converter 61b is input to the AND gate 62b. The AND gate 62b produces second discrimination data DM as an AND value between the desired discrimination data input from the converter 61b and the color detection result SM input from the color detection section 33.
[0086] The desired discrimination data (second discrimination data) output from the converter 61c is input to the AND gate 62c. The AND gate 62c produces second discrimination data DY as an AND value between the desired discrimination data input from the converter 61c and the color detection result SY input from the color detection section 33.
[0087] The desired discrimination data (second discrimination data) output from the converter 61d is input to the AND gate 62d. The AND gate 62d produces second discrimination data DK as an AND value between the desired discrimination data input from the converter 61d and the color detection result SK input from the color detection section 33.
[0088]
FIG. 8 shows an example of the structure of the image data generating means 13. The image data generating means 13 comprises line buffers 71a and 71b, a background density averaging section 72, a character density averaging section 73, and a selector 74.
[0089] The line buffer 71a accumulates n-lines of the first image data output from the image development means 11.
[0090] The line buffer 71b accumulates n-lines of the second discrimination data output from the discrimination data generating means 12.
[0091] The background density averaging section 72 calculates the average density of each of the color components C, M and Y as regards a pixel within n×n pixels around a pixel of interest, with respect to which the second discrimination data DK on the color component K is not “NEW-TEXT”.
[0092] On the other hand, the character density averaging section 73 calculates the average density of each of the color components C, M, Y and K within an area of m×m pixels (m≦n) around the pixel of interest.
[0093] The selector 74 outputs second image data by properly replacing the pixel values in accordance with the second discrimination data on the pixel of interest.
[0094] For example, when the pixel value of K of the pixel of interest is “NEW-TEXT” and all the pixel values of C, M and Y are zero, the data C, M, Y of the pixel of interest shown in FIG. 9A is changed to the output value of the background density averaging section 72 as shown in FIG. 9B (over-print process or trapping process). Specifically, the first image data C, M, Y, K shown in FIG. 9A is replaced with the output value of the second image data C, M, Y, K shown in FIG. 9B.
[0095] Similarly, when the color component of K of the pixel of interest is “NEW-TEXT”, the pixel value of K of the pixel of interest is replaced with the output value of the character density averaging section 73, as shown in a, b and c of FIG. 10 (smoothing process, etc.).
[0096] The processing of the image data generating means 13 has been described above merely by way of example, and the content of the processing is not limited to the above-described one.
[0097]
FIG. 11 shows an example of the structure of the image processing means 14. The image processing means 14 comprises line buffers 101a and 101b, a filter section 102, a gamma correction section 103, and a screen processing section 104.
[0098] The line buffer 101a accumulates several lines of the second image data generated by the image data generating means 13 for the purpose of filter processing.
[0099] The line buffer 101b outputs the second discrimination data on the pixel of interest (center pixel of an image matrix) in synchronism with the second image data.
[0100] The filter section 102 multiplies each pixel of the image matrix buffered by the line buffer 101b with a predetermined coefficient, thus calculating the sum. In this case, the filter section 102 changes the coefficient for multiplication in accordance with the second discrimination data output synchronously from the line buffer 101b.
[0101] The gamma correction section 103 corrects each pixel of the second image data for each color component, using correction tables as shown in FIGS. 12 to 19. In this case, the gamma correction section 103 switches the correction table in accordance with the second discrimination data output synchronously from the line buffer 101b.
[0102] A correction table shown in FIG. 12 relates to correction of color component C in a case where the second discrimination data is “NEW-TEXT”.
[0103] A correction table shown in FIG. 13 relates to correction of color component C in a case where the second discrimination data is not “NEW-TEXT”.
[0104] A correction table shown in FIG. 14 relates to correction of color component M in a case where the second discrimination data is “NEW-TEXT”.
[0105] A correction table shown in FIG. 15 relates to correction of color component M in a case where the second discrimination data is not “NEW-TEXT”.
[0106] A correction table shown in FIG. 16 relates to correction of color component Y in a case where the second discrimination data is “NEW-TEXT”.
[0107] A correction table shown in FIG. 17 relates to correction of color component Y in a case where the second discrimination data is not “NEW-TEXT”.
[0108] A correction table shown in FIG. 18 relates to correction of color component K in a case where the second discrimination data is “NEW-TEXT”.
[0109] A correction table shown in FIG. 19 relates to correction of color component K in a case where the second discrimination data is not “NEW-TEXT”.
[0110] The screen processing section 104 processes each pixel of the corrected second image data input from the gamma correction section 103 in accordance with the second discrimination data input synchronously from the line buffer 101b, thereby outputting image data of each color component matching with the image output means 15 in the rear stage. The processing is, for example, an error spreading process for converting image data of 8 bits per pixel (256 tone levels) to image data of 1 bit (2 tone levels).
[0111] The image output means 15 transfers the output image data from the screen processing section 104 onto printing medium (paper or the like).
[0112] In the first embodiment, the first discrimination data is generated from the image development means and the second discrimination data is generated from the discrimination data generating means, for example, in the following manner.
[0113] a) The image development means generates first discrimination data that discriminates whether each pixel is associated with a character or a line figure, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a character or a line figure, using the first discrimination data generated by the image development means.
[0114] The character is an object disposed in the first image data as font data.
[0115] The line figure is an object described by a straight line and a curve.
[0116] b) The image development means generates first discrimination data that does not discriminate whether each pixel is associated with a line figure or a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a line figure or a plane figure, using the first discrimination data generated by the image development means.
[0117] The plane figure is an object, the entirety or each component of which is painted out with uniform density.
[0118] c) The image development means generates first discrimination data that does not discriminate whether each pixel is associated with a contour portion or an inside portion of a plane figure, and the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a contour portion or an inside portion of a plane figure, using the first discrimination data generated by the image development means.
[0119] d) The image development means generates first discrimination data that discriminates whether each pixel is associated with a plane figure or a tone image, and the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a plane figure or a tone image, using the first discrimination data generated by the image development means.
[0120] e) The image development means generates first discrimination data that discriminates that each pixel is associated with a tone image, and the discrimination data generating means generates second discrimination data that discriminates the magnitude of density variation in each pixel, using the first discrimination data generated by the image development means.
[0121] As has been described above, the first embodiment comprises the discrimination data generating means for generating the second discrimination data on the basis of the first image data and the first discrimination data generated from the page information described in the page description language, and the image data generating means for correcting the first image data on the basis of the second discrimination data and generating the second image data, thereby performing an image quality enhancing process matching with the output characteristics of the printer.
[0122] Second to sixth embodiments of the invention will now be described.
[0123]
FIG. 20 shows the structure of an image processing apparatus 2 according to a second embodiment.
[0124] The main difference between the image processing apparatus 2 of the second embodiment and the image processing apparatus 1 shown in FIG. 1 is that a discrimination data generating means 122 generates second discrimination data, without using first discrimination data generated by image development means 121. Thereby, the independency of the first discrimination data and second discrimination data is enhanced, a greater degree of freedom is provided by the circuit configuration.
[0125] However, when image data generating means 123 generates second image data and when image processing means 124 switches the processing, both the first discrimination data and second discrimination data needs to be referred to.
[0126]
FIG. 21 shows the structure of an image processing apparatus 3 according to a third embodiment.
[0127] In the image processing apparatus 3 of the third embodiment, the image data generating means 123 of the image processing apparatus 2 shown in FIG. 20 is omitted. Since the image processing apparatus 3 of the third embodiment does not generate the second image data, the line memory, etc. are not needed and the image processing apparatus can be formed at low cost.
[0128]
FIG. 22 shows the structure of an image processing apparatus 4 according to a fourth embodiment. In the image processing apparatus 4 of the fourth embodiment, the controller unit (image development means 11) of the image processing apparatus 1 shown in FIG. 1 is omitted and it is provided as an external element. In addition, interface means (data input means 141) as interface with the external controller and discrimination type setting means 146 are provided.
[0129] The data input means 141 of the image processing apparatus 4 is, for example, an interface unit of a LAN (Local Area Network).
[0130] The discrimination type setting means 146 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 146, and the discrimination type setting means 146 is preset by the operation by a user, a manager, a designer, etc.
[0131] The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 146.
[0132] With this structure, an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 4.
[0133]
FIG. 23 shows the structure of an image processing apparatus 5 according to a fifth embodiment. In the image processing apparatus 5 of the fifth embodiment, the controller unit (image development means 121) of the image processing apparatus 2 shown in FIG. 20 is omitted and it is provided as an external element. In addition, interface means (data input means 151) as interface with the external controller and discrimination type setting means 156 are provided.
[0134] The data input means 151 of the image processing apparatus 5 is, for example, an interface unit of a LAN (Local Area Network).
[0135] The discrimination type setting means 156 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 156, and the discrimination type setting means 156 is preset by the operation by a user, a manager, a designer, etc.
[0136] The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 156.
[0137] With this structure, an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 5.
[0138]
FIG. 24 shows the structure of an image processing apparatus 6 according to a sixth embodiment. In the image processing apparatus 6 of the sixth embodiment, the controller unit (image development means 131) of the image processing apparatus 3 shown in FIG. 21 is omitted and it is provided as an external element. In addition, interface means (data input means 161) as interface with the external controller and discrimination type setting means 165 are provided.
[0139] The data input means 161 of the image processing apparatus 6 is, for example, an interface unit of a LAN (Local Area Network).
[0140] The discrimination type setting means 165 is a means for setting the type of the first discrimination data input by the external controller. Specification information of the external controller is input to the discrimination type setting means 165, and the discrimination type setting means 165 is preset by the operation by a user, a manager, a designer, etc.
[0141] The discrimination types of first discrimination data described in connection with the first embodiment are “TEXT”, “GRAPHIC” and “IMAGE”, and the correspondency of the three discrimination types as shown in FIG. 7 is registered (set) by the discrimination type setting means 165.
[0142] With this structure, an external controller that generates any kind of discrimination data can be connected to the image processing apparatus 6.
[0143] As has been described above, according to the embodiments of the present invention, a high-image-quality image process matching with output characteristics of a printer can be performed, even in a case where an ordinary printer controller is used.
Claims
- 1. An image processing apparatus comprising:
image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data and the first discrimination data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- 2. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that discriminates whether each pixel is associated with a character, or a line figure described by a straight line and a curve.
- 3. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that does not discriminate whether each pixel is associated with a character, or a line figure described by a straight line and a curve, using the first image data generated by the image development means.
- 4. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that does not discriminate whether each pixel is associated with a line figure described by a straight line and a curve, or a plane figure, the entirety or each component of which is painted out with uniform density.
- 5. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that discriminates whether each pixel is associated with a line figure described by a straight line and a curve, or a plane figure, the entirety or each component of which is painted out with uniform density, using the first image data generated by the image development means.
- 6. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that does not discriminate between a contour portion and an inside portion of a plane figure painted out with uniform density.
- 7. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that discriminates between a contour portion and an inside portion of a plane figure painted out with uniform density, using the first image data generated by the image development means.
- 8. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that discriminates between a plane figure painted out with uniform density and a tone image.
- 9. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that does not discriminate between a plane figure painted out with uniform density and a tone image, using the first image data generated by the image development means.
- 10. An image processing apparatus according to claim 1, wherein the image development means generates first discrimination data that discriminates that each pixel is associated with a tone image.
- 11. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates second discrimination data that discriminates the magnitude of density variation in each pixel, using the first image data generated by the image development means.
- 12. An image processing apparatus according to claim 1, wherein the discrimination data generating means generates, when the first image data generated by the image development means is color image data comprising plural color components, second discrimination data which represents attributes of each pixel for each color component and is different from the first discrimination data, using the color image data.
- 13. An image processing apparatus according to claim 1, wherein the image generating means generates, where the first image data generated by the image development means is color image data comprising plural color components and where at least one color component of each pixel of the color image data is associated with a character or a line figure described by a straight line and a curve, second image data by replacing the data other than said color component with data of a peripheral pixel of said pixel and thus correcting the first image data.
- 14. An image processing apparatus according to claim 1, wherein the image data generating means generates second image data by subjecting a pixel of a line figure described by a straight line and a curve or a character of the first image data generated by the image development means to a smoothing process for providing a smooth density variation, on the basis of the second discrimination data generated by the discrimination data generating means.
- 15. An image processing apparatus comprising:
image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means; image data generating means for generating second image data by correcting the first image data generated by the image development means on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the first discrimination data generated by the image development means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- 16. An image processing apparatus comprising:
image development means for generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of information described in a page description language; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data generated by the image development means or using the first image data and the first discrimination data; image processing means for subjecting the first image data generated by the image development means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means and the first discrimination data generated by the image development means; and image output means for outputting image data processed by the image processing means.
- 17. An image processing apparatus comprising:
input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the type of attributes set by the setting means and the first image data and the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- 18. An image processing apparatus comprising:
input means for inputting data from an external unit that. generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image data generating means for generating second image data by correcting the first image data input by the input means on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; image processing means for subjecting the second image data generated by the image data generating means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- 19. An image processing apparatus comprising:
input means for inputting data from an external unit that generates first image data and first discrimination data representing attributes of each of pixels of the first image data; discrimination data generating means for generating second discrimination data different from the first discrimination data, using the first image data input by the input means; setting means for desiredly setting the type of attributes represented by the first discrimination data input by the input means; image processing means for subjecting the first image data input by the input means to a predetermined process on the basis of the type of attributes set by the setting means, the first discrimination data input by the input means and the second discrimination data generated by the discrimination data generating means; and image output means for outputting image data processed by the image processing means.
- 20. An image processing method for image-processing information described in a page description language, and outputting an image, comprising:
generating first image data and first discrimination data representing attributes of each of pixels of the first image data on the basis of the information described in the page description language; generating second discrimination data different from the first discrimination data, using the generated first image data and first discrimination data; generating second image data by correcting the generated first image data on the basis of the generated second discrimination data; subjecting the generated second image data to a predetermined process on the basis of the generated second discrimination data; and outputting processed image data.