1. Field of the Invention
The present invention relates to a technique for generating image data to visualize a mark, character(s) or the like on a print medium under a particular light source.
2. Description of the Related Art
As an anticounterfeit technology or authenticity check technology for printed matter, a printing method using special ink that cannot be visually observed (or observed with difficulty) under ordinary light but can be easily visually observed under a particular light source is known. As a typical method, a method of printing an image on a print medium with a printing device using ink including fluorescent dye, irradiating ultraviolet light on the outputted printed matter, thereby generating an image with fluorescence is known (for example, see Japanese Patent Laid-Open No. 9-227817).
However, when special ink as described above is used, it is necessary to provide an ink tank for the special ink in the device. Further, after generation of printed matter by printing using normal ink on a print medium, it is necessary to perform another process of printing. This takes a lot of trouble. Since printing is performed twice, it is necessary to perform positioning of the image formed with the special ink with respect to the image on the printed matter formed with the normal ink.
The present invention has been made to address the above-described problems, and provides a technique for generating a print image which can be easily visually observed under a particular light source, utilizing a characteristic of a print medium, without particular printing material such as ink or toner.
To attain this object, the present invention provides an image processing apparatus having e.g. the following structure. That is, provided is an image processing apparatus for outputting print image data to a printing device which performs printing by attaching ink to a print medium, comprising: a color information holding unit configured to hold first color information and second color information indicating different ink use amounts in the printing device per unit area on the print medium, and a color difference under ordinary light equal to or less than a predetermined threshold value, an input unit configured to input discrimination subject information, a generation unit configured to generate binary latent image data in accordance with the input discrimination subject information, and an output unit configured to, in correspondence with a value of each pixel of the latent image data generated by the generation unit, output one of the first color information and the second color information held in the color information holding unit, as print data with respect to the pixel, to the printing device.
According to the present invention, by utilizing e.g. the fact that a fluorescent brightening agent which becomes fluorescent under ultraviolet light is included in a normal print sheet, print image data which cannot be discriminated under ordinary light but can be discriminated under particular light (under ultraviolet light in this case) can be generated.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Hereinbelow, preferred embodiments of the present invention will be described in detail in accordance with the accompanying drawings.
First, the basic conception of the present embodiment will be briefly described using
In many cases, a general print sheet (hereinbelow, “print medium”) includes, with some minor difference in content, a fluorescent brightening agent to maintain whiteness to some degree. It is known that the fluorescent brightening agent becomes fluorescent under ultraviolet light (so-called black light). More particularly, the fluorescent brightening agent has a characteristic in that it absorbs ultraviolet light (wavelength: 300 to 400 nm) and changes the absorbed ultraviolet light to blue visible light (wavelength: 400 to 450 nm) and radiates the visible light (this operation is called “fluorescence”). Accordingly, when a sheet including the fluorescent brightening agent (not limited to paper) is irradiated with ultraviolet light, it is fluorescent in pale blue.
Printing on a print medium at last means attaching printing color material to the print medium, i.e., covering the print medium with the attached printing material. Accordingly, when a comparison is made between a region where the printing color material is attached to many points with a region where the printing color material is attached to fewer points, as the ratio of exposed surface of the print medium is higher in the latter region, the degree of fluorescence under ultraviolet light is higher. In other words, when regions are intentionally generated in which the printing color material is attached per unit area (hereinbelow, “covering ratio”) are different, regions having different degrees of fluorescence under ultraviolet light can be generated.
Note that as the printing color material, ink (pigment ink or dye ink) and toner may be used, and different printing methods are used for these materials having different characteristics. When the printing color material is toner, toner particles as solid materials are attached to a drum, then transferred to paper and fixed to the paper. Then, to reproduce the tonality of a color of a print image, the area dot percentage per unit area is changed. With this arrangement, the toner can be easily fixed to a desired position, and further, as the toner particles are solid materials, there is no probability of blur. Accordingly, the intentional generation of regions with different covering ratio can be comparatively easily realized.
On the other hand, when the printing color material is ink, liquid ink droplets are discharged from nozzles and attached to paper. To reproduce the tonality of a color of a print image, an ink use amount per unit area is changed. With this arrangement, it is difficult to attach the ink to a desired position and there is a probability of overlapped ink discharge. Further, as the ink is in liquid state, blur occurs on the paper, and there is a probability that the shape of the attached ink is not constant. Accordingly, with the above-described two factors, it is difficult to intentionally generate regions with different covering ratios.
As described above, in a case where the printing color material is ink, although it is difficult to directly control the covering ratio by using the relation between the ink use amount and the covering ratio, in the present embodiment, the covering ratio is indirectly changed by controlling the ink use amount.
As it is apparent from
Next, changing the degree of fluorescence can be considered by utilizing the second factor, the difference in suppression power in accordance with ink color type, by performing printing with ink to, e.g., slightly suppress fluorescence in a first region on the print medium and with ink to greatly suppress fluorescence in a second region. However, as it is necessary to use a color material to greatly suppress fluorescence as the second color, there is a probability that the color reproduction range of the second color is limited.
The present inventors have focused attention to the above points, and have made it difficult to discriminate two regions, a first region (character background portion) and a second region (character portion), under ordinary light, as shown in
In the first region and second region, the ink use amount is controlled so as to approximate the colors to each other under ordinary light.
Between the first region and the second region, the ink use amount is controlled such that the ratios of covering with ink are different.
<Color Information Generation Method>
First, the color information generation method will be described. The color information is structural information indicating the amounts of color ink for image formation on a print medium with respect to RGB values of image data. Generally, one color information is determined with respect to one RGB value, and held in a color information table or the like in an output device, a printer driver or the like.
Note that the ratio of each color ink with respect to each RGB value is indicated with a numerical value on the assumption of the amount to fully cover a unit area on the print medium as “100”. In the present embodiment, when the first region indicated in the color information table held at normal times is the first color, the second color used for the second region to satisfy the following condition is determined with respect to the first color. Note that in the following description, L*a*b* color space is used, however, the color space used for color discrimination is not limited to this color space.
ΔEab*={(L1*−L2*)2+(a1*−a2*)2+(b1*−b2*)2}1/2≦Th1
ΔC_sum=|C_sum2−C_sum1|Th2 or
C_sum2≧Th4
In the above expression, L* indicates luminance; and a*b*, chromaticity representing color hue and chromaticness. The luminance and chromaticity of the first color is L1* and a1*b1*, and the luminance and chromaticity of the second color, L2* and a2*b2*. Further, the total ink use amount for the first color is C_sum1, and the total ink use amount for the second color, C_sum2. Note that Th1 is a threshold value of color difference undiscriminable for human. Generally, it is said that when the threshold value is equal to or less than 6.5, it is within an allowable range, and when the threshold value is equal to or less than 3.2, the difference is almost recognizable. Further, assuming that the difference in covering ratio as a luminance difference discriminable for human under ultraviolet light is Th3, the threshold value Th2 is a threshold value of the total difference in the ink use amounts for the first color and the second color when the difference covering ratio is Th3. Further, the threshold value Th4 is a threshold value of the total ink use amount for the second color when the covering ratio difference is Th3. These values are not general numerical values, and different in accordance with print medium. Accordingly, it may be arranged such that plural measurement patches with various ink use amounts are printed on a print medium and the threshold values are respectively determined based on difference in covering ratio which is a luminance difference discriminable under ultraviolet light. Further, the ordinary light means light for color measurement with relative spectral distribution regulated by CIE (Commission Internationale de l'Eclairage: International Commission on Illumination). For example, as the ordinary light, D65, D50, A light source, C light source or the like can be given. The second color may be obtained based on the threshold value Th2 as the threshold value of the total difference in ink use amount, or may be obtained based on the threshold value Th4 as the threshold value of the total ink use amount.
Hereinbelow, printing of the first region and the second region will be described in a case where (1) printing is performed using the same ink and in a case where (2) printing is performed using combined different color inks.
(1) Printing with Same Ink
In this case, pixels within a region of image data uniformly have the same color, and with respect to a pixel value Np1 (RGB value), C1=(c,m,y,k)=(0,0,0,24) holds as print color information C1 on the first color.
A region having the first color is formed on paper based on the color information C1, and L*a*b* value measurement is performed under ordinary light (D50). Similarly, candidate colors C21=(0,0,0,18), C22=(0,0,0,30), . . . for the second color, different from the first color, are formed on the print medium, and respective L*a*b* values are measured.
It is assumed as a result of measurement that as the L*a*b* values of the first color, L*=67.77, a*=2.33, and b*=0.18 hold, and the L*a*b* values for the candidate colors C21, C22, . . . ,
At this time, as the total ink use amount C_sum, C_sum1=24, C_sum21=30, and C_sum22=18, . . . hold.
Note that as the color difference threshold value, Th1=6.5 holds, and as the threshold value of the total difference in ink use amount, Th2=6 holds. Note that the threshold value of the total difference in ink use amount is obtained by printing plural measurement patches with various ink use amounts on the paper, previously measuring covering ratios of the respective measurement patches, and calculating total differences in ink use amount based on the covering ratio difference as a luminance difference recognizable under ultraviolet light.
In the above case, as a second color candidate, a value satisfying the color difference ΔEab* with respect to the first color equal to or less than the threshold value (ΔEab*≦6.5), when ΔC_sum≧6 holds as the total difference in ink use amount ΔC_sum, is found. It is desirable that the difference ΔEab* is as small as possible, and the total difference in ink use amount ΔC_sum, as large as possible.
As a result of calculation, as the color difference between C21(0,0,0,30) and C1, ΔEab*=5.66 holds, and ΔC_sum=|C_sum1−C_sum21|=|24−30|=6 holds. Further, as the color difference between C22(0,0,0,18) and C1, ΔEab*=6.55 holds, and ΔC_sum=|C_sum1−C_sum2|=|24−18|=6 holds.
Accordingly, as the color information C2 for the second color, C2=C21=(0,0,0,30) is determined as a color satisfying the above conditions. Accordingly, C1=(c,m,y,k)=(0,0,0,24) is determined as the color information with respect to the pixel value Np1 in the first region, for the first color, C2=(c,m,y,k)=(0,0,0,30) is determined as the color information on the second color in the second region, and the color information table as shown in
(2) Printing with Different Inks
When the ink type is other than cyan(c), magenta (m), yellow (y) and black (k), e.g., red (r), green (g) and blue (b) (used in certain types of printing devices for business use), the second color different from the first color can be defined as many types. For example, red can be reproduced with combination of magenta and yellow. Further, green can be reproduced with combination of cyan and yellow, and blue, with combination of cyan and magenta.
In the following description, as an example, the first color is represented with blue, and the second color, with combination of cyan and magenta.
Assuming that with respect to a pixel value Np2 (RGB value) of image data, as the color information C1 on the first color,
C1=(c,m,y,k,r,g,b)=(0,0,0,0,0,0,24) holds, the first color image is formed on paper based on the color information C1, and measurement of L*a*b* values is performed under the ordinary light (D50).
Next, with the two cyan and magenta inks, the second color candidates C21=(11,27,0,0,0,0,0) and C22=(9,25,0,0,0,0,0) . . . are formed on paper, and the L*a*b* values are measured. In this example, as a result, as the L*a*b* values of the first color, L*=72.15, a*=24.60, and b*=−35.96 hold. As the L*a*b* values of the candidate colors,
C21(11,27,0,0,0,0,0): L*=70.17, a*=24.92 and b*=−32.90 hold, and
C22(9,25,0,0,0,0,0): L*=72.48, a*=24.47 and b*=−30.54 hold.
At this time, as the total ink use amount C_sum, C_sum1=24, C_sum21=38 and C_sum22=34 hold.
Next, assuming that as the threshold, Th1=6.5 and Th2=12 hold, the second color is found as a color when, as the color difference and the luminance difference with respect to the first color, color difference ΔEab*≦6.5 and the total difference in ink use amount ΔC_sum≧12 hold.
As a result of calculation, as the color difference between C21(11,27,0,0,0,0,0) and C1, ΔEab*=3.66 holds and as the total difference in ink use amount, ΔC_sum=14 holds. Further, as the color difference between C22(9,25,0,0,0,0,0) and C1, ΔEab*=5.4 holds and as the total difference in ink use amount, ΔC_sum=10 holds.
Accordingly, C21=(11,27,0,0,0,0,0) satisfying the above conditions is determined as the color information C2 on the second color. Regarding the color information with respect to the pixel value Np2, as the first color,
C1=(c,m,y,k,r,g,b)=(0,0,0,0,0,0,24) holds, as the second color, C2=(c,m,y,k,r,g,b)=(11,27,0,0,0,0,0) holds, and the color information table as shown in
In this manner, with respect to the pixel value Np (RGB value) of image data, the data for the first color and the second color are generated.
Note that in this example, for the sake of simplicity of explanation, the color information tables shown in
Further, in this example, the above-described color information generation is performed by utilizing the difference in ink use amount per unit area as one of the factors for changes of the degree of fluorescence. However, the present invention is not limited to this arrangement. It may be arranged such that the color information generation is performed by utilizing the power of color material to suppress fluorescence as another factor. In this case, as the two factors are utilized, in comparison with the case using only one factor, the range of color reproduction is further expanded, and the present method is applicable to various designs using various colors.
<Discrimination Image Generation Method>
Next, particular processing of generating a discrimination image in an image processing apparatus having the color information generated by the above-described “color information generation method” will be described.
The latent image generator 101 has a function of reading latent image data as electronic data and generating a latent image. The color information holding unit 102 has a function of holding color information on a first color and a second color requiring different ink use amounts per unit area with respect to some pixel value. The discrimination image data generator 103 has a function of generating data for formation of the discrimination image from the latent image and the color information. The image output unit 104 has a function of feeding paper and print-outputting the discrimination image based on the discrimination data on the paper.
Hereinbelow, a method for realizing the present embodiment will be described in detail using the block diagram of
First, the latent image generator 101 inputs discrimination subject information (hereinbelow, “latent image data C”) to be subjected to discrimination under ultraviolet light such as character(s), a mark or the like, generates latent image data Ic, and supplies the data to the discrimination image data generator 103 (step S201). Note that the latent image data Ic is binary image data which can be handled in pixel units, in which a pixel value of a character or mark portion is “1” and other pixel values are “0”. In other words, the latent image data Ic has information indicating whether each pixel belongs to the above-described first region or the second region.
In
When the latent image data Ic is supplied from the latent image generator 101, the discrimination image data generator 103 accesses the information holding unit 102, and obtains color information Ci with respect to some pixel value Np (step S202). Note that in this embodiment, for the sake of simplicity of explanation, printing of a discrimination image is performed with respect to a predetermined one pixel value.
As shown in
In this embodiment, as shown in
Ci=(C1,C2)=((0,0,0,0,0,0,24),(11,27,0,0,0,0,0)) with respect to a pixel value Np2. Note that when there are plural pixel values Np, one pixel value to be used is designated. Upon that designation, candidates are displayed for a user and one of the candidates is designated.
The discrimination image data generator 103 obtains the color information Ci from the color information holding unit 102, then generates discrimination image data Itd as data forming a discrimination image from the latent image data It and the color information Ci, and supplies the generated data to the image output unit 104. The discrimination image data Itd is set data of color information with respect to each pixel. The generation of the discrimination image data Itd is, handling 1 pixel of the latent image data Ic as a subject pixel, execution of the following processing sequentially from an upper left corner pixel of the latent image data Ic by 1 pixel.
First, regarding the subject pixel of the latent image data Ic, it is determined whether or not the value of the subject pixel is “1”, i.e., a character or mark exists (step S203). When the value of the subject pixel is “1”, it is determined that the discrimination image data Itd is the second color C2 (11,27,0,0,0,0,0) (step S204). On the other hand, when the value of the subject pixel is “0”, it is determined that the discrimination image data Itd is the first color C1 (0,0,0,0,0,0,24) of the color information Ci (step S205). Hereinbelow, this processing is repeated.
Note that in this embodiment, the color of the character(s) or mark region is the second color and the color of the peripheral region is the first color, however, these colors may be exchanged. In such case, as the character(s) or mark is fluorescent in comparison with the peripheral region under ultraviolet light, the fluorescent region, if it is character(s), it can be recognized as outline character(s).
Then, it is determined whether or not the subject pixel processed as above is a final pixel (lower right corner pixel) (step S206). When it is determined that the subject pixel is not the final pixel, the subject pixel is changed (step S207) and the same processing is repeated from step S203 untill the final pixel is processed. In this manner, the discrimination image data Itd is generated with respect to all the pixels of the latent image data Ic. When the discrimination image data Itd is supplied from the discrimination image data generator 103, the image output unit 104 feeds paper 1001, print-outputs the discrimination image It from the discrimination image data Itd on the paper, and outputs printed matter 1003 on which the discrimination image It has been printed (step S208).
The processing of generating printed matter available for authenticity check is as described above using the functional diagram and the flowchart. According to the first embodiment, by performing the above-described processing on a print medium including a fluorescent brightening agent, printed matter available for authenticity check can be easily generated with ordinary ink.
In the above-described first embodiment, a discrimination image is generated with respect to one pixel value, however, in some cases, a discrimination image is to be generated with respect to an image in plural colors such as a mark. Next, a modification of the first embodiment will be described as a second embodiment. In the embodiment, a method of printing a discrimination image based on an image formed with plural pixel values will be described.
Note that as the latent image generator 101, color information holding unit 102, the discrimination image data generator 103, and the image output unit 104 have the same functions as those of the latent image generator 101, the color information holding unit 102, the discrimination image data generator 103 and the image output unit 104 in the above-described first embodiment, the detailed explanations of these units will be omitted. Accordingly, in the image processing apparatus 12, the image input unit 105 and the determination unit 106 are added to the image processing apparatus 11 described in the first embodiment.
Hereinbelow, a method for realizing the present embodiment will be described in accordance with the block diagram of
First, when image data 1004 is inputted into the image processing apparatus 12, the image input unit 105 reads the image, generates image data I as electronic data (step S301), and supplies the generated data to the determination unit 106. Note that the image data I is an image which can be handled in pixel units. For example, in a case where the image is formed on a paper document, on the presumption that the image input unit 105 has a charge coupled device (CCD) or an optical sensor, the image input unit 105 performs photographing on the image in accordance with an image input instruction, then performs electric signal processing, digital signal processing and the like, to generate the image data I. Further, the image is data described in page description language or data generated with an application handling a particular data format, the data format is converted to a general image format (bitmap format) or the like as the image data I. In this embodiment, for the sake of simplicity of explanation, the generated image data I has a region of a pixel value Np1 and a region of a pixel value Np2 as denoted by reference numeral 1004 in
Then, the determination unit 106 receives the image data I supplied from the image input unit 105, and determines whether or not color information on each pixel value of the image data I is held in the color information holding unit 102 (step S302). When it is determined that the color information is not held in the color information holding unit 102, the generation of discrimination image is not performed but the processing is terminated (error termination). When it is determined that the color information is held in the color information holding unit 102, the image data I is supplied to the discrimination image data generator 103.
Next, the latent image generator 101 inputs latent image data C representing character(s) or mark to be visualized under ultraviolet light, then generates latent image data Ic, and supplies the generated data to the discrimination image data generator 103 (step S303).
Note that in the first embodiment, the size of the discrimination image is the same as that of the latent image data Ic, however, in the second embodiment, the size of the discrimination image is the same as that of the input image data I. Accordingly, in some cases, the size of the image data I and that of the latent image data Ic are different. When the size of the latent image data Ic is smaller than that of the image data I, the latent image data Ic is generated by expanding the latent image data Ic as shown in
Then, the discrimination image data generator 103 obtains color information Ci with respect to the pixel values Np1 and Np2 while accessing the color information holding unit 102 (step S304). The discrimination image data generator 103 generates discrimination image data Itd as data forming the discrimination image from the latent image data Ic and the image data I, and supplies the generated data to the image output unit 104. The discrimination image data Itd is set data of color information on each pixel.
The generation of the discrimination image data Itd is made by, handling one pixel of the latent image data Ic and the image data I as a subject pixel, execution of the following processing sequentially from an upper left corner pixel of the latent image data Ic and the image data I by 1 pixel. Note that in this embodiment, the color information holding unit 102 holds the color information Ci with respect to the pixel values Np1 and Np2 as shown in
First, regarding the subject pixel of the latent image data Ic, it is determined whether or not the value of the subject pixel is “1”, i.e., a character or mark exists (step S305). When the value of the subject pixel is “1” and the pixel value of the image data I is Np1, C2 of the color information Ci1 with respect to the pixel value Np1 is obtained and it is determined that the discrimination image data Itd is the second color C2 (0,0,0,30,0,0,0) of the color information Ci1. When the value of the subject pixel is “1” and the pixel value of the image data I is Np2, C2 of the color information Ci2 with respect to the pixel value Np2 is obtained, and it is determined that the discrimination image data Itd is the second color C2 (11,27,0,0,0,0,0) of the color information Ci2 (step S306).
Further, when the value of the subject pixel is “0” and the pixel value of the image data I is Np1, C1 of the color information Ci1 with respect to the pixel value Np1 is obtained and it is determined that the discrimination image data Itd is the first color C1 (0,0,0,24,0,0,0) of the color information Ci1. When the value of the subject pixel is “0” and the pixel value of the image data I is Np2, C1 of the color information Ci2 with respect to the pixel value Np2 is obtained and it is determined that the discrimination image data Itd is the first color C1 (0,0,0,0,0,0,24) of the color information Ci2 (step S307).
Then, it is determined whether or not the subject pixel processed as above is a final pixel (step S308). When it is determined that the subject pixel is not the final pixel, the subject pixel is changed (step S309) and the same processing is repeated from step S305 untill the final pixel is processed.
In this manner, the discrimination image data Itd is generated with respect to all the pixels of the latent image data Ic.
Then, when the discrimination image data Itd is supplied from the discrimination image data generator 103, the image output unit 104 feeds paper 1001, print-outputs the discrimination image It from the discrimination image data Itd on the paper, and outputs printed matter 1003 on which the discrimination image It has been printed (step S310). Accordingly, when printing is performed on a print medium including a fluorescent brightening agent, it is possible to generate a discrimination image formed with plural colors on the print medium.
Note that in this embodiment, the processing is terminated when the color information with respect to each pixel value of the image data I is not held in the color information holding unit 102. However, it may be arranged such that in place of termination, the discrimination image is generated using color information on an approximate pixel value.
For example, assuming that color information as shown in
Np1=(R,G,B)=(128,0,255), using the color information Ci1 of the pixel value Np1.
Further, it may be arranged such that the first color and the second color with respect to a pixel value of the image data I are calculated from the held color information. For example, assuming that as a pixel value Np4 of the image data I,
Np4=(R,G,B)=(128,0,224) holds, the discrimination image data generator 103 calculates color information with respect to the pixel value Np4 from the color information on the pixel values Np1 and Np2 held in the color information holding unit 102 using interpolation as a known technique. As the interpolation, liner interpolation, Lagrange's interpolation, Newton interpolation, Gauss' interpolation, Bessel's interpolation and the like can be used. As the result of calculation,
Ci4=(C1,C2)=((0,0,0,0,0,0,18),(9,21,0,0,0,0,0)) holds, thus the color information with respect to the pixel value Np4 is obtained.
In the above-described second embodiment, the discrimination image It is generated based on the color information previously held in the color information holding unit 102. Hereinbelow, a modification of the above-described embodiment will be described as a third embodiment. In this embodiment, an image having plural pixel values is handled as an input image, and with respect to each pixel value of the input image, color information on the first color and the second color with different ink use amounts per unit area are generated, and a discrimination image is print-outputted based on the color information.
Note that as the latent image generator 101, the color information holding unit 102, the discrimination image data generator 103, the image output unit 104, the image input unit 105 and the determination unit 106 have the same functions as those of the latent image generator 101, the color information holding unit 102, the discrimination image data generator 103, the image output unit 104, the image input unit 105 and the determination unit 106 in the above-described second embodiment, the detailed explanations of these units will be omitted. Accordingly, in the image processing apparatus 13, color information generator 107 is added to the image processing apparatus 12 described in the second embodiment.
Hereinbelow, a method for realizing the present third embodiment will be described in accordance with the block diagram of
First, when the image data 1004 is inputted into the image processing apparatus 13, the image input unit 105 reads the image, generates image data I as electronic data (step S301), and supplies the generated data to the color information generator 107.
Then, the color information generator 107 receives the image data I supplied from the image input unit 105, generates color information with respect to each pixel of the image data I, supplies the generated information to the color information holding unit 102, and the color information holding unit 102 holds the information (step S1301).
Hereinbelow, as the processing of generating the discrimination image data Itd based on the color information held in the color information holding unit 102 (steps S302 to S310) is the same as the processing (steps S302 to S310) described in the second embodiment, the detailed explanation of the processing will be omitted. Accordingly, in the generation processing in the third embodiment, the color information generation processing (S1301) is added to the generation processing described in the above second embodiment.
Hereinbelow, the above-described color information generation processing (S1301) will be described in more detail in accordance with the block diagram of
The color information generator 107, with the function of generating the color information on the first color and the second color with different ink use amounts per unit area with respect to a pixel of an input image, has a color conversion processor 108, a color information calculator 109, a measurement data holding unit 110 and a threshold value holding unit 111. The color conversion processor 108 has a function of converting a pixel value of the image data I to color space data representable on a device. The color information calculator 109 has a function of calculating the color information on the first color and the second color. The measurement data holding unit 110 has a function of holding L*a*b* values with respect to ink use amounts as color measurement data, and a covering ratio with respect to the total ink use amount, as covering ratio data. The threshold value holding unit 111 has a function of holding a threshold value of discrimination of color difference under ordinary light as Th1, and a threshold value of covering ratio difference to cause luminance difference under ultraviolet light, as Th3.
First, when the pixel value Np1 of the image data I is inputted (S1401), the color conversion processor 108 converts the pixel value into L*a*b* values as device-independent color space values (step S1402), and converts the obtained L*a*b* values to L′*a′*b′* values as device-dependent color space values (step S1403). This means converting the pixel value of the image data I into common color space data and converting from the common color space data into device-representable color space data, thereby converting the colors of the image data I into colors which can be outputted by the image output unit 104. Then the color conversion processor 108 supplies the converted L′*a′*b′* to the color information calculator 109.
The color information calculator 109 calculates the first color C1(c,m,y) and the second color C2(r,g,b) as respective ink use amounts with respect to the supplied L′*a′*b′* values, using the color measurement data held in the measurement data holding unit 110.
Note that as shown in
More particularly, four data having L*a*b* values close to the L′*a′*b′* values are obtained from the first color measurement data (
Similarly, four data having L*a*b* values close to the L′*a′*b′* values are obtained from the second color measurement data (
Next, the total ink use amount C_sum is respectively calculated from the first color C1 and the second color C2 (step S1408). The total ink use amount is a value obtained by adding the respective ink use amount. For example, when the first color C1 values are (20,0,10) and the second color C2 values are (5,30,5), the total of the respective ink use amounts for the first color C1, C_sum1, is 30, and the total of the respective ink use amounts for the second color C2, C_sum2, is 40.
Next, the threshold value Th4 as a threshold value of the total ink use amount C_sum2 for the second color C2 is calculated (step S1409) from the total ink use amount C_sum1 for the first color C1, the threshold value Th3 held in the threshold value holding unit 111 and the covering ratio data held in the measurement data holding unit 110. This is calculation of the total ink use amount for the second color C2 when the difference in covering ratio with respect to the first color C1 is Th3. Note that as shown in
Next, processing of obtaining the covering ratio cov from the total ink use amount C_sum is expressed as follows.
cov=f1(C_sum)
Further, processing of obtaining the total ink use amount C_sum from the covering ratio cov is expressed as follows.
C_sum=f2(cov)
In the above-described example, as the total ink use amount for the first color C1, C_sum1=30 holds, from the covering data in
Then, it is determined whether or not the total ink use amount C_sum2 calculated at step S1408 is equal to or greater than Th4 (step S1410).
When it is determined that the total ink use amount C_sum2 is equal to or greater than Th4, the first color C1 and the second color C2 are supplied, as the color information Ci, to the color information holding unit 102. The color information holding unit 102 holds the supplied color information as the color information Ci1 with respect to the pixel value Np1 (step S1417). In this case, as the total ink use amount C_sum2 is equal to or greater than Th4, colors C1=(20,0,10) and C2=(5,30,5) are held as the color information Ci1 for the pixel value Np1 in the information holding unit 102.
On the other hand, when the total ink use amount C_sum2 is less than Th4, the respective ink use amounts for the second color C2 are changed such that the total ink use amount C_sum2 is equal to or greater than Th4 (step S1411).
For example, when the first color C1 is (10,10,10), the second color C2, (12,12,12) and Th3=15 holds as Th3, as the total ink use amount C_sum1 is 30 and the total ink use amount C_sum2 is 36, i.e., the total ink use amount C_sum2 is less than Th4=40. Accordingly, the respective ink use amounts are changed such that the total ink use amount C_sum2 is equal to or greater than Th4=40, i.e., the total of ink use amounts for the second color C2 is equal to or greater than 40, as a second color C2′. As candidates for the second color C2′, (14,13,13), (13,14,13), (13,13,14) or the like can be given. Note that as long as the total ink use amount is equal to or greater than 40, may other candidate can be given. However, when the amount of change is increased, the color difference ΔE with respect to the first color C1 to be calculated later becomes greater. It is desirable that the amount of change is as small as possible. Accordingly, an upper limit of the amount of change is determined, and plural candidates with ink use amounts not beyond the upper limit are calculated in advance.
Next, the L*a*b* values for the second color C2′ changed such that the total ink use amount is equal to or greater than the threshold value Th4 are calculated (step S1412). More particularly, four data with printing color materials close to the changed second color C2′ are obtained from the second color measurement data in the measurement data holding unit 110, and the L*a*b* values for the changed second color C2′ are calculated from the obtained data by interpolation as a known technicue. Then a color difference ΔE between the calculated L*a*b* values for the second color C2′ and the L′*a′*b′* values calculated by the color conversion processor 108 is calculated (step S1413). Then, it is determined whether or not the obtained color difference ΔE is equal to or less than the threshold value Th1 (step S1414). The determination is made as to whether or not the color difference between the first color and the second color is at an undiscriminable level since the L*a*b* values under ordinary light are changed by the change of the respective ink use amounts for the second color C2.
When it is determined that the obtained color difference ΔE is equal to or less than the threshold value Th1, the first color C1 and the second color C2′ are supplied as the color information Ci to the color information holding unit 102. The color information holding unit 102 holds the information as the color information Ci1 with respect to the pixel value Np1 (step S1417).
Further, when it is determined that the obtained color difference ΔE is greater than the threshold value Th1, it is determined whether or not determination of all the candidates for the second color C2′ for which the respective ink use amounts are changed has been completed (step S1415). When it is not determined that the determination of all the candidates has not completed, the candidate is changed (step S1416), and the processing at step S1314 and the subsequent steps is performed. Note that when it is determined at step S1415 that the determination for all the candidates has been completed, the processing is terminated (error-termination).
In this manner, the color information Ci for the first color and the second color, indicating different ink use amounts per unit area with respect to a pixel value of an input image, is generated.
Note that when the threshold value Th3 in the above example is 15, as the covering ratio cov2 of the second color C2, cov2=cov1+Th3=45+15=60 holds. That is, the covering ratio of the second color C2 is higher than the covering ratio of the first color C1. However, as long as the difference between the covering ratio of the first color C1 and the covering ratio of the second color C2 is 15, as the covering ratio cov2 of the second color C2, cov2=cov1−Th3=45−15=30 may be used. That is, the present invention is applicable to a case where the covering ratio of the second color C2 is lower than the covering ratio of the first color C1. In this case, as the total ink use amount C_sum2 for the second color C2, C sum2=f2(cov2)=f2(30)=20 holds, as the threshold value Th4, Th4=20 holds. Then the determination as to whether or not the total ink use amount C_sum2 is less than the threshold value Th4 is performed. When it is determined that the total ink use amount C_sum2 is equal to or greater than the threshold value Th4, the respective ink use amounts for the second color C2 are changed such that the total ink use amount is less than the threshold value Th4.
Hereinbelow, as a fourth embodiment, an example where the processing according to the above-described respective embodiments is realized with a computer program (application program and a printer driver program) executed by an information processing apparatus such as a personal computer connected to a printer will be described. For the sake of simplicity of explanation, the fourth embodiment is applied to the first embodiment, however, those skilled in the art easily understand that the forth embodiment is also applicable to other embodiments.
In the above configuration, when the power of the apparatus main body is turned ON, the CPU 901 reads the OS from the external storage device 908 to the RAM 902 and executes the OS in accordance with a boot program stored in the ROM 903. As a result, the apparatus functions as an information processing apparatus using the operation input device 906 and the display 905 as a user interface. When a user instructs to execute an application described below, the present apparatus functions as an information processing apparatus. Then the user can perform various applications installed in the external storage device 908.
For assistance of understanding, in this embodiment, an application program to design a concert ticket and print-output the designed ticket will be described.
Note that the application program in the fourth embodiment, it is possible to communicate with the installed printer driver or to access data managed by the printer driver via the OS (operating system). As a result, the application program can obtain a color table, available for generation of a discrimination image such as images shown in
First, in response to the user's operation, the CPU 901 generates a ticket design in accordance with various figure variables and image editing functions in the application (step S1001). Next, in response to the user's instruction, the CPU 901 performs processing of setting the discrimination image region in the application (step S1002). The setting of the discrimination image region includes inputting the position and size of the region, a character string or mark as the original of latent image data C included in the region, and selecting a color used in the discrimination image region. As described above, as the color selection depends on the connected printer 910, the user selects a color from color information (first color) available for generation of a latent image obtained from the printer driver. Thereafter, when the user has designated the number of print copies and instructed to perform printing, the CPU 901 performs processing for delivery of the information on the ticket design and the setting information of the discrimination image region to the printer driver (step S1003).
First, the CPU 901 generates ticket print image data in accordance with the ticket design information delivered from the application (step S1101). Next, the CPU 901 generates discrimination image data in accordance with the setting information of the discrimination image region delivered from the application (step S1102). The generation of the discrimination image data is realized by executing a program equivalent to the latent image generator 101, the color information holding unit 102 and the discrimination image data generator 103 in
Note that in the above-described first to fourth embodiments, the print medium including fluorescent brightening agent which becomes fluorescent under ultraviolet light has been described, however, the present invention is not limited to this type of print medium as long as a part with a different ink covering ratio can be intentionally generated with respect to a print medium which becomes fluorescent under a particular light source. Accordingly, the fluorescent agent is not limited to the fluorescent brightening agent, and the light source is not limited to the ultraviolet light source. Further, the print medium may be a print medium including the fluorescent agent or may be a print medium coated with the fluorescent agent.
Aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiments, and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiments. For this purpose, the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2009-241889 filed Oct. 20, 2009 and No. 2010-168428 filed Jul. 27, 2010, which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | Kind |
---|---|---|---|
2009-241889 | Oct 2009 | JP | national |
2010-168428 | Jul 2010 | JP | national |