The present invention relates to an image generating apparatus which superimposes plural additional images on a color image to generate a combined image.
The widespread use of copy machines enables easy duplication of images (original). Thus, a method for confirming whether an image is authentic (original) or not is required.
In JP-A-2004-48800, a single first image is embedded in a second image (original) to generate a combined image. If the combined image is observed with the naked eye, only the second image is visually recognized. Meanwhile, if a special sheet is superimposed on a recording object in which the combined image is recorded, the first image is seen as overlapping the second image. This enables confirmation as to whether the original image is counterfeited or not.
However, in confirming the counterfeit, it may be insufficient simply to embed the single first image in the second image.
To solve the foregoing problem, according to an aspect of the invention, an image generating apparatus includes: a modulating section which, by using different additional images corresponding to different pattern images, modulates signals of the pattern images to generate plural modulated pattern images; and a superimposing section which, by changing color information of a color image in accordance with each of the modulated pattern images, superimposes the plural modulated pattern images on the color image to generate a recordable combined image.
According to another aspect of the invention, an image generating method includes: by using different additional images corresponding to different pattern images, modulating signals of the pattern images to generate plural modulated pattern images; and by changing color information of a color image in accordance with each of the modulated pattern images, superimposing the plural modulated pattern images on the color image to generate a recordable combined image.
Hereinafter, embodiments of the invention will be described with reference to the drawings.
An image generating apparatus according to a first embodiment of the invention will be described. The image generating apparatus according to this embodiment embeds plural embedding images (additional images) in a color image and thus generates a combined image. The plural embedding images are different from each other. The generated combined image is recorded (formed) on a recording object such as a sheet.
If the combined image recorded on the recording object is directly observed by a person from outside, almost only the color image is visually recognized and the embedding image is not visually recognized. On the other hand, if a special sheet is used which will be described later, the embedding image in the combined image (color image) can be visually recognized.
Data of a color image S1 as an original is inputted to an input section 101. The data of the color image S1 is sent to a superimposing section 102.
Meanwhile, data of n embedding images 103-1 to 103-n embedded in the color image S1 are sent to an embedding pattern generating section 104. The number n is an integer equal to or greater than 2. The number n and content of the embedding images embedded in the color image S1 can be properly selected by a user.
The embedding images 103-1 to 103-n show different contents from each other. The contents in this case refer to features that enable each image to be identified by external observation, for example, the size and shape of the image. The images also include pictures, letters, symbols, and numerals.
The data of the embedding images 103-1 to 103-n can be prepared and stored in advance in a memory (not shown). Embedding image data may also be newly prepared and added to the memory.
The embedding pattern generating section (modulating section) 104 acquires basic patterns (pattern images) from a memory 105. Also, newly prepared embedding image data can be supplied to the embedding pattern generating section 104.
The basic patterns 105-1 to 105-n are prepared in the number equal to the number of the data of the embedding images 103-1 to 103-n. The basic patterns 105-1 to 105-n correspond to the embedding image data 103-1 to 103-n.
If specific embedding image data 103-k (where k is an arbitrary value from 1 to n) is inputted, the embedding pattern generating section 104 reads out the basic pattern 105-k corresponding to the embedding image data 103-k from the memory 105. The embedding pattern generating section 104 processes the corresponding basic pattern 105-k on the basis of the embedding image data 103-k and thus generates an embedding pattern (modulated pattern image).
Specifically, the embedding pattern generating section 104 modulates the signal of the basic pattern 105-k with the signal of the embedding image data 103-k and thereby generates the signal of the embedding pattern. The embedding pattern generating section 104 supplies the generated embedding pattern to the superimposing section 102.
A method for generating an embedding pattern will be described specifically with reference to
An embedding pattern C1 shown in
The basic pattern A1 includes plural pixels A11 and A12 that are different from each other. The pixels A11 and A12 serve as indexes for changing the color difference of the color image S1, as will be described later. In the pixels A11 and A12, values used for changing the color difference are different from each other. In the basic pattern A1, the pixels A11 and A12 are arranged alternately in the x-direction and the y-direction.
The embedding image B1 is a monochrome binary image to be embedded in the color image S1. The embedding image B1 has an image area including plural pixels B10 and a background area where no pixels B10 are located. The basic pattern A1 and the embedding image B1 have the same size (the same number of pixels).
The embedding pattern C1 is generated by inverting pixels in the basic pattern A1 corresponding to the pixels B10 in the embedding image B1. For example, in the embedding image Bi, the pixel B10 exists at a position P1 ((x,y)=(4,2)). Therefore, the pixel at the position P1 in the basic pattern A1 is changed from a pixel A11 to a pixel A12.
An embedding pattern C2 shown in
The basic pattern A2 has a different pattern from the basic pattern A1 shown in
The content of the embedding image B2 is different from the content of the embedding image B1 shown in
The embedding pattern C2 is generated by inverting pixels in the basic pattern A2 corresponding to the pixels B20 in the embedding image B2. For example, in the embedding image B2, the pixel B20 exists at a position P2 ((x,y)=(1,3)). Therefore, the pixel at the position P2 in the basic pattern A2 is changed from a pixel A21 to a pixel A22.
In the descriptions with reference to
The superimposing section 102 shown in
Specifically, the superimposing section 102 changes the color difference of the color image S1 in accordance with each embedding pattern. By changing the color difference, it is possible to make a change in the color image S1 that cannot easily be observed with the naked eye. Saturation can be changed instead of color difference. Alternatively, both color difference and saturation can be changed.
A method for superimposing the embedding pattern C1 shown in
In the case of superimposing the embedding pattern C1 shown in
R
2
=R
1
+d/6 (1)
G
2
=G
1+
d/6 (2)
B
2
=B
1
−d/3 (3)
R1, G1 and B1 indicate the value of each color component in the color image S1 supplied from the input section 101. R2, G2 and B2 indicate the value of each color component after the color image S1 is modulated with the embedding pattern C1. The symbol d indicates the fluctuation range.
Meanwhile, for the pixels in the color image S1 corresponding to the pixels A12 in the embedding pattern C1, the pixel values can be changed as expressed by the following equations (4) to (6). In equations (4) to (6), the sign of “(d)” in equations (1) to (3) is inverted.
R
2
=R
1
−d/6 (4)
G
2
=G
1
−d/6 (5)
B
2
=B
1
+d/3 (6)
With the above modulation, the embedding pattern C1 shown in
Here, the size of the embedding pattern C1 may be coincident with the size of the color image S1 or may be smaller than the size of the color image S1. If the embedding pattern C1 and the color image S1 have the same size, the embedding pattern is superimposed on the entire color image S1. If the embedding pattern C1 is smaller than the color image S1, the embedding pattern C1 is superimposed on a predetermined area in the color image S1. In this case, the position where the embedding pattern C1 is superimposed can be suitably set.
Next, the superimposing section 102 superimposes the embedding pattern C2 shown in
For example, for the pixels in the color image S1 corresponding to the pixels A21 in the embedding pattern C2, the pixel values can be changed similarly to the equations (1) to (3). For the pixels in the color image S1 corresponding to the pixels A22 in the embedding pattern C2, the pixel values can be changed similarly to the equations (4) to (6).
Thus, an image (a combined image S2; see
Here, since the embedding patterns C1 and C2 are superimposed on the same area in the color image S1, the interference by the embedding patterns C1 and C2 may make the embedding images B1 and B2 difficult to visually recognize even if the an image reproducing method which will be described later is used. Thus, in order to reduce the interference by the embedding patterns C1 and C2, modulation can be carried out in different color difference directions from each other.
For example, at the time of superimposing the embedding pattern C1 on the color image S1, modulation is carried out in the yellow-blue direction. At the time of superimposing the embedding pattern C2, modulation can be carried out in the magenta-green direction.
More specifically, at the time of superimposing the embedding pattern C1, the color image S1 can be modulated by using the equations (1) to (6). Meanwhile, at the time of superimposing the embedding pattern C2, the pixel values of the pixels corresponding to the pixels A21 can be changed as expressed by the following equations (7) to (9).
R
2
=R
1
−d/6 (7)
G
2
=G
1
+d/3 (8)
B
2
=B
1
−d/6 (9)
For the pixels corresponding to the pixels A22 in the embedding pattern C2, the pixel values can be changed by using the equations (7) to (9) with the sign of “d” inverted.
In the above example, two embedding images are embedded in the color image S1. However, the number of embedding images is not limited to this. That is, three or more embedding images can be embedded in the color image S1. In this case, embedding patterns corresponding to the three or more embedding images can be generated and these embedding patterns can be superimposed on the color image.
In the above example, plural embedding patterns are superimposed on the same area in the color image. However, the superimposing area is not limited to this. That is, embedding patterns can be superimposed on different image areas in the color image. For example, the embedding pattern C1 shown in
The combined image S2 generated by the superimposing section 102 is outputted from an output section 106 (see
The combined image S2 generated by the superimposing section 102 is expressed by R, G and B color components. Therefore, when printing the combined image S2, it is preferable to convert the R, G and B color components to C (cyan), M (magenta) and Y (yellow) color components in advance.
Next, general processing (program process) to generate a combined image S2 from a color image S1 will be described with reference to
The recording medium can be, for example, an internal storage device installed in a computer such as ROM or RAM, a portable storage medium such as CD-ROM, flexible disk, DVD disk, magneto-optical disk or IC card, a database that holds computer programs, or a transmission medium on a line.
A color image S1(x,y) is inputted to the input section 101 (ACT 201). An embedding image Bn(x,y) to be embedded into the color image S1(x,y) and the number of embedding images n are set (ACT 202). For example, the number n and embedding image(s) Bn(x,y) can be set by a user's manual input.
The embedding pattern generating section 104 sets n0 to 1 (ACT 203). The embedding pattern generating section 104 generates a basic pattern An(x,y) (ACT 204). Specifically, the embedding pattern generating section 104 acquires the basic pattern An(x,y) from the memory or receives input of anew basic pattern An(x,y).
The embedding pattern generating section 104 determines whether the embedding image Bn(x,y) has a value of 0 or not (ACT 205). Here, since embedding images are binary images as described above, the embedding image Bn(x,y) shows a value of 0 or 1.
If the embedding image Bn(x,y) has a value of 1, the embedding pattern generating section 104 modulates the basic pattern An(x,y) (ACT 206). In other words, the pixels of the basic pattern are inverted as described with reference to
On the other hand, if the embedding image Bn(x,y) has a value of 0, the embedding pattern generating section 104 does not modulate the basic pattern An(x,y). In other words, the pixels of the basic pattern are not inverted as described with reference to
The superimposing section 102 superimposes the modulated basic pattern An′(x,y) on the color image S1(x,y) and thus generates a combined image S2(x,y) (ACT 207). Then, it is determined whether n0 is n or not (ACT 208). If n0 is not n, 1 is added to n0 (ACT 209). Then, the processing of ACT 204 to ACT 207 is repeated. Meanwhile, if n0 is n, the combined image S2(x,y) is outputted (ACT 210).
Next, a method for reproducing plural embedding images from the combined image S2 will be described. In the following description, a method for reproducing the embedding images B1 and B2 from the combined image S2 formed by superimposing the embedding patterns C1 and C2 (see
The embedding images B1 and B2 are reproduced as a mask sheet (sheet member), described hereinafter, is superimposed on the recording object on which the combined image S2 is recorded.
The mask sheet 201 can be formed, for example, by printing black color at the parts of a transparent sheet that correspond to the pixels M11. The parts that correspond to the pixels M12 remain transparent. Alternatively, the pixels M12 can be black areas and the pixels M11 can be transparent areas.
The size of the mask sheet is inputted (ACT 301). The number n allocated to the basic pattern is inputted (ACT 302). The size of the mask sheet and the number n can be inputted, for example, by a user.
A basic pattern An(x,y) corresponding to the inputted number n is generated (ACT 303). Specifically, the basic pattern An(x,y) stored in the memory is acquired, or input of a new basic pattern An(x,y) is received.
The basic pattern An(x,y) is outputted and the pattern is printed (ACT 304). The above processing is similar to general image forming processing.
As the mask sheet 201 is superimposed on the recording object on which the combined image is recorded, the embedding image B1 can be observed.
If the mask sheet 201 is superimposed on the combined image, a part of the combined image can be visually recognized only through the light-transmitting areas (pixels M12) of the mask sheet 201. As described above, in the embedding pattern C1 superimposed on the color image S1, a part of the pixels in the basic pattern A1 is inverted by the embedding image B1.
Therefore, if the mask sheet 201 having the same pattern as the basic pattern A1 is used, the inverted pixels are highlighted as shown in
Meanwhile, if the mask sheet 202 is superimposed on the combined image, the embedding image B2 can be observed according to the principle similar to that of the mask sheet 202. Specifically, the pixels inverted from the pixels in the basic pattern A2 are highlighted, as shown in
In the examples shown in
Alternatively, a lenticular lens (sheet member) as an optical device can be used instead of the mask sheet. In a lenticular lens, plural cylindrical lens parts are arrayed in parallel. If a lenticular lens is used, the striped basic pattern A2 shown in
If the lenticular lens is superimposed on the combined image S2 while matching the pitch of the lenticular lens with the pitch of the basic pattern A2, the embedding image can be confirmed.
In this embodiment, by embedding plural embedding images into a color image, it is possible to enhance the level of security against counterfeit.
Specifically, plural embedding images cannot be confirmed without using plural kinds of mask sheets. After the plural embedding images are confirmed, authenticity of the color image can be determined. Moreover, if plural embedding images are embedded in the same area in the color image S1, each embedding image becomes harder for a third party to discover.
Here, as the number of embedding patterns superimposed on the color image S1 is increased, the level of security against counterfeit can be raised. Meanwhile, repeated superimposition of embedding patterns may cause deterioration in image quality of the combined image S2. The number of embedding patterns superimposed on the color image, that is, the number of embedding images, can be decided in consideration of this point.
In a second embodiment of the invention, plural embedding images are reproduced from a combined image by using one mask sheet. The same parts as described in the first embodiment are denoted by the same reference numerals.
In this embodiment, a basic pattern A3 shown in
The basic pattern A3 is a pattern formed by rotating the basic pattern A2 described with reference to
The embedding pattern generating section 104 inverts the pixels A31 and A32 in the basic pattern A3 that correspond to the pixels B10 in the embedding image B1 and thereby generates the embedding pattern C3, as described in the first embodiment.
The superimposing section 102 superimposes the embedding pattern C3 shown in
If the mask sheet 202 shown in
The basic pattern A3 is a pattern formed by rotating the basic pattern A2 by 90 degrees counterclockwise. However, the pattern is not limited to this. That is, it suffices that the basic pattern A2 exists in any arbitrary direction within a two-dimensional plane. Here, the two basic patterns have point symmetry.
For example, as the basic pattern A3, a pattern formed by rotating the basic pattern A2 by 90 degrees clockwise can be used. Moreover, a pattern formed by rotating the basic pattern A2 by 45 degrees clockwise or counterclockwise can be used as well. In this case, if the mask sheet 202 is rotated within the two-dimensional plane, plural embedding images can be visually recognized in accordance with the rotation angle.
It is also possible to visually recognize plural embedding images by reversing the mask sheet. In other words, a mask sheet with line symmetry about an axis in the x-direction or y-direction can be used. Depending on the pattern of the mask sheet, different patterns can be seen from a specific direction as the mask sheet is reversed.
Therefore, if the mask sheet is arranged with its one side facing the combined image, one embedding image can be visually recognized. Then, if the mask sheet is arranged with its one side facing the observer, the other embedding image can be visually recognized.
In this embodiment, the mask sheet 202 described with reference to
A third embodiment of the invention will be described. In this embodiment, two embedding patterns generated from similar basic patterns to each other are superimposed on a color image, and a combined image is thus generated. The same parts described as in the first embodiment are denoted by the same reference numerals.
If two basic patterns are similar to each other and two embedding patterns generated from these basic patterns are superimposed on the same area in a color image, it is difficult to visually recognize each embedding image by using a mask sheet. Whether basic patterns are similar to each other or not can be determined in accordance with whether embedding images are hard to visually recognize or not, as described above.
In this embodiment, two embedding patterns generated from two similar basic patterns to each other are superimposed on image areas located at different positions from each other within a color image. In other words, plural embedding patterns generated from similar basic patterns to each other are prohibited from being superimposed on the same area in a color image. Thus, two embedding images can easily be visually recognized with the use of a mask sheet. Hereinafter, this is described more specifically.
The embedding pattern generating section 104 acquires first and second basic patterns 105-1 and 105-2 that are similar to each other from the memory 105. Information about whether the basic patterns are similar to each other or not can be stored in the memory 105 in association with the basic patterns.
The embedding pattern generating section 104 processes (modulates) the first and second basic patterns 105-1 and 105-2 on the basis of embedding images 103-1 and 103-2 corresponding to each basic pattern and thereby generates first and second embedding patterns. The embedding pattern generating section 104 supplies information showing that the basic patterns are similar to each other, together with the generated first and second embedding patterns, to the superimposing section 102.
The superimposing section 102 superimposes the first embedding pattern on a first image area R1 in the color image S1 (see
Here, a third embedding pattern can also be superimposed on the image areas R1 and R2. The third embedding pattern is formed by processing (modulating) a third basic pattern that is not similar to the first and second basic patterns, with an embedding image.
If three or more basic patterns are similar to each other, embedding patterns generated from these basic patterns can be superimposed on different image areas from each other in the color image.
In this embodiment, similar basic patterns are specified in advance. However, the basic patterns are not limited to this. For example, the embedding pattern generating section 104 or the superimposing section 102 can determine whether the basic patterns are similar or not, according to a predetermined standard.
An image generating apparatus as a fourth embodiment of the invention will be described. In this embodiment, plural embedding images to be observed by using a mask sheet are embedded in a color image, and numeric data (additional information) acquired by image analysis is also embedded in the color image.
The configuration of the image generating apparatus according to the present embodiment will be described with reference to
A first embedding pattern generating section 104a processes a basic pattern in accordance with an embedding image and thereby generates an embedding pattern. A first memory 105a stores plural basic patterns corresponding to plural embedding images. A first superimposing section 102a superimposes the plural embedding patterns generated by the first embedding pattern generating section 104a on the color image S1. The operations of the first embedding pattern generating section 104a and the first superimposing section 102a are the same as described in the first embodiment.
Hereinafter, a method for embedding numeric data in a color image will be described. It is confirmed that the human gradation identifying ability of human beings is high with respect to changes in the luminance direction and low with respect to changes in the color difference direction. Thus, as in the first embodiment, numeric data can be embedded by utilizing this characteristic. In color images, generally, color difference components do not contain high-frequency components.
The color image (combined image) generated by the first superimposing section 102a is inputted to a second superimposing section 102b. The operation of the first superimposing section 102a and the operation of the second superimposing section 102b can be carried out by one component (superimposing section).
Numeric data 107 is supplied to a second embedding pattern generating section (generating section) 104b. The numeric data 107 is supplied to the second embedding pattern generating section 104b as a code including plural bits.
The second embedding pattern generating section 104b generates a pattern (embedding pattern) having plural frequency components based on the inputted numeric data 107. In this embodiment, the second embedding pattern generating section 104b generates a pattern having plural frequency components by using basic patterns stored in a second memory 105b.
The plural basic patterns stored in the first memory 105a may be the same as or different from the plural basic patterns stored in the second memory 105b. Also, a pattern can be newly generated on the basis of plural frequency components that are set on the basis of the numeric data 107.
The processing by the second embedding pattern generating section 104b will be described with reference to
The example shown in
Solid black circles shown in
In the example shown in
Here, a point for direction detection is set on the Fourier transform plane. This point is used to align the direction of the image at the time of reading the numeric data (code) embedded in the color image with the direction of the image at the time of embedding the numeric data. The point for direction detection is constantly set to be ON when embedding the numeric data 107.
It is preferable that the point for detection direction has an angle that does not easily cause deterioration and has a low frequency component so that the direction of the image can easily be detected. It is also preferable that a frequency component that is different from the frequency component of the point for direction detection is used as the frequency component of each bit forming the numeric data (code). This enables prevention of erroneous direction detection.
The second embedding pattern generating section 104b supplies the embedding pattern having plural frequency components to the second superimposing section 102b. The second superimposing section 102b superimposes the embedding pattern from the second embedding pattern generating section 104b on the color image and thus generates the combined image S2. Then, the output section 106 outputs the combined image S2. The combined image S2 is recorded on a recording object as described in the first embodiment.
Next, a method for reproducing information embedded in the color image S1 will be described.
The embedding image embedded in the color image S1 can be reproduced as a mask sheet is superimposed on the combined image, as in the first embodiment.
Meanwhile, the numeric data embedded in the color image is reproduced as follows.
First, the color image (combined image) in which the numeric data is embedded is scanned by a scanner or the like and image data is thus generated. Specifically, the image area in which the numeric data is embedded, in the color image, is scanned. The scanned image data is then Fourier-transformed.
Next, a frequency component for angle detection is detected on the Fourier transform plane and the angle of the scanned image is adjusted on the basis of the result of the detection. Whether a frequency component exists at each bit or not is confirmed in order of bit number. “1” is set if there is a frequency component. “0” is set if there is no frequency component. Thus, the numeric data 107 can be reproduced.
Here, the numeric data 107 embedded in the color image 107 can be associated with the embedding image. The association in this case means that the numeric data 107 can specify the embedding image.
By embedding the numeric data 107 thus associated with the embedding image into the color image S1, it is possible to construct a system with a high security level. For example, if a counterfeited embedding image is embedded in the color image, the numeric data can be scanned and it can thus be confirmed whether the embedding image that is visually recognized by using a mask sheet is authentic or not.
In a fifth embodiment of the invention, plural embedding images are embedded in a color image and information (additional information) indicating truth or falsehood of the embedding image is also embedded in the color image. A true embedding image is an image that is truly used by a person who reproduces the embedding image. A false embedding image is an image that has no value of use to a person who reproduces the embedding image.
The information indicating truth or falsehood of the embedding image can be embedded in the color image by a similar method to the embedding method of the numeric data described in the fourth embodiment.
Specifically, as in the fourth embodiment, plural points are provided on the Fourier transform plane and the plural points and plural embedding images are associated with each other by using reference numbers.
For example, three points are provided on the Fourier transform plane, as shown in
The second embedding pattern generating section 104b generates an embedding pattern having plural frequency components on the basis of ON or OFF state of each point shown in
The second superimposing section 102b superimposes the embedding pattern from the second embedding pattern generating section 104b on the color image. Thus, the combined image S2 is generated.
Meanwhile, by conducting similar image analysis to the fourth embodiment, it is possible to acquire information embedded in the combined image S2. Specifically, the scanned image data is Fourier-transformed and the presence or absence of a frequency component at each point is detected.
Thus, if a frequency component is confirmed at a point on the Fourier transform plane, the embedding image associated with this point by reference number can be regarded as a true embedding image. If no frequency component is confirmed at a point on the Fourier transform plane, the embedding image associated with this point by reference number can be regarded as a false embedding image.
The invention is described in detail with reference to specific embodiments. However, it is obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit and scope of the invention.
As described above in detail, according to the invention, a technique of superimposing plural additional images on a color image and thus generating a combined image can be provided.
This application is based upon and claims the benefit of priority from: U.S. provisional application 61/076,280, filed on Jun. 27, 2008; and U.S. provisional application 61/076,281, filed on Jun. 27, 2008, the entire contents of each of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61076281 | Jun 2008 | US | |
61076280 | Jun 2008 | US |