This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2007-263386 filed on Oct. 9, 2007.
The present invention relates to an image processing device, an image forming device, an image reading system, a comparison system, an image processing method, computer readable medium, and computer data signal.
An aspect of the present invention provides an image processing device including: a generating unit that generates image data on the basis of which an image forming unit forms a visible image on a recording medium containing detectable substances using only a coloring material having a spectral reflection factor that is different in a particular wavelength range from a spectral reflection factor that the detectable substances have by a predetermined threshold or more; and an output unit that outputs the image data generated by the generating unit to the image forming unit.
Exemplary embodiments of the present invention will now be described in detail below with reference to the following figures, wherein:
Hereinafter, exemplary embodiments of the invention will be described with reference to the drawings.
1. Structure
Now, a case where a user takes a printed material outside through the door 400 will be described. The user operates the comparison device 300 so as to read the printed material. The comparison device 300 reads the printed material and calculates feature amounts characterizing distribution of detectable substances watermarked in the printed material. The comparison device 300 and the registration device 200 are connected wirelessly or by a cable to enable communication with each other. The comparison device 300 compares the feature amounts calculated by the comparison device 300 with feature amounts characterizing distribution of detectable substances, which are stored in the registration device 200. The comparison device 300 outputs a comparison result. At this time, if the comparison result satisfies a predetermined condition and if the printed material is not an item for internal use only, the comparison device 300 opens the door 400. Otherwise, if the comparison result does not satisfy the predetermined condition or if the printed material is for internal use only, the comparison device 300 inhibits opening of the door 400. The aforementioned predetermined condition is determined depending on correlation between feature amounts characterizing distributions to be compared with each other (such as a number of equal feature amounts or a value of equal feature amounts). For example, if calculated feature amounts characterizing distribution of detectable substances agree with stored feature amounts characterizing distribution of detectable substances at a rate of 80% or more, these detectable substances are regarded to be identical to each other. As an alternative example, the predetermined condition is that a difference between compared values of compared feature amounts is 5% or less. The door 400 is not limited to an openable/closable door but may be a gate constituted of panels attached on two sides of a gateway which users can pass through at any time. In this case, for example, an emergency bell or siren is set in a security guard room not shown outside the gate or the limited space, and take-out of a printed material may be notified by sounds or light in place of closing the door.
More specifically, the controller 210 has a CPU (Central Processing Unit) 211, a memory 212, and an interface 213. The CPU 211 executes programs stored in the memory 212. For example, the memory 212 includes a ROM (Read Only Memory) which stores various programs, and a RAM (Random Access Memory) which functions as a work area for the CPU 211. The interface 213 is a physical interface which enables exchange of information with individual units connected to the controller 210. The interface 213 receives various information from the image read unit 220 and the manipulation unit 230, and supplies the image read unit 220 with various information.
Programs stored in the memory 212 are a basic program P1 for controlling operation of the registration device 200, and a feature amount calculation program P2 for calculating feature amounts which characterize distribution of a detectable substance. Processings performed by the feature amount calculation program P2 will be described in detail later.
Next, the image forming unit 250 will be described below. The image forming unit 250 includes image forming engines.
The image forming engines are respectively provided for individual developers containing toners (coloring materials) of different colors: cyan (C), magenta (M), yellow (Y), and black (K). Each of the image forming engines includes a photosensitive drum, an electric charge unit, an exposure unit, a development unit, and a transfer unit. The black toner (hereinafter “K toner”) utilizes a pigment as a coloring material, and contains a carbon black. Toners of the other colors also utilize pigments of corresponding colors, respectively. The photosensitive drums each are a drum type member which rotates at a predetermined speed about an axle as a rotation center. The photosensitive drums are charged to an electric potential by the electric charge units, respectively. The exposure units irradiate the electrically charged photosensitive drums with laser light, to form electrostatic latent images, respectively. The development units supply toners of corresponding colors so as to stick to the electrostatic latent images formed on the photosensitive drums, respectively, and develop the latent images to attain toner images, also respectively. The transfer units respectively transfer the toner images of corresponding colors to a paper sheet which is fed from a sheet feed tray in synchronization with forming of an image. After fixing the toner images on the paper sheet, the paper sheet is output outside of the device.
The image read unit 220 is provided in an upstream side relative to the transfer units of the image forming unit 250 along a sheet feed direction. Before the toner images are transferred by the transfer units, the image read unit 220 optically reads the paper sheet fed from the sheet feed tray.
Specifically, the image read unit 220 has a structure as shown in
The size of image information and the number of grey-scales can be arbitrarily determined. In this exemplary embodiment, an area of A4 size (210 mm×297 mm) is read at an input resolution of 600 dots (pixels) per inch, to obtain data in which each dot indicates 8-bit grey-scale (of total number of 256 grey-scales). At this time, grey-scale values (luminance information) are defined so that a grey-scale value “0” corresponds to white and a grey-scale value “255” corresponds to black. The lower the grey-scale value is, the higher the brightness is. The higher the grey-scale value is, the lower the brightness is. In the image information, an image area covers an entire surface of a paper sheet. That is, the image area of image information is an array of 4960 (≈210×600/25.4) pixels in the X direction×7016 (≈297×600/25.4).
The ID information storage unit 240 stores an ID information management table 241 and a property information management table 242.
Next,
As shown in
Referring back to
As shown in
The controller 310 has a CPU 311, a memory 312, and an interface 313. The CPU 311 executes programs stored in the memory 312. For example, the memory 212 includes a ROM (Read Only Memory) which stores various programs, and a RAM (Random Access Memory) which functions as a work area for the CPU 311. The interface 313 is a physical interface which enables exchange of information with individual units connected to the controller 310. The interface 313 obtains various information from the image read unit 320 and the manipulation unit 330. The programs stored in the memory 312 are a basic program P3 for controlling operation of the comparison device 300, and a feature amount calculation/comparison program P4 for calculating feature amounts characterizing distribution of detectable substances and for making a comparison. Processings performed by the feature amount calculation/comparison program P4 will be described in detail later.
The image read unit 320 reads an area of A4 size (210 mm×297 mm) at input resolution of 600 dots (pixels) per inch, and generates image information of “256” grey-scales. The greater a grey-scale value of a pixel in the image information is, the lower (i.e., darker) the brightness of the pixel is. The smaller a grey-scale value of a pixel in the image information is, the higher (i.e, brighter) the brightness of the pixel is.
Referring to
2. Operation
Descriptions will now be made of content of processings executed by the comparison system 100, separately depending on whether the processings belong to operation of the registration device 200 or comparison device 300.
2-1. Operation of the Registration Device 200
In
The steps Sb, Sc, and Sd will now be described in detail below.
Object Extraction Processing
In
The expansion processing will now be described referring to a specific example. For example, an image information having a pixel P(i, j) as shown in
In the expansion processing as described above, the number of neighboring pixels can be any arbitrary number. For example, neighboring pixels may be pixels in one line on each of upper, lower, left, and right sides of a target pixel, in place of pixels in two lines on each of the four sides of a target pixel as in the foregoing example. Hereinafter, the expansion processing performed on neighboring pixels existing in two lines on each of upper, lower, left, and right sides of a target pixel will be referred to as a “5×5 pixel expansion processing”, in the meaning of focusing on 5×5 pixels about a target pixel as a center. Similarly, an expansion processing performed on neighboring pixels existing in one line on each of upper, lower, left, and right sides of a target pixel will be referred to as a “3×3 pixel expansion processing”, in the meaning of focusing on 3×3 pixels about a target pixel as a center. That is, the expansion processing executed in the step Sb2 is the 5×5 pixel expansion processing.
Returning to the description of the flowchart of
Next, the controller 210 calculates an average of grey-scale values of all pixels constituting the image information (step Sb7). Based on the average calculated at this time, the controller 210 determines a threshold T for a binarization processing to be performed later (step Sb8). Any arbitrary relationship can be made between the threshold T and the average. For example, the threshold T can be a value obtained by multiplying the average by a predetermined coefficient. In this operation example, the threshold T is a value obtained by adding “22” to the average.
Further, the controller 210 executes a binarization processing by using the threshold T as determined in a manner described above (step Sb9). That is, the controller 210 carries out a substitution so that all grey-scale values of pixels that are smaller than the threshold T are set to “0” and all grey-scale values of pixels that are not smaller than the threshold T are set to “1”.
After executing the binarization processing, the controller 210 executes a processing for extracting objects, based on image information binarized through the binarization processing (step Sb10). In this processing, for example, labeling is carried out with regard to one object for each cluster of continuous pixels which have a gradation value “1”. In addition, a length, a peripheral length, and an area size of each object are calculated. If the length, peripheral length, and area size of an object do not reach predetermined thresholds, the object is regarded and excluded as noise, e.g., as an object which has been extracted due to warp of a paper sheet or nonuniformity of light. In this example, the predetermined thresholds for the length, peripheral length, and area size of an object are respectively set to “236”, “600”, and “7000”. These thresholds are expressed in units of “pixels”. Specifically, the threshold for the length is approximately 10 (≈236/600×25.4) mm. Hereinafter, where the term “object” (except detectable substances) is used, the term refers to an object extracted in the step Sb10 but does not refer to noise appearing in image information.
Feature Amount Calculation Processing
Next, the feature amount calculation processing in the step Sc in
In the feature amount calculation processing, the controller 210 divides an image expressed by image information, into plural images (hereinafter “divisional image areas”), and calculates feature amounts characterizing distribution of detectable substances for each of the divisional image area. Specifically, as shown in
At first, focusing on an object as a target, the controller 210 specifies which of the divisional image areas F1 to F9 the object belongs to (step Sc2). In this case, coordinate values of a centroid of each object are compared with coordinate values which define each divisional image area. A divisional image area to which the centroid of an object belongs is specified as the divisional image area to which the object belongs. In the example of
Next, the controller 210 specifies numbers of overlapping detectable substances among objects (step Sc).
More specifically, the controller 210 calculates, for each object, a number of overlapping detectable substances from area sizes or peripheral lengths of the extracted object. Each detectable substance has a length of approximately 25 mm, and hence has an area size of 10,000 to 33,000 (pixels) and a peripheral length of 8,50 to 1,500 (pixels). The controller 210 therefore determines “2” as a number of overlapping detectable substances if an object has an area size which is equal to or greater than 33,000 and smaller than 55,000 or if an object has a peripheral length which is equal to or greater than 1,500 and smaller than 3,000. Otherwise, the controller 210 determines “3 or more” as a number of overlapping detectable substances if an object has an area size which is equal to or greater than 55,000 or if an object has a peripheral length which is equal to or greater than 3,000. Yet otherwise, the controller 210 determines “1” as a number of overlapping detectable substances if an object has an area size smaller than 33,000 or if an object has a peripheral length smaller than 1,500. In this manner, as shown in
Subsequently, as shown in
In
The controller 210 calculates a total number of objects which belong to the entire image area expressed by the image information (step Sc6). In this example, the total number of objects is calculated to be “10” as a total number of the objects A to J. Subsequently, the controller 210 calculates a sub-total number of objects which belong to a divisional image area (a sub-total per divisional image area), for each of the divisional image areas F1 to F9 (step Sc7). In the example shown in
Next, the controller 210 calculates a sub-total number of objects which belong to each of the angular ranges R1 to R4 (step Sc9). In the example of
After calculating feature amounts characterizing distribution of detectable substances in a manner as described above, the controller 210 writes the feature amounts into the ID information management table 241 in the ID information storage unit 240 (step Sc10).
Image Forming Processing
Next, the Image forming processing of the step Sd in
The controller 210 determines types of toners to be used for forming a visible image, depending on whether detectable substances have been detected from the paper sheet.
At first, if the controller 210 determines that no detectable substance is contained in the paper sheet (step Sd1: NO), types of toners used for forming a visible image are set to four color toners of cyan, magenta, yellow, and black (hereinafter referred to as “CMYK toners”). Further, the controller 210 converts image information, which has been obtained through the communication unit 260 or the like to form a visible image with use of the determined types of toners, into image information constituted of four color components of C, M, Y, and K (step Sd2). Specifically, the controller 210 firstly converts the image information into image information constituted of three color components of C, M, and Y, and then executes a UCR (Under Color Removal) processing. By the UCR processing, an area where three color components of C, M, and Y overlap each other thereby presenting gray and/or black colors is applied with a K color component according to density of the gray and/or black colors. That is, image information constituted of three color components of C, M, and Y is converted into image information constituted of four color components of C, M, Y, and K by the UCR processing. Subsequently, the controller 210 executes a half-tone processing on each pixel included in the converted image information, to determine toner amounts of CMYK toners in accordance with the image information (step Sd3). Further; the controller 210 outputs, to the image forming unit 250, color information for controlling image forming engines in accordance with the toner amounts (step Sd4). Further, the image forming unit 250 forms a visible image on a paper sheet by using the CMYK toners (step Sd5). In this case, the image forming unit 250 forms a black image by using a K toner (a second coloring material).
Otherwise, if detectable substances are contained in the paper sheet, the determination result of the step Sd1 is “YES”. In this case, the controller 210 sets, as the types of toners to be used for forming a visible image, three color toners of cyan, magenta, and yellow (hereinafter “CMY toners” (a first color material)). The controller 210 converts image information, which has been obtained from the communication unit 260 or the like to form a visible image with use of the determined types of toners, into image information constituted of three color components of C, M, and Y (step Sd6). At this time, image areas to be colored in black and gray are expressed by overlapping C, M, and Y toners on each other. Further, the controller 210 executes a half-tone processing on each pixel included in the converted image information, to determine toner amounts of CMY toners in accordance with the converted image information (step Sd7). Further, the controller 210 outputs, to the image forming unit 250, the color information for controlling image forming engines for these colors in accordance with the toner amounts, and causes the image forming unit 250 to form a visible image by using the CMY toners (steps Sd4 and 5). In this case, the image forming unit 250 forms a black image by using the CMY toners.
As described above, if a paper sheet is watermarked with detectable substances, the registration device 200 does not use the K toner to form a visible image. This is because with this configuration, detectable substances can be more easily extracted from an image which is read from a printed material by the comparison device 300.
While forming a visible image, the controller 210 writes an “image forming date/time”, a “device ID”, a “file ID”, a “page number”, a “user ID”, and “take-out availability”. The controller 210 writes a present date/time as the “image forming date/time”, and a device ID assigned to the registration device 200 as the “device ID”. The “file ID”, “page number”, and “user ID” are information which can be specified by referring to image information expressing a visible image formed on a paper sheet or by referring to a header of the image information. The controller 210 therefore writes such specified information as the “file ID”, “page number”, and “user ID”. The “take-out availability” is information which is described in a header of image information or is specified by a user when giving an instruction to execute an Image forming processing. The controller 210 therefore refers to such information and writes the information into the property information management table 242.
2-2 Operation of Comparison Device 300
Next, operation of the comparison device 300 will now be described below.
A user who wants to take out a printed material sets the printed material on a platen glass of the image read unit 320, and makes a manipulation for carrying out a comparison (e.g., presses down a button). The controller 310 of the comparison device 300 executes the feature amount calculation/comparison program P4. The following description of the operation of the comparison device 300 will be made with regard to a case where feature amounts calculated from the image shown in
At first, the controller 310 controls the image read unit 320 to read the printed material, and obtains image information generated by the image read unit 320 through the interface 313. At this time, the image read unit 320 generates the image information on the basis of intensity of reflection light from the printed material.
As shown in the upper part of
A wavelength range of approximately 700 nm to 1,000 nm is a high-wavelength range close to a visible light range within an infrared range. In the high-wavelength range, the base material S1 has a spectral reflection factor of approximately 80%, and the K toner has a spectral reflection factor of approximately 5%. These spectral reflection factors are almost the same as those in the visible light range. However, the spectral reflection factor of the CMY image abruptly rises to approximately 720 nm, and is substantially constant at slightly less than 80%, in a higher wavelength range than 820 nm. On the other side, the spectral reflection factor of the K image stays low even in the wavelength range of 700 to 1,000 nm. This is because the K toner contains carbon black as a pigment, which has a property of maintaining a substantially constant low spectral reflection factor, from an ultraviolet light range to the infrared range. Detectable substances have a low spectral reflection factor which is substantially as low as that of the K image, regardless of wavelength ranges. This is because the detectable substances used in this exemplary embodiment have a low spectral reflection factor near the range of 700 nm to 1,000 nm.
As can be seen from the above, a difference of approximately 70% exists between the spectral reflection factor of the CMY image and those of the K image and detectable substances, in the wavelength range of 700 nm to 1,000 nm.
The image read unit 320 generates image information, based on light in the wavelength range of 700 nm to 1,000 nm as described above. Therefore, of an image read from a printed material, image parts corresponding to a CMY image and a base material have a high brightness, and image parts corresponding to detectable substances and a K image have a low brightness. Therefore, in the images D1 and D2 shown in
Based on the above situation, if the registration device 200 forms only a CMY image without forming a K image in case of a paper sheet watermarked with detectable substances, only detectable substance images are expressed by a high grey-scale value (corresponding to low brightness) in an image read by the image read unit 320 (as shown in the image D1 in
The image read unit 320 generates image information by reading a printed material in a manner as described above. The controller 310 then executes the object extraction processing and the feature amount calculation processing on the image information obtained from the image read unit 320. Processes of the object extraction processing and the feature amount calculation processing (steps Sb and Sc in
At this time, if visible images are formed only of CMK toners on a printed material, noise images may be included in addition to detectable substance images. This is because areas of lower spectral reflection factors are formed depending on positions and amounts of applied CMK toners. Such areas appear as noise images in a read result of the image read unit 320. Even in this case, noise images are removed by the object extraction processing, so that detectable substance images can be easily extracted. After calculating feature amounts based on the printed material, the controller 310 executes a comparison processing for comparing the calculated feature amounts with feature amounts written in the ID information management table 241.
In the figure, the controller 310 firstly extracts, from the ID information management table 241, paper sheet IDs each associated with a total number of objects which is equal to or different by “1” from a total number of objects as a calculated feature amount (step Se1). Since the total number of objects which belong to the image shown in
The controller 310 determines whether feature amounts have been compared for all paper sheet IDs or not (step Se2). Since feature amounts have not yet been compared for any paper sheet ID (step Se2: NO), the controller 310 goes to the step Se3. In the step Se3, the controller 310 focuses on one of the extracted paper sheet IDs, and calculates a number of divisional image areas among divisional image areas F1 to F9, for each of which a sub-total number of objects per divisional image area, as a calculated feature amount, is equal to one of corresponding values written in the field of “number of detectable substances per area” associated with the focused paper sheet ID (step Se3). Subsequently, the controller 310 calculates a number of groups among groups “1”, “2”, and “3 or more”, for each of which a calculated feature amount is equal to one of corresponding values written in the field “sub-total number sorted by numbers of overlapping detectable substances” associated with the focused paper sheet ID (step Se4). Further, the controller 310 calculates a number of angular ranges among the angular ranges R1 to R4, for each of which a number of included objects is equal to one of corresponding values written in the field “number of detectable substances per angular range” associated with the focused paper sheet ID (step Se5). Further, the controller 310 calculates a total sum (hereinafter referred to as a “total number of agreements”) of all numbers of areas, groups, and ranges calculated in the foregoing steps Se3 to Se5 (step Se6). In this exemplary embodiment, the “total number of agreements” is “3” for the paper sheet ID “2” and is “16” for the paper sheet ID “9”.
The controller 310 determines whether the total number of agreements is equal to or greater than a predetermined threshold (step Se7). The predetermined threshold may be 80 percent. Namely, it may be determined that a printed material agrees with a paper sheet assigned with a subject paper sheet ID, if a feature amount of the former does not completely agree with a feature amount of the latter. If the controller 310 determines the total number of agreements to be less than the threshold (step Se7: NO), the controller 310 determines that a printed material disagrees with a paper sheet assigned with the paper sheet ID focused at present, and returns to the step Se2.
Otherwise, if the controller 310 determines the total number of agreements to be equal to or greater than the threshold (step Se7: YES), the controller 310 further determines whether the total number of agreements is maximum at present or not (step Se8). In other words, if the controller 310 has already specified another paper sheet ID which has resulted in a greater total number of agreements as a maximum value than the total number of agreements resulting from a paper sheet ID focused at present (step Se8: NO), the controller 310 determines that a printed material disagrees with a paper sheet assigned with the paper sheet ID focused at present. The controller 310 then returns to the step Se2 described previously, and repeats the processings also described previously, focusing on another one of the extracted paper sheet IDs. Otherwise, if the controller 310 determines the total number of agreements for the paper sheet ID focused at present is greater than the maximum value (step Se8: YES), the controller 310 selects the paper sheet ID focused at present (step Se9). The controller 310 then returns to the step Se2, and repeats the processings as described previously, focusing on another one of the extracted paper sheet IDs.
If the controller 310 determines that comparisons are complete for all of the extracted paper sheet IDs (step Se2: YES), the controller 310 determines whether a paper sheet ID has been selected in the step Se9 (step Se10). As described above, the controller 310 selects the paper sheet ID “9” in the step Se9 (step Se10: YES), and therefore specifies the paper sheet ID “9”. The controller 310 accordingly specifies the printed material as agreeing with the paper sheet assigned with the paper sheet ID “9” (step Se11). Further, the controller 310 determines whether take-out of the printed material as a target of the comparison processing is allowed or inhibited, based on the property information management table 242 (see
Meanwhile, if the controller 310 determines in the step Se10 that no paper sheet ID has been selected in the step Se9 (step Se10: NO), the controller 310 determines that the printed material as a target of the comparison processing is not registered in the registration device 200 and that there is no associated paper sheet (step Se12). Therefore, the controller 310 determines to allow take-out of the paper sheet outside, and outputs a control signal to open the door 400. At this time, the controller 310 outputs a control signal to cause the notification unit 340 to generate an audio signal or show a message, so that the user is invited to make registration in the registration device 200.
Next, a second exemplary embodiment of the invention will be described. In the second exemplary embodiment, the feature amount calculation processing and the comparison processing operate differently from those in the first exemplary embodiment. Operations other than the foregoing processings and device structures are the same as those in the first exemplary embodiment. In the following description, only the feature amount calculation processing and the comparison processing will therefore be described in detail.
In this exemplary embodiment, the feature amount calculation processing in the step Sc shown in
At first, the Hough transform processing will now be described. Where pixel positions are expressed by X and Y coordinates in image information in which grey-scale values are expressed by binary values, every line that penetrates a pixel positioned at coordinates (x, y) can be expressed by the following expression 1 on X-Y coordinates, wherein p is a distance from the origin to a line which penetrates the coordinates (x, y) and extends an angle θ to the X axis.
ρ=x cos θ+y sin θ (0≦θ<π) (1)
For example, θ in the expression 1 is sequentially changed from 0 to π for each of the pixels positioned at coordinates P1 (x1, y1) and P2 (x2, x2) on a line 1 shown in
As shown in
Next, the feature amount calculation processing executed by using the Hough transform described above will be described below.
At first, the controller 210 of the registration device 200 generates image information read from a paper sheet, and then executes a binarization processing with use of a predetermined threshold. Next, the controller 210 executes the Hough transform on the image information, to obtain Hough curves. As has been described previously, detectable substances are substantially linear, and therefore, detectable substance images are substantially linear. That is, plural Hough curves expressed based on a detectable substance image intersect each other at a certain pair of coordinates in a Hough plane. Accordingly, the controller 210 can obtain information corresponding to the position and inclination of the detectable substance by referring to coordinates indicating an intersection between a large number of Hough curves to each other (i.e., a pair of coordinates at which a large number of intersections (i.e., votes) between Hough curves to each other exist). Even if image parts which are not detectable substance images are included in an image, the image parts are not extracted erroneously by mistaking them to be detectable substance images. This is because an image part which is not a detectable substance image does not gather a large number of votes in the Hough plane unless the image part has a linear shape having a certain length. In addition, each paper sheet is watermarked with approximately several to fifty detectable substances. The controller 210 can therefore specify positions of detectable substance images by extracting coordinates in an order from a detectable substance image gathering the greatest number of votes.
In this manner, the controller 210 extracts coordinates (ρ, θ) corresponding in number to detectable substances, in an order from coordinates containing the greatest number of votes in the Hough plane. The controller 210 writes the extracted coordinates as feature amounts characterizing distribution of detectable substances into the ID information storage unit 240. If a detectable substance is more or less curved, such a detectable substance results in that intersections between plural Hough curves do not perfectly agree with each other in a Hough plane. Even in this case, a large number of intersections are concentrated in a small range. Accordingly, such a slightly curved detectable substance can be extracted as a detectable substance image by focusing on the number of votes concentrated in a predetermined range.
Described next will be a comparison processing executed by the comparison device 300.
In this comparison processing, the controller 310 of the comparison device 300 firstly generates image information read from a printed material, and then executes a binarization processing and a Hough transform processing, as in the comparison processing of the registration device 200. Further, the controller 310 extracts coordinates in an order from coordinates containing the greatest number of votes in the Hough plane, and stores the extracted coordinates as feature amounts characterizing distribution of detectable substances into the ID information storage unit 240.
Further, the controller 310 selects one after another points expressed as coordinates from feature amounts stored in the ID information storage unit 240, and calculates a Euclidian distance in the Hough plane, in order to compare the feature amounts stored in the ID information storage unit 240 and the feature amounts calculated from a printed material. If the Euclidian distance is “0” or a predetermined value or less, the controller 310 determines the position and inclination of a detectable substance in a printed material as agreeing with those of a detectable substance according to the stored feature amounts. Further, if there is a paper sheet ID associated with feature amounts which result in a predetermined number of agreements or more in position and inclination with detectable substances read from a printed material, the controller 310 determines the printed material as agreeing with a paper sheet assigned with the paper sheet ID. Subsequent processings are the same as those in the first exemplary embodiment described previously.
Next, a third exemplary embodiment of the invention will be described. The third exemplary embodiment operates differently from the first exemplary embodiment though the first and third exemplary embodiments have the same device structures as each other. The following description will therefore be focused on content of the operations. In the third exemplary embodiment, the comparison device 300 executes a comparison processing by using a cross spectrum. That is, a comparison is made depending on how much image information generated from a registered paper sheet and image information generated from a printed material are similar to each other, based on interrelationships between the image information generated from a registered paper sheet and that from a printed material.
At first, the controller 210 of the registration device 200 generates image information by reading a paper sheet, and then executes a binarization processing with use of a predetermined threshold. By this processing, each of the white pixels is expressed by a grey-scale value “0”, and each of the black pixels is expressed by a grey-scale value “1”. Next, the controller 210 divides an image expressed by the image information into plural divisional image areas, and generates superimposed image information by layering the divisional image areas onto each other. Superimposed image information is used on the ground that a comparison processing using a cross spectrum requires a large calculation amount, so that a long time is required for the processing. By using superimposed image information in which divisional image areas divided from an image area are layered onto each other, a calculation amount and a processing time required for the comparison processing are much reduced. Besides, feature amounts of detectable substances can be maintained in the superimposed image information.
The controller 210 stores superimposed image information having pixels the grey-scale values of which are expressed by the expression 2, as feature amounts characterizing distribution of detectable substances, into the ID information storage unit 240, with the superimposed image information associated with a paper sheet ID. Hereinafter, the superimposed image information stored in the ID information storage unit 240 will be referred to as “registered superimposed image information”.
Described next will be a comparison processing performed by the comparison device 300.
In the comparison processing, the controller 310 of the comparison device 300 generates superimposed image information (hereinafter referred to as “comparative superimposed image information”) based on a printed material, as in the same manner as in the generation processing for generating superimposed image information, which is executed by the controller 210 of the registration device 200 as described above. Further, the controller 310 compares the comparative superimposed image information with registered superimposed image information stored in the ID information storage unit 240.
At first, the controller 310 executes a two-dimensional Fourier transform on any set of registered superimposed image information stored in the ID information storage unit 240 and on the comparative superimposed image information (step Se102). Further, the controller 310 calculates a cross spectrum CS, based on the registered superimposed image information Fir and the comparative superimposed image information Fi both of which have been subjected to the two-dimensional Fourier transform (step Se103). The cross spectrum is defined by the following expression 3 wherein F−1 represents an inverse Fourier transform.
CS=F−1(Fir×Fi) (3)
Next, the controller 310 determines whether the comparative superimposed image information has been compared with all sets of registered superimposed image information stored in the ID information storage unit 240 or not (step Se101). If the controller 310 determines that the comparative superimposed image information has not yet been compared with all sets of registered superimposed image information (step Se101: NO), the controller 310 repeats the processing steps Se102 and Se103 described above.
Otherwise, if the comparative superimposed image information has been compared with all sets of registered superimposed image information (step Se101: YES), the controller 310 specifies a paper sheet ID which maximizes the value of cross spectrum CS (step Se104). Subsequently, the controller 310 determines whether the cross spectrum CS calculated based on the specified paper sheet ID exceeds a predetermined threshold or not (step Se105: YES). If the cross spectrum SC is determined to exceed the threshold (step Se105: YES), correlation is considered to be high between the registered superimposed image information and the comparative superimposed image information. The controller 310 accordingly determines that a paper sheet associated with the specified paper sheet ID agrees with the paper sheet of the printed material (step Se106). The aforementioned threshold is provided in view of a case that no paper sheet is registered in the registration device 200. In this case, the cross spectrum CS takes a relatively small value even when the cross spectrum CS is maximized. By providing the threshold, erroneous determinations are prevented from being made on printed materials.
Otherwise, if “NO” is a determination result of the step Se105, the controller 310 determines that the paper sheet of the printed material is not registered in the registration device 200 (step Se107), and notifies the user of this determination.
The inventors conducted experiments using paper sheets as described in the first to third exemplary embodiments. In the experiments, printed materials were prepared by forming images only of CMY toners or a K toner on the printed materials. The printed materials were read by the comparison device 300 to check detection accuracy of detected detectable substances.
Toners used for forming visible images were those made of polyester resins, pigments, etc. For the CMY toners, pigments for respective colors of C, M, and Y were used as coloring materials, and toners having a weight-average grain size of 7 μm were used for all of these colors. For the K toner, carbon black was used as a pigment, and a toner having a weight-average grain size of 9 μm was used.
Each of the C, M, and Y inks used in this experiment contained water, a pigment (coloring material) which is self-dispersible in water, a water-soluble organic solvent, a surfactant, and a high molecular compound. The pigment which is self-dispersible in water can be manufactured by subjecting a commonly used pigment to surface reforming treatment such as acid-basic treatment, coupling treatment, polymer graft treatment, plasma treatment, and/or oxidation/reduction treatment. A carbon black pigment was used for the black ink. Pigments which are respectively appropriate for cyan, magenta, and yellow colors were used for the cyan, magenta, and yellow inks. Used for the water-soluble organic solvent were polyvalent alcohols, derivatives of polyvalent alcohols, a nitrogen-containing solvent, alcohols, a sulfur-containing solvent, and/or propylene carbonate. A nonionic surfactant was used as the surfactant. The high polymer compound may be any of a nonionic compound, an anionic compound, a cationic compound, and an amphoteric compound.
As shown in
From the experimental results as described above, the inventors confirmed that a CMY image does not substantially appear but detectable substance images and a K image clearly appear in an image which the image read unit 320 reads from a printed material. That is, detectable substance images can be more easily extracted from an entire image read from a printed material on which no K image is formed, compared with a printed material in which a K image is formed. Also in case of using inks as coloring materials, an image formed of C, M, and Y inks does not substantially appear in a read image while an image formed of a black ink appears clearly.
The exemplary embodiments described above may be modified as follows. For example, the following modifications are available. The following modifications can be appropriately combined with each other in practical use.
In each of the exemplary embodiments described above, the comparison device 300 reads a printed material by emitting light in the infrared range (approximately 750 nm to 950 nm). This is because, as shown in
A wavelength range used for reading a printed material may be different from the wavelength range as described above if a difference not smaller than a predetermined threshold exists between the spectral reflection factor of detectable substances and that of the base material and if a difference which is not smaller than a predetermined threshold also exists between grey-scale values of pixels of detectable substance images and grey-scale values of pixels of the other images. Specifically, a lowest value capable of separating a CMY image and detectable substance images from each other is specified in advance as a threshold, based on experiments or calculations. Further, a printed material is read based on light in a wavelength range with which differences between spectral reflection factors of a visible image (CMY image) and the spectral reflection factor of detectable substances are not smaller than a threshold (e.g., Th1 shown in the figures). Meanwhile, a difference between the spectral reflection factor of a base material and the spectral reflection factor of detectable substances needs to be not smaller than a threshold (for example, Th2 shown in the figures).
Also in the above exemplary embodiments, if detectable substances are detected from a paper sheet, images are formed on the paper sheet by using only first coloring materials (CMY toners). The first coloring materials reflect light in a particular wavelength range, with an intensity which differs by a threshold or more from an intensity of light reflected on the detectable substances when the detectable substances are irradiated with the light in the particular wavelength range. On the other side, if no detectable substance is detected from a paper sheet, an image is formed on the paper sheet by using a second coloring material (K toner). The second coloring material reflects the light in the particular wavelength range with an intensity which differs by a difference smaller than the threshold from an intensity of light reflected on the detectable substances when the detectable substances are irradiated with the light in the particular wavelength range.
However, the exemplary embodiments are configured aiming for facilitating extraction of detectable substance images in case where detectable substances are detected from a paper sheet. Any coloring material can be used if no detectable substance is detected from a paper sheet. That is, only when detectable substances are detected from a paper sheet, an image on the paper sheet needs to be formed by using only coloring materials which reflect light in a particular wavelength range at a particular intensity. The intensity differs by a threshold or more from an intensity of light reflected on the detectable substances when the detectable substances are irradiated with the light in the particular wavelength range. Toners used for a paper sheet watermarked with detectable substances are not limited to the CMY toners. For example, toners of orange, blue, and/or other colors may be used insofar as an intensity of light reflected on detectable substances is not smaller than a threshold in a wavelength range (infrared range) which the image read unit 320 uses for reading images. In case of using a K toner or a black ink, a carbon black contained in the K toner or black ink adversely reduces an intensity of reflection light in the infrared range, so that detectable substances are difficult to extract. However, there is a case that a coloring material such as a dye can express black without containing a carbon black. In such a case, any coloring material can be used insofar as the coloring material reflects light in a particular wavelength range at an intensity which differs, by a difference not smaller than a threshold, from an intensity of light reflected by detectable substances.
Depending on the conditions, the image forming unit 250 may be configured so as to exclude the image forming engine for the K toner. For example, there can be a case that the registration device 200 is used to prepare important documents and only paper sheets containing detectable substances are set in advance. In this case, the registration device 200 does not form K images on the paper sheets, and an image forming engine for the K toner need not be provided. In this configuration, the registration device 200 omits the step Sd1 in
Also in the exemplary embodiments, the registration device 200 determines whether or not a paper sheet is watermarked with detectable substances on the basis of an extraction result of the object extraction processing. A different method for detecting detectable substances may be different from a method of making such a determination. For example, the registration device 200 may be configured so that a magnetic sensor is provided in an upstream side of the image forming unit 250 along a direction of feeding paper sheets. The controller 210 may make a determination based on a detection result of the magnetic sensor. When registering a paper sheet, a user of the registration device 200 may be allowed to specify whether a paper sheet contains detectable substances or not by the manipulation unit 230.
Also in the exemplary embodiments, the infrared light source 321 of the comparison device 300 uses a LED light source which has spectral energy distribution as shown in
Further, light emitted from the infrared light source 321 needs to only contain a wavelength component in the infrared range and may also contain other wavelength components. In this case, the sensor 323 has an image pickup element which is sensitive only to the range approximately of 700 nm to 1,000 nm, and the image read unit 320 may generate image information on the basis of an intensity of light in this wavelength range.
Also in the exemplary embodiments described above, the registration device 200 and the comparison device 300 calculate feature amounts characterizing distribution of detectable substances. Calculations of feature amounts are not always required. The comparison device 300 may determine whether take-out of a printed material as a target of a comparison processing is allowed or not, depending on whether detectable substances are contained in a printed sheet or not. In this case, the registration device 200 and the comparison device 300 need not execute the “feature amount calculation processing”, and a structure equivalent to the ID information storage unit 240 is therefore not required. More specifically in this case, the registration device 200 operates to form a visible image by using either CMY toners or CMYK toners depending on whether detectable substances are detected from a paper sheet or not. When the comparison device 300 makes a comparison, control is made so as not to allow take-out of a printed material if detectable substance images are extracted from image information generated by the image read unit 320.
In each of the exemplary embodiments, the image read unit 320 reads a paper sheet fed from a sheet feed tray, before toner images are transferred by the transfer units. However, the image read unit may be a stand-alone device such as a scanner. A user may set a paper sheet which the user wants to register, and operates the scanner to read the paper sheet. In this case, the user may stock paper sheets in the sheet feed tray of the registration device 200 after registration of the paper sheet.
Regarding the image read unit 220 of the registration device 200 and the image read unit 320 of the comparison device 300, the surface to be read from a paper sheet (or printed material) and the direction of reading the paper sheet vary depending on how the paper sheet is actually set by a user. More specifically, image information can be read from a paper sheet in a total number of four different ways, depending on whether the front or back surface of a paper sheet is read and whether a paper sheet is read in a direction from top to bottom of the paper sheet or vice versa. That is, if any part of the surface and the direction of a paper sheet to read is unspecified, the comparison device 300 cannot satisfactorily achieve an intended comparison unless all reading patterns available from the four different ways are taken into consideration. Next, how image information differs depending on the surface and the direction of a paper sheet to read will be described for each of the above exemplary embodiments, and related correction methods will also be described.
At first, in the first exemplary embodiment, a front surface of the paper sheet shown in
In the second exemplary embodiment, the position of an origin stays unchanged regardless of which of the aforementioned four different ways is taken to read a paper sheet, provided that the center of image information is regarded as the origin. However, coordinate values (θ, ρ) in the Hough plane correspond to the position (π−θ, ρ) if opposite surfaces of a paper sheet are read with the longitudinal direction of the paper sheet oriented in the same direction. Otherwise, the coordinate values (θ, ρ) in the Hough plane correspond to the position (θ, −ρ) if one surface of a paper sheet is read twice with the longitudinal direction of the paper sheet oriented in opposite directions. Still otherwise, the coordinate values (θ, ρ) in the Hough plane correspond to the position (π−θ, −ρ) if opposite surfaces of a paper sheet are read with the longitudinal direction of the paper sheet oriented in opposite directions. That is, the comparison device 300 may carry out a comparison processing by comparing coordinates which are corrected on the bases of the foregoing relationships.
In the third exemplary embodiment, superimposed image information can be generated in four different ways depending on the surface and the direction in which a printed material is read. Therefore, the comparison processing may be carried out by calculating cross spectrums, based on comparative superimposed image information and image information obtained by rotating registered superimposed image information by 90 degrees.
In the exemplary embodiments, the image read units 220 and 320 each generate image information by reading one surface of a paper sheet. Alternatively, the image read units each may generate image information by reading two surfaces of a paper sheet. In this case, the image read unit 220 has the same structure as shown in
Also in the exemplary embodiments, the comparison device 300 calculates feature amounts and performs a comparison processing, based on image information which is read and generated by the image read unit 320. The comparison device 300 may alternatively be configured so as to perform the comparison processing, based on image information which is obtained from a device provided in an outside space. For example, the comparison device 300 is supposed to have a communication unit as an interface device for making communication via a network, and to be able to communicate with an external scanner provided in an outside space. If a printed material is read by the external scanner, the comparison device 300 obtains image information and performs a comparison processing. Even if a printed material for internal use only is taken out, the foregoing comparison processing enables the controller 310 to specify location of the printed material by identifying an external scanner used for reading the printed material. The controller 310 can further specify a paper sheet ID from feature amounts characterizing distribution of detectable substances contained in the printed material, and property information as shown in
The external scanner is set near the door 400 in an outside space, and the comparison device 300 executes a comparison processing, based on an image read by the scanner. Further, the comparison device 300 refers to a field not shown, which is associated with property information and describes whether take-in is allowed or not. If take-in is allowed, the comparison device 300 outputs a control signal to the door open/close unit 401 so as to open the door 400. At this time, the comparison device 300 detects that a printed material which has been taken out is returned, and writes the return of the printed material into a file. Needless to say, if a printed material is taken out, the comparison device 300 writes the take-out of the printed material into the file.
In the exemplary embodiments, the controller 310 of the comparison device 300 specifies a paper sheet ID by a comparison processing, and then, outputs a control signal for controlling open/close of the door 400, depending on the content of the ID information management table 241. However, information concerning a comparison result which the controller 310 outputs is not limited to the control signal. For example, the comparison device 300 may refer to the property information table 242 shown in
Also in the exemplary embodiments, the registration device 200 performs processings relating to registration of a paper sheet, and the comparison device 300 performs processings relating to comparison of a printed material. However, all of these processings may be performed by one single device, or common processes to the processings of both devices may be shared by both devices. Otherwise, the processings of both devices may partially be performed by an external device.
In case where processings of the registration device 200 and the comparison device 300 are performed by one single device (hereinafter a “registration/comparison device”), a user makes a manipulation of commanding registration of a paper sheet, and the registration/comparison device then generates image information by reading the paper sheet (first recording medium) which is set on an image read device equivalent to the image read unit 220. Further, the registration/comparison device makes control for forming a visible image on the paper sheet by using only CMY toners or CMYK toners, depending on whether detectable substances are extracted from the paper sheet or not. On the other side, the registration/comparison device calculates feature amounts characterizing distribution of detectable substances, and stores the calculated feature amounts into the ID information storage unit. When the user makes a manipulation of commanding a comparison of a printed material, the registration/comparison device then causes an image read device equivalent to the image read unit 320 to read the printed material (second recording medium) and generate image information. Further, the registration/comparison device calculates feature amounts characterizing distribution of detectable substances, based on the image information. The registration/comparison device further reads and compares feature amounts stored in the ID information storage unit, with the calculated feature amounts, and outputs information concerning a comparison result. In this case, reading of a paper sheet which is carried out by the image read device equivalent to the image read unit 220 may be carried out by the image read device equivalent to the image read unit 320.
In the comparison system 100, functions of the image forming unit 250 in the registration device 200 may be performed by an image forming device as an external device. In this case, the registration device outputs color information for forming a visible image of CMY toners or CMYK toners through a communication interface not shown, and causes the image forming device to form the visible image on a paper sheet contained in the image forming device. At this time, the registration device obtains a detection result, for example, from a detection unit for detecting detectable substances, which is provided in the image forming device. Depending on the detection result, the registration device may determine whether color information for CMYK toners or CMYK toners should be generated.
Further, the ID information storage unit 240 may be included in the comparison device 300 or may be an external storage device.
The feature amount calculation program P2 and the feature amount calculation/comparison program P4 in the above exemplary embodiments can be provided, recorded in a recording medium such as a magnetic tape, a magnetic disk, a flexible disk, an optical recording medium, a magneto-optical recording medium, a CD (Compact Disk), a DVD (Digital Versatile Disk), or a RAM.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiment was chosen and described in order to best explain the principle of the invention and its practical application, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2007-263386 | Oct 2007 | JP | national |