This application is based on the application No. 2015-051686 filed in Japan, the contents of which are hereby incorporated by reference.
1. Field of the Invention
The present invention relates to an image reading device that includes a duplex parallel reading function and a non-transitory readable recording medium. The present invention more specifically relates to a technique of detecting a specific document.
2. Description of the Background Art
An image reading device that includes a function to read both sides of a document at the same time has an image reading unit for an adverse side and a back side of the document at a separate place from each other in a document path. Therefore, an image reading process is performed for each of the adverse and back side read at the image reading device. More specifically, in most cases, each processing unit that processes image data generated by reading the document is required for the adverse side and the back side of the document, respectively. Thus, two processing units are required for the entire device.
Some of the image reading devices have a function to determine whether or not the read document is one of specific documents such as a bank bill or securities. Some of the image reading devices are capable of restricting a forgery of the specific documents by terminating the reading process of the document when determining that the read document is one of the specific documents. The image reading device provided with the processing units for each of the adverse and back side of the document generally includes two processing units for each of the adverse and back side of the document for the determination process of the specific documents. In such a case, the number of the processing units necessary for the image reading device becomes large. A circuit size of the device becomes twice the large, and a structure of the device becomes complicated. Also, it is expensive to manufacture the devices.
An image processing device capable of reducing each image data of the adverse side and the back side of the read document in main scanning direction, and creating linked image by lining and linking the two reduced image data. The image processing device then determines whether or not the linked image is the specific document. This known technique is introduced for example in Japanese Patent Application Laid-Open No. JP 2005-26880 A. According to the image processing device, only one processing unit that determines whether or not the image data is the specific document is enough, and two are unnecessary. The image processing device according to the known technique, however, has also some problems described below.
As the known technique, when the two image data is compressed into a half of original size, respectively, the number of the blocks B100 that should be read for the detection of the characteristic patterns 101, 102 and 103 is no difference from the two images of the adverse side and the back side. Then, the number of the blocks that should be read for creating the image GC is still two, which is not changed. For performing the process to detect the characteristic pattern by the single processing unit, a double time compared for the process performed for the single image is required.
The present invention is intended to solve the above problems. Thus, the present invention is intended to provide an image reading device and a non-transitory computer readable recording medium capable of controlling a time required for a determination as enabling a single determining processing unit to determine whether or not a read document is a specific document.
First, the present invention is directed to an image reading device.
According to one aspect of this invention, the image reading device comprises: a feeder that feeds a document along with a predetermined path; a first reading part that reads a first side of said document at a first position in said path and creates a first image; a second reading part that reads a second side of said document at a second position which is located posterior to said first position in said path and creates a second image; a synthesizing part that lays said first image and said second image on top of one another and creates a composite image; and a detecting part that detects a characteristic pattern of a specific document from said composite image created by said synthesizing part.
Second, the present invention is directed to a non-transitory computer readable recording medium storing a program to be executed by an image reading device that includes a feeder that feeds a document along with a predetermined path; a first reading part that reads a first side of said document at a first position in said path and creates a first image; and a second reading part that reads a second side of said document at a second position which is located posterior to said first position in said path and creates a second image.
According to one aspect of this invention, the program executed on said said image reading device to function as a system comprises: a synthesizing part that lays said first image and said second image on top of one another and creates a composite image; and a detecting part that detects a characteristic pattern of a specific document from said composite image created by said synthesizing part.
Third, the present invention is directed to an image reading method.
According to one aspect of this invention, the image reading method comprises: (a) feeding a document along with a predetermined path; (b) reading a first side of said document at a first position in said path and creating a first image; (c) reading a second side of said document at a second position which is located posterior to said first position in said path and creating a second image; (d) laying said first image and said second image on top of one another and creating a composite image; and (e) detecting a characteristic pattern of a specific document from said composite image created in said step (d).
A preferred embodiment of the present invention is described in detail below with reference to figures. In the description given below, those elements which are shared in common among figures are represented by the same reference numerals, and these elements are not discussed repeatedly for the same description.
The document put into the document tray 6a is taken out by a pick up roller, which is not shown in
As an example of the process at the image reading device 1, after the images of the both sides of the document are created, a single composite image is created by laying the image of one side and the image of the other side on top of one another, and a characteristic pattern of a specific document is detected from the created composite image. When the image of one side and the image of the other side are laid on top of one another and the images are synthesized, there is no difference in the number of the blocks that form the image because a process such as a data compression is not performed. More specifically, an addition synthesizing process is performed for each corresponding block by laying the two images on top of one another. The number of the blocks therefore does not change. The number of the blocks to be processed for the determination whether or not the characteristic pattern of the specific document is included in the created composite image is the same as the number of the blocks of the single image. The determination of the characteristic pattern may be performed in the process time the same as the single image.
The first reading unit 21 includes the first imaging unit 10, a first reading part 23 and a first stated image creating part 25. The first reading part 23 reads the first side of the document at the first imaging position P1 in the path feeding the document and creates the first image. The first reading part 23 converts an analog signal obtained by the first imaging unit 10 by reading the document into a digital signal. The first reading part 23 sends the created first image data to the first stated image creating part 25. The first stated image creating part 25 extracts an image component that has an image characteristic of the characteristic pattern of the specific document from the first image and creates the first stated image. The first stated image creating part 25 reads characteristic pattern information relating to the characteristic pattern stored in a storage, which is not shown in
A color component, for instance, can be one of examples of the image characteristic of the characteristic pattern R1 used for the creation of the first stated image by the first stated image creating part 25. In this case, the first stated image creating part 25 extracts the color component of the characteristic pattern from the first image P1 and creates the first stated image. By extracting the color component of the characteristic pattern and creating the first stated image, the first stated image that does not contain the color component different from that of the characteristic pattern is created. Thus, the data size of the first stated image can be reduced, and the storage capacity of the image buffer 30 for storing the first stated image in the image buffer 30 may be decreased.
Different from what is described above, the first stated image creating part 25 is capable of using a frequency component as the image characteristic of the characteristic pattern R1. To be more specific, the characteristic pattern is included in a predetermined frequency domain in the space frequency domain of the first image P1 in many cases. By extracting the image with the predetermined frequency component from the first image P1, the image containing the characteristic pattern R1 is allowed to be extracted. In this case, the first stated image creating part 25 extracts the frequency component of the characteristic pattern from the first image and creates the first stated image. The characteristic pattern R1 is shown by a predetermined frequency on the specific document. The first stated image creating part 25 is capable of extracting only the image component that matches with the frequency of the characteristic pattern R1 of the images displayed as the first image by using a filter. As a result, the data size of the first stated image can be reduced. Also, the characteristic pattern may be detected in a shorter time compared to detect the characteristic pattern from the first image.
The second reading unit 22 includes the second imaging unit 12, a second reading part 24 and a second stated image creating part 26. The second reading part 24 obtains the analog data generated by reading the other side (the second side) of the document from the second imaging unit 12, and converts the obtained analog data into the digital data. The second reading part 24 sends the created second image to the second stated image creating part 26 as well as the first reading part 23. The second stated image creating part 26 extracts the image component that has the image characteristic of the characteristic pattern from the second image and creates the second stated image. The second stated image creating part 26 uses the color component as the image characteristic of the characteristic pattern as well as the first stated image creating part 25. The second stated image creating part 26 extracts the image component from the second image and creates the second stated image. Alternatively, the second stated image creating part 26 is capable of using the frequency component as the image characteristic of the characteristic pattern. The second stated image creating part 26 extracts the image component of the frequency the same as the frequency of the characteristic pattern from the second image, thereby creating the second stated image.
The image buffer 30 is constructed by a device such as a memory capable of storing therein multiple lines of the images, for example. The image buffer 30 is a storage region in which the first image created by the first reading part 23 and the second image created by the second reading part 24 are stored. The first reading part 23 and the second reading part 24 read the first and the second sides along with a predetermined reading line, respectively, and create a first line image and a second line image that form the first image and the second image. The first reading part 23 and the second reading part 24 then store the created images in the image buffer 30. Alternatively, the first stated image creating part 25 and the second stated image creating part 26 may create the first stated image and the second stated image that correspond to the first line image and the second line image and store them in the image buffer 30. The line images are stored in the image buffer 30. As a result, every time the predetermined number of the line images are stored in the image buffer 30, an image synthesizing part 45 reads and creates a composite band image that forms the composite image. As a result, even before the first side or the second side is read, the characteristic pattern can be detected with the composite band image. Then, the characteristic pattern can be detected at an early stage when the specific document is read. The later process is allowed to be terminated.
The CPU 35 executes a program stored in a storage, which is not shown in
The image synthesizing part 45 lays the first image and the second image on top of one another, thereby creating the composite image. The first image and the second image are created and stored in the image buffer 30 by the first reading part 23 and the second reading part 24, respectively. In this case, the image synthesizing part 45 reads the first and the second images in the image buffer 30 and creates the composite image. As illustrated in
The image component having the image characteristic of the first and the second images is extracted and the first stated image and the second stated image are created by the first stated image creating part 25 and the second stated image creating part 26. In this case, the first stated image creating part 25 and the second stated image creating part 26 store the created first stated image and second stated image in the image buffer 30. The image synthesizing part 45 may read the first stated image and the second stated image in the image buffer 30 and lay the first stated image and the second stated image on top of one another, thereby creating the composite image.
The first reading part 23 and the second reading part 24 read the first side and the second side along with the predetermined reading line and create the first line image and the second line image that form the first image and the second image, respectively. In this case, the image synthesizing part 45 reads the first line image or the second line image in the image buffer 30 and creates the composite band image. When the first image and the second image are created along with the predetermined reading line, respectively, the first stated image creating part 25 and the second stated image creating part 26 create the first stated image and the second stated image corresponding to the first line image and the second line image. As shown in
The composite image of
After the predetermined lines of the first line images or the second line images are stored in the image buffer 30, the image synthesizing part 45 may read the stored first line image or second line image and create the composite band image.
As the document is reached the second imaging position P2 at timing t2, the second reading part 24 starts creating the second line image. From timing t2 to t3, the first line image and the second line image are created and stored in the image buffer 30, respectively. The image synthesizing part 45 then reads the predetermined number of the first line images and the second line images in the image buffer and creates the composite band image from timing t2 to t3. After the reading of the first side of the document, the first line image is created at timing t3. The reading of the second side of the document is then complete, and the second line image is created at timing t4. After creation of the second line image, the creation of the composite band image is also complete. From timing t4 to t5, the document is fed and passed out into the catch position. The pattern detecting part 46 detects the characteristic pattern from the composite band image in the last line (determining time). The multiple sheets of the document may be read continuously. In this case, the next document has been fed to the first imaging position P1 and the previous document has been passed out from timing t4 to t5. After the detection of the characteristic pattern is performed for the composite band image in the last line, the reading operation of the next document is started and the creation of the first band image is started. More specifically, when the image synthesizing part 45 does not read the first band image and the second band image at the same time, creation of the composite band image may be started at almost the same time as the timing when the creation of the first band image is started.
A correcting part 50 corrects a difference between the timing to begin the creation of the first image and the timing to begin the creation of the second image. The correcting part 50 synchronizes the timing to read the first first line image with the timing to read the first second line image and reads the images in the image buffer. In this case, the image synthesizing part 45 synthesizes the first first line image and the first second line image read by the correcting part 50, thereby creating the first composite band image.
One of the first line image and the second line image may be allowed to be read but not another. In this case, the correcting part 50 serves as a line correcting part that reads the first line image and the second line image in the image buffer 30 after waiting until the timing when both of the first line image and the second line image are allowed to be read.
The pattern detecting part 46 includes a document detecting part 54 and a document determining part 55. When the characteristic pattern is detected from the composite image, the document detecting part 54 detects the characteristic pattern from each of the first image and the second image. The document determining part 55 determines the side of the document from which the characteristic pattern is detected based on the detecting result by the document detecting part 54. The pattern detecting part 46 may detect the characteristic pattern from the composite image. In this case, the document detecting part 54 performs a process to detect the characteristic pattern for each of the first and the second images in order to identify the characteristic pattern is shown in either of the first and the second images that are laid on top of one another in the composite image. The document determining part 55 determines that the characteristic pattern is detected from either of the first side or the second side of the document based on the detecting result by the document detecting part 54.
When the document detecting part 54 detects the characteristic pattern from the first image, the document determining part 55 determines that the characteristic pattern is detected from the first side. When the document detecting part 54 detects the characteristic pattern from the second image, the document determining part 55 determines that the characteristic pattern is detected from the second side. The document determining part 55 may determine that the characteristic pattern is detected from the first and the second side when the characteristic pattern detected from the composite image is in a predetermined manner.
The output controller 47 includes an output restricting part 52 and a process restricting part 53. The output restricting part 52 restricts the output from the output unit 38 when the characteristic pattern is detected from the composite image by the pattern detecting part 46. The output of the specific document from the printer section or transmission of the specific document from the fax section may be restricted. The process restricting part 53 restricts the creation of the first image or the second image by the first reading unit 23 or the second reading unit 24 when the characteristic pattern is detected from the composite image by the pattern detecting part 46. The output restricting part 52 and the process restricting part 53 send a terminating request to the controller 40 that is capable of terminating the process at each unit, for example, thereby terminating the output and the creation of the image when the characteristic pattern is detected by the pattern detecting part 46.
The sequential procedure of the process at the image reading device 1 of the present preferred embodiment is described next.
The image reading device 1 determines if it is the timing to start reading the second side of the document (step S7). The image reading device 1 may determine it is the timing to start reading the second side (when a result of step S7 is YES). In this case, the image reading device 1 performs the second reading process to read the second side (step S9). The image reading device 1 may determine it is not the timing to start reading the second side (when a result of step S7 is NO). In this case, the image reading device 1 determines whether or not it is during the reading operation of the second side (step S8). When determining it is during the reading operation of the second side (when a result of step S8 is YES), the image reading device 1 performs the second reading process. When determining it is not during the reading operation of the second side (when a result of step S8 is NO), the image reading device 1 skips the second reading process. The detail of the second reading process is described later.
The image reading device 1 determines if it is the timing to start creating the composite image (step S15). The image reading device 1 may determine it is the timing to start (when a result of step S15 is YES). In this case, the image reading device 1 performs the process to create the composite image (step S17). The image reading device 1 may determine it is not the timing to start creating the composite image (when a result of step S15 is NO). In this case, the image reading device 1 determines whether or not it is during the creation of the composite image (step S16). When determining it is during the creation of the composite image (when a result of step S16 is YES), the image reading device 1 performs the synthesizing process. When determining it is not during the creation of the composite image (when a result of step S16 is NO), the image reading device 1 skips the synthesizing process. The detail of the synthesizing process is described later. After the synthesizing process, the image reading device 1 performs the pattern detecting process to detect the characteristic pattern (step S20). The image reading device 1 determines whether or not the characteristic pattern is detected (step S21). The image reading device 1 may determine that the characteristic pattern is not detected (when a result of step S21 is NO). In this case, the image reading device 1 determines if the reading operation is complete (step S28). When determining the reading operation is complete (when a result of step S28 is YES), it can be said the document is not the specific document. The image reading device 1, therefore, executes the job (step S30) and completes the process. When determining that the reading operation is not complete (when a result of step S28 is NO), the image reading device 1 returns to the process in step S3 and repeats the process from the first reading process.
The image reading device 1 may determine that the characteristic pattern is detected (when a result of step S21 is YES). In this case, the image reading device 1 determines whether or not to determine that the characteristic pattern is detected from which the side of the document (step S23). When the image reading device 1 determines (when a result of step S23 is YES), it performs the process to determine the side of the document (step S25) and completes the process. When the image reading device 1 does not determine (when a result of step S23 is NO), it skips the process to determine the side of the document and completes the process.
When determining that no characteristic pattern is detected in the predetermined manner (when a result of step S101 is NO), the image reading device 1 reads the first stated image (step S103). The image reading device 1 reads the first stated image and determines whether or not the image contained in the first stated image matches with the characteristic pattern (step S105). In response to determining that the image contained in the first stated image matches with the characteristic pattern, the image reading device 1 determines that the characteristic pattern is detected from the first side (step S107), then completes the process. On the other hand, when determining that the image contained in the first stated image does not match with the characteristic pattern (when a result of step S105 is NO), the image reading device 1 reads the second stated image (step S108) and determines whether or not the image contained in the second stated image matches with the characteristic pattern (step S110). In response to determining that the image contained in the second stated image matches with the characteristic pattern (when a result of step S110 is YES), the image reading device 1 determines that the characteristic pattern is detected from the second side (step S112), then completes the process. When determining that the image contained in the second stated image does not match with the characteristic pattern (when a result of step S110 is NO), the image reading device 1 determines it is the detecting error because of the false detection in the pattern detecting process (step S114).
The image reading device 1 of the present preferred embodiment reads the first stated image in step S103 and the second stated image in step S108. The image reading device 1 may read the first image in the memory and determine if the image in the first image matches with the characteristic pattern in step S103. Alternatively, the image reading device 1 may read the second image in the memory and determine if the image in the second image matches with the characteristic pattern in step S108. Still alternatively, when determining that the image in the first stated image does not match with the characteristic pattern, the image reading device 1 may determine the characteristic pattern is detected from the second side in step S105. Still alternatively, the image reading device 1 may restart the job which is terminated in the pattern detecting process when determining that it is the detecting error because of the false detection in step S114.
As described above, the image reading device 1 of the present application is capable of determining if the two images of the first side and the second side of the read document are the specific documents in a single process. In addition, the image reading device 1 takes to determine the first side and the second side almost the same time as that required for determining only a single image.
While the preferred embodiment of the present invention has been described above, the present invention is not limited to the preferred embodiment. Various modifications may be applied to the present invention.
According to the above-described preferred embodiment, the job is executed when the reading operation of the entire documents is complete and no characteristic pattern is detected. However, this is given not for limitation. Even before completion of the reading operation of the document, the job may be executed about the document from which the characteristic pattern is not detected through the pattern detecting process (step S20 in
According to the above-described preferred embodiment, it is explained as an example that the matching of the image contained in the composite band image forming the composite image with the characteristic pattern is determined in the detecting process of the characteristic pattern. However, this is given not for limitation. The matching of the image contained in the composite image with the characteristic pattern may be determined after the composite image is created.
Number | Date | Country | Kind |
---|---|---|---|
2015-051686 | Mar 2015 | JP | national |