The present application claims priority to and incorporates by reference the entire contents of Japanese Patent Application No. 2012-286015 filed in Japan on Dec. 27, 2012.
1. Field of the Invention
The present invention relates to an image processing apparatus, an image processing method, and a computer program product.
2. Description of the Related Art
Conventionally, for example, as disclosed in Japanese Laid-open Patent Publication No. 2012-177961, there has been a technique of designating a template image to detect a similar image area in which the image similar to the template image is located within the same image.
However, when the parameter and/or the template for detecting the similar image area is changed to perform the similar-image area detections for multiple times in order to detect all of a number of the similar image areas located within the same image, for example, the same area is likely to be extracted in a slightly different shape or undesired areas are likely to be extracted. That is, conventionally, there has been a problem that the similar image areas within the same image cannot be detected in high accuracy.
In view of the above-mentioned conventional technique, there is needed to provide an image processing apparatus, an image processing method, and a computer program product that is able to detect the similar image areas within the same image in high accuracy.
It is an object of the present invention to at least partially solve the problems in the conventional technology.
According to the present invention, there is provided: an image processing apparatus comprising: an acquisition unit configured to acquire image data; a first setting unit configured to set an area included in the image data as a template area; a second setting unit configured to set a search area indicating an area to be subjected to a detection process for detecting an area similar to the template area from the image data; and a detection unit configured to perform the detection process for detecting, out of the search area, whether or not there is an area similar to the template area in an area other than an area corresponding to each of the template area and an area that has been detected by the detection process.
The present invention also provides an image processing method comprising: an acquisition step for acquiring image data; a first setting step for setting an area included in the image data as a template area; a second setting step for setting a search area indicating an area to be subjected to a detection process for detecting an area similar to the template area from the image data; and a detection step for performing the detection process for detecting, out of the search area, whether or not there is an area similar to the template area in an area other than an area corresponding to each of the template area and an area that has been detected by the detection process.
The present invention also provides a computer program product comprising a non-transitory computer-readable recording medium having a computer program that causes a computer to function as: an acquisition unit configured to acquire image data; a first setting unit configured to set an area included in the image data as a template area; a second setting unit configured to set a search area indicating an area to be subjected to a detection process for detecting an area similar to the template area from the image data; and a detection unit configured to perform the detection process for detecting, out of the search area, whether or not there is an area similar to the template area in an area other than an area corresponding to each of the template area and an area that has been detected by the detection process.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
With reference to the attached drawings, described below in detail will be the embodiment of an image processing apparatus, an image processing method, and a computer program product according to the present invention.
The image processing apparatus 100 includes a CPU 101, a memory 102, a user I/F unit 103, a display device 104, and a communication device 105, which are connected to each other via an internal bus 106. The CPU 101 controls the entire operation of the image processing apparatus 100 in an integral manner. The memory 102 stores various data therein such as a program and the like executed by the CPU 101. The user I/F unit 103 is a device for input operation and, for example, includes a keyboard, a mouse, and the like. The display device 104 is a device adapted to display various information and, for example, may be configured with a liquid crystal display device and the like. The communication device 105 is a device for communicating with external devices.
As illustrated in
The acquisition unit 10 has the function of acquiring a PDF of text data that is a manuscript for which the characteristics designation is made and extracting this text data as the image to display it on the screen of the display device 104. In other point of view, it can be understood that the acquisition unit 10 has the function of acquiring the image data that is to be subjected to the characteristics designation.
The second setting unit 20 sets a search area that indicates the area to be subjected to a detection process by which the area similar to the template area is detected from the image data (the image data being to be subjected to the characteristics designation). In the present embodiment, viewing the displayed image data, the user designates, an area intended to be subjected to the detection process by using the mouse that is the user I/F unit 103. Then, the second setting unit 20 sets the area designated by the user as a search area out of the image data and registers the set search area in the storage unit 60. That is, the second setting unit 20 has the function of setting the search area indicating the area to be subjected to the detection process out of the image data according to the user input.
Turning back to
In the present embodiment, viewing the displayed image data, the user designates the area intended to be the template area by using the mouse that is the user I/F unit 103. Then, the first setting unit 30 sets the area designated by the user out of the image data as the template area and registers the set template area in the storage unit 60.
Further, for each template area registered in the template area list (to be utilized in the detection process) according to the user input, the first setting unit 30 sets the similarity degree that is a parameter to be utilized in detecting the area similar to the template area. With respect to the value of the parameter representing the similarity degree, it can be set such that a smaller value represents much similarity, while a larger value represents less similarity. The threshold for the value of the parameter can be predefined depending on the similarity degree, for example, the threshold may be set to A when the similarity degree is “high”, the threshold may be set to B (>A) when the similarity degree is “middle”, and the threshold may be set to C (>B) when the similarity degree is “low”. When searching the area that is likely to be falsely detected, the similarity degree may be set to high so as not to falsely detect the area intended to be extracted.
Further, for each template area registered in the template area list, the first setting unit 30 sets the number of re-search times according to the user input. For example, when the number of re-search times associated to a particular template area is zero, this indicates that the area detected by the detection process in which the template area is used is not added to the template area list. Further, for example, when the number of re-search times associated to the template area is one, this indicates that the area detected by the detection process in which the template area is used is registered to the template area list (the first time) and the area detected by the detection process in which the registered area is used is not registered. Further, for example, when the number of re-search times associated to the template area is two, this indicates that the area detected by the detection process in which the template area is used is registered (the first time) and the area detected by the detection process in which the area registered at the first time is also registered (the second time), while the area detected by the detection process in which the area registered at the second time is not registered.
Further, for each template area registered in the template area list, the first setting unit 30 may also set application area information that indicates whether the template area is the template area for the valid area (the area to be subjected to the detection process) or the template area for the invalid area (the area not to be subjected to the detection process) according to the user input.
The first setting unit 30 has the functions of, according to the designation by the user, registering the designated area as the template area for the valid area, registering it as the template area for the invalid area, changing the position on the layer, deleting the area, switching between for the valid area/for the invalid area, changing the similarity degree, changing the number of re-search times, and registering the area detected by the detection process as a new template area. Further, the registered area may be moved to the search area list and/or the valid area list described later.
Turning back to
For example, the developed image data and the above-described search area list, the template area list, and the valid detection area list may be displayed on the screen, and the process by each of the second setting unit 20, the first setting unit 30, and the detection unit 40 may be performed in parallel. Since the structures of respective lists are similar, they are managed together in one list (see
Further, when the area similar to the template area has already been detected once or more and when the excessive areas have been detected, deletion of the unnecessary area from the valid detection area list allows the areas to be set without excess and deficiency, and thus the process may be terminated at this time depending on the user judgment.
Further, in the present embodiment, the detection unit 40 uses the image data developed from the acquired text data, the search area list, the template area list, and the valid area detection list to generate the image in which the search-unnecessary area is masked from the image data (the image to be subjected to the detection process, hereafter, referred to as detection target image). First, the valid area of the search area list is set to white pixels, the invalid area is set to black pixels, and the white/black are overwritten on the image in the registration order. When there are layers, the overwriting is similarly made from a lower layer in the registration order. Next, all the areas in the template area list are overwritten by the black pixels as the invalid area. Next, all the areas in the valid area detection list are overwritten by the black pixels as the invalid area (see
For each of one or more template areas included in the template area list, the detection unit 40 uses the template area, the similarity degree associated with the template area, and the detection target image to perform the detection process for detecting whether or not there is an area similar to the template area in the detection target image, and performs the contour extraction process for all the detected areas.
Here, after the detection process by the detection unit 40 ends, the above-described first setting unit 30 determines whether or not to register the area detected by the detection process in the storage unit 60 as a new template area according to the registration information associated to the template area used in the detection process.
More specifically, after the detection process by the detection unit 40 ends, when the number of re-search times associated with the template area used in the detection process is one or more, the first setting unit 30 registers the area detected by the detection process in the storage unit 60 as the new template area, subtracts one from the number of re-search times associated with the template area used in the detection process, and associates it with the new template area to register it in the storage unit 60. On the other hand, when the number of re-search times associated with the template area used in the detection process is zero, the area detected by the detection process is not registered in the storage unit 60.
Further, in the present embodiment, after the detection process ends, the first setting unit 30 deletes the template area used in the detection process and the number of re-search times associated with that template area from the storage unit 60. More detailed description will be provided later.
Turning back to
Next, described will be the operation example of the image processing apparatus 100 according to the present embodiment.
As illustrated in
Next, the detection unit 40 uses the image data developed from the text data acquired at step S1, the search area list, and the template area list to generate the detection target image in which the search-unnecessary area of the image data is masked (step S4). Next, the detection unit 40 selects any template area from one or more template areas included in the template area list and performs the detection process (step S5). Next, the detection unit 40 performs a registration determination process for determining whether or not to register the area detected by the detection process at step S5 in the storage unit 60 as a new template area (step S6). By referring to
Next, the detection unit 40 determines whether the number of re-search times associated with the template area used in the detection process of step S5 is zero or not (step S13). When the number of re-search times associated with that template area is not determined to be zero, that is, it is one or more (NO at step S13), the detection unit 40 adds all the areas detected by the detection process of step S5 to the template area list as a new template area (step S14). In this example, the detection unit 40 subtracts one from the number of re-search times associated with the template area used in the detection process of step S5 and associates it with each of all the areas detected by the detection process to register it in the storage unit 60. Further, the detection unit 40 associates, with each of all the areas detected by the detection process, the same information as the similarity degree associated with the template area used in the detection process of step S5 and the application information and registers it in the storage unit 60. The process then enters step S15.
On the other hand, when the number of re-search times associated with the template area used in the detection process of step S5 is determined to be zero in step S13 described above (YES at step S13), the process enters step S15.
In step S15, the detection unit 40 deletes the template area used in the detection process in step S5 from the template area list (step S15). More specifically, the detection unit 40 deletes the template area used in the detection process of step S5, the similarity degree associated with that template area, the number of re-search times, and the application area information from the template area list. The description above is the specific content of the registration determination process of step S6 in
Turning back to
As described above, in the present embodiment, performed is the detection process for detecting whether or not there is an area similar to the template area in the area other than the area corresponding to each of the pre-registered template area and the valid detection area (the area that has been detected by the detection process) (that is, in the above-described detection target image) out of the search area in the image data extracted from the acquired text data, so that the overlapping detection of the same area or the detection of the undesired area can be prevented. Therefore, the present embodiment allows for the advantage of the highly accurate detection of the similar image area within the same image.
In particular, in recent years, the image forming technique with the use of the clear toner has been paid attention in order to make the image quality in the electronic photograph technique close to the offset printing to provide a new additional value and expand the market of the digital printing system. For example, in order to place the clear toner on the similar image area having many water droplets, considered may be the technique for designating the template image to detect the similar image area. However, in a mere combination of the technique of designating the clear toner area at the image forming and the technique of detecting the similar image area using the template, when detection processes are performed for multiple times by changing the parameter and/or the template used for detecting the similar image area in order to select all of a number of the expected similar image areas, the same area is likely to be detected in slightly different shape or undesired areas are likely to be extracted. It then requires to manually perform all the designation as to whether or not to use them and/or the management of the detected areas, which results in the problem that it takes labor.
In contrast, the above-described present embodiment allows all the expected similar image areas to be detected from the same image without overlapping, which facilitates a series of operations for designating the area that provides the surface area by the clear toner. That is, the present embodiment is particularly effective in the characteristics designation such as the clear toner.
It is noted that it may be configured to record and provide the program executed by the CPU 101 of the above-described embodiment in the computer readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, a DVD (Digital Versatile Disk), and so on in an installable form file or an executable form file.
Furthermore, it may be configured to store the program executed by the CPU 101 of the above-described embodiment in the computer connected to the network such as the Internet and have it downloaded via the network. Further, it may be configured to provide or deliver the control program executed by the CPU 101 of the above-described embodiment via the network such as the Internet.
The present invention allows for a highly accurate detection of the similar image areas within the same image.
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2012-286015 | Dec 2012 | JP | national |