This nonprovisional application is based on Japanese Patent Application No. 2004-250345 filed with the Japan Patent Office on Aug. 30, 2004, the entire contents of which are hereby incorporated by reference.
1. Field of the Invention
The present invention relates to an apparatus, method and program for collating images of fingerprints or the like with each other and thereby determining whether the fingerprints or the like match with each other or not, and also relates a machine readable recording medium storing the program. In particular, the invention relates to an apparatus, method and program collating one item of input data with a plurality of items of reference data, and also relates to a machine readable recording medium storing the program.
2. Description of the Background Art
Japanese Patent Laying-Open No. 2003-323618 has disclosed an image collating apparatus which performs collation of a plurality of items of image data for determining whether these images represent the same target or not. This image collating apparatus divides one image into a plurality of partial areas for template matching, detects a position in another image where each partial area matches to the highest extent and determines whether movement vectors of these partial areas are the same or not. In this manner, the above determination of match/mismatch between the images is performed.
In this image collating apparatus, collation accuracy depends on a size of the partial area. If the partial area is excessively small (and e.g., is nearly equal to a width of a ridge of a fingerprint), it matches with another image in many positions. Therefore, the partial area must have a certain size (e.g., including three ridges of the fingerprint).
In another method, a movement of a partial area is calculated to determine whether movement vectors are the same or not, and thereby matching or mismatching is determined. In this manner, a certain number of partial areas are required for ensuring collation accuracy.
Accordingly, when the image obtained by a sensor has an excessively small size, it is impossible to ensure a sufficient number of partial areas if the apparatus is configured to prevent mutual overlapping of the partial areas, as is done in the foregoing image collating apparatus, while satisfying the foregoing conditions of the size of the partial areas.
For example, it is assumed that twenty or more partial areas are required for obtaining a necessary accuracy.
In this case, a sensor of a large area shown in
In the foregoing image collating apparatus, continuity of images at a boundary between the partial areas neighboring to each other is not reflected in the determination of matching or mismatching of the images based on the movement vectors. Accordingly, variations occur in a collation property depending on the manner of defining the partial areas.
For example, in the operation of collating the images in
This is due to the fact that the property of continuity of the images is not used at a boundary between the partial areas, and a performance of detecting a difference lowers when a difference between the images occurs exactly at the boundary between the partial areas.
In summary, the foregoing image collating apparatus cannot sufficiently satisfy the two antinomic requests, i.e., the request for ensuring certain sizes of the partial areas for template matching and the request for increasing the number of the partial areas. Also, the foregoing image collating apparatus suffers from a problem that an error occurs in determination of match/mismatch between the images due to the fact that the continuity of the image data at the boundary between the neighboring partial areas is not reflected in the determination of match/mismatch between the images.
An object of the invention is to provide an image collating apparatus, method and program which can satisfy both the two antinomic requests, i.e., the request for ensuring certain sizes of the partial areas and the request for increasing the number of the partial areas, and can also reflect the continuity of the image data at the boundary between the neighboring partial areas in the determination of match/mismatch between the images, as well as a recording medium storing the program.
For achieving the above object, an image collating apparatus according to an aspect of the invention for collating first and second images with each other includes a partial area setting unit for setting a plurality of partial areas for use in template matching in the first image. Corresponding to each of the images of the plurality of partial areas set by the partial area setting unit, a search is made for a maximum matching score position being a position in the second image of the partial area attaining a maximum matching score in the second image. For calculating the image similarity score, the apparatus uses a positional relationship quantity representing a positional relationship between a position of the partial area in the first image and the maximum matching score position corresponding to the partial area in question. Thus, a similarity score between the first and second images is calculated from information related to the partial areas exhibiting the positional relationship quantity falling within a predetermined range among the positional relationship quantities of the plurality of partial areas, and is output as an image similarity score. Based on the output image similarity score, it is determined whether the first and second images match with each other or not. The partial area setting unit includes a neighboring partial area setting unit setting a plurality of neighboring partial areas located in the first image and neighboring to each other without an overlap, and a boundary-including partial area setting unit setting a boundary-including partial area located on the neighboring partial areas set by the neighboring partial area setting unit and in a position including a boundary between the neighboring partial areas. The neighboring partial areas and the boundary-including partial area are set as the plurality of foregoing partial areas.
By setting the boundary-including partial area as described above, it is possible to satisfy both the two antinomic requests for ensuring a certain size of the partial area and for increasing the number of the partial areas, and it is possible to set the partial area having a size required for ensuring a collation accuracy and to set the partial areas of the number required for ensuring the collation accuracy. Further, the boundary-including partial area is set in the position including the boundary between the neighboring partial areas so that the boundary-including partial area can cover the boundary between the neighboring partial areas. Consequently, the continuity of the image data at the boundary between the neighboring partial areas can be reflected in determination of match/mismatch between the images as far as possible.
For achieving the foregoing object, an image collating apparatus according to another aspect of the invention for collating first and second images with each other includes a partial area image data storing unit and a boundary-including partial area image data producing unit. The partial area image data storing unit stores image data of a plurality of neighboring partial areas set in the first image for use in template matching and neighboring to each other without an overlap. The boundary-including partial area image data producing unit produces image data of a boundary-including partial area including a boundary between the neighboring partial areas neighboring to each other. Thus, image data of portions overlapping with the boundary-including partial area and included in the image data of the neighboring partial areas neighboring to each other and stored in the partial area image data storing unit is collected to produce the image data of the boundary-including partial area. In image collation, a search is made for a maximum matching score position, in the second image provided from an image input unit, of the partial area attaining the maximum matching score in the second image, corresponding to each of the images of the plurality of partial areas including the images formed of the image data of the plurality of neighboring partial areas stored in the partial area image data storing unit and the image formed of the image data of the boundary-including partial area produced by the boundary-including partial area image data producing unit. The image similarity score is calculated by using a positional relationship quantity representing a positional relationship between a position of the partial area in the first image and the maximum matching score position corresponding to the partial area in question. Thus, the similarity score between the first and second images is calculated from information related to the partial areas exhibiting the positional relationship quantity falling within a predetermined range among the positional relationship quantities of the plurality of partial areas, and is output as an image similarity score. Based on the output image similarity score, it is determined whether the first and second images match with each other or not.
As described above, by using the image of the boundary-including partial area for determining match/mismatch, it is possible to satisfy both the two antinomic requests for ensuring a certain size of the partial area and for increasing the number of the partial areas, and it is possible to set the partial area having a size required for ensuring a collation accuracy and to set the partial areas of the number required for ensuring the collation accuracy. Since the image data of the boundary-including partial area is the image data in the position including the boundary between the neighboring partial areas, the boundary-including partial area can cover the boundary between the neighboring partial areas. Consequently, the continuity of the image data at the boundary between the neighboring partial areas can be reflected in the determination of match/mismatch between the images as far as possible. Further, the image data in the boundary-including partial area is formed using the image data in the neighboring partial area. Therefore, the partial area image data storing unit for storing the image data of the neighboring partial area is not required to store additionally the image data of the boundary-including partial area, which can suppress increase in storage capacity.
Preferably, the plurality of partial areas are the partial areas all having the same rectangular shape. The neighboring partial areas are set to spread throughout the first image, and the boundary-including partial area includes a crossing point of the boundaries formed by the four neighboring partial areas neighbor to each other.
By arranging the partial areas as described above, the boundary-including partial areas can include all the boundaries of the neighboring partial areas, and the continuity of the image data at the boundary between the neighboring partial areas can be reflected in the determination of match/mismatch between the images as efficiently as possible. Further, the above configuration simplifies the calculation of the positions of the respective partial areas.
Preferably, the boundary-including partial area is the partial area restricted to a position including the boundary between the neighboring partial areas clearly exhibiting a feature of the image among all the neighboring partial areas.
By configuring the partial areas as described above, the positions clearly exhibiting the feature of the image can be concentratedly used for the determination of match/mismatch between the images, and the portions not clearly exhibiting the feature can be omitted from collation so that fast collation can be performed with a small calculation quantity.
Preferably, the similarity score calculation is executed while placing a larger weight on the similarity score calculation for the restricted boundary-including partial area clearly exhibiting the feature of the image as compared with the calculation for the other partial areas.
The above configuration performs the collation on the portions clearly exhibiting the feature of the image with a larger weight, and thereby can improve the accuracy of determination of match/mismatch between the images while simplifying the collation of the portions not clearly exhibiting the feature. Thereby, fast collation can be performed with a small calculation quantity.
Preferably, the first and second images are derived from a living body. The determination of match/mismatch between the images derived from the living body can be performed with high accuracy.
Preferably, the image derived from the living body is an image derived from a fingerprint. The determination of match/mismatch between the images derived from the fingerprints can be performed with high accuracy.
For achieving the above object, an image collating method according to still another aspect of the invention for collating first and second images with each other includes a partial area setting step of setting a plurality of partial areas for use in template matching in the first image. Corresponding to each of the images of the plurality of partial areas set in the partial area setting step, a search is made for a maximum matching score position being a position in the second image of the partial area attaining a maximum matching score in the second image. For calculating the image similarity score, the method uses a positional relationship quantity representing a positional relationship between a position of the partial area in the first image and the maximum matching score position corresponding to the partial area in question. Thus, the similarity score between the first and second images is calculated from information related to the partial areas exhibiting the positional relationship quantity falling within a predetermined range among the positional relationship quantities of the plurality of partial areas, and is output as an image similarity score. Based on the output image similarity score, it is determined whether the first and second images match with each other or not. The partial area setting step includes a neighboring partial area setting step of setting a plurality of neighboring partial areas located in the first image and neighboring to each other without an overlap, and a boundary-including partial area setting step of setting a boundary-including partial area located on the neighboring partial areas set by the neighboring partial area setting step and in a position including a boundary between the neighboring partial areas. The neighboring partial areas and the boundary-including partial area are set as the plurality of foregoing partial areas.
For achieving the foregoing object, an image collating method according to yet another aspect of the invention for collating first and second images with each other includes a partial area image data storing step and a boundary-including partial area image data producing step. The partial area image data storing step stores image data of a plurality of neighboring partial areas set in the first image for use in template matching and neighboring to each other without an overlap. The boundary-including partial area image data producing step produces image data of a boundary-including partial area including a boundary between the neighboring partial areas neighboring to each other. Thus, image data of portions included in the image data of the neighboring partial areas neighboring to each other, stored in the partial area image data storing step and overlapping with the boundary-including partial area is collected to produce the image data of the boundary-including partial area. In image collation, a search is made for a maximum matching score position, in the second image provided in the image input step, of the partial area attaining the maximum matching score by the second image corresponding to each of the images of the plurality of partial areas including the images formed of the image data of the plurality of neighboring partial areas stored in the partial area image data storing step and the image formed of the image data of the boundary-including partial area produced in the boundary-including partial area image data producing step. The image similarity score is calculated by using a positional relationship quantity representing a positional relationship between the position of the partial area in the first image and the maximum matching score position corresponding to the partial area in question. Thus, the similarity score between the first and second images is calculated from the information related to the partial area exhibiting the positional relationship quantity falling within a predetermined range among the positional relationship quantities of the plurality of partial areas, and the calculated similarity score is output as an image similarity score. Based on the output image similarity score, it is determined whether the first and second images match with each other or not.
An image collating program according to a further aspect of the invention causes a computer to execute the foregoing image collating method.
A computer readable recording medium according to a further aspect of the invention stores the foregoing image collating program.
According to the invention, since the partial area overlapping with the partial areas set in the first image and neighboring to each other is set as the partial area including the boundary between the foregoing neighboring partial areas, it is possible to set the partial area of a size required for ensuring a collation accuracy, and to set the partial areas of the number required for ensuring the collation accuracy. Further, the partial area set in the overlapping fashion is the partial area including the boundary between the neighboring partial areas, and the determination of match/mismatch is performed on the images including the image data of this partial area. Therefore, the continuity of the image at the boundary between the partial areas neighboring to each other can be reflected in the determination of match/mismatch between the images so that it is possible to suppress a disadvantageous determination error which may be caused by the fact that the continuity at the boundary between the partial areas is not reflected in the movement of the image.
Further, in the case where the image data of the boundary-including partial area is produced by using the image data of the neighboring partial area, the partial area image data storing unit for storing the image data of the neighboring partial area is not required to store additionally the image data of the boundary-including partial area, and this suppresses increase in storage capacity.
In the case where an overlapping partial area setting unit sets the partial areas concentratedly in a portion clearly exhibiting the feature of the image, accurate collation can be performed fast. Thus, by arranging the partial areas as described above, the collation can be performed while placing a larger weight on a part of the image, and the collation of the portions not clearly exhibiting the feature can be simplified so that fast calculation can be performed with a small calculation quantity.
The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
Embodiments of the invention will now be described with reference to the drawings.
Here, two image data are collated with each other. Though fingerprint data will be described as an exemplary image data to be collated, the image is not limited thereto, and the present invention may be applicable to image data of other biometrics that are similar among samples (individuals) but not identical, or other image data of linear patterns.
The computer may be provided with a magnetic tape apparatus accessing to a cassette type magnetic tape that is detachably mounted thereto.
Referring to
Image input unit 101 includes a fingerprint sensor 100, and outputs a fingerprint image data that corresponds to the fingerprint read by fingerprint sensor. Fingerprint sensor may be an optical, a pressure-type, a static capacitance type or any other type sensor. Memory 102 stores image data and various calculation results. Bus 103 is used for transferring control signals and data signals between each of these units. Image correcting unit 104 performs density correction of the fingerprint image input from image input unit 101. Maximum matching score position searching unit 105 uses a plurality of partial areas of one fingerprint image as templates, and searches for a position of the other fingerprint image that attains to the highest matching score with the templates. Namely, this unit serves as a so called template matching unit. Using the information of the result from maximum matching score position searching unit 105 stored in memory 102, similarity score calculating unit 106 calculates the movement-vector-based similarity score, which will be described later. Collation determining unit 107 determines a match/mismatch, based on the similarity score calculated by similarity score calculating unit 106. Control unit 108 controls processes performed by various units of collating unit 11.
The procedure for collating images “A” and “B” that correspond to data of two fingerprint images, for collating two fingerprint images in image collating apparatus 1 shown in
First, control unit 108 transmits an image input start signal to image input unit 101, and thereafter waits until an image input end signal is received. Image input unit 101 receives as an input image “A” for collation, and received image “A” is stored at a prescribed address of memory 102 through bus 103 (step T1). After the input of image “A” is completed, image input unit 101 transmits the image input end signal to control unit 108.
Receiving the image input end signal, control unit 108 again transmits the image input start signal to image input unit 101, and thereafter, waits until the image input end signal is received. Image input unit 101 receives as an input image “B” for collation, and received image “B” is stored at a prescribed address of memory 102 through bus 103 (step T1). After the input of image “B” is completed, image input unit 101 transmits the image input end signal to control unit 108.
Thereafter, control unit 108 transmits an image correction start signal to image correcting unit 104, and thereafter, waits until an image correction end signal is received. In most cases, the input image has uneven image quality, as tones of pixels and overall density distribution vary because of variations in characteristics of image input unit 101, dryness of fingerprints and pressure with which fingers are pressed. Therefore, it is not appropriate to use the input image data directly for collation. Image correcting unit 104 corrects the image quality of input image to suppress variations of conditions when the image is input (step T2). Specifically, for the overall image corresponding to the input image or small areas obtained by dividing the image, histogram planarization, as described in Computer GAZOU SHORI NYUMON (Introduction to computer image processing), SOKEN SHUPPAN, p. 98, or image thresholding (binarization), as described in Computer GAZOU SHORI NYUMON (Introduction to computer image processing), SOKEN SHUPPAN, pp. 66-69, is performed, on images “A” and “B” stored in memory 102.
After the end of image correcting process on images “A” and “B”, image correcting unit 104 transmits the image correction end signal to control unit 108.
Thereafter, on the images “A” and “B” that have been subjected to image correcting process by image correcting unit 104, the collation (step T3) is performed. This processing will be described with reference to
Control unit 108 transmits a template matching start signal to maximum matching score position searching unit 105, and waits until a template matching end signal is received. Maximum matching score position searching unit 105 starts a template matching process represented by steps S001 to S008. In step S001, a counter variable “i” is initialized to “1”. In step S002, an image of a partial area defined as partial area “Ri” of image “A” is set as a template to be used for the template matching.
When there is no area that allows setting of neighboring partial area “Ri” in image “A” without an overlap, the result of determination in step S030 is “NO”, and it is determined in step S032 whether it is possible to set partial area “Ri” at a center of the area formed of four neighboring templates (i.e., neighboring templates) in an overlapping fashion or not. This overlapping partial area “Ri” is a partial area indicated by a reference number “32” in
Though the partial area “Ri” has a rectangular shape for simplicity of calculation, the shape is not limited thereto. Returning to
In step S005, a search is made for a position where the highest matching score is attained with respect to the template set in step S002, and thus the position where the data in the image matches with it to the highest extent. More specifically, we represent pixel density of coordinates (x, y), with an upper left corner of partial area “Ri” used as the template being the origin by “Ri (x, y)”, pixel density of coordinates (s, t), with an upper left corner of image “B” being the origin by “B(s, t)”, the width and height of partial area “Ri” by “w” and “h”, respectively, possible maximum density of each pixel in images “A” and “B” by “V0”, and the matching score at coordinates (s, t) of image “B” by “Ci(s, t)”. This matching score “Ci(s, t)” is calculated in accordance with the following equation (1), based on a density difference between of the pixels.
In image “B”, the coordinates (s, t) are successively updated and the matching score “C(s, t)” is calculated. A position having the highest value is considered as the maximum matching score position, the image of the partial area at that position is represented as partial area “Mi”, and the matching score at that position is represented as maximum matching score “Cimax”. In step S006, the maximum matching score “Cimax” in image “B” for the partial area “Ri” calculated in step S005 is stored at a prescribed address of memory 102. In step S007, a movement vector “Vi” is calculated in accordance with equation (2) and is stored at a prescribed address of memory 102.
Here, if the image “B” is scanned to identify the partial area “Mi” at the position “M” having the highest matching score with the partial area “Ri”, based on the partial area “Ri” at position “P” in image “A”, a directional vector from position “P” to position “M” is referred to as a movement vector. This is because the image “B” seems to have moved from image “A” as a reference, as the finger is placed in various manners on the fingerprint sensor.
Vi=(Vix, Viy)=(Mix−Rix, Miy−Riy) (2)
In equation (2), variables “Rix” and “Riy” are “x” and “y” coordinates at the reference position of partial area “Ri”, that correspond, by way of example, to the upper left corner of partial area “Ri” in image “A”. Variables “Mix” and “Miy” are “x” and “y” coordinates at the position of maximum matching score “Cimax” as the result of search of partial area “Mi”, which correspond, by way of example, to the upper left corner coordinates of partial area “Mi” at the matched position in image “B”.
Steps S002 to S008 are repeated, and template matching is performed on every partial area “Ri”. Also, the maximum matching score “Cimax” of each partial area “Ri” and the movement vector “Vi” are calculated. As described above, when there is no area that allows setting of boundary-including partial area “Ri” in image “A”, the process proceeds from step S004 to step S009.
Maximum matching score position searching unit 105 stores the maximum matching score “Cimax” and the movement vector “Vi” for every partial area “Ri” calculated successively as described above at prescribed addresses of memory 102, and thereafter transmits a template matching end signal to control unit 108 to end the processing.
Thereafter, control unit 108 transmits a similarity score calculation start signal to similarity score calculating unit 106, and waits until a similarity score calculation end signal is received. Similarity score calculating unit 106 calculates the similarity score through the process of steps S009 to S021 of
In step S009, similarity score “P (A, B)” is initialized to 0. Here, the similarity score “P(A, B)” is a variable storing the degree of similarity between images “A” and “B”. In step S010, an index “i” of the movement vector “Vi” as a reference is initialized to “1”. In step S011, similarity score “Pi” related to the reference movement vector “Vi” is initialized to “0”. In step S012, an index “j” of movement vector “Vj” is initialized to “1”. In step S013, vector difference “dVij” between reference movement vector “Vi” and movement vector “Vj” is calculated in accordance with equation (3).
dVij=|Vi−Vj|=sqrt((Vix−Vjx)ˆ2+(Viy−Vjy)ˆ2) (3)
Here, variables “Vix” and “Viy” represent “x” direction and “y” direction components, respectively, of the movement vector “Vi”, variables “Vjx” and “Vjy” represent “x” direction and “y” direction components, respectively, of the movement vector “Vj”, variable “sqrt(X)” represents square root of “X” and “Xˆ2” represents calculation of square of “X”.
In step S014, vector difference “dVij” between movement vectors “Vi” and “Vj” is compared with a prescribed constant value “ε”, so as to determine whether the movement vectors “Vi” and “Vj” can be regarded as substantially the same vectors. If the vector difference “dVij” is smaller than the constant value “ε”, movement vectors “Vi” and “Vj” are regarded as substantially the same, and the flow proceeds to step S015. If the difference is larger than the constant value, the movement vectors cannot be regarded as substantially the same, and the flow proceeds to step S016. In step S015, the similarity score “Pi” is incremented in accordance with equations (4) to (6).
Pi=Pi+a (4)
α=1 (5)
α=Cimax (6)
In equation (4), variable “α” is a value for incrementing the similarity score “Pi”. If “α” is set to 1 as represented by equation (5), similarity score “Pi” represents the number of partial areas that have the same movement vector as reference movement vector “Vi”. If “α” is set to Cimax as represented by equation (6), the similarity score “Pi” would be the total sum of the maximum matching scores obtained through the template matching of partial areas that have the same movement vector as the reference movement vector “Vi”. The value of variable “α” may be made smaller, in accordance with the magnitude of vector difference “dVij”.
In step S016, whether the value of index “j” is smaller than the total number “n” of partial areas or not is determined. If the value of index “j” is smaller than the total number “n” of partial areas, the flow proceeds to step S017, and if it is larger, the flow proceeds to step S018. In step S017, the value of index “j” is incremented by 1. By the process from step S011 to S017, the similarity score “Pi” is calculated, using the information of partial areas determined to have the same movement vector as the reference movement vector “Vi”. In step S018, the similarity score “Pi” using movement vector “Vi” as a reference is compared with the variable “P(A, B)”, and if the similarity score “Pi” is larger than the largest similarity score (value of variable “P(A, B)”) obtained by that time, the flow proceeds to step S019, and otherwise the flow proceeds to step S020.
In step S019, a value of similarity score “Pi” using movement vector “Vi” as a reference is assigned to the variable “P(A, B)”. In steps S018 and S019, if the similarity score “Pi” using movement vector “Vi” as a reference is larger than the maximum value of the similarity score (value of variable “P(A, B)”) calculated by that time using other movement vector as a reference, the reference movement vector “Vi” is considered to be the best reference among the values of index “i” used to that time point.
In step S020, the value of index “i” of reference movement vector “Vi” is compared with the number (value of variable “n” set in step S034) of partial areas. If the value of index “i” is smaller than the number “n” of partial areas, the flow proceeds to step S021, in which the index value “i” is incremented by 1.
From step S009 to step S021, similarity between images “A” and “B” is calculated as the value of variable “P(A, B)”. Similarity score calculating unit 106 stores the value of variable “P(A, B)” calculated in the above described manner at a prescribed address of memory 102, and transmits a similarity score calculation end signal to control unit 108 to end the process.
Thereafter, control unit 108 transmits a collation determination start signal to collation determining unit 107, and waits until a collation determination end signal is received (step T4). Collation determining unit 107 collates and determines (step T4). Specifically, the similarity score represented by the value of variable “P(A, B)” stored in memory 102 is compared with a predetermined threshold “T” for collation. As a result of comparison, if “P(A, B)”≧T, it is determined that images “A” and “B” are taken from one same fingerprint, a value, for example, “1”, indicating a “match” is written to a prescribed address of memory 102 as a collation result, and if not, the images are determined to be taken from different fingerprints and a value, for example, “0”, indicating a “mismatch” is written at a prescribed address of memory 102 as a collation result. Thereafter, a collation determination end signal is transmitted to control unit 108, and the process ends.
Finally, control unit 108 outputs the collation result stored in memory 102 through display 610 or printer 690 (step T5), and the image collating process ends.
In the present embodiment, part of or all of image correcting unit 104, maximum matching score position searching unit 105, similarity score calculating unit 106, collation determining unit 107 and control unit 108 may be implemented by an ROM such as memory 624 storing the process procedure as a program and a processor such as CPU 622.
In the first embodiment already described, after the partial areas neighboring to each other are set in the image, the boundary-including partial area is set in the overlapping manner on the above partial area, and the setting of the partial area ends when the area allowing the setting of the boundary-including partial area is no longer left. In contrast to this control, a second embodiment performs the control such that setting of the partial areas ends when the partial areas equal in number to a predetermined value of variable “n” are set. This second embodiment will now be described with reference to
In next step S045, a center of image “A” is calculated. In next step S046, it is determined whether a relationship of (i≦n) is satisfied or not. Variable “n” indicates the predetermined total number of the partial areas to be set in image “A”. Variable “n” is preset to satisfy the relationship that variable “n” is larger than the number of neighboring partial areas “Ri” spread throughout image “A” without an overlap, and is smaller than the number of the total number of partial areas in the state where the boundary-including partial areas are fully spread over the neighboring partial areas.
When a current value of “i” is smaller than that of “n”, the flow proceeds to step S047, in which boundary-including partial area “Ri” is set as the template in a position near the center of image “A” calculated in step S045. One is added to “i” in step S008 in response to every execution of the processing in step S047, and the processing in step S047 is repeated until the value of “i” reaches the value of “n”. When the result of determination in step S040 first becomes “NO”; the overlap start flag is set to “ON”. For the subsequent processing, therefore, the result of determination becomes “YES” in step S042, and the flow directly proceeds to step S046 without performing the processing in steps S043-S045.
When the value of variable “i” reaches that of variable “n”, the template setting completion flag is set to “ON” in step S048, and the overlap start flag is set to “OFF” in step S049.
As a result of the processing in step S047, boundary-including partial areas “Ri” are arranged such that higher priority is given to the arrangement of boundary-including partial areas “Ri” at the center of image “A”, i.e., at the position where the feature of the image becomes clear in the case where the target image is a fingerprint.
Referring first to
When the current value of “i” is equal to or larger than start value “K”, i.e., when the setting of the boundary-including partial area is already performed, processing is performed according to “α=1+(n−i)/n” in step S058. In step S057, the value of “α” set in step S058 is added to the current value of “Pi” in step S057.
In “α=1+(n−i)/n” used in step S058, the value of “α” decreases with increase in value of “i”, and becomes “1” when the value of “i” reaches the maximum value, i.e. the value of variable “n”. In step S047, therefore, boundary-including partial areas “Ri” are set in the position near the center on a priority basis so that boundary-including partial area “Ri” closer to the center exhibits the “α” of a larger value, and the value of “α” gradually decreases with a distance from the center. Consequently, the determination of match/mismatch between the images is performed while placing larger weights on central portions exhibiting the feature in the image.
In another example of the addition processing of “Pi” illustrated in
When the result of determination in step S060 is “YES”, processing is performed according to (α=Cimax(1+(n−i)/n)) in step S063. In this step S063, the value of “α” decreases with increase in value of “i”, and becomes “Cimax” when the value of “i” reaches the maximum value, i.e. the value of variable “n” similarly to the foregoing step S058. Therefore, in the example illustrated in
In the first and second embodiments, images “A” and “B” to be collated are provided from image input unit 101. In a third embodiment, however, one of two images to be collated is registered in advance, and only the other image is provided.
Registered data storing unit 202 stores in advance only data portion to be used for collation in one of the image to be collated. Fingerprint registering unit 206 extracts the information required for collation from the fingerprint image provided from image input unit 101, and stores it in registered data storing unit 202. Registered data reading unit 207 reads the information required for collation from registered data storing unit 202, and stores the read information in memory 102.
First, it is assumed that images “A” and “B” form the image pair to be collated. In general, one of the images is stored in registered data storing unit 202, and is collated with the other images, which are successively input. Thereby, it is determined whether the registered image and the input image are derived from the same fingerprint or not. In this example, image “A” is the registered image, and image “B” is the input image.
The processing of registering image “A” as the registered image will now be described according to the flowchart of
First, control unit 108 transmits an image input start signal to image input unit 101, and then waits until an image input end signal is received. Image input unit 101 performs the input of image “A”, and stores it through bus 103 at a prescribed address in memory 102 (step T8). When the image input is completed, image input unit 101 transmits an image input end signal to control unit 108.
Then, control unit 108 transmits the image correction start signal for image “A” to image correcting unit 104, and waits until the image correction end signal is received. Image correcting unit 104 effects image quality correction, which is already described in connection with the first embodiment, on image “A” stored in memory 102 (step T9). Thereafter, it transmits the image correction end signal to control unit 108.
Then, control unit 108 transmits a fingerprint registration start signal to fingerprint registering unit 206, and waits until a fingerprint registration end signal is received. Fingerprint registering unit 206 extracts data of partial areas (neighboring partial areas) R1, R2, . . . Rn in the predetermined positions, e.g., illustrated in
In this embodiment, all or a part of image correcting unit 104, fingerprint registering unit 206, registered data reading unit 207, maximum matching score position searching unit 105, similarity score calculating unit 106, collation determining unit 107 and control unit 108 may be implemented by a ROM corresponding to memory 624 storing a processing procedure and a processor such as CPU 622 for executing the stored procedure.
Until the result of determination becomes “NO”, the processing in steps S071-S073 is repeated. Consequently, as illustrated in
The flowchart of the collating processing of collating image “B” serving as the input image with the registered data is the same as that in
When the above current value of “i” is not smaller than the value of “K”, the template matching is to be performed in this stage with the image data of boundary-including partial area. In this case, the result of determination in step S081 becomes “NO”, and arithmetic of (e=i−K+1) is performed in step S083. In next step S084, it is determined whether four neighboring partial areas including “Re” at the upper left are present or not. The value of “e” increases one by one to attain successively, e.g., 1, 2, 3 . . . in response to every increase of the value of “i”. Since the initial value of “e” is “1”, it is determined for the first execution of step S084 whether the neighboring partial areas including “R1” at the upper left are present or not. Referring to
In next step S086, “1” is added to the value of “d”.
Referring to
In next step S087, it is determined whether the value of “e” is equal to the value of “K” or not. If it is not yet equal, this subroutine program ends.
Every time “1” is added to the value of “i” (see step S008), the processing in steps S083-S087 is repeated. When the value of “i” reaches “2K”, and thus when the value of “e” attains “K”, the result of determination in step S087 becomes “NO”, and the processing in step S088 is performed according to “n=K+d”. Consequently, the value of “n” is equal to a sum of the number of the neighboring partial areas, which are set in the input image without an overlap, and the number of image data items corresponding to the boundary-including partial areas produced in step S085. In next step S089, “0” is assigned to “d”, and the template setting completion flag is set to “ON” in step S090.
In a fourth embodiment, the processing of producing the image data corresponding to the overlapping partial area already described in connection with the third embodiment is applied to the second embodiment.
Referring to
The loop in steps S102-S104 is repeated, and “1” is added to the value of “i” in response to every repetition. When neighboring partial areas “Ri” are set throughout the input image without an overlap, the processing in step S105 is performed to set the value of “i” to “K”, and this subroutine program ends.
Since neighboring partial areas “Ri” are extracted from the positions near the center in step S102, the neighboring partial areas are set as illustrated in
In the case of
When the value of “i” is equal to or larger than K, the image of the boundary-including partial area is to be used in this stage so that it is determined in step S112 whether the values of “i” and “K” are equal to each other or not. When these are equal, i.e., when the process is in the first stage after it becomes necessary to use the boundary-including partial area, the center of the input image is calculated in step S113. In next step S114, it is determined whether the value of “i” is equal to or smaller than the value of “n” or not. This “n” is the same as that in foregoing step S046, and indicates the total number of the partial areas used for the collation. This value is predetermined. The value of “n” is larger than the total number (i.e., 25 in the case of
When the current value of “i” is equal to or smaller than the value of “n”, calculation of (e=i−K+1) is performed in step S115, and it is determined in step S116 whether the four neighboring partial areas including neighboring partial area “Re” at the upper left are present or not. When these are present, the processing is performed in step S117 to collect or gather the data of divided area D1 of neighboring partial area “R1”, the data of divided area D2 of the upper right partial area, the data of divided area D3 of the lower left partial area and the data of divided area D4 of the lower right partial area to produce the image data of boundary-including partial area “Ri” from the collected data. The processing in these steps S116 and S117 are the same as those in the foregoing steps S084 and S085.
The processing in steps S114-S117 is repeated, and “1” is added to the value of “i” in response to every repetition (see step S008). When the value of “i” exceeds the value of “n”, the result of determination in step S114 becomes “NO”, and the template setting completion flag is set to “ON” in step S118 so that the subroutine program ends.
In the third and fourth embodiments, the image data of the neighboring partial area extracted from the input image is registered in registered data storing unit 202. This registering processing may be configured to store the neighboring partial area in registered data storing unit 202 corresponding to the plurality of items of image data (fingerprint data) to be collated. For this configuration, registered data storing unit 202 must store the data of the neighboring partial areas of the plurality of items of image data. In the collating operation, however, the data of the boundary-including partial areas required for the collation is produced using the data of the neighboring partial areas. Therefore, it is not necessary to store additionally the data of the boundary-including partial areas in registered data storing unit 202, which results in an advantage that increase in storage capacity of the registered data storing unit 202 can be suppressed as far as possible.
The processing function for the image collation already described is implemented by a program. In this embodiment, the program is stored in a computer readable recording medium.
In this embodiment, the recording medium may be a memory required for the processing by the computer illustrated in
Here, the recording medium mentioned above is detachable from the computer body. A medium fixedly carrying the program may be used as the recording medium. Specific examples may include tapes such as magnetic tapes and cassette tapes, disks including magnetic disks such as FD 623 and fixed disk 626 and optical disks such as CD-ROM 642/MO(Magneto-Optical disk)/MD(Mini Disc)/DVD(Digital Versatile Disc), cards such as an IC card (including memory card)/optical card, and semiconductor memories such as a mask ROM, EPROM (Erasable and Programmable ROM), EEPROM (Electrically EPROM) and a flash ROM.
The computer shown in
The contents stored in the recording medium are not limited to a program, and may include data.
According to the first to fifth embodiments described above, even when the sensor area is small as illustrated, e.g., in
Further, in the process of collating the images, e.g., in
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2004-250345 | Aug 2004 | JP | national |