The present application claims priority from Japanese patent application JP Patent Application No. 2014-130138 filed on Jun. 25, 2014, the content of which is hereby incorporated by reference into this application.
1. Technical Field
The present invention relates to a system that authenticates an individual by utilizing human biometric information.
2. Background Art
As a result of the progress in network technology that has been made in recent years, it is expected that the future demand will increase for cloud type biometric authentication services that centrally manage biometric data for individual authentication over a network. When a plurality of pieces of biometric data can be centrally managed on a server, a vast number of data items may be registered.
When the number of people who utilize a biometric authentication system is large, throughput is decreased in the case of 1:1 authentication whereby a living body is presented after the individual is uniquely identified by the input of a personal identification number or by the presentation of an ID card. Thus, it is desirable to perform so-called 1:N authentication involving solely biometric authentication without utilizing the personal identification number or ID card. As the number of data items registered on a server increases, N in the 1:N authentication increases. Accordingly, in order to correctly distinguish individuals from among a large number of registered data items, increased accuracy is required.
Patent Document 1 discloses a technique aimed at achieving increased accuracy of individual identification performance by utilizing collation of biometric features of an individual with those of others. In Patent Document 1, it is described that the object is to make authentication faster in so-called multimodal authentication involving a plurality of pieces of biometric information for authentication. In Patent Document 1, as a solution for achieving the increase in speed, a multimodal authentication method is described whereby candidates are selected from among registered persons by utilizing first biometric information from the authentication-requesting person, and then collation is performed only with the candidates using second biometric information.
In Patent Document 1, it is further described that a similarity value is detected in the form of an index indicating a similarity relationship between respective pieces of the second biometric information of the candidates on the basis of a predetermined function. In Patent Document 1, if the similarity value based on collation with the others exceeds a predetermined threshold value, candidate selection is performed again. Only when the similarity value is below the predetermined threshold value, it is determined that personal identification from the candidates can be readily performed utilizing the second biometric information, and authentication is performed.
Patent Document 1: JP 2005-275508 A
However, merely increasing the types (biometric modality) of biometric information utilized for biometric authentication does not necessarily lead to an increase in the amount of beneficial information for individual authentication. Namely, for increased accuracy, it is necessary to increase information beneficial for individual identification capacity from among the information obtained from biometric modality. However, it is considered that the biometric features that have been utilized for biometric authentication so far do not fully draw and take advantage of all features inherently possessed by a living body that are beneficial for individual identification. Thus, there is the problem of how to draw feature information beneficial for authentication that has not been used in conventional biometric modality or newly added biometric modality, and to fully take advantage of the feature information for authentication, instead of simply increasing the types of biometric modality.
An object of the present invention is to provide a highly accurate authentication system that utilizes beneficial feature information in a biometric authentication system.
In order to achieve the above object, the configurations set forth in the claims are adopted, for example. The present application includes a plurality of means for solving the problem. For example, there is provided an authentication system including a measurement device that acquires biometric modality information from a living body of a first user; an input unit that generates at least one item of input information from the biometric modality information; a storage device that stores first feature information acquired from the biometric modality information of the first user, and second feature information acquired based on a correlation between the biometric modality information of the first user and biometric modality information of a second user; and an authentication unit that authenticates the first user by collating the input information with the first feature information and collating the input information with the second feature information.
In another example, there is provided an authentication system including a measurement device that acquires biometric modality information from a living body of a first user; an input unit that generates input information from the biometric modality information; a storage device that stores, with respect to a group of at least three persons including the first user, group feature information acquired based on a correlation between the biometric modality information of the at least three persons; and an authentication unit that authenticates the group to which the first user belongs by collating the input information with the group feature information.
In yet another example, there is provided an authentication system including a measurement device that acquires biometric modality information from a living body of a first user; an input unit that generates input information from the biometric modality information; a storage device that stores first feature information acquired from the biometric modality information of the first user and group information indicating a group to which the first user belongs; and an authentication unit that authenticates the first user by collating the input information with the first feature information. The authentication unit authenticates a second user belonging to the group by collating the input information with the first feature information, identifies the group to which the second user belongs, and lowers an authentication condition for the first user for a predetermined time when the first user is at a close spatial distance from the second user and temporally close from an authentication time for the second user.
According to the present invention, a highly accurate authentication system can be provided by utilizing beneficial feature information.
Additional features relating to the present invention will become apparent from the description of the present specification and the attached drawings. Problems, configurations, and effects other than those described above will become apparent from the following description of embodiments.
In the following, embodiments of the present invention will be described with reference to the attached drawings. While the attached drawings illustrate specific embodiments in accordance with the principle of the present invention, the embodiments are provided for facilitating an understanding of the present invention and are not to be used for interpreting the present invention in a limited sense.
The measurement device 12 is a device that acquires information about biometric modality of an authentication-requesting person 10, and may include a camera or a distance sensor. In the following, a case will be described in which a biometric modality image of the authentication-requesting person 10 is obtained by the measurement device 12, for example. The image input unit 18 acquires the image of the authentication-requesting person 10 that has been captured by the measurement device 12, generates input data from the acquired image, and sends the data to the authentication processing unit 13. The authentication processing unit 13 includes a CPU 19, a memory 20, and various interfaces (IF) 21. The CPU 19 performs various processes by executing a program recorded in the memory 20. The memory 20 stores the program executed by the CPU 19. The memory 20 also temporarily stores the image input from the input unit 18. The interfaces 21 are provided for connection with devices connected to the authentication processing unit 13. Specifically, the interfaces 21 are connected to the measurement device 12, the storage device 14, the display unit 15, the input unit 16, the speaker 17, and the image input unit 18, for example.
The storage device 14 stores registered data of the authentication-requesting person who utilizes the present system. The registered data include information for collation of the authentication-requesting person, such as an image obtained by measuring a living body of the person. The display unit 15 displays information received from the authentication processing unit 13, for example. The input unit 16, such as a keyboard and mouse, transmits information input by the authentication-requesting person to the authentication processing unit 13. The speaker 17 is a device that emits information received from the authentication processing unit 13 in the form of an acoustic signal.
The processing units of the authentication processing unit 13 may be realized by various programs. In the memory 20, various programs stored in the storage device 14, for example, are loaded. The CPU 19 executes the programs loaded into the memory 20. The processes and operations described below are executed by the CPU 19.
In the biometric authentication system of
The CPU 19 executes the program stored in the memory 20 to collate the biometric feature information of the authentication-requesting person 10 with biometric feature information 6 of registered persons 11 (p1, p2, . . . , pn; n is the number of people registered in the database) stored in the registration databases 8 connected via the network 7, whereby individual authentication can be performed.
As a feature of the present embodiment, the biometric feature information 6 includes first biometric feature information 6-1 extracted by referring only to the biometric modality information of one person, and second biometric feature information 6-2 acquired on the basis of correlation of biometric modality information between different persons. For example, the second biometric feature information 6-2 is biometric feature information extracted by searching for biometric information having high correlation value (such as similarity) between the biometric modality information items of different persons. The first biometric feature information 6-1 and the second biometric feature information 6-2 may be each extracted from the same biometric modality or from different biometric modalities. The biometric modality for extracting the first biometric feature information 6-1 and the second biometric feature information 6-2 may include blood vessel, fingerprint, palm print, palm shape, nail shape, face, ear shape, iris, retina, gait, or any other biometric modality.
Generally, conventional biometric authentication involves authenticating an individual by utilizing biometric feature information (i.e., information such as the first biometric feature information 6-1) extracted from a living body of the individual in a uniform feature extraction process. However, in the present invention, in addition to the first biometric feature information 6-1 extracted in a uniform process, the second biometric feature information 6-2 having high correlation (such as similarity) between a plurality of persons is extracted and utilized for individual authentication.
The second biometric feature information 6-2 is biometric feature information exhibiting high correlation value indicating a correlation between a plurality of different persons. Herein, the correlation value means the degree of correspondence in biometric modality between a plurality of different persons. For example, when the biometric modality is obtained as an image, the correlation value may include similarity indicating the degree of correspondence between image patterns. The similarity may be calculated by applying a technology well known to those skilled in the art.
“Having high correlation value” means that the correlation value is higher than a certain reference value by a predetermined value. Herein, as the reference value, a standard value (such as an average value) may be obtained from the distribution of the correlation values of biometric modality information between a plurality of different persons. For example, when a biometric modality image is utilized, an image pattern of biometric modality of a certain person is matched with image patterns of biometric modality of various persons, and a similarity histogram is created. In the histogram, a pattern at a position spaced apart from a standard position, such as an average value, by a predetermined value may be extracted as the second biometric feature information 6-2. The method of extracting the second biometric feature information 6-2 is not limited to the above, and other methods may be used for extraction.
The first biometric feature information 6-1 is such that high similarity is obtained by collation with the subject person while low similarity is obtained by collation with the others. Thus, the first biometric feature information 6-1 enables individual authentication by distinguishing the subject person and the others. The first biometric feature information 6-1 is such that low similarity is obtained when collated with most persons other than the subject person. In other words, when the first biometric feature information 6-1 is collated with persons other than the subject person, high similarity is rarely obtained.
On the other hand, the second biometric feature information 6-2 is such that high similarity is obtained when collated with (specific) others, and can provide a unique feature between the collated persons. Specifically, a biometric feature such that high similarity is obtained only between specific persons is intentionally acquired as the second biometric feature information 6-2 and registered in advance. When the second biometric feature information 6-2 is collated with specific others and if high similarity is obtained, the authenticity of the authentication-requesting person as the subject person increases, whereby the person can be distinguished from the others and the individual can be authenticated. Consider a case in which all similarities obtained by collating an arbitrary feature, such as the first biometric feature information 6-1, with others are comprehensively utilized for individual authentication. In this case, as described above, mostly low similarities are obtained in the case of collation with the others, and it is not very effective in improving individual identification performance to utilize a number of low similarities obtained by collation with the others. Accordingly, by intentionally utilizing only the second biometric feature information 6-2 where high similarity is obtained when collated with others for individual authentication, individual identification performance can be improved more effectively than by simply utilizing the similarities obtained by collation with others.
In the present embodiment, the authenticity of the subject person is determined by utilizing the similarity calculated by collation of the registered first biometric feature information 6-1 with the subject person, and the authenticity of the subject person is further determined by utilizing an increase in similarity calculated by collation with the registered second biometric feature information 6-2. In this configuration, individual authentication with increased accuracy can be realized.
In the foregoing, as the second biometric feature information 6-2, the biometric feature information exhibiting high correlation value indicating the correlation between a plurality of different persons is extracted. However, this example is not a limitation, and as the second biometric feature information 6-2, biometric feature information exhibiting low correlation value indicating the correlation between a plurality of different persons may be extracted. “Exhibiting low correlation value” means that the correlation value is lower than a certain reference value by a predetermined value. By the same method as described above, the second biometric feature information 6-2 having low correlation value between a plurality of different persons can be extracted. In this case, it becomes possible to confirm the authenticity of the authentication-requesting person as the subject person by utilizing an extremely low similarity obtained by collation with the second biometric feature information 6-2.
In the following, a more specific example will be described. Referring to
Herein, the second biometric feature information 6-2 (f1-fi) having high similarity calculated by collation of person p1 with each person pi (2≦i≦n) other than p1 in the registration database 8 is extracted and registered in advance. When the second biometric feature information 6-2 (f1-fi) is extracted from the input px1, and collated with the second biometric feature information 6-2 (f1-fi) of the registered p1, most of a plurality of similarities obtained exhibit high values. On the other hand, when the second biometric feature information 6-2 (f2-fi) of the input px2 is collated with the second biometric feature information 6-2 (f1-fi) of the registered p1, most of a plurality of similarities that are obtained have low values. Thus, the persons px1 and px2 can be distinguished, and px1 can be authenticated as p1.
When authentication is performed for person p1, after person p1 presents a living body of the person to the measurement device 12, such as a camera, the measurement device 12 senses the living body of person p1 (S201). When the first feature information 6-1 and the second feature information 6-2 are of the same biometric modality, measurement may be made once. When the first feature information 6-1 and the second feature information 6-2 are of different biometric modalities, a plurality of measurements may be required.
Then, the image input unit 18 generates, on the basis of the information measured by the measurement device 12, the first feature information 6-1 and the second feature information 6-2 as input data (S202). As will be described later, the second feature information 6-2 may be partial information of the first feature information. In this case, where the first feature information 6-1 and the second feature information 6-2 are obtained from one biometric modality information item, the image input unit 18 may input one piece of feature information (such as the first feature information) as the input data.
Then, the authentication unit 101 initializes a variable i identifying the registered data to 1 for collation process initialization (S203). The variable i corresponds to the order of arrangement of the registered data. When i is 1, the initial registered data is indicated; when the number of the registered data items is N, the last registered data is indicated. The authentication unit 101 collates the first feature information 6-1, which is the generated input data, with the first feature information 6-1, which is the i-th registered data on the registration databases 8, to calculate a collation score 1 (i). The authentication unit 101 further collates the second feature information 6-2 as the input data with the second feature information 6-2 as the i-th registered data on the registration databases 8, to calculate a collation score 2 (i) (S204).
The authentication unit 101 then calculates a final collation score (i) for making a final authentication determination by integrating the collation score 1 (i) and the collation score 2 (i) (S205). The authentication unit 101 determines whether the final collation score (i) is equal to or greater than an authentication threshold value Th1 which is previously set (S206). If the determination condition is satisfied, the authentication unit 101 determines that authentication is successful (S207).
If the final collation score (i) is below the authentication threshold value Th1, the authentication unit 101 increments the value of the variable i, and performs collation with the next registered data in the registration databases 8. As a result of collation with the last registered data N, if the final score (N) is below the authentication threshold value, the authentication unit 101 determines that authentication is unsuccessful because there is no registered data to be collated (S208).
In the present embodiment, the collation score 1 (i), which is the result of collation between two items of the first feature information 6-1, has only a single value. However, there is a plurality of items of the second feature information 6-2 as the i-th registered data. Thus, a plurality of collation scores 2(i) is calculated as the result of collation between the second feature information 6-2 items. Accordingly, the collation score 2 (i) provides vector data including a plurality of values. The final collation score (i) may be calculated by a method of linear combination of a plurality of scores including the collation score 1 (i) and the collation score 2 (i), or by an integrating method based on the probability density function of each of the collation scores utilizing Bayesian statistics.
A method of registering the first feature information 6-1 and the second feature information 6-2 in the registration databases 8 will be described.
Herein, on the assumption that the measurement device 12 has produced one or more items of biometric modality information with respect to each of persons p1 to pn, a process of extraction and registration of the first feature information 6-1 and the second feature information 6-2 of person p1 will be described. As described above, the first feature information 6-1 and the second feature information 6-2 may be extracted from the same biometric modality or from different biometric modalities.
The first feature information 6-1(f1) extracted from the biometric modality information of person p1 is extracted independently without consideration of its relationships with the living bodies of persons other than p1 (p2, . . . , pn). The registration unit 102 extracts the first feature information 6-1(f1) from the biometric modality information of person p1. The registration unit 102 registers the extracted first feature information 6-1(f1) in the registration database 8.
Meanwhile, the second feature information 6-2 is a feature having high correlation value between person p1 and persons other than person p1 (p2, . . . , pn). The registration unit 102 compares the biometric modality information of person p1 with the biometric modality information of certain others (p2, . . . , pn), and extracts, from the biometric modality information of person p1, a feature having high correlation value (similarity) with respect to each of the others as the second feature information 6-2. The registration unit 102 registers the extracted second feature information 6-2 (f1-f2, . . . , f1-fn) in the registration database 8.
As illustrated in
Thus, when items of the second feature information 6-2 are extracted, the second feature information 6-2 (f1-fi) having high correlation value varies for each combination of person p1 and person pi (2≦i≦n). Namely, depending on the combination of person p1 and person pi, the biometric location, position, size and the like from which the second feature information 6-2 (f1-fi) is extracted may vary. The second feature information 6-2 (f1-fi) has high correlation value (similarity) only between person p1 and the specific person pi. Thus, the similarity obtained by collation of the second feature information 6-2 (f1-fi) of person p1 with the second feature information 6-2 (f3-fi) of a person other than person p1 (such as person p3) is low. In the example of
Meanwhile, in the example of
The registration database 8 is provided with a first table including an identifier (ID) 401 for identifying each person, the first feature information 6-1, the second feature information 6-2, and biometric modality information 402. As in the illustrated example, the biometric modality information of each person may be registered in the registration database 8 together with the first feature information 6-1 and the second feature information 6-2. For example, when a person pz is newly registered in the registration database 8, the registration unit 102 may extract the first feature information 6-1 and the second feature information 6-2 by comparing the biometric modality of person pz with the biometric modality information of each person in the registration database 8, and then register the extracted information in the registration database 8.
In the present example, when the arbitrary authentication-requesting person px that is input is authenticated using the second feature information 6-2 registered in the registration database 8, the image input unit 18 does not know which information is to be extracted from the biometric modality information of the authentication-requesting person px as the second feature information 6-2 (fx-f2, fx-f3, . . . , fx-fn). Thus, the authentication unit 101 needs to search a range of the biometric modality information in which the second feature information is present while collating a position similar to the registered second feature information 6-2 (f1-f2, f1-f3, . . . , f1-fn).
Herein, a case will be considered in which collation is performed with the second feature information 6-2 (f1-f2) of person p1 registered in the registration database 8. Specifically, when it is authenticated as to whether the authentication-requesting person px is person p1, it is necessary to collate the biometric modality information of the authentication-requesting person px with the second feature information 6-2 (f1-f2) to calculate similarity. However, because it is not known whether the authentication-requesting person px is person p1, it is not known in fact which information in the biometric modality information of the authentication-requesting person px is the second feature information 6-2 (fx-f2) that should be the object of collation with the second feature information 6-2 (f1-f2). Thus, in the present embodiment, the biometric modality information of the authentication-requesting person px is searched for feature information exhibiting high similarity to the registered second feature information 6-2 (f1-f2), and the feature information obtained as a result of the search is handled as the second feature information 6-2 (fx-f2). For example, the authentication unit 101 handles the feature information, among the biometric modality information of the authentication-requesting person px, exhibiting the highest similarity to the registered second feature information 6-2 (f1-f2) as the second feature information 6-2 (fx-f2). The authentication unit 101 determines that the highest similarity is the similarity f1-f2 as the result of collation of the second feature information 6-2 (fx-f2) of the authentication-requesting person px with the registered second feature information 6-2 (f1-f2).
A more specific embodiment will be described. In the following, human biometric modality information is provided by finger blood vessel images, and the first feature information 6-1 and the second feature information 6-2 are provided by finger blood vessel patterns extracted from the finger blood vessel images.
As illustrated in
Then, the registration unit 102 extracts, as the second feature information 6-2, partial patterns having high similarity between the finger blood vessel image of person p1 and the finger blood vessel images of the others (p2, . . . , pn). For example, the registration unit 102 searches for a certain partial pattern of the finger blood vessel image of person p1 by matching in the entire region of the finger blood vessel image of person p2, and detects the partial pattern having high similarity to the finger blood vessel image of person p2. The registration unit 102 determines the detected partial pattern as being the second feature information 6-2 (f1-f2). Similarly, the registration unit 102 detects a partial pattern having high similarity between the finger blood vessel image of person p1 and the finger blood vessel image of each of the others (p3, . . . , pn). The registration unit 102 determines that the detected partial patterns are the second feature information 6-2 (f1-f3), . . . , (f1-fn), respectively. The first feature information 6-1(f1) thus extracted and a plurality of items of the second feature information 6-2 (f1-f2, f1-f3, . . . , f1-fn) provide the feature of person p1.
In the example of
In another example, with respect to the blood vessel partial patterns p1a and p2a having high similarity, a pattern during a deformation process, such as morphing in which one partial pattern is brought closer to another partial pattern, may be extracted as the second feature information 6-2 (f1-f2).
In the example of
As a method of detecting the blood vessel partial pattern as the second feature information 6-2, the following examples may also be applied. For example, initially, each of the finger blood vessel images of two persons is divided by a preset number into a plurality of partial patterns. Then, a combination of the partial patterns with the highest similarity is selected from a plurality of combinations of the partial patterns, and the selected partial patterns may provide the second feature information 6-2. In another example, the partial pattern having high similarity may be detected by varying the region size or position from which the partial pattern is cut out in each of the finger blood vessel images of two persons.
It is also possible to obtain the second feature information 6-2 by extracting a partial pattern from a partial region of high similarity calculated by collation that utilizes local features, such as collation of feature points in the finger blood vessel image. In this case, for example, a threshold value concerning the similarity calculated by collation of two blood vessel partial patterns is set in advance. When the similarity of the two blood vessel partial patterns exceeds the threshold value, the partial patterns may provide the second feature information 6-2. When a plurality of partial patterns having high similarity between the two finger blood vessel images is detected, each partial pattern may provide the second feature information 6-2.
While in the present embodiment the second feature information 6-2 is provided by a blood vessel partial pattern, other information may be used as the second feature information 6-2. For example, as the second feature information 6-2, there may be adopted information such as the number of blood vessels included in a blood vessel partial pattern, the ratio of blood vessels in a partial pattern region, or the direction of flow of blood vessels in the partial pattern.
In another example, the second feature information 6-2 may be provided by a histogram, such as information about the brightness gradient of a blood vessel image in a partial pattern. In this case, information which is robust with respect to an error in the position for cutting out the blood vessel partial pattern can be used as the second feature information 6-2, whereby authentication accuracy can be improved. It goes without saying that the second feature information 6-2 may be provided by other features that can be extracted from the blood vessel image.
A method of registering the first feature information 6-1 and the second feature information 6-2 that have been extracted will be described. As illustrated in
With regard to the order in which the plurality of items of the second feature information 6-2 (f1-f2, f1-f3, . . . , f1-fn) is stored for registration, the second feature information 6-2 with greater region size may be stored earlier, for example. In this way, it becomes possible to perform collation with the blood vessel image of the authentication-requesting person from the second feature information 6-2 of greater size and higher identifiability. In another example, the second feature information 6-2 may be stored in the order of decreasing level of identifiability on the basis of an index representing the level of identifiability of the second feature information 6-2. When registered data is newly added to the registration database 8, not only are the first feature information 6-1 and the second feature information 6-2 of the newly registered person pn+1 registered, but also the second feature information 6-2 of the persons p1 to pn that are already registered are updated. For example, with respect to the registered person p1, the second feature information 6-2 (f1-fn+1) is extracted between person p1 and the newly registered person pn+1 and added as registered data for person p1.
While the flow of the authentication process is the same as the flowchart of
First, the authentication-requesting person px presents a living body of the person, and a finger blood vessel image is acquired by the measurement device 12. The image input unit 18 extracts from the acquired finger blood vessel image a blood vessel pattern providing the first feature information 6-1(fx), and inputs the pattern to the authentication processing unit 13. The authentication unit 101 collates the first feature information 6-1(fx) of the authentication-requesting person px with the first feature information 6-1(f1) of the registered person p1 to calculate similarity.
With regard to the collation of the second feature information 6-2, the authentication unit 101 calculates similarity by searching the finger blood vessel image of the authentication-requesting person px for the second feature information 6-2 of the registered person p1. For example, as illustrated in
In the present example, it is necessary to collate the registered second feature information 6-2 (f1-f2) of person p1 with the second feature information 6-2 (f1-f2) of the authentication-requesting person px to calculate similarity. However, it is not known which partial pattern in the finger blood vessel image of the authentication-requesting person px should be the second feature information 6-2 (fx-f2) as the object of collation with the second feature information 6-2 (f1-f2). Thus, as illustrated in
In the above configuration, feature information beneficial for authentication that has not been used is drawn out of biometric modality information, whereby authentication can be performed fully taking advantage of the feature information. Particularly, the biometric feature information 6 includes the first feature information 6-1 extracted by only referring to the biometric modality information of one person, and the second feature information 6-2 acquired based on the correlation between the biometric modality information items of different persons. By utilizing the second feature information 6-2 in addition to the first feature information 6-1, highly accurate authentication can be performed.
In the present embodiment, a configuration in which the second feature information 6-2 is extracted from the biometric modality information of the authentication-requesting person will be described. In the present embodiment, an extraction property is registered in the registration database 8 along with the second feature information 6-2. The extraction property herein refers to attribute information for extracting, from the input information, the second feature information 6-2 as the object of collation with the second feature information 6-2 in the registration database 8. For example, the extraction property includes information about biometric location, extraction position, or region size and the like.
On the other hand, the second feature information 6-2 is a feature having high correlation value between person p1 and persons (p2, . . . , pn) other than person p1. The registration unit 102 compares the biometric modality information of person p1 with the biometric modality information of certain others (p2, . . . , pn), and extracts, from the biometric modality information of person p1 and as the second feature information 6-2, a feature having high correlation value (similarity) with each of the others. At this time, the registration unit 102 also acquires, for each combination of person p1 and the others, information about extraction property 9 representing attribute information of the second feature information 6-2. The registration unit 102 registers the extraction property 9 of the second feature information 6-2 in the registration database 8 along with the second feature information 6-2.
Depending on the combination of person p1 and each of the others pi, the extraction property 9 (p1-pi) representing the attribute information, such as the biometric location, extraction position, or region size, for extracting the second feature information 6-2 (f1-fi) may vary. Thus, the registration unit 102 registers the extraction property (p1-pi) of the second feature information 6-2 (f1-fi) in the registration database 8 for each combination of person p1 and each of the others pi.
The extraction property 9 may include, in addition to the above-described examples, a correlation value (similarity) between the second feature information 6-2 (f1-fi) of person p1 at the time of registration and the second feature information 6-2 (fi-f1) of person pi. Thus, as the extraction property 9, there may be registered a correlation value such as an average or dispersion of similarities in the collation of the second feature information 6-2 (f1-fi) of the registered person p1 with the second feature information 6-2 (fi-f1) of person pi. In this way, the authenticity of the subject person can be determined with increased accuracy on the basis of a difference between the registered correlation value and the correlation value calculated using the second feature information 6-2 (f1-fi) at the time of actual authentication.
The authentication unit 101 then initializes the variable i identifying the registered data to 1 for collation process initialization (S303). The variable i corresponds to the order of arrangement of registered data. When i is 1, the initial registered data is indicated; when the number of registered data items is N, the last registered data is indicated. The image input unit 18 generates, from the biometric modality information of the authentication-requesting person and by utilizing the extraction property 9 of the i-th registered second feature information 6-2, the second feature information 6-2 as the input data (S304).
The authentication unit 101 then collates the first feature information 6-1, i.e., the generated input data, with the first feature information 6-1 that is the i-th registered data on the registration database 8, and calculates a collation score 1 (i). Further, the authentication unit 101 collates the second feature information 6-2 as input data with the second feature information 6-2 as the i-th registered data on the registration database 8, and calculates a collation score 2 (i) (S305).
Then, the authentication unit 101 integrates the collation score 1 (i) and the collation score 2 (i) to calculate a final collation score (i) for final authentication determination (S306). The authentication unit 101 determines whether the final collation score (i) is equal to or greater than an authentication threshold value Th2 that is set in advance (S307). If this determination condition is satisfied, the authentication unit 101 determines that the authentication is successful (S308).
If the final collation score (i) is below the authentication threshold value Th2, the authentication unit 101 increments the value of the variable i, and performs collation with the next registered data on the registration database 8. As a result of the collation with the last registered data N, if the final score (N) is below the authentication threshold value, the authentication unit 101 determines that the authentication is unsuccessful because of the absence of registered data to be collated (S309).
When person px is authenticated with the registered data of persons on the registration database 8, the first feature information 6-1 and the second feature information 6-2 of person px are extracted. The authentication unit 101 authenticates person px on the basis of the level of similarity calculated by collation with the first feature information 6-1 and the second feature information 6-2 of the persons on the registration database 8. The operation of the authentication is similar to
When the authentication-requesting person px and person p1 on the registration database 8 are collated, the first feature information 6-1(fx) is extracted from the biometric modality information of person px. The authentication unit 101 calculates similarity by collating the first feature information 6-1(fx) with the first feature information 6-1(f1) of the registered person p1. When the second feature information 6-2 (fx-fi) is extracted from the authentication-requesting person px for collation with person p1, the extraction property 9 (p1-p2, . . . p1-pn) on the registration database 8 is utilized. By utilizing the extraction property 9 (p1-p2, . . . p1-pn), a plurality of items of the second feature information 6-2 (fx-f2, fx-f3, . . . fx-fn) is extracted from the biometric modality information of person px. The authentication unit 101 collates the second feature information 6-2 (fx-2, fx-f3, . . . fx-fn) of the authentication-requesting person px respectively with the second feature information 6-2 (f1-f2, f1-f3, . . . f1-fn) of person p1 so as to calculate similarity. Then, the authentication unit 101 calculates the final collation score from the obtained plurality of similarities. If the final collation score is greater than the preset threshold value, the authentication unit 101 determines that person px is person p1. In the example of
In the above example, the second feature information 6-2 is extracted as a feature having high correlation between the living bodies of two persons. However, a feature having high correlation between three or more persons may be extracted as third feature information. Generally, the greater the number of persons, the less likely it becomes for a feature having high correlation between a plurality of persons to appear, making the identifiability of the feature higher.
A more specific embodiment will be described. In the following, the human biometric modality information is provided by finger blood vessel images, and the first feature information 6-1 and the second feature information 6-2 that are extracted are provided by finger blood vessel patterns extracted from the finger blood vessel images.
As illustrated in
In this configuration, when the blood vessel partial pattern of the second feature information 6-2 is registered, the extraction property (such as position or region size) for extracting the second feature information 6-2 from the entire finger blood vessel image is also registered. In this way, when the authentication-requesting person is authenticated, it becomes possible to uniquely extract, from the finger blood vessel image of the arbitrary authentication-requesting person px and by utilizing the extraction property, the blood vessel partial pattern providing the second feature information 6-2, and to collate the partial pattern with the second feature information of each of the persons in the registration database 8.
As illustrated in
First, the authentication-requesting person px presents a living body, and a finger blood vessel image is acquired by the measurement device 12. The image input unit 18 extracts from the acquired finger blood vessel image a blood vessel pattern that provides the first feature information 6-1(fx). With regard to the second feature information 6-2, the image input unit 18 extracts, from the finger blood vessel image of the authentication-requesting person px and by utilizing the extraction property 9 registered in the registration database 8, the second feature information 6-2 (fx-f2). Similarly, the image input unit 18 extracts, from the finger blood vessel image of the authentication-requesting person px and by utilizing the extraction property 9 registered in the registration database 8, the second feature information 6-2 (fx-f3, . . . , fx-fn).
Then, the authentication unit 101 calculates similarity by collating the first feature information 6-1(fx) of the authentication-requesting person px with the first feature information 6-1(f1) of person p1. Further, the authentication unit 101 collates each item of the second feature information 6-2 (fx-f2, . . . , fx-fn) of the authentication-requesting person px with the corresponding second feature information 6-2 (f1-f2, . . . , f1-fn) of person p1 to calculate similarity. The authentication unit 101 integrates a plurality of similarities thus obtained, and calculates a final collation score. If the magnitude of the final collation score is greater than the preset authentication threshold value, px is authenticated as p1; if below the threshold value, px is collated with the next registered data on the registration database 8.
In the present embodiment, the extraction property, such as the position of extraction or size of the second feature information 6-2 as a partial pattern in the blood vessel image in each combination of various persons, is registered in the registration database 8. Thus, by utilizing the extraction property, the second feature information 6-2 can be uniquely extracted from the blood vessel image of the subject px that has been input.
In the present embodiment, the second feature information 6-2 is extracted as a similar partial pattern between any and all two finger blood vessel images (blood vessel patterns). However, in reality, a similar partial pattern may not necessarily exist between two finger blood vessel images. Thus, when a similar partial pattern does not exist, one blood vessel pattern may be subjected to at least one of pattern transformation processes of rotation, inversion, size change (scale change), or deformation. In this way, a similar partial pattern between two finger blood vessel images can be extracted.
For example, it is assumed that a similar blood vessel partial pattern could not be found between person p1 and person p2 when the second feature information 6-2 of person p1 is registered. In this case, the registration unit 102 subjects the blood vessel partial pattern of person p2 to the above pattern transformation process so as to generate a partial pattern similar to the blood vessel partial pattern of person p1. The registration unit 102 may register the pattern obtained through transformation of the blood vessel partial pattern of person p2 as the second feature information 6-2 (f1-f2). If person p1 is the authentication-requesting person, a partial pattern (input data) as the second feature information 6-2 extracted from person p1 may be collated with the second feature information 6-2 (registered data) generated through transformation of the partial pattern of person p2, whereby high similarity can be obtained.
If there are not many blood vessel patterns of person p1 as the authentication-requesting person, and if the blood vessel patterns do not include many geometric structures, such as curves, the authentication unit 101 may subject the blood vessel partial pattern of person p1 to the transformation process. In this way, it can be expected that authentication accuracy will be increased. As the extraction property 9 of the second feature information 6-2, in addition to the position of extraction or size of the second feature information 6-2, parameter information of the partial pattern transformation process may also be registered in the registration database 8. In this way, by utilizing the pattern transformation process parameter at the time of authentication, the authentication unit 101 can subject the blood vessel partial pattern of person p1 as the authentication-requesting person to pattern transformation process.
With regard to the handling of a plurality of similarities, in the present embodiment, similarity obtained by collation with the first feature information 6-1 and similarity obtained by collation with the second feature information 6-2 are calculated. In the foregoing examples, the plurality of similarities are integrated to determine a single similarity (final collation score) for authentication. In another example, collation may be performed using the first feature information 6-1 first. If the similarity is higher than a preset authentication threshold value, it may be determined that authentication has been successful, and only if the similarity is lower than the authentication threshold value, a plurality of similarities based on the collation of the second feature information 6-2 may be utilized. Conversely, collation may be performed first with the second feature information 6-2. If the similarity is higher than the preset authentication threshold value, it may be determined that the authentication has been successful, and only if the similarity is lower than the authentication threshold value, the similarity based on the collation of the first feature information 6-1 may be utilized. Alternatively, an authentication result may be determined on the basis of the similarity based on the collation of only the second feature information 6-2.
With regard to the order of collation with the second feature information 6-2, when the number of registered data items of the second feature information 6-2 in the registration database 8 is small, collation may be performed with all of the registered second feature information 6-2 items for authentication. However, when the number of registered data items of the second feature information 6-2 is very large, it may take much time for collation with all of the registered second feature information 6-2 items. In this case, collation may be performed only with those of the plurality of items of the registered second feature information 6-2 that have a large degree of contribution to the authentication result. In this way, the difference between the result of determination of authentication in a case where collation is terminated before performing collation with all of the second feature information 6-2 items, and the result of determination of authentication in a case where collation is performed with all of the second feature information 6-2 items may be virtually eliminated. In addition, the speed of the authentication process can be increased.
As to the method of calculating the degree of contribution to the authentication result, the level of similarity of the biometric features of two persons at the time of registration of the second feature information 6-2 may be considered the degree of contribution. Alternatively, the level of the so-called identifiability of the second feature information 6-2 may be considered the degree of contribution to the authentication result, the identifiability being such that, based on collation performed within the registration database 8, for example, the similarity with respect to the second feature information 6-2 is high at the time of collation of the subject person whereas the similarity based on collation between two persons whose items of second feature information 6-2 are not to correspond is decreased. With regard to the order of the second feature information 6-2 when performing collation using the second feature information 6-2, a unique order may be set for each registered person, or a fixed order of the second feature information 6-2 may be set in the registration database 8.
In the present embodiment, the second feature information 6-2 having high correlation between two different finger blood vessel images is extracted. However, the third feature information having high correlation between three or more different finger blood vessel images may be extracted. The first feature information 6-1 and the second feature information 6-2 may include information about certain specific feature points in a blood vessel image or brightness changes in a blood vessel image with grey scale representation. The first feature information 6-1 and the second feature information 6-2 may be respectively extracted from different biometric modalities (such as blood vessel, fingerprint, palm print, palm shape, nail shape, face, ear shape, iris, retina, and gait).
The first and the second embodiments have been described with reference to examples where the second feature information 6-2 having high correlation value (similarity) between two persons is extracted and utilized for collation, so as to authenticate an individual. By utilizing the second feature information 6-2 in addition to the first feature information 6-1, highly accurate authentication can be performed. Meanwhile, as the number of data items registered in the registration database 8 on the server and the like increases, the speed of authentication may become lowered. Thus, in the present embodiment, a method for performing authentication with high accuracy and at high speed by utilizing a feature having high similarity between a plurality of persons will be described.
According to the first and the second embodiments, the second feature information 6-2 is provided by a biometric modality feature having high similarity between two persons. In the present example, third feature information (group feature information) 6-3 acquired on the basis of correlation between the biometric modality information of three or more different persons is utilized. The third feature information 6-3 is feature information exhibiting high correlation value (similarity) between three or more persons. The third feature information 6-3 may be provided by feature information having low correlation value (similarity) between the three or more persons. The meaning of “high (or low) correlation value” is the same as described above. By utilizing the co-occurrence where a plurality of similarities obtained by collation of the third feature information 6-3 having high similarity commonly among three or more persons with a plurality of persons is simultaneously increased, not only can an individual be authenticated but also a group to which the individual belongs can be identified.
For example, in a scene where a plurality of authentication-requesting persons makes a line waiting for authentication, and the persons are authenticated one after another, it can be expected that a plurality of authentication-requesting persons belonging to the same group is waiting in the same line together. Thus, a plurality of temporally and spatially close authentication-requesting persons is collated with the third feature information 6-3. If a plurality of high similarities is obtained, the likelihood is very high that a plurality of authentication-requesting persons belonging to a certain specific group is there. Thus, when a certain authentication-requesting person can be authenticated and the group to which the authentication-requesting person belongs can be identified, the likelihood is high that the authentication-requesting persons that are going to be authenticated include persons belonging to that group. Accordingly, immediately after the group is identified, a locationally and temporally close authentication-requesting person is preferentially collated with the registered data of the persons belonging to that group. In this way, the probability is increased that collation with the registered data of a correct authentication-requesting person can be performed at an increased speed.
First, living bodies of a plurality of authentication-requesting persons j are photographed by the measurement device 12 simultaneously or at short predetermined time intervals (S401). Then, the image input unit 18 generates the third feature information 6-3 as input data from the biometric modality information of each of the authentication-requesting persons (S402). The authentication unit 101 then initializes the variable i identifying the registered data to 1 for collation process initialization (S403). The variable i corresponds to the order of arrangement of registered data. When i is 1, the initial registered data is indicated; when the number of registered data items is N, the last registered data is indicated.
Then, the authentication unit 101 collates the third feature information 6-3, which is the generated input data, with the third feature information 6-3, which is the i-th registered data on the registration database 8, and calculates a collation score 3 j(i) (S404). The authentication unit 101 then counts the number k of the authentication-requesting persons of which the collation score 3 j(i) is greater than a preset authentication threshold value Th3 (S405). The authentication unit 101 determines whether the number k of the authentication-requesting persons is equal to or greater than a preset threshold value Th4 (S406). Herein, by performing the determination using the threshold value Th4, it can be determined whether a certain number of persons in the group are being authenticated simultaneously or at short predetermined time intervals. For example, four persons belong to a certain group. By setting the threshold value Th4 to “3”, it can be determined that, even if not all of the persons of the group satisfy the determination of step S406, the likelihood is high that the remaining persons of the group are also being authenticated, whereby the group can be estimated.
When the number k of the authentication-requesting persons is equal to or greater than the threshold value Th4, the authentication unit 101, assuming that the authentication-requesting persons of which the collation score 3 is greater than the authentication threshold value Th3 belong to a group i, identifies the group (S407). When the number k of the authentication-requesting persons is below the threshold value Th4, the authentication unit 101 performs collation with next registered data. As a result of collation with the last registered data N, if the number k of the authentication-requesting persons is below the threshold value Th4, it is determined that the group identification is unsuccessful because of the absence of registered data to be collated (S408).
In the example of
When the third feature information 6-3 (gf1) of group 1 is registered, an extraction property, such as the position of extraction of the third feature information 6-3 or the region size of the third feature information 6-3, may also be registered. As in the above-described case, the extraction property refers to attribute information for extracting, from the input information, the third feature information 6-3 as the object of collation with the third feature information 6-3 in the registration database 8. For example, the third feature information 6-3 includes information about biometric location, extraction position, or region size and the like.
Depending on the person in the group, the extraction property representing the attribute information such as the biometric location, extraction position, or region size for extraction of the third feature information 6-3 may vary. Thus, the registration unit 102 registers the extraction property of the third feature information 6-3 in the registration database 8 for each person in the group. In this way, it becomes possible to uniquely extract the third feature information 6-3 from an arbitrary authentication-requesting person px using the extraction property, and to collate it with the third feature information 6-3 registered in the registration database 8.
First, the living bodies of a plurality of authentication-requesting persons j are photographed by the measurement device 12 simultaneously or at short time intervals (S501). The image input unit 18 generates the third feature information 6-3 from the biometric modality information of each of the authentication-requesting persons as input data (S502). The authentication unit 101 initializes the variable i identifying the registered data to 1 for collation process initialization (S503). The variable i corresponds to the order of arrangement of registered data. When i is 1, the initial registered data is indicated; when the number of registered data items is N, the last registered data is indicated.
Then, the image input unit 18, utilizing the extraction property of the third feature information 6-3 of the i-th registered group i in the registration database 8, generates the third feature information 6-3 from the biometric modality information of each of the authentication-requesting persons j as input data (S504). The authentication unit 101 then collates the third feature information 6-3, which is the generated input data, with the third feature information 6-3, which is the i-th registered data in the registration database 8, and calculates a collation score 3 j(i) (S505). Then, the authentication unit 101 counts the number k of the authentication-requesting persons of which the collation score 3 j(i) is greater than the preset authentication threshold value Th3 (S506). The authentication unit 101 then determines whether the number k of the authentication-requesting persons is equal to or greater than the preset threshold value Th4 (S507).
If the number k of the authentication-requesting persons is equal to or greater than the threshold value Th4, the authentication unit 101, determining that the authentication-requesting persons of which the collation score 3 is greater than the authentication threshold value Th3 belong to group i, identifies the group. Simultaneously, the authentication unit 101, with respect to the authentication-requesting persons of which the collation score 3 is greater than the authentication threshold value Th3, performs individual authentication (S508). If the number k of the authentication-requesting persons is below the threshold value Th4, the authentication unit 101 performs collation with the next registered data. As a result of collation with the last registered data N, if the number k of the authentication-requesting persons is below the threshold value Th4, the authentication unit 101 determines that the group identification is unsuccessful because of the absence of registered data for collation (S509).
First, the image input unit 18, using the extraction properties (p1-1, p2-1, p3-1, p4-1, p5-1) for uniquely extracting the third feature information 6-3 from each person belonging to group 1, extracts the respective third feature information 6-3 items (gx1-1, gx1-2, gx1-3, gx1-4, gx1-5) from person px1. In this case, because the position or size of the extracted feature varies depending on the extraction properties (p1-1, p2-1, p3-1, p4-1, p5-1), the third feature information 6-3 (gx1-1, . . . , gx1-5) also varies. Thus, the third feature information 6-3 (gx1-1, . . . , gx1-5) extracted from the respective extraction properties (p1-1, p2-1, p3-1, p4-1, p5-1) is handled in a distinguished manner.
The authentication unit 101 collates a plurality of items of the third feature information (gx1-1, . . . , gx1-5) extracted from person px1 respectively with the third feature information 6-3 (gf1) of group 1 registered in the registration database 8, and calculates similarity.
In the example of
The examples of
It is also possible to perform highly accurate authentication by combining similarities based on collation using the first feature information 6-1, the second feature information 6-2 exhibiting high correlation between two persons, and the third feature information 6-3 exhibiting high correlation between three or more persons. For example, the similarity calculated by collation of the third feature information 6-3, and the similarity calculated by collation of the first feature information 6-1 and the second feature information 6-2 may be integrated, whereby highly accurate authentication can be performed.
In the following, an example in which the third feature information 6-3 and the first feature information 6-1 (or the second feature information 6-2) are used in combination will be described. In this configuration, authentication speed and convenience can be improved while authentication accuracy is ensured.
In the examples of
First, the authentication unit 101 authenticates person p1 with the first feature information 6-1 (S601). Then, the authentication unit 101 identifies the group to which the authenticated person p1 belongs (S602). For example, as shown in the first table of
The measurement device 12 photographs the living body of at least one authentication-requesting person px, and acquires the biometric modality information of each authentication-requesting person px (S603). Then, the authentication unit 101 determines whether the spatial distance between the authentication-requesting person px and the authenticated person p1 is smaller than Th5, and whether the authentication time interval of the authentication-requesting person px and the authenticated person p1 is shorter than Th6 (S604). The spatial distance between the authentication-requesting person px and the authenticated person p1 may be determined using the distance between the authentication gates used for authentication of each person. For example, when there is a plurality of authentication gates, the storage device 14 may store information about the distance between the authentication gates. For example, when the authentication-requesting person px is authenticated at the same gate as or an adjacent gate to the gate used for authentication of the authenticated person p1, the authentication unit 101 may determine that the spatial distance condition in step S604 is satisfied.
The authentication-requesting person px who does not satisfy the condition of step S604 is determined to belong to a group different from that of the authenticated person p1, and the process proceeds to step S605. In this case, the authentication unit 101 performs an authentication process for the authentication-requesting person px by utilizing only the first feature information 6-1 (S605).
When the condition of step S604 is satisfied, the process proceeds to step S606. The authentication unit 101 collates the third feature information 6-3 of group i to which person p1 belongs with the third feature information extracted from person px to calculate a collation score 3 px(i) (S606). Then, the authentication unit 101 acquires the first feature information 6-1 from the registration database 8 with respect only to each person j who belongs to group i. The authentication unit 101 collates the first feature information 6-1 of each person j who belongs to group i with the first feature information extracted from person px to calculate a collation score 1 (j) (S607).
The authentication unit 101 determines whether the calculated collation score 3 px(i) and collation score 1 (j) are respectively greater than an authentication threshold value Th7 and an authentication threshold value Th8 (S608). If the condition of step S608 is satisfied, the authentication unit 101 determines that authentication of the authentication-requesting person is successful (S609). If the condition of step S608 is not satisfied, the authentication unit 101 determines that the authentication is unsuccessful (S610). In this case, the authentication unit 101 acquires the first feature information 6-1 of a person of a group other than group i from the registration databases 8, and collates the person's first feature information 6-1 with the first feature information extracted from person px (S611).
According to the above configuration, group i is identified from the initially authenticated person p1, and then the authentication-requesting person px is collated using the third feature information 6-3 of group i and the first feature information 6-1 of a person belonging to group i, whereby the speed of authentication is increased. Further, because the third feature information 6-3 and the first feature information 6-1 are used in combination, compared with the case where authentication is performed using solely the first feature information 6-1, the accuracy of the authentication system as a whole can be maintained even when the authentication threshold value Th8 in step S608 is lowered. Conventionally, because authentication is performed using solely the first feature information 6-1, the authentication threshold value needs to be set high so as to maintain authentication system accuracy. In contrast, according to the present embodiment, authentication using the third feature information 6-3 can be additionally performed by identifying the group of the authentication-requesting person in advance. Thus, the accuracy of the authentication system as a whole can be maintained even when the authentication threshold value Th8 for the first feature information 6-1 is lowered.
First, at the authentication gates, authentication is performed by extracting the first feature information 6-1 from the finger blood vessel image acquired by the measurement device 12. While waiting in the authentication-waiting lines, authentication is performed by extracting the third feature information 6-3 (face feature) from the facial image acquired by the measurement device 12.
It is assumed that one person has been initially authenticated with the first feature information 6-1, and that a group 2 to which the authenticated person p1 belongs and the third feature information 6-3 (gf2) of group 2 have been identified. If it is assumed that person p1 came to the authentication gates with a plurality of persons of group 2 to which person p1 belongs, the persons px1 to px9 lined up at the three authentication gates will include persons belonging to the same group 2 as person p1.
Thus, authentication is performed by performing collation with the third feature information 6-3 (gf2) of group 2 in the registration database 8 with respect solely to the persons px1 to px9 that are authenticated at the same authentication gate or at close authentication gates, immediately after person p1 is authenticated. At the authentication gates, collation using the first feature information 6-1 of the persons belonging to group 2 is preferentially performed. Collation is performed using the third feature information 6-3 (gf2) with respect solely to group 2 to which the authentication-requesting persons px1 to px9 are highly likely to belong, and collation using the first feature information 6-1 is performed with respect solely to the persons belonging to group 2. In this way, the probability is increased that collation with the registered data of the correct authentication-requesting persons can be performed with increased speed.
Further, by limiting the authentication-requesting persons to the authentication-requesting persons px1 to px9 immediately after person p1 is authenticated, the authentication threshold values for collation with the first feature information 6-1 and collation with the third feature information 6-3 in the registration database 8 can be lowered. Because the first feature information 6-1 and the third feature information 6-3 are used in combination, compared with the case where authentication is performed using solely the first feature information 6-1, the accuracy of the authentication system as a whole can be maintained even when the authentication threshold values for the first feature information 6-1 and the third feature information 6-3 are lowered. Thus, the frequency of rejection of the subject person at the authentication gate can be decreased. Further, because the authentication-requesting persons for whom the authentication threshold values are lowered are limited to temporally and spatially close persons, the risk of acceptance of the others in the authentication system as a whole can be reduced.
An example of combined use of the third feature information 6-3 and the first feature information 6-1 after collation by the third feature information 6-3 is performed and a certain group is identified will be described.
In this configuration, when high similarities are simultaneously obtained (co-occurrence) by collation of the third feature information 6-3 of a certain specific group with the third feature information extracted from a plurality of persons, collation by combined use of the third feature information 6-3 and the first feature information 6-1 is performed with respect to the persons associated with the co-occurrence of high similarities. In this way, highly accurate authentication can be performed.
Referring to
Then, the authentication unit 101 determines whether the collation score 3 px(i) and the collation score 1 (j) calculated in the flow of
As illustrated in
In the example of
Even when the similarity of a certain person is below the threshold value, if the similarities of the other persons who have reached the authentication gates simultaneously or at short time intervals are high, it may be estimated that the person of which the similarity is below the threshold value belongs to the same group 2.
If the group of the authentication-requesting persons who arrived at the authentication gates is identified, and if the individuals have also been authenticated, they can pass the authentication gates. By integrating the result of collation of the third feature information 6-3 and the result of collation of the first feature information 6-1 at the authentication gates, highly accurate authentication can be performed.
In an example of
While in the present embodiment the third feature information 6-3 is extracted from the face, the information may also be extracted from other biometric modalities that can be photographed contactlessly, such as the iris, palm print, or blood vessel. The first feature information 6-1, the second feature information 6-2, and the third feature information 6-3 may be respectively extracted from different modalities, such as blood vessel, fingerprint, palm print, palm shape, nail shape, face, ear shape, iris, retina, or gait.
In the present embodiment, an example of combined use of the first feature information 6-1 and the third feature information 6-3 has been described. However, the second feature information 6-2 and the third feature information 6-3 may be used in combination. Further, authentication may be performed by using the three items of information of the first feature information 6-1, the second feature information 6-2, and the third feature information 6-3.
The plurality of persons from which the third feature information 6-3 is extracted may be selected by various methods. For example, the third feature information 6-3 may be extracted from a plurality of persons who are often together. When a plurality of persons is authenticated together, the group to which the plurality of persons belong may be distinguished from another unspecified group by collation of the plurality of persons with the third feature information 6-3. The information about the group of the plurality of identified persons may be utilized for increasing the accuracy of individual authentication.
In another exemplary method of selecting the plurality of persons from which the third feature information 6-3 is extracted, a plurality of persons may be selected from a database in which an unspecified number of items of biometric modality information are stored. In this case, the selected persons and the number of the persons may be determined so that identifiability is increased in the database. Alternatively, the persons selected and the number of the persons may be determined so as to increase the speed of collation in the database by collation of the third feature information 6-3. The persons selected and the number of persons may be determined for other purposes.
In the present embodiment, a group to which a plurality of persons belongs is registered in advance, and information about co-occurrence of a plurality of high similarities by collation with the first feature information 6-1 is utilized. In this configuration, authentication accuracy can be increased.
In the sixth embodiment, the example has been described in which the group to which persons belong is identified (or estimated) by collation of the third feature information 6-3 common to a plurality of persons, and the information about the group is utilized for individual authentication. In the present embodiment, the information about which persons belong to a certain group, and the co-occurrence relationship of similarities by the collation of the first feature information 6-1 extracted only from the biometric modality information of the subject person are utilized. In this way, it becomes possible to increase the accuracy of group identification and individual authentication.
First, as illustrated in
As illustrated in
At this point in time, person p4 who belongs to group 1 is not yet authenticated. In this scene, because three of the four persons of group 1 have been authenticated, the probability is high that person p4 who belongs to group 1 and who is not yet authenticated is included in the persons px4 to px9 who are going to be authenticated. In this case, it is assumed that, as a result of collation of person px5 with the first feature information 6-1(f4) of person p4, the similarity is slightly below the authentication threshold value (namely, the similarity is smaller than the authentication threshold value by a predetermined value). Here, it is assumed that person px5 is person p4 by utilizing the result of the previous authentication of the persons p1, p2, and p3 of the same group 1, and person px5 is authenticated as being person p4. Namely, because person p4 is temporally and spatially close to the persons p1, p2, and p3 of the same group 1, the authentication condition is set lower for a predetermined time.
The authentication unit 101 then counts the number k of the authenticated persons of the same group (group 1) (S803). Herein, the number k of the authenticated persons is “3”. When the number k of the authenticated persons is equal to or greater than the threshold value Th9, the authentication unit 101 proceeds to step S805. In this case, the authentication unit 101 sets the authentication threshold value for the first feature information 6-1 of the person (herein, p4) of the same group smaller by a predetermined value for a predetermined time (S805).
When the condition of S804 is not satisfied, the process from step S801 is repeated. With regard to the process of S801 to S804, when the predetermined time elapsed, the value of the number k of the authenticated persons is reset. This is so that the authentication threshold value for the first feature information 6-1 is lowered only when the group is identified by a plurality of temporally and spatially close authentication-requesting persons.
In the above example, the result of the previous authentication of the persons p1, p2, and p3 of the same group 1 is utilized to presumably authenticate person px5 as being person p4. If the person is authenticated as being person p4 while the authentication threshold value is simply lowered at all times, the probability of erroneously authenticating a person who is not actually person p4 may be increased. However, authentication of the person who belongs to group 1 and who is yet to be authenticated is made easier only for the temporally and spatially close person who is authenticated immediately after the previous authentication of a plurality of persons of group 1. In this way, the number of times of collation while the authentication threshold value is lowered can be minimized, whereby the probability of erroneous authentication of the others can be decreased.
It is also possible to utilize a plurality of different items of the first feature information 6-1, and to perform multimodal authentication utilizing the co-occurrence relationship of similarities by the collation of the respective items of the first feature information 6-1. For example, two different items of the first feature information 6-1 are respectively the first feature information 6-1-1 and the first feature information 6-1-2. Herein, the first feature information 6-1-1 is a feature that has low identification capacity but that can be robust with respect to posture variations and the like, and be extracted at a distance. On the other hand, the first feature information 6-1-2 is a feature that provides high identification capacity as long as it can be extracted in a correct posture and in a stationary state.
By utilizing the co-occurrence relationship such that a plurality of similarities obtained by the collation of the first feature information 6-1-1 of the plurality of persons belonging to the same group with the plurality of authentication-requesting persons is simultaneously increased, the group to which the authentication-requesting persons belong can be identified or estimated. If the similarity calculated by collation of the first feature information 6-1-1 registered in the registration database 8 with the authentication-requesting person is higher than a preset threshold value, the authentication-requesting person can be authenticated and the group of the authenticated person can be identified. When the individual is authenticated and the group can be identified, the individual can pass the authentication gate.
On the other hand, with respect to an authentication-requesting person who is temporally and spatially close to a person who has been individually authenticated and whose group has been identified, the authentication-requesting person is not authenticated as an individual if the similarity calculated by collation with the first feature information 6-1-1 is slightly lower than the threshold value. However, the group to which the authentication-requesting person belongs can be estimated. With respect to the person who cannot be individually authenticated even by utilizing the co-occurrence relationship of high similarities by the collation of the first feature information 6-1-1, the result of estimation of the group and the first feature information 6-1-2 having higher identification performance than the first feature information 6-1-1 are used in combination. In this way, authentication accuracy can be increased.
In another example, the co-occurrence relationship of similarities as a result of collation by different features may be utilized for authentication, the relationship being such that high similarity is obtained for a certain person of a plurality of authentication-requesting persons belonging to the same group by collation of the first feature information 6-1-1, while high similarity is obtained for the other persons by collation of the first feature information 6-1-2.
When cloud type biometric authentication via the network 7 as illustrated in
The authentication processing unit 13 is further provided with an ID generation unit that generates an ID from biometric modality information. For generating the ID, the authentication processing unit 13 is provided with a database 30 illustrated in
In the present example, it is assumed that, with respect to the finger blood vessel image that is captured, the influence of finger posture variations or lighting variations on a blood vessel pattern is normalized, and that the same blood vessel pattern region is cut out at all times. Namely, an ID is produced from the blood vessel pattern in a state such that the influence of finger posture variations and positional or lighting variations can be disregarded.
First, the finger blood vessel image of the authentication-requesting person is acquired by the measurement device 12. Thereafter, the ID generation unit divides the finger blood vessel image for producing an ID into a plurality (n) of blocks, as illustrated in
The ID generation unit, as illustrated in
The ID generation unit generates an IDi by linking the generated ID(ij). The generated IDi of the block i is as follows.
IDi1|IDi2| . . . |IDim
where the symbol “|” means linking of the codes. For example, the IDij shown in
The ID generation unit generates a final unique ID by linking the IDi. The unique ID for one finger is as follows.
ID1|ID2| . . . |IDn
The registration database 8 on the cloud in the present embodiment is managed with the above unique ID. Thus, the authentication processing unit 13 exchange information with the registration database 8 via the network 7 and using the generated unique ID. The finger blood vessel image as personal information is not transmitted over the network 7. Even if the information about the unique ID were to be leaked, the finger blood vessel pattern of the individual would not be leaked. If the unique ID were to be leaked, operation of the system would be enabled by simply changing the reference patterns in the database 30 and reissuing the ID without re-registration of the finger blood vessel pattern.
By utilizing the above-described unique ID, privacy-protected type of authentication can be performed on the network server. Although biometric modality information may temporarily remain in the client terminal (i.e., the authentication processing unit 13) connected to the network when the biometric feature is scanned, safety can be ensured by completely erasing the information immediately after the unique ID is generated. Further, the ID generation unit of the authentication processing unit 13 may transmit the unique ID to the network 7 as encrypted. Encryption of the unique ID ensures that the biometric modality information will not be leaked. Should the unique ID be stolen, the unique ID can be changed and prevented from being abused by simply changing the rule for generating the unique ID from the biometric feature.
In the present embodiment, the unique ID is generated by encoding the blood vessel pattern in the finger blood vessel image. The ID may also be generated by encoding a geometric feature in a partial region of the finger blood vessel image, such as brightness gradient, blood vessel direction, the number of blood vessels or the shape thereof.
In the registration database 8 in the network 7, the unique ID is registered in advance, and the unique ID is collated with an input unique ID at the time of authentication to perform individual authentication. The unique ID has no risk of information leakage because the original biometric modality information cannot be extracted from the unique ID even if stolen on the network.
According to the first to the eighth embodiments, a highly accurate authentication system can be provided in a large-scale biometric authentication system.
The present invention is not limited to the foregoing embodiments, and may include various modifications. The embodiments have been described for the purpose of facilitating an understanding of the present invention, and are not limited to have all of the described configurations. A part of the configuration of one embodiment may be substituted by the configuration of another embodiment, or the configuration of the other embodiment may be incorporated into the configuration of the one embodiment. With respect to a part of the configuration of each embodiment, addition of another configuration, deletion, or substitution may be made.
The various computing units, such as the authentication processing unit 13 and the image input unit 18, may be implemented by software by having a processor interpret and execute a program for realizing the respective functions. The information for realizing the functions, such as programs, tables, and files, may be placed in a storage device such as a memory, a hard disk, or a solid state drive (SSD), or a recording medium such as an IC card, an SD card, or a DVD. The various computing units described above, such as the authentication processing unit 13 and the image input unit 18, may be implemented by hardware by designing a part or all of the units in an integrated circuit, for example.
The control lines and information lines shown in the drawings are those deemed necessary for description purposes, and do not necessarily represent all of control lines or information lines required in a product. All of the configurations may be mutually connected.
Number | Date | Country | Kind |
---|---|---|---|
2014-130138 | Jun 2014 | JP | national |