Individual recognizing apparatus and individual recognizing method

Information

  • Patent Application
  • 20040125991
  • Publication Number
    20040125991
  • Date Filed
    November 20, 2003
    20 years ago
  • Date Published
    July 01, 2004
    20 years ago
Abstract
An individual recognizing apparatus includes a data acquisition unit to acquire certifying data from a recognized person, a detection unit to detect feature points of the certifying data acquired by the data acquisition unit, a change calculation unit to calculate changes of detecting positions of the feature points detected by the detection unit, an aptitude judging unit to judge whether the certifying data acquired by the data acquisition unit is appropriate for the preparation of a certifying dictionary based on the changes of feature points calculated by the change calculation unit, a dictionary preparing unit to prepare a certifying dictionary based on the certifying data acquired by the data acquisition unit when the acquired certifying data is judged appropriate by the aptitude judging unit, a dictionary storing unit to store the certifying dictionary prepared by the dictionary preparing unit, and a certifying unit to certify that a recognized person is a proper person using the certifying data acquired by the data acquisition unit and the dictionary stored in the dictionary storing unit.
Description


CROSS-REFERENCE

[0001] This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2002-378452, filed on Dec. 26, 2002, the entire contents of which are incorporated herein by reference.



BACKGROUND OF THE INVENTION

[0002] 1. Field of the Invention


[0003] This invention relates to an individual recognizing apparatus to recognize whether a recognized person is the proper person by using biological information such as a face image, finger print, retina, iris, palm figure, of a recognized person as data for certifying and an individual recognizing method.


[0004] 2. Description of the Related Art


[0005] Recently, an individual recognizing apparatus was developed and is used in the management of the entry/exit to/from buildings/rooms requiring security for certifying whether a recognized person is the proper person using a face image, finger print, retina, iris, palm figure, etc. of the recognized person.


[0006] Such the individual recognizing apparatus is normally provided with a dictionary storing unit storing dictionaries for use in the individual certification. The individual certification is executed to certify whether a recognized person is the proper person using dictionaries stored in this dictionary storing unit and input certifying data (biological information of a recognized person).


[0007] By the way, certifying dictionaries are stored (hereinafter, called also as registered) in the dictionary storing unit and certifying data including various kinds of variations are acquired when a dictionary is stored. For this reason, a recognized person is urged to add various kinds of variations through the user interface.


[0008] Further, for instance, when failed to recognize (collate) a person, a certifying dictionary is updated by recognizing a person.


[0009] As described above, it was so far tried to acquire versatile certifying data (a face image) of a recognized person through the guidance to the recognized person. However, a recognized person may not act as provided in the guidance.


[0010] Further, it was so far supposed that if an improper dictionary was registered for recognition (collation), it is updated to a proper dictionary through re-registration or updating of the dictionary. However, a recognized person may not always re-register or update a dictionary.



SUMMARY OF THE INVENTION

[0011] It is an object of this invention to provide an individual recognizing apparatus and an individual recognizing method capable of using certifying data suited for the recognition for the learning when storing a dictionary and preventing an improper dictionary storage.


[0012] According to this invention, there is provided an individual recognizing apparatus, comprising a data acquisition unit to acquire certifying data from a recognized person; a detection unit to detect feature points of the certifying data acquired by the data acquisition unit; a change calculation unit to calculate the change of the detecting positions of the feature points detected by the detection unit; an aptitude judging unit to judge whether the certifying data acquired by the data acquisition unit is appropriate for the preparation of a certifying dictionary based on the change in the feature points calculated by the change calculation unit; a dictionary preparing unit to prepare a certifying dictionary based on the certifying data acquired by the data acquisition unit when the certifying data is judged appropriate; a dictionary storing unit to store the certifying dictionary prepared by the dictionary preparing unit; and a certifying unit to certify whether a recognized person is a proper person using the certifying data acquired by the data acquisition unit and the dictionary stored in the dictionary storing unit.


[0013] Further, according to this invention, there is provided an individual recognizing apparatus, comprising a data acquisition unit to acquire certifying data from a recognized person; a dictionary preparing unit to prepare a certifying dictionary by analyzing principal components based on the certifying data acquired by the data acquisition unit; a calculation unit to calculate an eigenvalue contribution rate of the dictionary prepared by the dictionary preparing unit; an aptitude judging unit to judge whether the dictionary prepared by the dictionary preparing unit is appropriate as a certifying dictionary based on the eigenvalue contribution rate calculated by the change calculation unit; a dictionary storing unit to store the dictionary prepared by the dictionary preparing unit when the dictionary is judged appropriate by the aptitude judging unit; and a certifying unit to certify whether a recognized person is a proper person using the certifying data acquired by the data acquisition unit and the dictionary stored in the dictionary storing unit.


[0014] Further, according to this invention, there is provided an individual recognizing method, comprising acquiring certifying data from a recognized person; detecting feature points from the acquired certifying data; calculating the change of the detecting positions of the detected feature points; judging whether the acquired certifying data is appropriate for the preparation of a certifying dictionary based on the change of the calculated feature points; preparing a certifying dictionary based on the acquired certifying data when the certifying data is judged appropriate in the judging step; storing the prepared certifying dictionary; and certifying whether a recognized person is a proper person using the acquired certifying data and the stored dictionary.


[0015] Further, according to this invention, there is provided an individual recognizing method, comprising acquiring certifying data from a recognized person; preparing a certifying dictionary by analyzing principal components based on the acquired certifying data; calculating an eigenvalue contribution rate of the prepared dictionary; judging whether the prepared dictionary is appropriate as a certifying dictionary based on the calculated eigenvalue contribution rate; storing the prepared dictionary when the prepared dictionary is judged appropriate in the judging step; and certifying whether a recognized person is a proper person using the acquired certifying data and the stored dictionary.







BRIEF DESCRIPTION OF THE DRAWINGS

[0016]
FIG. 1 is a block diagram schematically showing the construction of an individual recognizing apparatus involved in a first embodiment of this invention;


[0017]
FIG. 2 is a flowchart for explaining the flow of the dictionary registration process of an individual recognizing apparatus involved in the first embodiment;


[0018]
FIG. 3 is a block diagram schematically showing the construction of an individual recognizing apparatus involved in a second embodiment;


[0019]
FIG. 4 is a flowchart for explaining the flow of the dictionary registration process of an individual recognizing apparatus in the second embodiment;


[0020]
FIG. 5A and FIG. 5B are diagrams for explaining the state of detecting a change in the UP_DOWN angle in the individual recognizing apparatus in the second embodiment;


[0021]
FIG. 6A and FIG. 6B are diagrams for explaining failed examples of detecting changes in the UP_DOWN angle in the individual recognizing apparatus in the second embodiment;


[0022]
FIG. 7 is a top view of the head of a person viewed from the top for explaining the positional relation of feature points in the individual recognizing apparatus in the second embodiment;


[0023]
FIG. 8A and FIG. 8B are diagrams for explaining changes in feature points resulting from the rotation in the individual recognizing apparatus in the second embodiment;


[0024]
FIGS. 9A and 9B are diagrams for explaining the state of detecting changes in the LEFT_RIGHT angle in the individual recognizing apparatus in the second embodiment;


[0025]
FIG. 10 is a block diagram schematically showing the construction of the individual recognizing apparatus involved in a third embodiment;


[0026]
FIG. 11 is a flowchart for explaining the flow of the dictionary registration process of the individual recognizing apparatus in the third embodiment; and


[0027]
FIG. 12A to FIG. 12C are graphs for explaining an eigenvalue contribution rate in the individual recognizing apparatus in the third embodiment.







DETAILED DESCRIPTION OF THE INVENTION

[0028] Preferred embodiments of this invention will be described below referring to the attached drawings.


[0029] The first embodiment of this invention will be first explained referring to the drawings.


[0030]
FIG. 1 schematically shows the construction of the individual recognizing apparatus involved in the first embodiment.


[0031] This individual recognizing apparatus comprises a data acquisition unit 101, a detection unit 102, a change calculation unit 103, an aptitude judging unit 104, a dictionary preparing unit 105, a dictionary storing unit 106, and a certifying unit 107.


[0032] The data acquisition unit 101 acquires biological information and other certifying data from a recognized person 100.


[0033] The detection unit 102 detects feature points from the certifying data acquired by the data acquisition unit 101.


[0034] The change calculation unit 103 calculates a change of detecting position of the feature point detected by the detection unit 102.


[0035] The aptitude judging unit 104 judges whether the certifying data acquired by the data acquisition unit 101 is appropriate for preparing a certifying dictionary based on the change of feature point calculated by the change calculation unit 103.


[0036] The dictionary preparing unit 105 prepares a dictionary for certifying based on the certifying data acquired by the data acquisition unit 101 when the certifying data acquired by the data acquisition unit 105 is judged appropriate.


[0037] The dictionary storing unit 106 stores a certifying dictionary prepared by the dictionary preparing unit 105.


[0038] The certifying unit 107 certifies whether a recognized person is the proper person using the certifying data acquired by the data acquisition unit 101 and the dictionary stored in the dictionary storing unit 106.


[0039] Hereinafter, the flow of the dictionary registration process in the first embodiment will be explained referring to the flowchart shown in FIG. 2.


[0040] First, a certifying data D is acquired from a recognized person 100 by the data acquisition unit 101 (Step S101). This certifying data D is, for instance, a face image data in the face certification, a fingerprint data in the fingerprint data, a voiceprint data in the voiceprint certification, and a sign data in the signature certification.


[0041] Then, feature points are detected from these certifying data D in the detection unit 102. The feature points referred to here are, for instance, such regions as eyes, brows, nose, lip, an edge, wrinkles in the face image data. In the fingerprint data, the feature points are minutiae (the edge, a branched finger pattern, etc.). In the signature data, the ending and leaping units are feature points. To detect these feature points, for instance, the detecting technique described in a publicly known literature [1] (“Extraction of Face Characteristics by Combination of Shape Extraction and Pattern Collation”, Kazuhiro FUKUI and Osamu YAMAGUCHI, The Institute Electronics, Information and Communication Engineers Paper D-11, Vol. J82-D-11, No. 8, pp. 2170-2177, August 1997) is applicable as a method for extracting a face image.


[0042] Further, in this example, eyes and nose are detected as the feature points even when the certifying data D is a face image data.


[0043] The processes in Steps S101 and S102 are repeated until a sufficient amount of data is obtained (Step S103). The sufficient amount of data referred to here is about 50 to 100 sheets of face image data assuming that 5 sheets of face images per second are obtained.


[0044] When the sufficient volume of data is obtained, the change calculation unit 103 calculates changing positions of the feature points (eyes and nose) detected in Step S102 (S104). For instance, when assuming that a-pieces of feature points fj,i (j=1,2, . . . ,a) were detected from n-pieces of data Di (i=1, 2, . . . , k, . . . ,n). At this time, if an average position of one feature point at j-th is fj,center, “Change” that is the position change can be calculated as shown by, for instance, the following mathematical expression 1.
1Change=j=1aChange(j)Change(j)=i=1n&LeftBracketingBar;fj,i-fj,center&RightBracketingBar;2[MathematicalExpression1]


[0045] Further, the change may be calculated according to the following mathematical expression [2] regarding it not the shift from the mean position but a total moving volume from the position of the preceding feature point:
2Change(j)=i=1n-1&LeftBracketingBar;fj,i-fj,i+1&RightBracketingBar;[MathematicalExpression2]


[0046] Or may be calculated as shown by the following mathematical expression 3 based on the change in a distance between feature points not the position of feature point.
3Change=j=1ai=1nChange(j,i)Change(j,i)=k=1(ki)n&LeftBracketingBar;fj,i-fj,k&RightBracketingBar;2[MathematicalExpression3]


[0047] Then, the aptitude judging unit 104 judges whether the certifying data acquired by the data acquisition unit 101 is appropriate for the preparation of a certifying dictionary. (Step S105). For instance, when the position change is above a prescribed threshold value TH101 or below TH102, it is too large (above TH101) or too small (below TH101) and the data is judged inaptitude for the dictionary registration and the registration is made over or a recognized person 101 is so warned and asked to select to make the registration (Step S106).


[0048] When the acquired certifying data is judged appropriate for the preparation of a certifying dictionary in Step S105, the dictionary preparing unit 105 prepares a certifying dictionary based on the certifying data acquired by the data acquisition unit 101 (Step S107). Then, the dictionary storing unit 106 stores the prepared dictionary (Step S108).


[0049] When learning a dictionary, it is necessary to learn data having somewhat diversity. Therefore, if no sufficient motion was observed or excessive motions were observed inversely when motions of a face were detected according to the eye and nose detecting positions as described above, a process is executed to exclude such data from the learning data. Thus, it becomes possible to prevent the inaptitude dictionary learning by data having no sufficient change or data having too large change.


[0050] The inaptitude dictionary leaning is such that there is less variation or data is excessively far from threshold values.


[0051] Next, the second embodiment will be explained.


[0052]
FIG. 3 schematically shows the construction of the individual recognizing apparatus in the second embodiment. This individual recognizing apparatus comprises a data acquisition unit 201, a detection unit 202, an angle change calculation unit 203, an aptitude judging unit 204, a dictionary preparing unit 205, a dictionary storing unit 206, and a certifying unit 207.


[0053] The data acquisition unit 201 acquires biological information and other certifying data from a recognized person.


[0054] The detection unit 202 detects feature points from the certifying data acquired by the data acquisition unit 201.


[0055] The angle change calculation unit 203 calculates either one of the up_down, and the left_right feature points detected by the detection unit 202; in this case, the up_down and left_right angle changes.


[0056] The aptitude judging unit 204 judges whether certifying data acquired by the data acquisition unit 201 is appropriate for the preparation of a certifying dictionary based on he angle changes calculated by the angle change calculation unit 203.


[0057] The dictionary preparing unit 205 prepares a certifying data based on the certifying data acquired by the data acquisition unit 201 when the data is judged appropriate by the aptitude judging unit 204.


[0058] The dictionary storing unit 206 stores a certifying dictionary prepared by the dictionary preparing unit 205.


[0059] The certifying unit 207 certifies whether a recognized person 100 is the proper person using the certifying data acquired by the data acquisition unit 201 and dictionaries stored in the dictionary storing unit 206.


[0060] Hereinafter, the flow of the dictionary registration process involved in the second embodiment will be explained referring to the flowchart shown in FIG. 4.


[0061] First, the processes to acquire certifying data D from a recognized person 100 by the data acquisition unit 201 (Step S201) and to detect feature points by the detection unit 202 (Step S203) are the same as the processes in Steps S101 to S103 in the first embodiment described above until sufficient volume of data is obtained (Step S203).


[0062] When a-pieces feature points fj,i (j=1, 2, . . . , n) are detected from n-pieces data Di (i=1, 2, . . . , n) likewise the first embodiment, the angle change calculation unit 203 calculates first the up_down angle change as described below (Step S204). When certifying data D that is a face image data is taken here as an example of the positional information of feature points, the left eye coordinates (Xlefteye, Ylefteye), the right eye coordinates (Xrighteye, Yrighteye), the left naris coordinates (Xleftnose, Yleftnose) and the right naris coordinates (Xrightnose, Yrightnose) are used, respectively. Even when the feature point is, for instance, the eye area, mouth or brow, the process is the same.


[0063] When the central coordinates of both eyes are (Xcentereye, Ycentereye) and the central coordinates of both nares are (Xcenternose, Ycenternose), the UP_DOWN index indicating the upper and lower angle change is obtained by the mathematical expression 4 shown below (See FIG. 5).
4UpperandloweranglechangeindexUP_DOWN=EyetoNoseDistanceDistancebetweenBothEyes=(Xcenter_eye-Xcenter_nose)2+(Ycenter_eye-Ycenter_nose)2(Xleft_eye-Xright_eye)2+(Yleft_eye-Yright_eye)2[MathematicalExpression4]


[0064] As shown in FIG. 6A, when the face is faced to the front, the eye to nose distance (eye to nose distance) becomes large and the index becomes large. As shown in FIG. 5B, when the face is faced upward, the eye to nose distance becomes small and the index becomes small. So, in this example, the eye to nose distance is normalized by dividing it with the both eyes distance and even when the eye to nose distance becomes large or small by simply coming close to or away so that the eye to nose distance is not judged erroneously as an upper and lower angle change.


[0065] However, in this technique, even when the left and right angles only are changed as shown in FIG. 6B instead of the face facing the front as shown in FIG. 6A, the index UP_DOWN change may be erroneously judged as the change of the upper and lower angle. In this case, therefore, the correction may be made as shown below.


[0066] If the eyes 701 and 702 and the nose 703 (the nares 704 and 705) were positioned as shown by a model in FIG. 7, the positions of the feature points in the direction x are expressed as shown in FIG. 8A. Further, when the face is turned from the front side by R, it is expressed as shown in FIG. 8B. Accordingly, the distance between both eyes Leye,0 in the direction x at the front side is expressed by the following expression:




L


eye,0
=2rl sin θ1  [Mathematical Expression 5]



[0067] and the distance between both eyes Leye,R in the direction x when turned by R is expressed by the following expression:




L


eye,R


=rl
sin(θ1+R)+rl sin(θ1−R)  [Mathematical Expression 6]



[0068] and the shift of the central point between the eye and the nose Lshift,R in the direction x when turned by R is expressed by the following expression:
5Lshift,R=PositionofCentralPointofNarisinDirectionx-PositionofCentralPointofEyeinDirectionx=r2sin(θ2+R)-r2sin(θ2-R)2-r1sin(θ1+R)-r1sin(θ1-R)2[MathematicalExpression7]


[0069] Further, r1, r2, R, θ1 and θ2 are defined as shown below (see FIGS. 8A and 8B).


[0070] r1: A distance from the central point of the head to the left eye 702;


[0071] r2: A distance from the central point of the head to the right naris 704;


[0072] R: An angle turned the face from the front to left or right;


[0073] θ1: An angle formed between the line connecting the front from the central point of the head and the line connecting the left eye 702 from the center of the head; and


[0074] θ2: An angle formed between the line connecting the front from the central point of the head to the front and the line connecting the left naris 705 from the center of the head.


[0075] On the other hand, an apparent distance on an image changes according to a distance between the face of a recognized person 100 and a camera that is used as the data acquiring unit 201 and an apparent distance is generally in proportion to an actual distance. In other words, the apparent distances on an image in Numeral Formulas 5, 6 and 7 become as shown below respectively:




L′


eye,0


=a·L


eye,0
  [Mathematical Expression 8]





L′


eye,R


=a·L


eye,R
  [Mathematical Expression 9]





L′


shift,R


=a·L


shift,R
(a is a proportionality constant)  [Mathematical Expression 10]



[0076] Accordingly, from Mathematical Expressions 9 and 10,
6a=Leye,RLeye,R=Lshift,RLshift,R[MathematicalExpression11]Leye,RLshift,R=Leye,RLshift,R[MathematicalExpression12]


[0077] are obtained. The left side of Mathematical Expression 12 is obtained from an observed value on an image and a parameter R is determined by substituting Mathematical Expressions 6 and 7 into the right side. Thus, Leye,R is obtained from Mathematical Expression 6 and a is obtained by substituting it into Mathematical Expression 11. Finally, the distance L′eye,0 (a corrected vale of the distance between both eyes in Mathematical Expression 4) on an image of both eyes distance in the direction x at the front is estimated when a is substituted into Mathematical Expression 8.


[0078] Thus, it becomes possible to calculate the upper and lower angle change index UP_DOWN of Mathematical Expression 4 without causing a trouble from such a change in both eye distance as shown in FIG. 6B. Further, as the model shown in FIG. 7, a model may be used commonly to all persons or a model different for each type of face structure of a recognized person 100 can be used.


[0079] Then, the angle change calculation unit 203 calculates the index LEFT_RIGHT that shows a left and right angle change according to the following Mathematical Expression 13 (refer to Step S205, FIG. 9).
7LeftandRightChangeIndexLeft_Right=ShiftbetweenCentralPointofBothEyesandCentralPointbetweenBothNares/DistancebetweenBothEyes=-Xcenter_eye-Xcenter_nose(Xcenter_eye-Xcenter_nose)2+(Ycenter_eye-Ycenter_nose)2[MathematicalExpression13]


[0080] Here, for the distance between both eyes, the both eyes distance used in Step S204 may be corrected.


[0081] Based on the results obtained in the above, the aptitude judging unit 204 judges whether the certifying data acquired by the data acquisition unit 201 has sufficient angle changes (Steps S206 and S208). When, for instance, the upper and lower angle change index UP_DOWN is above the prescribed threshold value TH201, it is judged too large and the dictionary registration is made over or a recognized person 100 is so warned and have select whether to make the registration (Step S207).


[0082] When the upper and lower angle change index UP_DOWN is below the prescribed threshold value TH202, the upper and lower angle change is judged too small or when the left and right angle change index LEFT_RIGHT is above the prescribed threshold value TH203, the left and right angle change is judged too large, and when the left and right angle change index LEFT_RIGHT is below the prescribed threshold value TH204, the left and right angle change is judged too small and the process shown in Step S207 is executed.


[0083] When the certifying data were not judged as inaptitude in Steps S206 and S208 (when the acquired certifying data was judged to have a sufficient angle change), a certifying dictionary is prepared based on the certifying data acquired by the data acquisition unit 201 (Step S209). Then, the prepared dictionary is stored in the dictionary storing unit 206 (Step S210).


[0084] When learning dictionaries, it is necessary to make the learning using data having a certain variety and therefore, the vertical and lateral directions of a face are judged based on the positions of the eyes and the nose as described above. If there are less changes in the direction or inversely too many changes, it is possible to prevent an inaptitude learning of dictionaries using data not having sufficient angle changes or data having large angle changes by executing the process to exclude such data from the learning data.


[0085] Next, the third embodiment will be explained.


[0086]
FIG. 10 is a diagram schematically showing the construction of the individual recognizing apparatus involved in the third embodiment. This individual recognizing apparatus comprises a data acquisition unit 301, a dictionary preparing unit 302, an eigenvalue calculation unit 303, an eigenvalue contribution rate calculation unit 304, an aptitude judging unit 305, a dictionary storing unit 396, and a certifying unit 307.


[0087] The data acquisition unit 301 acquires certifying data including biological information from a recognized person 100.


[0088] The dictionary preparing unit 302 prepares a certifying dictionary by executing a principal component analysis based on the certifying data acquired by the data acquisition unit 301.


[0089] The eigenvalue calculation unit 303 calculates an eigenvalue included in the dictionary preparing unit 302.


[0090] The eigenvalue contribution rate calculation unit 304 calculates an eigenvalue contribution rate of the dictionary prepared by the dictionary preparing unit 302.


[0091] The aptitude judging unit 305 judges whether the dictionary prepared by the dictionary preparing unit 302 is aptitude as a certifying dictionary based on the eigenvalue contribution rate calculated by the eigenvalue contribution rate calculation unit 304.


[0092] The dictionary storing unit 306 stores a dictionary prepared by the dictionary preparing unit 302 when it is judged aptitude by the aptitude judging unit 305.


[0093] The certifying unit 307 certifies whether a recognized person 100 is the person himself (or herself) using the certifying data acquired by the data acquisition unit 301 and the dictionary stored in the dictionary storing unit 306.


[0094] The flow of the dictionary registration process in the third embodiment will be explained below referring to the flowchart shown in FIG. 11.


[0095] First, the process to acquire certifying data D from a recognized person 100 (Step S301) is repeated in the data acquiring unit 301 until sufficient volume of data is acquired (Step S302). When sufficient volume of data is acquired, the dictionary preparing unit 302 and the eigenvalue calculation unit 303 prepare a certifying dictionary by analyzing principal components based on the acquired certifying data.


[0096] Definitely, the processes shown below are executed. For details, refer to the publicly known literature [2] (“Recognition of Face by Computer-Survey”, Shigeru AKAMATSU, Institute Electronics Information and Communication Engineers Paper D-11, Vol. J80-d-11, No. 8, pp. 2031-2046, 1997).


[0097] M pieces patterns that are expressed by N dimensional vectors are xi (i=1, 2, . . . , M) and a mean vector of them is μ, a variance-covariance matrix S is expressed by
8S=1Mi=1M(xi-μ)(xi-μ)T[MathematicalExpression14]


[0098] S0, by solving a feature equation shown below







i
iΦi 1≧λ2≧ . . . ≧λN)



ΦiTΦi=1  [Mathematical Expression 15]


[0099] N-pieces N-dimensional feature vector (Di (i=1, 2, . . . , N) and corresponding N-pieces eigenvalues λi (i=1, 2, . . . , N; λ12> . . . >λN) are obtained. For solving the feature equation, an analytical library contained in the above-mentioned literature [1] is usable. Further, instead of using the variabce-covariabce matrix, a cross-correlation matrix R shown below is usable.
9R=1Mi=1Mxi·xiT[MathematicalExpression16]


[0100] Then, the eigenvalue contribution rate calculation unit 304 calculates an eigenvalue contribution rate according t the following formula 17 (for details, refer to “Image Analyzing Handbook”, Takagi and Shimoda, p. 43, Tokyo University Publishing Society, 1991). The symbol “T” represents a transposed matrix.
10EigenvaluecontributionrateofmtheigenvalueCm=λmi=1Nλi[MathematicalExpression17]


[0101] This eigenvalue shows the distribution of higher dimensional eigenvalue occupying a large part of the contribution rate (For details, refer to the publicly known literature [4] “Application of the Karhymen-Loeve Procedure for the Characterization of Human Faces”.


[0102] M. Kirby and L. Sirovich, IEEE Transactions on Patterns Analysis and Machine Intelligence, Vol. 12, No. 1, pp. 103, January 1990). This means that a large part of the distribution of the learning data can be expressed in higher dimensional feature vectors.


[0103] On the other hand, when the variety in the learning data is small (for instance, when a face is kept almost stationary), almost all of the eigenvalue contribution rate is occupied by very small higher dimensional eigenvalue only. On the contrary, when the variety of the learning data is too large (for instance, when a face is excessively moved or a detected face is shifted), a large eigenalue contribution rate is retained to lower dimensional eigenvalues as shown in FIG. 12C.


[0104] Accordingly, when an eigenvalue contribution rate of the mth eigenvalue or an accumulated eigenvalue contribution rate to the m-th eigenvalue shown in
11Cm=jmcj=jmλji=1Nλi[MathematicalExpression18]


[0105] is above the prescribed threshold value TH301, the aptitude judging unit 305 judges the variety as being too large and inaptitude as a dictionary for certification (collation) (Step S304), and the dictionary registration is made over or a recognized person 100 is so warned and have him (or her) to select whether to make the registration (Step S305). When a dictionary is judged aptitude in Step S304, the dictionary storing unit 306 stores that dictionary (Step S306).


[0106] As described above, when the face image change is too small, the eigenvalue contribution rate of a dictionary becomes large and on the contrary, when the change is too large, the eigenvalue contribution rate of a dictionary becomes small. Therefore, it is possible to prevent the inaptitude dictionary learning made by data that have no sufficient change or too large change by judging a face change.


[0107] Further, in the above-mentioned embodiments, the processes in the dictionary registration are explained, the similar process can be made in the certification (collation) after the dictionary registration.


[0108] According to this invention as described in detail in the above, an individual recognizing apparatus and an individual recognizing method capable of using certifying data suited for certifying in the dictionary registration for learning and preventing the inaptitude dictionary registration can be provided.


Claims
  • 1. An individual recognizing apparatus comprising: a data acquisition unit to acquire certifying data from a recognized person; a detection unit to detect feature points of the certifying data acquired by the data acquisition unit; a change calculation unit to calculate the change of the detecting positions of the feature points detected by the detection unit; an aptitude judging unit to judge whether the certifying data acquired by the data acquisition unit is appropriate for the preparation of a certifying dictionary based on the change in the feature points calculated by the change calculation unit; a dictionary preparing unit to prepare a certifying dictionary based on the certifying data acquired by the data acquisition unit when the certifying data is judged appropriate; a dictionary storing unit to store the certifying dictionary prepared by the dictionary preparing unit; and a certifying unit to certify whether a recognized person is a proper person using the certifying data acquired by the data acquisition unit and the dictionary stored in the dictionary storing unit.
  • 2. The individual recognizing apparatus according to claim 1, wherein the change calculation unit includes a unit to calculate at least either one of the up_down and the left_right angle change of the feature points detected by the detection unit.
  • 3. The individual recognizing apparatus according to claim 1, wherein the certifying data acquired by the data acquisition unit is a face image of the recognized person.
  • 4. The individual recognizing apparatus according to claim 3, wherein the detection unit uses such face regions as eyes, brows, nose or lip of the face image as the feature points.
  • 5. The individual recognizing apparatus according to claim 1, wherein the processes are executed again starting from the acquisition of certifying data by the data acquisition unit when the certifying data is judged as inappropriate by the aptitude judging unit.
  • 6. An individual recognizing apparatus comprising: a data acquisition unit to acquire certifying data from a recognized person; a dictionary preparing unit to prepare a certifying dictionary by analyzing principal components based on the certifying data acquired by the data acquisition unit; a calculation unit to calculate an eigenvalue contribution rate of the dictionary prepared by the dictionary preparing unit; an aptitude judging unit to judge whether the dictionary prepared by the dictionary preparing unit is appropriate as a certifying dictionary based on the eigenvalue contribution rate calculated by the change calculation unit; a dictionary storing unit to store the dictionary prepared by the dictionary preparing unit when the dictionary is judged appropriate by the aptitude judging unit; and a certifying unit to certify whether a recognized person is a proper person using the certifying data acquired by the data acquisition unit and the dictionary stored in the dictionary storing unit.
  • 7. The individual recognizing apparatus according to claim 6, wherein the certifying data acquired by the data acquisition unit is a face image of the recognized person.
  • 8. The individual recognizing apparatus according to claim 7, wherein the detection unit uses such facial regions as eyes, brows, nose or lip of the face image as the feature points.
  • 9. The individual recognizing apparatus according to claim 6, wherein the processes are executed again starting from the acquisition of certifying data by the data acquisition unit when the certifying data is judged as inappropriate by the aptitude judging unit.
  • 10. An individual recognizing method comprising: acquiring certifying data from a recognized person; detecting feature points from the acquired certifying data; calculating the change of the detecting positions of the detected feature points; judging whether the acquired certifying data is appropriate for the preparation of a certifying dictionary based on the change of the calculated feature points; preparing a certifying dictionary based on the acquired certifying data when the certifying data is judged appropriate in the judging step; storing the prepared certifying dictionary; and certifying whether a recognized person is a proper person using the acquired certifying data and the stored dictionary.
  • 11. The individual recognizing method according to claim 10, wherein the step for calculating the change includes the step for calculating at least either one of the up_down and the left_right angle changes of the feature points detected by the detecting step.
  • 12. The individual recognizing method according to claim 10, wherein the certifying data acquired by the data acquiring step are a face image of the recognized person.
  • 13. The individual recognizing method according to claim 12, wherein the detecting step uses such facial regions as eyes, brows, nose or lip of a face image as the feature points.
  • 14. The individual recognizing method according to claim 10, wherein the processes are executed again starting from the acquisition of the certifying data by the data acquiring step when the acquired data is judged as inappropriate in the aptitude judging step.
  • 15. An individual recognizing method comprising: acquiring certifying data from a recognized person; preparing a certifying dictionary by analyzing principal components based on the acquired certifying data; calculating an eigenvalue contribution rate of the prepared dictionary; judging whether the prepared dictionary is appropriate as a certifying dictionary based on the calculated eigenvalue contribution rate; storing the prepared dictionary when the prepared dictionary is judged appropriate in the judging step; and certifying whether a recognized person is a proper person using the acquired certifying data and the stored dictionary.
  • 16. The individual recognizing method according to claim 15, wherein the acquired certifying data is a facial image of a recognized person.
  • 17. The individual recognizing method according to claim 16, wherein the detecting step uses such facial regions as eyes, brows, nose or lip of the facial image as feature points.
  • 18. The individual certifying method according to claim 15, wherein the processes are executed again starting from the acquisition of certifying data by the data acquiring step when the acquired data is judged as inappropriate in the aptitude judging step.
Priority Claims (1)
Number Date Country Kind
2002-378452 Dec 2002 JP