The present application claims priority from Japanese Patent Application No. JP 2010-061168 filed in the Japanese Patent Office on Mar. 17, 2010, the entire content of which is incorporated herein by reference.
1. Field of the Invention
The present invention relates to an information processing apparatus and an information processing method. More particularly, the invention relates to an information processing apparatus and an information processing method for performing authentication accurately in a biometric authentication process even where the position for living body part placement is flat.
2. Description of the Related Art
In recent years, there have been known biometric authentication apparatuses for authenticating individuals using the veins of their fingers or their palms.
In order to perform authentication accurately with such a biometric authentication apparatus, it is necessary for the user precisely to place his or her palm or finger onto the position in which to read (i.e., image) the venous pattern of the body part (the position may be called the placement position hereunder).
Some apparatuses are arranged to illuminate the surroundings of the position for finger placement so that the user may know the exact position to put his or her finger on (see Japanese Patent Laid-Open No. 2005-323892).
The arrangement above allows the user to determine clearly where the placement position is and thus to put his or her finger precisely onto the placement position.
However, this technique does not envisage sensing the position where the user's finger is actually placed. The user thus fails to notice any possible misalignment of the fingertips with the accurate placement position. This can make it difficult to perform authentication with precision.
There exists a technique which, if the user has failed to put his or her finger preciously onto the placement position, allows the image taken of the venous pattern to be corrected to permit accurate authentication. More specifically, the technique involves emitting light beams of different wavelengths at the user's finger to acquire a fingerprint image and a venous pattern image. If the user's finger is not aligned with the placement position, a misalignment is detected from the fingerprint image and the detected misalignment is used as the basis for correcting the venous pattern image (see Japanese Patent Laid-Open No. 2006-72764, called the Patent Document 1 hereunder).
According to the technique of the Patent Document 1, the placement position for the finger is formed as guide grooves so that the user's finger will not be misaligned substantially from the correct placement position. However, if the placement position for the finger is formed as a flat shape based on this technique, the user finds it difficult to determine the correct placement position. This poses the possibility that the user's finger will be so misaligned with the placement position as to make it impossible to correct the venous pattern image in accordance with the misalignment. As a result, authentication may not be carried out precisely.
The present invention has been made in view of the above circumstances and provides arrangements for performing authentication accurately in a biometric authentication process even where the position for living body part placement is flat.
In carrying out the present invention and according to one embodiment thereof, there is provided an information processing apparatus for performing authentication using veins of a living body part, the information processing apparatus including: a visible light source configured to present through light emission the position on which to place the living body part; a light-receiving section configured to receive reflected light of the visible light from the visible light source; a computation section configured to compute the amount of misalignment of the living body part with the placement position on the basis of the intensity of the reflected light received by the light-receiving section; and a control section configured to prompt the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount computed by the computation section.
Preferably, the control section may prompt the correction of the placement of the living body part for alignment with the placement position by controlling the emission of the visible light in accordance with the misalignment amount computed by the computation section.
Preferably, the information processing apparatus may further include: a near-infrared light source configured to emit near-infrared light to the living body part; and an imaging section configured to take an image of the living body part to which the near-infrared light is emitted; wherein the computation section may compute the misalignment amount based on the intensity of the reflected light received by the light-receiving section and on the image of the living body part taken by the imaging section.
Preferably, if the misalignment amount is larger than a predetermined threshold value, then the control section may prompt the correction of the placement of the living body part for alignment with the placement position.
Preferably, the information processing apparatus may further include an imaging control portion configured to adjust imaging parameters of the imaging section if the misalignment amount is smaller than the predetermined threshold value.
Preferably, the information processing apparatus may further include a determination section configured to determine whether an object imaged by the imaging section is the living body part; wherein, if the determination section determines that the object is the living body part, then the computation section may compute the misalignment amount.
Preferably, the information processing apparatus may further include a recording section configured to record the image taken by the imaging section upon user registration; wherein the computation section may compute the misalignment amount based on the intensity of the reflected light received by the light-receiving section and on a difference between the image taken by the imaging section and the image recorded in the recording section.
Preferably, the information processing apparatus may further include a display section configured to display a predetermined image or text; wherein the control section may cause the display section to display an image or a text prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount computed by the computation section.
Preferably, the information processing apparatus may further include a sound output section configured to output a sound; wherein the control section may cause the sound output section to output a sound prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount computed by the computation section.
Preferably, the information processing apparatus may further include a temperature difference generation section configured to generate a temperature difference near the placement position; wherein the control section may cause the temperature difference generation section to generate a temperature difference prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount computed by the computation section.
Preferably, the information processing apparatus may further include a vibration generation section configured to generate vibrations near the placement position; wherein the control section may cause the vibration generation section to generate vibrations prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount computed by the computation section.
Preferably, the information processing apparatus may further include: a display section configured to display a predetermined image or text; a sound output section configured to output a sound; a temperature difference generation section configured to generate a temperature difference near the placement position; and a vibration generation section configured to generate vibrations near the placement position. The control section may cause the visible light source to emit the light prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount computed by the computation section. The control section may cause the display section to display an image or a text prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount. The control section may cause the sound output section to output a sound prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount. The control section may cause the temperature difference generation section to generate a temperature difference prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount. The control section may cause the vibration generation section to generate vibrations prompting the correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount.
Preferably, the living body part mentioned above may be a human finger.
According to another embodiment of the present invention, there is provided an information processing method for use with an information processing apparatus which performs authentication using veins of a living body part and which includes a visible light source configured to present through light emission the position on which to place the living body part and a light-receiving section configured to receive reflected light of the visible light from the visible light source, the information processing method including the steps of: computing the amount of misalignment of the living body part with the placement position on the basis of the intensity of the reflected light received by the light-receiving section; and performing control to prompt correction of the placement of the living body part for alignment with the placement position in accordance with the misalignment amount computed by the computation section.
According to the present invention embodied as outlined above, the amount of misalignment of the living body part with the placement position is computed on the basis of the intensity of the reflected light received by the light-receiving section. Then a prompt is made to correct the placement of the living body part for alignment with the placement position in accordance with the computed misalignment amount.
Thus according to the present invention outlined above, it is possible to perform authentication accurately in a biometric authentication process even where the position for living body part placement is flat.
Some preferred embodiments of the present invention will now be explained in reference to the accompanying drawings.
The authentication unit 11 shown in
More specifically, in the authentication unit 11, a near-infrared light source 31 emits near-infrared light to the finger 12 while an imaging section 32 images the light scattered by the finger 12. That is, the imaging section 32 takes an image of a venous pattern formed by the hemoglobin inside those veins of the finger 12 which absorb the near-infrared light from the near-infrared light source 31.
The near-infrared light source 31 is composed of LED's (light emitting diodes) emitting light in an infrared spectrum part ranging from about 0.7 to 2.5 μm. The near-infrared light source 31 is formed by a plurality of LED's arrayed linearly in the lengthwise direction of the finger 12. The LED's constituting the near-infrared light source 31 are arrayed at suitable lighting angles so that excess near-infrared light will not spill into the imaging section 32 and that only the finger 12 will be lit in the position for taking an image of the venous pattern of the finger 12.
The imaging section 32 is made up of an optical block and a photoelectric conversion element such as a CCD (charge coupled device) or CMOS (complementary metal-oxide semiconductor). The optical block forms an optical image of an object. The photoelectric conversion element acquires the image of the object by converting the optical image (i.e., image of the object) into image data that is an electrical signal.
In the authentication unit 11 of
In the authentication unit 11 of
In that structure, the authentication unit 11 on the side of the finger 12 (i.e., topside of the transmission filter 33) may be shaped flat. Also, the authentication unit 11 as a whole can be formed into a thin shape.
In the above-described structure, the authentication unit 11 performs venous pattern authentication by collating an image taken of the venous pattern with the venous pattern imaged and recorded upon registration.
A typical functional structure of the authentication unit 11 will now be explained in reference to
The authentication unit 11 shown in
Also, the finger 12 in
A typical layout of the near-infrared light source 31, imaging section 32, visible light source 34, and light-receiving sensor 35 are now explained below in reference to
As shown in
The layout of the near-infrared light source 31 and visible light source 34 is not limited to the one shown in
Referring back to the explanation in reference to
The visible light source 34 is composed of a plurality of LED's emitting visible light in a visible light spectrum part ranging from about 380 to 750 nm. The visible light source 34 is structured with a plurality of LED's arrayed in the region shown in
The light-receiving sensor 35 receives reflected light of the visible light that was emitted to and reflected from the finger 12 before passing through the transmission filter 33. The light-receiving sensor 35 supplies the control section 37 with information representing the level of the received light.
The registration database 36 is typically composed of a hard disk or a nonvolatile memory. As such, the registration database 36 records user information supplied from the control section 37 for user authentication. The user information recorded in the registration database 36 is rewritable and is retrieved as needed by the control section 37.
The control section 37 is made up of a CPU (central processing unit), a ROM (read only memory), and a RAM (random access memory). The control section 37 controls the components of the authentication unit 11.
The control section 37 includes a received-light intensity calculation portion 51, an imaging control portion 52, a registration/authentication processing portion 53, an object determination portion 54, a misalignment amount computation portion 55, and a light emission control portion 56.
The received-light intensity calculation portion 51 calculates the intensity of the reflected light received from the finger 12 on the basis of received-light level information coming from the light-receiving sensor 35. The received-light intensity thus calculated is fed to the registration/authentication processing portion 53 or misalignment amount computation portion 55. Also, upon detection of a change in received-light intensity typically as a result of the finger 12 getting placed on the transmission filter 33, the received-light intensity calculation portion 51 supplies the imaging control portion 52 with information giving an instruction to start imaging.
The imaging control portion 52 controls the near-infrared light source 31 and imaging section 32 in operation. For example, when supplied from the received-light intensity calculation portion 51 with information giving the instruction to start imaging, the imaging control portion 52 causes the near-infrared light source 31 to emit near-infrared light and the imaging section 32 to start imaging. The image data (simply called the image hereunder) acquired by the imaging section 32 is sent to the registration/authentication processing portion 53 or object determination portion 54 under control of the imaging control portion 52.
The registration/authentication processing portion performs registration and authentication of user information in the authentication unit 11.
More specifically, when the authentication unit 11 is in registration mode, the registration/authentication processing portion 53 establishes correspondences among the identification information for user identification coming from an input section, not shown, the image of the finger 12 from the imaging section 32, and the received-light intensity from the received-light intensity calculation portion 51, on the basis of the identification information. The items of information thus made in correspondence with one another are supplied and written to the registration database 36 as user information.
Also, when the authentication unit 11 is in authentication mode, the registration/authentication processing portion 53 reads corresponding user information from the registration database 36 based on the identification information supplied from the input section, not shown. The registration/authentication processing portion 53 proceeds to collate the finger image and the received-light intensity of the user information with the finger image coming from the imaging section 32 and with the received-light intensity from the received-light intensity calculation portion 51, respectively.
When the authentication unit 11 is in authentication mode, the object determination portion 54 determines whether the object of the image coming from the imaging section 32 is a human finger. If the object of the image from the imaging section 32 turns out to be a human finger, the object determination portion 54 acquires the user information from the registration/authentication processing portion 53, and compares in size the finger image of the user information with the finger image from the imaging section 32. The object determination portion 54 proceeds to supply the misalignment amount computation portion 55 with information representing the difference in size obtained from the comparison between the two finger images.
Based on the received-light intensity fed from the received-light intensity calculation portion 51 and on the difference in size between finger images from the object determination portion 54, the misalignment amount computation portion 55 computes the amount of misalignment (called the misalignment amount hereunder) of the finger 12 with the placement position on the transmission filter 33, and sends the computed misalignment amount to the light emission control portion 56.
The light emission control portion 56 controls the light emission of the visible light source 34 in accordance with the misalignment amount coming from the misalignment amount computation portion 55. Depending on the misalignment amount from the misalignment amount computation portion 55, the light emission control portion 56 prompts the visible light source 34 to emit visible light in order to let the placement of the finger 12 be corrected for alignment with the placement position.
The user registration process performed by the authentication unit 11 will now be explained in reference to the flowchart of
The registration process shown in the flowchart of
In step S11, the registration/authentication processing portion 53 determines whether the identification information for identifying the user is input through the input section, not shown. If the input section is structured as, say, a numerical keypad, the identification information may be an ID number entered by the user through the numerical keypad. If the input section is structured as a card reader, then the identification information may be a user ID recorded on an ID card owned by the user.
If it is determined in step S11 that identification information is not input yet, step S11 is repeated until identification information is input. If it is determined in step S11 that identification information is input, then control is transferred to step S12.
At this point, as shown in
In step S12, the received-light intensity calculation portion 51 determines whether the finger 12 is placed on the placement position on the transmission filter 33 based on the information from the light-receiving sensor 35 indicating the received-light level.
If it is determined in step S12 that the finger 12 is not placed on the placement position on the transmission filter 33, i.e., if it is determined that there is no change in the received-light level from the light-receiving sensor 35 according to the information therefrom indicating the received-light level, then step S12 is repeated until there occurs a change in the received-light level from the light-receiving sensor 35.
If it is determined in step S12 that the finger 12 is placed on the placement position on the transmission filter 33, i.e., if it is determined that the received-light level from the light-receiving sensor 35 is raised abruptly by the finger getting placed onto the placement position on the transmission filter 33 according to the received-light level information from the light-receiving sensor 35, then the received-light intensity calculation portion 51 supplies the imaging control portion 52 with information giving an instruction to start imaging. Also, based on the received-light level information from the light-receiving sensor 35, the received-light intensity calculation portion 51 calculates the intensity of the reflected light from the finger 12 and feeds the calculated received-light intensity to the registration/authentication processing portion 53. Thereafter, control is transferred to step S13.
In step S13, the imaging control portion 52 causes the near-infrared light source 31 to emit near-infrared light based on the information from the received-light intensity calculation portion 51 giving the instruction to start imaging. The near-infrared light is emitted to the finger 12 placed on the placement position on the transmission filter 33.
In step S14, based on the information from the received-light intensity calculation portion 51 giving the instruction to start imaging, the imaging control portion 52 causes the imaging section 32 to take an image of the finger 12 which is placed on the placement position and to which the near-infrared light is emitted. More specifically, when supplied from the received-light intensity calculation portion with the information giving the instruction to start imaging, the imaging control portion 52 causes the imaging section 32 to start imaging the finger 12. The imaging control portion 52 causes the imaging section 32 to supply the registration/authentication processing portion 53 with the finger image acquired upon elapse of a predetermined time period.
In step S15, the registration/authentication processing portion 53 establishes correspondences among the identification information input through the input section, not shown, the finger image from the imaging section 32, and the received-light intensity from the received-light intensity calculation portion 51. These items of information made in correspondence with one another are supplied and written to the registration database 36 as user information.
When the above steps have been carried out, the user placing his or her finger 12 onto the placement position of the transmission filter 33 can have his or her user information registered.
Explained next in reference to the flowchart of
The authentication process shown in the flowchart of
Steps S31 through S34 in the flowchart of
In step S35, based on the identification information input through the input section, not shown, the registration/authentication processing portion 53 searches the registration database 36 for the user information corresponding to the input identification information and retrieves the corresponding user information.
In step S36, the authentication unit 11 performs a misalignment notification process notifying the user of any misalignment of the finger 12 that may occur with the placement position on the transmission filter 33.
The misalignment notification process performed by the authentication unit 11 is explained below in reference to the flowchart in
In step S51, the misalignment amount computation portion 55 acquires the received-light intensity of the user information retrieved by the registration/authentication processing portion 53, and compares the received-light intensity of the user information with the received-light intensity coming from the received-light intensity calculation portion 51 to find a difference therebetween.
In
Received-light intensities L1 and L2 indicated by a broken line and a dashed line respectively are typical of the received-light levels fed from the received-light intensity calculation portion 51 in authentication mode. Positions P1 and P2 represent the placement positions for the finger 12 in effect when the received-light intensities indicated by the received-light intensity curves L1 and L2 respectively (called the received-light intensities L1 and L2 hereunder) are obtained. At the positions P1 and P2, the received-light intensities L1 and L2 each take the peak value LMAX.
For example, when the received-light intensity L1 is supplied from the received-light intensity calculation portion 51, the misalignment amount computation portion 55 compares the position P0 at which the received-light intensity L0 of the user information takes the peak value LMAX, with the position P1 at which the peak value LMAX of the received-light intensity L1 is obtained by the received-light intensity calculation portion 51, to find a difference therebetween.
Referring back to the flowchart of
If it is determined in step S52 that the imaged object is a human finger, then control is transferred to step S53.
In step S53, the object determination portion 54 acquires the finger image of the user information retrieved by the registration/authentication processing portion 53, and compares the finger image of the user information with the finger image determined to be representative of a human finger to find a difference therebetween.
For example, if the finger 12 is placed on the position P1 on the transmission filter 33 shown in
If the finger 12 is placed on the position P2 on the transmission filter 33 shown in
Thus the object determination portion 54 may typically compare the finger width in the finger image of the user information with the finger width in the finger image taken by the imaging section 32 to find a difference therebetween. The acquired difference is sent to the misalignment amount computation portion 55.
In step S54, the misalignment amount computation portion 55 computes the amount of misalignment of the finger 12 with the placement position on the transmission filter 33 based on the difference between the received-light intensities obtained in step S51 and on the difference between the finger images supplied from the object determination portion 54.
More specifically, the misalignment amount computation portion 55 computes the amount of misalignment of the finger 12 with the placement position on the transmission filter 33 (the amount may also be called the actual misalignment amount hereunder) on the basis of, say, the difference between the positions P0 and P1 shown in
For example, if the light-receiving sensor 35 receives the reflected light from the finger 12 in units of pixels, then the received-light intensity calculation portion calculates the intensity of the received light also in units of pixels. Thus the difference between the positions P0 and P1 shown in
Also, the misalignment amount computation portion 55 computes the amount of relative misalignment (also called the relative misalignment amount hereunder) corresponding to the amount of misalignment of the finger 12 with the placement position on the transmission filter 33 based on the difference between the finger width in the finger image of the user information and the finger width in the finger image taken by the imaging section 32.
For example, if the finger width in the finger image taken by the imaging section 32 is 750 pixels and if the finger width in the finger image of the user information is 500 pixels, the difference of 250 pixels constitutes the relative misalignment amount. In this case, the finger width in the finger image of the user information is less than the finger with in the finger image taken by the imaging section 32. That means the finger 12 is shifted from the placement position towards the near-infrared light source 31.
In step S55, the misalignment amount computation portion 55 determines whether the misalignment amount computed in step S54 is larger than a predetermined threshold value. More specifically, the misalignment amount computation portion 55 determines whether the actual misalignment amount and the relative misalignment amount are each larger than a predetermined corresponding threshold value.
If it is assumed here that the predetermined threshold value for the actual misalignment amount is 8 mm and that the predetermined threshold value for the relative misalignment amount is 200 pixels, then it is determined in step S55 of the above example that the misalignment amount is larger than the threshold value. In this case, the misalignment amount computation portion 55 supplies the light emission control portion 56 with information representative of the calculated amount of misalignment (actual misalignment amount). Control is then transferred to step S56.
The threshold value for the amount of misalignment may be established variably depending on the security level demanded. For example, the threshold value may be set low for the authentication permitting entry into buildings, and set high for the authentication permitting access to ATM's (automated teller machines) at banks or like institutions.
In step S56, the light emission control portion 56 controls the light emission of the visible light source 34 in accordance with the information representative of the misalignment amount coming from the misalignment amount computation portion 55. Through such light emission control, the light emission control portion 56 prompts the correction of the placement of the finger 12 on the placement position.
As shown in
As shown in
The shapes and patterns of the light emission by the visible light source 34 as well as the lighting colors are not limited to those shown in
Returning to the flowchart of
Meanwhile, if it is determined in step S55 that the misalignment amount computed in step S54 is not larger than the predetermined threshold value, e.g., if it is determined that the actual misalignment amount and the relative misalignment amount are each not larger than the predetermined corresponding threshold value, then the misalignment amount computation portion 55 supplies the imaging control portion 52 with information representative of the misalignment amount (relative misalignment amount). Control is then transferred to step S57.
In step S57, the imaging control portion 52 adjusts the imaging parameters of the imaging section 32 in accordance with the information representative of the misalignment amount coming from the misalignment amount computation portion 55, and causes the imaging section 32 to take a finger image using the adjusted imaging parameters before feeding the acquired finger image to the registration/authentication processing portion 53. The adjustments allow the imaging section 32 to take the finger image that can be authenticated by the registration/authentication processing portion 53. If the actual misalignment amount and the relative misalignment amount are each zero (or approximately zero), then the imaging control portion 52 causes the acquired finger image to be sent to the registration/authentication processing portion 53 without adjusting the imaging parameters of the imaging section 32.
Subsequent to step S57, or if it is determined in step S52 that the imaged object is not a human finger, then control is transferred back to step S36 in the flowchart of
Returning to the flowchart of
With the above steps carried out, if the user's finger is not aligned with the correct placement position during the authentication process based on the venous pattern of the finger and performed by the authentication unit with its finger placement position shaped flat, then an emission of the visible light source reflecting the amount of the finger's amount of misalignment with the placement position can be fed back to the user. The feedback allows the user to recognize that the finger is not aligned with the placement position. As a result, authentication can be performed accurately even where the placement position is shaped flat.
Whereas the foregoing description indicated that the amount of misalignment is calculated based on the received-light intensity and on the finger image, the misalignment amount may alternatively be computed from the received-light intensity alone.
In the foregoing paragraphs, the received-light intensity was shown to be computed based on the received-light level of the reflection of the visible light. Alternatively, the received-light intensity may be computed on the basis of the received-light level of the reflection of the near-infrared light emitted by the near-infrared light source 31.
Also in the foregoing paragraphs, the transmission filter 33 was shown to be structured simply to let near-infrared light and visible light pass through. Alternatively, a diffuser panel may be overlaid on the transmission filter 33.
The diffuser panel overlaid on the transmission filter 33 diffuses the visible light emitted by the visible light source 34. This provides gradations at the placement position for the finger 12 as shown in
Whereas
In
The shape of the light-guiding material 131 is not limited to the heart shape shown in
As described above, when the diffuser panel is overlaid on the transmission filter 33 or the light-guiding materials are embedded therein, it is possible visibly to vary the manner in which the finger placement position is presented or the kind of feedback with which the user is notified of finger misalignment from the placement position.
The authentication unit 11 in
In the foregoing paragraphs, the emission of the visible light source 34 was shown to be given as the feedback to the user where the finger is not aligned with the placement position. Alternatively, the authentication unit 11 may be structured to incorporate or connect with a display section for displaying predetermined images or text and a sound output section for outputting predetermined sounds. The display section and the sound output section may then be arranged to give a suitable display and output suitable sounds as the feedback to the user.
Explained below in reference to
The authentication unit 11 in
In the authentication unit 11 of
That is, the difference between the authentication unit 11 in
The display section 211 is composed of a display device such as a liquid crystal display (LCD) or an organic electroluminescence (EL) display. Under control of the control section 37, the display section 211 displays predetermined images or text.
The sound output section 212 is composed of so-called speakers that output predetermined sounds under control of the control section 37.
The control section 37 in
In the control section 37 of
That is, the difference between the control section in
The display control portion 231 controls the display of the display section 211 in a manner prompting the correction of the placement of the finger 12 for alignment with the placement position in accordance with the misalignment amount supplied from the misalignment amount computation portion 55.
The sound output control portion 232 controls the sound output of the sound output section 212 in a manner prompting the correction of the placement of the finger 12 for alignment with the placement position in accordance with the misalignment amount supplied from the misalignment amount computation portion 55.
The user registration process and the user authentication process performed by the authentication unit 11 in
Also, the misalignment notification process performed by the authentication unit 11 in
It should be noted, however, that in the misalignment notification process performed by the authentication unit 11 of
More specifically, depending on the actual misalignment amount, the display section 211 is caused typically to display an arrow image or a text such as “Move your finger by _ mm to the right (left)” thereby prompting the correction of the placement of the finger 12 for alignment with the placement position.
After the above-described misalignment notification process, the display control portion 231 may control the display of the display section 211 in accordance with the result of the collation in a manner presenting the user with the outcome of the collation.
Also in the misalignment notification process performed by the authentication unit 11 of
More specifically, depending on the actual misalignment amount, the sound output section 212 is caused typically to sounds such as “Move your finger by mm to the right (left)” in order to prompt the correction of the placement of the finger 12 for alignment with the placement position.
After the above-described misalignment notification process, the sound output control portion 232 may control the sound output of the sound output section 212 in accordance with the result of the collation in a manner presenting the user with the outcome of the collation.
In the above-described authentication process based on the venous pattern of the finger and performed by the authentication unit with its finger placement position shaped flat, the display or the sound reflecting any misalignment of the user's finger with the placement position is presented to the user as the feedback indicating the misalignment. This allows the user to recognize that his or her finger is not aligned with the placement position. As a result, authentication can be performed accurately even where the placement position is shaped flat.
Also, the authentication unit 11 in
Furthermore, because the authentication unit 11 of
In the personal computer 301 shown in
Also in
The above structure allows the personal computer 301 as a whole including its authentication facility to be shaped thinner than ever before.
In the foregoing paragraphs, the display of the display section or the sound output of the sound output section was shown to be presented to the user as the feedback indicating any misalignment of the finger with the placement position. Alternatively, a temperature difference or vibrations near the placement position may be given to the user as the feedback indicative of the misalignment.
Explained below in reference to
The authentication unit 11 in
In the authentication unit 11 of
That is, the difference between the authentication unit 11 in
The heating element 213 is structured to be a thin metal sheet enveloped in plastic resin film. As such, the heating element 213 is attached to positions away from the finger placement position by a predetermined distance (e.g., positions corresponding to the visible light source element 34-3 in
The control section 37 in
In the control section 37 of
That is, the difference between the control section 37 in
In accordance with the misalignment amount coming from the misalignment amount computation portion 55, the heat control portion 233 controls the heating of the heating element 213 in a manner prompting the correction of the placement of the finger 12 for alignment with the placement position.
The user registration process and the user authentication process performed by the authentication unit 11 in
Also, the misalignment notification process performed by the authentication unit 11 in
It should be noted, however, that in the misalignment notification process performed by the authentication unit 11 of
More specifically, an electrical current reflecting the actual misalignment amount is allowed to flow through the metal sheet of the heating element 213. This causes the misaligned position away from the placement position on the transmission filter 33 to generate heat reflecting the misalignment amount in a manner prompting the correction of the placement of the finger 12 for alignment with the placement position. At this point, the temperature at the placement position on the transmission filter 33 is different from (i.e., lower than) the temperature at the misaligned position away from the placement position, so that the user can recognize the low-temperature position to be the correct placement position.
In the above-described structure, the misaligned position away from the finger placement position was shown to have a raised temperature (i.e., heated in proportion to the amount of misalignment with the placement position). The point, however, is that the temperature difference between the placement position and the misaligned position need only be recognized by the user. In that sense, the misaligned position away from the placement position may be arranged alternatively to have a lower temperature than the placement position (i.e., heat is absorbed in accordance with the amount of misalignment with the placement position).
Explained below in reference to
The authentication unit 11 in
In the authentication unit 11 of
That is, the difference between the authentication unit 11 in
The vibration section 214 may be structured to include a small-sized motor equipped with an eccentric weight. The vibration section 214 is attached onto that position of the bottom side of the transmission filter 33 which is displaced by a predetermined distance from the finger placement position (e.g., onto the position corresponding to the visible light source element 34-3 in
The control section 37 in
In the control section 37 of
That is, the difference between the control section in
In accordance with the misalignment amount coming from the misalignment amount computation portion 55, the vibration control portion 234 controls the vibration of the vibration section 214 in a manner prompting the correction of the placement of the finger 12 for alignment with the placement position.
The user registration process and the user authentication process performed by the authentication unit 11 in
Also, the misalignment notification process performed by the authentication unit 11 in
It should be noted, however, that in the misalignment notification process performed by the authentication unit 11 of
More specifically, the vibration section 214 causes the transmission filter 33 to vibrate the misaligned position away from the finger placement position at a magnitude (or with a pattern) reflecting the actual misalignment amount so as to prompt the correction of the placement of the finger 12 for alignment with the placement position. The vibrations near the placement position on the transmission filter 33 are felt smaller than those at a significantly misaligned position away from the placement position. This allows the user to recognize that the smaller the magnitude of the vibrations, the closer the finger position to the placement position.
In the above-described structure, the misaligned position was shown to vibrate more the farther away from the finger placement position (i.e., the placement position does not vibrate). The point, however, is that the difference in (or the absence of) the magnitude of vibrations need only be felt by the user between the placement position and the misaligned position. In that sense, there may be provided an alternative structure whereby the farther away from the placement position, the smaller the vibrations felt at the misaligned position of the finger (i.e., the placement position vibrates the most).
In the above-described authentication process based on the venous pattern of the finger and performed by the authentication unit with its finger placement position shaped flat, the temperature difference or the vibration reflecting any misalignment of the user's finger with the placement position is presented to the user as the feedback indicating the misalignment. This allows the user to recognize that his or her finger is not aligned with the placement position. As a result, authentication can be performed accurately even where the placement position is shaped flat.
In the foregoing paragraphs, the temperature difference or the vibration was shown given to the user as the feedback indicating any misalignment of his or her finger with the placement position. Alternatively, the above-described emission of visible light, display, sound, temperature difference, and vibration may all be given to the user as the feedback indicative of any misalignment.
Explained below in reference to
In the authentication unit 11 of
Also, the user registration process, user authentication process, and misalignment notification process performed by the authentication unit 11 in
In the above-described authentication process based on the venous pattern of the finger and performed by the authentication unit with its finger placement position shaped flat, the emission of visible light, display, sound, temperature difference, and vibration reflecting any misalignment of the user's finger with the placement position may all be presented to the user as the feedback indicating the misalignment. This allows the user to recognize that his or her finger is not aligned with the placement position. As a result, authentication can be performed accurately even where the placement position is shaped flat.
In the foregoing paragraphs, all of the emission of visible light, display, sound, temperature difference, and vibration were shown given to the user as the feedback indicating any misalignment of his or her finger with the placement position. Alternatively, at least two items out of the above-described emission of visible light, display, sound, temperature difference, and vibration may be given in combination to the user as the feedback indicative of any misalignment. This also allows the user to recognize any misalignment of his or her finger with the placement position more unambiguously than ever.
In the foregoing description, the present invention was explained as applicable to the authentication unit performing the authentication process by utilizing the veins of the human finger. Alternatively, the invention may be applied to diverse authentication units including those that carry out authentication processes by use of part of the veins of the human body, such as the veins of the palm.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factor in so far as they are within the scope of the appended claims or the equivalents thereof.
Number | Date | Country | Kind |
---|---|---|---|
P2010-061168 | Mar 2010 | JP | national |