This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2013-154839, filed on Jul. 25, 2013, the entire contents of which are incorporated herein by reference.
The technique discussed in the embodiment is related to an image capture device and an image capture method.
Japanese Laid-open Patent Publication No. 2007-233981 and Japanese Laid-open Patent Publication No. 2012-208687 discuss techniques using biometric authentication. There are scenes where biometric authentication is used to ensure security in mobile equipment, such as a notebook personal computer or a tablet terminal.
According to an aspect of the invention, an image capture device includes a casing, an image sensor provided to a surface of the casing, and a processor configured to: detect a location with which a subject is in contact on the surface, and cause the image sensor to perform image capture processing when a distance between a first portion of the subject and the image sensor meets a certain criterion, the first portion being different from a second portion of the subject, the second portion being in contact with the surface in the location, wherein the certain criterion is set based on first distance information indicating a first distance between the location and the image sensor and second distance information indicating a second distance between a certain location on the surface and the image sensor.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Biometric information input in biometric authentication includes fluctuations resulting from a state at the time of input (e.g., a way in which a hand is held). The fluctuations are a cause of a reduction in authentication accuracy.
It is an object of the technique discussed in the embodiment to suppress a reduction in authentication accuracy in biometric authentication.
An embodiment is described below with reference to the drawings.
The CPU 101 is a central processing unit. The CPU 101 includes one or more cores. The RAM 102 is volatile memory for temporarily storing a program to be executed by the CPU 101, data to be processed by the CPU 101, and the like. The storage device 103 is nonvolatile memory. Examples usable as the storage device 103 can include a read-only memory (ROM), a solid-state drive, such as flash memory, and a hard disk driven by a hard disk drive. The storage device 103 stores an image capture program and a biometric authentication program.
Examples of the display device 104 can include a liquid crystal display and an electroluminescent panel. The display device 104 can display instructions to a user, results of image capture processing and biometric authentication processing, which are described below, and the like. The image sensor 105 may be any device capable of acquiring a biometric image by capturing an image of a subject in a noncontact manner and is not particularly limited. One example of the image sensor 105 may be a complementary metal-oxide semiconductor (CMOS) camera. In the present embodiment, the image sensor 105 acquires an image of a palm of a user as one example. The input equipment 106 may be equipment, such as a keyboard or a touch panel.
The image capture program and the biometric authentication program stored in the storage device 103 is developed in the RAM 102 such that they are executable. The CPU 101 executes the image capture program and the biometric authentication program developed in the RAM 102. The execution performs the image capture processing and the biometric authentication processing in the biometric authentication apparatus 100. The image capture processing is processing of acquiring a biometric image by capturing an image of a subject. The biometric authentication processing is processing of identifying an authorized user by checking feature data for use in checking obtained in authentication against registered feature data, which was registered in advance.
Next, a preferred value of the mounting angle (optical axis) of the image sensor 105 is described. In the present embodiment, an image is captured in an attitude in which a fingertip is in contact with the surface 201 and the palm is raised. That is, the palm is held obliquely to the surface 201. Thus the image sensor 105 may preferably be mounted such that its optical axis is orthogonal to the palm. At that time, the palm may preferably be positioned within the sensor field of view of the image sensor 105. Examples of those conditions will be described below.
Referring to
To position the palm within the sensor field of view, the angle β may preferably satisfy the following Expression (1).
β=tan−1{(a+b)/h} (1)
When in Expression (1) the height h is a height at which an image can be taken using the image sensor 105 with the angle of view, 2α, such that the palm length 2a is within the sensor field of view, the following Expression (2) is established.
h=a/tan α (2)
When the above Expression (2) is substituted into the above Expression (1), the angle β can be determined from the following Expression (3).
β=tan−1{(a+b)×tan α/a} (3)
The image sensor 105 may preferably be arranged such that the angle β satisfies the above Expression (3).
The palm length 2a may preferably be designed so as to suit expected users. For example, according to Anthropometric Database 1991-92 in National Institute of Advanced Industrial Science and Technology (AIST) (http://riodb.ibase.aist.go.jp/dhbodydb/91-92/), about the palm length (Item L2), the average value for adult males is 11.3 cm and that for elderly people is 11 cm. For example, when the sensor angle of view 2α is 80°, the palm size 2a is 11 cm, and the finger length b is 8 cm, the angle β is 66°. In actuality, if α is strictly set, the apparatus is susceptible to positional displacement. Thus adjustment for setting a height at which an image in a range slightly larger than the palm size 2a can be taken may preferably be carried out. For example, when a margin of 1 cm is set, the angle β may be determined by applying a′=a+1 and b′=b−1 to the above Expression (3).
Next, an index point set on the surface 201 is described. In the present embodiment, the location of a fingertip when the palm is placed on the image sensor 105 with reference to an index point in the vicinity of the image sensor 105 is detected, and when the palm is isolated upward while the fingertip remains on the surface 201 and the palm is raised up to a proper distance, an image is captured. Because users hold their palms relative to the index point, the index point may preferably be easy to identify for them. One example of the index point may be the center of the palm. In actual use, the center of the palm may not be used as the index point. For example, the base of the thumb or the like may also be used. In that case, the index point is determined as a relative location to the center of the palm, and it may be indicated on the surface 201 by marking or the like. If a location significantly remote from the center is selected, an error in detection is large. Thus the center of the palm may be preferable as the index point.
As in
Next, actions of the biometric authentication apparatus 100 are described.
Then, the fingertip location detector 12 detects the location of the fingertip (step S2). Then, the fingertip location detector 12 determines whether the detection of the fingertip has succeeded (step S3). When “No” is determined in step S3, the processing from step S1 is performed again. Referring to
When the terminal 200 is a tablet terminal, the location of the fingertip can be detected more easily. In that case, because a touch panel is used as the input equipment 106, the location with which the fingertip is in contact can be detected accurately. In that case, the image sensor 105 can be implemented below the screen. As described above, a dedicated sensor can be used to detect the location of the fingertip. The dedicated sensor is optional.
Referring to
h=√(d2−c2) (4)
where c is the one in which a distance e between the index point A and the sensor center is subtracted from the distance d. Because the index point A is a fixed point on the surface 201, the distance e is also a fixed value. Accordingly, the optimal height h can be represented as the following Expression (5).
h=√(d2−(d−e)2) (5)
Referring to
Then, the height determining unit 15 determines whether the height h of the palm detected in step S6 is within an appropriate range (step S7). The appropriate range can be set at a certain range including the optimal height h represented by the above Expression (5). For example, the appropriate range can be set at a range whose center is the optimal height h. When “No” is determined in step S7, the processing from step S5 is performed again. When “Yes” is determined in step S7, the image capture unit 16 instructs the image sensor 105 to capture an image and acquires the image (step S8).
Then, the authentication device 20 performs the biometric authentication processing (step S9) using the image acquired in step S8. Specifically, the biometric feature extractor 21 extracts a biometric feature from the image. When the palm is used as the subject, veins in the palm, a palm print, an outline, and the like can be used as the biometric feature. Then, the biometric feature comparing unit 22 calculates the degree of similarity between the registered feature data registered in the registered feature data storage unit 23 and the biometric feature extracted by the biometric feature extractor 21. When the degree of similarity is at or above a threshold value, the biometric feature comparing unit 22 determines that the checking is successful. The result of the checking by the biometric feature comparing unit 22 is output to the display device 104 by the authentication result outputting unit 24. When the above processing is completed, the execution of the process of the flowchart ends.
According to the present embodiment, because the fingertip is in contact with the surface 201, the location of the palm to the front, rear, right, and left is stable. In addition, because the palm is isolated upward while the fingertip is in contact with the surface 201, the height of the palm is stable.
Because an image is captured when the distance between the subject and the image sensor 105 meets a criterion established based on the distance between the location of the fingertip and the image sensor 105 and the distance between the index point A and the image sensor 105, the inclination of the palm is stable. Consequently, the reproducibility in inputting biometric information can be improved. This can suppress a reduction in authentication accuracy.
The inclination in horizontal directions is easily visible and thus can be easily stabilized originally. It can be expected that this inclination is further stabilized by placement of the fingertip. As for the accuracy, it can be expected that the stable attitude leads to high reproducibility in acquired images and to a reduced false rejection rate. In addition, it can be expected that advantages for the speed (processing time) are also obtainable. It can be expected that the stable attitude allows processing for correcting the attitude to be omitted and thus leads to a reduced calculation time desired for authentication processing. Because finding the location is clear and an operation method is simple, one of the most promising advantages is a reduction in the time desired for operation by a user. If an unaccustomed user operates the apparatus in accordance with instructions, the operation may take several tens of seconds because he or she gradually makes the attitude effective by trial and error. In contrast, when the technique of the present embodiment is used, the user has only to place his or her hand and then raise the palm, so that image capture is completed in several seconds.
The height determining unit 15 may detect the attitude of a subject in accordance with measurement of the distances between the surface of the palm and the sensor surface of the image sensor 105 at a plurality of sites. For example, as illustrated in
As described above, in the present embodiment, the fingertip location point B can be determined by placement of the user's hand with reference to the index point A. The index point A in
In the discussion up to here, it is intended that the center of the palm when the hand is held coincide with the center of the optical axis. However, as illustrated in
The distance e between the sensor center O of the image sensor 105 and the index point A meets the relationship of the following Expression (6), where the distance between the index point A and the fingertip location point B is c. That is, the distance between the index point A and the sensor center O of the image sensor 105 reduces with a reduction in the height h, that is, with an increase in the angle of view α.
e=√(c2−h2)−c (6)
In the case where hands with different sizes are used, when the distance c′ between B′ and C′ is k×c, the distance e′ between O and A′ is k×e, and thus the difference Δ between A and A′ is (1−k)×e. As calculated above, when the average hand size for adult males is used as the standard, e is 1.3 cm. The minimum value of the palm length for adults (females) is 8.7 cm and that of the finger length is 6.2 cm. Thus, c=8.7/2+6.2=10.55 cm. From k=0.78, Δ=0.29 cm. Hence, the finger is placed in a location displaced toward the fingertip location point B by 0.29 cm. This is no more than 3 mm, can fall within the range of variation occurring in placement of the hand, and thus causes no problem.
Image capture is discussed below for a better understanding. The field of view shifts by Δ×sin β and an image is taken in a location higher by Δ×sin μ. Thus, if there is no margin, the region on the finger side of the palm may lack by Δ(sin β−sin α). In the case of an example design in the above Expression (1), because α is 40° and β is 66°, the range of the lack is 0.078 cm, which is no more than 1 mm, and the effects are very small. Therefore, if there is a margin in the field of view, no problem arises. In contrast, in the case of a large hand, the maximum value of the palm size (for males) is 13.1 cm, that of the finger length is 9 cm, c=13.1/2+9=15.55 cm. From k=1.14, Δ=0.18 cm. Thus, the hand is placed in a location remote from the point B by 0.18 cm. This difference is on the order of 2 mm, and the effects are smaller than those in the previous discussion, and there is no problem.
In the above-described examples, the fingertip is in contact with the surface 201, and the palm is raised such that the point of contact acts as a pivot. The embodiment is not limited to the above examples. For example, the base of the palm may be in contact with the surface 201, and the fingers and the palm may be raised such that the point of contact acts as a pivot. The way of having the contact between any location of the palm and the surface 201 and raising other portions of the palm such that the point of contact acts as a pivot can stabilize the attitude of the palm. In that case, the mounting angle of the image sensor 105 may preferably be set so as to be substantially orthogonal to the inclination of the palm even if any location is used as the point of contact.
In the above-described examples, the palm is used as a subject for capturing an image in a noncontact manner. The embodiment is not limited to the above examples. For example, another portion, such as a face, may be targeted. In the above-described examples, a request for the location of a subject to a user is displayed on the display device 104. Other informing media, such as sound, may also be used. In the example illustrated in
In the above-described examples, the fingertip location detector 12 functions as a detector configured to detect a location where a surface of a casing and a subject are in contact with each other. The height determining unit 15 functions as an image capture controller configured to perform control such that an image is captured when a distance between the subject and a sensor meets a certain criterion and functions as an attitude detector configured to detect an attitude of the subject by detecting the distances between the surface of the casing and the subject at a plurality of sites. The instructing unit 11 functions as an outputting unit configured to output an alert when the attitude detected by the attitude detector does not meet the certain condition.
The embodiment of the present disclosure is described above. The present disclosure is not limited to a particular embodiment, and various modifications and changes may be made within the scope of the disclosure as defined in the claims.
All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the invention and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment(s) of the present invention(s) has(have) been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2013-154839 | Jul 2013 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
6879710 | Hinoue | Apr 2005 | B1 |
20050104968 | Aoki et al. | May 2005 | A1 |
20050148876 | Endoh | Jul 2005 | A1 |
20050286744 | Yoshizu et al. | Dec 2005 | A1 |
20060023919 | Okamura et al. | Feb 2006 | A1 |
20060228004 | Sato | Oct 2006 | A1 |
20060290781 | Hama | Dec 2006 | A1 |
20070196096 | Naruse | Aug 2007 | A1 |
20080107309 | Cerni | May 2008 | A1 |
20090093727 | Sato | Apr 2009 | A1 |
20130027184 | Endoh | Jan 2013 | A1 |
20130127709 | Spielberg | May 2013 | A1 |
20130243264 | Aoki | Sep 2013 | A1 |
20150324566 | Miura | Nov 2015 | A1 |
Number | Date | Country |
---|---|---|
2006-011988 | Jan 2006 | JP |
2006-252034 | Sep 2006 | JP |
2007-010346 | Jan 2007 | JP |
2007-215952 | Aug 2007 | JP |
2007-233981 | Sep 2007 | JP |
2012-208687 | Oct 2012 | JP |
WO 2004084140 | Sep 2004 | WO |
2012014304 | Feb 2012 | WO |
WO 2012014304 | Feb 2012 | WO |
2013069372 | May 2013 | WO |
Entry |
---|
Extended European Search Report dated May 21, 2015 in related European Application No. 14175200.6. |
Number | Date | Country | |
---|---|---|---|
20150029319 A1 | Jan 2015 | US |