This invention relates to biometric acquisition, identification, fraud detection, and security systems and methods, more particularly biometric systems and methods which employ iris recognition, inter alia. Biometric acquisition systems generally employ cameras, lens, illumination, processors, and reporting functions. When such systems detect the identity of a subject, they can issue a signal, open a gate, sound an alarm, alert operators, or merely record the detection. Some biometric acquisition systems require a card swipe or other means of informing the system of the purported identity of a subject.
Previous systems were primarily kiosk-based, where the cameras and illumination were directly in front of the user, looking directly towards them.
More recently, “walk through” biometric identification systems have been disclosed. Walk through systems are designed to verify the identity of subjects who pass through an opening such as an airport gate, a door, or the like, by illuminating the subject as it passes through the gate, acquiring an image of one or two irises and/or facial features of the subject, applying an algorithm to an image to generate a set of data, and comparing the resultant set or sets of data to stored sets of data using pre-designated criteria and determining if there is a match between the sets of data and thereby determining if there is a match between the subject's iris and/or facial features and registered identities. In the prior art systems, the cameras were mounted in a device situated directly facing the user, such that the user had to walk around the cameras.
There are problems with existing systems which have prevented them from being widely adopted. The great disadvantage of the arrangement of illuminators and cameras is that the person has to stop or change their direction of motion, or else they will strike the cameras and illuminators. This approach has been the state-of-the-art in iris recognition for decades. Also, when the illumination in prior systems is continually turned on at normal levels, the illuminator must be replaced at frequent intervals. Also, the systems may be fooled into illuminating when non-subjects walk past them rather than subjects walking through them.
Some systems use multiple sensors to acquire data through spectral filters at the same time, which is inefficient.
Further, some systems are not capable of integrating non-biometric devices such as classic identification card systems which can be swiped or touched to a detector.
There has been a long-felt need in the field of biometric detection systems and methods for more efficient and effective detection, identification, and security.
These needs and others as will become apparent from the following description and drawings, are achieved by the present invention which comprises in one aspect a system for determining the identity of a subject comprising a system for determining the identity of a subject while the subject is walking or being transported along in an essentially straight direction, the two dimensional profile of the subject walking or being transported along forming a three dimensional swept volume, without requiring the subject to change direction to avoid any part of the system, adapted to acquire one or more biometrics of the subject and determine if the acquired biometrics match corresponding biometric data stored in the system, the system comprising one or more cameras and one or more infrared illuminators which are strobed or scanned, wherein the cameras are positioned above, next to, or below the swept volume; and the illuminators are positioned above, next to, or below the swept volume.
In another aspect, the invention comprises a method of determining the identity of a subject while the subject is walking or being transported in an essentially straight direction, the two dimensional profile of the subject walking or being transported along forming a three dimensional swept volume, without requiring the subject to change direction to avoid any part of the system, comprising acquiring data related to one or more biometrics of the subject with the camera(s), processing the acquired biometrics data, and determining if the acquired biometric data match corresponding biometric data stored in the system, positioning camera(s) and strobed or scanned infrared illuminators above, next to, or below the swept volume.
The invention results in an order-of-magnitude increase in the throughput of people that can be processed by such a system. The substantial gain in throughput can be measured by time-and-motion analysis, and we have shown that the traditional systems where users faced cameras or lighting resulted in delays from having to find the kiosk or location to stop, wait for the prior person, putting down bags, reading instructions, waiting for the device to operate, picking up bags, finding the new direction in which to walk, and then walking to the new location. Even if each of these steps takes 2 seconds, then the cumulative time to perform biometric reading can take 10 seconds or more.
In one preferred configuration, the user looks at the camera as they walk, the lighting and cameras are not co-located since the user may be wearing glasses. In addition, the lighting and cameras should not be too far away from the user or else the signal/to/noise ratio of the image will be too low for acquisition of biometric imagery of sufficient quality for matching. Further, in embodiments of the invention where the user looks straightforward or at arbitrary points that are not necessarily the camera location, then the angle of the eye to the camera should not be too large or else image foreshortening will result in a smaller-than-required segment of the iris being captured. It is preferred that the lighting and cameras match the physical characteristics of doorways and other locations through which individuals walk through.
Another aspect of the invention is a system for determining the identity of a subject walking or being transported in an essentially straight direction comprising a motion sensor and an illuminator adapted to be turned on when motion is detected.
A still further aspect is placing the camera above, next to, or below a doorway or portal and acquiring the biometric data when the horizontal distance between the subject and at least one of the cameras is between 97.28 and 201.93 cm.
In applications wherein at least one camera is positioned on a counter and is used in a point of sale identity verification application and the horizontal distance between the camera and the area of biometric acquisition on the subject is preferably about 0.15 to 1.2 meters.
In another aspect of the invention is determining the identity of a subject comprising employing at least one camera and at least one illuminator adapted to illuminate at a level sufficient to detect the potential presence of biometric data and upon detection to illuminate at a high level sufficient to acquire biometric data with improved signal to noise.
While the invention is capable of many embodiments, for purposes of illustration a few embodiments are described below with reference to the drawings wherein
Presence Detector modules have been used previously but such detectors do not localize the user position well, and do not localize the orientation of users at all. For example, in an airport application, the biometric system may be at a boarding gate in order to verify the identity of boarding passengers, but in fact numerous passengers can be walking orthogonal to the device while in transit to other gates, thereby triggering the infrared or microwave detectors continuously. In another example, a stream of users, each behind each other, may be continuously using the device. This will result in the Presence Detector being continually triggered. Since the Presence Detector triggers the illumination, the illumination will be continually turned on. This greatly increases the degree to which subjects, particularly staff standing continuously by the device, are irradiated. It also reduces the lifetime of the illuminators.
We also describe fraud-resistance in order to prevent a user from purporting to be another user. For example, photographs of the face or iris can sometimes be used in place of the user's face or iris in order to purport to have another identity. Several methods have been proposed to detect fraud detection. For example, U.S. Pat. No. 6,760,467 describes a method for turning on and off 2 illuminators to detect if a live-eye is in the scene by determining if the specularity from the illuminator is present in the acquired image at the expected time. In another method, it is known that skin and other living tissue has a particular spectral signature that can be differentiated from paper or other synthetic material. This approach either requires multiple sensors to acquire data through spectral filters at the same time, or requires a single sensor to acquire data under different spectral illumination at different times. However, while we wish to acquire single-spectrum data to analyze the spectral signature, we still wish to acquire wide spectrum data that can be reliably used for biometric matching.
We also describe a back-end biometric architecture that allows a biometric device to be integrated to a non-biometric architecture without losing the additional functionality that the biometric device enables, and where either type of device can be managed and deployed in the same way. More specifically, most biometric architectures have been developed to optimize biometric performance with minimal or no consideration of the large base of installed non-biometric devices and architectures, and have assumed that integrators will be specially trained to integrate biometric devices. These are amongst the factors that have limited the widespread deployment of biometric devices. As discussed later, this is especially relevant when a biometric device is used to compare a biometric template acquired from a user with more than one template in the database (recognition) as oppose to comparing a template with just one candidate template in the database (verification).
In the case of verification, a card-swipe of other means of identification sends a small set of data to a processor for one-to-one comparison. In the case of recognition however, the processor needs to be capable of performing one-to-many comparisons rapidly, and the comparisons are typically not simple digit matching—they are biometric template matching, which typically takes substantially more processing. As a result, custom biometric match processors have been developed to perform the biometric matching. However, these custom match processors have biometric databases that are typically managed separately from the standard database that may already exist at a deployment. We propose an architecture where non-biometric devices and biometric devices can co-exist in the same architecture without any loss of functionality, and where either type of device can be managed and deployed in the same way.
The preferred distance between the camera or cameras and the subject at the time of acquiring biometric data depends on the particular application. For example, when there is no equipment in the way, the preferred horizontal distance is 0.45 m to 3.15 m where the camera or cameras are mounted above the swept-volume, for example in access-control at doorways, identification at airports, border crossings and hotels. For certain embodiments where one or more cameras are mounted above or to one side of the swept volume, the preferred distance is 0.3 m to 5.72 m such that the angle between the optical axis of the one or more cameras and a vector defining the path of the swept volume is less than approximately 45 degrees, for example for identification through windshields of cars, as well as access-control at doorways, identity validation at airports, border crossings or hotels where mounting of cameras from above is not preferred. For certain embodiments where one or more cameras are mounted to one side of the swept volume, such that the angle between the optical axis of the one or more cameras and a vector defining the path of the swept volume is greater than approximately 45 degrees, a preferred distance is 0.1 m to 2.8 m for example in border control lanes and point-of-sale terminals. A distance of 0.1 to 5.72 m is preferred for certain embodiments for access-control at doorways, identification at airports, border crossings and hotels, identification through windshields of cars, border control lanes, point-of-sale identification, and desk-based or kiosk-based identification, or identification using a substantially mobile device, especially the where illumination is scanned.
In embodiments wherein the housing 12 is placed above a doorway or portal, the biometric data is preferably acquired when the horizontal distance between the subject and at least one of the cameras is between 97.28 and 201.93 cm.
The present invention, therefore, is well adapted to carry out the objects and attain the ends and advantages mentioned, as well as others inherent therein. While the invention has been depicted and described and is defined by reference to particular preferred embodiments of the invention, such references do not imply a limitation on the invention, and no such limitation is to be inferred. The invention is capable of considerable modification, alteration and equivalents in form and function, as will occur to those ordinarily skilled in the pertinent arts. The depicted and described preferred embodiments of the invention are exemplary only and are not exhaustive of the scope of the invention. Consequently, the invention is intended to be limited only by the spirit and scope of the appended claims, giving full cognizance to equivalents in all respects.
This application is a continuation of and claims priority to U.S. application Ser. No. 13/797,258, filed Mar. 12, 2013, entitled “COMPACT BIOMETRIC ACQUISITION SYSTEM AND METHOD”, which is a continuation of and claims priority to U.S. application Ser. No. 12/441,881, filed Mar. 18, 2009 entitled “COMPACT BIOMETRIC ACQUISITION SYSTEM AND METHOD” issued as U.S. Pat. No. 8,965,063 on Feb. 24, 2015, which is a national stage entry of International application PCT/US2007/79160, filed Sep. 21, 2007, entitled “COMPACT BIOMETRIC ACQUISITION SYSTEM AND METHOD”, which claims priority to U.S. provisional application 60/826,560 filed Sep. 22, 2006, entitled “COMPACT BIOMETRIC ACQUISITION SYSTEM AND ARCHITECTURE”, all of which are hereby incorporated by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4641349 | Flom et al. | Feb 1987 | A |
5259040 | Hanna | Nov 1993 | A |
5291560 | Daugman | Mar 1994 | A |
5359669 | Shanley et al. | Oct 1994 | A |
5488675 | Hanna | Jan 1996 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5581629 | Hanna et al. | Dec 1996 | A |
5613012 | Hoffman et al. | Mar 1997 | A |
5615277 | Hoffman | Mar 1997 | A |
5737439 | Lapsley et al. | Apr 1998 | A |
5764789 | Pare et al. | Jun 1998 | A |
5802199 | Pare et al. | Sep 1998 | A |
5805719 | Pare et al. | Sep 1998 | A |
5838812 | Pare et al. | Nov 1998 | A |
5901238 | Matsushita | May 1999 | A |
5953440 | Zhang et al. | Sep 1999 | A |
5978494 | Zhang | Nov 1999 | A |
6021210 | Camus et al. | Feb 2000 | A |
6028949 | McKendall | Feb 2000 | A |
6064752 | Rozmus et al. | May 2000 | A |
6069967 | Rozmus et al. | May 2000 | A |
6079862 | Kawashima et al. | Jun 2000 | A |
6082858 | Grace | Jul 2000 | A |
6119096 | Mann | Sep 2000 | A |
6144754 | Okano et al. | Nov 2000 | A |
6192142 | Pare et al. | Feb 2001 | B1 |
6247813 | Kim et al. | Jun 2001 | B1 |
6252977 | Salganicoff et al. | Jun 2001 | B1 |
6289113 | McHugh et al. | Sep 2001 | B1 |
6320610 | Van Sant | Nov 2001 | B1 |
6366682 | Hoffman et al. | Apr 2002 | B1 |
6373968 | Okano et al. | Apr 2002 | B2 |
6377699 | Musgrave et al. | Apr 2002 | B1 |
6424727 | Musgrave et al. | Jul 2002 | B1 |
6483930 | Musgrave et al. | Nov 2002 | B1 |
6532298 | Cambier et al. | Mar 2003 | B1 |
6540392 | Braithwaite | Apr 2003 | B1 |
6542624 | Oda | Apr 2003 | B1 |
6546121 | Oda | Apr 2003 | B1 |
6594376 | Hoffman et al. | Jul 2003 | B2 |
6594377 | Kim et al. | Jul 2003 | B1 |
6652099 | Chae et al. | Nov 2003 | B2 |
6700998 | Murata | Mar 2004 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6760467 | Min et al. | Jul 2004 | B1 |
6813010 | Kono et al. | Nov 2004 | B2 |
6820979 | Stark | Nov 2004 | B1 |
6850631 | Oda et al. | Feb 2005 | B1 |
6917695 | Teng et al. | Jul 2005 | B2 |
6920236 | Prokoski | Jul 2005 | B2 |
6980670 | Hoffman et al. | Dec 2005 | B1 |
6985608 | Hoffman et al. | Jan 2006 | B2 |
7091471 | Wenstrand | Aug 2006 | B2 |
7095901 | Lee et al. | Aug 2006 | B2 |
7146027 | Kim et al. | Dec 2006 | B2 |
7167201 | Stavely | Jan 2007 | B2 |
7248719 | Hoffman et al. | Jul 2007 | B2 |
7271939 | Kono | Sep 2007 | B2 |
7385626 | Aggarwal et al. | Jun 2008 | B2 |
7414737 | Cottard et al. | Aug 2008 | B2 |
7417727 | Polonskiy | Aug 2008 | B2 |
7418115 | Northcott et al. | Aug 2008 | B2 |
7428320 | Northcott et al. | Sep 2008 | B2 |
7430365 | Ng et al. | Sep 2008 | B2 |
7542590 | Robinson et al. | Jun 2009 | B1 |
7558406 | Robinson et al. | Jul 2009 | B1 |
7558407 | Hoffman et al. | Jul 2009 | B2 |
7574021 | Matey | Aug 2009 | B2 |
7583822 | Guillemot et al. | Sep 2009 | B2 |
7599526 | Ono et al. | Oct 2009 | B2 |
7606401 | Hoffman et al. | Oct 2009 | B2 |
7616788 | Hsieh et al. | Nov 2009 | B2 |
7627147 | Lolacono | Dec 2009 | B2 |
7639840 | Hanna et al. | Dec 2009 | B2 |
7693307 | Rieul et al. | Apr 2010 | B2 |
7697786 | Camus et al. | Apr 2010 | B2 |
7715595 | Kim et al. | May 2010 | B2 |
7719566 | Guichard | May 2010 | B2 |
7797606 | Chabanne | Sep 2010 | B2 |
7801335 | Hanna et al. | Sep 2010 | B2 |
7869627 | Northcott et al. | Jan 2011 | B2 |
7929732 | Bringer et al. | Apr 2011 | B2 |
7978883 | Rouh et al. | Jul 2011 | B2 |
8009876 | Kim et al. | Aug 2011 | B2 |
8025399 | Northcott et al. | Sep 2011 | B2 |
8064647 | Bazakos | Nov 2011 | B2 |
8092021 | Northcott et al. | Jan 2012 | B1 |
8132912 | Northcott et al. | Mar 2012 | B1 |
8170295 | Fujii et al. | May 2012 | B2 |
8195044 | Hanna et al. | Jun 2012 | B2 |
8233680 | Bringer et al. | Jul 2012 | B2 |
8243133 | Northcott et al. | Aug 2012 | B1 |
8279042 | Beenau et al. | Oct 2012 | B2 |
8317325 | Raguin et al. | Nov 2012 | B2 |
8606097 | Hanna et al. | Dec 2013 | B2 |
20010031072 | Dobashi et al. | Oct 2001 | A1 |
20030020828 | Ooi et al. | Jan 2003 | A1 |
20030142853 | Waehner | Jul 2003 | A1 |
20030152251 | Ike | Aug 2003 | A1 |
20030169334 | Braithwaite et al. | Sep 2003 | A1 |
20040240711 | Hamza | Dec 2004 | A1 |
20050084137 | Kim et al. | Apr 2005 | A1 |
20050084179 | Hanna | Apr 2005 | A1 |
20050089197 | Iwasaki et al. | Apr 2005 | A1 |
20050129285 | Mino et al. | Jun 2005 | A1 |
20050265585 | Rowe | Dec 2005 | A1 |
20060074986 | Mallalieu et al. | Apr 2006 | A1 |
20060165265 | Fujimatsu | Jul 2006 | A1 |
20060202028 | Rowe et al. | Sep 2006 | A1 |
20060204050 | Takizawa | Sep 2006 | A1 |
20070211922 | Crowley et al. | Sep 2007 | A1 |
20080002863 | Northcott | Jan 2008 | A1 |
20080181467 | Zappia | Jul 2008 | A1 |
20090074256 | Haddad | Mar 2009 | A1 |
20090097715 | Cottard et al. | Apr 2009 | A1 |
20090161925 | Cottard et al. | Jun 2009 | A1 |
20090231096 | Bringer et al. | Sep 2009 | A1 |
20100021016 | Cottard et al. | Jan 2010 | A1 |
20100074477 | Fujii et al. | Mar 2010 | A1 |
20100127826 | Saliba et al. | May 2010 | A1 |
20100246903 | Cottard | Sep 2010 | A1 |
20100278394 | Raguin et al. | Nov 2010 | A1 |
20100310070 | Bringer et al. | Dec 2010 | A1 |
20110158486 | Bringer et al. | Jun 2011 | A1 |
20110194738 | Choi et al. | Aug 2011 | A1 |
20110277518 | Lais et al. | Nov 2011 | A1 |
20120240223 | Tu | Sep 2012 | A1 |
20120257797 | Leyvand et al. | Oct 2012 | A1 |
Number | Date | Country |
---|---|---|
2006031185 | Feb 2006 | JP |
WO-2005008567 | Jan 2005 | WO |
WO-2010062371 | Jun 2010 | WO |
WO-2011093538 | Aug 2011 | WO |
Entry |
---|
B. Galvin, et al., Recovering Motion Fields: An Evaluation of Eight Optical Flow Algorithms, Proc. of the British Machine Vision Conf. (1998). |
Christopher Boyce, Arun Ross, Matthew Monaco, Lawrence Hornak and Xin Li, “Multispectral Iris Analysis: A Preliminary Study”, Proceedings of Computer Vision and Pattern Recognition Workshop on Biometrics, Jun. 2006, New York NY, pp. 1-9. |
European Partial Search Report on 07842962.8 dated Apr. 20, 2015. |
Examination Report for EP 07842962.8 dated Apr. 7, 2017. |
Extended European Search Report on 07842962.8 dated Aug. 13, 2015. |
International Report on Patentability on PCT/US2007/079160 dated Mar. 24, 2009. |
International Search Report on PCT/US2007/079160 dated Jan. 30, 2008. |
J. R. Bergen, et al., Hierarchical Model-Based Motion Estimation, European Conf. on Computer Vision (1993). |
K. Nishino, et al., The World in an Eye, IEEE Conf. on Pattern Recognition, vol. 1, at pp. 444-451 (Jun. 2004). |
R. Kumar, et al., Direct recovery of shape from multiple views: a parallax based approach, 12th IAPR Int'l Conf. on Pattern Recognition (1994). |
R. P. Wildes, Iris Recognition: An Emerging Biometric Technology, Proc. IEEE 85(9) at pp. 1348-1363 (Sep. 1997). |
U.S. Office Action on U.S. Appl. No. 13/797,258 dated Jun. 23, 2014. |
Written Opinion on PCT/US2007/079160 dated Jan. 30, 2008. |
Summons to attend oral proceedings pursuant to Rule 115(1) EPC on EP appl. No. 07842962.8-1207 dated Feb. 23, 2018 (pp. 1-6). |
Number | Date | Country | |
---|---|---|---|
20170220862 A1 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
60826560 | Sep 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13797258 | Mar 2013 | US |
Child | 15488011 | US | |
Parent | 12441881 | US | |
Child | 13797258 | US |