1. Field of the Invention
The present invention relates generally to systems and methods for biometric identification, and more particularly, to imaging systems and methods capable of biometric identification according to more than one modality, especially for subjects positioned at a distance from the image capture system.
2. Description of the Related Art
Due to the unique character of each individual's face or iris, various systems attempt to use either the face or the iris for biometric identification. As such, commercially available imaging systems used for biometric identification generally use a single biometric modality. In other words, these systems employ imaging systems that process images of the face or the iris, but not both. As a result, these single modal systems suffer from the limitations inherent in face-only imaging systems or iris-only imaging systems. As a further disadvantage, commercially available iris-only systems usually image one iris at a time, and not two eyes simultaneously, or near simultaneously. In addition, conventional face-only or iris-only imaging systems suffer from constraints that prevent these systems from acquiring and tracking a person among multiple persons within a specified field of view from a distance. For example, the greater the distance between the imaging system and the target, the more difficult it is to acquire images that may be used for biometric identification.
In view of the limitations of the single modal systems described previously, embodiments of the present invention provide a biometric system for capturing and combining biometric information from more than one modality. In particular, embodiments of the present invention may provide multimodal biometric systems that generate and process images from the face and the two irises of subjects. Biometrics based on a combination of data from both irises and the face, as provided by such embodiments, are more accurate and robust than using biometrics based on data from only a single iris or only the face. Furthermore, such embodiments exhibit lower fail-to-acquire (FTA) metrics than iris or face only systems and are less susceptible to spoofing.
In addition, embodiments of the present invention may provide multimodal systems that capture biometric data from subjects who are positioned at a distance from the system. For example, a multimodal biometric system may capture and process images of the face and both irises of subjects who are 50 meters away from the system. As such, the system solves the problem of capturing an image of both irises at a long distance. In particular, aspects of this system provide sufficient illumination of the iris, achieve adequate resolution with the captured iris image, and minimize the iris's exposure to any damaging illumination.
In one embodiment, a system for multimodal biometric identification includes a first imaging system that detects one or more subjects in a first field of view, where the one or more subjects includes a targeted subject having a first biometric characteristic and a second biometric characteristic. In addition, the system includes a second imaging system that captures a first image of the first biometric characteristic according to first photons reflecting from the first biometric characteristic, where the first biometric characteristic is positioned in a second field of view which is smaller than the first field of view, and the first image includes first data for biometric identification. Furthermore, the system includes a third imaging system that captures a second image of the second biometric characteristic according to second photons reflecting from the second biometric characteristic, where the second biometric characteristic is positioned in a third field of view which is smaller than the first and second fields of view, and the second image includes second data for biometric identification. At least one active illumination source emits the second photons to be reflected from the second biometric characteristic. A controller operates the first imaging system, the second imaging system, the third imaging system, and the at least one illumination source according to programmed instructions. The controller includes at least one or more independent sub-controllers and/or one or more interdependent sub-controllers. In particular embodiments, the first biometric characteristic may be a face and the second biometric characteristic may be at least one iris corresponding to an eye of the targeted subject.
In yet another embodiment, a method for multimodal biometric identification includes the steps of: identifying one or more subjects in a first field of view; selecting a targeted subject from the one or more subjects, where the targeted subject has a first biometric characteristic and a second biometric characteristic; aligning a second field of view to the first biometric characteristic, where the second field of view is smaller than the first field of view; aligning a third field of view to the second biometric characteristic, where the third field of view is smaller than the first field of view and the second field of view; actively illuminating with second photons the second biometric characteristic; capturing a first image of the first biometric characteristic according to first photons, where the first image includes first data for biometric identification; and capturing a second image of the second biometric characteristic according to the second photons, where the second image includes second data for biometric identification.
Embodiments of the present invention may employ subject tracking, face tracking and recognition, iris tracking from facial tracking and recognition, iris image capture, high speed iris image processing, optimal optics and illumination design, as well as compliance with applicable safety and technology standards.
These and other aspects of the present invention will become more apparent from the following detailed description of the preferred embodiments of the present invention when viewed in conjunction with the accompanying drawings.
Referring to
The scene imaging system 120 may include one or more cameras that capture images based on photons with visible, near-infrared (NIR), or infrared (IR) wavelengths. The visible wavelengths detected may be in a range of approximately 400 nm to 700 nm; the NIR wavelengths detected may be in a range of approximately 700 nm to 2 μm; and the IR wavelengths detected may be in a range of approximately 2 μm to 13 μm. In some embodiments, the scene imaging system 120 captures images through passive imaging. Passive imaging refers to the detection of photons that are initially emitted from a source external to the biometric system 100, also referred to as ambient photon generation. In certain indoor or outdoor scenarios, passive imaging by the scene imaging system 120 may detect photons with visible, NIR, and/or IR wavelengths. For example, the biometric system 100 may be used to check subjects attending a large sporting event or similar public gathering, where the ambient lighting at the venue generates a sufficient level of photons with visible wavelengths for detection by the scene imaging system 120. In other embodiments, however, the scene imaging system 120 may detect photons that are provided by an illumination source (not shown) controlled by the biometric system 100, i.e., active illumination.
The face imaging system 140 may include a camera that captures images of the face based on photons with visible, NIR, or IR wavelengths. The visible wavelengths detected may be in a range of approximately 400 nm to 700 nm; the NIR wavelengths detected may be in a range of approximately 700 nm to 2 μm; and the IR wavelengths detected may be in a range of approximately 2 μm to 13 μm. In some embodiments, the face imaging system 140 may employ passive imaging to detect photons with visible, NIR, or IR wavelengths. In other embodiments, the face imaging system 140 may detect photons that are provided by an illumination source controlled by the biometric system 100, i.e., active illumination.
The iris imaging system 160 may include a camera that captures iris images based on photons with visible or NIR wavelengths. Photons with visible or NIR wavelengths may be used for iris recognition if the iris sensor is sufficiently large and an adequately high resolution is employed. The visible wavelengths detected may have a range of approximately 400 nm to 700 nm. The NIR wavelengths detected may be in a range of approximately 700 nm to 2 μm, or preferably, a range of 700 nm to 900 nm corresponding to the wavelength requirements for the ANSI specification for Iris Image Interchange Format (ANSI INCITS 379-2004). The preferable range may generally be determined according to the existing Iris Image Interchange Format standard.
The iris sensor of the iris imaging system 160 may have a significantly higher magnification than the face sensor of the face imaging system 140. In some embodiments, commercially available sensors may be employed, where the sensors, for example, employ 752×480 pixels for each eye image, have a resolution in the range of approximately 16 to 21 pixels/mm, and have a quantum efficiency of approximately 25 to 30 percent at 850 nm illumination.
In some embodiments, the optical design of the iris imaging system 160 may employ a zooming telescope lens having an aperture of 100 mm for 3 m to 6 m. For other embodiments in which very long distances are involved, telescopes having an aperture of approximately 50 cm to 100 cm for 50 m may be employed. In particular, the telescope may have a Ritchey-Chrétien design, i.e. a hyperbolic Cassegrain telescope with a very flat field. In addition, the resolution may be 2 lp/mm to 4 lp/mm, thereby complying with ANSI specifications (ANSI INCITS 379-2004). Meanwhile, the opto-mechanical requirements may be met with commercially available ultra-high precision axis encoders (resolutions <0.002 arc-sec).
To illustrate the ability of embodiments to resolve features at a distance,
One or more illumination systems, such as the illumination system 180 in
Alternatively, rather than providing continuous wave illumination as described previously, the laser may be pulsed at 50 nsec with a 10 kHz duty cycle. Advantageously, employing a quasi-CW laser reduces laser speckle.
As
As further illustrated in
In further embodiments, the PTU 195 may be used to target and track subjects. As shown in
In some embodiments, one or more beam steering systems (not shown), as are known, may additionally or alternatively be employed to direct the photons which are detected by the imaging systems 120, 140, and 160 for image capture. The beam steering systems may include galvanometric mirrors and/or imaging optics positioned on a gimbal mount. The beam steering systems may direct photons from the illumination source 180 to a biometric feature of the targeted subject 10. Additionally or alternatively, the beam steering systems may direct photons reflected from the biometric feature to the appropriate imaging system.
Embodiments of the present invention meet the safety criteria of Class I ANSI Z136. In general, the maximum permissible exposure (MPE) for continuous wave exposure at 850 nm is approximately 2 mW/cm2. As such, the illumination source 180 in some embodiments may provide illumination with a wavelength of 850 nm for up to 30,000 seconds. On the other hand, the maximum permissible exposure (MPE) for repetitive pulse exposure at 850 nm is approximately 0.56 mW/cm2. Thus, the illumination source 180 in other embodiments may provide illumination with a wavelength of 850 nm in a 10 second pulse train with 50 nsec pulses at 10 KHz. Other considerations for laser safety include the operational environment, the use of additional optical devices, such as glasses and binoculars, by targeted subjects, as well as the presence of specular surfaces.
As illustrated in
To obtain a full 360-degree field of view for the scene imaging system 120, the scene imaging system 120 may employ a plurality of scene cameras. The cameras may be arranged so that the field of view 102 for each camera overlaps, abuts, or nearly abuts other fields of view 102, whereby a series of fields of view 102 forms a continuous or nearly continuous a larger 360-degree field of view.
Alternatively, some embodiments may employ imaging systems which are all co-aligned using beam steering mirrors. As is known with other security monitoring systems, the use of a beam steering mirrors may be employed to enable the imaging systems to rotate through 360 degrees for observation.
Accordingly, some embodiments can identify multiple people within a 360 degree panoramic view. Employing such a system may require capturing images in rapid succession from a plurality of subjects who are moving within the panoramic view. Known techniques exist for stitching several detectors together to allow for rapid reading of the image to allow for increased frame rates. Moreover, aspects of these embodiments minimize occlusion of the subject's face and/or irises, minimize the time required to process the captured images, and overcome the constraints associated with the mechanical operation of the system.
Referring to
Thus, with the face imaging system 140 and the iris imaging system 160, the multimodal biometric system 100 generates images of the face and two irises for biometric identification. The controller 190 may operate the face imaging system 140 to capture an image of the subject's face 12 and the iris imaging system 160 to capture images of each iris 14 from the subject's right and left eyes all simultaneously, or near simultaneously.
Biometrics based on a combination of facial and iris data, as provided by the system of
Referring to
Information captured by the face imaging system 140 and the iris imaging system 160 is used to establish facial pattern recognition, iris pattern recognition, as well as biometric fusion. To achieve biometric identification, the information from the imaging systems may be used to determine a host of attributes including, but not limited to, positioning of the face or the irises, tracking of the face or irises, measurements of focus provided in the images, and interpupillary distance.
For example, the software executed by the controller 190 for capturing and processing images of the face 12 and irises 14 may determine characteristics such as linear (X,Y,Z) position of the head, head pose angle, and eye-gaze angle. Head pose angle indicates pitch, yaw, and roll, where pitch refers to up-and-down rotation of the head, yaw refers to side-to-side rotation of the head, and roll refers to rotation the head along a direction from ear to shoulder. Meanwhile, eye-gaze angle refers to the up-and-down or side-to-side viewing angle of the eyes.
To minimize the effect of environmental factors, such as heat from hot surfaces which can distort captured images, some embodiments may employ a Hartmann-Shack sensor to correct for these environmental factors.
Once the positioning of the biometric features is determined and images are captured by the facial/iris imaging system, the software executed by the controller 190 also detects and processes images of the face 12 and irises 14 in the captured data. For instance, as shown in step 218 of
Once the iris image data is segmented and tested according to step 218, the iris image data may be employed for biometric matching with databases of existing iris data or may be recorded for biometric enrollment, as shown in step 222. When iris data is collected from multiple subjects in rapid succession, the enrollment may be anonymous, i.e. recorded without further identification data, such as a name.
In general, embodiments of the present invention may employ various configurations of imaging systems that capture iris images and face images. Although many of the features of embodiments of the present invention may be described with respect to the configuration shown in
While the embodiments described previously may employ NIR laser illumination for the facial/iris imaging system, other embodiments of the present invention may employ LEDs or flash lamps rather than laser diodes. As such, in these alternative embodiments, the system can perform facial and iris liveness testing. Facial liveness testing detects whether the biometric information comes from a living source. (U.S. patent application Ser. No. 11/258,749, filed on Oct. 26, 2005, describes a METHOD AND SYSTEM FOR DETECTING BIOMETRIC LIVENESS, and is entirely incorporated herein by reference.)
Moreover, while the embodiments described previously may direct NIR laser illumination over a long distance to the face 12 or the irises 14, other embodiments may employ use of LED's positioned more closely to the targeted subject. For example, such LED's may be employed to illuminate a subject as the subject is guided down a specific corridor of known length and width. In other words, if the subject is guided near a known position, an illumination source may be set up near the known position so that photons for image capture do not have to be transmitted over longer distances.
Embodiments of the present invention may be fully automatic or may require some operator input, especially with regard to initial targeting of subjects. In other words, an operator selectively targets subjects for biometric analysis. Advantageously, the operator can ensure that the illumination sources are not directed at subjects who may susceptible to eye damage from photons emitted by the illumination sources. For example, embodiments of the present invention may be employed to identify and screen subjects at an event, such as a highly attended sporting event. At such events, spectators often use optical aids, such as binoculars, to view the game or match. Eye damage might result if laser illumination is conducted to the eyes of a targeted individual through such an optical aid. As a result, an operator-assisted mode can prevent the laser illumination from being directed at subjects using an optical aid.
As described above, the controller 190 may be a programmable processing device, such as an external conventional computer or an on-board field programmable gate array (FPGA) or digital signal processor (DSP), that executes software, or stored instructions. In general, physical processors and/or machines employed by embodiments of the present invention for any processing or evaluation may include one or more networked or non-networked general purpose computer systems, microprocessors, field programmable gate arrays (FPGA's), digital signal processors (DSP's), micro-controllers, and the like, programmed according to the teachings of the exemplary embodiments of the present invention, as is appreciated by those skilled in the computer and software arts. The physical processors and/or machines may be externally networked with the image capture device, or may be integrated to reside within the image capture device. Appropriate software can be readily prepared by programmers of ordinary skill based on the teachings of the exemplary embodiments, as is appreciated by those skilled in the software art. In addition, the devices and subsystems of the exemplary embodiments can be implemented by the preparation of application-specific integrated circuits or by interconnecting an appropriate network of conventional component circuits, as is appreciated by those skilled in the electrical art(s). Thus, the exemplary embodiments are not limited to any specific combination of hardware circuitry and/or software.
Stored on any one or on a combination of computer readable media, the exemplary embodiments of the present invention may include software for controlling the devices and subsystems of the exemplary embodiments, for driving the devices and subsystems of the exemplary embodiments, for enabling the devices and subsystems of the exemplary embodiments to interact with a human user, and the like. Such software can include, but is not limited to, device drivers, firmware, operating systems, development tools, applications software, and the like. Such computer readable media further can include the computer program product of an embodiment of the present inventions for performing all or a portion (if processing is distributed) of the processing performed in implementing the inventions. Computer code devices of the exemplary embodiments of the present inventions can include any suitable interpretable or executable code mechanism, including but not limited to scripts, interpretable programs, dynamic link libraries (DLLs), Java classes and applets, complete executable programs, and the like. Moreover, parts of the processing of the exemplary embodiment of the present inventions can be distributed for better performance, reliability, cost, and the like.
Common forms of computer-readable media may include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other suitable magnetic medium, a CD-ROM, CDRW, DVD, any other suitable optical medium, punch cards, paper tape, optical mark sheets, any other suitable physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other suitable memory chip or cartridge, a carrier wave or any other suitable medium from which a computer can read.
While the present invention has been described in connection with a number of exemplary embodiments, and implementations, the present inventions are not so limited, but rather cover various modifications, and equivalent arrangements.
This application claims priority to U.S. Provisional Application No. 60/844,644 filed Sep. 15, 2006, the contents of which are incorporated entirely herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3069654 | Hough | Dec 1962 | A |
4641349 | Flom et al. | Feb 1987 | A |
5291560 | Daugman | Mar 1994 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5751836 | Wildes et al. | May 1998 | A |
5836872 | Kenet et al. | Nov 1998 | A |
5850470 | Kung et al. | Dec 1998 | A |
5859686 | Aboutalib et al. | Jan 1999 | A |
5953440 | Zhang et al. | Sep 1999 | A |
6011624 | de Groot | Jan 2000 | A |
6144754 | Okano et al. | Nov 2000 | A |
6152563 | Hutchinson et al. | Nov 2000 | A |
6215891 | Suzaki et al. | Apr 2001 | B1 |
6229907 | Okano et al. | May 2001 | B1 |
6247813 | Kim et al. | Jun 2001 | B1 |
6285780 | Yamakita et al. | Sep 2001 | B1 |
6373968 | Okano et al. | Apr 2002 | B2 |
6442465 | Breed et al. | Aug 2002 | B2 |
6526160 | Ito | Feb 2003 | B1 |
6529630 | Kinjo | Mar 2003 | B1 |
6532298 | Cambier et al. | Mar 2003 | B1 |
6542624 | Oda | Apr 2003 | B1 |
6546121 | Oda | Apr 2003 | B1 |
6571002 | Ogawa | May 2003 | B1 |
6591064 | Higashiyama et al. | Jul 2003 | B2 |
6597377 | MacPhail | Jul 2003 | B1 |
6614919 | Suzaki et al. | Sep 2003 | B1 |
6700998 | Murata | Mar 2004 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6753919 | Daugman | Jun 2004 | B1 |
6760467 | Min et al. | Jul 2004 | B1 |
6778698 | Prakash et al. | Aug 2004 | B1 |
6785406 | Kamada | Aug 2004 | B1 |
6850631 | Oda et al. | Feb 2005 | B1 |
6944318 | Takata et al. | Sep 2005 | B1 |
6992717 | Hatano | Jan 2006 | B2 |
7099495 | Kodno et al. | Aug 2006 | B2 |
7130453 | Kondo et al. | Oct 2006 | B2 |
7155035 | Kondo et al. | Dec 2006 | B2 |
7197166 | Jeng | Mar 2007 | B2 |
7277561 | Shin | Oct 2007 | B2 |
7362884 | Willis et al. | Apr 2008 | B2 |
7583823 | Jones et al. | Sep 2009 | B2 |
7593550 | Hamza | Sep 2009 | B2 |
7599524 | Camus et al. | Oct 2009 | B2 |
7697734 | Jung et al. | Apr 2010 | B2 |
8064647 | Bazakos et al. | Nov 2011 | B2 |
20020136435 | Prokoski | Sep 2002 | A1 |
20030012413 | Kusakari et al. | Jan 2003 | A1 |
20030108224 | Ike | Jun 2003 | A1 |
20030118217 | Kondo et al. | Jun 2003 | A1 |
20040197011 | Camus et al. | Oct 2004 | A1 |
20050078868 | Chen et al. | Apr 2005 | A1 |
20050084179 | Hanna et al. | Apr 2005 | A1 |
20050251347 | Perona et al. | Nov 2005 | A1 |
20060008124 | Ewe et al. | Jan 2006 | A1 |
20060140453 | Geng | Jun 2006 | A1 |
20060140454 | Northcott et al. | Jun 2006 | A1 |
20060147094 | Yoo | Jul 2006 | A1 |
20060165266 | Hamza | Jul 2006 | A1 |
20060187305 | Trivedi et al. | Aug 2006 | A1 |
20060228005 | Matsugu et al. | Oct 2006 | A1 |
20070036397 | Hamza | Feb 2007 | A1 |
20070047772 | Matey et al. | Mar 2007 | A1 |
20070047773 | Martin et al. | Mar 2007 | A1 |
20070110284 | Rieul et al. | May 2007 | A1 |
20070160266 | Jones et al. | Jul 2007 | A1 |
20070160267 | Jones et al. | Jul 2007 | A1 |
Number | Date | Country |
---|---|---|
2005 008567 | Jan 2005 | WO |
Entry |
---|
Guo et al., “A System for Automatic Iris Capturing”, Mitsubishi Electric Research Laboratories, TR2005-044, 2005. |
Wang et al., “Combining Face and Iris Biometrics for Identity Verification”, Proceedings of Fourth International Conference on AVBPA, Guildford, UK, 2003, pp. 805-813. |
Y. Park, et al.; “A Fast Circular Edge Detector for the Iris Region Segmentation”; S.-W. Lee, H.H. Buelthoff, T. Poggio (Eds.) BMCV 2000, LNCS 1811, pp. 417-423, 2000. |
Christel-Loic Tisse, et al.; “Person identification technique using human iris recognition”; Advanced System Technology; Universite de Montpellier. |
Libor Masek; “Recognition of Human Iris Patterns for Biometric Identification”; School of Computer Science and Software Engineering, The University of Western Australia, 2003, pp. 1-56. |
Xiaomei Liu, et al.; “Experiments with an Improved Iris Segmentation Algorithm”; Department of Computer Science and Engineering University of Notre Dame; Fourth IEEE Workshop on Automatic Identification Advanced Technologies (AutolD), Oct. 2005, New York, 6 pages. |
Ping-Sung Liao, et al.; “A Fast Algorithm for Multilevel Thresholding”; Journal of Information Science and Engineering 17, pp. 713-727 (2001). |
Nobuyuki Otsu; “A Threshold Selection Method from Gray-Level Histograms”; IEEE Transactions on Systems Man and Cybernetics, vol. SMC-9, No. I, Jan. 1979. |
International Search Report for PCT/US08/75910, dated Nov. 28, 2008, 3 pages. |
Written Opinion for PCT/US08/75910, dated Nov. 28, 2008, 9 pages. |
European Search Report corresponding to European patent Application Serial No. 07 84 2181, European Patent Office, dated Aug. 27, 2010, 7 pages. |
Ross et al.; “Handbook of Multibiometrics”; Springer Science, New York, US XP002597965 ISBN: 978-0-387-22296-7; p. 51; Jun. 24, 2006. |
Fancourt et al.; “Iris Recognition at a Distance”; Audio- and Video-based Biometric Person Authentication; (Lecture Notes in Computer Science;; LNCS), Springer-Verlag, Berlin/Heidelberg, pp. 1-13; XP019013243; ISBN: 978-3-540-27887-0; Jun. 28, 2005. |
Basit, A. et al. “A Fast and Robust Iris Localization Method.” IADIS International Conference Applied Computing, Feb. 25-28, 2006 (pp. 557-560). |
Camus, T. et al. “Reliable and Fast Eye Finding in Close-up Images.” Proceedings of the 16th International Conference on Pattern Recognition. vol. 1, 2002 (pp. 389-394). |
Liu, X. et al. “Experiments with an Improved Iris Segmentation Algorithm.” Fourth IEEE Workshop on Automatic Identification Advanced Technologies, Oct. 2005 (6 pages). |
Vezhnevets, V. et al. “Robust and Accurate Eye Contour Extraction.” International Conference Graphicon, 2003 (4 pages). |
Number | Date | Country | |
---|---|---|---|
20080069411 A1 | Mar 2008 | US |
Number | Date | Country | |
---|---|---|---|
60844644 | Sep 2006 | US |