This invention relates to systems and methods for acquiring biometric and other imagery, biometric acquisition, identification, fraud detection, and security systems and methods, particularly biometric systems and methods which employ iris recognition with a camera having a field of view. More particularly the invention relates to systems and methods for very quickly acquiring iris imagery within a wide capture volume.
Iris recognition systems have been in use for some time. The acquisition of images suitable for iris recognition is inherently a challenging problem. This is due to many reasons. As an example, the iris itself is relatively small (approximately 1 cm in diameter) and for many identification systems it is desirable to obtain a subject's iris data from a great distance in order to avoid constraining the position of the subject. This results in a small field of view and a small depth of field. Even systems which obtain iris data from a close in subject must be adapted to subjects which do not stay absolutely still. Systems must also deal with subjects which blink involuntarily or drop or swivel their head momentarily to check on the whereabouts of luggage.
There is therefore a need to scan very quickly or else the person will have moved out of the capture volume or the subject's motion will cause a blur. In the current state of the art, attempts to resolve this problem comprise using a flat mirror to scan but such attempts have not so far resolved the motion blur problem, especially when the camera is zoomed in. The image motion in terms of pixels/second is very high which makes it very difficult to obtain high quality imagery with prior art systems in these situations.
In biometric applications, one or more image sensors are often used to collect data for subsequent analysis and biometric matching. For example, with the face or iris biometric, a single camera and lens is often used to collect the biometric data. There is an obvious trade-off between the resolution required for biometric analysis and matching, and the field of view of the lens. For example, as the field of view of the lens increases, the capture volume or coverage in which the biometric data can be observed increases, but the resolution of the data decreases proportionally. Multiple cameras and lenses covering a larger volume is an obvious solution, but it requires the expense of additional cameras, optics and processing.
Another approach for increasing the capture volume has been to use controllable mirrors that point the camera coverage in different locations. Specifically, in U.S. Pat. No. 6,714,665 it is proposed to use a wide field of view camera to determine where to point a mirror that was mounted on a pan/tilt/zoom assembly. However approaches that point mirrors in such a fashion have to handle one or more key problems, namely: (i) the time latency involved in moving the camera to a location, (ii) vibration of the mirror and the resulting settling time of the mirror as it stops and starts motion, (iii) the complexity of the mechanical arrangement, (iv) the reliability, longevity and expense of the opto-mechanical components for such a moving assembly.
U.S. Pat. No. 6,320,610, Van Sant et al disclosed acquisition of biometric data with a mirror on a pan/tilt platform, or a camera on pan/tilt platform. The problem with that approach is that it is very expensive or physically impossible to use such a mechanism to point at 2 or 3 places in a scene at a very high rate—for example, 5-50 times a second. If there is a mechanical mirror or pointing mechanism, then there is substantial inertia preventing the rapid stopping and starting of the assembly quickly and furthermore such a system needs a very powerful actuator/motor to rotate a camera assembly. In addition, there is substantial settling time for the mirror or camera to stop vibrating as the mirror or pan/tilt assembly stops before imagery is acquired, so essentially it makes it almost physically impossible to scan at such high rates.
It is an object of the present invention to acquire biometric data within large capture volumes with high resolution using fewer cameras, or one camera, and without the problems of prior art systems.
The present invention overcomes the problems of the prior art systems and improves on them by using a continuous mechanical mechanism to solve the inertia problem, and translates that into imagery that stops and stares at one location and then instantaneously jumps to stare at another location.
In one aspect the invention comprises using a rotating curved mirror and tilting which allows the image to appear frozen for a fraction of a second before moving onto the next tile of the scan which also appears frozen.
In another aspect the invention comprises a system for acquiring biometric imagery in a large capture volume from an unconstrained subject comprising a rotationally symmetric mirror, motor means to rotate the mirror at a constant rotational velocity about an axis, and a sensor configured to acquire biometric imagery reflected off of the mirror as it is rotated about the axis.
In some embodiments the rotationally symmetric mirror is comprised of one or more conical sections.
The system can be configured to obtain a set of still images. In some embodiments the system is configured for iris recognition and comprises one or more conical sections arranged to rotate at a substantially constant rotational velocity around their common axis.
In another aspect the invention comprises a reflection device comprising a first surface that reflects light off that surface as if off a substantially rotationally symmetric surface; a second surface different from the first surface that reflects light off that surface as if off a substantially rotationally symmetric surface; wherein said first and said second surfaces are mounted on the same axis such that rotational symmetry of each surface is maintained.
The method aspect of the invention comprises acquiring imagery in a large capture volume by configuring a sensor to view a scene reflected off a non-flat surface; mounting the said surface on a rotating axis; and acquiring imagery of the scene reflected off said surface.
In certain embodiments a set of still images of portions of the scene are obtained.
These and other objects, features, and advantages of embodiments are presented in greater detail in the following description when read in relation to the drawings, but not limited to these figures, in which:
While the invention is capable of many embodiments, only a few embodiments are illustrated in detail herein.
The following is a general description of a system and method according to the invention. An image is acquired using a camera system 10, 11, or any other image recording device. A camera system us used that can either capture images synchronously at a constant rate, or asynchronously on request by a computer-controlled trigger signal. The camera may be operated at a variable acquisition rate depending on the results of previous processing.
The method is highly effective in many respects. First, if the disposition of the subject is immediately amenable to successful data acquisition (e.g. eyes are open and their face is facing the system), then the system will acquire iris imagery very rapidly.
However, of the subject is fidgeting or unable to remain stationary, or is distracted by baggage or children for example, then the acquisition system will still acquire imagery, although it might take a slightly longer period of time. However, the acquisition time for an amenable user will not be penalized by the system's capability to acquire data in the case of a less amenable user. This is crucial when subject throughput is considered.
The invention performs temporal multiplexing of the camera and optics such that at one time instant the camera sensor acquires data from a first part of the scene and at another time instant the camera sensor acquires data from a second part of the scene, that may or may not substantially overlap the first part of the scene. This process is repeated for additional parts of the scene, until data is once again acquired from the first part of the scene. This process results in tiles which do not substantially overlap as illustrated in
In this configuration, a non-flat mirror is continually rotated at a constant rotational velocity by a small electrical mirror. The mirror is designed to be reflective in the wavelengths required for the biometric camera acquisition device. The non-flat mirror can, for example, be spherical, conical, or other shapes. In the case of conical shapes, a series of conical sections can be joined together. For example,
The camera, lens, or other imager, and motor are fixed. The motor is designed to rotate at a constant angular velocity. Constant angular motion eliminates mechanical vibration due to stop/start motion and the motor is very reliable. As the mirror rotates, the part of the scene viewed by the lens changes as each different conical mirrored section comes into view of the lens. However, the part of the scene viewed by the lens when each particular conical mirrored sections is in view of the lens does not change even though the mirror is rotating, due to the rotationally symmetric nature of each mirror segment. During this time period of the mirrors rotation, high quality imagery of the scene at a particular location is acquired.
The specific location of the scene that is imaged as the mirror rotates depends on the position on the mirror to which the sensor is pointed.
To illustrate further, if the camera is mounted such that it is pointed at a first substantially rotationally symmetric mirror (
Additional scan patterns can be implemented by combining two or more mirror/motor assemblies in optical series such that the resultant scan pattern is the combination of each individual scan patterns. More specifically, one rotating mirror assembly can be mounted with a vertical orientation of the axis of rotation, which provides a scan pattern in the vertical direction. A second rotating mirror assembly can be mounted with a horizontal orientation of the axis of rotation such that the optical path reflects off the first mirror assembly and onto the second mirror assembly. The second mirror assembly provides a scan pattern in the horizontal direction. The speed of rotation of each mirror assembly is carefully controlled such that the combination of the vertical and horizontal scan patterns results in a scan pattern that covers a complete 2 dimensional area. For example, if there are 3 separate mirror surfaces within each of the vertical and horizontal mirror assemblies that cover 3 areas in each of the vertical and horizontal directions, then the speed of rotation of one of the assemblies is controlled to be ⅓ or a third the speed of rotation of the other assembly to ensure that the combined scan pattern covers a complete 2 dimensional area. Position sensors, such as optical encoders that are well known in the art, can be used to both measure rotational velocity as well as measure the angular position of each rotating mirror assembly at any time instant in order to optimize the scan pattern such that the scan in one mirror assembly is transitioning from one region to the next at the same time that the scan is transitioning in the second mirror assembly.
This approach allows large capture volumes to be scanned over time. However, one significant remaining problem is that the during biometric data acquisition, the optical path is such that the subject appears to move in the field of view of the camera—in effect, the camera is virtually scanning across the scene. Depending on the integration time of the sensor, this can introduce motion blur in the image data. This can be mitigated by illuminating the subject by stroboscopic lighting, which is a commonly-used technique to stop apparent motion in images acquired where either the camera and/or subject is moving. The stroboscopic illumination can illuminate the subject externally, or can be directed through the moving mirror assembly using a half-silvered mirror in order to direct the illumination directly at the location of interest.
Since the imagery is reflected off a non-flat surface, the imagery is stretched or deformed. The deformation is highly predictable and is given by the shape of the rotationally symmetric surface. After the imagery has been digitized, the stretching or distortion can be removed by applying an inverse geometric image warping function. As an example, “Corneal Imaging System: Environment from Eyes,” K. Nishino and S. K. Nayar, International Journal on Computer Vision, October 2006, describe methods of removing distortion off a spherical surface.
In some embodiments two or more conical sections of different pitch (angle) are combined on a single component that spins around an optical axis. The more conical sections that are added, then the more parts of the scene can be scanned. As the conical sections rotate, when the scene is viewed reflected off one conical section, then a certain part of the field of view is observed and appears stationary. When the scene is viewed reflected off a second conical section, then a different part of the field of view is observed and also appears stationary. The advantage is that a wide area of a scene can be scanned extremely rapidly in contrast with a moving pan/tilt mirror system which introduces motion blur or has a slow scan time. In some embodiments, moderate stroboscopic illumination may be used to stop the motion of the individual in the scene.
The angle of the non-flat mirror such as a cone is chosen based on the field of view of the lens and the optical configuration. For example, consider a single cone with a 45 degree pitch. Imagery is reflected by a full 90 degree angle off the conical surface. If the field of view of the imaging system is 10 degrees, then the second conical surface may have a pitch that is 10/2=5 degrees different from the first cone, which is either 40 or 50 degrees depending on whether the desired second part of the scene to be imaged lies above or below the first part of the scene. In practice, the pitch of the second conical surface will be slightly closer to the pitch of the first surface in order to ensure that there is overlap between the regions being imaged.
While the invention has been described and illustrated in detail herein, various other embodiments, alternatives, and modifications should become apparent to those skilled in the art without departing from the spirit and scope of the invention. Therefore the claims should not be considered limited to the illustrated embodiments.
This application is a continuation of and claims priority to U.S. application Ser. No. 12/658,706, filed Feb. 16, 2010, entitled “Mirror System and Method for Acquiring Biometric Data,” which is: a continuation of and claims priority to PCT Application No. PCT/US2008/074751, filed Aug. 29, 2008, entitled “Mirror System and Method for Acquiring Biometric Data,” which claims priority to U.S. provisional application 60/969,607, filed Sep. 1, 2007, entitled “Methodology for Acquiring Biometric Data Large Volumes,” which are both hereby incorporated by reference in their entireties; and a continuation of and claims priority to PCT Application No. PCT/US2008/074737, filed Aug. 29, 2008, entitled “System And Method for Iris Data Acquisition For Biometric Identification,” which claims priority to U.S. provisional application 60/969,607, filed Sep. 1, 2007, entitled “Methodology for Acquiring Biometric Data Large Volumes,” which are both hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4231661 | Walsh et al. | Nov 1980 | A |
4641349 | Flom et al. | Feb 1987 | A |
4910725 | Drexler et al. | Mar 1990 | A |
4923263 | Johnson | May 1990 | A |
5140469 | Lamarre et al. | Aug 1992 | A |
5259040 | Hanna | Nov 1993 | A |
5291560 | Daugman | Mar 1994 | A |
5488675 | Hanna | Jan 1996 | A |
5572596 | Wildes et al. | Nov 1996 | A |
5581629 | Hanna et al. | Dec 1996 | A |
5613012 | Hoffman et al. | Mar 1997 | A |
5615277 | Hoffman | Mar 1997 | A |
5737439 | Lapsley et al. | Apr 1998 | A |
5751836 | Wildes | May 1998 | A |
5764789 | Pare et al. | Jun 1998 | A |
5802199 | Pare et al. | Sep 1998 | A |
5805719 | Pare et al. | Sep 1998 | A |
5838812 | Pare et al. | Nov 1998 | A |
5901238 | Matsushita | May 1999 | A |
5953440 | Zhang et al. | Sep 1999 | A |
5978494 | Zhang | Nov 1999 | A |
6021210 | Camus et al. | Feb 2000 | A |
6028949 | McKendall | Feb 2000 | A |
6055322 | Salganicoff | Apr 2000 | A |
6064752 | Rozmus et al. | May 2000 | A |
6069967 | Rozmus et al. | May 2000 | A |
6088470 | Camus | Jul 2000 | A |
6144754 | Okano et al. | Nov 2000 | A |
6149061 | Massieu et al. | Nov 2000 | A |
6192142 | Pare et al. | Feb 2001 | B1 |
6222903 | Kim et al. | Apr 2001 | B1 |
6246751 | Bergl et al. | Jun 2001 | B1 |
6247813 | Kim et al. | Jun 2001 | B1 |
6252977 | Salganicoff et al. | Jun 2001 | B1 |
6289113 | McHugh et al. | Sep 2001 | B1 |
6301375 | Choi | Oct 2001 | B1 |
6320610 | Van Sant et al. | Nov 2001 | B1 |
6349171 | Koike | Feb 2002 | B1 |
6366682 | Hoffman et al. | Apr 2002 | B1 |
6373968 | Okano et al. | Apr 2002 | B2 |
6377699 | Musgrave et al. | Apr 2002 | B1 |
6424727 | Musgrave et al. | Jul 2002 | B1 |
6483930 | Musgrave et al. | Nov 2002 | B1 |
6532298 | Cambier et al. | Mar 2003 | B1 |
6542624 | Oda | Apr 2003 | B1 |
6545810 | Takada et al. | Apr 2003 | B1 |
6546121 | Oda | Apr 2003 | B1 |
6554705 | Cumbers | Apr 2003 | B1 |
6587597 | Nakao et al. | Jul 2003 | B1 |
6594376 | Hoffman et al. | Jul 2003 | B2 |
6594377 | Kim et al. | Jul 2003 | B1 |
6652099 | Chae et al. | Nov 2003 | B2 |
6700998 | Murata | Mar 2004 | B1 |
6701029 | Berfanger et al. | Mar 2004 | B1 |
6714665 | Hanna et al. | Mar 2004 | B1 |
6760467 | Min et al. | Jul 2004 | B1 |
6763148 | Sternberg et al. | Jul 2004 | B1 |
6819219 | Bolle et al. | Nov 2004 | B1 |
6832044 | Doi et al. | Dec 2004 | B2 |
6850631 | Oda et al. | Feb 2005 | B1 |
6917695 | Teng et al. | Jul 2005 | B2 |
6920236 | Prokoski | Jul 2005 | B2 |
6930707 | Bates et al. | Aug 2005 | B2 |
6944318 | Takata et al. | Sep 2005 | B1 |
6950536 | Hovvener | Sep 2005 | B2 |
6980670 | Hoffman et al. | Dec 2005 | B1 |
6985608 | Hoffman et al. | Jan 2006 | B2 |
7007298 | Shinzaki et al. | Feb 2006 | B1 |
7020351 | Kumar | Mar 2006 | B1 |
7047418 | Ferren et al. | May 2006 | B1 |
7095901 | Lee et al. | Aug 2006 | B2 |
7106366 | Parker et al. | Sep 2006 | B2 |
7146027 | Kim et al. | Dec 2006 | B2 |
7152782 | Shenker et al. | Dec 2006 | B2 |
7209271 | Lewis et al. | Apr 2007 | B2 |
7212330 | Seo et al. | May 2007 | B2 |
7221486 | Makihira et al. | May 2007 | B2 |
7236534 | Morejon et al. | Jun 2007 | B1 |
7248719 | Hoffman et al. | Jul 2007 | B2 |
7271939 | Kono | Sep 2007 | B2 |
7272265 | Kouri et al. | Sep 2007 | B2 |
7346472 | Moskowitz et al. | Mar 2008 | B1 |
7385626 | Aggarwal et al. | Jun 2008 | B2 |
7398925 | Tidwell et al. | Jul 2008 | B2 |
7414737 | Cottard et al. | Aug 2008 | B2 |
7418115 | Northcott et al. | Aug 2008 | B2 |
7428320 | Northcott et al. | Sep 2008 | B2 |
7542590 | Robinson et al. | Jun 2009 | B1 |
7545962 | Peirce et al. | Jun 2009 | B2 |
7558406 | Robinson et al. | Jul 2009 | B1 |
7558407 | Hoffman et al. | Jul 2009 | B2 |
7574021 | Matey | Aug 2009 | B2 |
7583822 | Guillemot et al. | Sep 2009 | B2 |
7606401 | Hoffman et al. | Oct 2009 | B2 |
7616788 | Hsieh et al. | Nov 2009 | B2 |
7639840 | Hanna et al. | Dec 2009 | B2 |
7660700 | Moskowitz et al. | Feb 2010 | B2 |
7693307 | Rieul et al. | Apr 2010 | B2 |
7697786 | Camus et al. | Apr 2010 | B2 |
7715595 | Kim et al. | May 2010 | B2 |
7719566 | Guichard | May 2010 | B2 |
7760919 | Namgoong | Jul 2010 | B2 |
7770019 | Ferren et al. | Aug 2010 | B2 |
7797606 | Chabanne | Sep 2010 | B2 |
7801335 | Hanna | Sep 2010 | B2 |
7847688 | Bernard et al. | Dec 2010 | B2 |
7869627 | Northcott et al. | Jan 2011 | B2 |
7912252 | Ren et al. | Mar 2011 | B2 |
7916908 | Thomas | Mar 2011 | B1 |
7925059 | Hoyos et al. | Apr 2011 | B2 |
7929017 | Aggarwal | Apr 2011 | B2 |
7929732 | Bringer et al. | Apr 2011 | B2 |
7949295 | Kumar | May 2011 | B2 |
7949494 | Moskowitz et al. | May 2011 | B2 |
7978883 | Rouh et al. | Jul 2011 | B2 |
8009876 | Kim et al. | Aug 2011 | B2 |
8025399 | Northcott et al. | Sep 2011 | B2 |
8028896 | Carter et al. | Oct 2011 | B2 |
8090246 | Jelinek | Jan 2012 | B2 |
8092021 | Northcott et al. | Jan 2012 | B1 |
8132912 | Northcott et al. | Mar 2012 | B1 |
8159328 | Luckhardt | Apr 2012 | B2 |
8170295 | Fujii et al. | May 2012 | B2 |
8181858 | Carter et al. | May 2012 | B2 |
8195044 | Hanna | Jun 2012 | B2 |
8212870 | Hanna | Jul 2012 | B2 |
8214175 | Moskowitz et al. | Jul 2012 | B2 |
8233680 | Bringer et al. | Jul 2012 | B2 |
8243133 | Northcott et al. | Aug 2012 | B1 |
8260008 | Hanna | Sep 2012 | B2 |
8279042 | Beenau et al. | Oct 2012 | B2 |
8280120 | Hoyos | Oct 2012 | B2 |
8289390 | Aggarwal | Oct 2012 | B2 |
8306279 | Hanna | Nov 2012 | B2 |
8317325 | Raguin et al. | Nov 2012 | B2 |
8364646 | Hanna | Jan 2013 | B2 |
8411909 | Zhao et al. | Apr 2013 | B1 |
8442339 | Martin et al. | May 2013 | B2 |
8443202 | White et al. | May 2013 | B2 |
8553948 | Hanna et al. | Oct 2013 | B2 |
8604901 | Hoyos | Dec 2013 | B2 |
8606097 | Hanna | Dec 2013 | B2 |
8719584 | Mullin | May 2014 | B2 |
20010028730 | Nahata | Oct 2001 | A1 |
20020110286 | Cheatle et al. | Aug 2002 | A1 |
20020131623 | Musgrave et al. | Sep 2002 | A1 |
20020136435 | Prokoski | Sep 2002 | A1 |
20030103212 | Westphal et al. | Jun 2003 | A1 |
20030151674 | Lin | Aug 2003 | A1 |
20040013288 | Svensson et al. | Jan 2004 | A1 |
20040042643 | Yeh | Mar 2004 | A1 |
20040071363 | Kouri et al. | Apr 2004 | A1 |
20050084137 | Kim et al. | Apr 2005 | A1 |
20050084179 | Hanna | Apr 2005 | A1 |
20050105778 | Sung et al. | May 2005 | A1 |
20050226471 | Singh et al. | Oct 2005 | A1 |
20050264758 | Wakamori | Dec 2005 | A1 |
20050270386 | Saitoh et al. | Dec 2005 | A1 |
20050285943 | Cutler | Dec 2005 | A1 |
20060028552 | Aggarwal | Feb 2006 | A1 |
20060029262 | Fujimatsu et al. | Feb 2006 | A1 |
20060073449 | Kumar | Apr 2006 | A1 |
20060074986 | Mallalieu et al. | Apr 2006 | A1 |
20060097172 | Park | May 2006 | A1 |
20060120707 | Kusakari et al. | Jun 2006 | A1 |
20060170813 | Morofuji | Aug 2006 | A1 |
20060188169 | Tener et al. | Aug 2006 | A1 |
20060204121 | Bryll | Sep 2006 | A1 |
20060279630 | Aggarwal | Dec 2006 | A1 |
20070098229 | Wu et al. | May 2007 | A1 |
20070110285 | Hanna | May 2007 | A1 |
20070188613 | Nobori et al. | Aug 2007 | A1 |
20070206839 | Hanna | Sep 2007 | A1 |
20070211922 | Crowley et al. | Sep 2007 | A1 |
20070286462 | Usher et al. | Dec 2007 | A1 |
20070286524 | Song | Dec 2007 | A1 |
20080031610 | Border et al. | Feb 2008 | A1 |
20080044063 | Friedman et al. | Feb 2008 | A1 |
20080075334 | Determan et al. | Mar 2008 | A1 |
20080089554 | Tabankin | Apr 2008 | A1 |
20080122578 | Hoyos | May 2008 | A1 |
20080291279 | Samarasekera | Nov 2008 | A1 |
20090074256 | Haddad | Mar 2009 | A1 |
20090097715 | Cottard et al. | Apr 2009 | A1 |
20090161925 | Cottard et al. | Jun 2009 | A1 |
20090207251 | Kobayashi et al. | Aug 2009 | A1 |
20090219405 | Kaneda et al. | Sep 2009 | A1 |
20090231096 | Bringer et al. | Sep 2009 | A1 |
20090232418 | Lolacono et al. | Sep 2009 | A1 |
20090268045 | Sur et al. | Oct 2009 | A1 |
20090274345 | Hanna | Nov 2009 | A1 |
20090278922 | Tinker et al. | Nov 2009 | A1 |
20100014720 | Hoyos | Jan 2010 | A1 |
20100021016 | Cottard et al. | Jan 2010 | A1 |
20100033677 | Jelinek | Feb 2010 | A1 |
20100074477 | Fujii et al. | Mar 2010 | A1 |
20100127826 | Saliba et al. | May 2010 | A1 |
20100201853 | Ishiga | Aug 2010 | A1 |
20100232655 | Hanna | Sep 2010 | A1 |
20100238407 | Dai | Sep 2010 | A1 |
20100246903 | Cottard | Sep 2010 | A1 |
20100253816 | Hanna | Oct 2010 | A1 |
20100278394 | Raguin et al. | Nov 2010 | A1 |
20100310070 | Bringer et al. | Dec 2010 | A1 |
20110002510 | Hanna | Jan 2011 | A1 |
20110007949 | Hanna | Jan 2011 | A1 |
20110119111 | Hanna | May 2011 | A1 |
20110119141 | Hoyos | May 2011 | A1 |
20110158486 | Bringer et al. | Jun 2011 | A1 |
20110194738 | Choi et al. | Aug 2011 | A1 |
20110211054 | Hanna | Sep 2011 | A1 |
20110277518 | Lais et al. | Nov 2011 | A1 |
20120127295 | Hanna | May 2012 | A9 |
20120187838 | Hanna | Jul 2012 | A1 |
20120212597 | Hanna | Aug 2012 | A1 |
20120219279 | Hanna | Aug 2012 | A1 |
20120239458 | Hanna | Sep 2012 | A9 |
20120240223 | Tu | Sep 2012 | A1 |
20120242820 | Hanna et al. | Sep 2012 | A1 |
20120242821 | Hanna | Sep 2012 | A1 |
20120257797 | Leyvand et al. | Oct 2012 | A1 |
20120268241 | Hanna | Oct 2012 | A1 |
20120293643 | Hanna | Nov 2012 | A1 |
20120300052 | Hanna | Nov 2012 | A1 |
20120300990 | Hanna | Nov 2012 | A1 |
20120321141 | Hoyos | Dec 2012 | A1 |
20120328164 | Hoyos | Dec 2012 | A1 |
20130051631 | Hanna | Feb 2013 | A1 |
20130093838 | Tan et al. | Apr 2013 | A1 |
20130108125 | Storm et al. | May 2013 | A1 |
20130110859 | Hanna | May 2013 | A1 |
20130162798 | Hanna et al. | Jun 2013 | A1 |
20130162799 | Hanna | Jun 2013 | A1 |
20130182093 | Hanna et al. | Jul 2013 | A1 |
20130182094 | Hanna et al. | Jul 2013 | A1 |
20130182095 | Hanna et al. | Jul 2013 | A1 |
20130182913 | Hoyos | Jul 2013 | A1 |
20130182915 | Hanna | Jul 2013 | A1 |
20130194408 | Hanna | Aug 2013 | A1 |
20130212655 | Hoyos | Aug 2013 | A1 |
20130223840 | Hanna et al. | Aug 2013 | A1 |
20130251215 | Coons | Sep 2013 | A1 |
20130294659 | Hanna | Nov 2013 | A1 |
20140064574 | Hanna | Mar 2014 | A1 |
20140072183 | Hanna | Mar 2014 | A1 |
Number | Date | Country |
---|---|---|
2007-249556 | Sep 2007 | JP |
1020020078225 | Oct 2002 | KR |
1020030005113 | Jan 2003 | KR |
1003738500000 | Feb 2003 | KR |
1020030034258 | May 2003 | KR |
1020030051970 | Jun 2003 | KR |
2003216700000 | Jul 2003 | KR |
1004160650000 | Jan 2004 | KR |
2003402730000 | Jan 2004 | KR |
2003411370000 | Jan 2004 | KR |
2003526690000 | May 2004 | KR |
2003552790000 | Jun 2004 | KR |
2003620320000 | Sep 2004 | KR |
2003679170000 | Nov 2004 | KR |
1020050005336 | Jan 2005 | KR |
2003838080000 | May 2005 | KR |
1020050051861 | Jun 2005 | KR |
2004046500000 | Dec 2005 | KR |
1005726260000 | Apr 2006 | KR |
10-2009-0086891 | Oct 2009 | KR |
10-2009-0106791 | Oct 2009 | KR |
10-2010-0049407 | May 2010 | KR |
1011976780000 | Oct 2012 | KR |
1013667480000 | Feb 2014 | KR |
1013740490000 | Mar 2014 | KR |
1020140028950 | Mar 2014 | KR |
1020140039803 | Apr 2014 | KR |
1020140050501 | Apr 2014 | KR |
WO 2008054396 | May 2008 | WO |
WO 2009029757 | Mar 2009 | WO |
WO 2009029765 | Mar 2009 | WO |
WO 2010062371 | Jun 2010 | WO |
WO 2011093538 | Aug 2011 | WO |
WO 2012112788 | Aug 2012 | WO |
WO 2013109295 | Jul 2013 | WO |
Entry |
---|
Belcher et al, “A Selective Feature Information Approach for Iris Image-Quality Measure”, IEEE, 3(3):572-577 (2008). |
Bergen, J.R., et al., Hierarchical Model-Based Motion Estimation, European Conf. on Computer Vision (1993). |
Daugman, John, “How Iris Recognition Works,” IEEE Transaction on Circuits and Systems for Video Technology, 14(1):21-30 (2004). |
Galvin, B., et al., Recovering Motion Fields: An Evaluation of Eight Optical Flow Algorithms, Proc. of the British Machine Vision Conf. (1998). |
He, Y. et al, “A fast iris image quality evaluation method based on weighted entropy”, SPIE, 6623:1-8 (2007). |
He, Xiaofu et al., “Contactless Autofeedback Iris Capture Design”, IEEE Transactions on Instrumentation and Measurement, IEEE Service Center, Piscataway, NJ, U.S. 57(7):1369-1375 (2008). |
Kumar, R., et al., “Direct recovery of shape from multiple views: a parallax based approach”, 12th IAPR Int'l Conf. on Pattern Recognition (1994). |
Lu, Huiqi et al., “Iris Recognition on Low Computational Power Mobile Devices”, 23 pages, (2011). Retrieved from the Internet: URL:http:jjcdn.intechopen.comjpdfs-wm/14646.pdf [retrieved on Jul. 23, 2014]. |
Ma, L. et al, “Personal Identification Based on Iris Texture Analysis”, IEEE: Pattern Analysis and Machine Intelligence, 25(12):1519-1533 (2003). |
Nishino, K., et al., “The World in an Eye”, IEEE Conf. on Pattern Recognition, 1:444-451 (2004). |
Peters, Tanya H. et al., “Effects of segmentation routine and acquisition environment on iris recognition”, 97 pages, (2009). Retrieved from the Internet: URL:http://etd.nd.edu/ETD-db/thesesjavailablejetd-12112009-103348/ [retrieved on Jul. 21, 2014]. |
Wildes, R.P., “Iris Recognition: An Emerging Biometric Technology”, Proc. IEEE 85(9):1348-1363 (1997). |
Written Opinion of the International Searching Authority in PCT/US2008/074737, mailed Jan. 23, 2009, 6 pages. |
International Search Report in PCT/US2008/074737, mailed Jan. 23, 2009, 4 pages. |
International Preliminary Report on Patentability in PCT/US2008/074737 dated Mar. 2, 2010, 7 pages. |
Notice of Allowance in U.S. Appl. No. 12/658,706, mailed Feb. 24, 2012, 8 pages. |
Written Opinion of the International Searching Authority in PCT/US2008/074751 mailed Jan. 28, 2009, 4 pages. |
International Search Report in PCT/US2008/074751, mailed Jan. 28, 2009, 2 pages. |
International Preliminary Report on Patentability in PCT/US2008/074751 dated Mar. 2, 2010, 5 pages. |
Written Opinion of the International Searching Authority in PCT/US2012/025468, mailed Sep. 14, 2012, 3 pages. |
International Search Report in PCT/US2012/025468, mailed Sep. 14, 2012, 3 pages. |
International Preliminary Report on Patentability in PCT/US2012/025468 dated Aug. 21, 2013, 4 pages. |
Office Action in U.S. Appl. No. 12/675,189 dated Dec. 7, 2012. |
International Preliminary Report on Patentability in PCT/US2012/032391, dated Oct. 8, 2013, 8 pages. |
Written Opinion of the International Searching Authority in PCT/US2012/032391, mailed Jul. 25, 2013, 7 pages. |
International Search Report in PCT/US2012/032391, mailed Jul. 25, 2013, 3 pages. |
Office Action in U.S. Appl. No. 13/493,455 mailed Sep. 19, 2013. |
Office Action in U.S. Appl. No. 13/773,168, mailed Oct. 8, 2013, 16 pages. |
Office Action in U.S. Appl. No. 13/773,159, mailed Oct. 31, 2013, 16 pages. |
Office Action in U.S. Appl. No. 13/440,707, mailed Jan. 14, 2014, 16 pages. |
Office Action in U.S. Appl. No. 13/807,256, mailed Jan. 29, 2014, 16 pages. |
Office Action in U.S. Appl. No. 13/493,455 mailed Apr. 9, 2014. |
Office Action in U.S. Appl. No. 13/398,562, mailed May 21, 2014, 11 pages. |
Office Action in U.S. Appl. No. 13/773,159, mailed Jun. 18, 2014, 26 pages. |
Office Action in U.S. Appl. No. 13/773,168, mailed Jul. 16, 2014, 19 pages. |
Notice of Allowance in U.S. Appl. No. 13/493,455, mailed Jul. 18, 2014, 5 pages. |
Extended European Search Report in EP Application No. EP 12866256.6, dated Aug. 1, 2014, 7 pages. |
Office Action in U.S. Appl. No. 13/786,079, mailed Sep. 26, 2014, 8 pages. |
Office Action in U.S. Appl. No. 13/440,707, mailed Sep. 30, 2014, 22 pages. |
Number | Date | Country | |
---|---|---|---|
20120243749 A1 | Sep 2012 | US |
Number | Date | Country | |
---|---|---|---|
60969607 | Sep 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12658706 | Feb 2010 | US |
Child | 13493462 | US | |
Parent | PCT/US2008/074751 | Aug 2008 | US |
Child | 12658706 | US | |
Parent | PCT/US2008/074737 | Aug 2008 | US |
Child | 12658706 | US |