System and method for iris data acquisition for biometric identification

Information

  • Patent Grant
  • 9946928
  • Patent Number
    9,946,928
  • Date Filed
    Monday, April 24, 2017
    7 years ago
  • Date Issued
    Tuesday, April 17, 2018
    6 years ago
  • Inventors
  • Original Assignees
  • Examiners
    • Park; Edward
    Agents
    • Foley & Lardner LLP
    • Pua; Paul M. H.
Abstract
A system and related method for acquiring high quality images of the iris of an unconstrained subject comprising a camera; a controllable focusing component; a focus controller component that controls the lens to focus at successively different points within a focus range, such focus control performed without any input from measurement of whether the image is in focus or out of focus, be it based from measurements of the image or other distance metrics to the subject; and a sharpness detection component that rejects the most out-of-focus images based on measurement of focus on the image is disclosed.
Description
BACKGROUND

This disclosure relates to systems and methods for acquiring biometric and other imagery, biometric acquisition, identification, fraud detection, and security systems and methods, particularly biometric systems and methods which employ iris recognition. More particularly the disclosure relates to systems and methods for acquiring iris data for iris recognition.


Iris recognition systems have been in use for some time. The acquisition of images suitable for iris recognition is inherently a challenging problem. The performance of recognition algorithms depends on the quality, i.e., sharpness and contrast, of the image of the iris of the subject who is to be identified. This is due to many reasons. As an example, the iris itself is relatively small (approximately 1 cm in diameter) and it is often required to observe it from a great distance in order to avoid constraining the position of the subject or when the subject is walking or riding. This results in a small field of view and also a small depth of field. As a second example, it is generally difficult for the adult or child subject to stay absolutely still. As a third example, the subject may blink involuntarily or drop or swivel their head momentarily to check on the whereabouts of luggage.


In biometric identification applications, due to unconstrained motion of cooperative or non-compliant subject, it has been very difficult to acquire iris images with sufficient quality for recognition and identification processing. For example, iris acquisition systems typically check whether the quality of an acquired image exceeds a threshold. Many methods of assessing quality have been developed, such as those based on a measurement of focus such as those disclosed in U.S. Pat. No. 6,753,919. The problem with this approach is that if the acquired image quality does not exceed the threshold, then the data is not acquired, despite the fact that there may never be another opportunity to acquire data from that subject again. More specifically, in the case of unconstrained users or non-cooperative subjects, it may be impossible to have the subject position themselves or wait until the acquired image data exceeds the quality threshold. For example, the subject may be distracted with their head turning in various directions, or they may be in the process of performing another task, such as boarding a bus, so that the opportunity to acquire data from them has already come and gone. More specifically, prior iris data acquisition systems have typically been designed to explicitly avoid capturing lower quality data with an emphasis on waiting or constraining the user such that only highest quality data is acquired. We have determined that even a lower quality iris image (blurred, for example) can still contain substantial evidence for matching, albeit not with the precision of a high quality iris image. However, we still wish to acquire high quality data when it is possible to do so. In another example of prior systems, for example those disclosed in U.S. Pat. No. 5,151,583, autofocus routines are used to attempt to obtain high quality iris images. However, autofocus routines cause lag times and inaccuracy, resulting in poor quality or even non-existent imaging. Other systems, such as the ones disclosed in U.S. Pat. No. 6,753,919 by Daugman, use sensors to assist a subject in aligning and focusing a handheld video camera.


Most if not all automatic focus systems work by acquiring an image of the scene, processing the image to recover a measure of focus, using that measure of focus to move a lens-focus actuator, and then repeating the steps of image acquisition, processing and actuation many times until it is determined in the processing step that focus has been reached. In most iris recognition systems autofocus never is able to catch up with the actual position of the subject unless the subject is relatively stationary, due to the unusually low depth of field in iris recognition, as well as the requirement that the focus has to be on the iris (as opposed to the nose for example).


Because of the time delays involved in acquiring an image, processing the image, and mechanical actuation, it is impossible for auto-focus algorithms to respond instantaneously. Moreover, as the depth of field reduces, as is typically the case in iris recognition, where the object is small and is typically observed at high magnification, it becomes more difficult for auto-focus algorithms to be successful because any error in the auto-focus position is much more apparent in the imagery since the depth of field is small.


It is much more difficult for auto-focus to acquire in-focus imagery of a subject who is moving even slightly (fractions of an inch).


In the case of a person moving even slightly because there is a finite control loop time for standard auto-focus to actuate, it can be shown that if a component of the person's motion is high frequency and above the control loop response time, then the auto-focus will never be able to converge and acquire an in-focus image of the person. The auto-focus will be continually “hunting” for a focused image and will always lag the motion of the subject. The result is that the subject has to be rock solid and still when standard auto-focus is used, and this was the state of the art in iris recognition before the present invention.


Prior attempts to solve these autofocus problems use the same closed loop approach but assume a subject is moving in a straight line and then use the image measurements to try and predict where the person will be in the next frame. This approach is not very robust and also fails for random movement that subjects often have. Other auto-focus systems use different ways of computing focus measures in the scene in one or more regions to compute the most accurate focus score. When a subject is moving with frequencies that are beyond the control loop of an auto-focus algorithm auto-focus algorithms are unable to catch up to the person's motion and acquire a good image of the person.


Martin, et al., US Pat. Pub. 2008/0075335, disclose a biometric image selection method which reduces the rate of non-exploitable images which are supplied to an analysis and identification processing module using sharpness and contrast criteria. In some embodiments Martin et al. locate a pattern in each image of a sequence of images, estimate the speed of displacement of the pattern between two successive images in the sequence, and select images for which the estimated speed of displacement of the pattern is lower than a speed threshold. Martin et al. disclosed embodiments wherein two selection modules are provided, the first being a quick selection module and the second being a pupil tracking module, rejecting an image if it is below a contrast or sharpness threshold. The selection module in some embodiments selects images having the highest sharpness and/or contrast out of the images stored. Martin et al do not disclose a system or method for acquiring the series of images, nor do they disclose storing only images having higher quality than previously stored images and removing the lesser quality image from memory storage.


SUMMARY

The foregoing disadvantages and problems are overcome by the present invention which automatically acquires a series of images, analyzes the images for quality, and stores only the best quality image, not necessarily dependent on whether the quality exceeds a predetermined threshold, thereby saving memory and assuring that at least one image is stored, even if not having a quality exceeding a threshold. In a second embodiment, the system which does not require an auto-focusing system but rather automatically acquires a series of images at different focus settings regardless of the quality of images previously acquired, analyzes the images for quality, and stores only the best quality image, not necessarily dependent on whether the quality exceeds a predetermined threshold, thereby saving memory and assuring that at least one image is stored, even if not having a quality exceeding a threshold. The invention is an iris image acquisition system that, over the smallest possible time period for a particular subject, stores successively better quality images of the iris among the images acquired by the acquisition system to ensure that at least some biometric data of the subject is acquired, while at the same time accounting for arbitrary and rapid subject motion, and voluntary or involuntary subject actions such as, for example, eye blinks or head twists, all with a minimal memory requirement.


The invention is directed to acquiring iris images of optimum quality for further processing which comprises matching iris images of unknown subjects to iris image templates of known subjects. In another aspect the invention comprises a system and method of acquiring iris images having the best focus without use of autofocus systems or methods. In another aspect the invention comprises a method of acquiring iris images comprising deploying a lens with a controllable adjustable focus; and adjusting focus without feedback from a focus measurement value. In some embodiments the lens is scanned over a range of focus values. The system of the invention controls the lens to have an opportunistic capture which scans through different slices of depth volume, acquiring data. The quality of the image capture is calculated using algorithms which, for example, analyze for sharpness and or contrast, or other parameters indicative of quality and suitability for further biometric processing. The system of the invention can use algorithms looking for an absolute measure of eye focus, since an eye has some generic features in common across large populations, or for a peak in the focus measure as images are acquired over the range of focuses scanned.





BRIEF DESCRIPTION OF THE DRAWINGS

These and other objects, features, and advantages of embodiments are presented in greater detail in the following description when read in relation to the drawings, but not limited to these figures, in which:



FIG. 1 is a flow chart illustrating a system of the invention.



FIGS. 2-5 are graphical representations of the relationship between the focus point of a sensor, distance between a moving subject and the sensor, and a fairly constant depth of field as images T1, T2, . . . Tn are acquired over time with examples of face/eye, iris, and status at different image acquisition times Tx, each figure illustrating a different focus pattern.



FIG. 6 is a graphical representation of the improving quality of iris images stored in the list over time.





DETAILED DESCRIPTION

While the invention is capable of many embodiments, only a few illustrative embodiments are described below.


Referring first to FIG. 1 illustrating a process flowsheet according to the invention, the process begins with a module 100 that determines whether Acquisition for a particular subject should be started. This module 100 may comprise several components depending on the specific application. For example the module may consist of a motion detector module, or a trigger that a previous subject has successfully performed a transaction with the system.


Upon initiating the acquisition, a local list of successively better images from the prior subject is cleared 101 in preparation for the next subject.


An image is then acquired 102 using a camera system. A camera system is used that can either capture images synchronously at a constant rate, or asynchronously on request by a computer-controlled trigger signal. As discussed later, the camera may be operated at a variable acquisition rate depending on the results of previous processing.


A Quality Metric module comprising, for example, one or more of the following sub-modules: face detector, eye detector, focus measurement, iris area detector is used 103 to measure the quality of each acquired image in sequence when sufficient computing capacity is available but not necessarily simultaneously with image acquisition. As discussed later, one or all of these modules may be performed at a particular time instant depending on the results of previous processing. The quality analysis and selection system of Martin et al in US 2008/0075335, supra, which is hereby incorporated by reference in its entirety, is one suitable of the present invention wherein only the best or a small, limited number of the highest quality of the acquired images is stored in memory.


An Acquisition Stopped module 104 is to perform an Acquisition Stopped routine. This module 104 ensures that the overall process is not being performed unnecessarily if, for example; the subject has walked away without any data being acquired. The Acqusition Stopped module may consist of a time-out counter that compares to a threshold the difference between the current time and the time that the Acquisition process was started. The process for a particular subject can be terminated 109 or the last image can be stored 107 if a better 103 image than the best quality image stored at 110 is calculated.


A Comparator module 105 then compares the results of the Quality Metric Module with the results stored in a Local List in storage module 110. In the first iteration of the process, there will be no data in the Local List in storage module 110. However, after several iterations, some data may be present within the Local List 110. If the results of the Quality Metric Module 103 are greater than any of those on the Local List 110, then the imagery data is stored on the Local List, Storage may comprise appending the imagery data to the Local List 110, or may comprise replacing 107 imagery data on the Local List that has a lower Quality Metric 103 value.


Step 108 is optional, as indicated by the box shown with broken lines. In certain embodiments where step 108 is absent, additional imagery is acquired automatically without changing focus values but is rather acquired at a fixed focus, the quality of imagery depending on the exact location of a moving subject within the capture volume at the time successive images are acquired. In certain other embodiments when module 108 is present, the focus setting of the camera acquisition system is independently modified prior to acquiring the next image. Several methods for modifying the focus setting can be employed as discussed later.


After the focus has been modified, then imagery is once again acquired 102 in the next iteration of the process.


The process continues until 109 either the timeout condition described above occurs, or the Quality Metric 103 exceeds a value.


Referring now to FIG. 2 the top illustration shows the disposition of an unconstrained subject over a period of time at times To through T6, showing that the subject may turn his head, or blink, for example. The solid, dark line in the bottom of FIG. 2 shows the disposition of the subject's distance from the camera acquisition system. Note that the subject is moving closer then further from the camera sensor in a random fashion due to their relaxed disposition or inability to remain exactly stationary. The dotted line shows the disposition of the Focus Setting position at different time instants. In this case, the Focus Setting has been set to follow a sawtooth waveform over time. The small vertical bars on the dotted line indicate the depth of field of the sensor. If the depth of the subject intersects any point within the small vertical bar, then the subject is in focus. The “Status” row at the top describes the status of the subject with respect to the image acquisition system. For example, at T=T0, the subject's head is turned and no face is visible. At T=T2, the subject's depth intersects with the depth of field of the particular focus setting at that time, however the subject's eyelid happens to be closed at that point in time. At T=T3 on the other hand, the subject's eye is present, the eye is at least partially open so that the resultant Quality Metric has a finite value, albeit a lower than optimal value since the image is slightly out of focus. The imagery at T=T3 is therefore placed on the Local List. At T=T5, the subject's eye is present, the eye is at least partially open so that the resultant Quality Metric has a finite value, and the subject's depth intersects with the depth of field of the particular focus setting at that time so that the Quality Metric has a higher value compared to that of the image that is already on the Local List, and therefore the image at T=T5 is either placed on the Local List or replaces the existing image on the Local List depending on the particular embodiment of the invention.



FIG. 3 shows another embodiment of the invention with a different focus setting routine. The subject's disposition is as in the previous example, but the camera acquisition module has the capability of performing rapid data acquisition over short time periods, upon certain conditions. Rapid data acquisition is not performed all the time since it is prevented by limitations in bandwidth and processing speed. In the embodiment shown in FIG. 3, the selected conditions for performing short-duration rapid data collection for a fixed time period (in this case from T=T3 to T=T6 is the detection of a face, an eye, an iris that is open, but blurred. If most of the criteria for successful acquisition have been met, then there are only very few additional criteria that need to change before valid iris data can be acquired. It is therefore more probable than at other time instants that valid iris data may soon appear. The rate of data acquisition is therefore increased in order to be ready to capture more iris data than would have otherwise been captured.


Referring now to FIG. 3, the thick vertical lines around T=T5 shows that 4 images were acquired around this time period during the rapid acquisition mode, rather than just 1 image in the prior embodiment.


Referring to FIG. 4, the subject is moving generally towards the camera, in addition to random movement. In this case the focus setting is a combination of an auto-focus value computed from the average focus of prior settings, as well as a sawtooth waveform as described in the first embodiment. In this case, valid iris images are stored on the Local List at T=T3, T=T5 and T=T6.



FIG. 6 is a graph showing on the Y-Axis the Quality Metric value of images as they are placed on the Local List over a short time period. Typically, imagery is typically placed on the list rapidly, but then as more data is placed on the list it becomes more difficult and therefore takes longer for new imagery to exceed the existing Quality Metrics on the list. An example Quality Metric is Q=F (A+delta), where F is a focus measure where high values of F indicate more focused imagery and A is the estimated area of the iris. Various known, alternative methods for segmenting the iris and extracting the area and quantifying focus can be used.


The method is highly effective in many respects. A first advantage of the invention is if the disposition of the subject is immediately amenable to successful data acquisition (e.g. eyes are open and their face is facing the system), then the system will acquire iris imagery very rapidly. There are many methods for detecting the presence of an eye. For example, the Hough Transform disclosed in U.S. Pat. No. 3,069,654 can be configured to locate circular segments of the eye due to the iris/sclera boundary and the pupil/iris boundary.


However, if the subject is fidgeting or unable to remain stationary, or is distracted by baggage or children for example, then the acquisition system will still acquire imagery, although it might take a slightly longer period of time. However, the acquisition time for an amenable subject will not be penalized by the system's delays in acquiring data in the case of a less amenable subject. This is crucial when subject throughput is considered. This is to be contrasted with systems that may acquire and store a large number of images and then perform processing on the images to select imagery.


A second advantage of the invention is the ability to acquire successively better iris imagery. In the current art, iris image acquisition systems typically have resulted in the output of one image of the iris deemed to have a quality suitable for matching, usually exceeding a threshold. If such an image is not found, then no iris data is captured. The problem with the current art is that there are some applications when there will not be a second chance to acquire better data since the subject has gone elsewhere or is fed up with using the system. Ironically, however, the iris imagery they presented may have had plenty of information for the particular application at hand. For example, if the image acquisition system is to be used to gain entry into a house with only 100 subjects, then some of the iris imagery acquired earlier in the acquisition process may be sufficient.


A third advantage of the invention is the efficient use of memory, which is significant especially when an embedded device is used. The Local List contains only iris imagery that is successively of better quality than the prior imagery, and does not contain the imagery that was originally acquired. In addition, depending on the application, the Local List can comprise a single image which is replaced each time imagery of a better quality is detected. After processing is complete, then the resultant image remaining in the Local List is the imagery acquired of the best quality.


In one embodiment, the invention obtains in-focus images by using a focus controller component that controls the lens to focus at successively different points within a focus range, such scan control performed without any input from measurement of whether the image is in focus or out of focus, be it based from measurements of the image or other distance metrics to the subject. In terms of focus scan speed and how it relates to frame rate, exposure time these relationships and related algorithms are known to those skilled in this alt.


Even when a subject is trying to stand still, there will be residual motion. The system in some embodiments can increase or decrease the rate of image capture at different focuses in view of the degree of motion of the subject.


The system acquires a varying number of images, to account for the fact that in some cases we may acquire a good image on the first image acquisition, but in other cases may have to wait for 10 or 20 image acquisitions or more. If the system simply fixed the number of image acquisitions to be 10 or 20, then we would dramatically slow down the average time it takes to use the device, and therefore reduce the throughput of people using the device, since the number of image acquisitions acquired would be set at the worst case, rather than being adaptive based on the quality of the iris.


It is not good enough to have the focus set at the correct focal distance opportunistically since, for example, the subject may blink or turn away even though the image is in focus.


If 10 or 20 or more images are being acquired, storing them can take up a lot of memory, which can be expensive in an embedded device. The system of the invention successively checks whether the iris image quality is better than the best iris image stored previously and only in that case does the system store it. Alternatively the system can overwrite the best iris image acquired so far to replace it with the better image. In this way, the system always has the best possible iris image stored without having to use extensive memory. If the subject turns away and the system loses its opportunity to ever again acquire iris data of a subject, the best possible image, even if not of high quality, will be stored and such image may have sufficient quality for biometric identification under the circumstances.


In addition to the area to which the camera is pointed, we also can control a focus control system such that a capture volume is swept through. Unlike autofocus which requires settling time, and many discontinuous stop/start steps that eventually can wear down components and can take time to respond, we simply sweep through a focus volume rapidly, in order to opportunistically acquire biometric imagery.


While the invention has been described and illustrated in detail herein, various other embodiments, alternatives, and modifications should become apparent to those skilled in the art without departing from the spirit and scope of the invention. The claims should not be considered limited to the illustrated embodiments, therefore.

Claims
  • 1. A system for acquiring a series of images of an unconstrained subject, the system comprising: a sensor configured to acquire one or more images of an unconstrained subject;a controller configured to:detect an eye that is open or partially open in the acquired image;monitor quality criteria comprising an estimated extent of an iris of the unconstrained subject that is exposed according to the openness of the eye and acquired in the one or more images;determine, according to the monitored quality criteria comprising the estimated extent of the iris that is exposed and acquired, a time period during which there is a high probability that iris biometric data valid for biometric matching can be acquired from the unconstrained subject;increase, responsive to the determination, a rate of acquisition of images of the unconstrained subject within the determined time period, without input from measurement of whether a corresponding image to be acquired is in focus; andreplace a first image stored in a storage device with a second image acquired within the determined time period, when a quality score of the second image is higher than that of the first image.
  • 2. The system of claim 1, wherein the controller is further configured to determine the quality score of the second image according to an extent of the iris that is acquired in the second image and available for biometric matching.
  • 3. The system of claim 2, wherein the controller is further configured to determine the quality score of the second image according to sharpness or contrast of features determined in the second image.
  • 4. The system of claim 1, wherein the controller is further configured to measure an extent of a physical area of the iris that is exposed for biometric acquisition and available for biometric matching.
  • 5. The system of claim 1, wherein the controller is further configured to detect a feature of the subject's face or eye on the acquired one or more images.
  • 6. The system of claim 1, wherein the controller is configured to further determine the time period according to at least one of: a result of biometric matching using the acquired image, and detection of a physical feature or motion of the unconstrained subject from the one or more images.
  • 7. The system of claim 1, wherein the controller is configured to determine at least one of: a duration for the time period, an image acquisition rate, or a number of images, for the acquisition of the images within the determined time period.
  • 8. The system of claim 1, wherein the controller is configured to determine, based on an estimated degree of motion of the subject, at least one of: a duration for the time period, a rate for acquisition and a number of images, for the acquisition of the images within the determined time period.
  • 9. The system of claim 1, wherein the sensor is configured to acquire the images within the determined time period, at focus values according to one of a monotonic pattern, a sawtooth pattern, and a random scan pattern.
  • 10. The system of claim 1, wherein the controller rejects at least one of the images acquired within the determined time period, based on one or more of: a result of biometric matching, measurement of image sharpness, measurement of image contrast, and detection of a feature of the subject's eye.
  • 11. A method for acquiring images of an unconstrained subject, the method comprising: acquiring, by a sensor, one or more images of an unconstrained subject;detecting an eye that is open or partially open in the acquired image;monitoring, by a controller executing on one or more processors, quality criteria comprising an estimated extent of an iris of the unconstrained subject that is exposed according to the openness of the eye and acquired in the one or more images;determining, by the controller according to the monitored quality criteria comprising the estimated extent of the iris that is exposed and acquired, a time period during which there is a high probability that iris biometric data valid for biometric matching can be acquired from the unconstrained subject;increasing, by the controller responsive to the determination, a rate of acquisition of images of the unconstrained subject within the determined time period, without input from measurement of whether a corresponding image to be acquired is in focus; andreplacing, by the controller, a first image stored in a storage device with a second image acquired within the determined time period, when a quality score of the second image is higher than that of the first image.
  • 12. The method of claim 11, further comprising determining, by the controller, the quality score of the second image according to an extent of the iris that is acquired in the second image and available for biometric matching.
  • 13. The method of claim 12, further comprising determining, by the controller, the quality score of the second image according to sharpness or contrast of features determined in the second image.
  • 14. The method of claim 11, further comprising measuring, by the controller, an extent of a physical area of the iris that is exposed for biometric acquisition and available for biometric matching.
  • 15. The method of claim 11, further comprising detecting, by the controller, a feature of the subject's face or eye on the acquired one or more images.
  • 16. The method of claim 11, wherein determining the time period further comprises determining the time period according to at least one of: a result of biometric matching using the acquired image, and detection of a physical feature or motion of the unconstrained subject from the one or more images.
  • 17. The method of claim 11, further comprising determining, by the controller, at least one of: a duration for the time period, an image acquisition rate, or a number of images, for the acquisition of the images within the determined time period.
  • 18. The method of claim 11, further comprising determining, by the controller, based on an estimated degree of motion of the subject, at least one of: a duration for the time period, a rate for acquisition and a number of images, for the acquisition of the images within the determined time period.
  • 19. The method of claim 11, further comprising acquiring, by the sensor, the images within the determined time period, at focus values according to one of a monotonic pattern, a sawtooth pattern, and a random scan pattern.
  • 20. The method of claim 11, further comprising rejecting, by the controller, at least one of the images acquired within the determined time period, based on one or more of: a result of biometric matching, measurement of image sharpness, measurement of image contrast, and detection of a feature of the subject's eye.
RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. application. Ser. No. 14/946,956, entitled “SYSTEM AND METHOD FOR IRIS DATA ACQUISITION FOR BIOMETRIC IDENTIFICATION”, filed on Nov. 20, 2015, which is a continuation of and claims priority to U.S. application. Ser. No. 13/786,093, entitled “SYSTEM AND METHOD FOR IRIS DATA ACQUISITION FOR BIOMETRIC IDENTIFICATION” filed on Mar. 5, 2013, issued as Pat. No. 9,192,297 on Nov. 24, 2015, which is a continuation of and claims priority to U.S. application. Ser. No. 12/675,189, entitled “SYSTEM AND METHOD FOR IRIS DATA ACQUISITION FOR BIOMETRIC IDENTIFICATION” filed On Feb. 25, 2010 issued as Pat. No. 8,553,948 on Oct. 8, 2013, which is a national stage entry of International application PCT/US2008/74737, filed Aug. 29, 2008, entitled “SYSTEM AND METHOD FOR IRIS DATA ACQUISITION FOR BIOMETRIC IDENTIFICATION”, which claims priority to U.S. provisional application 60/969,607 filed Sep. 1, 2007, entitled “METHODOLOGY FOR ACQUIRING BIOMETRIC DATA LARGE VOLUMES”, all of which are hereby incorporated by reference for all purposes.

US Referenced Citations (267)
Number Name Date Kind
4231661 Walsh et al. Nov 1980 A
4641349 Flom et al. Feb 1987 A
4910725 Drexler et al. Mar 1990 A
4923263 Johnson May 1990 A
5140469 Lamarre et al. Aug 1992 A
5259040 Hanna Nov 1993 A
5291560 Daugman Mar 1994 A
5488675 Hanna Jan 1996 A
5572596 Wildes et al. Nov 1996 A
5581629 Hanna et al. Dec 1996 A
5613012 Hoffman et al. Mar 1997 A
5615277 Hoffman Mar 1997 A
5737439 Lapsley et al. Apr 1998 A
5751836 Wildes et al. May 1998 A
5764789 Pare et al. Jun 1998 A
5802199 Pare et al. Sep 1998 A
5805719 Pare et al. Sep 1998 A
5838812 Pare et al. Nov 1998 A
5878156 Okumura Mar 1999 A
5901238 Matsushita May 1999 A
5953440 Zhang et al. Sep 1999 A
5978494 Zhang Nov 1999 A
6021210 Camus et al. Feb 2000 A
6028949 McKendall Feb 2000 A
6055322 Salganicoff et al. Apr 2000 A
6064752 Rozmus et al. May 2000 A
6069967 Rozmus et al. May 2000 A
6088470 Camus et al. Jul 2000 A
6144754 Okano et al. Nov 2000 A
6149061 Massieu et al. Nov 2000 A
6192142 Pare et al. Feb 2001 B1
6222903 Kim et al. Apr 2001 B1
6246751 Bergl et al. Jun 2001 B1
6247813 Kim et al. Jun 2001 B1
6252977 Salganicoff et al. Jun 2001 B1
6289113 McHugh et al. Sep 2001 B1
6301375 Choi Oct 2001 B1
6320610 Van Sant et al. Nov 2001 B1
6366682 Hoffman et al. Apr 2002 B1
6373968 Okano et al. Apr 2002 B2
6377699 Musgrave et al. Apr 2002 B1
6424727 Musgrave et al. Jul 2002 B1
6483930 Musgrave et al. Nov 2002 B1
6532298 Cambier et al. Mar 2003 B1
6542624 Oda Apr 2003 B1
6545810 Takada et al. Apr 2003 B1
6546121 Oda Apr 2003 B1
6554705 Cumbers Apr 2003 B1
6587597 Nakao et al. Jul 2003 B1
6594376 Hoffman et al. Jul 2003 B2
6594377 Kim et al. Jul 2003 B1
6652099 Chae et al. Nov 2003 B2
6700998 Murata Mar 2004 B1
6701029 Berfanger et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6760467 Min et al. Jul 2004 B1
6763148 Sternberg et al. Jul 2004 B1
6819219 Bolle et al. Nov 2004 B1
6832044 Doi et al. Dec 2004 B2
6850631 Oda et al. Feb 2005 B1
6917695 Teng et al. Jul 2005 B2
6920236 Prokoski Jul 2005 B2
6930707 Bates et al. Aug 2005 B2
6944318 Takata et al. Sep 2005 B1
6950536 Houvener Sep 2005 B2
6980670 Hoffman et al. Dec 2005 B1
6985608 Hoffman et al. Jan 2006 B2
7007298 Shinzaki et al. Feb 2006 B1
7020351 Kumar et al. Mar 2006 B1
7047418 Ferren et al. May 2006 B1
7095901 Lee et al. Aug 2006 B2
7106366 Parker et al. Sep 2006 B2
7146027 Kim et al. Dec 2006 B2
7152782 Shenker et al. Dec 2006 B2
7209271 Lewis et al. Apr 2007 B2
7212330 Seo et al. May 2007 B2
7221486 Makihira et al. May 2007 B2
7236534 Morejon et al. Jun 2007 B1
7248719 Hoffman et al. Jul 2007 B2
7271939 Kono Sep 2007 B2
7272265 Kouri et al. Sep 2007 B2
7346472 Moskowitz et al. Mar 2008 B1
7385626 Aggarwal et al. Jun 2008 B2
7398925 Tidwell et al. Jul 2008 B2
7414737 Cottard et al. Aug 2008 B2
7418115 Northcott et al. Aug 2008 B2
7428320 Northcott et al. Sep 2008 B2
7542590 Robinson et al. Jun 2009 B1
7545962 Peirce et al. Jun 2009 B2
7558406 Robinson et al. Jul 2009 B1
7558407 Hoffman et al. Jul 2009 B2
7574021 Matey Aug 2009 B2
7583822 Guillemot et al. Sep 2009 B2
7606401 Hoffman et al. Oct 2009 B2
7616788 Hsieh et al. Nov 2009 B2
7639840 Hanna et al. Dec 2009 B2
7652695 Halpern Jan 2010 B2
7660700 Moskowitz et al. Feb 2010 B2
7693307 Rieul et al. Apr 2010 B2
7697786 Camus et al. Apr 2010 B2
7715595 Kim et al. May 2010 B2
7719566 Guichard May 2010 B2
7760919 Namgoong Jul 2010 B2
7770019 Ferren et al. Aug 2010 B2
7797606 Chabanne Sep 2010 B2
7801335 Hanna et al. Sep 2010 B2
7847688 Bernard et al. Dec 2010 B2
7869627 Northcott et al. Jan 2011 B2
7912252 Ren et al. Mar 2011 B2
7916908 Thomas Mar 2011 B1
7925059 Hoyos et al. Apr 2011 B2
7929017 Aggarwal et al. Apr 2011 B2
7929732 Bringer et al. Apr 2011 B2
7949295 Kumar et al. May 2011 B2
7949494 Moskowitz et al. May 2011 B2
7978883 Rouh et al. Jul 2011 B2
8009876 Kim et al. Aug 2011 B2
8025399 Northcott et al. Sep 2011 B2
8028896 Carter et al. Oct 2011 B2
8090246 Jelinek Jan 2012 B2
8092021 Northcott et al. Jan 2012 B1
8132912 Northcott et al. Mar 2012 B1
8159328 Luckhardt Apr 2012 B2
8170295 Fujii et al. May 2012 B2
8181858 Carter et al. May 2012 B2
8195044 Hanna et al. Jun 2012 B2
8212870 Hanna et al. Jul 2012 B2
8214175 Moskowitz et al. Jul 2012 B2
8233680 Bringer et al. Jul 2012 B2
8243133 Northcott et al. Aug 2012 B1
8260008 Hanna et al. Sep 2012 B2
8279042 Beenau et al. Oct 2012 B2
8280120 Hoyos et al. Oct 2012 B2
8289390 Aggarwal et al. Oct 2012 B2
8306279 Hanna Nov 2012 B2
8317325 Raguin et al. Nov 2012 B2
8364646 Hanna et al. Jan 2013 B2
8411909 Zhao et al. Apr 2013 B1
8442339 Martin et al. May 2013 B2
8443202 White et al. May 2013 B2
8553948 Hanna Oct 2013 B2
8604901 Hoyos et al. Dec 2013 B2
8606097 Hanna et al. Dec 2013 B2
8719584 Mullin May 2014 B2
9002073 Hanna et al. Apr 2015 B2
9633260 Hanna Apr 2017 B2
20010028730 Nahata Oct 2001 A1
20020110286 Cheatle et al. Aug 2002 A1
20020131623 Musgrave et al. Sep 2002 A1
20020136435 Prokoski Sep 2002 A1
20030103212 Westphal et al. Jun 2003 A1
20030151674 Lin Aug 2003 A1
20030169334 Braithwaite et al. Sep 2003 A1
20030208125 Watkins Nov 2003 A1
20040013288 Svensson et al. Jan 2004 A1
20040042643 Yeh Mar 2004 A1
20040071363 Kouri et al. Apr 2004 A1
20050084137 Kim et al. Apr 2005 A1
20050084179 Hanna et al. Apr 2005 A1
20050105778 Sung et al. May 2005 A1
20050168321 Fitzgibbon Aug 2005 A1
20050226471 Singh et al. Oct 2005 A1
20050264758 Wakamori Dec 2005 A1
20050270386 Saitoh et al. Dec 2005 A1
20050285943 Cutler Dec 2005 A1
20060028552 Aggarwal et al. Feb 2006 A1
20060029262 Fujimatsu et al. Feb 2006 A1
20060073449 Kumar et al. Apr 2006 A1
20060074986 Mallalieu et al. Apr 2006 A1
20060097172 Park May 2006 A1
20060120707 Kusakari et al. Jun 2006 A1
20060140454 Northcott et al. Jun 2006 A1
20060170813 Morofuji Aug 2006 A1
20060188169 Tener et al. Aug 2006 A1
20060204121 Bryll Sep 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070040903 Kawaguchi Feb 2007 A1
20070098229 Wu et al. May 2007 A1
20070110284 Rieul et al. May 2007 A1
20070110285 Hanna et al. May 2007 A1
20070145273 Chang Jun 2007 A1
20070160265 Wakiyama Jul 2007 A1
20070188613 Nobori et al. Aug 2007 A1
20070206839 Hanna et al. Sep 2007 A1
20070211922 Crowley et al. Sep 2007 A1
20070253596 Murata et al. Nov 2007 A1
20070286462 Usher et al. Dec 2007 A1
20070286524 Song Dec 2007 A1
20080031610 Border et al. Feb 2008 A1
20080044063 Friedman et al. Feb 2008 A1
20080075334 Determan et al. Mar 2008 A1
20080075335 Martin et al. Mar 2008 A1
20080089554 Tabankin et al. Apr 2008 A1
20080122578 Hoyos et al. May 2008 A1
20080259161 Hellman et al. Oct 2008 A1
20080291279 Samarasekera et al. Nov 2008 A1
20090046899 Northcott et al. Feb 2009 A1
20090047010 Yoshida et al. Feb 2009 A1
20090074256 Haddad Mar 2009 A1
20090097715 Cottard et al. Apr 2009 A1
20090161925 Cottard et al. Jun 2009 A1
20090207251 Kobayashi et al. Aug 2009 A1
20090219405 Kaneda et al. Sep 2009 A1
20090231096 Bringer et al. Sep 2009 A1
20090232418 Lolacono et al. Sep 2009 A1
20090268045 Sur et al. Oct 2009 A1
20090273562 Baliga et al. Nov 2009 A1
20090274345 Hanna et al. Nov 2009 A1
20090278922 Tinker et al. Nov 2009 A1
20100014720 Hoyos et al. Jan 2010 A1
20100021016 Cottard et al. Jan 2010 A1
20100033677 Jelinek Feb 2010 A1
20100074477 Fujii et al. Mar 2010 A1
20100127826 Saliba et al. May 2010 A1
20100201853 Ishiga Aug 2010 A1
20100232655 Hanna Sep 2010 A1
20100238407 Dai Sep 2010 A1
20100246903 Cottard Sep 2010 A1
20100253816 Hanna Oct 2010 A1
20100278394 Raguin et al. Nov 2010 A1
20100310070 Bringer et al. Dec 2010 A1
20110002510 Hanna Jan 2011 A1
20110007949 Hanna et al. Jan 2011 A1
20110119111 Hanna May 2011 A1
20110119141 Hoyos et al. May 2011 A1
20110150293 Bower et al. Jun 2011 A1
20110158486 Bringer et al. Jun 2011 A1
20110160576 Bower et al. Jun 2011 A1
20110194738 Choi et al. Aug 2011 A1
20110211054 Hanna et al. Sep 2011 A1
20110277518 Lais et al. Nov 2011 A1
20120127295 Hanna et al. May 2012 A9
20120187838 Hanna Jul 2012 A1
20120212597 Hanna Aug 2012 A1
20120219279 Hanna et al. Aug 2012 A1
20120229617 Yates et al. Sep 2012 A1
20120239458 Hanna Sep 2012 A9
20120240223 Tu Sep 2012 A1
20120242820 Hanna et al. Sep 2012 A1
20120242821 Hanna et al. Sep 2012 A1
20120243749 Hanna et al. Sep 2012 A1
20120257797 Leyvand et al. Oct 2012 A1
20120268241 Hanna et al. Oct 2012 A1
20120293643 Hanna Nov 2012 A1
20120300052 Hanna et al. Nov 2012 A1
20120300990 Hanna et al. Nov 2012 A1
20120321141 Hoyos et al. Dec 2012 A1
20120328164 Hoyos et al. Dec 2012 A1
20130051631 Hanna Feb 2013 A1
20130093838 Tan et al. Apr 2013 A1
20130108125 Storm et al. May 2013 A1
20130110859 Hanna et al. May 2013 A1
20130162798 Hanna et al. Jun 2013 A1
20130162799 Hanna et al. Jun 2013 A1
20130182093 Hanna Jul 2013 A1
20130182094 Hanna Jul 2013 A1
20130182095 Hanna Jul 2013 A1
20130182913 Hoyos et al. Jul 2013 A1
20130182915 Hanna Jul 2013 A1
20130194408 Hanna et al. Aug 2013 A1
20130212655 Hoyos et al. Aug 2013 A1
20130223840 Chang Aug 2013 A1
20130251215 Coons Sep 2013 A1
20130294659 Hanna et al. Nov 2013 A1
20130329079 Florea et al. Dec 2013 A1
20140064574 Hanna et al. Mar 2014 A1
20140072183 Hanna et al. Mar 2014 A1
Foreign Referenced Citations (31)
Number Date Country
101027678 Aug 2007 CN
2007-249556 Sep 2007 JP
1003738500000 Feb 2003 KR
10-2003-0051970 Jun 2003 KR
2003216700000 Jul 2003 KR
1004160650000 Jan 2004 KR
2003402730000 Jan 2004 KR
2003411370000 Jan 2004 KR
2003526690000 May 2004 KR
2003552790000 Jun 2004 KR
2003620320000 Sep 2004 KR
2003679170000 Nov 2004 KR
2003838080000 May 2005 KR
2004046500000 Dec 2005 KR
1005726260000 Apr 2006 KR
10-2009-0086891 Aug 2009 KR
10-2009-0106791 Oct 2009 KR
10-2010-0049407 May 2010 KR
10-2014-0028950 Mar 2014 KR
10-13740490000 Mar 2014 KR
10-2014-0039803 Apr 2014 KR
10-2014-0050501 Apr 2014 KR
2318438 Mar 2008 RU
97839 Sep 2010 RU
WO-2008054396 May 2008 WO
WO-2009029757 Mar 2009 WO
WO-2009029765 Mar 2009 WO
WO-2010062371 Jun 2010 WO
WO-2011093538 Aug 2011 WO
WO-2012112788 Aug 2012 WO
WO-2013109295 Jul 2013 WO
Non-Patent Literature Citations (69)
Entry
U.S. Appl. No. 06/149,061.
Office Action on U.S. Appl. No. 14/830,366 dated Jul. 18, 2016.
Al-Zubi R T et al: Automated personal identification system based on human iris analysis, Pattern Analysis and Applications, Springer-Verlag, LO, vol. 10, No. 2, Nov. 29, 2006 (Nov. 29, 2006), pp. 147-164, XP019493841, ISSN: 1433-755X, sectionI-5, abstract; figures 1-17.
B. Galvin, et al., Recovering Motion Fields: An Evaluation of Eight Optical Flow Algorithms, Proc. of the British Machine Vision Conf. (1998).
Belcher et al, “A Selective Feature Information Approach for Iris Image-Quality Measure”, IEEE, 3(3):572-577 (2008).
Chen Y et al: A highly accurate and computationally efficient approach for unconstrained iris segmentation, Image and Vision Computing, Elsevier, Guildford, GB, vol. 28, No. 2, Feb. 1, 2010 (Feb. 1, 2010), pp. 261-269, XP026777056, ISSN: 0262-8856, DOI:10.1016/J.IMAVIS.2009.04.017 [retrieved on May 13, 2009] section on 1-7, abstract; figures 1-16.
Daugman, John, “How Iris Recognition Works,” IEEE Transaction on Circuits and Systems for Video Technology, 14(1):21-30 (2004).
Extended European Search Report on 12866256.6 dated Aug. 1, 2014.
Extended European Search Report on 12747311.4 dated Jul. 4, 2016.
First Chinese Office Action on 201280017539.7 dated Mar. 14, 2016.
He, Xiaofu et al., “Contactless Autofeedback Iris Capture Design”, IEEE Transactions on Instrumentation and Measurement, IEEE Service Center, Piscataway, NJ, U.S. 57(7):1369-1375 (2008).
He, Y. et al, “A fast iris image quality evaluation method based on weighted entropy”, SPIE, 6623:1-8 (2007).
International Preliminary Report on Patentability in PCT/US2008/074737 dated Mar. 2, 2010, 7 pages.
International Preliminary Report on Patentability in PCT/US2008/074751 dated Mar. 2, 2010, 5 pages.
International Preliminary Report on Patentability in PCT/US2012/025468 dated Aug. 21, 2013, 4 pages.
International Preliminary Report on Patentability in PCT/US2012/032391, dated Oct. 8, 2013, 8 pages.
International Search Report in PCT/US2008/074737, dated Jan. 23, 2009, 4 pages.
International Search Report in PCT/US2008/074751, dated Jan. 28, 2009, 2 pages.
International Search Report in PCT/US2012/032391, dated Jul. 25, 2013, 3 pages.
International Search Report on PCT/US2012/025468 dated Sep. 14, 2012.
J. R. Bergen, et al., Hierarchical Model-Based Motion Estimation, European Conf. on Computer Vision (1993).
K. Nishino, et al., The World in an Eye, IEEE Conf. on Pattern Recognition, vol. 1, at pp. 444-451 (Jun. 2004).
Lu, Huiqi et al., “Iris Recognition on Low Computational Power Mobile Devices”, 23 pages, (2011). Retrieved from the Internet: URL: http:jjcdn.intechopen.comjpdfs-wm/14646.pdf [retrieved on Jul. 23, 2014].
Ma, L. et al, “Personal Identification Based on Iris Texture Analysis”, IEEE: Pattern Analysis and Machine Intelligence, 25(12):1519-1533 (2003).
Notice of Allowance dated May 28, 2013 in U.S. Appl. No. 12/675,189.
Notice of Allowance dated Oct. 27, 2014 in U.S. Appl. No. 13/493,462.
Notice of Allowance dated Oct. 9, 2014 in U.S. Appl. No. 13/773,159.
Notice of Allowance on U.S. Appl. No. 12/658,706 dated Feb. 24, 2012.
Notice of Allowance on U.S. Appl. No. 13/398,562 dated Nov. 2, 2015.
Notice of Allowance on U.S. Appl. No. 13/440,707 dated Apr. 20, 2015.
Notice of Allowance on U.S. Appl. No. 13/493,455 dated Feb. 10, 2015.
Notice of Allowance on U.S. Appl. No. 13/493,455 dated Jul. 18, 2014.
Notice of Allowance on U.S. Appl. No. 13/773,168 dated Jan. 23, 2015.
Notice of Allowance on U.S. Appl. No. 13/786,079 dated Apr. 2, 2015.
Notice of Allowance on U.S. Appl. No. 13/786,093 dated Jul. 21, 2015.
Office Action in U.S. Appl. No. 13/398,562, dated May 21, 2014.
Office Action in U.S. Appl. No. 13/440,707, dated Jan. 14, 2014.
Office Action in U.S. Appl. No. 13/773,159, dated Jun. 18, 2014, 26 pages.
Office Action in U.S. Appl. No. 13/773,159, dated Oct. 31, 2013, 16 pages.
Office Action in U.S. Appl. No. 13/773,168, dated Jul. 16, 2014.
Office Action in U.S. Appl. No. 13/773,168, dated Oct. 8, 2013, 16 pages.
Office Action in U.S. Appl. No. 13/807,256, dated Jan. 29, 2014, 16 pages.
Office Action on U.S. Appl. No. 12/675,189 dated Dec. 7, 2012.
Office Action on U.S. Appl. No. 13/398,562 dated Nov. 17, 2014.
Office Action on U.S. Appl. No. 13/440,707 dated Sep. 30, 2014.
Office Action on U.S. Appl. No. 13/493,455 dated Apr. 9, 2014.
Office Action on U.S. Appl. No. 13/493,455 dated Sep. 19, 2013.
Office Action on U.S. Appl. No. 13/493,462 dated Jul. 1, 2014.
Office Action on U.S. Appl. No. 13/786,079 dated Sep. 26, 2014.
Office Action on U.S. Appl. No. 13/786,093 dated Nov. 28, 2014.
Office Action on U.S. Appl. No. 13/786,102 dated Nov. 25, 2014.
Office Action on U.S. Appl. No. 14/946,956 dated Jul. 11, 2016.
Peters, Tanya H. et al., “Effects of segmentation routine and acquisition environment on iris recognition”, 97 pages, (2009). Retrieved from the Internet: URL: http://etd.nd.edu/ETD-db/thesesjavailablejetd-12112009-103348/ [retrieved on Jul. 21, 2014].
R. Kumar, et al., Direct recovery of shape from multiple views: a parallax based approach, 12th IAPR Int'l Conf. on Pattern Recognition (1994).
R. P. Wildes, Iris Recognition: An Emerging Biometric Technology, Proc. IEEE 85(9) at pp. 1348-1363 (Sep. 1997).
Russian Decision on Grant on 2013142254 dated Jan. 12, 2016.
U.S. Notice of Allowance on U.S. Appl. No. 14/830,366 dated Feb. 27, 2017.
U.S. Notice of Allowance on U.S. Appl. No. 14/946,956 dated Mar. 1, 2017.
U.S. Notice of Allowance on U.S. Appl. No. 14/946,956 dated Mar. 23, 2017.
U.S. Office Action on U.S. Appl. No. 14/830,366 dated Dec. 16, 2016.
U.S. Office Action on U.S. Appl. No. 14/946,956 dated Nov. 23, 2016.
Written Opinion of the International Searching Authority in PCT/US2008/074737, dated Jan. 23, 2009, 6 pages.
Written Opinion of the International Searching Authority in PCT/US2008/074751 dated Jan. 28, 2009, 4 pages.
Written Opinion of the International Searching Authority in PCT/US2012/032391, dated Jul. 25, 2013, 7 pages.
Written Opinion on PCT/US2012/025468 dated Sep. 14, 2012.
Yingzi Du et al: “Video-Based Noncooperative Iris Image Segmentation”, IEEE Transactions on Systems, Man and Cybernetics. Part B:Cybernetics, IEEE Service Center, Piscataway, NJ, US, vol. 41 , No. 1, Feb. 1, 2011 (Feb. 1, 2011), pp. 64-74, XP011373393, ISSN: 1083-4419, DOI: 10.1109/TSMCB.2010.2045371, section I-IV, abstract; figures 1-17.
Second Chinese Office Action on 201280017539.7 dated Oct. 11, 2016.
U.S. Notice of Allowance on U.S. Appl. No. 15/487,923 dated Aug. 7, 2017.
U.S. Office Action on U.S. Appl. No. 15/487,923 dated Jun. 6, 2017.
Related Publications (1)
Number Date Country
20170228592 A1 Aug 2017 US
Provisional Applications (1)
Number Date Country
60969607 Sep 2007 US
Continuations (3)
Number Date Country
Parent 14946956 Nov 2015 US
Child 15495782 US
Parent 13786093 Mar 2013 US
Child 14946956 US
Parent 12675189 US
Child 13786093 US