Field of the Invention
The invention relates to biometry, the study of characteristics of people for identification purposes. More specifically, the invention relates to active illumination used during biometry, especially iris and facial recognition, for enhancing user experience while maintaining image quality.
Description of the Related Art
Active illumination has been widely used in the field of iris recognition, which is the field of recognizing individuals based on the patterns of the iris in the human eye. For example, Daugman describes a range of iris recognition deployments all using active infrared illumination [J. Daugman/“The Importance of being Random: Statistical Principles of iris Recognition”/Pattern Recognition 26 (2003) 279-291]. A problem however is that the illumination is often noticed by the subject, which may cause them some temporary discomfort while using the system.
Moreover, FIG. 1 shows methods for iris recognition using pulsed lighting 11 synchronized to frame acquisition 10 that has been described in US 2003/0169334 A1 and U.S. Pat. No. 7,542,628, for example, as a means to stop the motion of an individual while performing iris recognition. The top graph on the horizontal axis shows time, and on the vertical axis it is shown whether a frame is being acquired or not. In this example three frames of a continuous sequence are shown being acquired, with each frame being acquired in a finite time period T.
The illumination in these systems is more noticeable to the user due to the repetitive on/off cycle of the illumination. Pulsed Light Emitting Diode (LED) lighting compared to constant LED lighting is preferable in these applications since for a given average heat dissipation capability of an LED, more power can be concentrated in the pulse during which the frame is being acquired, resulting in higher quality imagery with a lower signal to noise ratio, rather than wasting the power during the time period when the image is not being acquired.
In addition, as iris recognition devices have becomes smaller, a side effect is that the user is more likely to look directly at or near the light sources mounted within a compact case. As such, the impact of the illumination is perceptually much greater than if the illumination were placed elsewhere. Put simply, the illuminators are more noticeable to the user, even though the incident power to the eye is the same compared to when the illuminators are placed elsewhere.
Of equal importance, the power of Light Emitting Diode (LED) illumination has increased substantially in recent years so that the LEDs are that much more noticeable to the user even though the incident power is the same, compared to less powerful LEDs spread over a larger area. High power LEDs can now be pulsed at >=250 mA. We have found that the use of pulsed illumination combined with the two factors above vastly increases the user's perception of the illumination. This very high perception of the illumination even given safe illumination intensity level can not only be annoying to a user, it can also create photosensitive epilepsy in certain subjects.
Wilkins in “Visual Stress”, Oxford Univ Press, 1995 describes how the peak response for photosensitive epilepsy is approximately 15 Hz, and the wavelength of light to which patients are most sensitive is in the red wavelength region, which is near the infra-red region used for iris recognition.
For all the aforementioned reasons therefore, it is important to reduce the visibility of illumination to the subject, while not impacting the quality of imagery acquired. This is a difficult problem since changing the characteristics of the illumination can potentially adversely impact the characteristics of the images being acquired.
In light of these and other problems, we have devised a method of illuminating the subject and acquiring imagery for use in applications such as iris or facial recognition that that exploits differences between the characteristics of the imaging system and the characteristics of the human eye in order to maximize the quality of the images being acquired while minimizing the illumination being perceived by the user. We use four differences in the characteristics of the imaging system and the human eye: temporal persistence of the human visual system, background light level, spectral response, and asymmetric perceived pulse brightness. These methods can be used individually or collectively, depending on the constraints and specifications of the particular device.
The invention is a method of providing active illumination during biometry that utilizes pulsed lighting synchronized to the frame acquisition of an imaging system. The inventive method includes the steps of a) providing a first illumination modality that maximizes the quality of images captured by the imaging system; and b) providing a second illumination modality, substantially simultaneously as the first illumination modality, that, in combination with the first illumination modality, minimizes the overall illumination perceived by the user.
In one aspect, the first illumination modality is provided as a first set of periodic illumination pulses synchronized with the frame acquisition of an imaging system, and the second illumination modality is provided as a second set of periodic illumination pulses not synchronized with imaging system frame acquisition. Preferably, the pulse rate frequency of a combination of the first and second sets of illumination pulses is greater than a response frequency for photosensitive epilepsy and is 2-10 times the pulse rate of the first set of pulses alone. More preferably, the intensity of the second set of pulses is equal to or greater than the intensity of the first set. The frame acquisition rate of the imaging system may be set to a maximum value.
In another aspect, the first illumination modality is provided a first set of periodic illumination pulses, and the second illumination modality is provided as constant background illumination. The background illumination of the second modality is preferably in the range of from at least 0.02 times, up to but not equal to, the average illumination of the pulses of the first illumination modality. Optionally, the first and second modalities may be both provided by the same single illumination source, or they may each be provided by different illumination sources. Optionally, the wavelength of the light from the first illumination source is different from the wavelength of the light of the second illumination source. In this case, the first wavelength is substantially in the range of 700-900 nm and the second wavelength is substantially in the range of 400-700 nm. In addition or in the alternative, the intensity of the light from the second illumination source is substantially 0.1-10 times the intensity of the light from the first illumination source.
In another aspect of the invention, the first illumination modality is provided by a first illumination source that generates a first set of periodic illumination pulses, and the second illumination modality is provided by a second illumination source to generate a second set of periodic illumination pulses having a substantially inverse waveform of the first set of pulses. As before, the wavelength of the light from the first illumination source may be different from the wavelength of the light of the second illumination source. Again, the first wavelength is preferably substantially in the range of 700-900 nm and the second wavelength is preferably substantially in the range of 400-700 nm.
In yet another aspect of the invention, the first modality includes pulses of a first duration and having a first intensity synchronized with imaging system frame acquisition, while the second modality includes pulses not synchronized with imaging system frame acquisition and have a shorter duration but equal or greater intensity than the pulses of the first modality. Preferably, the second pulses are 0.001 to 1 times the duration and 1 to 100 times the intensity of the first pulses.
In all cases, it is preferred to include in the method the steps of sensing the actual output of at least one of the first and second illumination modalities, and adjusting the output of at least one of the first and second illumination modalities in response to the output sensed in the sensing step. One example is to provide at least one photodiode for detecting the output of one or more modalities and to connect the photodiode to the controller(s) of the one or more illumination sources to provide feedback to the controller(s).
More generally, the invention is a method of providing active illumination during the acquisition of high quality images of a person that utilizes pulsed lighting synchronized to imaging system frame acquisition. A first illumination modality is provided that maximizes the quality of images captured by the imaging system. Substantially at the same time, a second illumination modality is provided that, in combination with the first illumination modality, minimizes the overall illumination perceived by the user.
The invention also includes an image capturing apparatus that performs the abovementioned methods.
Description of the invention will now be given with reference to
In the first aspect of the invention, we take advantage of the temporal persistence of the human visual system, such that individual pulses at higher frequencies are less discernible than individual pulses at lower frequencies.
An illuminator 22 is controlled by a lighting controller 20, which is synchronized by a camera controller 21 to a camera 25 that acquires frames. An optional photodiode 23 can also be connected to the lighting controller 20. The illuminator 22 projects light onto the optional photodiode 23 as well as on the subject 24, shown on the bottom of the figure. The illumination is reflected off the eye of the subject 24, and an image of the eye is captured using the camera 25 shown to the right of the figure.
Most camera sensors are capable of acquiring data at 5-30 frames per second or higher, depending on the resolution of the imager. As the resolution of the imager increases, the number of pixels per image that needs to be acquired also increases, and therefore the rate at which data can be acquired through a given data bandwidth channel reduces. Iris recognition typically uses high resolution cameras (for example, 1.3 Mpixel or greater) and such cameras often have frame rates limited to 5-15 frames per second as a result. US 2003/0169334 A1 and U.S. Pat. No. 7,542,628 describe methods whereby the frame acquisition is synchronized to the illumination pulse. If the acquired frame rate and illumination pulse rate is set too low, then the performance of the iris recognition device can be impacted since not enough frames are being acquired in a sufficient time period for reliable acquisition of eye imagery of the subject. On the other hand, if the acquired frame rate and illumination pulse rate is set at the highest possible rate for the sensor, which may be close to 15 frames and illumination pulses per second, then the illumination pulse rate is close to the peak response for photosensitive epilepsy.
The first aspect of the invention overcomes this problem by using a different pulse rate for the illumination compared to the frame acquisition rate of the sensor, such that a portion of the illumination pulses are still synchronized with frame acquisition but where the remaining portion of illumination pulses is not. Put another way, a first set of pulses coincide with frame/image capture (the synchronized pulses), while a second set of pulses are triggered at other times (the asynchronous pulses). The pulse rate of the illumination is set sufficiently high in order to take advantage of the persistence of the human visual system so that the illumination pulses appear almost unnoticed to the subject, but a subset of the pulses are still synchronized to the lower frame acquisition rate so that illumination is provided at the lower frequency in order to provide high-quality, well-illuminated imagery. In this way, photosensitive epilepsy or discomfort to the user is not a concern, even though images are being illuminated and acquired at a rate to which the human eye is much more sensitive.
As shown in
The asynchronous pulse sets 31 and 41 are shown to be evenly periodic, and that is preferred. However, the asynchronous pulses need not be evenly periodic; they can be spaced unevenly in time.
The intensity of the illumination between frame acquisition does not necessarily need to be smaller than the intensity of the illumination that is synchronized with frame acquisition, in order to achieve optimal imagery. In fact, we have found it advantageous to use the same or higher intensity illumination between frame acquisition compared to during frame acquisition, as described further in the fourth aspect of the invention.
In the second aspect of the invention, we take advantage of another property of the human visual system such that the sensitivity of the eye is substantially consistent with Weber's law whereby for a given wavelength of light the minimum brightness difference that can be perceived is approximately proportional to the average brightness being perceived. In other words, the brighter the scene, then the less sensitive the human visual system is to a fixed difference in illumination either temporally or spatially. We impose this constraint with our iris recognition system using 2 methods.
The first method of the second aspect of the invention is shown in
The second method of the second aspect of the invention is shown in
A preferred intensity of the second illuminator source is in the range of 0.1 to 10 of the peak intensity of the infra-red pulsed illumination.
In the third aspect of the invention, we again take advantage of the differences in the spectral response of the visual system compared to the spectral response of the camera system. Similarly to the method described above, in this embodiment of the invention we also introduce a second illumination module with wavelength characteristics that are substantially different from the wavelength characteristics of the first illumination module. In this case however, as shown in
The wavelength spectrums of the first and second illuminators are also chosen such that the spectrum defined by the intersection 82 of the human-visible spectrum and the spectrum of the first illuminator, and the spectrum defined by the intersection of the human visible spectrum and the spectrum of the second illuminator are substantially the same, as described earlier and shown in
While reducing or eliminating the magnitude of visible pulsed illumination observed by the subject substantially reduces discomfort, if the two or more illuminators are positioned substantially apart from each other, then spatial flickering may still be observed solely from the difference in position, even if the wavelength spectrum of each illuminator were identical.
In the fourth aspect of the invention, we take advantage of another property of the human visual system such that the perceived temporal response of the eye is non-symmetric for illumination that transitions from off-to-on, compared to light that transitions from on-to-off. More specifically, the perceived response time of the eye has a decay time that is longer than the attack time. For example, a description of this property of the eye is given by Jinno et. al “Effective Illuminance Improvement of a Light Source by using Pulse Modulation and Its Physcophysical Effect on the Human Eye” in J. Light & Vis. Env. Vol. 32, No. 2, 2008.
As shown in
As described in the four aspects of the invention above, the characteristics of the illumination control signals (for example pulse width) are adjusted substantially within the preferred ranges to reduce or eliminate perceived flicker while maintaining high quality image acquisition. As shown in
Having described certain embodiments of the invention, it should be understood that the invention is not limited to the above description or the attached exemplary drawings. Rather, the scope of the invention is defined by the claims appearing herein below and any equivalents thereof as would be appreciated by one of ordinary skill in the art.
This application is a continuation of International Application No. PCT/US09/04835, which claims priority to i) U.S. Provisional Patent Application No. 61/075,817 filed Jun. 26, 2008; and ii) U.S. Provisional Patent Application No. 61/185,417 filed Jun. 9, 2009; and the entire teachings of the aforementioned U.S. provisional patent applications i)-ii) are hereby incorporated by reference herein.
| Number | Name | Date | Kind |
|---|---|---|---|
| 4641349 | Flom et al. | Feb 1987 | A |
| 5259040 | Hanna | Nov 1993 | A |
| 5291560 | Daugman | Mar 1994 | A |
| 5488675 | Hanna | Jan 1996 | A |
| 5572596 | Wildes et al. | Nov 1996 | A |
| 5581629 | Hanna et al. | Dec 1996 | A |
| 5613012 | Hoffman et al. | Mar 1997 | A |
| 5615277 | Hoffman | Mar 1997 | A |
| 5737439 | Lapsley et al. | Apr 1998 | A |
| 5751836 | Wildes et al. | May 1998 | A |
| 5764789 | Pare et al. | Jun 1998 | A |
| 5802199 | Pare et al. | Sep 1998 | A |
| 5805719 | Pare et al. | Sep 1998 | A |
| 5838812 | Pare et al. | Nov 1998 | A |
| 5901238 | Matsushita | May 1999 | A |
| 5953130 | Benedict et al. | Sep 1999 | A |
| 5953440 | Zhang et al. | Sep 1999 | A |
| 5978494 | Zhang | Nov 1999 | A |
| 6021210 | Camus et al. | Feb 2000 | A |
| 6028949 | McKendall | Feb 2000 | A |
| 6055322 | Salganicoff et al. | Apr 2000 | A |
| 6062475 | Feng | May 2000 | A |
| 6064752 | Rozmus et al. | May 2000 | A |
| 6069967 | Rozmus et al. | May 2000 | A |
| 6088470 | Camus et al. | Jul 2000 | A |
| 6144754 | Okano et al. | Nov 2000 | A |
| 6192142 | Pare et al. | Feb 2001 | B1 |
| 6246751 | Bergl et al. | Jun 2001 | B1 |
| 6247813 | Kim et al. | Jun 2001 | B1 |
| 6252977 | Salganicoff et al. | Jun 2001 | B1 |
| 6289113 | McHugh et al. | Sep 2001 | B1 |
| 6366682 | Hoffman et al. | Apr 2002 | B1 |
| 6373968 | Okano et al. | Apr 2002 | B2 |
| 6377699 | Musgrave et al. | Apr 2002 | B1 |
| 6424727 | Musgrave et al. | Jul 2002 | B1 |
| 6483930 | Musgrave et al. | Nov 2002 | B1 |
| 6532298 | Cambier et al. | Mar 2003 | B1 |
| 6542624 | Oda | Apr 2003 | B1 |
| 6546121 | Oda | Apr 2003 | B1 |
| 6554705 | Cumbers | Apr 2003 | B1 |
| 6594376 | Hoffman et al. | Jul 2003 | B2 |
| 6594377 | Kim et al. | Jul 2003 | B1 |
| 6652099 | Chae et al. | Nov 2003 | B2 |
| 6700998 | Murata | Mar 2004 | B1 |
| 6714665 | Hanna et al. | Mar 2004 | B1 |
| 6760467 | Min et al. | Jul 2004 | B1 |
| 6819219 | Bolle et al. | Nov 2004 | B1 |
| 6850631 | Oda et al. | Feb 2005 | B1 |
| 6914629 | Hurwitz et al. | Jul 2005 | B1 |
| 6917695 | Teng et al. | Jul 2005 | B2 |
| 6944318 | Takata et al. | Sep 2005 | B1 |
| 6950536 | Houvener | Sep 2005 | B2 |
| 6966681 | Stephan et al. | Nov 2005 | B2 |
| 6980670 | Hoffman et al. | Dec 2005 | B1 |
| 6985608 | Hoffman et al. | Jan 2006 | B2 |
| 7007298 | Shinzaki et al. | Feb 2006 | B1 |
| 7020351 | Kumar et al. | Mar 2006 | B1 |
| 7047418 | Ferren et al. | May 2006 | B1 |
| 7095901 | Lee et al. | Aug 2006 | B2 |
| 7146027 | Kim et al. | Dec 2006 | B2 |
| 7152782 | Shenker et al. | Dec 2006 | B2 |
| 7248719 | Hoffman et al. | Jul 2007 | B2 |
| 7271939 | Kono | Sep 2007 | B2 |
| 7346472 | Moskowitz et al. | Mar 2008 | B1 |
| 7385626 | Aggarwal et al. | Jun 2008 | B2 |
| 7398925 | Tidwell et al. | Jul 2008 | B2 |
| 7414737 | Cottard et al. | Aug 2008 | B2 |
| 7418115 | Northcott et al. | Aug 2008 | B2 |
| 7428320 | Northcott et al. | Sep 2008 | B2 |
| 7542590 | Robinson et al. | Jun 2009 | B1 |
| 7545962 | Peirce et al. | Jun 2009 | B2 |
| 7558406 | Robinson et al. | Jul 2009 | B1 |
| 7558407 | Hoffman et al. | Jul 2009 | B2 |
| 7574021 | Matey | Aug 2009 | B2 |
| 7583822 | Guillemot et al. | Sep 2009 | B2 |
| 7606401 | Hoffman et al. | Oct 2009 | B2 |
| 7616788 | Hsieh et al. | Nov 2009 | B2 |
| 7639840 | Hanna et al. | Dec 2009 | B2 |
| 7660700 | Moskowitz et al. | Feb 2010 | B2 |
| 7693307 | Rieul et al. | Apr 2010 | B2 |
| 7697786 | Camus et al. | Apr 2010 | B2 |
| 7715595 | Kim et al. | May 2010 | B2 |
| 7719566 | Guichard | May 2010 | B2 |
| 7770019 | Ferren et al. | Aug 2010 | B2 |
| 7797606 | Chabanne | Sep 2010 | B2 |
| 7801335 | Hanna et al. | Sep 2010 | B2 |
| 7847688 | Bernard et al. | Dec 2010 | B2 |
| 7869627 | Northcott et al. | Jan 2011 | B2 |
| 7925059 | Hoyos et al. | Apr 2011 | B2 |
| 7929017 | Aggarwal et al. | Apr 2011 | B2 |
| 7929732 | Bringer et al. | Apr 2011 | B2 |
| 7949295 | Kumar et al. | May 2011 | B2 |
| 7949494 | Moskowitz et al. | May 2011 | B2 |
| 7978883 | Rouh et al. | Jul 2011 | B2 |
| 8009876 | Kim et al. | Aug 2011 | B2 |
| 8025399 | Northcott et al. | Sep 2011 | B2 |
| 8028896 | Carter et al. | Oct 2011 | B2 |
| 8090246 | Jelinek | Jan 2012 | B2 |
| 8092021 | Northcott et al. | Jan 2012 | B1 |
| 8132912 | Northcott et al. | Mar 2012 | B1 |
| 8159328 | Luckhardt | Apr 2012 | B2 |
| 8170295 | Fujii et al. | May 2012 | B2 |
| 8181858 | Carter et al. | May 2012 | B2 |
| 8195044 | Hanna et al. | Jun 2012 | B2 |
| 8212870 | Hanna et al. | Jul 2012 | B2 |
| 8214175 | Moskowitz et al. | Jul 2012 | B2 |
| 8233680 | Bringer et al. | Jul 2012 | B2 |
| 8243133 | Northcott et al. | Aug 2012 | B1 |
| 8260008 | Hanna et al. | Sep 2012 | B2 |
| 8279042 | Beenau et al. | Oct 2012 | B2 |
| 8280120 | Hoyos et al. | Oct 2012 | B2 |
| 8289390 | Aggarwal et al. | Oct 2012 | B2 |
| 8306279 | Hanna | Nov 2012 | B2 |
| 8317325 | Raguin et al. | Nov 2012 | B2 |
| 8364646 | Hanna et al. | Jan 2013 | B2 |
| 8411909 | Zhao et al. | Apr 2013 | B1 |
| 8442339 | Martin et al. | May 2013 | B2 |
| 8443202 | White et al. | May 2013 | B2 |
| 8553948 | Hanna | Oct 2013 | B2 |
| 8604901 | Hoyos et al. | Dec 2013 | B2 |
| 8606097 | Hanna et al. | Dec 2013 | B2 |
| 8719584 | Mullin | May 2014 | B2 |
| 20020093645 | Heacock | Jul 2002 | A1 |
| 20020191388 | Matveev | Dec 2002 | A1 |
| 20040047146 | Galoob | Mar 2004 | A1 |
| 20040170304 | Haven et al. | Sep 2004 | A1 |
| 20040196371 | Kono et al. | Oct 2004 | A1 |
| 20050001926 | Lee | Jan 2005 | A1 |
| 20050084137 | Kim et al. | Apr 2005 | A1 |
| 20050084179 | Hanna et al. | Apr 2005 | A1 |
| 20050117172 | Plamann et al. | Jun 2005 | A1 |
| 20050243224 | Choi | Nov 2005 | A1 |
| 20050281475 | Wilson | Dec 2005 | A1 |
| 20060027021 | Choi | Feb 2006 | A1 |
| 20060028552 | Aggarwal et al. | Feb 2006 | A1 |
| 20060043303 | Safai | Mar 2006 | A1 |
| 20060073449 | Kumar et al. | Apr 2006 | A1 |
| 20060074986 | Mallalieu et al. | Apr 2006 | A1 |
| 20060113386 | Olmstead | Jun 2006 | A1 |
| 20060115130 | Kozlay | Jun 2006 | A1 |
| 20060164541 | Olmstead et al. | Jul 2006 | A1 |
| 20060279630 | Aggarwal et al. | Dec 2006 | A1 |
| 20070015992 | Filkins | Jan 2007 | A1 |
| 20070110285 | Hanna et al. | May 2007 | A1 |
| 20070133974 | Murakami | Jun 2007 | A1 |
| 20070206839 | Hanna et al. | Sep 2007 | A1 |
| 20070211922 | Crowley et al. | Sep 2007 | A1 |
| 20080122578 | Hoyos et al. | May 2008 | A1 |
| 20080158348 | Karpen et al. | Jul 2008 | A1 |
| 20080291279 | Samarasekera et al. | Nov 2008 | A1 |
| 20090074256 | Haddad | Mar 2009 | A1 |
| 20090097715 | Cottard et al. | Apr 2009 | A1 |
| 20090161925 | Cottard et al. | Jun 2009 | A1 |
| 20090231096 | Bringer et al. | Sep 2009 | A1 |
| 20090274345 | Hanna et al. | Nov 2009 | A1 |
| 20100014720 | Hoyos et al. | Jan 2010 | A1 |
| 20100021016 | Cottard et al. | Jan 2010 | A1 |
| 20100074477 | Fujii et al. | Mar 2010 | A1 |
| 20100127826 | Saliba et al. | May 2010 | A1 |
| 20100232655 | Hanna | Sep 2010 | A1 |
| 20100246903 | Cottard | Sep 2010 | A1 |
| 20100253816 | Hanna | Oct 2010 | A1 |
| 20100278394 | Raguin et al. | Nov 2010 | A1 |
| 20100310070 | Bringer et al. | Dec 2010 | A1 |
| 20110002510 | Hanna | Jan 2011 | A1 |
| 20110007949 | Hanna et al. | Jan 2011 | A1 |
| 20110063427 | Fengler et al. | Mar 2011 | A1 |
| 20110119111 | Hanna | May 2011 | A1 |
| 20110119141 | Hoyos et al. | May 2011 | A1 |
| 20110158486 | Bringer et al. | Jun 2011 | A1 |
| 20110194738 | Choi et al. | Aug 2011 | A1 |
| 20110211054 | Hanna et al. | Sep 2011 | A1 |
| 20110277518 | Lais et al. | Nov 2011 | A1 |
| 20120127295 | Hanna et al. | May 2012 | A9 |
| 20120187838 | Hanna | Jul 2012 | A1 |
| 20120212597 | Hanna | Aug 2012 | A1 |
| 20120219279 | Hanna et al. | Aug 2012 | A1 |
| 20120239458 | Hanna | Sep 2012 | A9 |
| 20120240223 | Tu | Sep 2012 | A1 |
| 20120242820 | Hanna et al. | Sep 2012 | A1 |
| 20120242821 | Hanna et al. | Sep 2012 | A1 |
| 20120243749 | Hanna et al. | Sep 2012 | A1 |
| 20120257797 | Leyvand et al. | Oct 2012 | A1 |
| 20120268241 | Hanna et al. | Oct 2012 | A1 |
| 20120293643 | Hanna | Nov 2012 | A1 |
| 20120300052 | Hanna et al. | Nov 2012 | A1 |
| 20120300990 | Hanna et al. | Nov 2012 | A1 |
| 20120321141 | Hoyos et al. | Dec 2012 | A1 |
| 20120328164 | Hoyos et al. | Dec 2012 | A1 |
| 20130051631 | Hanna | Feb 2013 | A1 |
| 20130110859 | Hanna et al. | May 2013 | A1 |
| 20130162798 | Hanna et al. | Jun 2013 | A1 |
| 20130162799 | Hanna et al. | Jun 2013 | A1 |
| 20130182093 | Hanna | Jul 2013 | A1 |
| 20130182094 | Hanna | Jul 2013 | A1 |
| 20130182095 | Hanna | Jul 2013 | A1 |
| 20130182913 | Hoyos et al. | Jul 2013 | A1 |
| 20130182915 | Hanna | Jul 2013 | A1 |
| 20130194408 | Hanna et al. | Aug 2013 | A1 |
| 20130212655 | Hoyos et al. | Aug 2013 | A1 |
| 20130294659 | Hanna et al. | Nov 2013 | A1 |
| 20140064574 | Hanna et al. | Mar 2014 | A1 |
| 20140072183 | Hanna et al. | Mar 2014 | A1 |
| Number | Date | Country |
|---|---|---|
| 09-305765 | Nov 1997 | JP |
| 11-203478 | Jul 1999 | JP |
| 2000-322029 | Nov 2000 | JP |
| 2002-102165 | Apr 2002 | JP |
| 2003-016434 | Jan 2003 | JP |
| 2009-193197 | Aug 2009 | JP |
| 1020020078225 | Oct 2002 | KR |
| 1020030005113 | Jan 2003 | KR |
| 1003738500000 | Feb 2003 | KR |
| 1020030034258 | May 2003 | KR |
| 1020030051970 | Jun 2003 | KR |
| 2003216700000 | Jul 2003 | KR |
| 1004160650000 | Jan 2004 | KR |
| 2003402730000 | Jan 2004 | KR |
| 2003411370000 | Jan 2004 | KR |
| 2003526690000 | May 2004 | KR |
| 2003552790000 | Jun 2004 | KR |
| 2003620320000 | Sep 2004 | KR |
| 2003679170000 | Nov 2004 | KR |
| 1020050005336 | Jan 2005 | KR |
| 2003838080000 | May 2005 | KR |
| 1020050051861 | Jun 2005 | KR |
| 2004046500000 | Dec 2005 | KR |
| 10-0565959 | Mar 2006 | KR |
| 1005726260000 | Apr 2006 | KR |
| 1011976780000 | Oct 2012 | KR |
| 1013667480000 | Feb 2014 | KR |
| 1013740490000 | Mar 2014 | KR |
| 1020140028950 | Mar 2014 | KR |
| 1020140039803 | Apr 2014 | KR |
| 1020140050501 | Apr 2014 | KR |
| WO 2009158662 | Dec 2009 | WO |
| WO-2010062371 | Jun 2010 | WO |
| WO-2011093538 | Aug 2011 | WO |
| WO 2012158825 | Nov 2012 | WO |
| Entry |
|---|
| “Frame (video)—Wikipedia, the free encyclopedia”. |
| Daugman, John“How Iris Recognition Works,” IEEE Transaction on Circuits and Systems for Video Technology, vol. 14, No. 1, pp. 21-30 (Jan. 2004) (10 pages). |
| International Preliminary Report on Patentability in PCT/US2009/048935 dated Jan. 5, 2011 (4 pages). |
| Written Opinion of the International Searching Authority in PCT/US2009/048935 dated Feb. 9, 2010 (3 pages). |
| International Search Report in PCT/US2009/048935 dated Feb. 9, 2010 (3 pages). |
| International Preliminary Report on Patentability in PCT/US2012/038188 dated Nov. 19, 2013 (6 pages). |
| Written Opinion of the International Searching Authority in PCT/US2012/038188 dated Jan. 22, 2013 (5 pages). |
| International Search Report in PCT/US2012/038188 dated Jan. 22, 2013 (3 pages). |
| Notice of Allowance in U.S. Appl. No. 13/800,462 dated May 28, 2014 (9 pages). |
| B. Galvin, et al., Recovering Motion Fields: An Evaluation of Eight Optical Flow Algorithms, Proc. of the British Machine Vision Conf. (1998). |
| J. R. Bergen, et al., Hierarchical Model-Based Motion Estimation, European Conf. on Computer Vision (1993). |
| K. Nishino, et al., The World in an Eye, IEEE Conf. on Pattern Recognition, vol. 1, at pp. 444-451 (Jun. 2004). |
| R. Kumar, et al., Direct recovery of shape from multiple views: a parallax based approach, 12th IAPR Int'l Conf. on Pattern Recognition. |
| R. P. Wildes, Iris Recognition: An Emerging Biometric Technology, Proc. IEEE 85(9) at pp. 1348-1363 (Sep. 1997). |
| Number | Date | Country | |
|---|---|---|---|
| 20120187838 A1 | Jul 2012 | US | |
| 20140354151 A9 | Dec 2014 | US |
| Number | Date | Country | |
|---|---|---|---|
| 61075817 | Jun 2008 | US | |
| 61185417 | Jun 2009 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/US2009/048935 | Feb 2009 | US |
| Child | 13055943 | US |