System for iris detection tracking and recognition at a distance

Information

  • Patent Grant
  • 7945074
  • Patent Number
    7,945,074
  • Date Filed
    Tuesday, May 9, 2006
    18 years ago
  • Date Issued
    Tuesday, May 17, 2011
    13 years ago
  • US Classifications
    Field of Search
    • US
    • 382 117000
  • International Classifications
    • G06K9/00
    • Term Extension
      878
Abstract
A stand-off range or at-a-distance iris detection and tracking for iris recognition having a head/face/eye locator, a zoom-in iris capture mechanism and an iris recognition module. The system may obtain iris information of a subject with or without his or her knowledge or cooperation. This information may be sufficient for identification of the subject, verification of identity and/or storage in a database.
Description
BACKGROUND

The present invention pertains to recognition systems and particularly to biometric recognition systems. More particularly, the invention pertains to iris recognition systems.


U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006, is hereby incorporated by reference. U.S. application Ser. No. 11/043,366, filed Jan. 26, 2005, is hereby incorporated by reference. U.S. application Ser. No. 11/275,703, filed Jan. 25, 2006, is hereby incorporated by reference. U.S. application Ser. No. 10/446,521, filed May 27, 2003, is hereby incorporated by reference. U.S. Pat. No. 6,718,049, issued Apr. 6, 2004, is hereby incorporated by reference.


SUMMARY

The invention is a system that incorporates an iris biometrics technology for person recognition (not necessarily cooperating) from afar.





BRIEF DESCRIPTION OF THE DRAWING


FIG. 1 is an overall diagram of the distant iris detection, tracking and recognition system;



FIG. 2 is a diagram of a head, face and eye region locator of the system in FIG. 1; and



FIG. 3 is a diagram of a zoom-in and iris capture stage of the system in FIG. 1.





DESCRIPTION

The present system may involve remote iris detection and tracking, remote iris recognition, remote biometrics, non-cooperative iris ID, non-invasive iris recognition and face detection from a stand off range. The invention may have application to identity management, access control, identification, verification, security, surveillance, medical imaging, and so forth.


Current iris recognition (biometrics) technology and devices are limited in their application use because they require actual cooperation by the subject. They also require that the subject places his or her eye or eyes for a few seconds in line with the device scanning window and look inside the device at the imaging source, or at best from a short distance away. This may be sufficient for some access control applications. However, there are applications (e.g., non-cooperative identification, surveillance, and fast access control), which require that iris identification be accomplished from a relatively long distance away.


Various properties and characteristics make iris recognition technology a potentially reliable personal identification tool. This technology may provide uniqueness and genetic independence in identification. The iris of the eye has an extraordinary structure that is unique to each human being. Unlike other well known biometric technologies, such as face-prints and fingerprints, irises are unique to each person and even among genetically identical individuals (i.e., twins). Although the striking visual similarity of identical twins reveals the genetic penetrance of facial appearance, a comparison of genetically identical irises reveals just the opposite for iris patterns. Biomedical literature suggests that iris features are as distinct for each human as fingerprints or patterns of retinal blood vessels. An iris has a data-rich physical structure with sufficient texture to provide adequate discrimination between human subjects. There is no aging effect, that is, there is stability over life of the iris features. Iris recognition technology provides non-invasiveness. The iris is regarded an internal and unique organ, yet is externally visible and can be measured from a distance, using this technique.


From a technical point of view, biometric accuracy may rely significantly on how best the iris is resolved, focused, segmented and extracted. When acquiring iris images, the number of “on-iris” pixels, iris exposure, dynamic range and focus must all be sufficiently precise to produce a high quality image that captures the intricacy of the iris tissue structure. When analyzing iris images of cooperative subjects, the segmentation approach may be a relatively straightforward process of edge detection and circular fitting. However, this is often not the case for stand-off range eye detection and tracking or iris-at-a-distance systems, which often do not receive the cooperation of the subject. In many cases of stand-off range and at-a-distance systems, merely a portion of the iris may be captured due to, for example, closure effect and/or eyelash and eyelid occlusions. Furthermore, given that the subject is not typically asked to cooperate, a tilted head or a rotated iris typically needs also be considered. The present system may extract accurate segments of the iris borders, among other things, in a stand-off range and at-a-distance environment. Computing iris features may use a good-quality segmentation process that focuses on the subject's iris and properly extracts its borders.


The system may detect the head and/or the face from a distance, track the head/face from a distance, track the head/face, locate the eyes in the face when they are presented in a direction of the camera intentionally or unintentionally, and track them. Then a high quality zoom camera may obtain close-ups of the eye, and a smart algorithm may determine when the iris has the best orientation towards the zoom camera, at which point several high quality sequential pictures of the eye/iris may be taken, to perform the iris recognition task.


The system may be based on the following approach operating from a distance. One may include a Tri-Band Imaging™ (TBI) (Honeywell International Inc) camera skin detector. Then specific algorithms may be used to determine if the detected skin is part of the face. This may be accomplished by locating several facial features (eyes, eye brows, nose, mouth, and so forth) and their positions relevant to each other. For skin and features determinations, one may use a commercially available (COTS) face detection and tracking system.


A high quality zoom camera may be used to obtain close-up high resolution images of a rectangular region that contains both eyes. Eye tracking algorithms may be used for iris location within the eye and to determine the “best iris position” with respect to the camera. The “best iris position” may be determined via algorithms by maximizing a function that depends on the key features of the iris and/or the face. At this point of maximization, the zoom camera may take several close-up images of each eye/iris region and pass it on to be processed by commercially available iris recognition algorithms or devices.



FIG. 1 is a block diagram of a stand-off range or at-a-distance iris detection system 10. A head/face/eye region locator or acquisition module 11 may seek out eye locations, face orientation and coast time of an individual that is a subject of inquiry. The eye locations, face orientation and coast time information, as shown by block 12, may go to a zoom-in iris capture module 13. The head/face/eye locator 11 may perform its operations with one or two sensors or cameras. There may be a cueing sensor 14 and a zoom sensor 15 collocated, or located at different places. The sensors may be physically one integrated multi-function sensor. If iris capture is not successful, then there may be a reacquisition request signal 16 that goes back to the head/face/eye locator module 11 so that the module may again seek out eye locations, face orientation, and coast time, of the subject individual to be forwarded to the zoom-in-iris capture module 13 for another capture of the individual's iris. If the capture is successful, then the resultant capture of the iris may be an iris segment 17 that may go on to an iris recognition module 18. The iris recognition module 18 may match the iris segment 17 with an iris segment in an iris database 19. The matching may a one-to-one verification of an identity of the subject individual, or it may be a one-to-many search to possibly identify the individual with a match in the database 19. Or the iris recognition module 18 may enter the iris segment 17 as information about the subject individual into the iris database 19 for reasons of cataloging for later use in cases of identification or verification of the individual.



FIG. 2 reveals more detail of the head/face/eye region locator or acquisition module 11. The module may start with head/face detection 21. The head/face detection may be performed with a present or future acquisition system, such as the Honeywell Tri-Band Imaging™ (TBI) camera. Other off-the-shelf (COTS) camera or sensor systems using a video and/or an infrared (IR) camera or other imaging techniques may be implemented. With the detected head/face information, face feature extraction 22 may be performed. From this feature or features, information 23 containing face orientation with respect to a camera line-of-sight (LOS) and eye location, may be sent to diamond shaped juncture 24 that asks the question whether the information 23 provides a sufficiently good view. The metrics for determining a good view may include face symmetry and face orientation based on facial features. If the answer to the good view question is no, then a signal may go to a head/face tracking module 25 which re-initiates the face feature extraction 22. The head/face tracking module 25 also may provide the time to coast in the “zoom-in iris capture” continuous mode, before the tracking is reinitiated. If the answer to the good view question is no, then one may get the face orientation, eye location, and estimate coast time information 12 which is provided on to the zoom-in iris capture module 13.



FIG. 3 shows some detail of the zoom-in and iris capture section or module 13. After receipt of the face orientation, eye location, and estimate coast time information 12, there may be a mechanism for providing zoom-in and a localizing (i.e., framing) of the eye region in module 26. From this information, an iris segmentation 27 may be performed. The activities for mechanisms or modules 26 and 27 may be accomplished with COTS technologies. The iris segmentation 27 may be reviewed at a diamond shaped juncture 28 to consider the question as to whether there is good iris fitness. If the answer is no, then a question at a diamond shaped juncture 29 is whether the coast time has expired. If the answer is yes, then a reacquisition request 16 may be initiated back to the head/face/eye locator module 11 in FIG. 1. There, the approach may be repeated in accordance with FIG. 1, as described herein. If the answer is no at juncture 29, then the action of zoom-in and localize the eye region module 26 may be reinstituted and its results forwarded on to the iris segmentation module 27 and to juncture 28 for determining whether there is a good iris fitness of a segment. If the answer at juncture 28 is yes, then the iris segment 17 may be provided to the iris recognition module 18 for the one-to-one verification of a person or the one-to-many identification of a person in conjunction with the database 19 of information. Or the iris segment 17 may be part of an acquisition of a non-cooperative (or cooperative) subject individual into the database 19.


The iris segmentation algorithms can be of any type which faithfully outlines the imaged iris presented to them. One such algorithm is one developed by Honeywell operating in the polar domain and is described herein.


Conducting the segmentation in the polar domain may lead to a more efficient and faster process to execute not only the segmentation, but also calibration, and noise removal, all in one step to generate a feature map for the encoding step.


The system may provide reliable calibration and an efficient segmentation (i.e., localization) of the stand-off range or at-a-distance iris detection, resulting in better extraction of the iris features that may eventually be converted into a numeric code. Conversion of an iris annular image into a numeric code that can be easily manipulated may be essential to iris recognition. The iris codes may be compared with previously generated iris codes for verification and identification purposes.


The orientation of head and eyes may result into different perspective of views of the iris circular shape. The captured shapes of the iris are usually apart from being circles or ellipses due to the orientation, tilt and slant angles.


In an illustrative example, the iris biometric approach may include using a POSE™ (i.e., Honeywell International Inc.—polar segmentation) technique to move virtually immediately the analysis to a polar domain and execute a 1-D segmentation of the iris borders, using one or more symmetry properties to detect one or more non-occluded areas of the iris—non-symmetric regions can correspond to areas partially covered by eyelashes, eyelids, and so forth (thus asymmetric). In some cases, one may limit the analysis to those segments where the iris and the sclera are detected relative to their symmetry. The sclera may be regarded as a tough white fibrous outer envelope of tissue covering the entire eyeball except the cornea. Once an orientation is detected, nominal angles with the least likelihood of distortions (i.e., occluded or deformed due to orientation) may be identified by, for example, estimating the ellipse parameters from nominal angles, and computing a calibration factor. A rotated ellipse detection technique that uses overlapping variable circles to detect the iris borders modeled as elliptic or irregular shapes rather than circles, and/or a least square fitting may be used to estimate the elliptic parameters and orientation. Mixture modeling may be used to handle variation in the iris textures.


The iris inner and outer boundaries of iris may be approximated by ellipses than circles of irregular shapes using snake delineation. However, the two ellipses are usually not concentric. One may characterize the shape and texture of the structure of the iris having a large number of interlacing blocks such as freckles, coronas, furrows, crypts, and stripes. The outer boundaries of the iris may be captured with irregular edges due to presence of eyelids and eyelashes. Taken in tandem, these observations suggest that iris localization may be sensitive to a wide range of edge contrasts.


The present system is well suited for high-security access control involving stand-off range and at-a-distance biometrics applications where less control is exercised on subject positioning and/or orientations. Such operations may include, for example, subjects captured at various ranges from the acquisition device, and/or may not have the subjects eye(s) directly aligned with the imaging equipment. Usually, for such applications, it is difficult to implement the level of control required by most of the existing art to enable reliable iris recognition. The system may help cope with asymmetry in acquired iris images, and may further help under uncontrolled environments as long as some of the iris annular is visible. The system may solve the asymmetry problem associated with image acquisition without the collaboration of the subjects and operate under uncontrolled operations as long as some of the iris annular is visible.


In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense.


Although the invention has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the present specification. It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.

Claims
  • 1. A system for iris detection at a distance of non-cooperative or cooperative subjects, comprising: a multi-band imaging camera for detecting skin of a subject, the multi-band imaging camera including a visible band and at least one infrared band;a first mechanism for determining with specific algorithms whether the skin is of a region containing at least a part of a face of the subject containing at least one eye, by locating several facial features such as eyes, eyebrows, nose or mouth and their positions relative to each other;an adjustable zoom camera, connected to a processor, for obtaining initial close-up high resolution images of the region that contains at least one eye, the eye not necessarily looking directly at the camera; anda second mechanism, connected to the camera, for determining, with eye tracking algorithms, whether the initial images contain a best iris position; andwherein:the best iris position is when the iris has the best orientation towards the zoom camera, and is determined with the algorithms by maximizing a function that depends on key features of the iris to provide a point of maximization;wherein the processor is configured to send a reacquisition request to the zoom camera to obtain additional close-up high resolution images of the region that contains at least one eye if the initial images do not contain a best iris position;wherein if the initial images do contain a best iris position, the zoom camera takes several secondary close-up images of each iris region at the point of maximization; andwherein the secondary images are passed on to be processed by an iris recognition device.
  • 2. The system of claim 1, wherein the first mechanism is a face detection and tracking system.
  • 3. The system of claim 1, wherein the iris recognition device is for identifying the subject upon recognition of an iris in the images of iris regions of the subject.
  • 4. The system of claim 1, wherein the eye is not necessarily looking directly at the camera since the eye may be of a non-cooperative subject.
  • 5. A system for iris detection, tracking and recognition of a non-cooperative or cooperative subject at a distance, comprising: an acquisition module comprising: a multi-band imaging camera for skin detection of a subject, the multi-band imaging camera including a visible band and at least one infrared band;a face detection and tracking system for determining if detected skin is part of a face of the subject by locating several facial features, such as eyes, and extracting the facial features; andwherein:from one or more facial features, information containing face orientation with respect to line of sight and eye location is sent to a juncture for determining whether the information provides a sufficiently good view; andmetrics for a good view comprise face symmetry and orientation toward the camera based on the facial features;wherein the acquisition module includes metrics that signals the face detection and tracking system to re-initiate the facial features extraction when there is not a good view, the acquisition module repeating the facial feature extraction until a good view is achieved;when there is a good view, the information containing face orientation and eye location is sent to a zoom-in iris capture mechanism;the zoom-in iris capture mechanism comprises: a zoom-in and framing mechanism for zooming in and framing an eye region containing an iris;an iris segmentation mechanism connected to the zoom-in and framing mechanism; andwherein the iris segmentation mechanism is for extracting features of the iris by approximating inner and outer borders of the iris by ellipses and performing one-dimensional segmentation of the iris in a polar domain.
  • 6. The system of claim 5, further comprising a mechanism for converting features of the iris into an iris numeric code.
  • 7. The system of claim 6, further comprising a mechanism for comparing the iris numeric code with previously generated iris numeric codes for verification and identification of the iris numeric code.
  • 8. The system of claim 5, wherein the segmentation mechanism is further for characterizing shape and texture of a structure of the iris having interlacing blocks of freckles, coronas, furrows, crypts and stripes.
  • 9. The system of claim 8, wherein mixture modeling is used to handle variation in the texture of the structure of the iris.
  • 10. The system of claim 5, wherein the inner and outer borders of the iris are approximated by ellipses using snake delineation.
  • 11. The system of claim 5, wherein the outer border of the iris is instead approximated with irregular edges due to eyelids and eyelashes.
  • 12. A method for detecting an iris at a distance of a non-cooperative or cooperative subject, comprising: providing a system for iris detection, the system including an acquisition module, a processor, and at least one multi-band camera having a visible band and at least one infrared band, the method including using the system to perform the following steps:scanning for a subject;detecting skin of the subject using the multi-band camera;determining whether the skin is of a region containing a face of the subject by locating one or more facial features such as eyes;obtaining a plurality detailed images of the region containing the face and having at least one eye;determining if one of the plurality of images includes an image of an iris of the at least one eye that shows a best position or view of the iris based on a maximizing a function that depends on features of the iris, wherein the best position of view of the iris is when the iris has a best orientation towards the camera;if no image shows a best position or view of the iris, the system sends a reacquisition signal to the acquisition module and the method steps are repeated until a successful image of the iris showing a best position of view of the iris is achieved; andwhen a successful image of the iris is achieved, extracting features of the iris by determining the inner and outer borders of the iris and doing a one dimensional segmentation of the iris in a polar domain.
  • 13. The method of claim 12, further comprising converting the features of the iris into an iris numeric code.
  • 14. The method of claim 13, further comprising comparing the iris numeric code with previously generated iris numeric codes for verification and identification of the iris numeric code or for entry of the iris numeric code into a database.
  • 15. The method of claim 12, wherein the inner and outer borders of the iris are approximated with ellipses.
  • 16. The method of claim 15, wherein least squares modeling is used to estimate elliptic parameters and orientation.
  • 17. The method of claim 15, wherein the ellipses are not necessarily concentric.
  • 18. The method of claim 12, wherein the outer border of the iris is approximated with irregular edges due to eyelids and eyelashes.
Parent Case Info

This application claims the benefit of U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006.

US Referenced Citations (371)
Number Name Date Kind
4641349 Flom et al. Feb 1987 A
4836670 Hutchinson Jun 1989 A
5231674 Cleveland et al. Jul 1993 A
5291560 Daugman Mar 1994 A
5293427 Ueno et al. Mar 1994 A
5359382 Uenaka Oct 1994 A
5404013 Tajima Apr 1995 A
5551027 Choy et al. Aug 1996 A
5572596 Wildes et al. Nov 1996 A
5608472 Szirth et al. Mar 1997 A
5717512 Chmielewski, Jr. et al. Feb 1998 A
5751836 Wildes et al. May 1998 A
5859686 Aboutalib et al. Jan 1999 A
5896174 Nakata Apr 1999 A
5901238 Matsushita May 1999 A
5909269 Isogai et al. Jun 1999 A
5953440 Zhang et al. Sep 1999 A
5956122 Doster Sep 1999 A
5978494 Zhang Nov 1999 A
6005704 Chmielewski et al. Dec 1999 A
6007202 Apple et al. Dec 1999 A
6021210 Camus et al. Feb 2000 A
6028949 McKendall Feb 2000 A
6055322 Salganicoff et al. Apr 2000 A
6064752 Rozmus et al. May 2000 A
6069967 Rozmus et al. May 2000 A
6081607 Mori et al. Jun 2000 A
6088470 Camus et al. Jul 2000 A
6091899 Konishi et al. Jul 2000 A
6101477 Hohle et al. Aug 2000 A
6104431 Inoue et al. Aug 2000 A
6108636 Yap et al. Aug 2000 A
6119096 Mann et al. Sep 2000 A
6120461 Smyth Sep 2000 A
6134339 Luo Oct 2000 A
6144754 Okano et al. Nov 2000 A
6246751 Bergl et al. Jun 2001 B1
6247813 Kim et al. Jun 2001 B1
6252977 Salganicoff et al. Jun 2001 B1
6282475 Washington Aug 2001 B1
6285505 Melville et al. Sep 2001 B1
6285780 Yamakita et al. Sep 2001 B1
6289113 McHugh et al. Sep 2001 B1
6299306 Braithwaite et al. Oct 2001 B1
6308015 Matsumoto Oct 2001 B1
6309069 Seal et al. Oct 2001 B1
6320610 Van Sant et al. Nov 2001 B1
6320973 Suzaki et al. Nov 2001 B2
6323761 Son Nov 2001 B1
6325765 Hay et al. Dec 2001 B1
6330674 Angelo et al. Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6344683 Kim Feb 2002 B1
6370260 Pavlidis et al. Apr 2002 B1
6377699 Musgrave et al. Apr 2002 B1
6393136 Amir et al. May 2002 B1
6400835 Lemelson et al. Jun 2002 B1
6424727 Musgrave et al. Jul 2002 B1
6424845 Emmoft et al. Jul 2002 B1
6433818 Steinberg et al. Aug 2002 B1
6438752 McClard Aug 2002 B1
6441482 Foster Aug 2002 B1
6446045 Stone et al. Sep 2002 B1
6483930 Musgrave et al. Nov 2002 B1
6484936 Nicoll et al. Nov 2002 B1
6490443 Freeny, Jr. Dec 2002 B1
6493669 Curry et al. Dec 2002 B1
6494363 Roger et al. Dec 2002 B1
6503163 Van Sant et al. Jan 2003 B1
6505193 Musgrave et al. Jan 2003 B1
6508397 Do Jan 2003 B1
6516078 Yang et al. Feb 2003 B1
6516087 Camus Feb 2003 B1
6516416 Gregg et al. Feb 2003 B2
6522772 Morrison et al. Feb 2003 B1
6526160 Ito Feb 2003 B1
6532298 Cambier et al. Mar 2003 B1
6540392 Braithwaite Apr 2003 B1
6542624 Oda Apr 2003 B1
6546121 Oda Apr 2003 B1
6553494 Glass Apr 2003 B1
6580356 Alt et al. Jun 2003 B1
6591001 Oda et al. Jul 2003 B1
6591064 Higashiyama et al. Jul 2003 B2
6594377 Kim et al. Jul 2003 B1
6594399 Camus et al. Jul 2003 B1
6598971 Cleveland Jul 2003 B2
6600878 Pregara Jul 2003 B2
6614919 Suzaki et al. Sep 2003 B1
6652099 Chae et al. Nov 2003 B2
6674367 Sweatte Jan 2004 B2
6690997 Rivalto Feb 2004 B2
6708176 Strunk et al. Mar 2004 B2
6711562 Ross et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6718049 Pavlidis et al. Apr 2004 B2
6718665 Hess et al. Apr 2004 B2
6732278 Baird, III et al. May 2004 B2
6734783 Anbai May 2004 B1
6745520 Puskaric et al. Jun 2004 B2
6751733 Nakamura et al. Jun 2004 B1
6753919 Daugman Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6760467 Min et al. Jul 2004 B1
6765470 Shinzaki Jul 2004 B2
6766041 Golden et al. Jul 2004 B2
6775774 Harper Aug 2004 B1
6785406 Kamada Aug 2004 B1
6793134 Clark Sep 2004 B2
6819219 Bolle et al. Nov 2004 B1
6829370 Pavlidis et al. Dec 2004 B1
6832044 Doi et al. Dec 2004 B2
6836554 Bolle et al. Dec 2004 B1
6837436 Swartz et al. Jan 2005 B2
6845879 Park Jan 2005 B2
6853444 Haddad Feb 2005 B2
6867683 Calvesio et al. Mar 2005 B2
6873960 Wood et al. Mar 2005 B1
6896187 Stockhammer May 2005 B2
6905411 Nguyen et al. Jun 2005 B2
6920237 Chen et al. Jul 2005 B2
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6950139 Fujinawa Sep 2005 B2
6954738 Wang et al. Oct 2005 B2
6957341 Rice et al. Oct 2005 B2
6972797 Izumi Dec 2005 B2
7053948 Konishi May 2006 B2
7071971 Elberbaum Jul 2006 B2
7136581 Fujii Nov 2006 B2
7183895 Bazakos et al. Feb 2007 B2
7184577 Chen et al. Feb 2007 B2
7197173 Jones et al. Mar 2007 B2
7277891 Howard et al. Oct 2007 B2
7298873 Miller, Jr. et al. Nov 2007 B2
7315233 Yuhara Jan 2008 B2
7362210 Bazakos et al. Apr 2008 B2
7362370 Sakamoto et al. Apr 2008 B2
7362884 Willis et al. Apr 2008 B2
7365771 Kahn et al. Apr 2008 B2
7406184 Wolff et al. Jul 2008 B2
7414648 Imada Aug 2008 B2
7417682 Kuwakino et al. Aug 2008 B2
7443441 Hiraoka Oct 2008 B2
7518651 Butterworth Apr 2009 B2
7542945 Thompson et al. Jun 2009 B2
7580620 Raskar et al. Aug 2009 B2
7593550 Hamza Sep 2009 B2
7639846 Yoda Dec 2009 B2
7751598 Matey et al. Jul 2010 B2
7756301 Hamza Jul 2010 B2
7756407 Raskar Jul 2010 B2
7761453 Hamza Jul 2010 B2
7777802 Shinohara et al. Aug 2010 B2
7804982 Howard et al. Sep 2010 B2
20010026632 Tamai Oct 2001 A1
20010027116 Baird Oct 2001 A1
20010047479 Bromba et al. Nov 2001 A1
20010051924 Uberti Dec 2001 A1
20010054154 Tam Dec 2001 A1
20020010857 Karthik Jan 2002 A1
20020033896 Hatano Mar 2002 A1
20020039433 Shin Apr 2002 A1
20020040434 Elliston et al. Apr 2002 A1
20020062280 Zachariassen et al. May 2002 A1
20020077841 Thompson Jun 2002 A1
20020089157 Breed et al. Jul 2002 A1
20020106113 Park Aug 2002 A1
20020112177 Voltmer et al. Aug 2002 A1
20020114495 Chen et al. Aug 2002 A1
20020130961 Lee et al. Sep 2002 A1
20020131622 Lee et al. Sep 2002 A1
20020139842 Swaine Oct 2002 A1
20020140715 Smet Oct 2002 A1
20020142844 Kerr Oct 2002 A1
20020144128 Rahman et al. Oct 2002 A1
20020150281 Cho Oct 2002 A1
20020154794 Cho Oct 2002 A1
20020158750 Almalik Oct 2002 A1
20020164054 McCartney et al. Nov 2002 A1
20020175182 Matthews Nov 2002 A1
20020186131 Fettis Dec 2002 A1
20020191075 Doi et al. Dec 2002 A1
20020191076 Wada et al. Dec 2002 A1
20020194128 Maritzen et al. Dec 2002 A1
20020194131 Dick Dec 2002 A1
20020198731 Barnes et al. Dec 2002 A1
20030002714 Wakiyama Jan 2003 A1
20030012413 Kusakari et al. Jan 2003 A1
20030014372 Wheeler et al. Jan 2003 A1
20030020828 Ooi et al. Jan 2003 A1
20030038173 Blackson et al. Feb 2003 A1
20030046228 Berney Mar 2003 A1
20030053663 Chen et al. Mar 2003 A1
20030055689 Block et al. Mar 2003 A1
20030055787 Fujii Mar 2003 A1
20030058492 Wakiyama Mar 2003 A1
20030061172 Robinson Mar 2003 A1
20030061233 Manasse et al. Mar 2003 A1
20030065626 Allen Apr 2003 A1
20030071743 Seah et al. Apr 2003 A1
20030072475 Tamori Apr 2003 A1
20030073499 Reece Apr 2003 A1
20030074317 Hofi Apr 2003 A1
20030074326 Byers Apr 2003 A1
20030076161 Tisse Apr 2003 A1
20030076300 Lauper et al. Apr 2003 A1
20030076984 Tisse et al. Apr 2003 A1
20030080194 O'Hara et al. May 2003 A1
20030086057 Cleveland May 2003 A1
20030091215 Lauper et al. May 2003 A1
20030092489 Veradej May 2003 A1
20030095689 Vollkommer et al. May 2003 A1
20030098776 Friedli May 2003 A1
20030099379 Monk et al. May 2003 A1
20030099381 Ohba May 2003 A1
20030103652 Lee et al. Jun 2003 A1
20030107097 McArthur et al. Jun 2003 A1
20030107645 Yoon Jun 2003 A1
20030108224 Ike Jun 2003 A1
20030108225 Li Jun 2003 A1
20030115148 Takhar Jun 2003 A1
20030115459 Monk Jun 2003 A1
20030116630 Carey et al. Jun 2003 A1
20030118212 Min et al. Jun 2003 A1
20030118217 Kondo et al. Jun 2003 A1
20030123711 Kim et al. Jul 2003 A1
20030125054 Garcia Jul 2003 A1
20030125057 Pesola Jul 2003 A1
20030126560 Kurapati et al. Jul 2003 A1
20030131245 Linderman Jul 2003 A1
20030131265 Bhakta Jul 2003 A1
20030133597 Moore et al. Jul 2003 A1
20030140235 Immega et al. Jul 2003 A1
20030140928 Bui et al. Jul 2003 A1
20030141411 Pandya et al. Jul 2003 A1
20030149881 Patel et al. Aug 2003 A1
20030152251 Ike Aug 2003 A1
20030152252 Kondo et al. Aug 2003 A1
20030156741 Lee et al. Aug 2003 A1
20030158762 Wu Aug 2003 A1
20030158821 Maia Aug 2003 A1
20030159051 Hollnagel Aug 2003 A1
20030163739 Armington et al. Aug 2003 A1
20030169334 Braithwaite et al. Sep 2003 A1
20030169901 Pavlidis et al. Sep 2003 A1
20030169907 Edwards et al. Sep 2003 A1
20030173408 Mosher, Jr. et al. Sep 2003 A1
20030174049 Beigel et al. Sep 2003 A1
20030177051 Driscoll et al. Sep 2003 A1
20030182151 Taslitz Sep 2003 A1
20030182182 Kocher Sep 2003 A1
20030191949 Odagawa Oct 2003 A1
20030194112 Lee Oct 2003 A1
20030195935 Leeper Oct 2003 A1
20030198368 Kee Oct 2003 A1
20030200180 Phelan, III et al. Oct 2003 A1
20030210139 Brooks et al. Nov 2003 A1
20030210802 Schuessier Nov 2003 A1
20030218719 Abourizk et al. Nov 2003 A1
20030225711 Paping Dec 2003 A1
20030228898 Rowe Dec 2003 A1
20030233556 Angelo et al. Dec 2003 A1
20030235326 Morikawa et al. Dec 2003 A1
20030235411 Morikawa et al. Dec 2003 A1
20030236120 Reece et al. Dec 2003 A1
20040001614 Russon et al. Jan 2004 A1
20040002894 Kocher Jan 2004 A1
20040005078 Tillotson Jan 2004 A1
20040006553 de Vries et al. Jan 2004 A1
20040010462 Moon et al. Jan 2004 A1
20040012760 Mihashi et al. Jan 2004 A1
20040019570 Bolle et al. Jan 2004 A1
20040023664 Mirouze et al. Feb 2004 A1
20040023709 Beaulieu et al. Feb 2004 A1
20040025030 Corbett-Clark et al. Feb 2004 A1
20040025031 Ooi et al. Feb 2004 A1
20040025053 Hayward Feb 2004 A1
20040029564 Hodge Feb 2004 A1
20040030930 Nomura Feb 2004 A1
20040035123 Kim et al. Feb 2004 A1
20040037450 Bradski Feb 2004 A1
20040039914 Barr et al. Feb 2004 A1
20040042641 Jakubowski Mar 2004 A1
20040044627 Russell et al. Mar 2004 A1
20040046640 Jourdain et al. Mar 2004 A1
20040049687 Orsini et al. Mar 2004 A1
20040050924 Mletzko et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040052405 Walfridsson Mar 2004 A1
20040052418 DeLean Mar 2004 A1
20040059590 Mercredi et al. Mar 2004 A1
20040059953 Purnell Mar 2004 A1
20040104266 Bolle et al. Jun 2004 A1
20040117636 Cheng Jun 2004 A1
20040133804 Smith et al. Jul 2004 A1
20040146187 Jeng Jul 2004 A1
20040148526 Sands et al. Jul 2004 A1
20040160518 Park Aug 2004 A1
20040162870 Matsuzaki et al. Aug 2004 A1
20040162984 Freeman et al. Aug 2004 A1
20040169817 Grotehusmann et al. Sep 2004 A1
20040172541 Ando et al. Sep 2004 A1
20040174070 Voda et al. Sep 2004 A1
20040190759 Caldwell Sep 2004 A1
20040193893 Braithwaite et al. Sep 2004 A1
20040219902 Lee et al. Nov 2004 A1
20040233038 Beenau et al. Nov 2004 A1
20040240711 Hamza et al. Dec 2004 A1
20040252866 Tisse et al. Dec 2004 A1
20040255168 Murashita et al. Dec 2004 A1
20050008200 Azuma et al. Jan 2005 A1
20050008201 Lee et al. Jan 2005 A1
20050012817 Hampapur et al. Jan 2005 A1
20050029353 Isemura et al. Feb 2005 A1
20050052566 Kato Mar 2005 A1
20050055582 Bazakos et al. Mar 2005 A1
20050063567 Saitoh et al. Mar 2005 A1
20050084137 Kim et al. Apr 2005 A1
20050084179 Hanna et al. Apr 2005 A1
20050099288 Spitz et al. May 2005 A1
20050102502 Sagen May 2005 A1
20050110610 Bazakos et al. May 2005 A1
20050125258 Yellin et al. Jun 2005 A1
20050127161 Smith et al. Jun 2005 A1
20050129286 Hekimian Jun 2005 A1
20050134796 Zelvin et al. Jun 2005 A1
20050138385 Friedli et al. Jun 2005 A1
20050138387 Lam et al. Jun 2005 A1
20050146640 Shibata Jul 2005 A1
20050151620 Neumann Jul 2005 A1
20050152583 Kondo et al. Jul 2005 A1
20050193212 Yuhara Sep 2005 A1
20050199708 Friedman Sep 2005 A1
20050206501 Farhat Sep 2005 A1
20050206502 Bernitz Sep 2005 A1
20050207614 Schonberg et al. Sep 2005 A1
20050210267 Sugano et al. Sep 2005 A1
20050210270 Rohatgi et al. Sep 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050238214 Matsuda et al. Oct 2005 A1
20050240778 Saito Oct 2005 A1
20050248725 Ikoma et al. Nov 2005 A1
20050249385 Kondo et al. Nov 2005 A1
20050255840 Markham Nov 2005 A1
20060093190 Cheng et al. May 2006 A1
20060147094 Yoo Jul 2006 A1
20060165266 Hamza Jul 2006 A1
20060274919 LoIacono et al. Dec 2006 A1
20070036397 Hamza Feb 2007 A1
20070140531 Hamza Jun 2007 A1
20070160266 Jones et al. Jul 2007 A1
20070189582 Hamza et al. Aug 2007 A1
20070206840 Jacobson Sep 2007 A1
20070211924 Hamza Sep 2007 A1
20070274570 Hamza Nov 2007 A1
20070274571 Hamza Nov 2007 A1
20070286590 Terashima Dec 2007 A1
20080005578 Shafir Jan 2008 A1
20080075334 Determan et al. Mar 2008 A1
20080075441 Jelinek et al. Mar 2008 A1
20080211347 Wright et al. Sep 2008 A1
20080252412 Larsson et al. Oct 2008 A1
20080267456 Anderson Oct 2008 A1
20090092283 Whillock et al. Apr 2009 A1
20090316993 Brasnett et al. Dec 2009 A1
20100002913 Hamza Jan 2010 A1
20100033677 Jelinek Feb 2010 A1
20100034529 Jelinek Feb 2010 A1
20100142765 Hamza Jun 2010 A1
20100182440 McCloskey Jul 2010 A1
Foreign Referenced Citations (187)
Number Date Country
0484076 May 1992 EP
0593386 Apr 1994 EP
0878780 Nov 1998 EP
0899680 Mar 1999 EP
0910986 Apr 1999 EP
0962894 Dec 1999 EP
1018297 Jul 2000 EP
1024463 Aug 2000 EP
1028398 Aug 2000 EP
1041506 Oct 2000 EP
1041523 Oct 2000 EP
1126403 Aug 2001 EP
1139270 Oct 2001 EP
1237117 Sep 2002 EP
1477925 Nov 2004 EP
1635307 Mar 2006 EP
2369205 May 2002 GB
2371396 Jul 2002 GB
2375913 Nov 2002 GB
2402840 Dec 2004 GB
2411980 Sep 2005 GB
9161135 Jun 1997 JP
9198545 Jul 1997 JP
9201348 Aug 1997 JP
9147233 Sep 1997 JP
9234264 Sep 1997 JP
9305765 Nov 1997 JP
9319927 Dec 1997 JP
10021392 Jan 1998 JP
10040386 Feb 1998 JP
10049728 Feb 1998 JP
10137219 May 1998 JP
10137221 May 1998 JP
10137222 May 1998 JP
10137223 May 1998 JP
10248827 Sep 1998 JP
10269183 Oct 1998 JP
11047117 Feb 1999 JP
11089820 Apr 1999 JP
11200684 Jul 1999 JP
11203478 Jul 1999 JP
11213047 Aug 1999 JP
11339037 Dec 1999 JP
2000005149 Jan 2000 JP
2000005150 Jan 2000 JP
2000011163 Jan 2000 JP
2000023946 Jan 2000 JP
2000083930 Mar 2000 JP
2000102510 Apr 2000 JP
2000102524 Apr 2000 JP
2000105830 Apr 2000 JP
2000107156 Apr 2000 JP
2000139878 May 2000 JP
2000155863 Jun 2000 JP
2000182050 Jun 2000 JP
2000185031 Jul 2000 JP
2000194972 Jul 2000 JP
2000237167 Sep 2000 JP
2000242788 Sep 2000 JP
2000259817 Sep 2000 JP
2000356059 Dec 2000 JP
2000357232 Dec 2000 JP
2001005948 Jan 2001 JP
2001067399 Mar 2001 JP
2001101429 Apr 2001 JP
2001167275 Jun 2001 JP
2001222661 Aug 2001 JP
2001292981 Oct 2001 JP
2001297177 Oct 2001 JP
2001358987 Dec 2001 JP
2002119477 Apr 2002 JP
2002133415 May 2002 JP
2002153444 May 2002 JP
2002153445 May 2002 JP
2002260071 Sep 2002 JP
2002271689 Sep 2002 JP
2002286650 Oct 2002 JP
2002312772 Oct 2002 JP
2002329204 Nov 2002 JP
2003006628 Jan 2003 JP
2003036434 Feb 2003 JP
2003108720 Apr 2003 JP
2003108983 Apr 2003 JP
2003132355 May 2003 JP
2003150942 May 2003 JP
2003153880 May 2003 JP
2003242125 Aug 2003 JP
2003271565 Sep 2003 JP
2003271940 Sep 2003 JP
2003308522 Oct 2003 JP
2003308523 Oct 2003 JP
2003317102 Nov 2003 JP
2003331265 Nov 2003 JP
2004005167 Jan 2004 JP
2004021406 Jan 2004 JP
2004030334 Jan 2004 JP
2004038305 Feb 2004 JP
2004094575 Mar 2004 JP
2004152046 May 2004 JP
2004163356 Jun 2004 JP
2004164483 Jun 2004 JP
2004171350 Jun 2004 JP
2004171602 Jun 2004 JP
2004206444 Jul 2004 JP
2004220376 Aug 2004 JP
2004261515 Sep 2004 JP
2004280221 Oct 2004 JP
2004280547 Oct 2004 JP
2004287621 Oct 2004 JP
2004315127 Nov 2004 JP
2004318248 Nov 2004 JP
2005004524 Jan 2005 JP
2005011207 Jan 2005 JP
2005025577 Jan 2005 JP
2005038257 Feb 2005 JP
2005062990 Mar 2005 JP
2005115961 Apr 2005 JP
2005148883 Jun 2005 JP
WO 9717674 May 1997 WO
9721188 Jun 1997 WO
WO 9802083 Jan 1998 WO
WO 9808439 Mar 1998 WO
WO 9932317 Jul 1999 WO
WO 9952422 Oct 1999 WO
WO 9965175 Dec 1999 WO
WO 0028484 May 2000 WO
WO 0029986 May 2000 WO
WO 0031677 Jun 2000 WO
WO 0036605 Jun 2000 WO
WO 0062239 Oct 2000 WO
WO 0101329 Jan 2001 WO
WO 0103100 Jan 2001 WO
WO 0128476 Apr 2001 WO
WO 0135348 May 2001 WO
WO 0135349 May 2001 WO
WO 0140982 Jun 2001 WO
WO 0163994 Aug 2001 WO
WO 0169490 Sep 2001 WO
WO 0186599 Nov 2001 WO
WO 0201451 Jan 2002 WO
WO 0219030 Mar 2002 WO
WO 0235452 May 2002 WO
WO 0235480 May 2002 WO
WO 02091735 Nov 2002 WO
WO 02095657 Nov 2002 WO
WO 03002387 Jan 2003 WO
WO 03003910 Jan 2003 WO
WO 03054777 Jul 2003 WO
WO 03077077 Sep 2003 WO
WO 2004029863 Apr 2004 WO
WO 2004042646 May 2004 WO
WO 2004055737 Jul 2004 WO
WO 2004089214 Oct 2004 WO
WO 2004097743 Nov 2004 WO
WO 2005008567 Jan 2005 WO
WO 2005013181 Feb 2005 WO
WO 2005024698 Mar 2005 WO
WO 2005024708 Mar 2005 WO
WO 2005024709 Mar 2005 WO
WO 2005029388 Mar 2005 WO
WO 2005062235 Jul 2005 WO
WO 2005069252 Jul 2005 WO
WO 2005093510 Oct 2005 WO
WO 2005093681 Oct 2005 WO
WO 2005096962 Oct 2005 WO
WO 2005098531 Oct 2005 WO
WO 2005104704 Nov 2005 WO
WO 2005109344 Nov 2005 WO
WO 2006012645 Feb 2006 WO
WO 2006023046 Mar 2006 WO
WO 2006051462 May 2006 WO
WO 2006063076 Jun 2006 WO
WO 2006081209 Aug 2006 WO
WO 2006081505 Aug 2006 WO
WO 2007101269 Sep 2007 WO
WO 2007101275 Sep 2007 WO
WO 2007101276 Sep 2007 WO
WO 2007103698 Sep 2007 WO
WO 2007103701 Sep 2007 WO
WO 2007103833 Sep 2007 WO
WO 2007103834 Sep 2007 WO
WO 2008016724 Feb 2008 WO
WO 2008019168 Feb 2008 WO
WO 2008019169 Feb 2008 WO
WO 2008021584 Feb 2008 WO
WO 2008031089 Mar 2008 WO
WO 2008040026 Apr 2008 WO
Related Publications (1)
Number Date Country
20100239119 A1 Sep 2010 US
Provisional Applications (1)
Number Date Country
60778770 Mar 2006 US