Combined face and iris recognition system

Information

  • Patent Grant
  • 8705808
  • Patent Number
    8,705,808
  • Date Filed
    Friday, March 2, 2007
    17 years ago
  • Date Issued
    Tuesday, April 22, 2014
    10 years ago
Abstract
A system using face and iris image capture for recognition of people. The system may have wide field-of-view, medium field-of-view and narrow field-of-view cameras to capture images of a scene of people, faces and irises for processing and recognition. Matching of the face and iris images with images of a database may be a basis for recognition and identification of a subject person.
Description
BACKGROUND

The present invention pertains to recognition systems and particularly to biometric recognition systems. More particularly, the invention pertains to combination face and iris recognition systems.


U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006, is hereby incorporated by reference.


U.S. Provisional Application No. 60/807,046, filed Jul. 11, 2006, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/275,703, filed Jan. 25, 2006, is hereby incorporated by reference.


U.S. Provisional Application No. 60/647,270, filed Jan. 26, 2005, is hereby incorporated by reference.


U.S. patent application Ser. No. 10/979,129, filed Nov. 3, 2004, is hereby incorporated by reference.


U.S. patent application Ser. No. 10/655,124, filed Sep. 5, 2003, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/382,373, filed May 9, 2006, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/043,366, filed Jan. 26, 2005, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/372,854, filed Mar. 10, 2006, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/672,108, filed Feb. 7, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/675,424, filed Feb. 15, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/681,614, filed Mar. 2, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/681,662, filed Mar. 2, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/681,751, filed Mar. 2, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/681,470, filed Mar. 3, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/681,505, filed Mar. 3, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/681,251, filed Mar. 3, 2007, is hereby incorporated by reference.


SUMMARY

The present invention is a combined face and iris recognition system.





BRIEF DESCRIPTION OF THE DRAWING


FIG. 1 is a diagram of a combined face and iris recognition system;



FIG. 2 is a diagram of an approach for the face and iris recognition system;



FIG. 3 is a block diagram of the recognition system;



FIG. 4 is an illustration of some of the hardware that might be used for the recognition system; and



FIGS. 5
a, 5b and 5c illustrate the enclosure and display for the recognition system.





DESCRIPTION

The present system may relate to biometrics, face and iris recognition systems, image metrics, authentication, access control, monitoring, identification, and security and surveillance systems.


Some potential applications for the present system may include airport access control to secure areas, airport trusted traveler systems, border control—watch list surveillance, industrial access control, outdoor facility access control, military checkpoint surveillance, critical infrastructure . . . .


Due to world wide increased security concerns there is a need to have the ability to accurately identify individuals at a distance. In some applications, it is essential to collect multimodal biometrics of multiple subjects standing or walking in an area. Among the various biometrics available nowadays for recognizing people, iris-based biometrics is the most accurate method for identifying people. There may be a present multimodal biometrics system relying on face and iris for guarantying high recognition rate in most challenging cases. The use of face and iris may improve the overall accuracy of the system. The human iris may provide a robust biometrics signature; however, collecting iris information from a distance appears challenging for several reasons. People may be in arbitrary places in the environment; people might not be looking straight into the device; people's head poses might be arbitrary; people's eyelids might occlude part of the iris signature and therefore provide a partial biometrics; and finally, people might be freely moving in the environment monitored by the present apparatus. Stand-off iris acquisition may also address challenging problems related to imaging the irises with sufficient resolution required for inferring a robust signature. Furthermore, a robust iris signature may be present at just a near infrared wavelength, requiring the use of a sensor capable of measuring such wavelengths, and the use of a near infrared illuminator to guarantee good imaging conditions.


Iris recognition may have high accuracy in identifying humans. Iris recognition systems may also have suitability, as an exceptionally accurate biometric can be derived from its extremely data-rich physical structure, stability over time, and non-contact acquisition. Related art iris recognition systems may require users to be within a few inches of the sensor and look directly into the sensor, or towards a specific area. The present system does not necessarily have such a requirement.


The present invention and system may use commercial off the shelf (COTS) face recognition technology combined with custom iris processing algorithms to accurately recognize subjects based on the face and iris at distances significantly greater than a few inches. The present combined face and iris recognition system (CFAIRS) may perform automatic illumination, detection, acquisition and recognition of faces and irises at ranges out to five meters (over 16 feet). The system may also automatically recognize multiple subjects standing in a scene, and optionally enroll any subjects not previously seen in a database.


Relative to the operation of the system, there may be a stationary wide field of view COTS stereo camera 11 for initial subject detection (FIG. 1). It may then use a pan-tilt-zoom (PTZ) COTS medium field of view camera 17 to zoom in and acquire the face of each subject within the field of view of camera 11. The acquired faces may be sent to a COTS face processing software package for recognition, such as with processors 18 and 29. In parallel, a modified PTZ COTS narrow field of view camera 22 may acquire images of the irises for each of the subjects. The system may use modified COTS flash flood illuminators 26 with a controller 27 to control the amount of illumination on a subject.


The acquired iris images may be processed for inferring a signature allowing recognition of a person. Various processing might be required for inferring such signature from an image such as pupil detection, iris segmentation, feature extraction and signature matching software. These processing steps might be optimized for irises acquired at a distance where off-axis gaze and eyelid occlusions are common. The iris processing software may be specifically designed to deal with untrained or unaware users, dealing with issues such as off-axis gaze and partial occlusions of the iris due to half-way closed eyes.



FIG. 1 is a diagram of the present combined face and iris recognition system 10. The system may have a wide field of view (WFOV) camera 11 for surveillance of a scene having one or more subjects of interest, such as people. Camera 11 might be a wide angle stereo camera for providing distance to the targets or subjects of interest. Camera 11 may be connected to an overall system processing unit 12. A camera module 13 may be connected to unit 12. It may be used for obtaining images of faces of people. A high resolution narrow field of view camera module 14 may be connected to unit 12. Module 14 may be used for obtaining images of a feature of a face such as an iris. An illumination module 15 may be connected to module 14. Module 15 may be used for illuminating items for use with the high resolution module 14 to obtain good images of both irises. Module 15 may be able to illuminate with infrared light. A system input/output interface electronics (I/O electronics) module 16 may be connected to unit 12, module 14 and module 15.


Module 13 may have a medium field-of-view (MFOV) camera (e.g., security camera) 17 that is connected to a face “process” 18 in unit 12. The term “processor” may be used in lieu of “process” in that a process would include processing. Such processes or processors may be a part of a larger processor, such as a system processor. A pan-tilt-zoom (PTZ) control unit 19 may be connected to the MFOV camera 17. The PTZ unit 19 may be connected to a surveillance process or processor 21 in the overall system processing unit 12. Module 14 may have a high resolution narrow field-of-view (NFOV) camera 22, and a pan-tilt and zoom (PTZ) control unit 23 connected to camera 22 and the surveillance processor 21. A NFOV camera controller 24 may be connected to the high resolution camera 22, the system I/O electronics module 16 and the surveillance processor 21. The camera might be connected to the WFOV stereo camera 11. The camera 22 may also be connected to an iris process or processor 25.


Module 15 may have an illuminator module consisting of a number of near infrared illuminators 26 and an illumination controller 27 connected to the illuminators 26. Controller 27 may also be connected to I/O electronics module 16 and the NFOV camera controller 24.


The WFOV camera 11 may be connected to a WFOV process or processor 28 of unit 12. WFOV processor 28 may be connected to surveillance processor 21. The face processor 18 may be connected to the surveillance processor 21 and to a face/iris recognition processor 29. Iris processor 25 may be connected to a surveillance processor 21 and the face/iris recognition processor 29. The face/iris recognition processor 29 may be connected to the surveillance processor 21. The face/iris recognition processor 29 may be connected to a face/iris database 31.


The system I/O electronics module 16 may be connected to a system processor 32. The surveillance processor 21 may be connected to the system processor 32. A user interface 33 may be connected to the surveillance processor 21 and the system processor 32.



FIG. 2 shows a processing flow in the present system 10 (FIG. 1). In a first step 35, subjects may be detected in the wide field of view image. This step 35 may use an open source or other suitable algorithms for face detection. A wide FOV stereo camera 11 may also compute the range to each of the subjects in a scene. A next step 36 may include prioritization of the subjects in the scene monitored by the present system. Detected subjects might be prioritized for iris and face acquisition. The prioritization scheme might use quality of the imaged biometrics (i.e., face and iris) or the motion of people in the scene. Quality of the imaged biometrics might be measured by the quantity or quality of the face and iris images captured, or confidence measures obtained after matching the subject with the database. After prioritization, the face processing (branch 37) and iris processing (branch 38) may occur in parallel.


The face processing branch 37 may use the subject location and range information detected and computed by the wide FOV stereo camera 11 to point the medium field of view camera 17 at a subject. A face image may be acquired at step 44 and then sent to the COTS face recognition software or processor 18. The face recognition software or processor 18 may compute features on the face image and compare them against features in a stored database 31, at step 45, with a recognition processor 29 for identification of the subject. If the face or features do not appear in the database, then they may be enrolled in the database.


The iris branch 38 may use the subject location and range information detected and computed by the wide FOV stereo camera 11 to point the modified near infrared (NIR) illuminators 26 to illuminate the face at step 39. The system 10 may do the iris image acquisition in near IR to highlight lower contrast iris features. A next step 41 uses the subject location and range information detected and computed by the wide FOV stereo camera 11 to point the NFOV high resolution camera 22 at the subject, and zoom at the right scale, which may then acquire the iris image. The high resolution camera 22 may be modified to remove an IR blocking filter, or camera 22 may use a NIR sensor. The system may control the focus and zoom of camera 22, using the camera controller 24. As an example, a two part algorithm may be used to focus the iris camera 22 on the subject's iris. This algorithm may first use the subject range to compute the right zoom factor required for imaging irises with sufficient resolution, auto-focus the camera 22 using visible light, and then it may use the subject range to compute a focus offset to get the image in focus in the NIR wavelength. Other kinds of suitable algorithms may be used.


After iris acquisition, the iris images may be processed with custom iris processing algorithms to extract unique features, at step 42, with an iris processor 25. The extracted features may be matched to features stored in a database 31, at step 43, with the recognition processor 29, to identify the subject. If the extracted features are not matched to features stored in the database, the subject may be enrolled in the system by adding the features and subject information to the database.


Another step 46 in the system 10 process may be a display of the matched results for both the face and iris. Matched results may be a corroboration of the identity of the subject. For specific applications, these results may be passed to a security system (e.g., access control). The entire process may then be repeated with a return to step 35 for other subjects. A custom architecture may maintain the inter-component communications and control for the system.



FIG. 3 shows a general layout of the present system. There may be camera or sensor module 61 for obtaining images of one or more subjects in a scene. Images of objects and features of each subject may be obtained by the camera module. These images may be processed by an object/feature processor 62. The processed images may go to a recognition processor 63 which may interact with an object/feature database 64, to perform matching object images and feature images with like images in the database 64 purposes of seeking recognition and identity of the subject. For an example, a subject may be a person, an object may be a face and a feature may be an iris. The subject, object and feature could represent other entities.


The camera module 61 may take WFOV images of a scene with one or more subjects. The module 61 may take MFOV images of an object of the one or more subjects. Also, module 61 may take NFOV images of a feature of the object. The module 61 may take pictures of the various fields-of-view with just one adjustable camera with adjustable resolution, or with more than one camera. A general processor 65 may coordinate various operations of the module 61 and processors 62 and 63. The processor 65 may provide user interfacing and also a connection with the world outside of the present system. It may even provide the recognition processor 63 with an external database located at the outside 66 via internet cable, wireless and other media.


The present recognition system 10 may have an enclosure including the following items. The items may include a commercial off the shelf (COTS) wide field of view (WFOV) stereo camera 11, a COTS medium field of view pan-tilt-zoom (PTZ) security or other kind of camera system 13 and a narrow field of view, such as a high resolution, iris camera system 14. The narrow field of view camera system 14 may include a modified COTS high resolution camera 22, motorized COTS zoom lens, a narrow field of view iris camera controller 24, and a customized pan tilt zoom PTZ controller unit 23. One of the cameras 17 and/or 22 may be a standard PTZ camera. The present recognition system may also include an illumination system 15, system I/O electronics 16 and a processing system 12. The illumination system 15 may include a number of modified flash illuminators 26 and an illuminator controller 27. The processing system 12 may include several computers, a system processor, a digital signal processor or a customized processing board. The system may provide a user interface 33 having a monitor, a keyboard with a build-in mouse, and a keyboard, video, and mouse (KVM) switch.


It may be noted that the COTS WFOV stereo camera 11 may be used for locating the position of people in the scene. The COTS medium field of view PTZ security camera 17 may be used to capture images of the people's faces. The high resolution iris camera system 14 may be used to capture near infrared iris images of the people in the scene. The iris camera 22 may be connected to the customized PTZ controller unit 23 and can allow the camera 22 to be pointed to people in the scene. The iris camera controller 24 may control a motorized COTS zoom lens which allows the system to zoom in on the irises in the scene. It may also be connected to a modified lens extension along with the extension lens communication hardware that enables the system to override the auto-focus control.


The illumination system 15 may have a number of modified flash illuminators 26 and associated illuminator electronics. The illuminator electronics 27 may allow the system to selectively select the flash units used for each picture taken. The system I/O electronics 16 may be used for initializing components by switching on/off power to the devices. It may also be used to monitor the camera to detect when the auto-focus is ready and it may also be used to select illuminators before they are fired.


The processing system 12 which includes computers may be used for the processing and storage of the images as well as for the user interface 33. There also may be a monitor and a keyboard with a build-in mouse which is used for the user interface. The KVM switch may be used to switch between computers for debugging purposes.


Some of these components may be modified. The COTS high resolution camera 22 may have a blocking infrared mechanism. A PTZ unit may be customized to fit the camera 22 with mechanical stops installed to limit the travel.


The COTS zoom lens may be engineered with gears and a motor to allow electronic control over the zoom and focus settings. The lens extension may be disassembled and the electronics be modified. The flash illuminators 26 may be modified so that they can be remotely triggered and have the “on” time of the flash reduced. A visible light filter may be added to allow the illumination to be limited to mostly infrared.


Some of the operation of the present recognition system 10 may be noted. As a person enters a scene, a WFOV camera 11 may detect and send images to a WFOV processor 12. The WFOV processor may locate the person's head position and range and send coordinates of the head to a surveillance processor. The surveillance processor 21 may prioritize which person in the scene needs to be captured with either the MFOV camera 17 or the iris camera 22. Camera 17 may be a security camera. The surveillance processor may send PTZ coordinates to the camera PTZ 19. The camera PTZ 19 may move to a correct location and send a command to the camera 17, which may then take a picture of the person and send the image to the face/iris recognition processor 29. The face/iris recognition processor may check the database 31 for a face match and may enroll the face if it does not have a match.


The surveillance processor 21 may send the PTZ coordinates to the corresponding PTZ 23 of the narrow field of view (NFOV) camera 22. The PZT 23 may move the camera to the position and zoom coordinates and then send a command to the NFOV or iris camera 22. The iris camera 22 may use its built-in auto focus electronics to focus the image in the visible wavelength. The system may wait for the auto-focus to complete, and then apply a focusing offset, based on the range of the subject, to get the image in focus in the near infrared wavelength. The infrared illuminators may then fire and the iris camera 22 may then take a picture of the person and send the image to the face/iris recognition processor 29 via processor 25. The face/iris recognition processor 29 may check a database 31 for a match of the irises and may enroll the irises if there is no match. The face/iris recognition processor 29 may also keep track of pairing face and irises from the same subject.


A feature of the present recognition system 10 is that it may use either face, iris or both a face and iris in the recognition process. The system may work at large distances (≧5 m., 16+ ft.). Also, the system may handle a large number of subjects.



FIG. 4 shows an example structure 53 containing system 10. A high resolution camera 22, a range estimation camera 11 and a face capture camera 17 are shown. Custom electronics 51 may encompass items of unit 12 and other items of a control and electronic nature. Also shown are flashes 52 which may be a part of illuminator 26.



FIGS. 5
a and 5c show cabinetry for encasing and/or enclosing the system 10. Structure 53 of FIG. 4 may be situated on top of cabinet 54 when open for use as shown in FIG. 5a. FIG. 5c shows the cabinet 54 closed up with structure 53 inside. The cabinet 54 may be rolled on wheels 55 and pulled by a handle 56. Associated with structure 53 may be a display/interface 57. It may show a scene 58 of several subjects one or more of which have a face image captured by a camera. An inset 59 may alternatively display a face, or a close-up of an eye or iris of the subject being imaged. Items 57, 58 and 59 are shown in FIG. 5b.


An alternative instantiation of this invention may combine the face and iris acquisition functions into a single camera. This may combine the narrow field of view camera and the medium field of view camera with a single camera that could image the full face and the irises at a high resolution.


An alternative instantiation of this invention may simplify the WFOV camera by discarding the stereo sensor and use anthropometric information computed from the wide field of view imagery of the WFOV camera to provide an estimate of the range of the subject in the scene.


Another instantiation of this invention may have the narrow field of view PTZ camera performing the functions of the wide field of view camera, the medium field of view camera, and the narrow field of view camera into a single camera. This camera may use a longer zoom setting to do the wide field of view subject detection, a medium field of view zoom setting to do the face acquisition and a narrow field of view zoom setting to do the iris acquisition.


In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense.


Although the invention has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the present specification. It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.

Claims
  • 1. A subject recognition system comprising: a wide field-of-view camera configured to locate positions and ranges of people in an area;a medium field of view camera configured to capture images of faces of the people;a narrow field of view camera configured to capture images of irises of the people; anda processing system configured to receive image information from the wide field-of-view, medium field of view, and narrow field of view cameras, the processing system being configured to perform biometric identity recognition processing on image information from the wide field-of-view, medium field of view, and narrow field of view cameras,wherein the processing system is configured to identify the individual identity of persons based upon the images of faces captured by the medium field of view camera; andwherein the processing system is configured to identify the individual identity of persons based upon the images of irises captured by the narrow field of view camera.
  • 2. The system of claim 1, further comprising a controller for driving a zoom of the narrow field of view camera.
  • 3. A biometric human recognition system comprising: a wide field of view camera configured for initial subject detection;a medium field of view camera configured for acquiring images of faces of subjects;a narrow field of view camera configured for acquiring images of irises of subjects; anda processing system configured to receive image information from the wide field of view, medium field of view, and narrow field of view cameras, the processing system being configured to perform facial identity recognition processing on image information from the medium field of view camera and iris identity recognition processing on image information from the narrow field of view camera;wherein the processing system is configured to corroborate matched results of both the face and iris of one of the subjects.
  • 4. The system of claim 3, further comprising a user interface configured to display a scene of one or more subjects, a face of one of the one or more subjects, and an iris of the one of the one or more subjects.
  • 5. The system of claim 3, further comprising a user interface, and wherein the processing system may display matched results for both a face and an iris of one of the subjects.
  • 6. The system of claim 1, wherein the wide field of view camera is for estimating a distance between the cameras and the subjects.
  • 7. The system of claim 1, wherein the distance to the subject may be inferred by stereo or from a calibrated camera and anthropometric knowledge.
  • 8. The system of claim 1, wherein a correct zoom factor of the narrow field of view is estimated for capturing iris images.
  • 9. The system of claim 1, wherein the processing system comprises a mechanism for storage and retrieval of images.
  • 10. The system of claim 9 further comprising: a mechanism for face and iris matching and/or enrolling connected to the processing system; andan interface module connected to the mechanism for face and iris matching and/or enrolling and to the wide field of view camera, the medium field of view camera and the narrow field of view camera.
  • 11. The system of claim 10, wherein the interface module comprises an iris processor connected to the narrow field of view camera and to the mechanism for face and iris matching and/or enrolling.
  • 12. The system of claim 10, further comprising an illumination system connected to the interface module.
  • 13. The system of claim 12, wherein the illuminator is an infrared iris illuminator.
  • 14. The system of claim 3, wherein the system is capable of performing detection, acquisition and recognition of faces and irises at ranges out to at least five meters.
  • 15. The system of claim 1, wherein the processing system may keep track of pairing face and irises belonging to a same subject of the subjects.
  • 16. The system of claim 1, further comprising a user interface, and wherein the processing system may display matched results for both a face and an iris of a subject of the subjects.
  • 17. The system of claim 1, wherein the system is capable of performing detection, acquisition and recognition of faces and irises at ranges out to at least five meters.
Parent Case Info

This application claims the benefit of U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006. This application claims the benefit of U.S. Provisional Application No. 60/807,046, filed Jul. 11, 2006. This application is a continuation-in-part of U.S. patent application Ser. No. 11/275,703, filed Jan. 25, 2006, which claims the benefit of U.S. Provisional Application No. 60/647,270, filed Jan. 26, 2005. This application is a continuation-in-part of U.S. patent application Ser. No. 11/043,366, filed Jan. 26, 2005. This application is a continuation-in-part of U.S. patent application Ser. No. 11/372,854, filed Mar. 10, 2006. This application is a continuation-in-part of U.S. patent application Ser. No. 10/979,129, filed Nov. 3, 2004, which is a continuation-in-part of U.S. patent application Ser. No. 10/655,124, filed Sep. 5, 2003. This application is a continuation-in-part of U.S. patent application Ser. No. 11/382,373, filed May 9, 2006. This application is a continuation-in-part of U.S. patent application Ser. No. 11/672,108, filed Feb. 7, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/675,424, filed Feb. 15, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/681,614, filed Mar. 2, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/681,662, filed Mar. 2, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/681,751, filed Mar. 2, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/681,470, filed Mar. 2, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/681,505, filed Mar. 2, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/681,251, filed Mar. 2, 2007, which claims the benefit of U.S. Provisional Application 60/807,046, filed Jul. 11, 2006.

Government Interests

The government may have rights in the present invention.

US Referenced Citations (395)
Number Name Date Kind
4641349 Flom et al. Feb 1987 A
4836670 Hutchinson Jun 1989 A
5231674 Cleveland et al. Jul 1993 A
5291560 Daugman Mar 1994 A
5293427 Ueno et al. Mar 1994 A
5359382 Uenaka Oct 1994 A
5404013 Tajima Apr 1995 A
5551027 Choy et al. Aug 1996 A
5572596 Wildes et al. Nov 1996 A
5608472 Szirth et al. Mar 1997 A
5664239 Nakata Sep 1997 A
5717512 Chmielewski, Jr. et al. Feb 1998 A
5751836 Wildes et al. May 1998 A
5859686 Aboutalib et al. Jan 1999 A
5860032 Iwane Jan 1999 A
5896174 Nakata Apr 1999 A
5901238 Matsuhita May 1999 A
5909269 Isogai et al. Jun 1999 A
5953440 Zhang et al. Sep 1999 A
5956122 Doster Sep 1999 A
5978494 Zhang Nov 1999 A
6005704 Chmielewski, Jr. et al. Dec 1999 A
6007202 Apple et al. Dec 1999 A
6012376 Hanke et al. Jan 2000 A
6021210 Camus et al. Feb 2000 A
6028949 McKendall Feb 2000 A
6055322 Salganicoff et al. Apr 2000 A
6064752 Rozmus et al. May 2000 A
6069967 Rozmus et al. May 2000 A
6081607 Mori et al. Jun 2000 A
6088470 Camus et al. Jul 2000 A
6091899 Konishi et al. Jul 2000 A
6101477 Hohle et al. Aug 2000 A
6104431 Inoue et al. Aug 2000 A
6108636 Yap et al. Aug 2000 A
6119096 Mann et al. Sep 2000 A
6120461 Smyth Sep 2000 A
6134339 Luo Oct 2000 A
6144754 Okano et al. Nov 2000 A
6246751 Bergl et al. Jun 2001 B1
6247813 Kim et al. Jun 2001 B1
6252977 Salganicoff et al. Jun 2001 B1
6282475 Washington Aug 2001 B1
6285505 Melville et al. Sep 2001 B1
6285780 Yamakita et al. Sep 2001 B1
6289113 McHugh et al. Sep 2001 B1
6299306 Braithwaite et al. Oct 2001 B1
6308015 Matsumoto Oct 2001 B1
6309069 Seal et al. Oct 2001 B1
6320610 Van Sant et al. Nov 2001 B1
6320612 Young Nov 2001 B1
6320973 Suzaki et al. Nov 2001 B2
6323761 Son Nov 2001 B1
6325765 Hay et al. Dec 2001 B1
6330674 Angelo et al. Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6344683 Kim Feb 2002 B1
6370260 Pavlidis et al. Apr 2002 B1
6377699 Musgrave et al. Apr 2002 B1
6393136 Amir et al. May 2002 B1
6400835 Lemelson et al. Jun 2002 B1
6424727 Musgrave et al. Jul 2002 B1
6424845 Emmoft et al. Jul 2002 B1
6433818 Steinberg et al. Aug 2002 B1
6438752 McClard Aug 2002 B1
6441482 Foster Aug 2002 B1
6446045 Stone et al. Sep 2002 B1
6483930 Musgrave et al. Nov 2002 B1
6484936 Nicoll et al. Nov 2002 B1
6490443 Freeny, Jr. Dec 2002 B1
6493363 Weaver et al. Dec 2002 B1
6493669 Curry et al. Dec 2002 B1
6494363 Roger et al. Dec 2002 B1
6503163 Van Sant et al. Jan 2003 B1
6505193 Musgrave et al. Jan 2003 B1
6506078 Mori et al. Jan 2003 B1
6508397 Do Jan 2003 B1
6516078 Yang et al. Feb 2003 B1
6516087 Camus Feb 2003 B1
6516416 Gregg et al. Feb 2003 B2
6522772 Morrison et al. Feb 2003 B1
6523165 Liu et al. Feb 2003 B2
6526160 Ito Feb 2003 B1
6532298 Cambier et al. Mar 2003 B1
6540392 Braithwaite Apr 2003 B1
6542624 Oda Apr 2003 B1
6546121 Oda Apr 2003 B1
6553494 Glass Apr 2003 B1
6580356 Alt et al. Jun 2003 B1
6591001 Oda et al. Jul 2003 B1
6591064 Higashiyama et al. Jul 2003 B2
6594377 Kim et al. Jul 2003 B1
6594399 Camus et al. Jul 2003 B1
6598971 Cleveland Jul 2003 B2
6600878 Pregara Jul 2003 B2
6614919 Suzaki et al. Sep 2003 B1
6652099 Chae et al. Nov 2003 B2
6674367 Sweatte Jan 2004 B2
6690997 Rivalto Feb 2004 B2
6708176 Strunk et al. Mar 2004 B2
6711562 Ross et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6718049 Pavlidis et al. Apr 2004 B2
6732278 Baird, III et al. May 2004 B2
6734783 Anbai May 2004 B1
6745520 Puskaric et al. Jun 2004 B2
6750435 Ford Jun 2004 B2
6751733 Nakamura et al. Jun 2004 B1
6753919 Daugman Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6760467 Min et al. Jul 2004 B1
6765470 Shinzaki Jul 2004 B2
6766041 Golden et al. Jul 2004 B2
6775774 Harper Aug 2004 B1
6785406 Kamada Aug 2004 B1
6793134 Clark Sep 2004 B2
6819219 Bolle et al. Nov 2004 B1
6829370 Pavlidis et al. Dec 2004 B1
6832044 Doi et al. Dec 2004 B2
6836554 Bolle et al. Dec 2004 B1
6837436 Swartz et al. Jan 2005 B2
6845879 Park Jan 2005 B2
6853444 Haddad Feb 2005 B2
6867683 Calvesio et al. Mar 2005 B2
6873960 Wood et al. Mar 2005 B1
6896187 Stockhammer May 2005 B2
6905411 Nguyen et al. Jun 2005 B2
6920237 Chen et al. Jul 2005 B2
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6950139 Fujinawa Sep 2005 B2
6954738 Wang et al. Oct 2005 B2
6957341 Rice et al. Oct 2005 B2
6972797 Izumi Dec 2005 B2
6992562 Fuks et al. Jan 2006 B2
7053948 Konishi May 2006 B2
7071971 Elberbaum Jul 2006 B2
7136581 Fujii Nov 2006 B2
7183895 Bazakos et al. Feb 2007 B2
7184577 Chen et al. Feb 2007 B2
7197173 Jones et al. Mar 2007 B2
7204425 Mosher, Jr. et al. Apr 2007 B2
7277561 Shin Oct 2007 B2
7277891 Howard et al. Oct 2007 B2
7298873 Miller, Jr. et al. Nov 2007 B2
7315233 Yuhara Jan 2008 B2
7362210 Bazakos et al. Apr 2008 B2
7362370 Sakamoto et al. Apr 2008 B2
7362884 Willis et al. Apr 2008 B2
7365771 Kahn et al. Apr 2008 B2
7406184 Wolff et al. Jul 2008 B2
7414648 Imada Aug 2008 B2
7417682 Kuwakino et al. Aug 2008 B2
7418115 Northcott et al. Aug 2008 B2
7421097 Hamza et al. Sep 2008 B2
7443441 Hiraoka Oct 2008 B2
7460693 Loy et al. Dec 2008 B2
7471451 Dent et al. Dec 2008 B2
7486806 Azuma et al. Feb 2009 B2
7518651 Butterworth Apr 2009 B2
7537568 Moehring May 2009 B2
7538326 Johnson et al. May 2009 B2
7542945 Thompson et al. Jun 2009 B2
7580620 Raskar et al. Aug 2009 B2
7593550 Hamza Sep 2009 B2
7639846 Yoda Dec 2009 B2
7722461 Gatto et al. May 2010 B2
7751598 Matey et al. Jul 2010 B2
7756301 Hamza Jul 2010 B2
7756407 Raskar Jul 2010 B2
7761453 Hamza Jul 2010 B2
7762665 Vertegaal et al. Jul 2010 B2
7777802 Shinohara et al. Aug 2010 B2
7804982 Howard et al. Sep 2010 B2
20010026632 Tamai Oct 2001 A1
20010027116 Baird Oct 2001 A1
20010047479 Bromba et al. Nov 2001 A1
20010051924 Uberti Dec 2001 A1
20010054154 Tam Dec 2001 A1
20020010857 Karthik Jan 2002 A1
20020033896 Hatano Mar 2002 A1
20020039433 Shin Apr 2002 A1
20020040434 Elliston et al. Apr 2002 A1
20020062280 Zachariassen et al. May 2002 A1
20020077841 Thompson Jun 2002 A1
20020089157 Breed et al. Jul 2002 A1
20020106113 Park Aug 2002 A1
20020112177 Voltmer et al. Aug 2002 A1
20020114495 Chen et al. Aug 2002 A1
20020130961 Lee et al. Sep 2002 A1
20020131622 Lee et al. Sep 2002 A1
20020139842 Swaine Oct 2002 A1
20020140715 Smet Oct 2002 A1
20020142844 Kerr Oct 2002 A1
20020144128 Rahman et al. Oct 2002 A1
20020150281 Cho Oct 2002 A1
20020154794 Cho Oct 2002 A1
20020158750 Almalik Oct 2002 A1
20020164054 McCartney et al. Nov 2002 A1
20020175182 Matthews Nov 2002 A1
20020186131 Fettis Dec 2002 A1
20020191075 Doi et al. Dec 2002 A1
20020191076 Wada et al. Dec 2002 A1
20020194128 Maritzen et al. Dec 2002 A1
20020194131 Dick Dec 2002 A1
20020198731 Barnes et al. Dec 2002 A1
20030002714 Wakiyama Jan 2003 A1
20030012413 Kusakari et al. Jan 2003 A1
20030014372 Wheeler et al. Jan 2003 A1
20030020828 Ooi et al. Jan 2003 A1
20030038173 Blackson et al. Feb 2003 A1
20030046228 Berney Mar 2003 A1
20030053663 Chen et al. Mar 2003 A1
20030055689 Block et al. Mar 2003 A1
20030055787 Fujii Mar 2003 A1
20030058492 Wakiyama Mar 2003 A1
20030061172 Robinson Mar 2003 A1
20030061233 Manasse et al. Mar 2003 A1
20030065626 Allen Apr 2003 A1
20030071743 Seah et al. Apr 2003 A1
20030072475 Tamori Apr 2003 A1
20030073499 Reece Apr 2003 A1
20030074317 Hofi Apr 2003 A1
20030074326 Byers Apr 2003 A1
20030076161 Tisse Apr 2003 A1
20030076300 Lauper et al. Apr 2003 A1
20030076984 Tisse et al. Apr 2003 A1
20030080194 O'Hara et al. May 2003 A1
20030091215 Lauper et al. May 2003 A1
20030092489 Veradej May 2003 A1
20030095689 Volkommer et al. May 2003 A1
20030098776 Friedli May 2003 A1
20030099379 Monk et al. May 2003 A1
20030099381 Ohba May 2003 A1
20030103652 Lee et al. Jun 2003 A1
20030107097 McArthur et al. Jun 2003 A1
20030107645 Yoon Jun 2003 A1
20030108224 Ike Jun 2003 A1
20030108225 Li Jun 2003 A1
20030115148 Takhar Jun 2003 A1
20030115459 Monk Jun 2003 A1
20030116630 Carey et al. Jun 2003 A1
20030118212 Min et al. Jun 2003 A1
20030118217 Kondo et al. Jun 2003 A1
20030123711 Kim et al. Jul 2003 A1
20030125054 Garcia Jul 2003 A1
20030125057 Pesola Jul 2003 A1
20030126560 Kurapati et al. Jul 2003 A1
20030131245 Linderman Jul 2003 A1
20030131265 Bhakta Jul 2003 A1
20030133597 Moore et al. Jul 2003 A1
20030140235 Immega et al. Jul 2003 A1
20030140928 Bui et al. Jul 2003 A1
20030141411 Pandya et al. Jul 2003 A1
20030149881 Patel et al. Aug 2003 A1
20030152251 Ike Aug 2003 A1
20030152252 Kondo et al. Aug 2003 A1
20030156741 Lee et al. Aug 2003 A1
20030158762 Wu Aug 2003 A1
20030158821 Maia Aug 2003 A1
20030159051 Hollnagel Aug 2003 A1
20030163739 Armington et al. Aug 2003 A1
20030169334 Braithwaite et al. Sep 2003 A1
20030169901 Pavlidis et al. Sep 2003 A1
20030169907 Edwards et al. Sep 2003 A1
20030173408 Mosher, Jr. et al. Sep 2003 A1
20030174049 Beigel et al. Sep 2003 A1
20030177051 Driscoll et al. Sep 2003 A1
20030182151 Taslitz Sep 2003 A1
20030182182 Kocher Sep 2003 A1
20030189480 Hamid Oct 2003 A1
20030189481 Hamid Oct 2003 A1
20030191949 Odagawa Oct 2003 A1
20030194112 Lee Oct 2003 A1
20030195935 Leeper Oct 2003 A1
20030198368 Kee Oct 2003 A1
20030200180 Phelan, III et al. Oct 2003 A1
20030210139 Brooks et al. Nov 2003 A1
20030210802 Schuessier Nov 2003 A1
20030218719 Abourizk et al. Nov 2003 A1
20030225711 Paping Dec 2003 A1
20030228898 Rowe Dec 2003 A1
20030233556 Angelo et al. Dec 2003 A1
20030235326 Morikawa et al. Dec 2003 A1
20030235411 Morikawa et al. Dec 2003 A1
20030236120 Reece et al. Dec 2003 A1
20040001614 Russon et al. Jan 2004 A1
20040002894 Kocher Jan 2004 A1
20040005078 Tillotson Jan 2004 A1
20040006553 de Vries et al. Jan 2004 A1
20040010462 Moon et al. Jan 2004 A1
20040012760 Mihashi et al. Jan 2004 A1
20040019570 Bolle et al. Jan 2004 A1
20040023664 Mirouze et al. Feb 2004 A1
20040023709 Beaulieu et al. Feb 2004 A1
20040025030 Corbett-Clark et al. Feb 2004 A1
20040025031 Ooi et al. Feb 2004 A1
20040025053 Hayward Feb 2004 A1
20040029564 Hodge Feb 2004 A1
20040030930 Nomura Feb 2004 A1
20040035123 Kim et al. Feb 2004 A1
20040037450 Bradski Feb 2004 A1
20040039914 Barr et al. Feb 2004 A1
20040042641 Jakubowski Mar 2004 A1
20040044627 Russell et al. Mar 2004 A1
20040046640 Jourdain et al. Mar 2004 A1
20040049687 Orsini et al. Mar 2004 A1
20040050924 Mletzko et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040052405 Walfridsson Mar 2004 A1
20040052418 DeLean Mar 2004 A1
20040059590 Mercredi et al. Mar 2004 A1
20040059953 Purnell Mar 2004 A1
20040061787 Liu et al. Apr 2004 A1
20040104266 Bolle et al. Jun 2004 A1
20040117636 Cheng Jun 2004 A1
20040133804 Smith et al. Jul 2004 A1
20040146187 Jeng Jul 2004 A1
20040148526 Sands et al. Jul 2004 A1
20040160518 Park Aug 2004 A1
20040162870 Matsuzaki et al. Aug 2004 A1
20040162984 Freeman et al. Aug 2004 A1
20040169817 Grotehusmann et al. Sep 2004 A1
20040172541 Ando et al. Sep 2004 A1
20040174070 Voda et al. Sep 2004 A1
20040190759 Caldwell Sep 2004 A1
20040193893 Braithwaite et al. Sep 2004 A1
20040219902 Lee et al. Nov 2004 A1
20040233038 Beenau et al. Nov 2004 A1
20040240711 Hamza et al. Dec 2004 A1
20040252866 Tisse et al. Dec 2004 A1
20040255168 Murashita et al. Dec 2004 A1
20050007450 Hill et al. Jan 2005 A1
20050008200 Azuma et al. Jan 2005 A1
20050008201 Lee et al. Jan 2005 A1
20050012817 Hampapur et al. Jan 2005 A1
20050029353 Isemura et al. Feb 2005 A1
20050052566 Kato Mar 2005 A1
20050055582 Bazakos et al. Mar 2005 A1
20050063567 Saitoh et al. Mar 2005 A1
20050084137 Kim et al. Apr 2005 A1
20050084179 Hanna et al. Apr 2005 A1
20050099288 Spitz et al. May 2005 A1
20050102502 Sagen May 2005 A1
20050110610 Bazakos et al. May 2005 A1
20050125258 Yellin et al. Jun 2005 A1
20050127161 Smith et al. Jun 2005 A1
20050129286 Hekimian Jun 2005 A1
20050134796 Zelvin et al. Jun 2005 A1
20050138385 Friedli et al. Jun 2005 A1
20050138387 Lam et al. Jun 2005 A1
20050146640 Shibata Jul 2005 A1
20050151620 Neumann Jul 2005 A1
20050152583 Kondo et al. Jul 2005 A1
20050193212 Yuhara Sep 2005 A1
20050199708 Friedman Sep 2005 A1
20050206501 Farhat Sep 2005 A1
20050206502 Bernitz Sep 2005 A1
20050207614 Schonberg et al. Sep 2005 A1
20050210267 Sugano et al. Sep 2005 A1
20050210270 Rohatgi et al. Sep 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050238214 Matsuda et al. Oct 2005 A1
20050240778 Saito Oct 2005 A1
20050248725 Ikoma et al. Nov 2005 A1
20050249385 Kondo et al. Nov 2005 A1
20050255840 Markham Nov 2005 A1
20060093190 Cheng et al. May 2006 A1
20060147094 Yoo Jul 2006 A1
20060165266 Hamza Jul 2006 A1
20060274919 LoIacono et al. Dec 2006 A1
20070036397 Hamza Feb 2007 A1
20070086087 Dent et al. Apr 2007 A1
20070140531 Hamza Jun 2007 A1
20070160266 Jones et al. Jul 2007 A1
20070189582 Hamza et al. Aug 2007 A1
20070206840 Jacobson Sep 2007 A1
20070211924 Hamza Sep 2007 A1
20070274570 Hamza Nov 2007 A1
20070274571 Hamza Nov 2007 A1
20070286590 Terashima Dec 2007 A1
20080005578 Shafir Jan 2008 A1
20080075441 Jelinek et al. Mar 2008 A1
20080104415 Palti-Wasserman et al. May 2008 A1
20080148030 Goffin Jun 2008 A1
20080211347 Wright et al. Sep 2008 A1
20080252412 Larsson et al. Oct 2008 A1
20090046899 Northcott et al. Feb 2009 A1
20090092283 Whillock et al. Apr 2009 A1
20090316993 Brasnett et al. Dec 2009 A1
20100033677 Jelinek Feb 2010 A1
20100034529 Jelinek Feb 2010 A1
20100142765 Hamza Jun 2010 A1
20100182440 McCloskey Jul 2010 A1
20100239119 Bazakos et al. Sep 2010 A1
Foreign Referenced Citations (188)
Number Date Country
0484076 May 1992 EP
0593386 Apr 1994 EP
0878780 Nov 1998 EP
0899680 Mar 1999 EP
0910986 Apr 1999 EP
0962894 Dec 1999 EP
1018297 Jul 2000 EP
1024463 Aug 2000 EP
1028398 Aug 2000 EP
1041506 Oct 2000 EP
1041523 Oct 2000 EP
1126403 Aug 2001 EP
1139270 Oct 2001 EP
1237117 Sep 2002 EP
1477925 Nov 2004 EP
1635307 Mar 2006 EP
2369205 May 2002 GB
2371396 Jul 2002 GB
2375913 Nov 2002 GB
2402840 Dec 2004 GB
2411980 Sep 2005 GB
9161135 Jun 1997 JP
9198545 Jul 1997 JP
9201348 Aug 1997 JP
9147233 Sep 1997 JP
9234264 Sep 1997 JP
9305765 Nov 1997 JP
9319927 Dec 1997 JP
10021392 Jan 1998 JP
10040386 Feb 1998 JP
10049728 Feb 1998 JP
10137219 May 1998 JP
10137221 May 1998 JP
10137222 May 1998 JP
10137223 May 1998 JP
10248827 Sep 1998 JP
10269183 Oct 1998 JP
11047117 Feb 1999 JP
11089820 Apr 1999 JP
11200684 Jul 1999 JP
11203478 Jul 1999 JP
11213047 Aug 1999 JP
11339037 Dec 1999 JP
2000005149 Jan 2000 JP
2000005150 Jan 2000 JP
2000011163 Jan 2000 JP
2000023946 Jan 2000 JP
2000083930 Mar 2000 JP
2000102510 Apr 2000 JP
2000102524 Apr 2000 JP
2000105830 Apr 2000 JP
2000107156 Apr 2000 JP
2000139878 May 2000 JP
2000155863 Jun 2000 JP
2000182050 Jun 2000 JP
2000185031 Jul 2000 JP
2000194972 Jul 2000 JP
2000237167 Sep 2000 JP
2000242788 Sep 2000 JP
2000259817 Sep 2000 JP
2000356059 Dec 2000 JP
2000357232 Dec 2000 JP
2001005948 Jan 2001 JP
2001067399 Mar 2001 JP
2001101429 Apr 2001 JP
2001167275 Jun 2001 JP
2001222661 Aug 2001 JP
2001292981 Oct 2001 JP
2001297177 Oct 2001 JP
2001358987 Dec 2001 JP
2002119477 Apr 2002 JP
2002133415 May 2002 JP
2002153444 May 2002 JP
2002153445 May 2002 JP
2002260071 Sep 2002 JP
2002271689 Sep 2002 JP
2002286650 Oct 2002 JP
2002312772 Oct 2002 JP
2002329204 Nov 2002 JP
2003006628 Jan 2003 JP
2003036434 Feb 2003 JP
2003108720 Apr 2003 JP
2003108983 Apr 2003 JP
2003132355 May 2003 JP
2003150942 May 2003 JP
2003153880 May 2003 JP
2003242125 Aug 2003 JP
2003271565 Sep 2003 JP
2003271940 Sep 2003 JP
2003308522 Oct 2003 JP
2003308523 Oct 2003 JP
2003317102 Nov 2003 JP
2003331265 Nov 2003 JP
2004005167 Jan 2004 JP
2004021406 Jan 2004 JP
2004030334 Jan 2004 JP
2004038305 Feb 2004 JP
2004094575 Mar 2004 JP
2004152046 May 2004 JP
2004163356 Jun 2004 JP
2004164483 Jun 2004 JP
2004171350 Jun 2004 JP
2004171602 Jun 2004 JP
2004206444 Jul 2004 JP
2004220376 Aug 2004 JP
2004261515 Sep 2004 JP
2004280221 Oct 2004 JP
2004280547 Oct 2004 JP
2004287621 Oct 2004 JP
2004315127 Nov 2004 JP
2004318248 Nov 2004 JP
2005004524 Jan 2005 JP
2005011207 Jan 2005 JP
2005025577 Jan 2005 JP
2005038257 Feb 2005 JP
2005062990 Mar 2005 JP
2005115961 Apr 2005 JP
2005148883 Jun 2005 JP
2005242677 Sep 2005 JP
WO 9717674 May 1997 WO
WO 9721188 Jun 1997 WO
WO 9802083 Jan 1998 WO
WO 9808439 Mar 1998 WO
WO 9932317 Jul 1999 WO
WO 9952422 Oct 1999 WO
WO 9965175 Dec 1999 WO
WO 0028484 May 2000 WO
WO 0029986 May 2000 WO
WO 0031677 Jun 2000 WO
WO 0036605 Jun 2000 WO
WO 0062239 Oct 2000 WO
WO 0101329 Jan 2001 WO
WO 0103100 Jan 2001 WO
WO 0128476 Apr 2001 WO
WO 0135348 May 2001 WO
WO 0135349 May 2001 WO
WO 0140982 Jun 2001 WO
WO 0163994 Aug 2001 WO
WO 0169490 Sep 2001 WO
WO 0186599 Nov 2001 WO
WO 0201451 Jan 2002 WO
WO 0219030 Mar 2002 WO
WO 0235452 May 2002 WO
WO 0235480 May 2002 WO
WO 02091735 Nov 2002 WO
WO 02095657 Nov 2002 WO
WO 03002387 Jan 2003 WO
WO 03003910 Jan 2003 WO
WO 03054777 Jul 2003 WO
WO 03077077 Sep 2003 WO
WO 2004029863 Apr 2004 WO
WO 2004042646 May 2004 WO
WO 2004055737 Jul 2004 WO
WO 2004089214 Oct 2004 WO
WO 2004097743 Nov 2004 WO
2005008567 Jan 2005 WO
WO 2005013181 Feb 2005 WO
2005024698 Mar 2005 WO
WO 2005024708 Mar 2005 WO
WO 2005024709 Mar 2005 WO
WO 2005029388 Mar 2005 WO
WO 2005062235 Jul 2005 WO
WO 2005069252 Jul 2005 WO
WO 2005093510 Oct 2005 WO
WO 2005093681 Oct 2005 WO
WO 2005096962 Oct 2005 WO
WO 2005098531 Oct 2005 WO
WO 2005104704 Nov 2005 WO
WO 2005109344 Nov 2005 WO
WO 2006012645 Feb 2006 WO
WO 2006023046 Mar 2006 WO
WO 2006051462 May 2006 WO
WO 2006063076 Jun 2006 WO
WO 2006081209 Aug 2006 WO
WO 2006081505 Aug 2006 WO
WO 2007101269 Sep 2007 WO
WO 2007101275 Sep 2007 WO
WO 2007101276 Sep 2007 WO
WO 2007103698 Sep 2007 WO
WO 2007103701 Sep 2007 WO
WO 2007103833 Sep 2007 WO
WO 2007103834 Sep 2007 WO
WO 2008016724 Feb 2008 WO
WO 2008019168 Feb 2008 WO
WO 2008019169 Feb 2008 WO
WO 2008021584 Feb 2008 WO
WO 2008031089 Mar 2008 WO
WO 2008040026 Apr 2008 WO
Non-Patent Literature Citations (92)
Entry
AOptix Technologies, “Introducing the AOptix Insight 2 Meter Iris Recognition System,” 6 pages, 2010.
Bonney et al., “Iris Pattern Extraction Using Bit Planes and Standard Deviations,” IEEE, pp. 582-586, 2004.
Camus et al., “Reliable and Fast Eye Finding in Close-up Images,” IEEE, pp. 389-394, 2002.
Cui et al., “A Fast and Robust Iris Localization Method Based on Texture Segmentation,” 8 pages, 2004.
Cui et al., “An Appearance-Based Method for Iris Detection,” 6 pages, 2004.
Cui et al., “An Iris Detection Method Based on Structure Information,” Advances in Biometric Person Authentication, International Workshop on Biometric Recognition Systems, IWBRS 2005, Beijing China, 10 pages, Oct. 22-23, 2005.
Cui et al., “An Iris Image Synthesis Method Based on PCA and Super-Resolution,” IEEE Computer Society, Proceedings of the 17th International Conference on Pattern Recognition, 6 pages, Aug. 23-26, 2004.
Cui et al., “An Iris Recognition Algorithm Using Local Extreme Points,” Biometric Authentication, First International Conference, ICBA 2004, Hong Kong, China, 10 pages, Jul. 15-17, 2004.
Daugman, “Results From 200 Billion Iris Cross-Comparisons,” University of Cambridge Computer Laboratory, Technical Report, No. 635, 8 pages, Jun. 2005.
Du et al., “A One-Dimensional Approach for Iris Identification,” 11 pages, prior to Jan. 25, 2006.
http://www.newscientisttech.com/article/dn11110-invention-covert-iris-sc, “Invention: Covert Iris Scanner,” 3 pages, printed Feb. 8, 2007.
Huang et al., “Iris Model Based on Local Orientation Description,” 5 pages, prior to Jan. 25, 2006.
Huang et al., “An Efficient Iris Recognition System,” IEEE Proceedings of the First International Conference on Machine Learning and Cybernetics, Beijing, pp. 450-454, Nov. 4-5, 2002.
Ma et al., “Personal Identification Based on Iris Texture Analysis,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, No. 12, pp. 1519-1533, Dec. 2003.
Masek, “Recognition of Human Iris Patterns for Biometric Identification,” 61 pages, 2003.
Sun et al., “Robust Encoding of Local Ordinal Measures: A General Framework of Iris Recognition,” 13 pages, prior to Jan. 25, 2006.
Avcibas et al., “Steganalysis Using Image Quality Metrics,” IEEE Transactions on Image Processing, vol. 12, No. 2, pp. 221-229, Feb. 2003.
Boles, “A Security System Based on Human Iris Identification Using Wavelet Transform,” IEEE First International Conference on Knowledge-Based Intelligent Electronic Systems, May 21-23, Adelaide, Australia, pp. 533-541, 1997.
Carson et al., “Blobworld: Image Segmentation Using Expectation-Maximization and Its Application to Image Querying,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 8, pp. 1026-1038, Aug. 2002.
Daugman, “How Iris Recognition Works,” IEEE 2002 International Conference on Image Processing, vol. I of III, 6 pages, Sep. 22-25, 2002.
Guo et al., “A System for Automatic Iris Capturing,” Mitsubishi Electric Research Laboratories, Inc., 10 pages, 2005.
Guo, “Face, Expression, and Iris Recognition Using Learning-Based Approaches,” 132 pages, 2006.
Jalaja et al., “Texture Element Feature Characterizations for CBIR,” IEEE, pp. 733-736, 2005.
Kalka et al., “Image Quality Assessment for Iris Biometric,” Proc. Of SPIE vol. 6202 62020D, 11 pages, 2006.
Ko et al., “Monitoring and Reporting of Fingerprint Image Quality and Match Accuracy for a Large User Application,” IEEE Computer Society, Proceedings of the 33rd Applied Imagery Pattern Recognition Workshop, 6 pages, 2004.
Lau et al., “Finding a Small Number of Regions in an Image Using Low-Level Features,” Pattern Recognition 35, pp. 2323-2339, 2002.
Maurer et al., “Tracking and Learning Graphs and Pose on Image Sequences of Faces,” IEEE Computer Society Press, International Conference on Automatic Face and Gesture Recognition, pp. 176-181, Oct. 14-16, 1996.
Oppenheim et al, “The Importance of Phase in Signals,” Proceedings of the IEEE, vol. 69, No. 5, pp. 529-541, 1981.
Ratha et al., “A Real-Time Matching System for Large Fingerprint Databases,” IEEE Transactions on Pattern Analysis, and Machine Intelligence, vol. 18, No. 8, pp. 799-812, Aug. 1996.
Sony, “Network Color Camera, SNC-RZ30N (NTSC),” 6 pages, Aug. 2002.
Wang et al, “Image Quality Assessment: From Error Visibility to Structural Similarity,” IEEE Transactions on Image Processing, vol. 13, No. 4, pp. 600-612, Apr. 2004.
Wang et al., “A Universal Image Quality Index,” IEEE Signal Processing Letters, vol. 9, No. 3, pp. 81-84, Mar. 2002.
Wang et al., “Local Phase Coherence and the Perception of Blur,” Advances in Nueral Information Processing Systems 16, pp. 1435-1442, 2004.
Belhumeur et al., “Eigenfaces Vs. Fisherfaces: Recognition Using Class Specific Linear Projection,” 14 pages, prior to Jun. 11, 2010.
Bentley et al., “Multidimensional Binary Search Trees Used for Associative Searching,” Communications of the ACM, vol. 18, No. 9, pp. 509-517, Sep. 1975.
Blackman et al., “Chapter 9, Multiple Sensor Tracking: Issues and Methods,” Design and Analysis of Modern Tracking Systems, Artech House, pp. 595-659, 1999.
Brasnett et al., “A Robust Visual Identifier Using the Trace Transform,” 6 pages, prior to Jun. 11, 2010.
Buades et al., “A Review of Image Denoising Algorithms, with a New One,” Multiscale Modeling & Simulation, vol. 4, No. 2, pp. 490-530, 2005.
Chen et al., “Localized Iris Image Quality Using 2-D Wavelets,” LNCS vol. 3832, pp. 373-381, 2005.
Chow et al., “Towards a System for Automatic Facial Feature Detection,” Pattern Recognition vol. 26, No. 12, pp. 1739-1755, 1993.
U.S. Appl. No. 12/792,498, filed Jun. 2, 2010.
U.S. Appl. No. 12/814,232, filed Jun. 11, 2010.
U.S. Appl. No. 12/814,272, filed Jun. 11, 2010.
Cula et al., “Bidirectional Imaging and Modeling of Skin Texture,” Proceedings of Texture 2003, 6 pages, Oct. 17, 2003.
Cula et al., “Bidirectional Imaging and Modeling of Skin Texture,” IEEE Transactions on Biomedical Engineering, vol. 51, No. 12, pp. 2148-2159, 2004.
Cula et al., “Compact Representation of Bidirectional Texture Functions,” Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2001, 8 pages, 2001.
Cula et al., “Skin Texture Modeling,” International Journal of Computer Vision 2004, 34 pages, 2004.
Dabov et al., “Image Denoising by Sparse 3-D Transform-Domain Collaborative Filtering,” IEEE Transactions on Image Processing, vol. 16, No. 8, pp. 2080-2095, Aug. 2007.
Dabov et al., “Image Restoration by Spars 3D Transform Collaborative Filtering,” SPIE vol. 6812 681207-1, 12 pages, 2008.
Daugman, “High Confidence Visual Recognition of Persons by a Test of Statistical Independence,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 15, No. 11, pp. 1148-1161, 1993.
Daugman, “Probing the Uniqueness and Randomness of Iris Codes: Results from 200 Billion Iris Pair Comparisons,” Proceedings of the IEEE vol. 94, No. 11, pp. 1928-1935, Nov. 2006.
Fooprateepsiri et al., “A Highly Robust Method for Face Authentication,” IEEE 2009 First Asian Conference on Intelligent Information and Database Systems, pp. 380-385, 2009.
Fooprateepsiri et al., “Face Verification Base-On Hausdorff-Shape Context,” IEEE 2009 Asia Conference on Informatics in Control, Automation and Robotics, pp. 240-244, 2009.
Forstner et al., “A Metric for Covariance Matrices,” 16 pages, prior to Jun. 11, 2010.
Gan et al., “Applications of Wavelet Packets Decomposition in Iris Recognition,” LNCS vol. 3832, pp. 443-449, 2005.
Hampapur et al., “Smart Surveillance: Applications, Technologies and Implications,” IEEE, 6 pages, Dec. 15-18, 2003.
Hamza et al., “Standoff Iris Recognition Usin Non-Iterative Polar Based Segmentation,” Proceedings of SPIE vol. 6944, 8 pages, 2008.
Hanna et al., “A System for Non-Intrusive Human Iris Acquisition and Identification,” IAPR Workshop on Machine Vision Applications, pp. 200-203, Nov. 12-14, 1996.
http://en.wikipedia.org/wiki/Radon—transform, “Radon Transform,” 5 pages, printed May 14, 2010.
Ivins et al., “A Deformable Model of the Human Iris for Measuring Small Three-Dimensional Eye Movements,” Machine Vision and Applications, vol. 11, pp. 42-51, 1998.
Kadyrov et al., “The Trace Transform and Its Applications,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 23, No. 8, pp. 811-828, Aug. 2001.
Kadyrov et al., “The Trace Transform as a Tool to Invariant Feature Construction,” 3 pages, prior to Jun. 11, 2010.
Kang et al., “Improved Dual Action Contour for Iris Recognition,” 10 pages, prior to Jun. 11, 2010.
Kawaguchi et al., “Detection of Eyes from Human Faces by Hough Transform and Separability Filter,” IEEE, 4 pages, 2000.
Kong et al., “Detecting Eyelash and Reflection for Accurate Iris Segmentation,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 17, No. 6, pp. 1025-1034, 2003.
Li et al., “Appearance Modeling Using a Geometric Transform,” IEEE Transactions on Image Processing, 17 pages, 2008.
Li et al., “Appearance Modeling Using a Geometric Transform,” Journal Preparation for IEEE Transactions on Image Processing, 30 pages, Nov. 5, 2006.
Ma et al., “Local Intensity Variation Analysis for Iris Recognition,” Pattern Recognition, vol. 37, pp. 1287-1298, 2004.
Ma et al., “Video Sequence Querying Using Clustering of Objects' Appearance Models,” Advances in Visual Computing Third Annual Symposium, ISVC 2007, 14 pages, 2007.
Monro et al., “DCT-Based Iris Recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, No. 4, Apr. 2007.
Noh et al., “A Novel Method to Extract Features for Iris Recognition System,” AVBPA 2003, LNCS 2688, pp. 862-868, 2003.
Ojala et al., “Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7, 18 pages, Jul. 2002.
Pamudurthy et al., “Dynamic Approach for Face Recognition Using Digital Image Skin Correlation,” Audio and Video Based Person Authentication 5th International Conference, AVBPA 2005, Hilton Rye Town, NY, USA, 11 pages, Jul. 20-22, 2005.
Petrou et al., “The Trace Transform in a Nutshell,” 9 pages, prior to Jun. 11, 2010.
Phillips et al., “FRVT 2006 and ICE 2006 Larte-Scale Results,” 56 pages, Mar. 2007.
Porikli et al., “Covariance Tracking Using Model Update Based on Means on Riemannian Manifolds,” 8 pages, prior to Jun. 11, 2010.
Proenca et al., “Toward Noncooperative Iris Recognition: A Classification Approach Using Multiple Signatures,” IEEE Transactions on Patern Analysis and Machine Intellingence, vol. 29, No. 4, pp. 607-612, Apr. 2007.
Ross et al., “Segmenting Non-Ideal Irises Using Geodesic Active Contours,” IEEE 2006 Biometrics Symposium, 3 pages, 2006.
Shapiro et al., pp. 556-559 in Book Entitled “Computer Vision,” Prentice Hall, prior to Jun. 11, 2010.
Stillman et al., “A System for Tracking and Recognizing Multiple People with Multiple Cameras,” 6 pages, Aug. 1998.
Sun et al., “Iris Recognition Based on Non-local Comparisons,” Sinobiometrics 2004, LNCS 3338, pp. 67-77, 2004.
Suzaki et al., “A Horse Identification System Using Biometrics,” Systems and Computer in Japan, vol. 32, No. 14, pp. 12-23, 2001.
Trucco et al., “Robust Iris Location in Close-up Images of the Eye,” Pattern Anal. Applic. vol. 8, pp. 247-255, 2005.
Turan et al., “Trace Transform Based Invariant Object Recognition System,” 4 pages, prior to Jun. 11, 2010.
Turk et al., “Eigenfaces for Recognition,” Journal of Cognitive Neuroscience, vol. 3, No. 1, 16 pages, 1991.
Wang et al., “Recent Developments in Human Motion Analysis,” Pattern Recognition, vol. 36, pp. 585-601, 2003.
Wei et al., “Robust and Fast Assessment of Iris Image Quality,” LNCS vol. 3832, pp. 464-471, 2005.
Zhao et al., “Dynamic Texture Recognition Using Local Binary Patterns with an Application to Facial Expressions,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, No. 6, pp. 915-928, Jun. 2007.
Zhi-Hui et al., “Research Iris Serial Images Quality Assessment Method Based on HVS,” Proceedings of SPIE, vol. 6034, 6 pages, 2006.
U.S. Appl. No. 13/077,821, filed Mar. 30, 2011.
Freeboy, “Adaptive Optics Speeds Up Airport Immigration,” Optics.org/ole, 2 pages, Jan. 2009.
http://www.imagine-eyes.com/content/view/100/115/, “INOVEO—Ultra-High Resolution Retinal Imaging with Adaptive Optics,” 2 pages, printed Feb. 22, 2010.
Related Publications (1)
Number Date Country
20080075334 A1 Mar 2008 US
Provisional Applications (3)
Number Date Country
60778770 Mar 2006 US
60807046 Jul 2006 US
60647270 Jan 2005 US
Continuation in Parts (15)
Number Date Country
Parent 11275703 Jan 2006 US
Child 11681752 US
Parent 11043366 Jan 2005 US
Child 11275703 US
Parent 11372854 Mar 2006 US
Child 11043366 US
Parent 10979129 Nov 2004 US
Child 11372854 US
Parent 10655124 Sep 2003 US
Child 10979129 US
Parent 11681752 US
Child 10979129 US
Parent 11382373 May 2006 US
Child 11681752 US
Parent 11672108 Feb 2007 US
Child 11382373 US
Parent 11675424 Feb 2007 US
Child 11672108 US
Parent 11681614 Mar 2007 US
Child 11675424 US
Parent 11681662 Mar 2007 US
Child 11681614 US
Parent 11681470 Mar 2007 US
Child 11681662 US
Parent 11681505 Mar 2007 US
Child 11681470 US
Parent 11681251 Mar 2007 US
Child 11681505 US
Parent 11681751 Mar 2007 US
Child 11681251 US