Iris recognition system having image quality metrics

Information

  • Patent Grant
  • 8050463
  • Patent Number
    8,050,463
  • Date Filed
    Friday, March 2, 2007
    17 years ago
  • Date Issued
    Tuesday, November 1, 2011
    12 years ago
Abstract
An iris recognition system implementing image quality metrics to assess the quality of an acquired eye image for reliable operation. Images with low image quality may be rejected or flagged based upon the application. The image quality may be determined with a preprocessing module in the recognition system. The processing may be configured based on a quality assessment.
Description
BACKGROUND

The present invention pertains to recognition systems and particularly to biometric recognition systems. More particularly, the invention pertains to iris recognition systems.


Related applications may include U.S. patent application Ser. No. 10/979,129, filed Nov. 3, 2004, which is a continuation-in-part of U.S. patent application Ser. No. 10/655,124, filed Sep. 5, 2003; and U.S. patent application Ser. No. 11/382,373, filed May 9, 2006, which are hereby incorporated by reference.


U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/275,703, filed Jan. 25, 2006, is hereby incorporated by reference.


U.S. Provisional Application No. 60/647,270, filed Jan. 26, 2005, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/043,366, filed Jan. 26, 2005, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/372,854, filed Mar. 10, 2006, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/672,108, filed Feb. 7, 2007, is hereby incorporated by reference.


U.S. patent application Ser. No. 11/675,424, filed Feb. 15, 2007 is hereby incorporated by reference.


SUMMARY

The present invention is an iris recognition system implementing image quality metrics to assess the quality of the acquired eye image for reliable operation. Images with low image quality may be rejected or flagged based upon the application.





BRIEF DESCRIPTION OF THE DRAWING


FIG. 1 is a diagram of an overall iris recognition system incorporating a preprocessing module for image quality metrics;



FIG. 2 is a diagram of the preprocessing module for image quality metrics;



FIGS. 3
a, 3b and 3c show an ordinary eye image, a blurred eye image and a restored blurred eye image, respectively; and



FIG. 4 is a diagram of an arrangement for measuring an iris image discrepancy.





DESCRIPTION

The present system may relate to biometrics, iris recognition systems, image quality metrics, authentication, access control, monitoring, identification, and security and surveillance systems. The present system addresses specifically a preprocessing procedure that may be included prior to executing the iris recognition techniques.


An overall eye detection system is shown in FIG. 1. It shows a camera 61 that may provide an image with a face in it to the eye finder 62 as noted herein. The eyefinder 62 may provide an image of one or two eyes that goes to a preprocessing module 60 for iris image evaluation and possible rehabilitation if needed. If the iris image does not meet a set of quality metrics and cannot be rehabilitated, then the eye image is rejected and a new eye image capture may be sought by eyefinder 62. In another embodiment, if the iris image does not meet a set of quality metrics, then the eye image may be flagged and the processing thereafter configured based upon the image quality assessment. If the iris image satisfies the set of quality metrics, either as it is captured or rehabilitated, then the image may be forwarded to the iris segmentation block 63. A one dimensional polar segmentation (1D POSE) system in block 63 may be used to perform the segmentation. POSE may be based on the assumption that image (e.g., 320×240 pixels) has a visible pupil where iris can be partially visible. POSE can still operate on not fully visible pupil where little portions of the pupil are obscured by the eyelids. There may be pupil segmentation at the inner border between the iris and pupil and segmentation at the outer border between the iris and the sclera and iris and eyelids. An output having a segmented image may go to a block 64 for mapping/normalization and feature extraction. An output from block 64 may go to an encoding block 65 which may provide an output, such as a barcode of the images to block 65 put in terms of ones and zeros. The coding of the images may provide a basis for storage in block 66 of the eye information which may be used for enrolling, indexing, matching, and so on, at block 67, of the eye information, such as that of the iris and pupil, related to the eye.


The present system may assess the quality of an eye image in real-time as a quality control procedure. This approach may allow poor image acquisition to be corrected through recapture and facilitate the acquisition of a best possible image within the capture time window configured in the system. This acquisition may result in a process for providing more good quality iris images that can improve the iris identification accuracy and the integrity of iris recognition systems.


An objective of the present invention is to define rules to assess iris image quality and use these rules as discriminators for covering poor qualities of iris images or reconfiguring the processing steps based upon the image quality assessment. With a person in the loop, it may be somewhat straightforward to ultimately assess the quality the eye image using subjective evaluation. In practice, however, subjective evaluation may lead to errors and thus tend to be impractical in view of the presently developed automated iris recognition systems. In addition, what is perceived as a good quality to the human eye does not necessary secure a reliable recognition by the present processes. Thus, the image quality may be assessed based upon specific criteria critical to a successful iris recognition processing. Like the fingerprint biometrics, iris recognition systems may have widely varying matching performance factors which depend heavily on eye image quality. The iris pattern and eye pose may have a direct effect on matcher accuracy. Therefore, operational recognition systems may require effective iris image quality metrics for image assessment even as the iris pattern is analyzed.


An automated iris recognition system may have major components which include iris localization, iris map feature extraction, encoding, and enroll/matching. In image acquisition, a digital image capturing the eye may be obtained at multiple resolutions, eye orientation and transition, under variant lighting illumination and in a noise laden environment. The feature extraction process may capture the unique texture of the iris pattern, and the encoder may encode the information into an iris barcode to expedite a matching process. The matching may involve computing a number of bits matched in the iris barcode against multiple templates of barcodes in a database. The performance of such a system may depend heavily on the various stages of the iris recognition processes, and in turn each of these processes may depend on the quality of the captured iris image. An objective image quality metric can play a variety of roles in each of the iris processing stages. Many artifacts may affect one or more of these processes.


A perfectly captured iris pattern under ideal conditions may illustrate clearly the texture of an iris that can be captured in a unique iris barcode. However, many factors such as eye closure, obscuration, an off-angle eye, occlusions, imperfect acquisition embedded in electronic noise, non-uniform illumination, different sensor wavelength sensitivity, pupil dilation, and specular light reflection may cause the captured iris map to be far from having ideal quality. Smearing, blurring, defocus (corresponding iris textures are at different depths in the acquisition scene) and poor resolution may result in the capture of very poor quality images as well as have a negative impact on iris segmentation and/or feature extraction and encoding.


Here, one may define a common framework to assess the quality of an image, develop quantitative measures that can objectively and automatically assess the quality or condition of the iris image before being processed for iris recognition, and preprocess the image for quality improvement.


Digital eye images may be subject to a wide variety of distortions during acquisitions, transmission and reproduction, any of which may result in degradation of iris recognition performance. To counter such vulnerability, the present system may have quantitative measures that can automatically assess the quality of iris images before being processed for iris recognition, and develop an appropriate set of quantitative iris image quality metrics (IIQMs). The present system may include apparatus and approaches for implementation of an appropriate set of quantitative iris image quality metrics (IIQMs). The IIQMs may be defined relative to image features based on acquisition performance. The quality of the image should correlate well with subjective iris processes. The IIQMs may be integrated into the preprocessing procedure to assess the quality of the iris image before the iris recognition process is initiated. Based upon an evaluation with these metrics, one may accept the input image, reconfigure the processing to deal with degradations, or request a new capture of the iris.


One may note various iris image quality metrics. Metrics to support automatic iris quality measurement may include eyelash/eyelid occlusion, pupil dilation, illumination, SNR, motion blur, optical defocusing, sensor noise, specular reflection, pixel count, iris texture sharpness, and so on.


There may be an interest in the modeling of image sharpness for the purpose of improving the performance of image analysis. Image quality metrics appear to be a reliable general purpose tool for iris image assessment before running an iris recognition process. To that end, a set of criteria may be defined for use with iris image quality metrics. A first criterion involves blur which may be measured using high frequency distortions from coarse to fine wavelet coefficients, or XOR-ing the resulting codes of two patches of same iris to measure discrepancy among the bits. Blur may be related to defocus. A second criterion involves defocus which may be assessed by measuring high frequency within the iris map. A third criterion involves eye closure which may be assessed using the iris inner border profile. A fourth criterion involves iris obscuration which may be assessed by computing the integral of the area between the eyelid curve and iris inner boundary. A fifth criterion involves off-angle eye (i.e., gazed eye) detection which may be assessed in the iris outer boundary shape fitting. A sixth criterion involves reflection which may be assessed using iris curve fitting and high contrast thresholding. A seventh criterion may involve excessive pupil extreme dilation which may be determined by evaluating the limits of the pupil edge detection.



FIG. 2 is a diagram of the preprocessing module 60 of FIG. 1. An iris image may enter module 60 and go to measurement modules such as blur module 31, defocus module 32, closure module 33, obscuration module 34, off-angle detection module 35, reflection module 36 and excessive dilation module 37. The measurement outputs of these modules may go to an evaluator 38 which may determine, according to the received measurements, whether the iris image 12 is acceptable as an output 41 for further processing such as segmentation. If the image 12 is not acceptable, it may be rejected or be deemed to have discrepancies that are reparable. If the latter is the case, then the image 12 may go to a rehabilator 39 for conditioning to make the image 12 as an output 41 acceptable for further processing or reconfiguring some of the processing to deal with the inherited degradations.


Objective image quality metrics may be classified according to the availability of a non-affected image, with which the distorted image is to be compared. One may note that iris image 12 quality enhancement may include pixel processing, contrast balancing, histogram equalization, image restoration, image blind deblurring, adaptive filtering for iris texture restoration, and pose normalization.



FIGS. 3
a, 3b and 3c show an example of conditioning of an iris image 12 by rehabilator 39. FIG. 3a is a good image 12 of the eye or iris. FIG. 3b is a blurred image 12. FIG. 3c is conditioned or rehabilitated image 12 of the eye or iris. Various kinds of techniques may be used for rehabilitative processing. Knowing a cause of the blurring may provide a basis for taking the blur out of the image 12, by perhaps reversing the cause of the blurring via processing, to result in a good image for further processing such as segmentation.


Blurring may be one of the most common forms of image distortion which can affect dramatically the performance of iris recognition. Experience may show that the effect of blurring is mostly apparent on the iris map feature extractions and encoding of the iris. The iris segmentation procedure may often be unaffected due to an existence of sufficient contrast among the iris and sclera or pupil that still permits a segmentation of the iris. The blurring phenomenon may be explained as a reduction in energy at high frequencies of the spectral domain of the image. Blurring of the iris image may occur in many different forms. The optical defocus, subject motion, and camera displacement for zooming may introduce blur distortions, which are a direct result of some technical limitation during eye acquisition.


Relative to motion blur and smearing effects, one may base a solution on high frequency distortions among the coarse to fine wavelet coefficients to detect blur by comparing the linear frequency distortion filter outputs at multiple stages of a dyadic decomposition to measure the discrepancy among the stages. An objective may be to automate these detection procedures as blurring has been proven to affect the iris matching performance. Detection of blur requires some modeling of what constitutes a blurred image and unaffected image. A hypothesis may be that image smearing leaves statistical evidence which can be exploited for detection with the aid of image quality high frequency features and multivariate regression analysis.


In another approach, instead of assessing the iris texture high frequency components, one might assess the resulting iris code directly by using two different localized patches and XOR-ing them to measure discrepancies between the corresponding bits of the two patches. Cross-matching with few discrepancies should indicate blurring effects and vice versa. Other standard quality measures may be used to measure the similarity among the two patches; the more blur the iris map is, the more similar the localized patches are. One might consider measuring the similarity of the two patches by measuring the MSE between the patches intensities, the correlation of the two patches, statistical similarity, contrast difference, or peak signal to noise ratio among the two patches. Let L(x,y), and R(x,y) present the first and second patch, one may formulate these metrics as follows.







MSE





measure


:



q
0


=


1

N


(
R
)








R
,

L


(

x
,
y

)











(


R


(

x
,
y

)


-

L


(

x
,
y

)



)

2








where N(R) is the number of pixels within each patch.


Correlation measure:








q
1

=


σ
LR



σ
L



σ
R




;





where







σ
LR

=


1

(


N


(
R
)


-
1

)






N








(


R


(

x
,
y

)


-





R
_


)



(


L


(

x
,
y

)


-

L
_


)









Statistical similarity








q
2

=



L
_



R
_





(

L
_

)

2

+


(

R
_

)

2




;





where R, L are the average values of image intensities within the patches. Last but not least, we can measure the contrast similarity using the following metric:







q
4

=

2




σ
L



σ
R



(


σ
L
2

+

σ
R
2


)







High frequency and blurring metrics may be noted. It may be shown that even small amount of motion blur significantly degrades performance independent of whether images were captured from an off-angle or frontal pose.


The present system may provide an approach for quantifying the blurring effect on the iris map based on an observation that blur induces a distortion of local frequency components in the iris patterns in terms of amplitude and phase which lead to a high-frequency energy loss. The present solution may be based on high frequency distortions among the coarse and fine wavelet coefficients. This approach may be used to detect blur by comparing the linear frequency distortion filter outputs at multiple stages of a dyadic decomposition to measure the discrepancy among the scales and measure their impact on the phase. One may note that any affect on the amplitude should not have any impact if only phasor information is used to encode the iris. If no high frequency distortion measure is reported, then the iris image has already gone through blurring degradation effect. On the other hand, if a discrepancy measure is significant then this implies a distortion has occurred and the original signal has contained all its iris high frequency components with no blurring effects.


Multi-resolution analysis may provide a convenient way for representation of localized signal features such as iris texture patterns because it is widely recognized as a great way to present the localized information in the signal both in spatial and frequency domains. It is for these reasons that one may deploy wavelet decomposition as the framework for a solution presented herein. Wavelet decomposition may be better suited than Fourier transform because of the varying nature of frequency components in an iris image. In the present approach, the behavior of high frequency components at different scales in the vicinity of iris pattern features may be explored to measure the blurring amount in an image. The present approach may be based on the fact that when an image is blurred through convolution with a symmetric linear filter, the low frequency information in the Fourier domain does not necessarily change. However, the local high frequency phasor and amplitude information may be affected by the filtering mechanism. Since the present encoder may be based upon the phase information, then any blurring will directly impact the encoded iris-code.



FIG. 4 shows a diagram of the module 31 for providing measurements of blur. An iris image 12 signal may enter a low pass filter 13 and go on a level 1 structure measure λ1 module 16. An output of filter 13 may go to a low pass filter 14 and go on to a level 2 structure measure λ2 module 17. An output of filter 14 may go to a low pass filter 15 and go to a level n structure measure λn module 18. There may be a number of low pass filters situated between filter 14 and filter 15 and a like number of structure measure λ modules situated between structure measure λ2 module 17 and structure measure λn module 18, with similar connection arrangements. The cutoff frequency of each Low pass filter may be set to investigate specific bands of frequencies. Similarly, the outputs w1 21, w2 22, . . . and wn 23 from the respective structure measure λ modules may go to a regression combination module 24. The output of module 24 may be a measure of blur or discrepancy in image 12. The particular measure of discrepancy among the coarse and fine wavelet coefficients of the iris map may be indicated by the following equation.

λn=Dscr(Iwn(x,y),Iwn−1(x,y))

The distortion (discrepancy) measure of the quality of iris image may be measured on the basis of the structural difference among these coarse-fine wavelet coefficients of the iris image or, in other words, the structural difference between the observed image at each scale and its filtered version.


The inner product between the unbiased variance with respect to the product of the variance may be used to quantify the structural similarity. In order to avoid instability, in case either variance is null, the quantity may be modified to











λ
n

=

min


{



2



σ

n


(

n
-
1

)





σ
n



σ

(

n
-
1

)





-
1

,


2



(

1
-

σ

n


(

n
-
1

)




)


(

1
-


σ
n



σ

(

n
-
1

)




)



-
1


}



,




(
2
)








where is the σn variance at scale n, the variance σn−1 at scale (n−1), and the covariance term may be defined as










σ

x


(

n
-
1

)



=


1
MN






i
,
j









(



I

W
n




(


x
i

,

y
j


)


-

μ
n


)




(



I

W

n
-
1





(


x
i

,

y
j


)


-

μ

n
-
1



)

.








(
3
)








Incidentally, one may note the blur image quality assessment and that a structure comparison may be conducted on the statistical normalization of the specified iris image ROIs and thus equation (1) may imply equation (2).


The elements in the finer level may be compared prior to a decimation operation for dimensional consistency. These local statistics may be computed within a kernel L×L square window, which being convolved across the predefined regions of interest (ROIs) that represent an iris map. The width of the kernel L may be chosen to represent a typical iris texture size. At each scale, the local statistics and the distortion measured within the local window may be computed. At each level, one may require a single overall quality measure of the discrepancy. One may utilize an expected average value of measure using a mean or median to evaluate the distortion measure. In other approaches here, one may include additional classical metrics to compare the statistical difference between a coarse scale image and its filtered image at the finer scale. This is possible since the regression analysis may depict the most contributing indices to result into the final decision. In addition, it is recommended that the choice of ROIs be limited to only areas that exhibit iris textures. Multiple ROIs may be treated separately to be weighted appropriately within the regression analysis. One may identify each of iris areas to be at the inner borders of the iris.


In addition, one may combine the outcome of different scales using a multivariate regression approach on the selected quality metrics of multiple scales trained based on some predefined samples of captured irises.


One may then adopt a regression approach to combine the quality indices into a final decision measure. The present metric indices may already be scaled in equation (2) to vary between −1 and 1; thus, one may define the weighting vector based upon the LS solution being {right arrow over (ω)}=D{right arrow over (ν)}, where D+=(DTD)−1DT, the pseudo inverse of the matrix of quality indices elements per each iris sample and per each quality index. The vector {right arrow over (ν)} may be the resulting indices for the trained set.


Testing appears to indicate that the present approach is able with reasonable accuracy to distinguish between blurred images and non-affected images.


In a different embodiment, one might decompose the two localized patches (i.e., iris regions at the left and right iris-sclera borders) using the same wavelet concept and compare the coefficients of the two decompositions at all levels. Regression combination may then be applied to the output of these structure measures similar to the above example to measure discrepancy among the two patches and not among the levels. No low pass filters are needed in this composition.


In a different example, instead of assessing the iris texture, one might assess the iris code directly using the localized patches and XOR them to measure a discrepancy among the bits, which may be aided with the following equation.

mb=Σ[MR)]XOR[MR+Δφ)]≦ηg


Motion blur and smearing effects may be related to defocusing. Defocus is isotropic in nature as pixels of an image may be smeared at all directions and be measured as a blurring effect. Defocus may be assessed by measuring high frequency contents within the iris map after using a median filter to eliminate the salt/pepper type of noise. A local phase technique may be noted. The present approach may include the XOR equation provided herein.


Eye closure and exposure of the iris map may affect primarily the segmentation modeling as it is expected to extract the pupil in its entirety to enable an analysis of the iris textures surrounding it. If the eye is not open enough or some of the pupil region is not visible, such condition may affect the localization of the iris edges or change some of the algorithms modeling assumptions.


Eye closure may be assessed using the iris inner border profile. Several parameters may be formulated to evaluate the estimated border profile including a fitness parameter to measure how far the detected curve from an elliptic like shape, and a parameter defined to measure how much of eye closure there is.


Eye closure may be assessed using the pupil profile. Several parameters may be formulated to evaluate the estimated pupil profile. It may incorporate the following formula.







η
1

=


1
N





Blob




u


(






F


(

x
,
y

)


-

f


(

x
,
y

)





F


(

x
,
y

)


-


F
c



(

x
,
y

)






-
ɛ

)





x




y









In the above equation, the curve f(x,y) represents the boundary of the blob, F(x,y) is the border curve of estimated fitting shape, and Fc(x,y) is the moment center of the model shape. N in the above equation represents the length of the curve f(x,y). the operator u( ) is the step function and ε<<1 is a tolerance factor.


Another is measuring the proportion of the blob within the estimated model curve. A fitting metrics may be basically the ratio of the estimated shape surface coverage or intersection of the surface of the model and the blob over the blob surface.








η
2

=


Surface


(

blob






F


(

x
,
y

)



)



S
blob



,





where Sblob is the surface of the blob. Iris quality metrics may include iris criteria. Eye closure may be assessed using the pupil profile. The parameters may be formulated to evaluate the estimated pupil profile with the boundary elliptic profile, and the coverage of pupil parameter as noted herein.


Obscuration and occlusions, due to presence of long dense eyelashes or normal closure of eyelids, may affect dramatically the segmentation and the encoding scheme of the recognition system. Iris obscuration may be assessed by computing the integral of the area between the eyelid curve and iris inner boundary. An eyelid-eyelash obscuration assessment may assume that the eye is open enough with most of the pupil being visible but still with the eyelids or eyelashes obscuring the iris texture. The assessment of this criterion may be inherited in the present POSE segmentation technique that provides a way to detect simultaneously the edges of the iris and eyelids and or eyelashes. One may assess or measure iris obscuration by computing the integral of the area surface under the eyelash/lid detected curve and the inner iris or pupil boundary with the following equation.







m
o

=






Θ
1

->

Θ
2






(


r


(
θ
)


-


r
p



(
θ
)



)








θ





η
o






Off-angle and eye gazing may be a significant concern in an iris recognition system. Eye gazing is not necessarily considered as a separate issue in the present system since off-angle eye acquisition can be like other eye acquisition here. An off-angle eye not looking forward or directly at an acquisition system may be problematic for some related iris detection mechanisms. Off-angle (gazed eyes) may be assessed in the iris outer boundary shape fitting.


Although, one may design the present iris recognition processes to handle also off-angle eyes, one may want to make an assessment of this IIQM so that special treatment is devoted to the image analysis. The present approach used to assess off-angle (gazed eyes) may be measure the shape fitness of the outer boundary of the iris to a circular shape. Here, the following equation may be noted.







m
g

=



1

π






R
2








Θ
R








u


(




(



I
R



(
θ
)


-
R

)



-
ϑ

)






η
g






A strong specular reflection may be a concern and affect the contrast of the region being shined and thus affect the segmentation approach as well as the features in the iris texture. An amount of reflection may be assessed using the iris curve fitting and the high contrast thresholding.


Pupil dilation may affect iris recognition performance; however, a good segmentation technique may handle such dilation to a certain extent. Pupil extreme dilation may be detected by evaluating the limits of the pupil edge detection. It is expected that the edges of the pupil may be detected within limits of a predefined range set for a normal range of operation of a pupil dilation. In case that the limit is reached for at all angles, this may indicate that the detected edges do not reflect the actual edges of the pupil and redefinition of the limits are necessary.


Some segmentation approaches may be designed to overcome pupil dilation. However, it has been noted that in some cases, the pupil dilation is significant enough that it may impact the segmentation. The present approach for assessing pupil dilation may be as follows. The iris map may be a region at the inner border of the iris and extend enough to cover a major segment of the iris without reaching the outer border. During inner boundary estimation, one may intentionally limit the map to a region less than the outer boundaries to avoid any obscuration or noise interference that may affect the map structure. It is expected that the edges of the pupil may be detected within limits of a predefined range defined for a normal range of operation of pupil dilation. In case that the limit is again reached for at all angles, this may indicate that the detected edges do not reflect the actual edges of the pupil and that redefinition of the limits appears necessary.


In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense.


Although the invention has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the present specification. It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.

Claims
  • 1. A computer implemented iris image quality metric system comprising: a device for acquiring an iris image; anda data-processing apparatus;a quality processing module connected to the device;wherein the quality processing module is executed by the data-processing apparatus, the module and data-processing apparatus are operated in combination to provide a quality assessment of an iris image;wherein the quality assessment of the iris image includes an assessment of image blur; andimage blur is measured according to high frequency distortions from coarse to fine wavelet coefficients.
  • 2. The system of claim 1, wherein the quality assessment of an iris image further includes an analysis of one or more of the following image conditions which comprise: defocus;closure;obscuration;off-angle detection;reflection; and/orexcessive dilation.
  • 3. The system of claim 1, wherein a quality assessment of the iris image includes an analysis of the iris image relative to iris obscuration including a computation of an integral of virtually total area between an eyelid curve and an iris inner boundary of an eye from which the iris image is taken.
  • 4. The system of claim 1, wherein a quality assessment of the iris image includes an analysis based on an XOR-ing of codes of two patches of the iris image to measure a discrepancy among bits of the patches.
  • 5. The system of claim 1, wherein a quality assessment of the iris image includes an analysis based on measuring similarity of two patches of the iris image using contrast similarity, intensity distribution similarity, peak signal to noise ratio, and/or correlation between intensities of the two patches.
  • 6. The system of claim 1, wherein a quality assessment of the iris image includes an analysis of the iris image relative to an amount of reflection by the iris in the image including an iris curve fitting and a contrast thresholding.
  • 7. A computer implemented iris image quality metric system comprising: a device for acquiring an iris image;a data-processing apparatus; anda quality processing module connected to the device;wherein the quality processing module is executed by the data-processing apparatus, the module and data-processing apparatus are operated in combination to provide a quality assessment of an iris image;wherein a quality measurement relative to blur and/or defocus of the iris image is based on a measurement of energy at high frequencies of the spectral domain of the image;wherein the measurement of energy at high frequencies of the spectral domain of the image is compared to a measurement of energy at high frequencies of the spectral domain of the image without blur and/or defocus; andan amount that the energy at high frequencies of the spectral domain of the image is less than the energy at high frequencies of the spectral domain of the image without blur and/or defocus is proportional to the blur and/or defocus of the image.
  • 8. A computer implemented iris image quality metric system comprising: a device for acquiring an iris image;a data-processing apparatus; anda quality processing module connected to the device;wherein the quality processing module is executed by the data-processing apparatus, the module and data-processing apparatus are operated in combination to provide a quality assessment of an iris image;wherein the quality assessment of the iris image is according to blur;blur is measured according to high frequency distortions from coarse to fine wavelet coefficients; andwherein the high frequency distortions among the coarse to fine wavelet coefficients are indicated by a comparison of linear frequency distortion filter outputs at multiple stages of a dyadic decomposition to measure a discrepancy among the stages.
  • 9. A computer implemented iris image quality metric system comprising: a device for acquiring an iris image;a data-processing apparatus; anda quality processing module connected to the device;wherein the quality processing module is executed by the data-processing apparatus, the module and data-processing apparatus are operated in combination to provide a quality assessment of an iris image;wherein the quality assessment of the iris image is according to blur;blur is measured according to high frequency distortions from coarse to fine wavelet coefficients; andwherein high frequency content is compared at wavelet decompositions between two patches of the iris image for similarity purposes.
  • 10. The system of claim 1, wherein a quality assessment of the iris image includes an analysis of the iris image relative to eye closure including an analysis of an inner border profile of the iris in the image.
  • 11. The system of claim 10, wherein the inner border profile is estimated according to a measurement of an amount that a detected curve of the profile is similar to an elliptic-like shape, and a measurement of an amount of exposure of a map of the iris.
  • 12. A computer implemented iris image quality metric system comprising: a device for acquiring an iris image;a data-processing apparatus; anda quality processing module connected to the device;wherein the quality processing module is executed by the data-processing apparatus, the module and data-processing apparatus are operated in combination to provide a quality assessment of an iris image; andwherein a quality assessment of the iris image is according to a comparison of a location of edges of the pupil with a set of defined limits for normal operation of a pupil dilation.
  • 13. A computer implemented iris image preprocessing system comprising: a device for acquiring an iris image;a data-processing apparatus;an iris image module executed by the data-processing apparatus;an iris image quality measurement module executed by the data-processing apparatus and connected to the iris image module;an evaluator connected to the iris image module;a rehabilitator connected to the evaluator;wherein:the evaluator receives a quality measurement of an iris image from the quality measurement module, indicates whether the iris image is acceptable or unacceptable for further processing, and indicates whether an iris image that is unacceptable should be rejected or be rehabilitated for further processing; andthe rehabilitator conditions the image by one or more of pixel processing, contrast balancing, histogram equalization, image blind deblurring, adaptive filtering for iris texture restoration, and pose normalization.
  • 14. The system of claim 13, further comprising an image segmentation module connected to the iris image module, the image segmentation module performing one dimensional polar segmentation on the iris image.
  • 15. A computer implemented method for assessing quality of an image comprising: receiving an iris image from an iris image acquiring device;measuring at least one quality of the iris image using a data-processing apparatus including a quality processing module;evaluating the at least one quality to determine whether the iris image is adequate relative to the at least one quality for further processing;determining whether an iris image, which is not adequate relative to the at least one quality for further processing, is adequate for rehabilitation;if the iris image is adequate for rehabilitation, conditioning the image by one or more of pixel processing, contrast balancing, histogram equalization, image blind deblurring, adaptive filtering for iris texture restoration, and pose normalization.
  • 16. A computer implemented method for assessing quality of an image comprising: receiving an iris image from an iris image acquiring device;measuring at least one quality of the iris image using a data-processing apparatus including a quality processing module;evaluating the at least one quality to determine whether the iris image is adequate relative to the at least one quality for further processing;measuring a first energy, at high frequencies, of the spectral domain of the iris image;determining a second energy, at high frequencies, of the spectral domain of a model iris image which is adequate for further processing; andwherein if the first energy is within a set percentage of the second energy, then the iris image is adequate, relative to the at least one quality, for further processing.
  • 17. A computer implemented method for assessing quality of an image comprising: receiving an iris image from an iris image acquiring device;measuring at least one quality of the iris image using a data-processing apparatus including a quality processing module;evaluating the at least one quality to determine whether the iris image is adequate relative to the at least one quality for further processing;obtaining two different localized patches from an iris code of the iris image;XOR-ing the two patches;measuring an amount of discrepancy among cross-matched bits of the two patches;determining whether the amount of discrepancy is greater than a set amount; andwherein if the amount of discrepancy is greater than a set amount, then the iris image is adequate, relative to the at least one quality, for further processing.
  • 18. A computer implemented method for assessing quality of an image comprising: receiving an iris image from an iris image acquiring device;measuring at least one quality of the iris image using a data-processing apparatus including a quality processing module;evaluating the at least one quality to determine whether the iris image is adequate relative to the at least one quality for further processing;measuring an amount of difference between the inner border profile in the iris image and an elliptic-like shape; andmeasuring a percentage of the iris map of the iris image that is exposed; andwherein:if the amount of difference is less than a set difference for an iris image adequate for further processing relative to the at least one quality,then the image is adequate for further processing relative to the at least one quality; andif the percentage is greater than a set percentage for an iris image adequate for further processing relative to the at least one quality, then the image is adequate for further processing relative to the at least one quality.
Parent Case Info

This application claims the benefit of U.S. Provisional Application No. 60/778,770, filed Mar. 3, 2006. This application is a continuation-in-part of U.S. patent application Ser. No. 11/275,703, filed Jan. 25, 2006, which claims the benefit of U.S. Provisional Application No. 60/647,270, filed Jan. 26, 2005. This application is a continuation-in-part of U.S. patent application Ser. No. 11/043,366, filed Jan. 26, 2005. This application is a continuation-in-part of U.S. patent application Ser. No. 11/372,854, filed Mar. 10, 2006; This application is a continuation-in-part of U.S. patent application Ser. No. 11/672,108, filed Feb. 7, 2007. This application is a continuation-in-part of U.S. patent application Ser. No. 11/675,424, filed Feb. 15, 2007.

Government Interests

The government may have rights in the present invention.

US Referenced Citations (398)
Number Name Date Kind
4641349 Flom et al. Feb 1987 A
4836670 Hutchinson Jun 1989 A
5231674 Cleveland et al. Jul 1993 A
5291560 Daugman Mar 1994 A
5293427 Ueno et al. Mar 1994 A
5359382 Uenaka Oct 1994 A
5404013 Tajima Apr 1995 A
5551027 Choy et al. Aug 1996 A
5572596 Wildes et al. Nov 1996 A
5608472 Szirth et al. Mar 1997 A
5664239 Nakata Sep 1997 A
5717512 Chmielewski, Jr. et al. Feb 1998 A
5751836 Wildes et al. May 1998 A
5859686 Aboutalib et al. Jan 1999 A
5860032 Iwane Jan 1999 A
5896174 Nakata Apr 1999 A
5901238 Matsushita May 1999 A
5909269 Isogai et al. Jun 1999 A
5953440 Zhang et al. Sep 1999 A
5956122 Doster Sep 1999 A
5978494 Zhang Nov 1999 A
6005704 Chmielewski, Jr. et al. Dec 1999 A
6007202 Apple et al. Dec 1999 A
6012376 Hanke et al. Jan 2000 A
6021210 Camus et al. Feb 2000 A
6028949 McKendall Feb 2000 A
6055322 Salganicoff et al. Apr 2000 A
6064752 Rozmus et al. May 2000 A
6069967 Rozmus et al. May 2000 A
6081607 Mori et al. Jun 2000 A
6088470 Camus et al. Jul 2000 A
6091899 Konishi et al. Jul 2000 A
6101477 Hohle et al. Aug 2000 A
6104431 Inoue et al. Aug 2000 A
6108636 Yap et al. Aug 2000 A
6119096 Mann et al. Sep 2000 A
6120461 Smyth Sep 2000 A
6134339 Luo Oct 2000 A
6144754 Okano et al. Nov 2000 A
6246751 Bergl et al. Jun 2001 B1
6247813 Kim et al. Jun 2001 B1
6252977 Salganicoff et al. Jun 2001 B1
6282475 Washington Aug 2001 B1
6285505 Melville et al. Sep 2001 B1
6285780 Yamakita et al. Sep 2001 B1
6289113 McHugh et al. Sep 2001 B1
6299306 Braithwaite et al. Oct 2001 B1
6308015 Matsumoto Oct 2001 B1
6309069 Seal et al. Oct 2001 B1
6320610 Van Sant et al. Nov 2001 B1
6320612 Young Nov 2001 B1
6320973 Suzaki et al. Nov 2001 B2
6323761 Son Nov 2001 B1
6325765 Hay et al. Dec 2001 B1
6330674 Angelo et al. Dec 2001 B1
6332193 Glass et al. Dec 2001 B1
6344683 Kim Feb 2002 B1
6370260 Pavlidis et al. Apr 2002 B1
6377699 Musgrave et al. Apr 2002 B1
6393136 Amir et al. May 2002 B1
6400835 Lemelson et al. Jun 2002 B1
6424727 Musgrave et al. Jul 2002 B1
6424845 Emmoft et al. Jul 2002 B1
6433818 Steinberg et al. Aug 2002 B1
6438752 McClard Aug 2002 B1
6441482 Foster Aug 2002 B1
6446045 Stone et al. Sep 2002 B1
6483930 Musgrave et al. Nov 2002 B1
6484936 Nicoll et al. Nov 2002 B1
6490443 Freeny, Jr. Dec 2002 B1
6493363 Weaver et al. Dec 2002 B1
6493669 Curry et al. Dec 2002 B1
6494363 Roger et al. Dec 2002 B1
6503163 Van Sant et al. Jan 2003 B1
6505193 Musgrave et al. Jan 2003 B1
6506078 Mori et al. Jan 2003 B1
6508397 Do Jan 2003 B1
6516078 Yang et al. Feb 2003 B1
6516087 Camus Feb 2003 B1
6516416 Gregg et al. Feb 2003 B2
6522772 Morrison et al. Feb 2003 B1
6523165 Liu et al. Feb 2003 B2
6526160 Ito Feb 2003 B1
6532298 Cambier et al. Mar 2003 B1
6540392 Braithwaite Apr 2003 B1
6542624 Oda Apr 2003 B1
6546121 Oda Apr 2003 B1
6553494 Glass Apr 2003 B1
6580356 Alt et al. Jun 2003 B1
6591001 Oda et al. Jul 2003 B1
6591064 Higashiyama et al. Jul 2003 B2
6594377 Kim et al. Jul 2003 B1
6594399 Camus et al. Jul 2003 B1
6598971 Cleveland Jul 2003 B2
6600878 Pregara Jul 2003 B2
6614919 Suzaki et al. Sep 2003 B1
6652099 Chae et al. Nov 2003 B2
6674367 Sweatte Jan 2004 B2
6690997 Rivalto Feb 2004 B2
6708176 Strunk et al. Mar 2004 B2
6711562 Ross et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6718049 Pavlidis et al. Apr 2004 B2
6718665 Hess et al. Apr 2004 B2
6732278 Baird, III et al. May 2004 B2
6734783 Anbai May 2004 B1
6745520 Puskaric et al. Jun 2004 B2
6750435 Ford Jun 2004 B2
6751733 Nakamura et al. Jun 2004 B1
6753919 Daugman Jun 2004 B1
6754640 Bozeman Jun 2004 B2
6760467 Min et al. Jul 2004 B1
6765470 Shinzaki Jul 2004 B2
6766041 Golden et al. Jul 2004 B2
6775774 Harper Aug 2004 B1
6785406 Kamada Aug 2004 B1
6793134 Clark Sep 2004 B2
6819219 Bolle et al. Nov 2004 B1
6829370 Pavlidis et al. Dec 2004 B1
6832044 Doi et al. Dec 2004 B2
6836554 Bolle et al. Dec 2004 B1
6837436 Swartz et al. Jan 2005 B2
6845879 Park Jan 2005 B2
6853444 Haddad Feb 2005 B2
6867683 Calvesio et al. Mar 2005 B2
6873960 Wood et al. Mar 2005 B1
6896187 Stockhammer May 2005 B2
6905411 Nguyen et al. Jun 2005 B2
6920237 Chen et al. Jul 2005 B2
6930707 Bates et al. Aug 2005 B2
6934849 Kramer et al. Aug 2005 B2
6950139 Fujinawa Sep 2005 B2
6954738 Wang et al. Oct 2005 B2
6957341 Rice et al. Oct 2005 B2
6972797 Izumi Dec 2005 B2
6992562 Fuks et al. Jan 2006 B2
7053948 Konishi May 2006 B2
7058209 Chen et al. Jun 2006 B2
7071971 Elberbaum Jul 2006 B2
7084904 Liu et al. Aug 2006 B2
7136581 Fujii Nov 2006 B2
7183895 Bazakos et al. Feb 2007 B2
7184577 Chen et al. Feb 2007 B2
7197173 Jones et al. Mar 2007 B2
7204425 Mosher, Jr. et al. Apr 2007 B2
7239726 Li Jul 2007 B2
7277561 Shin Oct 2007 B2
7277891 Howard et al. Oct 2007 B2
7298873 Miller, Jr. et al. Nov 2007 B2
7315233 Yuhara Jan 2008 B2
7331667 Grotehusmann et al. Feb 2008 B2
7362210 Bazakos et al. Apr 2008 B2
7362370 Sakamoto et al. Apr 2008 B2
7362884 Willis et al. Apr 2008 B2
7365771 Kahn et al. Apr 2008 B2
7406184 Wolff et al. Jul 2008 B2
7414648 Imada Aug 2008 B2
7417682 Kuwakino et al. Aug 2008 B2
7418115 Northcott et al. Aug 2008 B2
7421097 Hamza et al. Sep 2008 B2
7443441 Hiraoka Oct 2008 B2
7460693 Loy et al. Dec 2008 B2
7471451 Dent et al. Dec 2008 B2
7486806 Azuma et al. Feb 2009 B2
7518651 Butterworth Apr 2009 B2
7537568 Moehring May 2009 B2
7538326 Johnson et al. May 2009 B2
7542945 Thompson et al. Jun 2009 B2
7580620 Raskar et al. Aug 2009 B2
7593550 Hamza Sep 2009 B2
7639846 Yoda Dec 2009 B2
7722461 Gatto et al. May 2010 B2
7751598 Matey et al. Jul 2010 B2
7756301 Hamza Jul 2010 B2
7756407 Raskar Jul 2010 B2
7761453 Hamza Jul 2010 B2
7777802 Shinohara et al. Aug 2010 B2
7804982 Howard et al. Sep 2010 B2
20010026632 Tamai Oct 2001 A1
20010027116 Baird Oct 2001 A1
20010047479 Bromba et al. Nov 2001 A1
20010051924 Uberti Dec 2001 A1
20010054154 Tam Dec 2001 A1
20020010857 Karthik Jan 2002 A1
20020033896 Hatano Mar 2002 A1
20020039433 Shin Apr 2002 A1
20020040434 Elliston et al. Apr 2002 A1
20020062280 Zachariassen et al. May 2002 A1
20020077841 Thompson Jun 2002 A1
20020089157 Breed et al. Jul 2002 A1
20020106113 Park Aug 2002 A1
20020112177 Voltmer et al. Aug 2002 A1
20020114495 Chen et al. Aug 2002 A1
20020130961 Lee et al. Sep 2002 A1
20020131622 Lee et al. Sep 2002 A1
20020139842 Swaine Oct 2002 A1
20020140715 Smet Oct 2002 A1
20020142844 Kerr Oct 2002 A1
20020144128 Rahman et al. Oct 2002 A1
20020150281 Cho Oct 2002 A1
20020154794 Cho Oct 2002 A1
20020158750 Almalik Oct 2002 A1
20020164054 McCartney et al. Nov 2002 A1
20020175182 Matthews Nov 2002 A1
20020186131 Fettis Dec 2002 A1
20020191075 Doi et al. Dec 2002 A1
20020191076 Wada et al. Dec 2002 A1
20020194128 Maritzen et al. Dec 2002 A1
20020194131 Dick Dec 2002 A1
20020198731 Barnes et al. Dec 2002 A1
20030002714 Wakiyama Jan 2003 A1
20030012413 Kusakari et al. Jan 2003 A1
20030014372 Wheeler et al. Jan 2003 A1
20030020828 Ooi et al. Jan 2003 A1
20030038173 Blackson et al. Feb 2003 A1
20030046228 Berney Mar 2003 A1
20030053663 Chen et al. Mar 2003 A1
20030055689 Block et al. Mar 2003 A1
20030055787 Fujii Mar 2003 A1
20030058492 Wakiyama Mar 2003 A1
20030061172 Robinson Mar 2003 A1
20030061233 Manasse et al. Mar 2003 A1
20030065626 Allen Apr 2003 A1
20030071743 Seah et al. Apr 2003 A1
20030072475 Tamori Apr 2003 A1
20030073499 Reece Apr 2003 A1
20030074317 Hofi Apr 2003 A1
20030074326 Byers Apr 2003 A1
20030076161 Tisse Apr 2003 A1
20030076300 Lauper et al. Apr 2003 A1
20030076984 Tisse et al. Apr 2003 A1
20030080194 O'Hara et al. May 2003 A1
20030091215 Lauper et al. May 2003 A1
20030092489 Veradej May 2003 A1
20030095689 Volkommer et al. May 2003 A1
20030098776 Friedli May 2003 A1
20030099379 Monk et al. May 2003 A1
20030099381 Ohba May 2003 A1
20030103652 Lee et al. Jun 2003 A1
20030107097 McArthur et al. Jun 2003 A1
20030107645 Yoon Jun 2003 A1
20030108224 Ike Jun 2003 A1
20030108225 Li Jun 2003 A1
20030115148 Takhar Jun 2003 A1
20030115459 Monk Jun 2003 A1
20030116630 Carey et al. Jun 2003 A1
20030118212 Min et al. Jun 2003 A1
20030118217 Kondo et al. Jun 2003 A1
20030123711 Kim et al. Jul 2003 A1
20030125054 Garcia Jul 2003 A1
20030125057 Pesola Jul 2003 A1
20030126560 Kurapati et al. Jul 2003 A1
20030131245 Linderman Jul 2003 A1
20030131265 Bhakta Jul 2003 A1
20030133597 Moore et al. Jul 2003 A1
20030140235 Immega et al. Jul 2003 A1
20030140928 Bui et al. Jul 2003 A1
20030141411 Pandya et al. Jul 2003 A1
20030149881 Patel et al. Aug 2003 A1
20030152251 Ike Aug 2003 A1
20030152252 Kondo et al. Aug 2003 A1
20030156741 Lee et al. Aug 2003 A1
20030158762 Wu Aug 2003 A1
20030158821 Maia Aug 2003 A1
20030159051 Hollnagel Aug 2003 A1
20030163739 Armington et al. Aug 2003 A1
20030169334 Braithwaite et al. Sep 2003 A1
20030169901 Pavlidis et al. Sep 2003 A1
20030169907 Edwards et al. Sep 2003 A1
20030173408 Mosher, Jr. et al. Sep 2003 A1
20030174049 Beigel et al. Sep 2003 A1
20030177051 Driscoll et al. Sep 2003 A1
20030182151 Taslitz Sep 2003 A1
20030182182 Kocher Sep 2003 A1
20030189480 Hamid Oct 2003 A1
20030189481 Hamid Oct 2003 A1
20030191949 Odagawa Oct 2003 A1
20030194112 Lee Oct 2003 A1
20030195935 Leeper Oct 2003 A1
20030198368 Kee Oct 2003 A1
20030200180 Phelan, III et al. Oct 2003 A1
20030210139 Brooks et al. Nov 2003 A1
20030210802 Schuessler Nov 2003 A1
20030218719 Abourizk et al. Nov 2003 A1
20030225711 Paping Dec 2003 A1
20030228898 Rowe Dec 2003 A1
20030233556 Angelo et al. Dec 2003 A1
20030235326 Morikawa et al. Dec 2003 A1
20030235411 Morikawa et al. Dec 2003 A1
20030236120 Reece et al. Dec 2003 A1
20040001614 Russon et al. Jan 2004 A1
20040002894 Kocher Jan 2004 A1
20040005078 Tillotson Jan 2004 A1
20040006553 de Vries et al. Jan 2004 A1
20040010462 Moon et al. Jan 2004 A1
20040012760 Mihashi et al. Jan 2004 A1
20040019570 Bolle et al. Jan 2004 A1
20040023664 Mirouze et al. Feb 2004 A1
20040023709 Beaulieu et al. Feb 2004 A1
20040025030 Corbett-Clark et al. Feb 2004 A1
20040025031 Ooi et al. Feb 2004 A1
20040025053 Hayward Feb 2004 A1
20040029564 Hodge Feb 2004 A1
20040030930 Nomura Feb 2004 A1
20040035123 Kim et al. Feb 2004 A1
20040037450 Bradski Feb 2004 A1
20040039914 Barr et al. Feb 2004 A1
20040042641 Jakubowski Mar 2004 A1
20040044627 Russell et al. Mar 2004 A1
20040046640 Jourdain et al. Mar 2004 A1
20040049687 Orsini et al. Mar 2004 A1
20040050924 Mletzko et al. Mar 2004 A1
20040050930 Rowe Mar 2004 A1
20040052405 Walfridsson Mar 2004 A1
20040052418 DeLean Mar 2004 A1
20040059590 Mercredi et al. Mar 2004 A1
20040059953 Purnell Mar 2004 A1
20040104266 Bolle et al. Jun 2004 A1
20040117636 Cheng Jun 2004 A1
20040133804 Smith et al. Jul 2004 A1
20040146187 Jeng Jul 2004 A1
20040148526 Sands et al. Jul 2004 A1
20040160518 Park Aug 2004 A1
20040162870 Matsuzaki et al. Aug 2004 A1
20040162984 Freeman et al. Aug 2004 A1
20040169817 Grotehusmann et al. Sep 2004 A1
20040172541 Ando et al. Sep 2004 A1
20040174070 Voda et al. Sep 2004 A1
20040190759 Caldwell Sep 2004 A1
20040193893 Braithwaite et al. Sep 2004 A1
20040204711 Jackson Oct 2004 A1
20040219902 Lee et al. Nov 2004 A1
20040233038 Beenau et al. Nov 2004 A1
20040252866 Tisse et al. Dec 2004 A1
20040255168 Murashita et al. Dec 2004 A1
20050008200 Azuma et al. Jan 2005 A1
20050008201 Lee et al. Jan 2005 A1
20050012817 Hampapur et al. Jan 2005 A1
20050029353 Isemura et al. Feb 2005 A1
20050052566 Kato Mar 2005 A1
20050055582 Bazakos et al. Mar 2005 A1
20050063567 Saitoh et al. Mar 2005 A1
20050084137 Kim et al. Apr 2005 A1
20050084179 Hanna et al. Apr 2005 A1
20050099288 Spitz et al. May 2005 A1
20050102502 Sagen May 2005 A1
20050110610 Bazakos et al. May 2005 A1
20050125258 Yellin et al. Jun 2005 A1
20050127161 Smith et al. Jun 2005 A1
20050129286 Hekimian Jun 2005 A1
20050134796 Zelvin et al. Jun 2005 A1
20050138385 Friedli et al. Jun 2005 A1
20050138387 Lam et al. Jun 2005 A1
20050146640 Shibata Jul 2005 A1
20050151620 Neumann Jul 2005 A1
20050152583 Kondo et al. Jul 2005 A1
20050193212 Yuhara Sep 2005 A1
20050199708 Friedman Sep 2005 A1
20050206501 Farhat Sep 2005 A1
20050206502 Bernitz Sep 2005 A1
20050207614 Schonberg et al. Sep 2005 A1
20050210267 Sugano et al. Sep 2005 A1
20050210270 Rohatgi et al. Sep 2005 A1
20050210271 Chou et al. Sep 2005 A1
20050238214 Matsuda et al. Oct 2005 A1
20050240778 Saito Oct 2005 A1
20050248725 Ikoma et al. Nov 2005 A1
20050249385 Kondo et al. Nov 2005 A1
20050255840 Markham Nov 2005 A1
20060093190 Cheng et al. May 2006 A1
20060147094 Yoo Jul 2006 A1
20060165266 Hamza Jul 2006 A1
20060274919 LoIacono et al. Dec 2006 A1
20070036397 Hamza Feb 2007 A1
20070140531 Hamza Jun 2007 A1
20070160266 Jones et al. Jul 2007 A1
20070189582 Hamza et al. Aug 2007 A1
20070206840 Jacobson Sep 2007 A1
20070211924 Hamza Sep 2007 A1
20070274571 Hamza Nov 2007 A1
20070286590 Terashima Dec 2007 A1
20080005578 Shafir Jan 2008 A1
20080075334 Determan et al. Mar 2008 A1
20080075441 Jelinek et al. Mar 2008 A1
20080104415 Palti-Wasserman et al. May 2008 A1
20080148030 Goffin Jun 2008 A1
20080211347 Wright et al. Sep 2008 A1
20080252412 Larsson et al. Oct 2008 A1
20080267456 Anderson Oct 2008 A1
20090046899 Northcott et al. Feb 2009 A1
20090092283 Whillock et al. Apr 2009 A1
20090316993 Brasnett et al. Dec 2009 A1
20100002913 Hamza Jan 2010 A1
20100033677 Jelinek Feb 2010 A1
20100034529 Jelinek Feb 2010 A1
20100142765 Hamza Jun 2010 A1
20100182440 McCloskey Jul 2010 A1
20100239119 Bazakos et al. Sep 2010 A1
Foreign Referenced Citations (188)
Number Date Country
0484076 May 1992 EP
0593386 Apr 1994 EP
0878780 Nov 1998 EP
0899680 Mar 1999 EP
0910986 Apr 1999 EP
0962894 Dec 1999 EP
1018297 Jul 2000 EP
1024463 Aug 2000 EP
1028398 Aug 2000 EP
1041506 Oct 2000 EP
1041523 Oct 2000 EP
1126403 Aug 2001 EP
1139270 Oct 2001 EP
1237117 Sep 2002 EP
1477925 Nov 2004 EP
1635307 Mar 2006 EP
2369205 May 2002 GB
2371396 Jul 2002 GB
2375913 Nov 2002 GB
2402840 Dec 2004 GB
2411980 Sep 2005 GB
9161135 Jun 1997 JP
9198545 Jul 1997 JP
9201348 Aug 1997 JP
9147233 Sep 1997 JP
9234264 Sep 1997 JP
9305765 Nov 1997 JP
9319927 Dec 1997 JP
10021392 Jan 1998 JP
10040386 Feb 1998 JP
10049728 Feb 1998 JP
10137219 May 1998 JP
10137221 May 1998 JP
10137222 May 1998 JP
10137223 May 1998 JP
10248827 Sep 1998 JP
10269183 Oct 1998 JP
11047117 Feb 1999 JP
11089820 Apr 1999 JP
11200684 Jul 1999 JP
11203478 Jul 1999 JP
11213047 Aug 1999 JP
11339037 Dec 1999 JP
2000005149 Jan 2000 JP
2000005150 Jan 2000 JP
2000011163 Jan 2000 JP
2000023946 Jan 2000 JP
2000083930 Mar 2000 JP
2000102510 Apr 2000 JP
2000102524 Apr 2000 JP
2000105830 Apr 2000 JP
2000107156 Apr 2000 JP
2000139878 May 2000 JP
2000155863 Jun 2000 JP
2000182050 Jun 2000 JP
2000185031 Jul 2000 JP
2000194972 Jul 2000 JP
2000237167 Sep 2000 JP
2000242788 Sep 2000 JP
2000259817 Sep 2000 JP
2000356059 Dec 2000 JP
2000357232 Dec 2000 JP
2001005948 Jan 2001 JP
2001067399 Mar 2001 JP
2001101429 Apr 2001 JP
2001167275 Jun 2001 JP
2001222661 Aug 2001 JP
2001292981 Oct 2001 JP
2001297177 Oct 2001 JP
2001358987 Dec 2001 JP
2002119477 Apr 2002 JP
2002133415 May 2002 JP
2002153444 May 2002 JP
2002153445 May 2002 JP
2002260071 Sep 2002 JP
2002271689 Sep 2002 JP
2002286650 Oct 2002 JP
2002312772 Oct 2002 JP
2002329204 Nov 2002 JP
2003006628 Jan 2003 JP
2003036434 Feb 2003 JP
2003108720 Apr 2003 JP
2003108983 Apr 2003 JP
2003132355 May 2003 JP
2003150942 May 2003 JP
2003153880 May 2003 JP
2003242125 Aug 2003 JP
2003271565 Sep 2003 JP
2003271940 Sep 2003 JP
2003308522 Oct 2003 JP
2003308523 Oct 2003 JP
2003317102 Nov 2003 JP
2003331265 Nov 2003 JP
2004005167 Jan 2004 JP
2004021406 Jan 2004 JP
2004030334 Jan 2004 JP
2004038305 Feb 2004 JP
2004094575 Mar 2004 JP
2004152046 May 2004 JP
2004163356 Jun 2004 JP
2004164483 Jun 2004 JP
2004171350 Jun 2004 JP
2004171602 Jun 2004 JP
2004206444 Jul 2004 JP
2004220376 Aug 2004 JP
2004261515 Sep 2004 JP
2004280221 Oct 2004 JP
2004280547 Oct 2004 JP
2004287621 Oct 2004 JP
2004315127 Nov 2004 JP
2004318248 Nov 2004 JP
2005004524 Jan 2005 JP
2005011207 Jan 2005 JP
2005025577 Jan 2005 JP
2005038257 Feb 2005 JP
2005062990 Mar 2005 JP
2005115961 Apr 2005 JP
2005148883 Jun 2005 JP
2005242677 Sep 2005 JP
WO 9717674 May 1997 WO
WO 9721188 Jun 1997 WO
WO 9802083 Jan 1998 WO
WO 9808439 Mar 1998 WO
WO 9932317 Jul 1999 WO
WO 9952422 Oct 1999 WO
WO 9965175 Dec 1999 WO
WO 0028484 May 2000 WO
WO 0029986 May 2000 WO
WO 0031677 Jun 2000 WO
WO 0036605 Jun 2000 WO
WO 0062239 Oct 2000 WO
WO 0101329 Jan 2001 WO
WO 0103100 Jan 2001 WO
WO 0128476 Apr 2001 WO
WO 0135348 May 2001 WO
WO 0135349 May 2001 WO
WO 0140982 Jun 2001 WO
WO 0163994 Aug 2001 WO
WO 0169490 Sep 2001 WO
WO 0186599 Nov 2001 WO
WO 0201451 Jan 2002 WO
WO 0219030 Mar 2002 WO
WO 0235452 May 2002 WO
WO 0235480 May 2002 WO
WO 02091735 Nov 2002 WO
WO 02095657 Nov 2002 WO
WO 03002387 Jan 2003 WO
WO 03003910 Jan 2003 WO
WO 03054777 Jul 2003 WO
WO 03077077 Sep 2003 WO
WO 2004029863 Apr 2004 WO
WO 2004042646 May 2004 WO
WO 2004055737 Jul 2004 WO
WO 2004089214 Oct 2004 WO
WO 2004097743 Nov 2004 WO
WO 2005008567 Jan 2005 WO
WO 2005013181 Feb 2005 WO
WO 2005024698 Mar 2005 WO
WO 2005024708 Mar 2005 WO
WO 2005024709 Mar 2005 WO
WO 2005029388 Mar 2005 WO
WO 2005062235 Jul 2005 WO
WO 2005069252 Jul 2005 WO
WO 2005093510 Oct 2005 WO
WO 2005093681 Oct 2005 WO
WO 2005096962 Oct 2005 WO
WO 2005098531 Oct 2005 WO
WO 2005104704 Nov 2005 WO
WO 2005109344 Nov 2005 WO
WO 2006012645 Feb 2006 WO
WO 2006023046 Mar 2006 WO
WO 2006051462 May 2006 WO
WO 2006063076 Jun 2006 WO
2006081505 Aug 2006 WO
WO 2006081209 Aug 2006 WO
WO 2007101269 Sep 2007 WO
WO 2007101275 Sep 2007 WO
WO 2007101276 Sep 2007 WO
WO 2007103698 Sep 2007 WO
WO 2007103701 Sep 2007 WO
WO 2007103833 Sep 2007 WO
WO 2007103834 Sep 2007 WO
WO 2008016724 Feb 2008 WO
WO 2008019168 Feb 2008 WO
WO 2008019169 Feb 2008 WO
WO 2008021584 Feb 2008 WO
WO 2008031089 Mar 2008 WO
WO 2008040026 Apr 2008 WO
Related Publications (1)
Number Date Country
20070274570 A1 Nov 2007 US
Provisional Applications (2)
Number Date Country
60778770 Mar 2006 US
60647270 Jan 2005 US
Continuation in Parts (6)
Number Date Country
Parent 11275703 Jan 2006 US
Child 11681614 US
Parent 11681614 US
Child 11681614 US
Parent 11043366 Jan 2005 US
Child 11681614 US
Parent 11372854 Mar 2006 US
Child 11043366 US
Parent 11672108 Feb 2007 US
Child 11372854 US
Parent 11675424 Feb 2007 US
Child 11672108 US