Eye defect detection in international standards organization images

Information

  • Patent Grant
  • 8503818
  • Patent Number
    8,503,818
  • Date Filed
    Tuesday, September 25, 2007
    16 years ago
  • Date Issued
    Tuesday, August 6, 2013
    10 years ago
Abstract
A method and apparatus for providing image processing. For one embodiment of the invention, a digital image is acquired. One or more relatively large candidate red eye defect regions are detected in at least a portion of the image. Face detection is applied to at least a portion of the image to eliminate non-face regions and one or more relatively small candidate red eye defect regions are identified in at least a portion of the image not including the eliminated non-face regions.
Description
FIELD

Embodiments of the invention relate generally to the field of image processing and more specifically to methods and apparatuses for improved eye defect detection in digital images.


BACKGROUND

Light sensitivity ratings conforming to the international standard set by the International Standards Organization (ISO) are known as ISO ratings and denote a sensitivity of an imaging sensor of an acquisition device to an amount of light present. In digital acquisition devices, altering the ISO rating is a means of exposure control, which affects shutter speed and/or lens aperture. The higher the ISO rating, the more sensitive the imaging sensor, thereby leading to increased exposure of an acquired image. However, as the light sensitivity increases, the imaging sensor is capable of recording a fainter light signal, and thus it will be susceptible to recording noise.


Noise produced by an imaging sensor is undesirable and can appear in an image as numerous small red-pixel cluster artifacts, also known as noise speckles. The presence of noise speckles in an image degrades the operation of conventional red eye detection methods, such as disclosed in U.S. Pat. No. 7,599,577. Conventional red eye detection methods involve segmenting and labeling pixels or groups of pixels of an image into candidate red-eye regions. When such red eye detection methods are applied to images having a high ISO rating, many of the noise speckles are initially mistaken for red eye defects, and as a result, the segmenting and labeling operations of the method become computationally burdensome.


It is known to apply a face tracker/detector, such as disclosed in International Patent Application No. PCT/EP2007/005330 (WO 2008/017343)and International Patent Application No. PCT/EP2007/006540 (WO 2008/107002), to limit the application of the red eye detection method to confirmed face regions. However, the presence of noise speckles can also affect the accuracy of face detection/tracking. Thus, such an approach could introduce a further degree of error resulting in less accurate red-eye detection.


Furthermore, the computational requirements involved in running typical face detection/tracking prior to running red-eye detection would degrade or limit the performance of the face detector/tracker, the red-eye detector or both, particularly when implemented on real time image acquisition devices.


SUMMARY

In accordance with one embodiment of the invention, a digital image is acquired. One or more relatively large candidate red eye defect regions are detected in at least a portion of the image. Face detection is applied to at least a portion of the image to eliminate non-face regions and one or more relatively small candidate red eye defect regions are identified in at least a portion of the image not including the eliminated non-face regions.


Other features and advantages of embodiments of the invention will be apparent from the accompanying drawings, and from the detailed description, that follows below.





BRIEF DESCRIPTION OF THE DRAWINGS

The invention may be best understood by referring to the following description and accompanying drawings that are used to illustrate embodiments of the invention. In the drawings:



FIG. 1 illustrates a block diagram of an image acquisition device 20 operating in accordance with various alternative embodiments of the invention;



FIG. 2 illustrates a method for effecting red-eye detection in accordance with one embodiment of the invention; and



FIG. 3 illustrates a typical acquired image for which red-eye defect detection is effected in accordance with one embodiment of the invention.





DETAILED DESCRIPTION

An image acquired with a flash may include red-eye defects. In general, these red-eye defects are detected by applying a conventional eye defect detector to the image. However images acquired with a high ISO rating, for example, greater than ISO 800, may include numerous small clusters of red-pixels indicative of noise and in such cases, the eye defect detector can identify the noise speckles as relatively small red eye defects.


Embodiments of the invention provide methods and apparatuses for detecting red eyes in high ISO flash images. For one embodiment of the invention, a digital image is acquired. One or more relatively large candidate red eye defect regions are detected in at least a portion of the image. Face detection is applied to at least a portion of the image to eliminate non-face regions and one or more relatively small candidate red eye defect regions are identified in at least a portion of the image not including the eliminated non-face regions.


In the following description, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known circuits, structures and techniques have not been shown in detail in order not to obscure the understanding of this description.


Reference throughout the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, the appearance of the phrases “in one embodiment” or “in an embodiment” in various places throughout the specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.


Moreover, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the Detailed Description are hereby expressly incorporated into this Detailed Description, with each claim standing on its own as a separate embodiment of this invention.


Embodiments of the invention are applicable to wide range of systems in which image processing is effected. As noted above, an image acquired with a flash may include red-eye defects. In general, these red-eye defects are detected by applying a conventional eye defect detector to the image. However images acquired with a high ISO rating, for example, greater than ISO 800, may include numerous small clusters of red-pixels indicative of noise and in such cases, the eye defect detector can identify the noise speckles as relatively small red eye defects.


In accordance with one embodiment of the invention, the effect of the noise speckles on the red eye defect detector can be mitigated by firstly applying anti-face detection to the image to eliminate regions of the image not comprising faces.


Face detection is well known in the art, for example as disclosed in US Patent Application No. 2002/0102024, hereinafter Viola-Jones. In Viola-Jones, a chain (cascade) typically comprising 32 classifiers based on rectangular (and increasingly refined) Haar features is used with an integral image, derived from an acquired image, by applying the classifiers to a sub-window within the integral image. For a complete analysis of an acquired image, this sub-window is shifted incrementally across the integral image until the entire image has been covered.


As the classifiers are increasingly more refined, the majority of non-face regions in an image are quickly eliminated after the first few classifiers in the cascade have been applied. Thus regions of an image that do not contain a face can be quickly and accurately determined. For example, in “Robust Real-Time Object Detection” Viola-Jones, Second International Workshop on Statistical and Computational Theories of Vision, Vancouver, July 2001, it is shown that it is possible to train a single two-feature classifier that will successfully detect 100% of faces with a 40% false positive rate. Thus, although 40% of the candidate face sub-windows it passes are not, in fact, face regions, practically 100% of the sub-windows it rejects are non-face regions.


In one embodiment of the invention, red eye detection of small red-eye defects is only applied to regions of an image not rejected by a relatively relaxed face detector, referred to herein as an anti-face detector. In this way, the computational efficiency and quality of the red-eye detection application can be improved.



FIG. 1 illustrates a block diagram of an image acquisition device 20 operating in accordance with various alternative embodiments of the invention. The digital acquisition device 20, which in the present embodiment is a portable digital camera, includes a processor 120. It can be appreciated that many of the processes implemented in the digital camera may be implemented in or controlled by software operating in a microprocessor, central processing unit, controller, digital signal processor and/or an application specific integrated circuit, collectively depicted as block 120 labelled “processor”. Generically, all user interface and control of peripheral components such as buttons and display is controlled by a microcontroller 122. The processor 120, in response to a user input at 122, such as half pressing a shutter button (pre-capture mode 32), initiates and controls the digital photographic process.


Ambient light exposure is monitored using light sensor 40 in order to automatically determine if a flash is to be used. A distance to the subject is determined using a focus component 50, which also focuses the image on image capture component 60. If a flash is to be used, processor 120 causes the flash 70 to generate a photographic flash in substantial coincidence with the recording of the image by image capture component 60 upon full depression of the shutter button. The image capture component 60 digitally records the image in colour. The image capture component preferably includes a CCD (charge coupled device) or CMOS to facilitate digital recording. The flash may be selectively generated either in response to the light sensor 40 or a manual input 72 from the user of the camera. The high resolution image recorded by image capture component 60 is stored in an image store 80 which may comprise computer memory such a dynamic random access memory or a non-volatile memory. The camera is typically equipped with a display 100, such as an LCD, for preview and post-view of images.


In the case of preview images which are generated in the pre-capture mode 32 with the shutter button half-pressed, the display 100 can assist the user in composing the image, as well as being used to determine focusing and exposure. Temporary storage 82 is used to store one or more of the preview images and can be part of the image store 80 or a separate component.


For one embodiment, the camera 20 has a user-selectable red-eye mode 30 particularly for detecting and optionally correcting images, which have been acquired with a flash. A red eye module 90 analyzes and processes such images acquired from the image store 80 according to a workflow described below.


For one embodiment, the module 90 comprises a red-eye detector 92, a face detector 94, an anti-face detector 96 and a red-eye defect corrector 98, the operations of which will be described in more detail below. The module 90 can be integral to the camera 20, for one embodiment, module 90 could be the processor 120 with suitable programming—or part of an external processing device 10 such as a desktop computer.


Where the red eye module 90 is integral to the camera 20, the final processed image may be displayed on image display 100, saved on a persistent storage 112 which can be internal or a removable storage such as CF card, SD card or the like, or downloaded to another device, such as a personal computer, server or printer via image output means 110 which can be tethered or wireless. For various embodiments where the module 90 is implemented in an external device 10, such as a desktop computer, the final processed image may be returned to the camera 20 for storage and display as described above, or stored and displayed externally of the camera.



FIG. 2 illustrates a method for effecting red-eye defect detection in accordance with one embodiment of the invention. At operation 100, a high ISO digital flash image is acquired in an otherwise conventional manner. A typical acquired image is described in FIG. 3. FIG. 3 illustrates a typical acquired image for which red-eye defect detection is effected in accordance with one embodiment of the invention. Image A, shown in FIG. 3, includes face regions, B and C, comprising relatively large red-eye defects, face regions, D and E, comprising relatively small red-eye defects, a face region F comprising a plurality of noise speckles F. The image A less the regions B-F comprises regions identifiably by a relaxed face detector as non-face regions, referred to herein as G.


For one embodiment of the invention, the image A is analyzed by the red-eye detector 92 to locate any relatively large red eye regions. Since the noise speckles appear in the image as relatively small red-pixel cluster artifacts, their presence does not largely affect the detection of large red eye defects. Thus, a standard red eye filter configured to locate relatively large red eye regions is applied to a sub-sampled version of the image, operation 110. Preferably the image is sub-sampled to 1024×768. However, it will be appreciated that the image may be sub-sampled to a greater degree, for example to 256×192.


As illustrated in FIG. 3, any candidate large red eye defect region 300 is bounded by a border (not necessarily rectangular), and for one embodiment, a boundary region 302 (not necessarily rectangular) is defined to include the border and a portion of the face surrounding the candidate region.


Referring again to FIG. 2, a conventional face detector 94 is preferably applied to a region including any detected boundary regions 302 in order to confirm the red eyes and identify surrounding face regions, operation 120.


The anti-face detector 96, which eliminates non-face regions of an image, is then applied to the full size image A.


For one embodiment, the anti-face detector 96 is applied to the full size image across a restricted range of scales. For example, the restricted range of scales may be based on anthropometric data and/or a size range of the noise speckles vis-à-vis the expected size of smaller red-eye defects, which might appear in an image. So for example, the smallest window size employed by the detector 96 could be determined by the smallest size face in an image for which there is a requirement to detect and correct red-eye defects, whereas the largest size window could be set as not exceeding the smallest sized face detected by the detector 94 in operation, 120.


More specifically, the smallest size of face is a function of the camera subsystem, the flash subsystem and the level of ambient lighting during image capture. Thus a camera with a stronger flash will generate red-eye effect in faces which are more distant from the camera and thus a smaller face size criteria should be used for such a camera. Similarly a higher level of ambient lighting will reduce the range at which red-eyes occur. Camera optics and sensor sensitivity will also determine the size threshold for the smallest face detector.


Typically no more than 4-5 sizes of face detector window would be employed (further granularity would reduce the speed of the detector) and typically decisions determined from the ambient lighting, lens configuration, exposure settings and flash strength would only affect the use of the 1-2 smallest size of face detector window.


As the combination of optics, sensor, lens and flash subsystems are quite unique to most models of digital camera an empirical calibration of this smallest size window threshold is typically required although it is possible to share calibration data between cameras with well-defined subsystem characteristics. Due to its non-linear nature this data is typically stored within the camera firmware as a set of look-up tables.


For one embodiment, the anti-face detector is an inverted face detector comprising a single two feature classifier with a 40% false positive rate as disclosed in the Viola-Jones paper referred to above.


An alternative method of face detection is described in US 2006/126,938, which discloses employing a measure of variance in an image sub-window to determine if the sub-window could possibly contain a face. If the variance of the sub-window lies below a particular threshold, a face cannot be detected, and the sub-window is rejected as not comprising a face.


As such, in an alternative implementation of the invention, a classifier based on variance of any sub-window within an image can be employed to quickly eliminate regions of an image not containing a face.


It will be appreciated that this variance based technique may be combined with the relatively relaxed Viola Jones face detector to improve the accuracy and/or speed of the anti-face detector.


For one embodiment, the anti-face detector 96 is applied to portions of the image A excluding large red eye face regions (B+C) confirmed by the face detector of operation 120, i.e. region (A−(B+C)), in order to eliminate non-face regions (A−(B+C+D+E+F)) of the image. However, it will also be appreciated that the anti-face detector 96 may be applied to the whole acquired image A.


Although almost 100% of the sub-windows passed by the anti-face detector are non-face regions, it is likely that some non-face regions F will not be eliminated by the anti-face detector 96.


For one embodiment, the red eye detector 92 is then applied to any regions of the image which were not eliminated by the anti-face detector of operation 130 as being non-face regions, and preferably excluding those regions confirmed as large red eye face regions in operation 120, for example, regions D+E+F in FIG. 3, in order to locate relatively small red eye defects, operation 140. It is understood that if no regions remain after the application of the anti-face detector 96, the red-eye defect detector 96 configured to detect relatively small red eye defects is not applied to the image.


In this way, any noise speckles in non-face regions eliminated by the anti-face detector 96, i.e. region (A−(B+C+D+E+F)), are not subjected to the red eye detection of operation 140, but face regions as well as non-face regions un-eliminated by the anti-face detector, i.e. region F, are subjected to the red eye detection of operation 140.


Thus, the regions of the image to which red eye detection is applied are significantly reduced, thereby increasing the computational efficiency of running the red eye detection application on a high ISO image. Furthermore, the probability of the red eye detector mistakenly identifying noise speckles as small red eye defects is reduced.


For one embodiment, the red eye defect corrector 98, such as that disclosed in U.S. Pat. No. 7,599,577 is applied to those relatively large and relatively small red eye defect regions to correct the image, operation 150.


In an alternative embodiment of the invention, the operations of FIG. 1 are carried out only if the image was acquired with an ISO sensitivity of greater than or equal to 800.


General Matters


Embodiments of the invention include apparatuses and methods for effecting red-eye defect detection. Embodiments of the invention have been described above with various specific details. It will be appreciated that such details are examples and may be modified.


Embodiments of the invention have been described as including various operations. Many of the processes are described in their most basic form, but operations can be added to or deleted from any of the processes without departing from the scope of the invention.


The operations of the invention may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose processor or logic circuits programmed with the instructions to perform the operations. Alternatively, the operations may be performed by a combination of hardware and software. The invention may be provided as a computer program product that may include a machine-readable medium having stored thereon instructions, which may be used to program a computer (or other electronic devices) to perform a process according to the invention. The machine-readable medium may include, but is not limited to, floppy diskettes, optical disks, CD-ROMs, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, magnet or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions. Moreover, the invention may also be downloaded as a computer program product, wherein the program may be transferred from a remote computer to a requesting computer by way of data signals embodied in a carrier wave or other propagation medium via a communication cell (e.g., a modem or network connection).


While the invention has been described in terms of several embodiments, those skilled in the art will recognize that the invention is not limited to the embodiments described, but can be practiced with modification and alteration within the spirit and scope of the appended claims. The description is thus to be regarded as illustrative instead of limiting.

Claims
  • 1. A method for digital image red eye defect detection comprising: using a processor;acquiring a digital image;identifying one or more relatively large candidate red eye defect regions in at least a portion of said image, including using a first red eye detector not configured to detect one or more relatively small candidate red eye defect regions;applying anti-face detection to at least a portion of said image not including said one or more relatively large candidate red eye defect regions to eliminate non-face regions; andidentifying said one or more relatively small candidate red eye defect regions both (a) with a second red eye detector configured to detect said one or more relatively small candidate red eye defect regions, and (b) in a portion of said image including neither (i) said eliminated non-face regions nor (ii) said one or more relatively large candidate red eye defect regions.
  • 2. The method of claim 1 wherein the identifying one or more relatively large candidate red eye defect regions is carried out on a sub-sampled version of said image.
  • 3. The method of claim 2 comprising applying relatively more rigorous face detection compared to said applying face detection to confirm said one or more large candidate red-eye defect regions.
  • 4. The method of claim 3 wherein said applying face detection comprises applying relatively relaxed face detection to said at least a portion of said image compared with said applying relatively rigorous face detection.
  • 5. The method of claim 4 wherein said applying relatively relaxed face detection comprises applying a chain of about two classifiers to said at least a portion of said image.
  • 6. The method of claim 4 wherein said relatively relaxed face detection has a false positive rate of approximately 40%.
  • 7. The method of claim 4, wherein the applying relatively relaxed face detection comprises comparing a variance of at least a portion of the image with a threshold and responsive to the variance being less than the threshold, eliminating the portion.
  • 8. The method of claim 1 further comprising correcting said relatively large candidate red eye defect regions and said relatively small candidate red eye defect regions.
  • 9. The method of claim 1 wherein the applying anti-face detection comprises applying an anti-face detector to said image to identify those regions of the image which do not contain faces and labelling the remaining regions of the image as candidate face regions.
  • 10. The method of claim 1 further comprising determining to apply the identifying one or more relatively large candidate red eye defect regions, applying anti-face detection and identifying one or more relatively small candidate red eye defect regions are applied in response to acquiring said image with an ISO rating greater than 800.
  • 11. One or more non-transitory processor readable media having code embedded therein for programming one or more processors to perform a method for digital image red eye defect detection, wherein the method comprises: identifying one or more relatively large candidate red eye defect regions in at least a portion of an acquired digital image, including using a first red eye detector not configured to detect one or more relatively small candidate red eye defect regions;applying anti-face detection to at least a portion of said image not including said one or more relatively large candidate red eye defect regions to eliminate non-face regions; andidentifying said one or more relatively small candidate red eye defect regions both (a) with a second red eye detector configured to detect said one or more relatively small candidate red eye defect regions, and (b) in a portion of said image including neither (i) said eliminated non-face regions nor (ii) said one or more relatively large candidate red eye defect regions.
  • 12. The one or more non-transitory processor readable media of claim 11 wherein the identifying is carried out on a sub-sampled version of said image.
  • 13. The one or more non-transitory processor readable media of claim 12 wherein the identifying further comprises applying relatively more rigorous face detection, compared to said applying face detection, to confirm said one or more large candidate red-eye defect regions.
  • 14. The one or more non-transitory processor readable media of claim 13 wherein said applying face detection comprises applying relatively relaxed face detection to said at least a portion of said image compared with said applying relatively rigorous face detection.
  • 15. The one or more non-transitory processor readable media of claim 14 wherein said applying relatively relaxed face detection comprises applying a chain of about two classifiers to said at least a portion of said image.
  • 16. The one or more non-transitory processor readable media of claim 14 wherein said relatively relaxed face detection has a false positive rate of approximately 40%.
  • 17. The one or more non-transitory processor readable media of claim 14, wherein the applying relatively relaxed face detection comprises comparing a variance of at least a portion of the image with a threshold and responsive to the variance being less than the threshold, eliminating the portion.
  • 18. The one or more non-transitory processor readable media of claim 11 further comprising correcting said relatively large candidate red eye defect regions and said relatively small candidate red eye defect regions.
  • 19. The one or more non-transitory processor readable media of claim 11 wherein the applying anti-face detection comprises applying an anti-face detector to said image to identify those regions of the image which do not contain faces and labelling the remaining regions of the image as candidate face regions.
  • 20. The one or more non-transitory processor readable media of claim 11, further comprising determining to apply the identifying one or more relatively large candidate red eye defect regions, applying anti-face detection and identifying one or more relatively small candidate red eye defect regions are applied in response to acquiring said image with an ISO rating greater than 800.
  • 21. A digital image acquisition device, comprising: a lens and image sensor for acquiring digital images;a processor;a memory having processor readable code embedded therein for programming the processor to perform a method for digital image red eye defect detection, wherein the method comprises:acquiring a digital image;identifying one or more relatively large candidate red eye defect regions in at least a portion of said image, including using a first red eye detector not configured to detect one or more relatively small candidate red eye defect regions;applying anti-face detection to at least a portion of said image not including said one or more relatively large candidate red eye defect regions to eliminate non-face regions; andidentifying said one or more relatively small candidate red eye defect regions both (a) with a second red eye detector configured to detect said one or more relatively small candidate red eye defect regions, and (b) in a portion of said image including neither (i) said eliminated non-face regions nor (ii) said one or more relatively large candidate red eye defect regions.
  • 22. The device of claim 21 wherein the identifying is carried out on a sub-sampled version of said image.
  • 23. The device of claim 22 wherein the identifying further comprises applying relatively more rigorous face detection, compared to said applying face detection, to confirm said one or more large candidate red-eye defect regions.
  • 24. The device of claim 23 wherein said applying face detection comprises applying relatively relaxed face detection to said at least a portion of said image compared with said applying relatively rigorous face detection.
  • 25. The device of claim 24, wherein said applying relatively relaxed face detection comprises applying a chain of about two classifiers to said at least a portion of said image.
  • 26. The device of claim 24, wherein said relatively relaxed face detection has a false positive rate of approximately 40%.
  • 27. The device of claim 24, wherein the applying relatively relaxed face detection comprises comparing a variance of at least a portion of the image with a threshold and responsive to the variance being less than the threshold, eliminating the portion.
  • 28. The device of claim 21, wherein the method further comprises correcting said relatively large candidate red eye defect regions and said relatively small candidate red eye defect regions.
  • 29. The device of claim 21, wherein the applying anti-face detection comprises applying an anti-face detector to said image to identify those regions of the image which do not contain faces and labelling the remaining regions of the image as candidate face regions.
  • 30. The device of claim 21, wherein the method further comprises determining to apply the identifying one or more relatively large candidate red eye defect regions, applying anti-face detection and identifying one or more relatively small candidate red eye defect regions are applied in response to acquiring said image with an ISO rating greater than 800.
  • 31. A digital image acquisition device, comprising: a lens and image sensor for acquiring digital images;a processor;a first red eye detector configured to program the processor to identify one or more relatively large candidate red eye defect regions in at least a portion of said image, and not configured to detect one or more relatively small candidate red eye defect regions;an anti-face detector configured to program the processor to apply face detection to at least a portion of said image not including said one or more relatively large candidate red eye defect regions to eliminate non-face regions; anda second red eye detector configured to program the processor to identify said one or more relatively small candidate red eye defect regions in a portion of said image including neither (a) said eliminated non-face regions nor (b) said one or more relatively large candidate red eye defect regions.
  • 32. The device of claim 31, wherein the first red eye detector is configured to operate on a sub-sampled version of said image.
  • 33. The device of claim 32, further comprising a second face detector configured to apply relatively more rigorous face detection than said anti-face detector to confirm said one or more large candidate red-eye defect regions.
  • 34. The device of claim 33, wherein said anti-face detector is configured to apply relatively relaxed face detection to said portion of said image compared with said second face detector.
  • 35. The device of claim 34, wherein said anti-face detector is configured to apply a chain of approximately two classifiers to said portion of said image.
  • 36. The device of claim 34, wherein said anti-face detector is configured to have a false positive rate of approximately 40%.
  • 37. The device of claim 34, wherein the anti-face detector is configured to compare a variance of a same or different portion of the image with a threshold and responsive to the variance being less than the threshold, eliminating the portion.
  • 38. The device of claim 31, wherein the device is configured to correct the relatively large candidate red eye defect regions and the relatively small candidate red eye defect regions.
  • 39. The device of claim 31, wherein the anti-face detector is configured to label regions that the anti-face detector does not eliminate as candidate face regions.
  • 40. The device of claim 31, wherein the device is configured to determine to apply the first red eye detector, the anti-face detector and the second red eye detector in response to acquiring said image with an ISO rating greater than 800.
US Referenced Citations (347)
Number Name Date Kind
4285588 Mir Aug 1981 A
4577219 Klie et al. Mar 1986 A
4646134 Komatsu et al. Feb 1987 A
4777620 Shimoni et al. Oct 1988 A
4881067 Watanabe et al. Nov 1989 A
4978989 Nakano et al. Dec 1990 A
5016107 Sasson et al. May 1991 A
5070355 Inoue et al. Dec 1991 A
5130789 Dobbs et al. Jul 1992 A
5164831 Kuchta et al. Nov 1992 A
5164833 Aoki Nov 1992 A
5202720 Fujino et al. Apr 1993 A
5231674 Cleveland et al. Jul 1993 A
5249053 Jain Sep 1993 A
5274457 Kobayashi et al. Dec 1993 A
5301026 Lee Apr 1994 A
5303049 Ejima et al. Apr 1994 A
5335072 Tanaka et al. Aug 1994 A
5384601 Yamashita et al. Jan 1995 A
5400113 Sosa et al. Mar 1995 A
5432863 Benati et al. Jul 1995 A
5432866 Sakamoto Jul 1995 A
5452048 Edgar Sep 1995 A
5455606 Keeling et al. Oct 1995 A
5537516 Sherman et al. Jul 1996 A
5568187 Okino Oct 1996 A
5568194 Abe Oct 1996 A
5649238 Wakabayashi et al. Jul 1997 A
5671013 Nakao Sep 1997 A
5678073 Stephenson, III et al. Oct 1997 A
5694926 DeVries et al. Dec 1997 A
5708866 Leonard Jan 1998 A
5719639 Imamura Feb 1998 A
5719951 Shackleton et al. Feb 1998 A
5724456 Boyack et al. Mar 1998 A
5734425 Takizawa et al. Mar 1998 A
5748764 Benati et al. May 1998 A
5748784 Sugiyama May 1998 A
5751836 Wildes et al. May 1998 A
5761550 Kancigor Jun 1998 A
5781650 Lobo et al. Jul 1998 A
5805720 Suenaga et al. Sep 1998 A
5805727 Nakano Sep 1998 A
5805745 Graf Sep 1998 A
5815749 Tsukahara et al. Sep 1998 A
5818975 Goodwin et al. Oct 1998 A
5847714 Naqvi et al. Dec 1998 A
5850470 Kung et al. Dec 1998 A
5862217 Steinberg et al. Jan 1999 A
5862218 Steinberg Jan 1999 A
5892837 Luo et al. Apr 1999 A
5949904 Delp Sep 1999 A
5974189 Nicponski Oct 1999 A
5990973 Sakamoto Nov 1999 A
5991456 Rahman et al. Nov 1999 A
5991549 Tsuchida Nov 1999 A
5991594 Froeber et al. Nov 1999 A
5999160 Kitamura et al. Dec 1999 A
6006039 Steinberg et al. Dec 1999 A
6009209 Acker et al. Dec 1999 A
6011547 Shiota et al. Jan 2000 A
6016354 Lin et al. Jan 2000 A
6028611 Anderson et al. Feb 2000 A
6035072 Read Mar 2000 A
6035074 Fujimoto et al. Mar 2000 A
6036072 Lee Mar 2000 A
6101271 Yamashita et al. Aug 2000 A
6104839 Cok et al. Aug 2000 A
6118485 Hinoue et al. Sep 2000 A
6125213 Morimoto Sep 2000 A
6134339 Luo Oct 2000 A
6151403 Luo Nov 2000 A
6172706 Tatsumi Jan 2001 B1
6192149 Eschbach et al. Feb 2001 B1
6195127 Sugimoto Feb 2001 B1
6201571 Ota Mar 2001 B1
6204858 Gupta Mar 2001 B1
6233364 Krainiouk et al. May 2001 B1
6249315 Holm Jun 2001 B1
6252976 Schildkraut et al. Jun 2001 B1
6266054 Lawton et al. Jul 2001 B1
6268939 Klassen et al. Jul 2001 B1
6275614 Krishnamurthy et al. Aug 2001 B1
6278491 Wang et al. Aug 2001 B1
6285410 Marni Sep 2001 B1
6292574 Schildkraut et al. Sep 2001 B1
6295378 Kitakado et al. Sep 2001 B1
6298166 Ratnakar et al. Oct 2001 B1
6300935 Sobel et al. Oct 2001 B1
6381345 Swain Apr 2002 B1
6393148 Bhaskar May 2002 B1
6396963 Shaffer et al. May 2002 B2
6407777 DeLuca Jun 2002 B1
6421468 Ratnakar et al. Jul 2002 B1
6426775 Kurokawa Jul 2002 B1
6429924 Milch Aug 2002 B1
6433818 Steinberg et al. Aug 2002 B1
6438264 Gallagher et al. Aug 2002 B1
6441854 Fellegara et al. Aug 2002 B2
6459436 Kumada et al. Oct 2002 B1
6473199 Gilman et al. Oct 2002 B1
6496655 Malloy Desormeaux Dec 2002 B1
6501911 Malloy Desormeaux Dec 2002 B1
6505003 Malloy Desormeaux Jan 2003 B1
6510520 Steinberg Jan 2003 B1
6516154 Parulski et al. Feb 2003 B1
6614471 Ott Sep 2003 B1
6614995 Tseng Sep 2003 B2
6621867 Sazzad et al. Sep 2003 B1
6628833 Horie Sep 2003 B1
6631208 Kinjo et al. Oct 2003 B1
6700614 Hata Mar 2004 B1
6707950 Burns et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6718051 Eschbach Apr 2004 B1
6724941 Aoyama Apr 2004 B1
6728401 Hardeberg Apr 2004 B1
6765686 Maruoka Jul 2004 B2
6786655 Cook et al. Sep 2004 B2
6792161 Imaizumi et al. Sep 2004 B1
6798913 Toriyama Sep 2004 B2
6859565 Baron Feb 2005 B2
6873743 Steinberg Mar 2005 B2
6885766 Held et al. Apr 2005 B2
6895112 Chen et al. May 2005 B2
6900882 Iida May 2005 B2
6912298 Wilensky Jun 2005 B1
6937997 Parulski Aug 2005 B1
6967680 Kagle et al. Nov 2005 B1
6980691 Nesterov et al. Dec 2005 B2
6984039 Agostinelli Jan 2006 B2
7024051 Miller et al. Apr 2006 B2
7027643 Comaniciu et al. Apr 2006 B2
7027662 Baron Apr 2006 B2
7030927 Sasaki Apr 2006 B2
7035461 Luo et al. Apr 2006 B2
7035462 White et al. Apr 2006 B2
7042501 Matama May 2006 B1
7042505 DeLuca May 2006 B1
7062086 Chen et al. Jun 2006 B2
7116820 Luo et al. Oct 2006 B2
7130453 Kondo et al. Oct 2006 B2
7133070 Wheeler et al. Nov 2006 B2
7155058 Gaubatz et al. Dec 2006 B2
7171044 Chen et al. Jan 2007 B2
7216289 Kagle et al. May 2007 B2
7224850 Zhang et al. May 2007 B2
7269292 Steinberg Sep 2007 B2
7289664 Enomoto Oct 2007 B2
7295233 Steinberg et al. Nov 2007 B2
7310443 Kris et al. Dec 2007 B1
7315631 Corcoran et al. Jan 2008 B1
7336821 Ciuc et al. Feb 2008 B2
7352394 DeLuca et al. Apr 2008 B1
7362368 Steinberg et al. Apr 2008 B2
7369712 Steinberg et al. May 2008 B2
7403643 Ianculescu et al. Jul 2008 B2
7436998 Steinberg et al. Oct 2008 B2
7454040 Luo et al. Nov 2008 B2
7515740 Corcoran et al. Apr 2009 B2
7567707 Willamowski et al. Jul 2009 B2
7574069 Setlur et al. Aug 2009 B2
7593603 Wilensky Sep 2009 B1
7613332 Enomoto et al. Nov 2009 B2
7630006 DeLuca et al. Dec 2009 B2
7657060 Cohen et al. Feb 2010 B2
7702149 Ohkubo et al. Apr 2010 B2
7747071 Yen et al. Jun 2010 B2
20010015760 Fellegara et al. Aug 2001 A1
20010031142 Whiteside Oct 2001 A1
20010052937 Suzuki Dec 2001 A1
20020019859 Watanabe Feb 2002 A1
20020041329 Steinberg Apr 2002 A1
20020051571 Jackway et al. May 2002 A1
20020054224 Wasula et al. May 2002 A1
20020085088 Eubanks Jul 2002 A1
20020089514 Kitahara et al. Jul 2002 A1
20020090133 Kim et al. Jul 2002 A1
20020093577 Kitawaki et al. Jul 2002 A1
20020093633 Milch Jul 2002 A1
20020102024 Jones et al. Aug 2002 A1
20020105662 Patton et al. Aug 2002 A1
20020114513 Hirao Aug 2002 A1
20020126893 Held et al. Sep 2002 A1
20020131770 Meier et al. Sep 2002 A1
20020136450 Chen et al. Sep 2002 A1
20020141661 Steinberg Oct 2002 A1
20020150292 O'callaghan Oct 2002 A1
20020150306 Baron Oct 2002 A1
20020159630 Buzuloiu et al. Oct 2002 A1
20020172419 Lin et al. Nov 2002 A1
20020176623 Steinberg Nov 2002 A1
20030007687 Nesterov et al. Jan 2003 A1
20030021478 Yoshida Jan 2003 A1
20030025808 Parulski et al. Feb 2003 A1
20030025811 Keelan et al. Feb 2003 A1
20030039402 Robins et al. Feb 2003 A1
20030044063 Meckes et al. Mar 2003 A1
20030044070 Fuersich et al. Mar 2003 A1
20030044176 Saitoh Mar 2003 A1
20030044177 Oberhardt et al. Mar 2003 A1
20030044178 Oberhardt et al. Mar 2003 A1
20030052991 Stavely et al. Mar 2003 A1
20030058343 Katayama Mar 2003 A1
20030058349 Takemoto Mar 2003 A1
20030086134 Enomoto May 2003 A1
20030086164 Abe May 2003 A1
20030095197 Wheeler et al. May 2003 A1
20030107649 Flickner et al. Jun 2003 A1
20030113035 Cahill et al. Jun 2003 A1
20030118216 Goldberg Jun 2003 A1
20030137597 Sakamoto et al. Jul 2003 A1
20030142285 Enomoto Jul 2003 A1
20030161506 Velazquez et al. Aug 2003 A1
20030190072 Adkins et al. Oct 2003 A1
20030194143 Iida Oct 2003 A1
20030202715 Kinjo Oct 2003 A1
20040017481 Takasumi et al. Jan 2004 A1
20040027593 Wilkins Feb 2004 A1
20040032512 Silverbrook Feb 2004 A1
20040032526 Silverbrook Feb 2004 A1
20040033071 Kubo Feb 2004 A1
20040037460 Luo et al. Feb 2004 A1
20040041924 White et al. Mar 2004 A1
20040046878 Jarman Mar 2004 A1
20040047491 Rydbeck Mar 2004 A1
20040056975 Hata Mar 2004 A1
20040057623 Schuhrke et al. Mar 2004 A1
20040057705 Kohno Mar 2004 A1
20040057715 Tsuchida et al. Mar 2004 A1
20040090461 Adams May 2004 A1
20040093432 Luo et al. May 2004 A1
20040109614 Enomoto et al. Jun 2004 A1
20040114796 Kaku Jun 2004 A1
20040114797 Meckes Jun 2004 A1
20040114829 LeFeuvre et al. Jun 2004 A1
20040114904 Sun et al. Jun 2004 A1
20040119851 Kaku Jun 2004 A1
20040120598 Feng Jun 2004 A1
20040125387 Nagao et al. Jul 2004 A1
20040126086 Nakamura et al. Jul 2004 A1
20040141657 Jarman Jul 2004 A1
20040150743 Schinner Aug 2004 A1
20040160517 Iida Aug 2004 A1
20040165215 Raguet et al. Aug 2004 A1
20040184044 Kolb et al. Sep 2004 A1
20040184670 Jarman et al. Sep 2004 A1
20040196292 Okamura Oct 2004 A1
20040196503 Kurtenbach et al. Oct 2004 A1
20040213476 Luo et al. Oct 2004 A1
20040223063 DeLuca et al. Nov 2004 A1
20040227978 Enomoto Nov 2004 A1
20040228542 Zhang et al. Nov 2004 A1
20040233299 Ioffe et al. Nov 2004 A1
20040233301 Nakata et al. Nov 2004 A1
20040234156 Watanabe et al. Nov 2004 A1
20040239779 Washisu Dec 2004 A1
20040240747 Jarman et al. Dec 2004 A1
20040258308 Sadovsky et al. Dec 2004 A1
20050001024 Kusaka et al. Jan 2005 A1
20050013602 Ogawa Jan 2005 A1
20050013603 Ichimasa Jan 2005 A1
20050024498 Iida et al. Feb 2005 A1
20050031224 Prilutsky et al. Feb 2005 A1
20050041121 Steinberg et al. Feb 2005 A1
20050046730 Li Mar 2005 A1
20050047655 Luo et al. Mar 2005 A1
20050047656 Luo et al. Mar 2005 A1
20050053279 Chen et al. Mar 2005 A1
20050058340 Chen et al. Mar 2005 A1
20050058342 Chen et al. Mar 2005 A1
20050062856 Matsushita Mar 2005 A1
20050063083 Dart et al. Mar 2005 A1
20050068452 Steinberg et al. Mar 2005 A1
20050074164 Yonaha Apr 2005 A1
20050074179 Wilensky Apr 2005 A1
20050078191 Battles Apr 2005 A1
20050117132 Agostinelli Jun 2005 A1
20050129331 Kakiuchi et al. Jun 2005 A1
20050134719 Beck Jun 2005 A1
20050140801 Prilutsky et al. Jun 2005 A1
20050147278 Rui et al. Jul 2005 A1
20050151943 Iida Jul 2005 A1
20050163498 Battles et al. Jul 2005 A1
20050168965 Yoshida Aug 2005 A1
20050196067 Gallagher et al. Sep 2005 A1
20050200736 Ito Sep 2005 A1
20050207649 Enomoto et al. Sep 2005 A1
20050212955 Craig et al. Sep 2005 A1
20050219385 Terakawa Oct 2005 A1
20050219608 Wada Oct 2005 A1
20050220346 Akahori Oct 2005 A1
20050220347 Enomoto et al. Oct 2005 A1
20050226499 Terakawa Oct 2005 A1
20050232490 Itagaki et al. Oct 2005 A1
20050238217 Enomoto et al. Oct 2005 A1
20050238230 Yoshida Oct 2005 A1
20050243348 Yonaha Nov 2005 A1
20050275734 Ikeda Dec 2005 A1
20050276481 Enomoto Dec 2005 A1
20050280717 Sugimoto Dec 2005 A1
20050286766 Ferman Dec 2005 A1
20060008171 Petschnigg et al. Jan 2006 A1
20060017825 Thakur Jan 2006 A1
20060038916 Knoedgen et al. Feb 2006 A1
20060039690 Steinberg et al. Feb 2006 A1
20060045352 Gallagher Mar 2006 A1
20060050300 Mitani et al. Mar 2006 A1
20060066628 Brodie et al. Mar 2006 A1
20060082847 Sugimoto Apr 2006 A1
20060093212 Steinberg et al. May 2006 A1
20060093213 Steinberg et al. May 2006 A1
20060093238 Steinberg et al. May 2006 A1
20060098867 Gallagher May 2006 A1
20060098875 Sugimoto May 2006 A1
20060119832 Iida Jun 2006 A1
20060120599 Steinberg et al. Jun 2006 A1
20060126938 Lee et al. Jun 2006 A1
20060140455 Costache et al. Jun 2006 A1
20060150089 Jensen et al. Jul 2006 A1
20060203108 Steinberg et al. Sep 2006 A1
20060204052 Yokouchi Sep 2006 A1
20060204110 Steinberg et al. Sep 2006 A1
20060221408 Fukuda Oct 2006 A1
20060280361 Umeda Dec 2006 A1
20060280375 Dalton et al. Dec 2006 A1
20060285754 Steinberg et al. Dec 2006 A1
20070098260 Yen et al. May 2007 A1
20070110305 Corcoran et al. May 2007 A1
20070116379 Corcoran et al. May 2007 A1
20070116380 Ciuc et al. May 2007 A1
20070133863 Sakai et al. Jun 2007 A1
20070154189 Harradine et al. Jul 2007 A1
20070201724 Steinberg et al. Aug 2007 A1
20070263104 DeLuca et al. Nov 2007 A1
20070263928 Akahori Nov 2007 A1
20080002060 DeLuca et al. Jan 2008 A1
20080013798 Ionita et al. Jan 2008 A1
20080043121 Prilutsky et al. Feb 2008 A1
20080112599 Capata et al. May 2008 A1
20080144965 Steinberg et al. Jun 2008 A1
20080186389 DeLuca et al. Aug 2008 A1
20080211937 Steinberg et al. Sep 2008 A1
20080219518 Steinberg et al. Sep 2008 A1
20080232711 Prilutsky et al. Sep 2008 A1
20080240555 Nanu et al. Oct 2008 A1
20110222730 Steinberg et al. Sep 2011 A1
Foreign Referenced Citations (69)
Number Date Country
884694 Dec 1998 EP
911759 Apr 1999 EP
911759 Jun 2000 EP
1199672 Apr 2002 EP
1229486 Aug 2002 EP
1288858 Mar 2003 EP
1288859 Mar 2003 EP
1288860 Mar 2003 EP
1293933 Mar 2003 EP
1296510 Mar 2003 EP
1429290 Jun 2004 EP
1478169 Nov 2004 EP
1528509 May 2005 EP
979487 Mar 2006 EP
1429290 Jul 2008 EP
2227002 Sep 2008 EP
2165523 Apr 2011 EP
841609 Jul 1960 GB
3-205989 Sep 1991 JP
3205989 Sep 1991 JP
4192681 Jul 1992 JP
5224271 Sep 1993 JP
7-281285 Oct 1995 JP
7281285 Oct 1995 JP
9214839 Aug 1997 JP
20134486 May 2000 JP
22247596 Aug 2002 JP
22271808 Sep 2002 JP
2003-030647 Jan 2003 JP
2003-030647 Jan 2003 JP
WO-9802844 Jan 1998 WO
WO-9917254 Apr 1999 WO
WO-9933684 Jul 1999 WO
WO-0171421 Sep 2001 WO
WO-0192614 Dec 2001 WO
WO-0245003 Jun 2002 WO
WO03019473 Mar 2003 WO
WO-03026278 Mar 2003 WO
WO-03071484 Aug 2003 WO
WO-2004034696 Apr 2004 WO
WO2005015896 Feb 2005 WO
WO-2005015896 Feb 2005 WO
WO-2005041558 Feb 2005 WO
WO2005076217 Aug 2005 WO
WO-2005072617 Aug 2005 WO
WO-2005076217 Aug 2005 WO
WO-2005087994 Sep 2005 WO
WO2005076217 Oct 2005 WO
WO-2005109853 Nov 2005 WO
WO-2006011635 Feb 2006 WO
WO-2006018056 Feb 2006 WO
WO2005076217 Apr 2006 WO
WO-2006045441 May 2006 WO
WO-2007057063 May 2007 WO
WO-2007057064 May 2007 WO
WO-2007093199 Aug 2007 WO
WO-2007093199 Aug 2007 WO
WO-2007095553 Aug 2007 WO
WO-2007095553 Aug 2007 WO
WO-2007142621 Dec 2007 WO
WO-2008023280 Feb 2008 WO
WO2008109708 Sep 2008 WO
WO-2008109644 Sep 2008 WO
WO-2008109644 Sep 2008 WO
WO 2010017953 Feb 2010 WO
WO2010017953 Feb 2010 WO
WO2010017953 Feb 2010 WO
WO 2010025908 Mar 2010 WO
WO2010025908 Mar 2010 WO
Non-Patent Literature Citations (107)
Entry
Viola et al., “Robust Real-time Object Detection”, International Workshop on Statistical and Computation Theories of Vision, Jul. 2001.
Combier, Nathalie et al., “Removal of Defects on Flash Radiographic Images by Fuzzy Combination, Conference: Machine Vision Applications in Industrial Inspection III, http://rlinks2.dialog.com/NASApp/ChannelWEB/DialogProServlet?ChName=engineering”, Proceedings of SPIE—The International Society for Optical Engineering. Society of Photo-Optical Instrumentation. 1995, pp. 301-312.
Corcoran, P. et al., “Automated In-Camera Detection of Flash-Eye Defects”, IEEE Transactions on Consumer Electronics, 2005, pp. 11-17, vol. 51—Issue 1.
Cucchiara, R. et al., “Detection of Luminosity Profiles of Elongated Shapes”, International Conference on Image Processing, 1996, pp. 635-638, vol. 3.
EPO Communication pursuant to Article 94(3) EPC, for European Patent Application No. 05 792 584.4, paper dated May 13, 2008, 8 pages.
European Patent Office, Communication pursuant to Article 94(3) EPC for Application No. 04763763.2, dated Mar. 7, 2008, 7 pages.
European Patent Office, Communication pursuant to Article 96(2) EPC for Application No. 04763763.2, dated Aug. 29. 2006, 4 pages.
Examination Report for European patent application No. 05792584.4, dated May 13, 2008, 8 pgs.
Gaubatz, Matthew et al., “Automatic Red-Eye Detection and Correction”, IEEE ICIP, Proceedings 2002 Intl Conference on Image Proc, 2002, pp. 1-804-1-807, vol. 2—Issue 3.
Han, T. et al., “Detection and Correction of abnormal Pixels in Hyperion Images”, IEEE International Symposium on Geoscience and Remote Sensing, 2002. pp. 1327-1330. vol. 3.
Iivarinen, J. et al., “Content-Based Retrieval of Defect Images, http://www.cs.tut.fi/.about.avisa/digger/Publications/acivs02.pdf”, Proceedings of Advanced Concepts for Intelligent Vision, Laboratory of Computer Information Science, 2002.
Ioffe, S., “Red eye detection with machine learning”, Proceedings 2003 International Conference on Image Processing, 2003, pp. 871-874. vol. 2—Issue 3.
Ito, M., “An Automated System for LSI Fine Pattern Inspection Based on Comparison of Sem Images and Cad Data”, IEEE International Conference on Robotics and Automation, 1995, pp. 544-549, vol. 1.
Jin, B. et al., “Modeling and Analysis of Soft-Test/Repair for CCD-Based Digital X-Ray Systems”, Instrumentation and Measurement, IEEE Trans, 2003, pp. 1713-1721, vol. 52—Issue 6.
Nguyen, Karlene et al., “Differences in the Infrared Bright Pupil Response of Human Eyes”, Proceedings of the 2002 symposium on Eye tracking research and applications, 2002, pp. 133-138.
Patent Abstracts of Japan, publication No. 2000050062, Image Input Device, application No. 10-217124. published, Feb. 18, 2000, 1 page.
PCT International Preliminary Report on Patentability (IPRP) for PCT Application PCT/EP2005/011010, dated Jan. 23, 2007, 18 pages.
PCT International Preliminary Report on Patentability for PCT Application No. PCT/EP2005/005907, dated Nov. 15, 2006, 8 pages.
PCT International Preliminary Report on Patentability for PCT Application PCT/EP2004/008706, dated Feb. 6, 2006, 7 pages.
PCT International Preliminary Report on Patentability for PCT Application PCT/EP2004/010199, dated Apr. 3, 2006, 7 pages.
PCT International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT Application No. PCT/US2008/055864, dated Jul. 30, 2008, 8 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT Application No. PCT/EP2004/008706, dated Nov. 19, 2004, 13 pages.
PCT International Search Report and Written Opinion of the International Searching Authority for PCT Application No. PCT/EP2005/005033.
PCT Notification Concerning Transmittal of International Preliminary Report on Patentability, for PCT Application No. PCT/US2007/062090, dated Aug. 28, 2008, 6 pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration (PCT/EP2006/008342), dated Dec. 28, 2006.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for PCT Application No. PCT/US07/62090 issued Mar. 10, 2008, 10 pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for PCT/EP/2005/011010, dated Jan. 23, 2006, 14 pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration for PCT/EP/2005/05907, dated Aug. 1, 2005, 12 pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT Application No. PCT/EP2006/008358, Dec. 5, 2006, 14 pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT Application No. PCT/US2008/055964, paper dated Jul. 30, 2008, 8 Pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT/EP2004/010199, paper dated Dec. 13, 2004, 13 pages.
PCT Notification of Transmittal of the International Search Report and Written Opinion of the International Searching Authority for PCT Application No. PCT/EP2005/001171, (11 pages).
Plotnikov, Yuri et al., “Advanced Image Processing for Defect Visualization in Infrared Thermography, http://citeseer.ist.psu.edu/ plotnikov98advanced.html”, NASA Langley Research Center, M.S. Posted: ACM Portal, 1998.
Plotnikov, Yuri et al., Winfree, “Visualization of Subsurface Defects in Composites Using a Focal Plane Array Infrared Camera . http://citeseer.ist.psu.edu/357066.html”, NASA Langley Research Center, 1999.
Sahba. F. et al., “Filter Fusion for Image Enhancement Using Reinforcement Learning, XP010654204, ISBN: 0-7803-7781-8”, Canadian Conference on Electrical and computer Engineering. 2003, pp. 847-850, vol. 3.
Shen, Jianhong, “Inpainting and the Fundamental Problem of image Processing”, 2002, 6 pages.
Smolka, B. et al., “Towards Automatic Redeye Effect Removal. XP004416063”. Pattern Recognition Letters, 2003, pp. 1767-1785, vol. 24—Issue 11, North-Holland Publ.
Soriano, M. et al., “Making Saturated Facial Images Useful Again, XP002325961, ISSN: 0277-786X”, Proceedings of the Spie, 1999, pp. 113-121, vol. 3826.
Tan, Yap-peng et al., “Robust Sequential Approach for the Detection of Defective Pixels in an Image Sensor, http:// ieeexplorejeee.org/search/freesrchabstract.jsp?arnumber=758382 andisnumber=16342andpunumber=6110andk2dock ey=758382©ieeecnfsandquery=%28%28%28%28images+and+defects+and+correction%29%29%29%29+%3Cin%3E”, IEEE International Conference on Acoustics, Speech, and Signal Processing, 1999, pp. 2239-2242, vol. 4.
Toet, A., “Multiscale Color Image Enhancement, Posted online: Aug. 6, 2002 18:09:24.0 http://ieeexplore.ieee.org/search/freesrchabstract.jsp?arnumber=146865andisnumber=3917andpunumber=1197andk2dockey=146865©ieecnfsandquery=%28%28images+and+defects+and+luminance%29%29+%3Cin%3E+metadataandpos=1”, International Conference on Image Processing and its Applications, 1992, pp. 583-585.
U.S. Appl. No. 10/772,767, filed Feb. 4, 2004, by invs Michael J. DeLuca, et al.
U.S. Appl. No. 10/170,511, filed Jun. 12, 2002, inventor Michael J. DeLuca.
U.S. Appl. No. 11/217,788, filed Aug. 30, 2005, inventors Eran Steinberg, et al.
United Kingdom Search Report dated May 22, 2007, issued in Application No. GB 0701957.3.
Willamowski. J. et al., “Probabilistic Automatic Red Eye Detection and Correction”, The 18th International Conference on Pattern Recognition (ICPR'06), 2006, pp. 762-765, vol. 3, IEEE Computer Society.
Non-Final Office Action mailed May. 3, 2010, for U.S. Appl. No. 12/187,763, filed Aug. 7, 2008.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT Application No. PCT/EP2009/006361, dated Nov. 24, 2009, 10 pages.
PCT Invitation to Pay Additional Fees for Application No. PCT/EP2009/051081, dated Apr. 29, 2009, 7 pages.
Agrawal A. et al., “Removing photography artifacts using gradient projection and flash-exposure sampling” ACM Transactions on Graphics , 2005, pp. 828-835.
Final Office Action mailed Apr. 26, 2010, for U.S. Appl. No. 10/773,092, filed Feb. 4, 2004.
Final Office Action mailed Apr. 26, 2010, for U.S. Appl. No. 11/690,834, filed Mar. 25, 2007.
Final Office Action mailed Apr. 26, 2010, for U.S. Appl. No. 11/772,427, filed Feb. 2, 2007.
Final Office Action mailed Apr. 26, 2010, for U.S. Appl. No. 12/035,416, filed Feb. 21, 2008.
Final Office Action mailed Mar. 24, 2010, for U.S. Appl. No. 11/462,035, filed Aug. 2, 2006.
Final Office Action mailed Nov. 9, 2010, for U.S. Appl. No. 11/462,035, filed Aug. 2, 2006.
Final Office Action mailed Nov. 20, 2009, for U.S. Appl. No. 12/192,897, filed Aug. 15, 2008.
Final Office Action mailed Sep. 1, 2009, for U.S. Appl. No. 11/841,855, filed Aug. 20, 2007.
Non-Final Office Action mailed Aug. 30, 2010, for U.S. Appl. No. 11/841,855, filed Aug. 20, 2007.
Non-Final Office Action mailed Aug. 31, 2009, for U.S. Appl. No. 11/462,035, filed Aug. 2, 2006.
Non-Final Office Action mailed Aug. 5, 2010, for U.S. Appl. No. 11/462,035, filed Aug. 2, 2006.
Non-Final Office Action mailed Jul. 14, 2009, for U.S. Appl. No. 12/192,897, filed Aug. 15, 2008.
Non-Final Office Action mailed May 3, 2010, for U.S. Appl. No. 12/187,763, filed Aug. 7, 2008.
Non-Final Office Action mailed May 4, 2010, for U.S. Appl. No. 12/192,335, filed Aug. 15, 2008.
Non-Final Office Action mailed Oct. 5, 2009, for U.S. Appl. No. 10/919,226, filed Aug. 16, 2004.
Non-Final Office Action mailed Oct. 7, 2009, for U.S. Appl. No. 12/119,614, filed May 13, 2008.
Non-Final Office Action mailed Oct. 29, 2009, for U.S. Appl. No. 12/194,148, filed Aug. 19, 2008.
Non-Final Office Action mailed Sep. 17, 2010, for U.S. Appl. No. 11/690,834, filed Mar. 25, 2007.
Non-Final Office Action mailed Sep. 21, 2010, for U.S. Appl. No. 10/773,092, filed Feb. 4, 2004.
Notice of Allowance mailed Dec. 10, 2010, for U.S. Appl. No. 11/462,035, filed Aug. 2, 2006.
Notice of Allowance mailed Feb. 1, 2010, for U.S. Appl. No. 10/919,226, filed Aug. 16, 2004.
Notice of Allowance mailed Feb. 19, 2010, for U.S. Appl. No. 12/119,614, filed May 13, 2008.
Notice of Allowance mailed Feb. 19, 2010, for U.S. Appl. No. 12/194,148, filed Aug. 19, 2008.
Notice of Allowance mailed Jun. 27, 2010, for U.S. Appl. No. 12/192,897, filed Aug. 15, 2008.
Notice of Allowance mailed Nov. 18, 2009, for U.S. Appl. No. 11/282,954, filed Nov. 18, 2005.
Notice of Allowance mailed Oct. 15, 2010, for U.S. Appl. No. 11/554,539, filed Oct. 30, 2006.
Notice of Allowance mailed Oct. 22, 2010, for U.S. Appl. No. 12/187,763, filed Aug. 7, 2008.
Notice of Allowance mailed Oct. 28, 2010, for U.S. Appl. No. 12/192,335, filed Aug. 15, 2008.
Notice of Allowance mailed Oct. 28, 2010, for U.S. Appl. No. 11/690,834, filed Mar. 25, 2007.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT Application No. PCT/EP2009/005809, dated Nov. 24, 2009, 12 pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT Application No. PCT/EP2009/006361, dated Nov. 11, 2009, 10 pages.
PCT Partial International Search Report for Application No. PCT/EP2009/051081, dated Apr. 29, 2009, 7 pages.
Tatsutoshi Kitajima (JP04-192681 English Translation; Electronic Camera, Jul. 10, 1992).
Translation of Hiroshi et al. JP05-224271, Mar. 1993, Japan Publication.
Final Office Action mailed Feb. 1, 2011, for U.S. Appl. No. 10/773,092, filed Feb. 4, 2004.
Corinne Vachier, Luc Vincent, Valuation of Image Extrema Using Alternating Filters by Reconstruction, Proceedings of the SPIE—The International Society for Optical Engineering, 1995, vol. 2568, pp. 94-103.
EPO Communication pursuant to Article 94(3) EPC, for European patent application No. 05707215.9, report dated Sep. 14, 2010, 11 Pages.
EPO Communication under Rule 71(3) EPC, for European patent application No. 09706058.6, report dated Oct. 4, 2010, 6 Pages.
EPO Extended European Search Report, for European application No. 10164430.0, dated Sep. 6, 2010, including the extended European search report, pursuant to Rule 62 EPC, the European search report (R. 61 EPC) or the partial European search report/declaration of no search (R. 63 EPC) and the European search opinion, 8 pages.
PCT Notification of Transmittal of the International Search Report and the Written Opinion of the International Searching Authority, or the Declaration, for PCT Application No. PCT/US2008/055964, dated Jul. 30, 2008, 8 pages.
PCT Written Opinion of the International Searching Authority, for PCT Application No. PCT/US2008/055964, dated Jul. 24, 2008, 5 pages.
PCT International Preliminary Report on Patentability for PCT Application No. PCT/US2008/055964, dated Sep. 8, 2009, 6 pages.
Cuiping Zhang and Fernand S. Cohen, Component-Based Active Appearance Models for Face Modelling, D. Zhang and A.K. Jain (Eds.): ICB 2006, LNCS 3832, pp. 206-212, 2005, Springer-Verlag Berlin Heidelberg 2005.
Fundus Photograph Reading Center—Modified 3-Standard Field Color Fundus Photography and Fluorescein Angiography Procedure, Retrieved from the Internet on Oct. 19, 2011, URL: http://eyephoto.ophth.wisc.edu/Photography/Protocols/mod3-ver1.4.html, 3 Pages.
Anatomy of the Eye, Retrieved from the Internet on Oct. 19, 2011, URL: http://www.stlukeseye.com/anatomy, 3 pages.
Fovea centralis, Retrieved from the Internet on Oct. 19, 2011, URL: http://en.wikipedia.org/wiki/Fovea, 4 pages.
Non-Final Office Action mailed Apr. 28, 2011, for U.S. Appl. No. 11/936,085, filed Nov. 7, 2007.
Non-Final Office Action mailed Apr. 28, 2011, for U.S. Appl. No. 11/937,377, filed Nov. 8, 2007.
Non-Final Office Action mailed Mar. 31, 2011, for U.S. Appl. No. 12/551,312, filed Aug. 31, 2009.
Non-Final Office Action mailed May 2, 2011, for U.S. Appl. No. 12/824,214, filed Jun. 27, 2010.
Notice of Allowance mailed Feb. 4, 2011, for U.S. Appl. No. 12/611,387, filed Nov. 3, 2009.
Notice of Allowance mailed Mar. 3, 2011, for U.S. Appl. No. 12/543,405, filed Aug. 18, 2009.
Final Office Action mailed Feb. 16, 2011, for U.S. Appl. No. 12/543,405, filed Aug. 18, 2009.
Final Office Action mailed Jan. 5, 2011, for U.S. Appl. No. 12/611,387, filed Nov. 3, 2009.
Notice of Allowance mailed May 12, 2011, for U.S. Appl. No. 12/043,025, filed Mar. 5, 2008.
Final Office Action mailed Feb. 2, 2011, for U.S. Appl. No. 12/613,457, filed Nov. 5, 2009.
Notice of Allowance mailed Mar. 17, 2011, for U.S. Appl. No. 12/042,335, filed Mar. 5, 2008.
Patent Abstracts of Japan, for Publication No. JP2002-247596, published Aug. 30, 2002, (Appl. No. 2001-044807), Program For Specifying Red Eye Area in Image, Image Processor and Recording Medium. 1 Page.
Related Publications (1)
Number Date Country
20090080797 A1 Mar 2009 US