Red-eye filter method and apparatus

Information

  • Patent Grant
  • 7916190
  • Patent Number
    7,916,190
  • Date Filed
    Tuesday, November 3, 2009
    15 years ago
  • Date Issued
    Tuesday, March 29, 2011
    13 years ago
Abstract
A digital camera has an integral flash and stores and displays a digital image. Under certain conditions, a flash photograph taken with the camera may result in a red-eye phenomenon due to a reflection within an eye of a subject of the photograph. The digital camera has a red-eye filter which analyzes the stored image for the red-eye phenomenon and modifies the stored image to eliminate the red-eye phenomenon by changing the red area to black. The modification of the image is enabled when a photograph is taken under conditions indicative of the red-eye phenomenon. The modification is subject to anti-falsing analysis which further examines the area around the red-eye area for indicia of the eye of the subject.
Description
FIELD OF THE INVENTION

The invention relates generally to the area of flash photography, and more specifically to filtering “red-eye” from a digital camera image.


BACKGROUND OF THE INVENTION

“Red-eye” is a phenomenon in flash photography where a flash is reflected within a subject's eye and appears in a photograph as a red dot where the black pupil of the subject's eye would normally appear. The unnatural glowing red of an eye is due to internal reflections from the vascular membrane behind the retina, which is rich in blood vessels. This objectionable phenomenon is well understood to be caused in part by a small angle between the flash of the camera and the lens of the camera. This angle has decreased with the miniaturization of cameras with integral flash capabilities. Additional contributors include the relative closeness of the subject to the camera and ambient light levels.


The red-eye phenomenon can be minimized by causing the iris to reduce the opening of the pupil. This is typically done with a “pre-flash”, a flash or illumination of light shortly before a flash photograph is taken. This causes the iris to close. Unfortunately, the pre-flash is an objectionable 0.2 to 0.6 seconds prior to the flash photograph. This delay is readily discernible and easily within the reaction time of a human subject. Consequently the subject may believe the pre-flash is the actual photograph and be in a less than desirable position at the time of the actual photograph. Alternately, the subject must be informed of the pre-flash, typically loosing any spontaneity of the subject captured in the photograph.


Those familiar with the art have developed complex analysis processes operating within a camera prior to invoking a pre-flash. Various conditions are monitored prior to the photograph before the pre-flash is generated, the conditions include the ambient light level and the distance of the subject from the camera. Such a system is described in U.S. Pat. No. 5,070,355 to Inoue et al. Although that invention minimizes the occurrences where a pre-flash is used, it does not eliminate the need for a pre-flash. What is needed is a method of eliminating the red-eye phenomenon with a miniature camera having an integral without the distraction of a pre-flash.


Digital cameras are becoming more popular and smaller in size. Digital cameras have several advantages over film cameras. Digital cameras eliminate the need for film as the image is digitally captured and stored in a memory array for display on a display screen on the camera itself. This allows photographs to be viewed and enjoyed virtually instantaneously as opposed to waiting for film processing. Furthermore, the digitally captured image may be downloaded to another display device such as a personal computer or color printer for further enhanced viewing. Digital cameras include microprocessors for image processing and compression and camera systems control. Nevertheless, without a pre-flash, both digital and film cameras can capture the red-eye phenomenon as the flash reflects within a subject's eye. Thus, what is needed is a method of eliminating red-eye phenomenon within a miniature digital camera having a flash without the distraction of a pre-flash.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows a block diagram of a camera apparatus operating in accordance with the present invention.



FIG. 2 shows a pixel grid upon which an image of an eye is focused.



FIG. 3 shows pixel coordinates of the pupil of FIG. 2.



FIG. 4 shows pixel coordinates of the iris of FIG. 2.



FIG. 5 shows pixel coordinates which contain a combination of iris and pupil colors of FIG. 2.



FIG. 6 shows pixel coordinates of the white eye area of FIG. 2.



FIG. 7 shows pixel coordinates of the eyebrow area of FIG. 2.



FIG. 8 shows a flow chart of a method operating in accordance with the present invention.



FIG. 9 shows a flow chart for testing if conditions indicate the possibility of a red-eye phenomenon photograph.



FIG. 10 shows a flow chart for testing if conditions indicate a false red-eye grouping.





DESCRIPTION OF A PREFERRED EMBODIMENT


FIG. 1 shows a block diagram of a camera apparatus operating in accordance with the present invention. The camera 20 includes an exposure control 30 that, in response to a user input, initiates and controls the digital photographic process. Ambient light is determined using light sensor 40 in order to automatically determine if a flash is to be used. The distance to the subject is determined using focusing means 50 which also focuses the image on image capture means 60. The image capture means digitally records the image in color. The image capture means is known to those familiar with the art and may include a CCD (charge coupled device) to facilitate digital recording. If a flash is to be used, exposure control means 30 causes the flash means 70 to generate a photographic flash in substantial coincidence with the recording of the image by image capture means 60. The flash may be selectively generated either in response to the light sensor 40 or a manual input from the user of the camera. The image recorded by image capture means 60 is stored in image store means 80 which may comprise computer memory such a dynamic random access memory or a nonvolatile memory. The red-eye filter 90 then analyzes the stored image for characteristics of red-eye, and if found, modifies the image and removes the red-eye phenomenon from the photograph as will be describe in more detail. The red-eye filter includes a pixel locator 92 for locating pixels having a color indicative of red-eye; a shape analyzer 94 for determining if a grouping of at least a portion of the pixels located by the pixel locator comprise a shape indicative of red-eye; a pixel modifier 96 for modifying the color of pixels within the grouping; and an falsing analyzer 98 for further processing the image around the grouping for details indicative of an image of an eye. The modified image may be either displayed on image display 100 or downloaded to another display device, such as a personal computer or printer via image output means 110. It can be appreciated that many of the processes implemented in the digital camera may be implemented in or controlled by software operating in a microcomputer (.mu.C) or digital signal processor (DSP) and/or an application specific integrated circuit (ASIC).



FIG. 2 shows a pixel grid upon which an image of an eye is focused. Preferably the digital camera records an image comprising a grid of pixels at least 640 by 480. FIG. 2 shows a 24 by 12 pixel portion of the larger grid labeled columns A-X and rows 1-12 respectively.



FIG. 3 shows pixel coordinates of the pupil of FIG. 2. The pupil is the darkened circular portion and substantially includes seventeen pixels: K7, K8, L6, L7, L8, L9, M5, M6, M7, M8, M9, N6, N7, N8, N9, O7 and O8, as indicated by shaded squares at the aforementioned coordinates. In a non-flash photograph, these pupil pixels would be substantially black in color. In a red-eye photograph, these pixels would be substantially red in color. It should be noted that the aforementioned pupil pixels have a shape indicative of the pupil of the subject, the shape preferably being a substantially circular, semi-circular or oval grouping of pixels. Locating a group of substantially red pixels forming a substantially circular or oval area is useful by the red-eye filter.



FIG. 4 shows pixel coordinates of the iris of FIG. 2. The iris pixels are substantially adjacent to the pupil pixels of FIG. 2. Iris pixels J5, J6, J7, J8, J9, K5, K10, L10, M10, N1O, O5, O10, P5, P6, P7, P8 and P9 are indicated by shaded squares at the aforementioned coordinates. The iris pixels substantially surround the pupil pixels and may be used as further indicia of a pupil. In a typical subject, the iris pixels will have a substantially constant color. However, the color will vary as the natural color of the eyes each individual subject varies. The existence of iris pixels depends upon the size of the iris at the time of the photograph, if the pupil is very large then iris pixels may not be present.



FIG. 5 shows pixel coordinates which include a combination of iris and pupil colors of FIG. 2. The pupil/iris pixels are located at K6, K9, L5, N5, O6, and O9, as indicated by shaded squares at the aforementioned coordinates. The pupil/iris pixels are adjacent to the pupil pixels, and also adjacent to any iris pixels which may be present. Pupil/iris pixels may also contain colors of other areas of the subject's eyes including skin tones and white areas of the eye.



FIG. 6 shows pixel coordinates of the white eye area of FIG. 2. The seventy one pixels are indicated by the shaded squares of FIG. 6 and are substantially white in color and are in the vicinity of and substantially surround the pupil pixels of FIG. 2.



FIG. 7 shows pixel coordinates of the eyebrow area of FIG. 2. The pixels are indicated by the shaded squares of FIG. 7 and are substantially white in color. The eyebrow pixels substantially form a continuous line in the vicinity of the pupil pixels. The color of the line will vary as the natural color of the eyebrow of each individual subject varies. Furthermore, some subjects may have no visible eyebrow at all.


It should be appreciated that the representations of FIG. 2 through FIG. 7 are particular to the example shown. The coordinates of pixels and actual number of pixels comprising the image of an eye will vary depending upon a number of variables. These variables include the location of the subject within the photograph, the distance between the subject and the camera, and the pixel density of the camera.


The red-eye filter 90 of FIG. 1 searches the digitally stored image for pixels having a substantially red color, then determines if the grouping has a round or oval characteristics, similar to the pixels of FIG. 3. If found, the color of the grouping is modified. In the preferred embodiment, the color is modified to black.


Searching for a circular or oval grouping helps eliminate falsely modifying red pixels which are not due to the red-eye phenomenon. In the example of FIG. 2, the red-eye phenomenon is found in a 5.times.5 grouping of pixels of FIG. 3. In other examples, the grouping may contain substantially more or less pixels depending upon the actual number of pixels comprising the image of an eye, but the color and shape of the grouping will be similar. Thus for example, a long line of red pixels will not be falsely modified because the shape is not substantially round or oval.


Additional tests may be used to avoid falsely modifying a round group of pixels having a color indicative of the red-eye phenomenon by further analysis of the pixels in the vicinity of the grouping. For example, in a red-eye phenomenon photograph, there will typically be no other pixels within the vicinity of a radius originating at the grouping having a similar red color because the pupil is surrounded by components of the subject's face, and the red-eye color is not normally found as a natural color on the face of the subject. Preferably the radius is large enough to analyze enough pixels to avoid falsing, yet small enough to exclude the other eye of the subject, which may also have the red-eye phenomenon. Preferably, the radius includes a range between two and five times the radius of the grouping. Other indicia of the recording may be used to validate the existence of red-eye including identification of iris pixels of FIG. 4 which surround the pupil pixels. The iris pixels will have a substantially common color, but the size and color of the iris will vary from subject to subject. Furthermore, the white area of the eye may be identified as a grouping of substantially white pixels in the vicinity of and substantially surrounding the pupil pixels as shown in FIG. 6. However, the location of the pupil within the opening of the eyelids is variable depending upon the orientation of the head of the subject at the time of the photograph. Consequently, identification of a number of substantially white pixels in the vicinity of the iris without a requirement of surrounding the grouping will further validate the identification of the red-eye phenomenon and prevent false modification of other red pixel groupings. The number of substantially white pixels is preferably between two and twenty times the number of pixels in the pupil grouping. As a further validation, the eyebrow pixels of FIG. 7 can be identified.


Further, additional criterion can be used to avoid falsely modifying a grouping of red pixels. The criterion include determining if the photographic conditions were indicative of the red-eye phenomenon. These include conditions known in the art including use of a flash, ambient light levels and distance of the subject. If the conditions indicate the red-eye phenomenon is not present, then red-eye filter 90 is not engaged.



FIG. 5 shows combination pupil/iris pixels which have color components of the red-eye phenomenon combined with color components of the iris or even the white area of the eye. The invention modifies these pixels by separating the color components associated with red-eye, modifying color of the separated color components and then adding back modified color to the pixel. Preferably the modified color is black. The result of modifying the red component with a black component makes for a more natural looking result. For example, if the iris is substantially green, a pupil/iris pixel will have components of red and green. The red-eye filter removes the red component and substitutes a black component, effectively resulting in a dark green pixel.



FIG. 8 shows a flow chart of a method operating in accordance with the present invention. The red-eye filter process is in addition to other processes known to those skilled in the art which operate within the camera. These other processes include flash control, focus, and image recording, storage and display. The red-eye filter process preferably operates within software within a .mu.C or DSP and processes an image stored in image store 80. The red-eye filter process is entered at step 200. At step 210 conditions are checked for the possibility of the red-eye phenomenon. These conditions are included in signals from exposure control means 30 which are communicated directly to the red-eye filter. Alternatively the exposure control means may store the signals along with the digital image in image store 80. If conditions do not indicate the possibility of red-eye at step 210, then the process exits at step 215. Step 210 is further detailed in FIG. 9, and is an optional step which may be bypassed in an alternate embodiment. Then is step 220 the digital image is searched of pixels having a color indicative of red-eye. The grouping of the red-eye pixels are then analyzed at step 230. Red-eye is determined if the shape of a grouping is indicative of the red-eye phenomenon. This step also accounts for multiple red-eye groupings in response to a subject having two red-eyes, or multiple subjects having red-eyes. If no groupings indicative of red-eye are found, then the process exits at step 215. Otherwise, false red-eye groupings are checked at optional step 240. Step 240 is further detailed in FIG. 10 and prevents the red-eye filter from falsely modifying red pixel groupings which do not have further indicia of the eye of a subject. After eliminating false groupings, if no grouping remain, the process exits at step 215. Otherwise step 250 modifies the color of the groupings which pass step 240, preferably substituting the color red for the color black within the grouping. Then in optional step 260, the pixels surrounding a red-eye grouping are analyzed for a red component. These are equivalent to the pixels of FIG. 5. The red component is substituted for black by the red-eye filter. The process then exits at step 215.


It should be appreciated that the pixel color modification can be stored directly in the image store by replacing red-eye pixels with pixels modified by the red-eye filter. Alternately the modified pixels can be stored as an overlay in the image store, thereby preserving the recorded image and only modifying the image when displayed in image display 100. Preferably the filtered image is communicated through image output means 110. Alternately the unfiltered image with the overlay may be communicated through image output means 110 to a external device such as a personal computer capable of processing such information.



FIG. 9 shows a flow chart for testing if conditions indicate the possibility of a red-eye phenomenon corresponding to step 210 of FIG. 8. Entered at step 300, step 310 checks if a flash was used in the photograph. If not, step 315 indicates that red-eye is not possible. Otherwise optional step 320 checks if a low level of ambient light was present at the time of the photograph. If not, step 315 indicates that red-eye is not possible. Otherwise optional step 330 checks if the subject is relatively close to the camera at the time of the photograph. If not, step 215 indicates that red-eye is not possible. Otherwise step 340 indicates that red-eye is possible.



FIG. 10 shows a flow chart for testing if conditions indicate a false red-eye grouping corresponding to step 240 of FIG. 8. Entered at step 400, step 410 checks if other red-eye pixels are found within a radius of a grouping. Preferably the radius is between two and five times the radius of the grouping. If found step 415 indicates a false red-eye grouping. Otherwise step 420 checks if a substantially white area of pixels is found in the vicinity of the grouping. This area is indicative of the white area of a subject's eye and has preferably between two and twenty times the number of pixels in the grouping. If not found step 415 indicates a false red-eye grouping. Otherwise step 430 searches the vicinity of the grouping for an iris ring or an eyebrow line. If not found, step 415 indicates a false red-eye grouping. Otherwise step 440 indicates the red-eye grouping is not false. It should be appreciated that each of the tests 410, 420 and 430 check for a false red-eye grouping. In alternate embodiments, other tests may be used to prevent false modification of the image, or the tests of FIG. 10 may be used either alone or in combination.


It should be further appreciated that either the red-eye condition test 210 or the red-eye falsing test 240 of FIG. 8 may be used to achieve satisfactory results. In an alternate embodiment test 240 may be acceptable enough to eliminate test 210, or visa versa. Alternately the selectivity of either the color and/or grouping analysis of the red-eye phenomenon may be sufficient to eliminate both tests 210 and 240 of FIG. 8. Furthermore, the color red as used herein means the range of colors and hues and brightnesses indicative of the red-eye phenomenon, and the color white as used herein means the range of colors and hues and brightnesses indicative of the white area of the human eye.


Thus, what has been provided is a method and apparatus for eliminating red-eye phenomenon within a miniature digital camera having a flash without the distraction of a pre-flash.

Claims
  • 1. A portable digital camera having no photographic film comprising: an integral flash for providing illumination during recording of an image without photographic film;a digital image capturing apparatus for recording said image; anda red-eye filter for modifying an area within said image indicative of a red-eye phenomenon and thereby generating a modified image that is stored on the camera as a modified image.
  • 2. The camera according to claim 1, further comprising an integral image display for displaying the modified image.
  • 3. The camera according to claim 1, wherein the area has a color and shape indicative of the red-eye phenomenon and the image is modified to change the color to a black color and further wherein: said integral flash selectively provides illumination during image recording; andsaid red-eye filter is enabled to modify the image in response to said integral flash providing illumination during image recording.
  • 4. The camera according to claim 3 further comprising an exposure control means for determining if the image was recorded in a condition conducive to the red-eye phenomenon and for generating a red-eye signal in response thereto, wherein said red-eye filter is further enabled in response to the red-eye signal.
  • 5. The camera according to claim 1, wherein said red-eye filter further includes a falsing avoidance apparatus which enables modification of the area in response to an absence of color indicative of the red-eye phenomenon with in a vicinity of and exclusive to the area.
  • 6. The camera according to claim 1, wherein said red-eye filter further includes a falsing avoidance apparatus which enables modification of the area in response a substantially white colored region within a vicinity of the area.
  • 7. The camera according to claim 1, wherein said red-eye filter comprises: a pixel locator for locating pixels having a color indicative of the red-eye phenomenon;a shape analyzer for determining if a grouping of at least a portion of the pixels located by said pixel locator comprise a shape indicative of the red-eye phenomenon; anda pixel modifier for modifying the color of the pixels within the grouping.
  • 8. The camera according to claim 7, further comprising a falsing analyzer for further processing the image in a vicinity of the grouping for details indicative of an eye, and for enabling said pixel modifier in response thereto.
  • 9. The camera according to claim 7, further comprising an exposure analyzer for determining if the image was recorded in a condition indicative of the red-eye phenomenon.
  • 10. A portable digital camera having no photographic film comprising: an integral flash for providing illumination during recording of an image without photographic film;a digital image capturing apparatus for recording said image;a red-eye filter for modifying an area within said image indicative of a red-eye phenomenon and thereby generating a modified image; andan integral image display for displaying the modified image.
  • 11. The camera according to claim 10, wherein the area has a color and shape indicative of the red-eye phenomenon and the image is modified to change the color to a black color and further wherein: said integral flash selectively provides illumination during image recording; andsaid red-eye filter is enabled to modify the image in response to said integral flash providing illumination during image recording.
  • 12. The camera according to claim 11, further comprising an exposure control means for determining if the image was recorded in a condition conducive to the red-eye phenomenon and for generating a red-eye signal in response thereto, wherein said red-eye filter is further enabled in response to the red-eye signal.
  • 13. The camera according to claim 10, wherein said red-eye filter further includes a falsing avoidance apparatus which enables modification of the area in response to an absence of color indicative of the red-eye phenomenon with in a vicinity of and exclusive to the area.
  • 14. The camera according to claim 10, wherein said red-eye filter further includes a falsing avoidance apparatus which enables modification of the area in response a substantially white colored region within a vicinity of the area.
  • 15. The camera according to claim 10, wherein said red-eye filter comprises: a pixel locator for locating pixels having a color indicative of the red-eye phenomenon;a shape analyzer for determining if a grouping of at least a portion of the pixels located by said pixel locator comprise a shape indicative of the red-eye phenomenon; anda pixel modifier for modifying the color of the pixels within the grouping.
  • 16. The camera according to claim 15, further comprising a falsing analyzer for further processing the image in a vicinity of the grouping for details indicative of an eye, and for enabling said pixel modifier in response thereto.
  • 17. The camera according to claim 15, further comprising an exposure analyzer for determining if the image was recorded in a condition indicative of the red-eye phenomenon.
  • 18. A portable digital camera having no photographic film comprising: an integral flash for providing illumination during acquisition of an image without photographic film;a digital image capturing apparatus for acquiring said image; anda red-eye filter for modifying an area within said image indicative of a red-eye phenomenon.
  • 19. The camera of claim 18, further comprising a processor and memory to generating and store on the camera a modified image including said image with said area modified.
  • 20. The camera according to claim 18, further comprising an integral image display for displaying a modified image including said image with said area modified.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 11/379,346, filed Apr. 19, 2006, now U.S. Pat. No. 7,619,665, which is a continuation of U.S. application Ser. No. 10/170,511, filed Jun. 12, 2002, now U.S. Pat. No. 7,042,505, which is a continuation application of U.S. application Ser. No. 08/947,603, filed Oct. 9, 1997, now U.S. Pat. No. 6,407,777. Each of these applications is hereby incorporated by reference.

US Referenced Citations (310)
Number Name Date Kind
4285588 Mir Aug 1981 A
4577219 Klie et al. Mar 1986 A
4646134 Komatsu et al. Feb 1987 A
4777620 Shimoni et al. Oct 1988 A
4881067 Watanabe et al. Nov 1989 A
4978989 Nakano et al. Dec 1990 A
5016107 Sasson et al. May 1991 A
5070355 Inoue et al. Dec 1991 A
5130789 Dobbs et al. Jul 1992 A
5164831 Kuchta et al. Nov 1992 A
5164833 Aoki Nov 1992 A
5202720 Fujino et al. Apr 1993 A
5227837 Terashita Jul 1993 A
5231674 Cleveland et al. Jul 1993 A
5249053 Jain Sep 1993 A
5274457 Kobayashi et al. Dec 1993 A
5301026 Lee Apr 1994 A
5303049 Ejima et al. Apr 1994 A
5335072 Tanaka et al. Aug 1994 A
5384601 Yamashita et al. Jan 1995 A
5400113 Sosa et al. Mar 1995 A
5424794 McKay Jun 1995 A
5432863 Benati et al. Jul 1995 A
5432866 Sakamoto Jul 1995 A
5438367 Yamamoto et al. Aug 1995 A
5452048 Edgar Sep 1995 A
5455606 Keeling et al. Oct 1995 A
5537516 Sherman et al. Jul 1996 A
5568187 Okino Oct 1996 A
5568194 Abe Oct 1996 A
5649238 Wakabayashi et al. Jul 1997 A
5671013 Nakao Sep 1997 A
5678073 Stephenson, III et al. Oct 1997 A
5694926 DeVries et al. Dec 1997 A
5708866 Leonard Jan 1998 A
5719639 Imamura Feb 1998 A
5719951 Shackleton et al. Feb 1998 A
5721983 Furutsu Feb 1998 A
5724456 Boyack et al. Mar 1998 A
5734425 Takizawa et al. Mar 1998 A
5748764 Benati et al. May 1998 A
5748784 Sugiyama May 1998 A
5751836 Wildes et al. May 1998 A
5761550 Kancigor Jun 1998 A
5781650 Lobo et al. Jul 1998 A
5805720 Suenaga et al. Sep 1998 A
5805727 Nakano Sep 1998 A
5805745 Graf Sep 1998 A
5815749 Tsukahara et al. Sep 1998 A
5818975 Goodwin et al. Oct 1998 A
5847714 Naqvi et al. Dec 1998 A
5850470 Kung et al. Dec 1998 A
5862217 Steinberg et al. Jan 1999 A
5862218 Steinberg Jan 1999 A
5892837 Luo et al. Apr 1999 A
5949904 Delp Sep 1999 A
5974189 Nicponski Oct 1999 A
5990973 Sakamoto Nov 1999 A
5991456 Rahman et al. Nov 1999 A
5991549 Tsuchida Nov 1999 A
5991594 Froeber et al. Nov 1999 A
5999160 Kitamura et al. Dec 1999 A
6006039 Steinberg et al. Dec 1999 A
6009209 Acker et al. Dec 1999 A
6011547 Shiota et al. Jan 2000 A
6016354 Lin et al. Jan 2000 A
6028611 Anderson et al. Feb 2000 A
6035072 Read Mar 2000 A
6035074 Fujimoto et al. Mar 2000 A
6036072 Lee Mar 2000 A
6101271 Yamashita et al. Aug 2000 A
6104839 Cok et al. Aug 2000 A
6118485 Hinoue et al. Sep 2000 A
6134339 Luo Oct 2000 A
6151403 Luo Nov 2000 A
6172706 Tatsumi Jan 2001 B1
6192149 Eschbach et al. Feb 2001 B1
6195127 Sugimoto Feb 2001 B1
6201571 Ota Mar 2001 B1
6204858 Gupta Mar 2001 B1
6204868 Yamauchi et al. Mar 2001 B1
6233364 Krainiouk et al. May 2001 B1
6249315 Holm Jun 2001 B1
6252976 Schildkraut et al. Jun 2001 B1
6266054 Lawton et al. Jul 2001 B1
6268939 Klassen et al. Jul 2001 B1
6275614 Krishnamurthy et al. Aug 2001 B1
6278491 Wang et al. Aug 2001 B1
6285410 Marni Sep 2001 B1
6292574 Schildkraut et al. Sep 2001 B1
6295378 Kitakado et al. Sep 2001 B1
6298166 Ratnakar et al. Oct 2001 B1
6300935 Sobel et al. Oct 2001 B1
6381345 Swain Apr 2002 B1
6393148 Bhaskar May 2002 B1
6396963 Shaffer et al. May 2002 B2
6407777 DeLuca Jun 2002 B1
6421468 Ratnakar et al. Jul 2002 B1
6426775 Kurokawa Jul 2002 B1
6429924 Milch Aug 2002 B1
6433818 Steinberg et al. Aug 2002 B1
6438264 Gallagher et al. Aug 2002 B1
6441854 Fellegara et al. Aug 2002 B2
6459436 Kumada et al. Oct 2002 B1
6473199 Gilman et al. Oct 2002 B1
6496655 Malloy Desormeaux Dec 2002 B1
6501911 Malloy Desormeaux Dec 2002 B1
6505003 Malloy Desormeaux Jan 2003 B1
6510520 Steinberg Jan 2003 B1
6516154 Parulski et al. Feb 2003 B1
6614471 Ott Sep 2003 B1
6614995 Tseng Sep 2003 B2
6621867 Sazzad et al. Sep 2003 B1
6628833 Horie Sep 2003 B1
6700614 Hata Mar 2004 B1
6707950 Burns et al. Mar 2004 B1
6714665 Hanna et al. Mar 2004 B1
6718051 Eschbach Apr 2004 B1
6724941 Aoyama Apr 2004 B1
6728401 Hardeberg Apr 2004 B1
6765686 Maruoka Jul 2004 B2
6786655 Cook et al. Sep 2004 B2
6792161 Imaizumi et al. Sep 2004 B1
6798913 Toriyama Sep 2004 B2
6859565 Baron Feb 2005 B2
6873743 Steinberg Mar 2005 B2
6885766 Held et al. Apr 2005 B2
6895112 Chen et al. May 2005 B2
6900882 Iida May 2005 B2
6912298 Wilensky Jun 2005 B1
6937997 Parulski Aug 2005 B1
6967680 Kagle et al. Nov 2005 B1
6980691 Nesterov et al. Dec 2005 B2
6984039 Agostinelli Jan 2006 B2
7024051 Miller et al. Apr 2006 B2
7027662 Baron Apr 2006 B2
7030927 Sasaki Apr 2006 B2
7035461 Luo et al. Apr 2006 B2
7035462 White et al. Apr 2006 B2
7042501 Matama May 2006 B1
7042505 DeLuca May 2006 B1
7062086 Chen et al. Jun 2006 B2
7116820 Luo et al. Oct 2006 B2
7133070 Wheeler et al. Nov 2006 B2
7155058 Gaubatz et al. Dec 2006 B2
7171044 Chen et al. Jan 2007 B2
7216289 Kagle et al. May 2007 B2
7224850 Zhang et al. May 2007 B2
7289664 Enomoto Oct 2007 B2
7295233 Steinberg et al. Nov 2007 B2
7310443 Kris et al. Dec 2007 B1
7315631 Corcoran et al. Jan 2008 B1
7336821 Ciuc et al. Feb 2008 B2
7352394 DeLuca et al. Apr 2008 B1
7362368 Steinberg et al. Apr 2008 B2
7369712 Steinberg et al. May 2008 B2
7403643 Ianculescu et al. Jul 2008 B2
7436998 Steinberg et al. Oct 2008 B2
7454040 Luo et al. Nov 2008 B2
7515740 Corcoran et al. Apr 2009 B2
7619665 DeLuca Nov 2009 B1
7738015 Steinberg et al. Jun 2010 B2
20010031142 Whiteside Oct 2001 A1
20010052937 Suzuki Dec 2001 A1
20020019859 Watanabe Feb 2002 A1
20020041329 Steinberg Apr 2002 A1
20020051571 Jackway et al. May 2002 A1
20020054224 Wasula et al. May 2002 A1
20020085088 Eubanks Jul 2002 A1
20020090133 Kim et al. Jul 2002 A1
20020093577 Kitawaki et al. Jul 2002 A1
20020105662 Patton et al. Aug 2002 A1
20020114513 Hirao Aug 2002 A1
20020131770 Meier et al. Sep 2002 A1
20020141661 Steinberg Oct 2002 A1
20020150292 O'callaghan Oct 2002 A1
20020159630 Buzuloiu et al. Oct 2002 A1
20020172419 Lin et al. Nov 2002 A1
20030021478 Yoshida Jan 2003 A1
20030025808 Parulski et al. Feb 2003 A1
20030025811 Keelan et al. Feb 2003 A1
20030044063 Meckes et al. Mar 2003 A1
20030044070 Fuersich et al. Mar 2003 A1
20030044176 Saitoh Mar 2003 A1
20030044177 Oberhardt et al. Mar 2003 A1
20030044178 Oberhardt et al. Mar 2003 A1
20030052991 Stavely et al. Mar 2003 A1
20030058343 Katayama Mar 2003 A1
20030058349 Takemoto Mar 2003 A1
20030107649 Flickner et al. Jun 2003 A1
20030113035 Cahill et al. Jun 2003 A1
20030118216 Goldberg Jun 2003 A1
20030137597 Sakamoto et al. Jul 2003 A1
20030161506 Velazquez et al. Aug 2003 A1
20030190072 Adkins et al. Oct 2003 A1
20030194143 Iida Oct 2003 A1
20030202715 Kinjo Oct 2003 A1
20040017481 Takasumi et al. Jan 2004 A1
20040027593 Wilkins Feb 2004 A1
20040032512 Silverbrook Feb 2004 A1
20040032526 Silverbrook Feb 2004 A1
20040033071 Kubo Feb 2004 A1
20040041924 White et al. Mar 2004 A1
20040046878 Jarman Mar 2004 A1
20040047491 Rydbeck Mar 2004 A1
20040056975 Hata Mar 2004 A1
20040057623 Schuhrke et al. Mar 2004 A1
20040057705 Kohno Mar 2004 A1
20040057715 Tsuchida et al. Mar 2004 A1
20040090461 Adams May 2004 A1
20040093432 Luo et al. May 2004 A1
20040114796 Kaku Jun 2004 A1
20040114797 Meckes Jun 2004 A1
20040114829 LeFeuvre et al. Jun 2004 A1
20040114904 Sun et al. Jun 2004 A1
20040119851 Kaku Jun 2004 A1
20040120598 Feng Jun 2004 A1
20040125387 Nagao et al. Jul 2004 A1
20040126086 Nakamura et al. Jul 2004 A1
20040141657 Jarman Jul 2004 A1
20040150743 Schinner Aug 2004 A1
20040160517 Iida Aug 2004 A1
20040165215 Raguet et al. Aug 2004 A1
20040184044 Kolb et al. Sep 2004 A1
20040184670 Jarman et al. Sep 2004 A1
20040196292 Okamura Oct 2004 A1
20040196503 Kurtenbach et al. Oct 2004 A1
20040223063 DeLuca et al. Nov 2004 A1
20040227978 Enomoto Nov 2004 A1
20040233299 Ioffe et al. Nov 2004 A1
20040233301 Nakata et al. Nov 2004 A1
20040234156 Watanabe et al. Nov 2004 A1
20040239779 Washisu Dec 2004 A1
20040240747 Jarman et al. Dec 2004 A1
20040258308 Sadovsky et al. Dec 2004 A1
20050001024 Kusaka et al. Jan 2005 A1
20050013602 Ogawa Jan 2005 A1
20050013603 Ichimasa Jan 2005 A1
20050024498 Iida et al. Feb 2005 A1
20050031224 Prilutsky et al. Feb 2005 A1
20050041121 Steinberg et al. Feb 2005 A1
20050046730 Li Mar 2005 A1
20050047655 Luo et al. Mar 2005 A1
20050058340 Chen et al. Mar 2005 A1
20050062856 Matsushita Mar 2005 A1
20050063083 Dart et al. Mar 2005 A1
20050068452 Steinberg et al. Mar 2005 A1
20050074164 Yonaha Apr 2005 A1
20050074179 Wilensky Apr 2005 A1
20050078191 Battles Apr 2005 A1
20050129331 Kakiuchi et al. Jun 2005 A1
20050134719 Beck Jun 2005 A1
20050140801 Prilutsky et al. Jun 2005 A1
20050147278 Rui et al. Jul 2005 A1
20050151943 Iida Jul 2005 A1
20050163498 Battles et al. Jul 2005 A1
20050168965 Yoshida Aug 2005 A1
20050196067 Gallagher et al. Sep 2005 A1
20050200736 Ito Sep 2005 A1
20050207649 Enomoto et al. Sep 2005 A1
20050212955 Craig et al. Sep 2005 A1
20050219385 Terakawa Oct 2005 A1
20050219608 Wada Oct 2005 A1
20050220346 Akahori Oct 2005 A1
20050220347 Enomoto et al. Oct 2005 A1
20050226499 Terakawa Oct 2005 A1
20050232490 Itagaki et al. Oct 2005 A1
20050238230 Yoshida Oct 2005 A1
20050243348 Yonaha Nov 2005 A1
20050275734 Ikeda Dec 2005 A1
20050276481 Enomoto Dec 2005 A1
20050280717 Sugimoto Dec 2005 A1
20050286766 Ferman Dec 2005 A1
20060008171 Petschnigg et al. Jan 2006 A1
20060017825 Thakur Jan 2006 A1
20060038916 Knoedgen et al. Feb 2006 A1
20060039690 Steinberg et al. Feb 2006 A1
20060045352 Gallagher Mar 2006 A1
20060050300 Mitani et al. Mar 2006 A1
20060066628 Brodie et al. Mar 2006 A1
20060082847 Sugimoto Apr 2006 A1
20060093212 Steinberg et al. May 2006 A1
20060093238 Steinberg et al. May 2006 A1
20060098867 Gallagher May 2006 A1
20060098875 Sugimoto May 2006 A1
20060119832 Iida Jun 2006 A1
20060120599 Steinberg et al. Jun 2006 A1
20060140455 Costache et al. Jun 2006 A1
20060150089 Jensen et al. Jul 2006 A1
20060204052 Yokouchi Sep 2006 A1
20060204110 Steinberg et al. Sep 2006 A1
20060221408 Fukuda Oct 2006 A1
20060285754 Steinberg et al. Dec 2006 A1
20070110305 Corcoran et al. May 2007 A1
20070116379 Corcoran et al. May 2007 A1
20070116380 Ciuc et al. May 2007 A1
20070133863 Sakai et al. Jun 2007 A1
20070154189 Harradine et al. Jul 2007 A1
20070201724 Steinberg et al. Aug 2007 A1
20070263104 DeLuca et al. Nov 2007 A1
20070263928 Akahori Nov 2007 A1
20080002060 DeLuca et al. Jan 2008 A1
20080013798 Ionita et al. Jan 2008 A1
20080043121 Prilutsky et al. Feb 2008 A1
20080112599 Nanu et al. May 2008 A1
20080144965 Steinberg et al. Jun 2008 A1
20080186389 DeLuca et al. Aug 2008 A1
20080211937 Steinberg et al. Sep 2008 A1
20080232711 Prilutsky et al. Sep 2008 A1
20080240555 Nanu et al. Oct 2008 A1
Foreign Referenced Citations (23)
Number Date Country
884694 Dec 1998 EP
911759 Apr 1999 EP
911759 Jun 2000 EP
1199672 Apr 2002 EP
1288858 Mar 2003 EP
1288859 Mar 2003 EP
1288860 Mar 2003 EP
1293933 Mar 2003 EP
1296510 Mar 2003 EP
1429290 Jun 2004 EP
1478169 Nov 2004 EP
1429290 Jul 2008 EP
4192681 Jul 1992 JP
2000-134486 May 2000 JP
2002-271808 Sep 2002 JP
WO0171421 Sep 2001 WO
WO0245003 Jun 2002 WO
WO03026278 Mar 2003 WO
WO03071484 Aug 2003 WO
WO2005015896 Feb 2005 WO
WO2005041558 May 2005 WO
WO2005109853 Nov 2005 WO
WO2006018056 Feb 2006 WO
Continuations (3)
Number Date Country
Parent 11379346 Apr 2006 US
Child 12611387 US
Parent 10170511 Jun 2002 US
Child 11379346 US
Parent 08947603 Oct 1997 US
Child 10170511 US