This patent application is related to the U.S. patent application Ser. No. ______, filed MONTH DAY, 2004, entitled “Method and Apparatus for Effecting Automatic Red Eye Reduction” and assigned to the assignee of the present application.
None.
None.
1. Field of the Invention
The present invention relates to processing of an image, and more particularly to processing of an image having red eye effect.
2. Description of the Related Art
Red eye effect is a common phenomenon in flash photography. In some environments, (e.g., in dim or dark places), the iris of an eye is opened wide for better viewing. When a flash is used for taking a picture in such environments, a burst of light is reflected from blood cells of the pupil, thereby producing a red eye effect in the resulting image. Images with red eye effect can look unrealistic and often unsightly. Correcting or reducing the red eye effect therefore enhances image perception. However, identifying eye regions having red eye effect is often difficult, due to nearby red pixels that are not part of the red eye effect. Moreover, since the eye is considered an important feature of a face, any mistake in red eye effect correction or reduction is often readily detected and unacceptable.
Accordingly, there is a need for an improved technique for reducing red eye effect in images. To this end, some embodiments of the present invention use an apparatus and method for building a boundary table for red eye colors to identify red eye pixels. Also, some embodiments of the present invention use an apparatus and method for locating red eye regions, such as by using a boundary table. Some embodiments of the present invention use an apparatus and method for reducing red eye using data from the red eye regions and changing the color of the red eye in such regions.
In one form, the invention provides a method of identifying a red eye from image data that has image attributes. The method includes determining a plurality of image attribute from the image data, and selecting from the determined image attributes a select image attribute. The method also includes grouping the determined image attributes with respect to the select image attribute, and setting an image attribute boundary based on the determined image attributes for the select image attribute.
In another form, the invention provides a method of centering a red eye region of an image. The method includes determining a region of the image that includes a portion of the red eye, selecting a pixel from the region where the pixel represents an initial red eye center, and dividing the region into a plurality of circular regions centered around the initial red eye center, each circular region having a radius measured from the initial red eye center. The method also includes counting a red eye pixel number for each circular region, and locating a centroid of the red eye pixels when the red eye number is less than a red eye pixel threshold for the radius.
In yet another form, the invention provides a method of reducing red eye effect of a red eye centered at a pixel. The method includes defining a first red eye region and a second red eye region round the pixel. In some embodiments, the second red eye region envelopes the first red eye region, and has a plurality of second region pixels. The method also includes filling pixels in the first region with a first color, measuring a distance for each of the second region pixels from the pixel, and filling the second region pixel with a second color based on the distance.
In yet another form, the invention provides a method of reducing red eye effect from image data having image attributes. The method includes identifying image data with a first image attribute that has characteristics of red eye pixels, and determining a centroid of the identified image data having characteristics of red eye pixels. The method also includes defining a red eye region based on the centroid, and filling each of the pixels in the red eye region with a color determined from an equation relating to a distance between the centroid and each of the pixels.
In yet another form, the invention provides a method of identifying a pixel having image attributes that are characteristics of red eye effect. The method includes retrieving a plurality of boundary points with respect to at least one of the image attributes, drawing a line from the at least one of the image attributes of the pixel to each of the boundary points, and determining if an angle between adjacent lines exceeds an angle threshold.
In yet another form, the invention provides an apparatus of reducing red eye effect from image data having image attributes. The apparatus includes a first image attribute identifying software code that identifies the image data with a first image attribute having characteristics of red eye pixels. The apparatus also includes centroid identifying software code to determine a centroid of the identified image data having characteristics of red eye pixels, and red eye region defining software code to define a red eye region based on the centroid. The apparatus also includes filler software code to fill each of the pixels in the red eye region with a color determined from an equation relating a distance between the centroid and each of the pixels.
Other features and advantages of the invention will become apparent to those skilled in the art upon review of the following detailed description, claims, and drawings.
The patent or application file contains at least one drawing executed in color. Copies of the patent or patent application publication with color drawings(s) will be provided by the Office upon request and payment of the necessary fee.
Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, the terms “connected,” “coupled,” and “mounted” and variations thereof herein are used broadly and encompass direct and indirect connections, couplings, and mountings. In addition, the terms “connected” and “coupled” and variations thereof are not restricted to physical or mechanical connections or couplings.
Before the reduction of red eye effects is made in an image, the red eye effects are identified. In many cases, the success of red eye effect reduction is at least partially dependent upon accurate identification of such red eye effects. However, identifying red eye effect can be particularly difficult because what constitutes a red eye color in an image can be a skin color in another part of the image or in another image. As a result, a first step of the red eye reduction process according to some embodiments of the present invention is to accurately identify red eye effects, or red eyes. To identify red eyes, a red eye boundary table can be constructed.
The red eye boundary construction method 100 illustrated in
As used herein and in the appended claims, the term “pixel” includes elements of any image comprising graphics and/or text. Also, the term “pixel” includes all such elements found on or in any medium, including without limitation image elements on a display screen, on a printed medium, and the like). Examples of pixels include LCD, CRT, and other display screen elements, and elements printed on any surface (e.g., pels, cells, dots, and the like).
For example, if the following (R, G, B) triplets are extracted from the red eye images in order to build a red eye database: (80, 00, 56), (78, 13, 33), (77, 33, 30), (78, 29, 33), (80, 26, 24), (78, 43, 37), (79, 10, 41), (77, 34, 27), (79, 39, 37), (79, 39, 37), (78, 26, 49), (79, 18, 34), (79, 37, 49), (79, 38, 32), (79, 41, 32), (80, 16, 35), (78, 38, 36), (80, 39, 73), (80, 27, 19), (77, 32, 27), (80, 39, 35), and (78, 13, 33). The underlined triplets indicated duplicate values and one triplet in each pair of the (79, 39, 37) and (78, 13, 33) triplets can be removed at block 112 as being duplicate or redundant data.
The extracted red eye image attributes (e.g., the (R, G, B) triplets) can be stored in a red eye database. This red eye database is a database of colors identified as red eye colors, and can be used to determine whether parts (e.g., pixels) of an image are part of a red eye, as will be described in greater detail below. Although such a red eye database can be constructed by sampling any number of red eyes from images, the red eye database can be constructed in any other manner, such as by specifying a number of red eye colors to be included in the database
Since a red eye database built from a limited number of samples is rarely (if ever) exhaustive, the red eye boundary construction method 100 according to some embodiments is configured to interpolate for missing points. To simplify the interpolation process, one of the extracted image attributes (e.g., one of the (R, or G, or B) triplets in an RGB triplet color space, or any other attribute in other spaces) can be selected to be a select image attribute, leaving behind one or more other extracted image attributes (e.g., a set of remaining extracted attributes). When image attributes such as (R, G, B) triplets are used, one of the R, G, and B triplets is selected to be a select image attribute. For example, in some embodiments, the select image attribute is R, and therefore the remaining attributes are G and B.
Once a select image attribute has been determined and selected, the extracted image attributes from the red eye image can be sorted with respect to the select image attribute at block 116. In the example discussed above, data sorted with respect to R includes (77, 32, 27), (77, 33, 30), (77, 34, 27), (78, 13, 33), (78, 26, 49), (78, 29, 33), (78, 38, 36), (78, 43, 37), (79, 10, 41), (79, 18, 34), (79, 37, 49), (79, 38, 30), (79, 39, 37), (79, 41, 32), (80, 00, 56), (80, 16, 35), (80, 26, 24), (80, 27, 19), (80, 39, 35), and (80, 39, 73).
Thereafter, the red eye boundary construction method 100 according to some embodiments of the present invention groups the sorted image attributes according to each of the select image attribute values at block 120. Depending on the number of the select image attribute values, there can be a large number of groups for the select image attribute. Continuing with the above example, the following remaining extracted attribute groups of (G, B) pairs are indexed by R. For the selected group attribute R=77, the (G, B) pairs are (32, 27), (33, 30), and (34, 27). For the selected group attribute R=78, the (G, B) pairs are (13, 33), (26, 49), (29, 33), (38, 36), and (43, 37). For the selected group attribute R=79, the (G, B) pairs are (10, 41), (18, 34), (37, 49), (38, 30), (39, 37), and (41, 32). For the selected group attribute R=80, the (G, B) pairs are (0, 56), (16, 35), (26, 24), (27, 19), (39, 35), and (39, 73). In this example, there are four R groups. These sorted groups of R-indexed (G, B) pairs can be stored in the red eye database at block 124 for further processing
At block 128, the red eye boundary construction method 100 in the illustrated embodiment sets an image attribute boundary on the remaining extracted image attributes for each of the select image attribute values. In general, the image attribute boundary is established to enclose the remaining extracted image attributes for each value of the select image attributes. Since the image attribute boundary can be represented by a set of boundary points, in some embodiments only the boundary points for each indexed attribute are stored at block 132. Thereafter, the process of setting an image attribute boundary at block 128 is repeated if there are more R groups to be analyzed. Otherwise, the red eye boundary construction method 100 stops at block 140. Although RGB triplets are used in the red eye boundary construction method 100 of the illustrated embodiment, other the red eye boundary construction method 100 can be used for establishing boundaries for other image attributes, or for image attributes defined in other manners. For example, the red eye construction method 100 can be used with other types of color spaces, such as luminance bandwidth chrominance (“YUV”), luminance chroma-blue chroma-red (“YCbCr”), L*ab, and L*CH color spaces.
To find boundary points in the red eye construction method 100, some embodiment of the present invention use a grouping or “rubber banding” technique. For example,
Furthermore, the image attribute boundary 202 can include any number of points that are not part of the original samples (i.e., those falling within the image attribute boundary 202 but not specifically found in the samples used to construct the image attribute boundary 202. Also, the boundary table can be easily expandable in some embodiments. While the image attribute boundary table can be constructed earlier (e.g., as a default boundary), new data can be optionally added to the table. For example, when a red eye is found, the red eye boundary construction algorithm 100 can insert one or more image attributes of the new pixels in the sample, and can re-run the red eye boundary construction method 100 to generate a new boundary table. Such a process can take place automatically, in some embodiments.
As mentioned above, the red eye boundary construction method 100 can be used for establishing boundaries for other image attributes, or for image attributes defined in other manners. For example, in some embodiments, red eye colors from sampling of the eye can be converted from RGB color space to YCbCr color space. The converted YCbCr triplets can be stored in a database as (Y, Cb, Cr) triplets sorted with respect to Y. That is, the database or the table can be Y indexed. For each Y indexed group, the corresponding Cb, and Cr can be sorted with respect to their values. As a result, each Y-indexed group can be stored as 2-dimensional points. In other embodiments, image attribute boundaries having one or more additional dimensions (e.g., a three-dimensional image attribute boundary) can be constructed, such as by using image attributes having four values and in which three of the four values are used to construct points for the image attribute boundary.
In other embodiments, the red eye boundary table can be a 2-dimensional space of red eye colors of sampled images. For example, such a boundary table can have red eye colors defined by Cb and Cr values. In this way, the boundary table can be significantly smaller when compared with tables generated with 3-D RGB or YCbCr spaces. The smaller boundary table can therefore speed up look up processes. Similarly, red eye boundary tables can also be generated in any other space, such as in L*ab space and in L*CH space. In both of the L*ab and L*CH spaces, data can be sorted in the order of L*, a, and b, and, L*, C, and H, respectively. The sorted data can then be grouped by the index of L* value.
In some embodiments, each pixel of an image is compared with boundary points to see if the pixel falls within the boundary. In this way, pixels whose colors are within the boundary points of the image attribute boundary retrieved at block 308 are considered red eye pixels. To determine if a pixel is inside the image attribute boundary indexed by R, in some embodiments rays can be drawn from the image attribute pair of the pixel (e.g., the R-indexed (G, B) pair in the illustrated embodiment) to the boundary points at block 312. Angles between all adjacent rays can thereafter be determined at block 316. If any of the determined angles exceeds an angle threshold, such as 180°, as determined at block 320, the pixel is considered to be outside the image attribute boundary. As a result, a “FALSE” is then returned at block 324, which means the pixel is likely not to have red eye characteristics. Otherwise, if all the determined angles are equal to or within the angle threshold (e.g., 180°), the pixel is considered to be inside the image attribute boundary. In such a case, a “TRUE” is then returned at block 328, which means the pixel is considered to have red eye characteristics.
For example,
To identify the existence of a red eye in an image, any manual or automatic red eye detection method can be used.
Referring back to
As red eye pixels continue to be counted at greater radial distances, the area of the rings 728A, 728B, 728C, 728D, and 728E can increase in proportion to the distance between the ring 728 and the suggested red eye center 724 (such as for rings having the same width). As a result, assuming a uniform distribution of red eye pixels in a region of an image, additional rings can contain proportional increases in the number of red eye pixels. Therefore, the number of red eye pixels will increase as more rings are counted. At block 620, a red eye pixel threshold T is determined for a current concentric circular ring 728. In some embodiments, the red eye pixel threshold T for the current concentric circular ring 728 is determined as follows. If the distance between the outer perimeter (ring 728E) of the ring 728 being examined and the suggested red eye center 724 is R, and C is a constant determined empirically or with an algorithm, T=R×C. The value of C in some embodiments is about 1. Of course, the inner perimeter (ring 728A) of the ring 728 can also be used to calculate the red eye pixel threshold T, if desired. Also, the red eye pixel threshold for any of the rings 728A, 728B, 728C, 728D, and 728E can be calculated or set in any other manner desired.
In some embodiments, the number of red eye pixels counted in each ring 728A, 728B, 728C, 728D, and 728E is compared with the variable threshold, T. For example, as long as the red eye pixel count is greater T as determined at block 624, the counting continues onto a next circular ring at block 628. However, the ring in which the number of red eye pixels drops below the threshold T can be considered a boundary of the red eye region. When this ring is detected, the process of counting red eye pixels can stop and a derived radius of the red eye can then be set as the distance between the outer perimeter of this ring and the suggested red eye center 724 at block 632. In other embodiments, the derived radius can instead be set as the distance between the inner perimeter (728A) of the ring 728 and the suggested red eye center 724. Thereafter, at block 636 red eye pixels inside a circle formed by the derived radius can be located. In some embodiments at block 640, based on the locations of the red eye pixels, as shown in
To reduce the red eye effect, in some embodiments all or any number of the pixels inside the estimated red eye region 700′ are replaced or filled with another color (e.g., a neutral color). In some embodiments, the estimated red eye region 700′ can be divided into a core or a first red eye region 740 and a periphery or a second red eye region 744 both centered at the red eye center 732, as shown in
The first red eye region 740 in the illustrated embodiment is likely to have red eye colors. Therefore, all the pixels inside the first red eye region 740 can be replaced or filled with another filling color (e.g., a neutral color). In some embodiments, the filling color can be a user-defined or a user-selected color defined or selected by a user. In yet some other embodiments, the filling color can be chosen from pixels near or adjacent to the red eye region by a user. For example, if two eyes have been detected in the image, and one of the two detected eyes does not have any red eye effect, the user can select the color of the unaffected eye to be the filling color for the other eye. This process can be performed while preserving the lightness of the red eye region 740. On the other hand, the second red eye region 744 may have a combination of red eye color, some color of the pupil, and skin color in any proportion. Thus, in some embodiments a distance measure is established to allow a gradual change in color. For example, for pixels adjacent to the first red eye region 740, the color of the pixels can be changed to another color (e.g., a neutral color). Colorfulness can be increased at any desired rate as the distance from the first red eye region 740 increases. The rate of change can be linear or non-linear as desired. In some embodiments, the rate of change is such that the colorfulness of a pixel farthest away from the center 732 is 100 percent. In other words, the color of the pixels farthest away from the center 732 will remain unchanged.
To illustrate a gradual change of color with changing radial distance in a red eye,
L=a1 RIN+a2 GIN+a2 BIN
where a1, a2, and a3 are 0.299, 0.5870, and 0.1140, respectively, in some embodiments. Of course, other values of a1, a2, and a3 can also be used in other applications. Also, other color space parameters (other than RIN, GIN, and BIN), such as chrominance can instead be used. Thereafter, the pixel distance RPIX is measured against the core radius, RCORE 748 at block 816. If the pixel distance RPIX is less than the core radius, RCORE 748, the pixel being examined is considered inside the first region 740. In such a case, the pixel being examined can be filled with another color (e.g., a neutral color) at block 820. In some embodiments, the following equation, EQN. 1, can be used to convert the color:
where ROUT, GOUT and BOUT are output colors. Once the values ROUT, GOUT and BOUT have been determined, the red eye reduction method 800 stops at block 822. Otherwise, when the pixel distance RPIX is at least equal to the core radius RCORE 748, the red eye reduction method 800 can determine if the pixel distance RPIX is less than the red eye radius, REYE 736 at block 824. If the pixel distance RPIX is less than the red eye radius, REYE 736, the pixel being examined can be considered outside the first region 740 but inside the second red eye region 744. In such a case, the pixel being examined can be filled with a transitional color at block 828 and the red eye reduction method 800 stops at block 830. In some embodiments, the following equation, EQN. 2, can be used to fill in the transition color.
Otherwise, the red eye reduction method 800 can simply keep the original color of the pixel and stops at block 834. If, at block 824, RPIX is not less than REYE, the method stops at block 836.
It should be noted that the various aspects of the invention described herein need not necessarily be used together in a single system or method. In this regard, any of the various aspects of the present invention (e.g., red eye boundary construction, red eye identification, red eye reduction, and the like) can be used alone or in any combination with other aspects while still falling within the spirit and scope of the present invention.
Various features and advantages of the invention are set forth in the following claims.