The subject application is directed generally to detection of characteristics in electronically encoded images. The application is particularly applicable to detection of one or more portions of an electronic image wherein an object is backlit.
Early image capturing systems involved shutters, lenses, and photo-sensitive material that was chemically changed when exposed to light. More recently, image capture is done with digital imaging devices, such as digital cameras or scanners. Acquired images, particularly those that result from real life images, such as may be acquired by digital cameras or scans of photographs, are often captured in non-optimal situations. Problems associated with earlier image capturing operations are also present with digitally acquired images. One such situation is presented with backlighting. A relatively bright backlighting tends to wash out or obscure objects in a forefront of such lighting. The relative intensity of the backlit area to the lighting of the object can wash out or otherwise obscure the features in the object. Backlighting is particularly problematic with human subjects insofar is it can result in obscured facial characteristics.
Early attempts at obfuscating the effects of backlighting included repositioning a subject relative to the light and image capturing device, such as a camera. By way of example, a photographer may reorient a person so that the sun is to the photographer's back and the subject is positioned to his front. In other situations, a photographer may use a flash to give better illumination of a subject relative to the backlighting.
Electronic images resultant from today's imaging equipment exist in many formats. By way of example, images may be acquired or stored in various schemes, including RAW, JPEG, GIF, TIFF or PCX, as well as many other image data types. Many image data encoding schemes define images in connection with a multidimensional color space, such as a space defined by either additive or subtractive primary colors. Such color spaces include red-green-blue (RGB); cyan, magenta, yellow (CYM), which is sometimes encoded with a blac(K) component as CMYK. Given the encoded nature of such captured images, it is possible to perform analysis and manipulation of underlying image data.
In accordance with one embodiment of the subject application, there is provided a system and method for backlit face image correction. Image data is first received that includes at least one facial region that defines an image human face. Via a processor that operates in accordance with software and an associated data storage, at least one facial region is detected and skin tone characteristics of the at least one facial region are then detected. Pixel count data and histogram data are then calculated from the received image data. A plateau is then detected in a function of the pixel count data relative to the histogram data. Based upon a property of the detected plateau, a correction factor is then calculated. Pixel values of the at least one facial region are adjusted based upon the calculated correction factor. A corrected image is then output that includes adjusted pixel values corresponding to at least one facial region.
Still other advantages, aspects and features of the subject application will become readily apparent to those skilled in the art from the following description wherein there is shown and described a preferred embodiment of the subject application, simply by way of illustration of one of the best modes best suited to carry out the subject application. As it will be realized, the subject application is capable of other different embodiments and its several details are capable of modifications in various obvious aspects all without departing from the scope of the subject application. Accordingly, the drawings and descriptions will be regarded as illustrative in nature and not as restrictive.
The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawings will be provided by the Office upon request and payment of the necessary fee. The subject application is described with reference to certain figures, including:
The subject application is directed to a system and method for backlit face image correction. In particular, the subject application is directed to a system and method for the enhancement of the faces of human subjects depicted in electronic images. More particularly, the subject application is directed to a system and method for automatically correcting backlit faces of human subjects in an input image. It will become apparent to those skilled in the art that the system and method described herein are suitably adapted to a plurality of varying electronic fields employing data detection and correction, including, for example and without limitation, communications, general computing, data processing, document processing, financial transactions, vending of products or services, or the like. The preferred embodiment, as depicted in
Referring now to
The system 100 also includes a document processing device 104, which is depicted in
According to one embodiment of the subject application, the document processing device 104 is suitably equipped to receive a plurality of portable storage media, including, without limitation, Firewire drive, USB drive, SD, MMC, XD, Compact Flash, Memory Stick, and the like. In the preferred embodiment of the subject application, the document processing device 104 further includes an associated user interface 106, such as a touchscreen, LCD display, touch-panel, alpha-numeric keypad, or the like, via which an associated user is able to interact directly with the document processing device 104. In accordance with the preferred embodiment of the subject application, the user interface 106 is advantageously used to communicate information to the associated user and receive selections from the associated user. The skilled artisan will appreciate that the user interface 106 comprises various components, suitably adapted to present data to the associated user, as are known in the art. In accordance with one embodiment of the subject application, the user interface 106 comprises a display, suitably adapted to display one or more graphical elements, text data, images, or the like, to an associated user, receive input from the associated user, and communicate the same to a backend component, such as the controller 108, as explained in greater detail below. Preferably, the document processing device 104 is communicatively coupled to the computer network 102 via a communications link 112. As will be understood by those skilled in the art, suitable communications links include, for example and without limitation, WiMax, 802.11a, 802.11b, 802.11g, 802.11(x), Bluetooth, the public switched telephone network, a proprietary communications network, infrared, optical, or any other suitable wired or wireless data transmission communications known in the art. The functioning of the document processing device 104 will be better understood in conjunction with the block diagrams illustrated in
In accordance with one embodiment of the subject application, the document processing device 104 incorporates a backend component, designated as the controller 108, suitably adapted to facilitate the operations of the document processing device 104, as will be understood by those skilled in the art. Preferably, the controller 108 is embodied as hardware, software, or any suitable combination thereof, configured to control the operations of the associated document processing device 104, facilitate the display of images via the user interface 106, direct the manipulation of electronic image data, and the like. For purposes of explanation, the controller 108 is used to refer to any myriad of components associated with the document processing device 104, including hardware, software, or combinations thereof, functioning to perform, cause to be performed, control, or otherwise direct the methodologies described hereinafter. It will be understood by those skilled in the art that the methodologies described with respect to the controller 108 is capable of being performed by any general purpose computing system, known in the art, and thus the controller 108 is representative of such general computing devices and is intended as such when used hereinafter. Furthermore, the use of the controller 108 hereinafter is for the example embodiment only, and other embodiments, which will be apparent to one skilled in the art, are capable of employing the system and method for backlit face image correction. The functioning of the controller 108 will better be understood in conjunction with the block diagrams illustrated in
Communicatively coupled to the document processing device 104 is a data storage device 110. In accordance with the one embodiment of the subject application, the data storage device 110 is any mass storage device known in the art including, for example and without limitation, magnetic storage drives, a hard disk drive, optical storage devices, flash memory devices, or any suitable combination thereof. In one embodiment, the data storage device 110 is suitably adapted to store scanned image data, modified image data, redacted data, user information, document data, image data, electronic database data, or the like. It will be appreciated by those skilled in the art that while illustrated in
Also depicted in
The communications link 122 is any suitable channel of data communications known in the art including, but not limited to wireless communications, for example and without limitation, Bluetooth, WiMax, 802.11a, 802.11b, 802.11g, 802.11(x), a proprietary communications network, infrared, optical, the public switched telephone network, or any suitable wireless data transmission system, or wired communications known in the art. Preferably, the computer workstation 116 is suitably adapted to provide document data, job data, user interface data, image data, monitor document processing jobs, employ thin-client interfaces, generate display data, generate output data, or the like, with respect to the document rendering device 104, or any other similar device coupled to the computer network 102. The functioning of the computer workstation 116 will better be understood in conjunction with the block diagram illustrated in
Communicatively coupled to the computer workstation 116 is a suitable memory, illustrated in
Additionally, the system 100 of
Turning now to
Also included in the device 200 is random access memory 206, suitably formed of dynamic random access memory, static random access memory, or any other suitable, addressable memory system. Random access memory provides a storage area for data instructions associated with applications and data handling accomplished by the processor 202.
A storage interface 208 suitably provides a mechanism for volatile, bulk or long term storage of data associated with the device 200. The storage interface 208 suitably uses bulk storage, such as any suitable addressable or serial storage, such as a disk, optical, tape drive and the like as shown as 216, as well as any suitable storage medium as will be appreciated by one of ordinary skill in the art.
A network interface subsystem 210 suitably routes input and output from an associated network allowing the device 200 to communicate to other devices. The network interface subsystem 210 suitably interfaces with one or more connections with external devices to the device 200. By way of example, illustrated is at least one network interface card 214 for data communication with fixed or wired networks, such as Ethernet, token ring, and the like, and a wireless interface 218, suitably adapted for wireless communication via means such as WiFi, WiMax, wireless modem, cellular network, or any suitable wireless communication system. It is to be appreciated however, that the network interface subsystem suitably utilizes any physical or non-physical data transfer layer or protocol layer as will be appreciated by one of ordinary skill in the art. In the illustration, the network interface card 214 is interconnected for data interchange via a physical network 220, suitably comprised of a local area network, wide area network, or a combination thereof.
Data communication between the processor 202, read only memory 204, random access memory 206, storage interface 208 and the network subsystem 210 is suitably accomplished via a bus data transfer mechanism, such as illustrated by the bus 212.
Suitable executable instructions on the device 200 facilitate communication with a plurality of external devices, such as workstations, document processing devices, other servers, or the like. While, in operation, a typical device operates autonomously, it is to be appreciated that direct control by a local user is sometimes desirable, and is suitably accomplished via an optional input/output interface 222 to a user input/output panel 224 as will be appreciated by one of ordinary skill in the art.
Also in data communication with the bus 212 are interfaces to one or more document processing engines. In the illustrated embodiment, printer interface 226, copier interface 228, scanner interface 230, and facsimile interface 232 facilitate communication with printer engine 234, copier engine 236, scanner engine 238, and facsimile engine 240, respectively. It is to be appreciated that the device 200 suitably accomplishes one or more document processing functions. Systems accomplishing more than one document processing operation are commonly referred to as multifunction peripherals or multifunction devices.
Turning now to
The document processing engine 302 suitably includes a print engine 304, facsimile engine 306, scanner engine 308 and console panel 310. The print engine 304 allows for output of physical documents representative of an electronic document communicated to the processing device 300. The facsimile engine 306 suitably communicates to or from external facsimile devices via a device, such as a fax modem.
The scanner engine 308 suitably functions to receive hard copy documents and in turn image data corresponding thereto. A suitable user interface, such as the console panel 310, suitably allows for input of instructions and display of information to an associated user. It will be appreciated that the scanner engine 308 is suitably used in connection with input of tangible documents into electronic form in bitmapped, vector, or page description language format, and is also suitably configured for optical character recognition. Tangible document scanning also suitably functions to facilitate facsimile output thereof.
In the illustration of
The document processing engine 302 is suitably in data communication with one or more device drivers 314, which device drivers allow for data interchange from the document processing engine 302 to one or more physical devices to accomplish the actual document processing operations. Such document processing operations include one or more of printing via driver 318, facsimile communication via driver 320, scanning via driver 322 and a user interface functions via driver 324. It will be appreciated that these various devices are integrated with one or more corresponding engines associated with the document processing engine 302. It is to be appreciated that any set or subset of document processing operations are contemplated herein. Document processors which include a plurality of available document processing options are referred to as multi-function peripherals.
Turning now to
Also included in the controller 400 is random access memory 406, suitably formed of dynamic random access memory, static random access memory, or any other suitable, addressable and writable memory system. Random access memory provides a storage area for data instructions associated with applications and data handling accomplished by processor 402.
A storage interface 408 suitably provides a mechanism for non-volatile, bulk or long term storage of data associated with the controller 400. The storage interface 408 suitably uses bulk storage, such as any suitable addressable or serial storage, such as a disk, optical, tape drive and the like as shown as 416, as well as any suitable storage medium as will be appreciated by one of ordinary skill in the art.
A network interface subsystem 410 suitably routes input and output from an associated network allowing the controller 400 to communicate to other devices. The network interface subsystem 410 suitably interfaces with one or more connections with external devices to the device 400. By way of example, illustrated is at least one network interface card 414 for data communication with fixed or wired networks, such as Ethernet, token ring, and the like, and a wireless interface 418, suitably adapted for wireless communication via means such as WiFi, WiMax, wireless modem, cellular network, or any suitable wireless communication system. It is to be appreciated however, that the network interface subsystem suitably utilizes any physical or non-physical data transfer layer or protocol layer as will be appreciated by one of ordinary skill in the art. In the illustration, the network interface 414 is interconnected for data interchange via a physical network 420, suitably comprised of a local area network, wide area network, or a combination thereof.
Data communication between the processor 402, read only memory 404, random access memory 406, storage interface 408 and the network interface subsystem 410 is suitably accomplished via a bus data transfer mechanism, such as illustrated by bus 412.
Also in data communication with the bus 412 is a document processor interface 422. The document processor interface 422 suitably provides connection with hardware 432 to perform one or more document processing operations. Such operations include copying accomplished via copy hardware 424, scanning accomplished via scan hardware 426, printing accomplished via print hardware 428, and facsimile communication accomplished via facsimile hardware 430. It is to be appreciated that the controller 400 suitably operates any or all of the aforementioned document processing operations. Systems accomplishing more than one document processing operation are commonly referred to as multifunction peripherals or multifunction devices.
Functionality of the subject system 100 is accomplished on a suitable document processing device, such as the document processing device 104, which includes the controller 400 of
In the preferred embodiment, the engine 502 allows for printing operations, copy operations, facsimile operations and scanning operations. This functionality is frequently associated with multi-function peripherals, which have become a document processing peripheral of choice in the industry. It will be appreciated, however, that the subject controller does not have to have all such capabilities. Controllers are also advantageously employed in dedicated or more limited purposes document processing devices that perform one or more of the document processing operations listed above.
The engine 502 is suitably interfaced to a user interface panel 510, which panel allows for a user or administrator to access functionality controlled by the engine 502. Access is suitably enabled via an interface local to the controller, or remotely via a remote thin or thick client.
The engine 502 is in data communication with the print function 504, facsimile function 506, and scan function 508. These functions facilitate the actual operation of printing, facsimile transmission and reception, and document scanning for use in securing document images for copying or generating electronic versions.
A job queue 512 is suitably in data communication with the print function 504, facsimile function 506, and scan function 508. It will be appreciated that various image forms, such as bit map, page description language or vector format, and the like, are suitably relayed from the scan function 308 for subsequent handling via the job queue 512.
The job queue 512 is also in data communication with network services 514. In a preferred embodiment, job control, status data, or electronic document data is exchanged between the job queue 512 and the network services 514. Thus, suitable interface is provided for network based access to the controller function 500 via client side network services 520, which is any suitable thin or thick client. In the preferred embodiment, the web services access is suitably accomplished via a hypertext transfer protocol, file transfer protocol, uniform data diagram protocol, or any other suitable exchange mechanism. The network services 514 also advantageously supplies data interchange with client side services 520 for communication via FTP, electronic mail, TELNET, or the like. Thus, the controller function 500 facilitates output or receipt of electronic document and user information via various network access mechanisms.
The job queue 512 is also advantageously placed in data communication with an image processor 516. The image processor 516 is suitably a raster image process, page description language interpreter or any suitable mechanism for interchange of an electronic document to a format better suited for interchange with device functions such as print 504, facsimile 506 or scan 508.
Finally, the job queue 512 is in data communication with a parser 518, which parser suitably functions to receive print job language files from an external device, such as client device services 522. The client device services 522 suitably include printing, facsimile transmission, or other suitable input of an electronic document for which handling by the controller function 500 is advantageous. The parser 518 functions to interpret a received electronic document file and relay it to the job queue 512 for handling in connection with the afore-described functionality and components.
Turning now to
The read only memory 604 suitably includes firmware, such as static data or fixed instructions, such as BIOS, system functions, configuration data, and other routines used for operation of the workstation 600 via CPU 602.
The random access memory 606 provides a storage area for data and instructions associated with applications and data handling accomplished by the processor 602.
The display interface 608 receives data or instructions from other components on the bus 614, which data is specific to generating a display to facilitate a user interface. The display interface 608 suitably provides output to a display terminal 628, suitably a video display device such as a monitor, LCD, plasma, or any other suitable visual output device as will be appreciated by one of ordinary skill in the art.
The storage interface 610 suitably provides a mechanism for non-volatile, bulk or long term storage of data or instructions in the workstation 600. The storage interface 610 suitably uses a storage mechanism, such as storage 618, suitably comprised of a disk, tape, CD, DVD, or other relatively higher capacity addressable or serial storage medium.
The network interface 612 suitably communicates to at least one other network interface, shown as network interface 620, such as a network interface card, and wireless network interface 630, such as a WiFi wireless network card. It will be appreciated that by one of ordinary skill in the art that a suitable network interface is comprised of both physical and protocol layers and is suitably any wired system, such as Ethernet, token ring, or any other wide area or local area network communication system, or wireless system, such as WiFi, WiMax, or any other suitable wireless network system, as will be appreciated by one of ordinary skill in the art. In the illustration, the network interface 620 is interconnected for data interchange via a physical network 632, suitably comprised of a local area network, wide area network, or a combination thereof.
An input/output interface 616 in data communication with the bus 614 is suitably connected with an input device 622, such as a keyboard or the like. The input/output interface 616 also suitably provides data output to a peripheral interface 624, such as a USB, universal serial bus output, SCSI, Firewire (IEEE 1394) output, or any other interface as may be appropriate for a selected application. Finally, the input/output interface 616 is suitably in data communication with a pointing device interface 626 for connection with devices, such as a mouse, light pen, touch screen, or the like.
Turning now to
The face region detector 712 of the processor 706 is configured to detect the facial regions, while the skin tone detector 714 is configured to detect the skin tone characteristics of the one or more facial regions. The pixel counter 716 calculates the pixel count data from the received image data 704, and the histogram calculator is configured to calculate histogram data from the received image data 704. The plateau detector 720 included in the processor 706 facilitates the detection of a plateau in a function of pixel count data relative to histogram data. The correction factor calculator 722 is configured to calculate a correction factor based upon a property of a detected plateau, and the image corrector 724 functions to adjust pixel values of the one or more facial regions based upon the calculated correction factor. The system 700 further includes an output 726 that is configured to output a corrected image 728 that includes of adjusted pixel values corresponding to one or more facial regions.
Referring now to
Pixel count calculation 808 and histogram data calculation 810 are then performed in accordance with the image data. Plateau detection 812 then occurs of a plateau in a function of the calculated pixel count relative to the histogram data. Correction factor calculation 814 is performed to calculate a correction factor based upon a property of a plateau detected via the plateau detection 812. Pixel value adjustments 816 are then performed of pixel values of the at least one facial region using the correction factor calculated via the calculation 814. Corrected image output 818 occurs of an image corrected via adjusted pixel values corresponding to the facial region.
The skilled artisan will appreciate that the subject system 100 and components described above with respect to
At step 904, a processor operating in accordance with software and an associated data storage detects the at least one facial region. Skin tone characteristics of the at least one facial region are then detected at step 906. At step 908, pixel count data is calculated from the received image data. Histogram data is then calculated, at step 910, from the received image data. A plateau is then detected in a function of the pixel count data relative to the histogram data at step 912. A correction factor is then calculated based upon a property of the detected plateau at step 914. At step 916, pixel values of the at least one facial region are adjusted based upon the calculated correction factor. At step 918, a corrected image is output that includes adjusted pixel values corresponding to at least one facial region.
Referring now to
At step 1004, at least one facial region is detected from the input image data that was received at step 1002. Those skilled in the art will appreciate that any suitable method of facial detection is capable of being employed in accordance with the methodology of the example in
The ethnicity of the image in the facial region is then detected at step 1012. It will be appreciated by those skilled in the art that the ethnicity of a given image is capable of being grouped into a lighter (European) ethnicity or a darker (African/Indian) ethnicity. A determination is then made at step 1014 whether the detected ethnicity is a darker ethnicity. Upon a positive determination at step 1014, flow proceeds to step 1016, whereupon adjusted mid-point data is calculated. In accordance with one example embodiment of the subject application, the value of the mid-point is increased by 50%. The pixel values of the input image are then adjusted in accordance with the detected ethnicity and adjusted mid-point at step 1018.
After adjustment of the pixel values based upon ethnicity at step 1018, or following a determination at step 1014 that the facial region is not of a darker ethnicity, flow proceeds to step 1020. The normalized histogram is then calculated at step 1020 from the received image data. It will be appreciated by those skilled in the art that the normalization of the histogram associated with the received image data reduces noise inherent in the image. Plateau detection is then performed at step 1022 of a plateau in a function of the pixel count data relative to the histogram. A more detailed explanation of the plateau is illustrated below with respect to
At step 1026, a preselected value is assigned to one or more Gamma values in response to the failure to detect a plateau in the histogram. A suitable value for substitution is described in greater detail below with respect to
Turning now to
At step 1104, face detection with a region of interest mask and size limit is applied to a given input image. Suitable methods for face detection include, for example and without limitation, the masking of input images to mask those pixels in an input image that do not have a skin tone color associated with them. Such a method further employs a region of interest scheme, which specifies the regions of the input image in which such facial detection is to be performed, as detailed in commonly assigned, co-pending U.S. patent application Ser. No. 12/583,625, filed on Aug. 24, 2009, the entirety of which is incorporated herein. This approach effectively blocks facial detection of faces near the edges of an input image. For example, given an input image, the minor dimension is determined, i.e., the smaller of the width and the height, and a scaling factor is calculated such that the minor dimension is scaled down to no bigger than, e.g. 480, if necessary. For example, if the input image is a typical 3 MB consumer photo, i.e., 1200 pixels (height) by 1600 pixels (width), then its minor dimension is 1200, and the scaling factor is 2.5. The scaled-down dimensions will be 480 pixels by 640 pixels. Next, a binary image is created of the same dimensions as the original image or of the scaled-down dimensions if necessary. Each pixel in the binary image is set it to 0 if it is not in the Region of Interest (for example, at the 10% peripheral regions), otherwise it is set 1. The binary image is then scaled up to the original input image dimensions if necessary to be the ROI mask, and send the mask to the face detector.
As shown in
Returning to the flowchart 1100 of
Suitable methods for detection of backlit faces, i.e. the facial region darkness, include a mid-point approach, as detailed in commonly assigned, co-pending U.S. patent application Ser. No. 12/387,540, filed May 4, 2009, which is incorporated herein. Such approach includes the determination of whether the intensity value at which the accumulated histogram of luminance reaches 50% is below a predetermined threshold value. Any pixels with extreme intensity values are then discarded from the histogram to remove noise. A size is then determined by the ratio of the width of the detected face over the minor dimension of the input image, i.e. to determine if the ratio is above some predetermined threshold value. The resulting comparisons are then used to determine whether a face in an input image is backlit. The skilled artisan will appreciate that other methods of detecting a backlit face are also capable of being implemented in accordance with the methodology set forth in
At step 1112, a severity category is calculated for the input image corresponding to a categorization of the relative backlighting of the input image.
Operations then proceed to step 1114, whereupon facial tone cluster calculations are performed. It will be appreciated by those skilled in the art that the subject application uses the facial tone cluster calculations for a given input image so as to estimate an ethnicity of the detected face of the input image. In accordance with one embodiment of the subject application, adjustments of naturally darker faces of a person of African or Indian descent must be made differently from naturally lighter faces. Facial tone cluster models (light and dark models) are used so as to determine the appropriate category of a detected face in an input image. A facial tone cluster is generated for the detected face and the overlap to the two models is measured. The percentage overlap is used to determine the ethnicity of the input image.
Suitable examples of such cluster calculations are detailed in the commonly assigned, co-pending application Ser. No. 12/592,110, filed Nov. 19, 2009, the entirety of which is incorporated herein. According to such an example of facial tone cluster calculations, a facial tone cluster model is first built by a) collecting typical faces of some specific ethnicities; b) cropping off the facial region; c) removing non-flesh tone regions like eyes, nose and lips; d) converting to CIE L*a*b* color space; and e) round off to integers to form a point set. This model is capable of being enhanced by systematically generating or collecting typical faces under various controlled lighting conditions and merging these faces into the point set. Preferably, two facial tone cluster models are used, one for darker facial tone (African/Indian) and one for lighter facial tone (European). According to one embodiment, the computation cost of such models is reduced by construction of a bounding box for each facial tone cluster model. A binary, three-dimensional matrix M(i,j,k) is then constructed for each model such that when entry in M equals 1, the pixel of code value (i,j,k) in L*a*b* color space is identified as a facial tone color in the model, 0 means otherwise. The boundary data of the models are calculated and stored off-line, and each time the likelihood of ethnicity is to be determined for an input facial region, these boundary data are retrieved from off-line for overlap comparison. Given an input facial region, a) the image is first converted to L*a*b* color space; b) the degrees of overlap to the models representing darker facial tone (African) and lighter facial tone (European) are then ranked to determine the likelihood of ethnicity this input facial region belongs to; and c) the degree of overlap is calculated by counting the total number of pixels in the input facial region that are within the facial tone cluster model boundaries.
The darkness measure of the cropped facial region is adjusted (by increasing the mid-point value by 50%) if the face is determined by such facial tone cluster calculation that it most likely belongs to an ethnicity with naturally darker skin tone, e.g. African/Indian.
A determination is then made at step 1116 whether the input image, based upon the calculated facial tone cluster is an African/Indian ethnicity. Upon a positive determination at step 1116, operations proceed to step 1118, whereupon the mid-point is modified by a predetermined amount, e.g. increased by 50%. The skilled artisan will appreciate that such a modification is capable of being above or below 50%, and the subject application is not limited to this amount. The severity category is then adjusted at step 1120 in accordance with the newly modified mid-point. A determination is then made at step whether the category of the non-African/Indian face (a negative determination at step 1116) or of the adjusted African/Indian face is equal to a severity category of zero (0). If positive at step 1122, operations end with no backlight adjustment made to the input image.
One method for brightness adjustment, e.g. the “Sectional Bulging” approach, as set forth in commonly assigned U.S. patent application Ser. No. 12/194,025, filed Aug. 19, 2008, incorporated herein, applies bulging over a section of the code value interval between two anchor points with a curvature, and after the brightness adjustment, saturation enhancement is applied by bulging with another curvature.
It will be appreciated by those skilled in the art that in a typical backlit scene, there exists a “plateau” phenomenon 2204 in the cumulative histogram (as shown in
HighY=HighX+DeltaY, where DeltaY=(255−HighX)*P%
and the percentage P is a function of the Severity Category. The two curvature factors, GammaB and GammaS, are also functions of the Severity Category. The following is the pseudo-code for these parameters:
If a plateau is detected at step 1126, flow proceeds to step 1128, whereupon the X-coordinate of Inflection Point is set as the center of the plateau. If no plateau is detected at step 1126, the X-coordinate of Inflection Point (HighX) is set at the default center of 127 at step 1130. At step 1132, the Y-coordinate of Inflection Point and the Bulging Curvatures of the brightness and saturation enhancement by the Severity Category are calculated, i.e. HighY, GammaB, and GammaS are calculated. Correction is then performed at step 1134 and the automatically backlit face image is output at step 1136.
The foregoing description of a preferred embodiment of the subject application has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the subject application to the precise form disclosed. Obvious modifications or variations are possible in light of the above teachings. The embodiment was chosen and described to provide the best illustration of the principles of the subject application and its practical application to thereby enable one of ordinary skill in the art to use the subject application in various embodiments and with various modifications as are suited to the particular use contemplated. All such modifications and variations are within the scope of the subject application as determined by the appended claims when interpreted in accordance with the breadth to which they are fairly, legally and equitably entitled.
This application is related and claims priority to U.S. Provisional Patent Application Ser. No. 61/229,878, filed on Jul. 30, 2009 titled SYSTEM AND METHOD FOR AUTOMATIC BACKLIT FACE CORRECTION, the entirety of which is incorporated herein.
Number | Date | Country | |
---|---|---|---|
61229878 | Jul 2009 | US |