The present invention relates generally to imaging systems, and more specifically, to image color correction systems.
Retail providers typically offer many types of items for sale and carry a significant variety of inventory. Such retailers often sell items using online commerce platforms. These online commerce platforms provide customers with the functionality to purchase items from retailers through websites or through mobile applications. The customer accesses online commerce platforms via their desktop or mobile internet browsers or via a mobile application downloaded and installed on their mobile device.
After selecting an item of interest to view, the customer is presented with a Product Description Page (PDP). The PDP provides the customer with information in connection with the selected item of interest, including one or more images, price information, and a description of the item. The customer is able to purchase the selected item via the PDP. Color correction is a process where an image is altered to match a consistent standard of appearance. Color correction is essential to establish true colors and maintain color constancy across different photo studios in warehouses.
In various embodiments, automated color correction of studio images is disclosed. In one embodiment, color correction is performed in a two-step process. In the first step, color correction information is determined from a color checker image associated with a photo set of images. In a second step, the color correction information is automatically applied to the images of the photo set to generate color-corrected images.
In one embodiment, the color correction information is generated as a color correction matrix that is used to correct the raw images of the photo set. In another embodiment, the color correction information is generated as a set of white-balance parameters that are used to correct the raw images of the photo set.
Both embodiments provide automatic color-correction processing of photo set images to provide efficient color correction in a way that is much faster than using manual color correction techniques. Thus, thousands of images can be easily, consistently, and automatically color-corrected based on processing of the color checker image.
In one embodiment, a method is provided that includes receiving a plurality of images and a color correction image associated with the plurality of images, generating color correction information from the color correction image, and automatically processing the plurality of images based on the color correction information to generate a plurality of color-corrected images.
In one embodiment, a color correction system comprises a correction information generator and an image processor. The correction information generator generates color correction information from a color correction image associated with a plurality of images. The image processor automatically processes the plurality of images based on the color correction information to generate a plurality of color-corrected images.
Further details and embodiments and methods are described in the detailed description below. This summary does not purport to define the invention. The invention is defined by the claims.
The patent or application file contains at least one drawing executed in color. Copies of this patent publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
The accompanying drawings, where like numerals indicate like components, illustrate embodiments of the invention.
Reference will now be made in detail to some embodiments of the invention, examples of which are illustrated in the accompanying drawings.
In one embodiment, the color checker thumbnail generator 404 receives the raw color correction image 108 and generates a color correction thumbnail image 414. The color checker card detector 406 receives the color correction thumbnail image 414 and detects the presence of a color checker card 204 within the thumbnail image 414. The patch segmenter 408 identifies color patches within the detected color checker card 204 and determines an observed color for each patch to form the matrix [O] 416.
The correction matrix processor 410 performs matrix computations to determine a color correction matrix [A] 412. In one embodiment, the correction matrix processor 410 solves the following matrix equation.
In the above equation ([P]=[O]*[A}), [O] are the identified colors read from the color checker patch segments in the color correction image, and [P] are referenced color values from the table 302 of color values. In one embodiment, the correction matrix processor 410 solves the above equation using a Nelder-Mead algorithm. However, other suitable algorithms can be used by the correction matrix processor 410 to solve the above equation to obtain the color correction matrix [A] 412. In one embodiment, the matrix [A] 412 forms the correction information 112 shown in
At block 502, a color correction image is received. For example, the CC thumbnail generator 404 receives the color correction image 108.
At block 504, a thumbnail color checker image is generated from the received color correction image. For example, the CC thumbnail generator 404 generates the color correction thumbnail image 414.
At block 506, the thumbnail color checker image is processed to detect a color checker card. For example, the CC card detector 406 performs this operation. In one embodiment, an edge-detection algorithm is performed by the CC card detector 406 to detect for the presence of the color checker card within the color correction thumbnail image 414. As illustrated in
At block 508, the detected color checker card is segmented into patch regions. For example, the patch segmenter 408 performs this operation. In one embodiment, the patch segmenter 408 uses the size and location of the detected color checker card to identify the patch regions. For example, as illustrated in
Next, a detection rectangle is generated by the patch segmenter 408 to pass through a plurality of patch segments. For example, as illustrated in
Next, the patch segmenter 408 reads a plurality of pixels along the detection rectangle 602 in a predefined block size within each patch. For example, in one embodiment, a block size of 30×30 pixels is used to read pixel color values within each block along the detection rectangle 602. Then the median pixel color value is selected as the observed color value for the patch. The observed color values for the patches form the matrix [O] 416 described above.
At block 510, the patch regions are processed to generate the color correction matrix. In one embodiment, the color correction processor 410 solves the equation ([P]=[O]*[A]) to determine the color correction matrix [A] 412, which represents the color correction information 112.
Thus, method 500 operates to generate correction information in accordance with one novel aspect. It should be noted that the operations of method 500 are exemplary and that the operations can be changed, added to, deleted, rearranged, or otherwise modified within the scope of the embodiments.
During operation, the images of a photo set 110 are input to the thumbnail processor 802, which generates thumbnail images 808 of the images in the photo set. The output of the thumbnail processor 802 is input to the image-cropping processor 804, which crops the thumbnail images 808 to remove artifacts that may be in the thumbnail images 808 to generate cropped images 810. The cropped images 810 output from the image-cropping processor 804 is input to the color-correction processor 806. The color correction processor 806 also receives the color correction matrix [A] 412 and uses matrix 412 to correct the colors of the images of the photo set 110 to generate the color-corrected images 114.
At block 902, raw photo set images are received. For example, the photo set images 110 are received by the thumbnail processor 802 that generates thumbnail images 808.
At block 904, the thumbnail images are cropped. For example, the image cropping processor 804 crops the thumbnail images 808 to generate cropped images 810.
At block 906, the cropped images are processed using the color correction matrix [A] to generate color-corrected images. For example, the color correction processor 806 receives the cropped images 810 and the color correction matrix [A] 412 and processes the images 810 to generate the color-corrected images 114. In one embodiment, the image is linearized by converting the image from sRGB image space to RGB image space. Next, the converted image is multiplied by the correction matrix. Next, exposure in the image is corrected. Next, the image is converted back to sRGB image space.
Thus, method 900 operates to generate color-corrected images in accordance with one novel aspect. It should be noted that the operations of method 900 are exemplary and that the operations can be changed, added to, deleted, rearranged, or otherwise modified within the scope of the embodiments.
The correction information generator 1000 comprises the color checker processor 402, which includes the color checker thumbnail generator 404, the color checker detector 406, and the patch segmenter 408. The correction information processor 1000 also comprises a white balance and parameter detector 1002 that includes a white balance processor 1004 and an optimization minimizer 1006.
During operation, the white balance processor 1004 receives the raw color checker image 108 and a neutral gray patch 22 value from the patch segmenter 408. The white balance processor 1004 processes the patch 22 value and the raw image 108 to determine desired white balance values.
In one embodiment, the white balance processor 1004 compares the red/green/blue (RGB) values of the patch 22 from the patch segmenter 408 and compares them to the reference values for the patch 22 from the color checker card. In one embodiment, the observed patch 22 values are adjusted to be the same as the reference values. In one embodiment, to adjust the values, the value for the green channel is fixed and the red and blue channels are adjusted to be the same as the green channel. To apply, all channels will be multiplied by the determined white balance factor.
In another embodiment, the raw image 108 does not have brightness adjusted. To correct this, a library is accessed to get other optimized parameters to optimize the raw image 108. The result is that the correction parameters 1008 comprises two sets of parameters, white balance parameters 1010 and other optimized parameters 1012 that are stored and applied to the raw images 108.
The optimization minimizer 1006 generates the other optimized parameters 1012, such as exposure, gamma, and brightness, which are optimized separately using all patches provided in the “O” matrix 416.
In one embodiment, the application of parameters to the images is done through library functions. The library functions allow a variety of parameters to be applied to pixels in an image for image correction. In one embodiment, the Nelder-Mead method provides library functions to solve optimization, such as the library for optimization (CFI). Some examples of an initial set of values to be set for optimization include the following.
At block 1102, a color checker image is received by the color checker processor 402 which generates the patch 22 value as described above.
At block 1104, a white balance is performed using the patch 22 value. For example, the white balance processor 1004 receives the patch 22 value and the raw image 108 and determines white balance parameters 1010 for patch region 22 of the color checker card. For example, the white balance operation is performed by fixing one of the three channels (e.g. green) and adjusting the other two channels (e.g. red and blue) to have the same value on the gray patch of the color checker card.
At block 1106, one or more additional parameters are determined. For example, the optimization minimizer 1006 determines the one or more additional parameters 1012 from the [O] matrix 416. In one embodiment, the additional parameters 1012 include one or more of exposure, gamma, and brightness parameters.
At block 1108, the correction parameters are output. For example, the determined white balance parameters 1010 and the one or more additional parameters 1012 are output as the correction parameters 1008. The correction parameters represent the color correction information.
Thus, method 1100 operates to provide an alternative method for generating color correction information in accordance with one novel aspect. It should be noted that the operations of method 1100 are exemplary and that the operations can be changed, added to, deleted, rearranged, or otherwise modified within the scope of the embodiments.
At block 1302, raw photo set images 110 are received by the color correction processor 1202.
At block 1304, correction parameters 1008 are received by the color correction processor 1202.
At block 1306, the correction parameters 1008 are used to process the raw photo set images 110 to generate color-corrected images 1206.
At block 1308, the color-corrected images 1206 are cropped to generate the final color-corrected images 114.
Thus, method 1300 operates to provide an alternative method for generating color-corrected images in accordance with one novel aspect. It should be noted that the operations of method 1300 are exemplary and that the operations can be changed, added to, deleted, rearranged, or otherwise modified within the scope of the embodiments.
Although certain specific embodiments are described above for instructional purposes, the teachings of this patent document have general applicability and are not limited to the specific embodiments described above. Accordingly, various modifications, adaptations, and combinations of various features of the described embodiments can be practiced without departing from the scope of the invention as set forth in the claims.
Number | Name | Date | Kind |
---|---|---|---|
8294781 | Cote et al. | Oct 2012 | B2 |
9712743 | Corcoran et al. | Jul 2017 | B2 |
9911242 | Sundaresan et al. | Mar 2018 | B2 |
10027938 | Fujiwara et al. | Jul 2018 | B2 |
10783615 | Ma et al. | Sep 2020 | B2 |
20080088858 | Marcu et al. | Apr 2008 | A1 |
20090059256 | Hasegawa | Mar 2009 | A1 |
20180176528 | Li et al. | Jun 2018 | A1 |
20220245900 | Tan | Aug 2022 | A1 |
Number | Date | Country |
---|---|---|
109040598 | Aug 2020 | CN |
4681529 | May 2011 | JP |
2020155907 | Sep 2020 | JP |