This application claims the benefit under 35 U.S.C. § 365 of International Application PCT/EP2017/080412, filed Nov. 24, 2017, which was published in accordance with PCT Article 21(2) on Jun. 21, 2018, in English, and which claims the benefit of European Patent Application No. 16306696.2, filed Dec. 15, 2016.
The present disclosure relates to the domain of color grading of videos for example when a transformation is applied to an original video for different viewing conditions.
This section is intended to introduce the reader to various aspects of art, which may be related to various aspects of the present invention that are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present invention. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
In creation and distribution of pictures and videos, it is known to produce a first version of a video, called original video, for a specific original display having original viewing conditions. In the following, viewing conditions include both the display on which the content is rendered and the environment of the rendering. Thus viewing conditions can include the color gamut of the display, the maximum luminance of the display, but also ambient light falling on the display, the luminance and chromaticity of the background, the adopted white of the human eye, the viewing distance, or the temporal evolution in any of those conditions.
It is further known, that the creator of the video produces a second version of the video, called reference video, that is well adapted to be displayed on a reference display having reference viewing conditions that are different from the original viewing conditions. This reference video may be generated either by manual color grading or by color processing such as gamut mapping and tone mapping or a combination of manual color grading followed by color processing.
However such mentioned reference video is of fixed nature in the sense that it contains colors that should be shown under the reference viewing conditions. If the viewing conditions are different from reference viewing conditions, the reference video does not show up correctly. For example, images prepared for a television set in a living room should be enhanced in contrast and saturation when shown on table outside in the sun.
When additionally to the original video and the reference video, more versions are generated, such videos may require too much storing space or transmission rate in order to show the content later or at a distant place. A known solution to this problem consists in storing or transmitting a color transform, being any combination of color grading and color processing, that can derive the additional video—for example—from the reference video. Color grading and color processing can include modification of intensity, saturation, hue, contrast, in images, part of images, multiple images or in temporal image sequences. The advantage is that instead of storing/transmitting an additional video, only a color transform has to be stored/transmitted. However, these known methods still require storage and transmission of each color transform. The transmission channel may not allow transmitting additional color transforms corresponding for additional viewing conditions. The distant receiver can neither receive additional transform nor generate additional target version of a reference video. A method for generating a new color video using color transforms adapted to target viewing conditions such that these color transforms require a minimum of storage place and a low transmission rate is therefore desirable in particular when the content receiver is distant.
The purpose of the invention is to overcome at least one of the disadvantages of prior art by proposing a method for generating a color video adapted to target viewing conditions from existing color transforms for other viewing conditions. Indeed, a salient idea of the invention is to determine and store interpolating parameters for generating an estimation of an existing color transform from a sub-set of others existing color transforms. Such existing color transform is advantageously removed from storage, while only the sub-set of existing color transforms and the determined interpolation parameters are stored. When the estimation of the color transform using interpolation is done at a receiver distant from the production based on the sub-set of other existing color transforms and the determined interpolation parameters, the interpolation at the receiver is advantageously not blind, but guided by the existing color transform at the production used in the determination of the interpolation parameters.
According to a first aspect, a method for generating a target color graded version of an original picture from a first color transform and a second color transform is disclosed. The first color transform is applied to an original picture to generate a first color graded version of the original picture. The second color transform is applied to the original picture to generate a second color graded version of the original picture. Advantageously, the method further comprises, for instance at a receiver implementing the method, receiving at least one interpolating parameter such that a fourth color transform is obtained by applying a parametric function determined by the at least one interpolating parameter to the first color transform and the second color transform and such that the fourth color transform is close to a third color transform, the third color transform transforming the original picture into the target color graded version of the picture. Then the method comprises generating the fourth color transform by an interpolation that applies the parametric function determined by the at least one interpolating parameter to the first color transform and the second color transform; and generating the target color graded version of the original picture by applying the fourth color transform to the original picture.
According to a specific characteristic, the at least one interpolating parameter is obtained by minimizing an error between the third color transform and the fourth color transform. The minimization is for instance performed on a set of sample color values. Advantageously, the method is compliant with any method assessing that the third color transform and the fourth color transform are close, the method being algorithmic by error minimization or being manual by an operator adjustment of the parameter up to satisfying color graded images or any combination. In other words, the interpolated fourth color transform is representative of the third color transform.
According to a second aspect, a method for generating the fourth color transform is also disclosed. Advantageously, the method comprises, for instance at a production device implementing the method, obtaining the first color transform, the second color transform (references) and the third color transform (exemplary) and obtaining at least one interpolating parameter wherein the fourth color transform is close to or representative of the third color transform (exemplary) and the fourth color transform is obtained by applying a parametric function determined by the at least one interpolating parameter to the first color transform and second color transform.
According to a specific characteristic, the at least one interpolating parameter is obtained by minimizing an error between the third color transform and the fourth color transform.
According to another specific characteristic, the method further comprises calculating the fourth color transform by an interpolation that applies the parametric function determined by the at least one interpolating parameter to the first color transform and the second color transform. Advantageously, the fourth color transform is thus present either for assessment or iterative refinement of the parameters.
According to another specific characteristic, the method further comprises transmitting the first color transform, second color transform and the at least one interpolating parameters, for instance to a distant receiver.
According to another specific characteristic, the method further comprises obtaining a difference between the third color transform and the fourth color transform for a set of sample color values; and in case where the difference is above a value repeating the obtaining at least one interpolating parameter. The fourth color transform is thus iteratively refined, which is particularly well adapted in case the interpolation function is complex such as or non-linear function.
According to a third aspect, a device for generating a target color graded version of an original picture is disclosed that comprises a processor configured to obtain a first color transform and a second color transform, wherein the first color transform transforms the original picture into a first color graded version of the original picture, the second color transform transforms the original picture to a second color graded version of the original picture; to receive at least one interpolating parameter such that a fourth color transform is obtained by applying a parametric function determined by the at least one interpolating parameter to the first color transform and the second color transform, and such that the fourth color transform is close to a third color transform and the third color transform transforms the original picture into the target color graded version of the picture; to generate the fourth color transform by an interpolator that applies the parametric function determined by the at least one interpolation parameter to the first color transform and the second color transform; and to generate the target color graded version of the original picture by applying the fourth color transform to the original picture.
In a variant, a device is disclosed that comprises means for obtaining a first color transform and a second color transform, means for receiving at least one interpolating parameter and interpolation means for generating the fourth color transform by applying the parametric function determined by the at least one interpolation parameter to the first color transform and the second color transform; and means for generating the target color graded version of the original picture by applying the fourth color transform to the original picture.
According to a specific embodiment, the device belongs to a set comprising:
According to a fourth aspect, a device for generating a fourth color transform wherein the fourth color transform transforms an original picture into a target color graded version of the original picture is disclosed. The device comprises a processor configured to obtain a first color transform and a second color transform; obtain a third color transform (exemplary) wherein the third color transform transforms the original picture to the target color graded version of the original picture; and obtain at least one interpolating parameter wherein the fourth color transform is close to the exemplary third color transform and the fourth color transform is obtained by applying a parametric function determined by the at least one interpolating parameter to the first color transform and the second color transform.
In a variant, a device is disclosed that comprises means for obtaining the first color transform, the second color transform, and the third color transform and means for obtaining at least one interpolating parameter for the fourth color transform for instance by minimizing of an error between the third color transform and the fourth color transform.
According to a fifth aspect, a computer program product comprising program code instructions to execute the steps of any of the disclosed methods when this program is executed on a computer is disclosed.
According to a sixth aspect, a processor readable medium is disclosed that has stored therein instructions for causing a processor to perform at least the steps of any of the disclosed methods.
According to a seventh aspect, a non-transitory program storage device is disclosed that is readable by a computer, tangibly embodies a program of instructions executable by the computer to perform any of the disclosed methods.
While not explicitly described, the present embodiments may be employed in any combination or sub-combination. For example, the invention is not limited to the described color transform and any adjustable parametric function can be used for interpolation purpose.
Besides, any characteristic or embodiment described for the methods is compatible with a device intended to process the disclosed method and with a computer-readable storage medium storing program instructions.
Other characteristics and advantages of the invention will appear through the description of a non-limiting embodiment of the present principles, which will be illustrated, with the help of the enclosed drawings:
A color gamut is a certain complete set of colors. The most common usage refers to a set of colors which can be accurately represented in a given circumstance, such as within a given color space or by a certain output device.
A color gamut is often defined by a color space and its dynamic range (i.e. min/max luminance) of the values, or coordinates, represented in the color space. A color space may further be specified by color primaries and by the reference white. An example of such a color space is RGB BT.2020 with D65 reference white and with minimum values equal to 0 and maximum values equal to 1. In this case, the values are relative values. RGB BT.709 with D65 reference white and with minimum values equal to 0 and maximum values equal to 1 is another example of such a color space. When working with a relative color space, for example BT.709, and a display having an absolute peak luminance, for example 100 cd/m2, a relative luminance of BT.709, calculated from a weighted sum of R, G, and B color values according to BT.709, is multiplied by 100 resulting in an absolute luminance in the range from 0 cd/m2 to 100 cd/m2. Viewing conditions include additional characteristics such as absolute maximum display luminance (in cd/m2), CIE 1931 x,y chromaticities of the background and/or surround of the display, the viewing distance and the viewing angle of the observer. According to the present principles, a color transform adapted to absolute luminance of a display of 1000 cd/m2 is advantageously interpolated from a color transform adapted to absolute luminance of a display of 2000 cd/m2 and a color transform adapted to absolute luminance of a display of 100 cd/m2, where the parameters of the interpolation function are guided by another reference color transform adapted to absolute luminance of a display of 1000 cd/m2. Color transform for 100 cd/m2 and color transform for 2000 cd/m2 are for instance advantageously transmitted as metadata to a distant receiver implementing the method and connected to a display of 100 cd/m2. Advantageously, once the parameters are obtained, the another reference color transform adapted to absolute luminance of a display of 1000 cd/m2 is removed from production thus reducing the storage of dedicated transform.
LO=0.2627R0+0.6780G0+0.0593B0
from an original color having the color coordinates R0, G0, B0, a reference luminance LR of the color graded version for 2000 cd/m2 is obtained according to
LR=0.2627R2000+0.6780G2000+0.0593B2000
from an existing color having the color coordinates R2000, G2000, B2000, and a target luminance LT of the color graded version for 1000 cd/m2 is obtained according to
LT=0.2627RT+0.6780GT+0.0593BT
from an existing color having the color coordinates RT, GT, BT.
In this variant of tone mapping, a color transform is based on a single coefficient. For example, the reference color transform is
RR=LR/L0×R0
GR=LR/L0×G0
BR=LR/L0×B0
and its coefficient is LR/L0. The target color transform is
RT=LT/L0×R0
GT=LT/L0×G0
BT=LT/L0×B0
and its coefficient is LT/L0. An interpolated color transform is used at the receiver distant from the production according to:
RI=LI/L0×R0
GI=LI/L0×G0
BI=LI/L0×B0
having a coefficient is LI/L0. This coefficient of the interpolated color transform is calculated from a luminance LI that is interpolated using the following interpolation function:
LI=ƒ(LR,L0)
The interpolation function can be linear, for example:
LI=aLR+bL0+c
Where the interpolation parameters a, b, c are determined at production using linear regression such that the remaining error
is minimal. Thus according to present principles, interpolation parameters a, b, c are transmitted to a receiver instead of the transform represented by its coefficient LI/L0. The skilled in the art will appreciate that the reference transform might be much more complex than is this exemplary embodiment and the gain is all the more important. Anyway, the way the color graded version and reference transforms are obtained and the form of the reference transform are out of the scope of the present disclosure.
Let assume that 3 different colors graded versions, associated with 3 color graded transforms are available at production or distribution side. The three reference color graded versions are obtained by any of the previously described method.
The first color transform T1, the second color transform T2 and the third color transform T3 to apply to an original picture as well as associated viewing conditions may be obtained from a source. According to different embodiments of the invention, the source belongs to a set comprising:
The device 2 may store color transforms T1, T2 and T3 for each viewing condition, and may further store color graded versions G1, G2 and G3 for each viewing condition as shown on
The modules storing T1, T2, T3 are linked to an interpolation transform processor 31. The processing of the interpolation transform processor 31 is described hereafter with respect to the method of
Indeed, the interpolation of T4 using T1 and T2 allows us to not use and/or not save and/or not transmit the given third color transform T3 while yielding in a fourth transformed video G4 being close to a third transformed video G3 that could have been obtained applying the third color transform T3 to the original video. The fourth transformed video G4 is close to a third transformed video G3 because the interpolated transform T4 is generally close to the transform T3.
The modules storing T1, T2, k are linked to an interpolation processor 41. The processing of the interpolation transform processor 41 generating the color transform T4 is described hereafter with respect to the method of
In a step S11, at least an interpolation parameter k is determined such that a parametric interpolation function applied to the first color transform T1 and the second color transform T2 results into a fourth color transform T4 being an estimation (or approximation, in other word close to) of the third color transform T3. Advantageously, the interpolation of T4 is not blind but is driven by the exemplary third color transform T3. Thus the interpolation of T4 allows to not use and/or not save and/or not transmit the given exemplary third color transform T3 while providing information for generating a fourth transformed video G4 being close to a third transformed video G3 that could have been obtained applying the third color transform T3 to the original video G0. Indeed, only the interpolation function, for instance represented by parameters of a parametric model, and the sub-set of reference transforms T1, T2 are stored. The third color transform T3 used to obtain the interpolated fourth color transform T4 is no longer needed can be removed from storage.
According to a first variant, the function used in the interpolation is a linear function parametrized by a coefficient k with 0<k<1 where
T4=kT1+(1−k)T2 eq(1)
In this first variant, a single color transform interpolation parameter k is determined. If Ci with 0≤i<I are sample color values equally distributed over the color space, the following interpolation error
Σi[T3(Ci)−kT1(Ci)−(1−k)T2(Ci)]2 eq(2)
is minimized during step S11. The minimum error is achieved when the following equation is satisfied:
ΣiT3(Ci)−T2(Ci)+kΣiT2(Ci)−T1(Ci)=0 eq(3)
leading to the interpolation parameter
According to a second variant, the function used in the interpolation is a polynomial function parametrized by 4 coefficients ki with 0<ki<1 and i=1, 2, 3, 4 where
T4=k1T1+k2T12+k3T2+k4T22 eq(5)
According to a another variant, the function used in the interpolation is a non-linear function parametrized by 2 coefficients ki with 0<ki<1 and i=1, 2, where
T4=k1 ln T1+k2 ln T2 eq(6)
However, any function is compatible with the present principles such as non-linear, parametric functions, splines or look up tables. As for the linear interpolation function, such function is calculated such that min D(ƒ(T1,T2),T3) with D( ) being a distance, norm, or interpolation error function. The interpolation function, for instance represented by its color transform interpolation parameters k, is determined by minimizing the interpolation error on sample color values Ci with 0≤i<I:
Σi[T3(Ci)−ƒ(T1,T2)(Ci)]2 eq(7)
In an optional step S12, once the interpolation function is determined, the color transform itself T4 is calculated by applying the interpolation function to T1 and T2 using the parametric model and the at least one interpolation parameter k. In the first variant, T4 is obtained according to:
T4=kT1+(1−k)T2.
In another optional step S13, T3 and T4 are compared. Advantageously the comparison allows verifying the error in the minimization process of step S11. In case, the error in the interpolated color transform T4 is too large, the determining step S11 is re-iterated. According to a non-limiting example, the error ε or the function D( ) is the sum of squared differences (also known as L2 norm) is computed between T3 and T4 for a set of sample color values Ci with 0≤i<I.
ε=∥T3−T4∥=Σi[T3(Ci)−T4(Ci)]2 eq(8)
If the error ε is larger than a predefined threshold T, the determining step S11 is repeated to determine an updated parameter k. Else, if the error ε is not larger than the predefined threshold T, the method goes to step S14.
According to different variants, the set of samples colors used to assess the error is the same as in the determining step S11 or a different one, for instance larger than the one used in determining step S11. In a variant of the optional step S13, the error ε is computed considering color preferences indicating—for at least one color—the importance of colors for difference calculations. To that end, a weighted difference is computed between T3 and T4 for a set of sample color values Ci with 0≤i<I where wi with 0≤i<1 is the relative weight representative of the importance of the sample color i
ε=Σiwi[T3(Ci)−T4(Ci)]2 eq(9)
Again, if the error ε is larger than a predefined threshold T, the determining step S11 is re-iterated.
According to another non-limiting example, an assessment of the error ε between T3 and T4 is obtained by a difference of their respective color graded images G3 and G4. To that end, the fourth color transform is applied to an original image G0 resulting into the fourth color graded image G4. The third color transform is also applied to the original image G0 resulting into the third color graded image G3. Then, the image difference is computed through a SAD for at least a subset of pixel color values Xi of the original image with i being the number of the at least a subset of pixels in the original image:
ε=∥T3−T4∥=Σi[T3(Xi)−T4(Xi)]2 eq(10)
In yet another variation, the image difference is computed considering image preferences indicating—at least for one spatial part of the image—the importance of spatial image parts for image difference calculation.
ε=∥T3−T4∥=Σiwi[T3(Xi)−T4(Xi)]2 eq(11)
Of course, the present principles are compatible with other difference computation such as the sum of absolute difference (also known as norm L1).
Finally, in a step S14 compliant with the variant where color transform is performed in a distant receiver, the parameter k of the parametric model of the interpolation function is transmitted to the distant receiver already storing the two color transforms T1 and T2.
It is preferable that the interpolation function is of simple complexity, such as with a linear function, to reduce computational effort for calculation of the interpolation parameter. However, the present principles are well adapted with complex color transforms since any color transform is adaptively generated based on a subset of color transforms, in a preferred embodiment 2 color transforms, and the parametric interpolation function. In this case, the gain of transmitting k rather than complex color transform such as nonlinear function or implemented as a LUT, is particularly advantageous.
The method implemented at the distant receiver is now described.
In a step S21, at least an interpolation parameter k is received from a device 3 for generating a fourth color transform based on a sub-set of reference transform and corresponding to an exemplary third color transform T3 (only known from the device 3 which generates the interpolation parameter k). The interpolation parameter, in any of its variants describe with
In a step S22, a fourth color transform T4 is interpolated by applying the parametric function determined by the interpolation parameter(s) k to the first color transform T1 and the second color transform T2. The interpolated fourth color transform T4 is an estimation of a third color transform T3.
Then in a step S23, the fourth color transform T4 is applied to an original image G0 resulting into the fourth color graded image G4. G4 and G3 are color graded version of the original image G0 corresponding to same viewing conditions. Advantageously, G4 is obtained without transmitting or storing the reference third color transform T3 to the device 4. The skilled in the art will appreciate that by receiving new parameters k, the device 4 is adapted to changing viewing conditions without transmitting dedicated color transforms.
According to different embodiments, the fourth color graded version of the image may be sent to a destination, e.g. a display device. As an example, the fourth color graded version of the image is stored in a remote or in a local memory 420, e.g. a video memory or a RAM, a hard disk. In a variant, the fourth color graded version of the picture is sent to a storage interface, e.g. an interface with a mass storage, a ROM, a flash memory, an optical disc or a magnetic support and/or transmitted over a communication interface, e.g. an interface to a point to point link, a communication bus, a point to multipoint link or a broadcast network.
According to an exemplary and non-limiting embodiment, the color grading interpolation device 4 further comprises a computer program stored in the memory 420. The computer program comprises instructions which, when executed by the color grading interpolation device 4, in particular by the processor 410, enable the color grading interpolation device 4 to execute the method described with reference to
The color grade interpolation device 2 is advantageously part of a player 300 or of a TV set.
The implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method or a device), the implementation of features discussed may also be implemented in other forms (for example a program). An apparatus may be implemented in, for example, appropriate hardware, software, and firmware. The methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processors also include communication devices, such as, for example, computers, cell phones, portable/personal digital assistants (“PDAs”), and other devices that facilitate communication of information between end-users.
Implementations of the various processes and features described herein may be embodied in a variety of different equipment or applications, particularly, for example, equipment or applications. Examples of such equipment include an encoder, a decoder, a post-processor processing output from a decoder, a pre-processor providing input to an encoder, a video coder, a video decoder, a video codec, a web server, a set-top box, a laptop, a personal computer, a cell phone, a PDA, and other communication devices. As should be clear, the equipment may be mobile and even installed in a mobile vehicle.
Additionally, the methods may be implemented by instructions being performed by a processor, and such instructions (and/or data values produced by an implementation) may be stored on a processor-readable medium such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette (“CD”), an optical disc (such as, for example, a DVD, often referred to as a digital versatile disc or a digital video disc), a random access memory (“RAM”), or a read-only memory (“ROM”). The instructions may form an application program tangibly embodied on a processor-readable medium. Instructions may be, for example, in hardware, firmware, software, or a combination. Instructions may be found in, for example, an operating system, a separate application, or a combination of the two. A processor may be characterized, therefore, as, for example, both a device configured to carry out a process and a device that includes a processor-readable medium (such as a storage device) having instructions for carrying out a process. Further, a processor-readable medium may store, in addition to or in lieu of instructions, data values produced by an implementation.
As will be evident to one of skill in the art, implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted. The information may include, for example, instructions for performing a method, or data produced by one of the described implementations. For example, a signal may be formatted to carry as data the rules for writing or reading the syntax of a described embodiment, or to carry as data the actual syntax-values written by a described embodiment. Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal. The formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream. The information that the signal carries may be, for example, analog or digital information. The signal may be transmitted over a variety of different wired or wireless links, as is known. The signal may be stored on a processor-readable medium.
Number | Date | Country | Kind |
---|---|---|---|
16306696 | Dec 2016 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2017/080412 | 11/24/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2018/108494 | 6/21/2018 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5604610 | Spaulding et al. | Feb 1997 | A |
7822270 | Van Hoof et al. | Oct 2010 | B2 |
8154563 | Park et al. | Apr 2012 | B2 |
8194997 | Segall et al. | Jun 2012 | B2 |
8648885 | Lee et al. | Feb 2014 | B2 |
20100037059 | Sun | Feb 2010 | A1 |
20110222080 | Monga | Sep 2011 | A1 |
Number | Date | Country |
---|---|---|
1991969 | May 2010 | CN |
3067882 | Sep 2016 | EP |
3067882 | Sep 2016 | EP |
2016066520 | May 2016 | WO |
WO2016066520 | May 2016 | WO |
Entry |
---|
Reinhard et al., “Chapter 6—The Human Visual System and HDR Tone Mapping”, in High Dynamic Range Imaging, Acquisition, Display, and Image-Based Lighting, Elsevier, Amsterdam, 2006, pp. 187-219. |
Morovic et al., “The Fundamentals of Gamut Mapping: A Survey”, Journal of Imaging Science and Technology, vol. 45, No. 3, May/Jun. 2001, pp. 283-290. |
Faridul et al., “A Survey of Color Mapping and its Applications”, Eurographics 2014, Strasbourg, France, Apr. 7, 2014, pp. 1-25. |
CN1991969 B, Translated “Display color adjusting method and device” May 5, 2010. |
Number | Date | Country | |
---|---|---|---|
20190311695 A1 | Oct 2019 | US |