1. Field of the Invention
The present invention relates to multimedia. More specifically, the present invention discloses a method of enhancing movies, video, or other multimedia in order to provide users with a more rewarding viewing experience.
2. Description of the Prior Art
Film is a popular form of art or entertainment. The affect on movie-goers is dramatic when movies are viewed in a cinema or movie theater. When movies are made or reproduced, care is taken to ensure that the lighting, brightness, and contrast of the film is suitable for the dim or dark theater setting. However, problems result when these movies are transferred from film to digital formats such as video compact disc (VCD) or digital video disc (DVD).
It is a common problem that movies look too dark when watched on televisions (TV), cathode ray tube (CRT) monitors, or liquid crystal display (LCD) panels. Because many movies are intended for playback in a relatively dark environment, such as a movie theater, video usually appears to be lacking in brightness and contrast when played back in a relatively well-lit environment, such as regular room lighting. A conventional way to compensate for this problem is to change the display device's brightness level and/or gamma correction strength. However, an increase in brightness will typically result in a lack of blackness. Additionally, an increase in gamma will typically result in too little contrast in bright regions and too much contrast in dark regions of the movie.
Refer to
Refer to
Therefore, there is need for an improved method of enhancing movies or video which results in a high quality video output and provides viewers with a rewarding viewing experience.
To achieve these and other advantages and in order to overcome the disadvantages of the conventional method in accordance with the purpose of the invention as embodied and broadly described herein, the present invention provides a movie enhancement method which corrects the video image so that the brightness and contrast levels are proper when the video is viewed in a well-lit environment.
In order to achieve a closer-to-movie theater viewing experience, the present invention provides a method of enhancing video or movies. The enhancement method can be applied to YUV or other video formats which are used for TV or digital media. Additionally, the method can be applied to any video format, such as HSV, RGB, etc. that can be converted or transformed into YUV.
In embodiments of the present invention, the method provides a curve with at least one inflection point such that at least one region has a concave upward arc and another region has a concave downward arc.
In an embodiment, the improved curve provides relatively less contrast in relatively dark regions and relatively more contrast in relatively bright regions. By taking into account the visual sensitivity to various luminance levels, a neutral point is selected to be located at a relatively dark point. To the darker side of the neutral point, luminance is suppressed. To the brighter side of the neutral point, luminance is enhanced. This improved curve is termed a “luminance-mapping” curve and is applied to the luminance (Y) signal so as to enhance both brightness and contrast.
In another embodiment, the luminance-mapping curve is adaptively adjusted over time according to the average luminance level of the input signal. By taking into account the effect of the luminance mapping over the average luminance level and the fact that the luminance level varies over time, the luminance-mapping curve can be shifted in order to keep the video with good contrast and preserve the average luminance level after luminance mapping through time. This is achieved by measuring the mean and variance of the luminance signal and generating an adjustment to the luminance-mapping curve according to the measurements.
In order to preserve the color saturation level, the chrominance signals (UN) an also be adjusted according to the change in Y.
To improve the color saturation level, a chrominance-mapping curve is designed to improve the contrast of color U/V components. In an embodiment, this curve is configured to have a neutral point at the mid-point.
In other embodiments, movie enhancement can be applied to regions. Regions can be overlapped and each can have its own variation of enhancement. Therefore, movie enhancement can be used with object detection and image segmentation algorithms. It can also be used with MPEG4. In this case, movie enhancement can be used with background or front-objects. In other cases, it can be used with the original movie content and subtitles can be blended later.
In other embodiments, movie enhancement can also be a temporal filter. For interlaced content, movie enhancement can be applied to each field.
Some embodiments may introduce coarse edges due to high contrast in certain luminance levels. In these cases, dithering can be used to reduce the effect.
Other areas which can utilize the movie enhancement method are flickering compensation for florescent light, scene change detection, fade detection, used with other special effects in which subjects change with time, and used in another color space, like HSV, to change tone of color.
These and other objectives of the present invention will become obvious to those of ordinary skill in the art after reading the following detailed description of preferred embodiments.
It is to be understood that both the foregoing general description and the following detailed description are exemplary, and are intended to provide further explanation of the invention as claimed.
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
a is a flow diagram illustrating a flow of processes for improving color saturation and contrast in accordance with an exemplary embodiment of the present invention;
b is a flow diagram illustrating a flow of processes for restoring color saturation from luminance adjustment in accordance with an exemplary embodiment of the present invention; and,
c is a flow diagram illustrating in greater detail a flow of processes for restoring color saturation from luminance adjustment in accordance with an exemplary embodiment of the present invention.
Reference will now be made in detail to the preferred embodiments of the present invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
In an embodiment of the present invention, the movie enhancement method is adjustable through two parameters: intensity and variation. These parameters can be set by the user according to the user's personal preference. Y, U, V are scaled to have a normal range of 0˜255 with a center at 128. It will be understood by one of ordinary skill in the art that other ranges can also be applied in similar formats. This embodiment has been implemented with InterVideo Inc.'s WinDVD which is the most popular PC-based software DVD player in the world. WinDVD has also designed a dedicated UI to control these parameters.
Refer to
Intensity is used to control the amount of adjustment applied to Y. In an embodiment, the change is one-directional, a value of 0 corresponds to no effect, and a value of 100 corresponds to maximum effect. It will be understood by one of ordinary skill in the art that other scales and ranges can also be used.
The luminance-mapping transfer function is F( ), which is a 1-to-1 mapping of the Y signal. One embodiment of the luminance-mapping curve is configured to increase luminance on the bright side of a neutral point, and to decrease luminance on the dark side of the neutral point. Advantageously, the neutral point can be located at a relatively dark point of the luminance range to address the issue of visual sensitivity to various luminance levels:
Y′=F(Y)
Intensity is i, which is one of the control elements and has a value between 0 and 100. The luminance mapping based on intensity is adjusted:
F(Y,i)=Y+(F(Y)−Y)*i/100, where
F(Y,100)=F(Y), F(Y,0)=Y
Refer to
G( ) is a “luminance-contrast adjustment” transfer function, which is closely related to F( ) and, in one embodiment, generates an adjustment to the luminance level based on the mean luminance signal of the video. G( ) moves the input signal toward the center of F( ) which has relatively high contrast. Also, G( ) helps keep the average luminance at a constant level at the output. In one embodiment, G( ) and F( ) have the same neutral point location, where the luminance level is not changed.
The luminance-contrast adjustment can be further weighted by the variance of the luminance signal. Signals with relatively large variance typically use less luminance-contrast adjustment, and signals with relatively small variance typically use more luminance-contrast adjustment. Let sqrt( ) be the square-root operator, Yvar be the variance of Y signal, and Ymax be the maximum range of Y. In one embodiment, the following formula is used to provide a weighting in the range of about 0.5˜2.0.
(2*Ymax−sqrt(Yvar))/(sqrt(Yvar)+Ymax)
To provide consistent contrast and average luminance over time, one embodiment measures the average Yavg and variance Yvar of the luminance signal for each video frame. It is understood that in other embodiments, the average Yavg and variance Yvar can be computed from selected frames, such as, for example, every other frame or every second.
g=G(Yavg)*(2*Ymax−sqrt(Yvar))/(sqrt(Yvar)+Ymax)
Refer to
The luminance-contrast adjustment based on control factor intensity is g(i),
g(i)=g*i/100
Thus for each video frame, based on input luminance and control intensity, the final enhancement mapping transfer function FG( ) for the luminance signal (Y), is obtained.
FG(Y,i)=F(Y+g(i),i)
There can be some hard-clippings at both ends of the FG( ) curve after the luminance-contrast adjustment. These ends of the FG( ) curve can be re-interpolated to provide a soft-clipping effect.
To preserve the color saturation level, the U/Y signal can be adjusted as follows:
Uout=(U−128)*Yout/Y+128
Vout=(V=128)*Yout/Y+128
Additionally, the actual calculation can be simplified to avoid division in implementation through fixed-point computation.
Variation is used to control the color saturation level of the video signal. The following process enhances the contrast of the chrominance signal (U/V) with the neutral point located at the mid-point, which indicates “no color” in the YUV format. In other embodiments that intend to offset color, the neutral point for the chrominance-mapping transfer function can be offset from the mid-point.
Refer to
FU( ) and FV( ) are the chrominance-mapping transfer functions, which enhance the contrast of chrominance signals.
U0ori, U1ori, V0ori, and V1ori are pre-defined constants that can be tailored to different viewing conditions, such as indoor and/or outdoor. In one embodiment, the mapping curve is weighted by the variance of the chrominance signals Uvar and Vvar. Signals with relatively large variance typically use less chrominance adjustment, and vice versa. Umax and Vmax are the maximum range of U and V:
U0ν=U0ori*(2*Umax−sqrt(Uvar))/(sqrt(Uvar)+Umax)
U1ν=U1ori*(2*Umax−sqrt(Uvar))/(sqrt(Uvar)+Umax)
V0ν=V0ori*(2*Vmax−sqrt(Vvar))/(sqrt(Vvar)+Vmax)
V1ν=V1ori*(2*Vmax−sqrt(Vvar))/(sqrt(Vvar)+Vmax)
Let ν be the variation ranging from 0 to 100, U0, U1, V0, V1 change linearly with ν:
U1=U1ν*ν/100
V1=V1ν*ν/100
U0=U0ν*ν/100
V0=V0ν*ν/100
Variation in color saturation can be achieved by changing the trajectory of U0, U1, V0, and V1.
Following is a description of details regarding implementation of the movie enhancement method of the present invention.
Let Y, U, and V be the input signal. One embodiment performs the following operations for every frame.
In other embodiments, operations 1-5 described above can be performed on a less frequent basis than every frame to reduce the computation complexity. For example, Yvar, Uvar, Vvar can be calculated when Yavg has a significant change. Operations 3 and 4 can be performed when Yavg has a significant change and upon a change in user-input intensity i. Operation 5 can be performed when Uvar and Vvar have significant changes and upon a change in user-input variation ν.
To reduce the computation complexity, all curves F( ) , G( ), FG(,i), FU( ), and FV( ) can be realized as a fixed-point lookup table, and the mapping calculations can be realized by table watching. In other embodiments, a 3rd order or higher polynomial is used to emulate those curves.
In another embodiment of the present invention the following operations are performed for every pixel:
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the present invention cover modifications and variations of this invention provided they fall within the scope of the invention and its equivalent.
Number | Name | Date | Kind |
---|---|---|---|
5450216 | Kasson | Sep 1995 | A |
5473373 | Hwung et al. | Dec 1995 | A |
5724456 | Boyack et al. | Mar 1998 | A |
5874988 | Gu | Feb 1999 | A |
5940530 | Fukushima et al. | Aug 1999 | A |
6018588 | Kim | Jan 2000 | A |
6062817 | Danowski et al. | May 2000 | A |
6078686 | Kim | Jun 2000 | A |
6137904 | Lubin et al. | Oct 2000 | A |
6263101 | Klein | Jul 2001 | B1 |
6360022 | Lubin et al. | Mar 2002 | B1 |
6507372 | Kim | Jan 2003 | B1 |
6600518 | Bakhmutsky et al. | Jul 2003 | B1 |
6654504 | Lubin et al. | Nov 2003 | B2 |
6850214 | Nishitani et al. | Feb 2005 | B2 |
6952235 | Park et al. | Oct 2005 | B2 |
6952503 | Matsuura | Oct 2005 | B2 |
7013042 | Yamada et al. | Mar 2006 | B1 |
7102695 | Han et al. | Sep 2006 | B2 |
7102697 | Lei et al. | Sep 2006 | B2 |
7158146 | Ohga | Jan 2007 | B2 |
20020031277 | Lubin et al. | Mar 2002 | A1 |
20030103057 | Graves et al. | Jun 2003 | A1 |
20030128220 | Ubillos | Jul 2003 | A1 |
20030133609 | Ubillos et al. | Jul 2003 | A1 |
20040091169 | Park et al. | May 2004 | A1 |
20050062891 | Tang et al. | Mar 2005 | A1 |
Number | Date | Country | |
---|---|---|---|
20040207759 A1 | Oct 2004 | US |