The present invention relates in general to image processing and more particularly to content adaptive sharpness enhancement of image data.
The perceived sharpness of a display device is an important factor for defining picture quality. For large screen display devices, as well as upscaling source signals from lower resolutions to panel resolutions of higher quality, sharpness of the display image is especially important. Conventional methods of picture quality enhancement typically employ one of unsharp masking and luma transient improvement (LTI). The conventional techniques however, do utilize the benefits of each method or correcting the defects of each method.
Unsharp masking is a conventional the oldest method of picture enhancement originally developed for dark room processing. In the conventional methods of unsharp masking, sharpness of an image was first low-pass filtered (e.g., blurred or unsharpened) such that the image was defocused. The resulting defocused negative was used as a mask for a normally processed image. Effectively, the conventional unsharp masking methods increase gain for high frequency image components. Unsharp masking is a type of technique for linear sharpening technique.
The conventional methods and techniques for LTI attempt to adaptively modify edges of a received image signal. Similar to unsharp masking techniques, conventional methods and techniques for LTI can provide image enhancement. However, the conventional methods of LTI can produce image artifacts, such as contouring and line flattening.
Although the conventional methods may be suitable for some applications, the conventional systems and methods do not apply an acceptable level of performance for sharpness enhancement of image data for all applications. Thus, there is a need in the art for systems and methods of image enhancement to utilize the benefits of enhancement techniques while minimizing their side effects.
Disclosed and claimed herein are methods and apparatus for adaptive sharpness enhancement of image data. In one embodiment, a method includes receiving image data for a first frame, performing linear sharpening enhancement of the image data for the first frame, and performing non-linear sharpening enhancement of the image data for the first frame. The method further includes generating blending parameters, by a controller, based upon the image data of the first frame, linear sharpened image data for the first frame, and non-linear sharpened image data for the first frame. Additionally, the method includes blending the image data of the first frame, the linear sharpened image data, and the non-linear sharpened image data based upon the blending parameters.
Other aspects, features, and techniques of the invention will be apparent to one skilled in the relevant art in view of the following detailed description of the invention.
The features, objects, and advantages of the present invention will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout and wherein:
One aspect of the present invention relates to adaptive contrast enhancement of image data. As disclosed herein, methods and an apparatus are provided for sharpness enhancement of image data. In one embodiment, a method for sharpness enhancement includes performing linear sharpening and non-linear sharpening of the image data. The method may further include generating blending parameters based on the image data and modified image data associated with one or more of linear sharpened image data and non-linear sharpened image data. The method may further include determining one or more blending parameters and blending image data based on the blending parameters. One advantage of the invention may be that artifacts generated by conventional methods of linear and non-linear sharpening are overcome. Further, methods for sharpness enhancement of image data according to the invention may further provide one or more of border control, strength control and sharpening level output.
In another embodiment, an apparatus is provided for sharpness enhancement of image data. The apparatus may include one or more modules associated with hardware, software and/or firmware, which may be configured to provide linear and non-linear sharpening of image data. In that fashion, adaptive sharpness enhancement may be provided for image data.
Referring now to the figures,
Non-linear sharpening module 215 may be configured to adaptively modify edges of the signal with the goal to sharpen edges without introducing overshoot. Non-linear module 215 may be configured to perform min/max pixel adjustment and/or min/max pixel value clipping to perform non-linear sharpening of the image data by application of non-linear edge enhancement. In one embodiment, non-linear sharpening module 215 may further include vertical module 220 and horizontal module 225 to provide vertical and horizontal non-linear sharpening, respectively. Image enhancement by non-linear sharpening module may additionally be based on high pass filtered output for horizontal and vertical components of image data by the linear sharpening module as will be discussed in more detail below. In addition to the methods and devices described herein, it should also be appreciated that other methods of linear and/or non-linear sharpening may be employed for sharpness enhancement.
Controller 230 may be configured to generate blending parameters based on the image data received by input 205, and modified input data associated with one or more of linear sharpening module 210 and non-linear sharpening module 215. According to one embodiment, controller 230 may be configured to analyze various properties of an input signal to define parameters for blending module 235 based on the output of linear and non-linear sharpening modules. Blending parameters may be determined to provide the most visually pleasing combination of received image data and linear and non-linear sharpened image data via output 240. In certain embodiments, blending parameters may additionally be generated by controller 230 based on non-linear sharpened image data generated by non-linear sharpening module 215. Blending module 235 may be configured to blend unmodified image data with the linear sharpened image data and non-linear sharpened image data based on the blending parameters generated by controller 230. In one embodiment, blending may be based on three parameters to control the blending, wherein parameters provide ratios for image data, linear sharpened image data and non-linear sharpened image data. As will be further discussed below with reference to
Controller 230 may be configured to provide output signals for one or more signal categories based on received image data. As described in Table 1, exemplary actions which may be performed by controller 230 are listed based on signal categories and selection criteria. With respect to noise, edge detection may be applied to distinguish the noise from texture according to one embodiment. Similarly, edge detection may be employed to distinguish texture. In certain embodiments, edge detection may be employed for low amplitude input signals to distinguish thin lines from noise. With respect to edges of image data, controller 230 may be configured to allow for limited overshoot, which may be visually pleasing and acceptable for viewing of image data, in certain instances.
According to another embodiment, controller 230 may be configured to operate based on one or more instructions stored in memory 232, wherein memory 232 relates to one of ROM and RAM memory. Executable instructions and/or data received by device 200 may be stored by memory 232. In certain embodiments, device 200 may interoperate with external memory (not shown in
As shown in
Referring now to
In an exemplary embodiment, an 11×11-tap incomplete (e.g., cross) two-dimensional kernel high pass filter may be provided, wherein output of the filter may be characterized as follows:
In one embodiment, an 11-tap filter in each direction (e.g., horizontal HPF 310 and vertical HPF 315) may be required to guarantee sharpening algorithm performance for upscaling of image data from standard definition (SD) (e.g., 480 lines of resolution) to high definition (HD) (e.g., 1080 lines of resolution, 2.25× upscaling). Using a two-dimensional kernel filter for HPF 310 and HPF 315 of
According to one embodiment, HPF gain of linear sharpening module 300 may be programmable to control the strength of perceived sharpening of image data. Accordingly, amplifiers 330 and 335 may be programmable to adjust the strength of perceived sharpening based on respective signals received for horizontal and vertical image components based on one or more control signals received from the controller. Gain adjusted horizontal and vertical HPF outputs may be combined by summer circuit 340.
To minimize noise amplification of high pass filtered image data by device 300, the gain of high pass filtered data output by summer circuit 340 may be controlled by vertical edge detector 320 and horizontal edge detector 325. In one embodiment, max module 350 determines maximum horizontal and vertical edges as follows:
In areas without defined edges, the peaking effect will be reduced or disabled. Gain of amplifier 345 may be characterized as follows:
2d_hpf_gain=2D_HPF_GAIN·max(abs(H—ED),abs(V—ED))
One property of a two-dimensional HPF as discussed above is highest gain for small objects, such as dots. This can result in unpleasant over-sharpening on small objects. To minimize this effect and in order to reduce noise amplification, output 355 may be passed to a look-up table (LUT). As will be discussed further below in more detail with respect to
Referring now to
Referring now to
Referring now to
As shown in
Non-linear sharpened image data for horizontal and vertical components may then be combined at merge block 550 and output as non-linear sharpened image data at output 555. Although non-linear sharpening module 500 is described above with respect to a min/max pixel value clipping, it should also be appreciated that other non-linear pixel value adjustments may be employed, including but not limited to the min/max pixel adjustment described above with reference to
As discussed above, the controller (e.g., controller 230) may be configured to analyze various properties of an input signal and define parameters for blending the output of linear and non-linear sharpened pixel data. The controller may further be configured to calculate parameters for pixels and switching between original and processed pixels. For example, two main parameters may be calculated for each pixel of an incoming image signal utilizing an eleven pixel window (horizontal and vertical). One parameter employed by the controller may be in-window contrast, which may be used to separate noise from the image signal issuing a soft coring concept. Another parameter may be in window activity, which may be employed to separate details/thin lines from edges. Based on the parameters and contrast determined by the controller, blending parameters may be generated to switch between original image data and processed signal data, wherein processed signal data relates to a combination of linear and/or non-linear sharpened image data.
Referring now to
contrast=max(window)−min(window)
According to another embodiment, activity may control the ratio between linear and non-linear sharpening applied on pixel-by-pixel basis. Activity may be calculated as follows:
activity=(max(window_start,window_end))−min(window_start,window_end)/contrast
Referring now to
According to one embodiment of the invention, the control module defines a set of three blending parameters:
In one embodiment, alpha_ss and alpha_mm can not exceed 100% of the signal, while alpha_us does not have such a limitation to achieve a required level of sharpening on textures or low amplitude lines/edges. Thus, various blending combinations may be provided depending on activity.
Referring now to
Referring now to
At block 915, non-linear sharpening enhancement of the image data may be performed for the first frame. Non-linear image sharpening may be based on clipping of linear sharpened values above and below a maximum value and minimum value of a window surrounding each sample. According to another embodiment, non-linear image sharpening may include selecting horizontal and vertical windows of image data, and determining minimum and maximum values for the selected horizontal and vertical windows. The minimum and maximum values may be employed for clipping of linear sharpened (e.g., high-pass filtered) image data. Blending parameters may be generated at block 920, by a controller (e.g., controller 230), based on the image data of the first frame, linear sharpened image data for the first frame, and non-linear sharpened image data for the first frame. Additionally, the blending parameters may include a first blending parameter to control an amount of the image data of the first frame, a second blending parameter to control an amount of linear sharpened image data, and a third blending parameter to control an amount of non-linear sharpened image data. Blending parameters may be generated based on one or more of horizontal and vertical contrast, horizontal and vertical activity, a sharpness level, and a linear to non-linear sharpening ratio. The image data of the first frame, the linear sharpened image data, and non-linear sharpened image data may then be blended at block 925 based on the blending parameters.
Process 900 may further include performing border control to selectively apply sharpness to a border region of the image data of the first frame. Additionally, process 900 may include outputting sharpness enhanced image data and providing sharpening level output based on the image data of the first frame and the sharpness enhanced image data. As a result of the linear and non-linear sharpening provided by process 900, adaptive sharpness may be provided which overcomes one or more drawbacks of conventional techniques for sharpness enhancement. By way of example, generation of artifacts by conventional techniques may be eliminated.
Referring now to
One advantage of the proposed invention is to overcome artifacts which may be generated by combination of linear and/or non-linear data with an image data for image enhancement. For example, referring now to
According to one embodiment, adaptive sharpness enhancement employing linear and/or non-linear sharpening may be improved by providing one or more of border control, strength control input and sharpening level output. As used herein, border control may relate to gradual on/off switching on the border of a programmable region of interest. Strength control may relate to allowing external modules to control a level of sharpening. Sharpening level control may relate to controlling other modules behavior based on an applied sharpness.
Referring now to
FIR=tap0*coef0+(tap−1+tap+1)*coef1 . . . +(tap−n+tap+n)*coefn
Below is an exemplary horizontal filter programming for 2.25× upscaling of SD image data:
Continuing to refer to
As further shown in
Alpha generation module 1345 of device 1330 may relate to parameter generation functions and/or components of a controller (e.g., controller 230) for generation of blending parameters. Although a controller is not shown in
In one embodiment, three (3) parameters may be generated by alpha calculation module 1345 to be applied to blender 1350. According to one embodiment, generation of the parameters may be based on: horizontal and vertical “contrast” (e.g., the difference between in-window maximum and in-window minimum); horizontal and vertical “activity” (e.g., the maximum of absolute differences between window start and window end); sharpness level (e.g., strength) as a combination of input strength provided by border generation module 1335 and user control; and/or a user programmable ratio between linear and non-linear sharpening.
Alpha generation module 1345 can define a set of 3 blending parameters:
max_h_full=max(all_pixels_in_horizontal_window)
max_v_full=max(all_pixels_in_vertical_window)
max_h_startend=max(horizontal_window_start, horizontal_window_end)
max_v_startend=max(vertical_window_start, vertical_window_end)
min_h_full=min(all_pixels_in_horizontal_window) min_v_full=min(all_pixels_in_vertical_window)
min_h_startend=min(horizontal_window_start, horizontal_window_end)
min_v_startend=min(vertical_window_start, vertical_window_end)
wherein a window is defined as N pixels before and after a current window.
In one embodiment, the linear sharpening parameter may be calculated as follows:
alpha_us_h=max(max_h_full−max_h_startend, min_h_startend−min_h_full)*US_MM/4)
alpha_us_v=max(max_v_full−max_v_startend, min_v_startend−min_v_full)*US_MM/4)
alpha_us=max(max(0.00, min(1.00,alpha_us_h)), max(0.00,min(1.00,alpha_us_v))).
In one embodiment, the non-linear sharpening parameter may be calculated as follows:
alpha_mm is calculated as shown below to complement total result:
alpha_mm=1.00−alpha_us
A parameter, alpha_ss, may additionally be calculated to account for border generation, external strength input, user control and soft coring function (e.g., reduced sharpness for low contrast signals). In one embodiment, alpha_ss may be calculated as follows:
alpha_ss1=max(0.00, min(1.00, border_strength−input_strength));
w_contrast=max(max_h_full−min_h_full, max_v_full−min_v_full);
Coring=(w_contrast−COR_START)*(COR_SLOPE); and
alpha_ss(v,h)=1.00−max(0.00, min(alpha_ss1, coring)).
Blender 1350 of
out1=alpha_us*in+(alpha_us+GAIN_HPF)*out_linear+alpha_mm*out_non-linear out=out1*(1.00−alpha_ss)+in*alpha_ss
Output of blender 1350 is clipped at block 1355 to provide a 10-bit dynamic range wherein the final enhanced image data may be output via output 1365.
According to another embodiment of the invention, it may be appreciated that the sharpness enhancement device of
In another embodiment, the sharpness enhancement device of
Referring now to
HPF—2D=GAIN—2D*(FIRh+FIRv);
HPF_EDH=GAIN_EDH*FIRedh; and
HPF_EDV=GAIN_EDV*FIRedv.
HPF gain reduction may be controlled by the edge detectors. At max block 1420, a maximum gain may be determined based on output of the horizontal and vertical edge detectors. Reduction of the gain may be characterized as follows:
HPF—ED=max(HPF_EDV,HPF_EDH)
HPF—2DD_FINAL=HPF—2D*min(1.00,HPF—ED)
At clipping block 1425, gain reduction may be controlled by outputs of horizontal and vertical edge detectors. Look-up table (LUT) 1430 may be configured to with interpolating/extrapolating logic to provide linear enhanced image data at output 1435 as will be discussed below in more detail below with respect to
Referring now to
Referring now to
Referring now to
While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art. Trademarks and copyrights referred to herein are the property of their respective owners.
This application claims the benefit of U.S. Provisional Application No. 61/145,692, filed on Jan. 19, 2009, which is hereby fully incorporated by reference. This application is related to concurrently filed and commonly assigned application Ser. No. 12/637,526 entitled, “Method and Apparatus for Spectrum Estimation” which is hereby fully incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4908875 | Assael et al. | Mar 1990 | A |
5583659 | Lee et al. | Dec 1996 | A |
5594807 | Liu | Jan 1997 | A |
5828771 | Bloomberg | Oct 1998 | A |
5880767 | Liu | Mar 1999 | A |
6256179 | Yamada et al. | Jul 2001 | B1 |
6600518 | Bakhmutsky et al. | Jul 2003 | B1 |
6614930 | Agnihotri et al. | Sep 2003 | B1 |
6667815 | Nagao | Dec 2003 | B1 |
6717622 | Lan | Apr 2004 | B2 |
6771835 | Han et al. | Aug 2004 | B2 |
6795585 | Parada et al. | Sep 2004 | B1 |
6847738 | Scognamiglio et al. | Jan 2005 | B1 |
6862368 | He et al. | Mar 2005 | B1 |
6898319 | Hazra et al. | May 2005 | B1 |
7006704 | Kobayashi et al. | Feb 2006 | B2 |
7176983 | Chiang et al. | Feb 2007 | B2 |
7283680 | Cheng | Oct 2007 | B2 |
7336844 | Pike et al. | Feb 2008 | B2 |
7408590 | Huang et al. | Aug 2008 | B2 |
7421127 | Bruls et al. | Sep 2008 | B2 |
7483081 | Wu | Jan 2009 | B2 |
8009908 | Yago | Aug 2011 | B2 |
8031961 | Nachlieli et al. | Oct 2011 | B2 |
20030189579 | Pope | Oct 2003 | A1 |
20050041883 | Maurer et al. | Feb 2005 | A1 |
20060017773 | Sheraizin et al. | Jan 2006 | A1 |
20060181644 | De Haan | Aug 2006 | A1 |
20060279660 | Ali | Dec 2006 | A1 |
20060285766 | Ali | Dec 2006 | A1 |
20070139563 | Zhong | Jun 2007 | A1 |
20070139568 | Zhong et al. | Jun 2007 | A1 |
20080211959 | Balram et al. | Sep 2008 | A1 |
20080267524 | Shaked et al. | Oct 2008 | A1 |
20090028458 | Teng et al. | Jan 2009 | A1 |
20090087120 | Wei | Apr 2009 | A1 |
20100183238 | Ayzenberg et al. | Jul 2010 | A1 |
Number | Date | Country | |
---|---|---|---|
20100189373 A1 | Jul 2010 | US |
Number | Date | Country | |
---|---|---|---|
61145692 | Jan 2009 | US |