The disclosed implementations are generally related to digital image processing.
Color correction tools are used in the film industry and other disciplines to alter the perceived color of an image. Conventional color correction tools typically allow users to perform primary and secondary color corrections. Primary color correction involves correcting the color of an entire image, such as adjusting the blacks, whites or gray tones of the image. Secondary color correction involves correcting a particular color range in an image. For example, a user may want to change the color of an object in an image from red to blue. The user would identify the range of red in the object and then push the hue to blue. This process could also be applied to other objects in the image.
Color corrections are usually performed in a color space, such as the ubiquitous RGB (Red, Green, Blue) color space. These color spaces can be represented by a three-dimensional (3D) coordinate system, where the three axes of the coordinate system represents components associated with the color space. For example, in the RGB color space the three axes represent contributions of Red, Green and Blue. A color can be located in the RGB color space based on Red, Green and Blue contributions to the color. Since color corrections are performed in 3D color space, many colorists could benefit from a 3D color visualization tool for making precise primary and secondary color adjustments to digital images.
The disclosed implementations relate generally to 3D histograms and other user interface elements for color correcting digital images.
In some implementations, a color correction method includes: generating a user interface for display on a display device, the user interface including a display area; generating a three-dimensional cube representing a color space for display in the display area; and generating a plurality of spheres for display within the cube, where the spheres are sized to represent pixel densities in a digital image.
In some implementations, a color correction method includes: generating a user interface for display on a display device, the user interface including a display area; and generating a color correction interface for display in the display area, the interface including a control for adjusting a selected hue range in a digital image, where the control allows for hue overstep.
Other implementations are disclosed that are directed to methods, systems, apparatuses, devices and user interfaces.
a is a screenshot of an exemplary 2D color correction interface for correcting a hue range.
b illustrates the concept of hue overstep.
c illustrates a user interaction with a hue overstep control.
a is a screenshot of an exemplary 3D histogram for HLS color space, showing a different viewer perspective.
b is a screenshot of an exemplary 3D histogram for HLS color space, showing a different viewer perspective (clockwise rotation about the Saturation axis).
c is a screenshot of an exemplary 3D histogram for HLS color space, showing a different viewer perspective (looking down along the Saturation axis).
When the Hue mode is selected, the color correction interface 200 displays several curves and controls for adjusting hue characteristics. In some implementations, curves are displayed for saturation 208, level high 210 (e.g., white level), level low 212 (e.g., black level) and hue range 216. In the example shown, the curves represent color corrections that will be applied to the digital image 102 based on the hue range contained in the selected region 104. For example, the area 218 under the curve 216 represents the range of hue in the region 104. Rather than displaying numbers, the curves are displayed over a hue gradient that represents the colors contained in the digital image 102. The hue gradient provides an intuitive interface which is more aligned with how a colorist thinks about color correction. Note that the hue range curve 216 continues on the left side of the hue gradient surface so that a portion of the area 218 under the curve 216 is on the left side of the hue gradient.
Using controls 204 and 206, the user can adjust the hue range and sigma of the digital image 102 based on the hue range contained in region 104. As used herein, “sigma” is the spread of the hue range curve 216 when the central position of the hue range curve 216 corresponds to a specific hue value. Various user interface elements can be used as controls (e.g., buttons, sliders, knobs, editable curves, etc.). In the example shown, a vertical bar 203 is displayed to provide a plot of a specific pixel in the digital image 102 through a syringe. In the example shown, the vertical bar 203 is in the middle of the hue range of region 104. The user can use the controls in the interface 200 to color correct the digital image 102. Other 2D interfaces are described in U.S. Granted U.S. Pat. No. 7,693,341, entitled “Improved Workflows for Color Correcting Images.”
b illustrates the concept of hue overstep. A hue wheel 220 is a circle composed of colors that gradually transition between red, yellow green, cyan, blue, magenta and red again as one traverses the circle. Also, in the hue wheel 220 the center is gray and as you go toward the outside ring, the color becomes more saturated, i.e., more rich. A hue range curve 224 represents the hue range in region 104 (
Referring to
The user can adjust the amount of hue overstep by manipulating a hue overstep control 232 until the desired color correction is achieved. By manipulating the hue overstep control 232 the color that is opposite blue on the hue wheel 228 (i.e., yellow) is added to the selection range 236 of the current correction of the image in varying saturation amounts, as shown in
In the example shown, the 3D histogram includes a cube 300 representing a bounded color space (e.g., RGB color space) with three coordinate axes. A first axis 302 represents Red, a second axis 306 represents Blue and a third axis represents Green. A 3D color distribution 308 is displayed within the cube 300. In this example, the distribution 308 is a one to one representation of pixel values. That is, each pixel is represented by a single point inside the cube 300. The position of the point is determined by contributions from Red, Green and Blue components. For example, a pixel that contains only blue would be represented by a point located along the Blue axis 306 in the cube 300. Similarly, a pixel having a color value with equal amounts of red, green and blue would be represented by a point located in the center of the cube 300.
To assist the user in color correction of hue ranges, the hue/saturation gradient surface 602 includes projections 606 and 608 corresponding to references 610 and 612, respectively. In the example shown, the projections 606 and 608 are rectangles. The centers of the rectangles 606, 608, represent the axis intersection of the Hue value with the Saturation value of the plotted pixel. Other representations of projections are possible. Projections can be painted with the same color as their corresponding references or otherwise altered or embellished to form a visual association with a corresponding reference.
To assist the user in color correction of luminance ranges, the luminance gradient surface 604 includes projections 614 and 616 corresponding to references 610 and 612. In the example shown, the projections 614 and 616 are vertical bars and are painted with the same color as their corresponding references 610 and 612. Note that vertical bars are used instead of points because only the lightness axis in the HLS color space is represented. Thus, the gradient surface 604 provides a visual cue for the lightness axis only, while the gradient surface 602 for hue/saturation provides a visual cue for both the hue and the saturation axes in HLS color space. That is, the lightness gradient surface 604 is a 1D gradient and the hue/saturation gradient surface 602 is a 2D gradient because it includes two axes.
a is a screenshot of an exemplary 3D histogram for HLS color space. The 3D histogram is shown displayed in a display area 801 of a user interface 800. Also displayed are a hue gradient surface 802 for displaying projections 804 and 810 corresponding to references 806 and 808, respectively. A user interface element 803 can be used to select the 3D histogram for display in the display area 801. A user interface element 805 can be used to select a color space for the 3D correction histogram. In the example shown, HLS color space was selected. Other color spaces can also be represented by a 3D histogram (e.g., RGB, Y'CbCr, CIELAB, etc.). A user interface element 807 can be used to select between a proxy element (e.g., a sphere representing density) or a “cloud” of points representing pixel values without any density information, as shown in
b is a screenshot of the 3D histogram shown in
c is a screenshot of the 3D histogram of
When the correction interface is displayed, the user can make adjustments using one or more controls in the correction interface (e.g., a slider, button, editable curve, etc.). User interactions with the controls are received by the system/UI manager 902 and sent to the correction engine 906. The correction engine 9006 includes various algorithms for generating color corrections, such as matrix transformations, color space warping and the like. The correction engine 906 also determines new color values for 3D LUT 910. The 3D LUT can be initialized by the system/UI manager 902 with color values upon the loading of the digital image. The digital image can be rapidly processed by the display engine 908 which replaces pixel values in the digital image that are in the sample range with corrected values provided by the 3D LUT 910. Techniques for color correcting digital images using a 3D LUT are described in co-pending U.S. patent application Ser. No. 11/408,783, entitled “3D LUT Techniques For Color Correction of Images”.
The System/UI Manager 902 is responsible for generating and displaying the 3D histograms, shown in
The term “computer-readable medium” refers to any medium that participates in providing instructions to a processor 1002 for execution, including without limitation, non-volatile media (e.g., optical or magnetic disks), volatile media (e.g., memory) and transmission media. Transmission media includes, without limitation, coaxial cables, copper wire and fiber optics. Transmission media can also take the form of acoustic, light or radio frequency waves.
The computer-readable medium 1012 further includes an operating system 1016 (e.g., Mac OS®, Windows®, Linux, etc.), a network communication module 1018, one or more digital images or video clips 1020 and a color correction application 1022. The color correction application 1022 further includes a system/UI manager 1024, a correction engine 1026, a heuristic engine 1028, a display engine 1030 and one or more 3D LUTs 1032. Other applications 1034 can include any other applications residing on the user system, such as a browser, compositing software (e.g., Apple Inc.'s Shake® digital compositing software), a color management system, etc. In some implementations, the color correction application 1022 can be integrated with other applications 1034 or be configured as a plug-in to other applications 1034.
The operating system 1016 can be multi-user, multiprocessing, multitasking, multithreading, real-time and the like. The operating system 1016 performs basic tasks, including but not limited to: recognizing input from input devices 1010; sending output to display devices 1004; keeping track of files and directories on computer-readable mediums 1012 (e.g., memory or a storage device); controlling peripheral devices (e.g., disk drives, printers, GPUs 1006, etc.); and managing traffic on the one or more buses 1014. The network communications module 1018 includes various components for establishing and maintaining network connections (e.g., software for implementing communication protocols, such as TCP/IP, HTTP, Ethernet, etc.). The digital images 1020 can be a video clip of multiple digital images or a single image. The color correction application 1022, together with its components, implements the various tasks and functions, as described with respect to
The user system architecture 1000 can be implemented in any electronic or computing device capable of hosting a color correction application, including but not limited to: portable or desktop computers, workstations, main frame computers, network servers, etc.
Various modifications may be made to the disclosed implementations and still be within the scope of the following claims.
This application is a Continuation Application of U.S. patent application Ser. No. 11/408,741, filed Apr. 21, 2006, entitled “3D Histogram and other User Interface Elements for Color Correcting Images”, which is related to co-pending U.S. patent application Ser. No. 11/409,553, filed Apr. 21, 2006, Granted U.S. Pat. No. 7,693,341, entitled “Improved Workflows For Color Correcting Images,”, and U.S. patent application Ser, No. 11/408,783, filed Apr. 21, 2006, entitled “3D LUT Techniques For Color Correcting Images. The subject matter of each of these patent applications is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5519515 | Komatsu | May 1996 | A |
6185325 | Sakaida et al. | Feb 2001 | B1 |
6259428 | Ramchandani | Jul 2001 | B1 |
6292167 | Throup | Sep 2001 | B1 |
6292195 | Shimizu et al. | Sep 2001 | B1 |
6323969 | Shimizu et al. | Nov 2001 | B1 |
6362829 | Omvik et al. | Mar 2002 | B1 |
6429875 | Pettigrew et al. | Aug 2002 | B1 |
6445816 | Pettigrew | Sep 2002 | B1 |
6456300 | Pettigrew | Sep 2002 | B1 |
6496599 | Pettigrew | Dec 2002 | B1 |
6571012 | Pettigrew | May 2003 | B1 |
6637861 | Yamamoto | Oct 2003 | B2 |
6724500 | Hains et al. | Apr 2004 | B1 |
6751347 | Pettigrew et al. | Jun 2004 | B2 |
6754399 | Pettigrew et al. | Jun 2004 | B2 |
6757425 | Pettigrew et al. | Jun 2004 | B2 |
6798412 | Cowperthwaite | Sep 2004 | B2 |
6833843 | Mojaver et al. | Dec 2004 | B2 |
6836563 | Dawson | Dec 2004 | B2 |
6898309 | Pettigrew et al. | May 2005 | B2 |
6903762 | Prabhu et al. | Jun 2005 | B2 |
6919892 | Cheiky et al. | Jul 2005 | B1 |
6919924 | Terashita | Jul 2005 | B1 |
6928187 | Cooper et al. | Aug 2005 | B2 |
6944335 | Pettigrew et al. | Sep 2005 | B2 |
6947135 | Johnson | Sep 2005 | B2 |
6999617 | Ohga | Feb 2006 | B1 |
7003140 | Venkatachalam | Feb 2006 | B2 |
7003178 | Pettigrew et al. | Feb 2006 | B2 |
7038735 | Coleman et al. | May 2006 | B2 |
7083278 | Broderick et al. | Aug 2006 | B2 |
7268913 | Murashita | Sep 2007 | B2 |
7411698 | Gallina | Aug 2008 | B2 |
7423791 | Tin | Sep 2008 | B2 |
7586644 | Walton et al. | Sep 2009 | B2 |
7671871 | Gonsalves | Mar 2010 | B2 |
7706036 | Yoshida et al. | Apr 2010 | B2 |
8022964 | Pettigrew et al. | Sep 2011 | B2 |
20020024517 | Yamaguchi et al. | Feb 2002 | A1 |
20030206665 | Pettigrew | Nov 2003 | A1 |
20040264766 | Pettigrew | Dec 2004 | A1 |
20040264767 | Pettigrew | Dec 2004 | A1 |
20050046902 | Sugimoto | Mar 2005 | A1 |
20050122543 | Walker | Jun 2005 | A1 |
20050140994 | Hasegawa | Jun 2005 | A1 |
20050147314 | Kokemohr | Jul 2005 | A1 |
20050174590 | Kubo | Aug 2005 | A1 |
20050190198 | Koyama | Sep 2005 | A1 |
20050276481 | Enomoto | Dec 2005 | A1 |
20050280846 | Ichitani | Dec 2005 | A1 |
20060013478 | Ito et al. | Jan 2006 | A1 |
20060017855 | Yamada | Jan 2006 | A1 |
20060170682 | Van Liere | Aug 2006 | A1 |
20070035753 | Ichitani | Feb 2007 | A1 |
20070188814 | Walton et al. | Aug 2007 | A1 |
20070247647 | Pettigrew et al. | Oct 2007 | A1 |
20070247679 | Pettigrew et al. | Oct 2007 | A1 |
Number | Date | Country |
---|---|---|
947956 | Oct 1999 | EP |
09-069961 | Nov 1997 | JP |
9608918 | Mar 1996 | WO |
Number | Date | Country | |
---|---|---|---|
20110316851 A1 | Dec 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11408741 | Apr 2006 | US |
Child | 13227282 | US |