1. Field of the Invention
The present invention relates to apparatus for processing image data, a method of processing image data and a computer-readable medium.
2. Description of the Invention
The digitisation of image processing has enabled many new image manipulation techniques to be developed. Available digital processing effects include a process of color warping, in which color attributes of an image, or area of an image, can be modified in some way. Common uses for such a technique are compensation for camera or film color distortions and special effects.
Many image processing systems provide control over color through the use of gamma correction curves. A gamma correction curve define transfer functions that are applied to red, green and blue image data values, in such a way that a color transformation may occur. However, manipulation of such curves to produce satisfactory results is extremely difficult. In the case of creating special effects, the lack of intuitive feel of such an approach makes achieving useful results extremely difficult.
From a mathematical perspective, many systems provide color transformations defined in terms of matrices. Matrices may be used to define arbitrary transformations in color space, just as they are used in the more familiar world of computer modelling and computer-aided design. However, although such techniques theoretically provide an enhanced level of control over color space, and have the potential to facilitate useful color warping tools, the lack of an intuitive relation between the mathematics and the effect upon the colors of an image makes these techniques difficult to use.
According to an aspect of the present invention, there is provided apparatus for processing image data, comprising storing means for storing instructions, memory means for storing said instructions during execution and for storing image data, processing means for performing image processing in which said image data is processed to modify color values, and display means for facilitating user interaction with said image processing, wherein said processing means is configured such that, in response to said instructions, said image data is processed by the steps of: identifying a color vector and a luminance range for said color vector; defining a color vector function in response to said identification, in which said color vector is a function of luminance; processing source image data to identify luminance values; and modifying colors in response to said luminance values with reference to said color vector function.
The invention will now be described by way of example only with reference to the accompanying drawings.
A system for the processing of image data is illustrated in
A computer 103 facilitates the transfer of image data between the tape player 101 and the frame store 102. The computer 103 also facilitates the modification, processing and adjustment of image data to form an output clip that will eventually be stored onto digital tape. The computer is a Silicon Graphics Octane (™). Images are previewed on a monitor 104 on which is also displayed a graphical user interface (GUI) to provide the user with several controls and interfaces for controlling the manipulation of image data. When processing image data, the user interacts with images and the graphical user interface displayed on the monitor 104 via a graphics tablet 105. For alphanumeric input, there is provided a keyboard 106, although facilities may be provided via the graphical user interface to facilitate occasional text input using the graphics tablet 105.
In addition to receiving image data from the tape player 101 and the frame store 102, the computer 103 may receive image and or other data over a network. The image processing system shown in
In a typical application, film clips are digitised and stored on digital tape for transfer to the system shown in FIG. 1. The film clips include several camera shots that are to be combined into the same scene. It is the task of the user or digital artist to combine and process this source image data into a single output clip that will be stored back onto tape for later transfer to film or video. Typical examples of this type of scene are where real images shot by a film camera are to be combined with artificially generated images and backgrounds, including scenes where actors are to be placed in computer-generated environments.
The computer 103 shown in
The switch 206 enables up to seven different non-blocking connections to be made between connected circuits. A graphics card 208 receives instructions from a CPUs 201 or 202 in order to render image data and graphical user interface components on the monitor 104. A high bandwidth SCSI bridge 209 facilitates high bandwidth communications to be made with the digital tape player 101 and the frame store 102. An I/O bridge 210 provides input output interface circuitry for peripherals, including the graphics tablet 105, the keyboard 106 and a network. A second SCSI bridge 211 provides interface connections with an internal hard disk drive 212. This has a capacity of thirteen gigabytes. The second SCSI bridge 211 also provides connections to a CDROM drive 213, from which instructions for the central processing units 201 and 202 may be installed onto the hard disk 212.
Steps performed by the user when operating the image processing system shown in
If starting on a new job, it will be necessary to obtain image data from film or video clips stored on digital tapes. This is done at step 305, where input clips are transferred from the tape player 101 to the digital frame store 102. Once a finished clip has been generated from the input clips, this is exported to tape at step 306. Alternative forms of import and export of image data may be performed as necessary, including transfer of image data over a network, transfer of image data from CDROM or transfer of data directly from a camera that may be connected to the input of a suitably equipped graphics card 208. Once finished using the image processing system, at step 307 the user logs off from their account and the computer and other equipment are switched off if necessary.
The contents of the main memory 207 shown in
In the present embodiment, the main memory includes Flame instructions 402 for image processing. The present applicant has image processing applications that include Flame (™), and the word Flame will henceforward refer to an improved version of Flame, operating in accordance with the present invention. Flame instructions 402 include color warper instructions 403. The instructions 402 and 403 may originate from a CDROM 303 or over a network connection, such as an Internet connection.
Main memory 207 further comprises a workspace 404, used for temporary storage of variables and other data during execution of instructions 401, 402 and 403 by the processors 201 and 202. The main memory also includes areas for source image data 405, a color vector function 406, a color vector look-up table (LUT) 407 and output image data 408.
Image processing 304 shown in
At step 502, image processing other than color warping is performed. Many operations may be performed at step 502, including effects such as color keying, image distortion, motion blur, and so on.
Color warping is a process in which a general shift in color is applied to an image. Known systems provide color warping using gamma curves for red, green and blue color components. While these curves provide comprehensive control of color, the relation between the user's interaction with such curves and the resulting change in color in an output image is non-intuitive.
At step 503 an image is identified for color warping, and the color vector function 406 is initialised so as to have no effect. At step 504 color warping is performed in accordance with the present invention, and in accordance with operations performed by the processors 201 and 202 in response to the color warping instructions 403. At step 505 a question is asked as to whether the color warping result is satisfactory. If not, control is directed to step 504, and the color warp effect is modified. Eventually, after several iterations, the resulting output image will have a satisfactory appearance. Thereafter, control is directed to step 506, where a question is asked as to whether another image requires color warping. If so, control is directed to step 503. Alternatively, definitions of color warping for an image or plurality of images is complete, and control is directed to step 507.
At step 507 a question is asked as to whether the color warping defined at step 504 should be animated. Color warping at different frames may be used to control an interpolated color warp for intermediate frames. This enables a gradually changing color warp to be applied over the duration of a clip. If an animated color warp is required, control is directed to step 508, where intermediate frames in the clip have their images modified automatically, without the need to repeat step 504 for each intermediate frame on an individual basis.
At step 509 a question is asked as to whether more image processing is required, for example, for other clips. If so, control is directed to step 502. Alternatively image processing is complete, and the resulting output clips may be exported to tape or other medium, at step 510.
Color warping 504, as performed in accordance with the present invention, is summarised in
The color vector graph 611 has three components, one each for red 612, green 613 and blue 614. These components can be made to vary in their proportions as a function of luminance 615. For any given luminance Y′, the red, green and blue values add up to give a total of one. At either end of the graph 611, the color vector is zero, and the three curves converge to a common value of one third. The vertical axis of the graph is scaled in such a way that one third appears as half the maximum color displacement.
A minimum luminance 616 and a maximum luminance 617 define a range of luminance over which a color vector will be added to the color vector function 406 that is already displayed in the graph 611. The color vector is defined by user manipulation of a graphical user interface widget in the form of a trackball 618. The trackball has color dimensions Pb and Pr of the Y′PbPr color space. The user can drag the center 619 of the trackball in any direction 620. The magnitude of this movement defines the amplitude of the color vector that is being added to the graph. The direction of this movement defines the color. As soon as the drag operation is finished, the trackball 618 reverts to its central state, thereby enabling the user to accumulate many such vector inputs. By also modifying the luminance range using the markers 616 and 617, the user is quickly able to build up a complex color vector function 406.
The color vector function 406 defined at step 601 is defined as a set of nine points for each of red, green and blue curves shown in the graph. At step 602 the color vector function 406 is used to create a color vector look-up table 407 (LUT). The use of a look-up table 407 enables subsequent image processing to take place with minimal computation requirements. At step 603 the source image 405 is processed with reference to the LUT 407 created at step 602, resulting in the generation of an output image 408. Finally, at step 604, the output image is displayed on the monitor 104, so that the user can determine whether or not the result is satisfactory, and what modifications might be required in the next iteration of the color warping steps 601 to 604.
The interface presented to the user of the monitor 104 when performing color warping 504, is shown in FIG. 7. The source and output images 405 and 408 are displayed in the top half of the screen. Transport controls 701 and a timeline 702 enable a user to select individual frames from a clip, or to preview or render a sequence of frames or an entire clip. Other controls are provided for the control of color warp animation and the saving and loading of settings. The color vector graph 611 and the trackball 618 are at the bottom of the screen. The luminance markers 616 and 617, in combination with the trackball 618, facilitate quick definition of a range of luminance values and a color vector to be added to the existing color vector function over the identified range 616, 617 of luminance values.
Examples of the types of color vector functions that can be achieved are shown in their graph form 611 in FIG. 8. With the range markers 616, 617 set to luminance values of zero and one respectively, color vectors defined by user manipulation of the trackball 618 cause a general change to the red, green and blue color curves, as shown at 801. With the maximum marker 617 moved to a luminance of one quarter, changes can then be made to the curves over a selected small range of luminance, with no changes to the curves outside this range, as shown at 802. After multiple iterations of range selection and color vector addition, complex curves can be created, as shown at 803. The level of complexity shown at 803, however, can be built up extremely quickly due to the nature of the interface provided.
The step of defining a color vector function, shown at 601 in
At step 902 the color vector, expressed as Pb and Pr co-ordinates, is translated into barycentric co-ordinates for red, green and blue. These barycentric co-ordinates represent the difference to be added to the red, green and blue curves of the existing color vector function. At step 903 these red, green and blue increments are applied proportionately to existing red, green and blue curves over the selected range of luminance values. Function characteristics outside the selected range are not affected by changes made inside the selected range. Furthermore, the color vector defined by the trackball movement has maximum effect in the center of the identified range, and practically no effect at its minimum 616 and maximum 617 points.
The curve data that is modified comprises nine data points for each color. Each point has a value, and the collection of twenty-seven data values defines the color vector function. For subsequent processing, these curves require continuous representation. At step 904, B-Splines are created to represent the newly updated red, green and blue curves. Finally, at step 905, the color vector graph 611 is updated so that the user has an immediate view of the effect of his or her actions on the graph, as well as on the output image. Steps 901 to 905 all take place as soon as the user makes a modification using the trackball 618.
The translation of a color vector into barycentric co-ordinates, shown at step 902 in
Locations of red, green and blue are shown in relation to the PbPr color plane at 1003. The red, green and blue points are joined by lines to form a triangle. This triangle is divided into three by lines drawn from red, green and blue points to the center at PbPr=(0,0). In
Calculations for obtaining barycentric co-ordinates in accordance with the processes illustrated in
The proportionate modification of red, green and blue curves, shown at step 903 in
At step 1205 the current value REDCTRL for the red control point is modified by multiplying the REDFACTOR calculated at step 1102 in
Updating the color vector LUT 407, performed at step 602 in
At step 1304, Pb and Pr co-ordinates are obtained from the barycentric co-ordinates U, V and W. This may be considered as the inverse of the process described in
At step 1306 the LUT 407 is updated. The LUT comprises three parts, one table each for red, green and blue values. Each of these tables is addressed by the value N, selected at step 1301, and is written with the value calculated at step 1305 for the respective color. At step 1307 a question is asked as to whether another address needs to be considered. If so, control is directed to step 1301. Alternatively, this completes the LUT update process 602.
Y′PbPr color space may be considered as having a cylindrical shape with a central axis Y′, that is a vector extending out from the origin of RGB color space, as shown at 1401. Conversion between these color spaces may be achieved by a matrix, and the parameters required for a transformation from RGB to Y′PbPr are detailed at 1402. Transformation from RGB to Y′PbPr may be assigned to a matrix A. The inverse of A, A−1, provides transformation from Y′PbPr to RGB. There is an intuitive relationship between these color spaces for colors of pure black and pure white, as shown at the bottom of FIG. 14. Matrix A−1 is used in step 1305 to convert from Y′PbPr color space to RGB color space.
Processing the source image, performed at step 603 and shown in
Number | Date | Country | Kind |
---|---|---|---|
0008469 | Apr 2000 | GB | national |
Number | Name | Date | Kind |
---|---|---|---|
4839718 | Hemsky et al. | Jun 1989 | A |
5305994 | Matsui et al. | Apr 1994 | A |
5416890 | Beretta | May 1995 | A |
5544284 | Allebach et al. | Aug 1996 | A |
6072902 | Myers | Jun 2000 | A |
6317128 | Harrison et al. | Nov 2001 | B1 |
6429875 | Pettigrew et al. | Aug 2002 | B1 |
6504551 | Takashima et al. | Jan 2003 | B1 |
Number | Date | Country |
---|---|---|
0392565 | Oct 1990 | EP |
2045026 | Oct 1980 | GB |
080036640 | Feb 1996 | JP |
WO 9810586 | Mar 1998 | WO |
Number | Date | Country | |
---|---|---|---|
20010036310 A1 | Nov 2001 | US |