This application relates to computer-implemented vector illustration programs and methods and techniques for modifying vector objects using such programs and methods.
In conventional computer graphics or illustration programs, such as Adobe® Illustrator®, available from Adobe Systems Incorporated of San Jose, Calif., an image can include multiple graphical elements representing, e.g., shapes, objects or other components of the image. In general, vector illustration programs create and manipulate graphics in which such graphical elements are represented as vector objects that have attributes that define the visual characteristics of the corresponding graphical element in the image, such as location, color, size, shape, orientation, transparency, and the like, based on a set of attribute values. Vector objects describe graphics mathematically, using, e.g., lines and curves that define their geometric characteristics. For example, a bicycle tire in a vector graphic is made up of a mathematical definition of a circle drawn with a certain radius, set at a specific location, and filled with a specific color. Vector objects can be edited—e.g., the location, size, or color attributes can be changed—by changing attribute values associated with the object.
By contrast, many painting and image-editing programs, such as Adobe Photoshop, generate and manipulate bitmap or raster images. Raster images use a grid of pixels to represent graphics, with each pixel having a specific location and color value. A bicycle tire in a bitmap image is made up of a collection of pixels, with each pixel forming part of a mosaic that gives the appearance of a tire. A raster image is edited by editing pixels (i.e., by changing color or opacity values associated with a pixel), rather than objects or shapes.
The invention features vector illustration programs and methods in which a user specifies a modification operation to be applied at one or more locations in an image that includes graphical elements represented by vector objects. Based on user input, one or more graphical elements are identified that will be affected by the specified modification operation. The modification operation is operable to define a spatially-dependent change in visual characteristics of affected graphical elements by changing attribute values of the corresponding vector objects relative to the vector objects' location in a region of influence. For each identified graphical element, the relevant attribute value or values of the corresponding vector object are changed according to the specified modification operation.
In general, in one aspect, the invention features computer-implemented methods and apparatus, including computer program apparatus, implementing techniques for processing one or more vector objects representing graphical elements in an image. The vector objects are defined by one or more object-level attributes which define one or more visual characteristics of the vector objects independent of location within the vector object. User input applying a modification operation in the image is received. One or more vector objects to be affected by the modification operation are identified relative to a region of influence of the modification operation. One or more attribute values for one or more of the object-level attributes of each identified vector object are changed according to the modification operation, whereby the degree of change is defined based at least in part on an influence function associated with the modification operation.
Particular implementations can include one or more of the following features. Based on the user input, an intensity representing a strength of the modification operation can be defined at one or more locations in the region of influence. Attribute values for object-level attributes of each identified vector object can be changed based at least in part on the intensity. The region of influence can be an aggregate region of influence defined based on a plurality of applications of the modification tool in the image. The influence function can be an aggregate influence function defined based on a plurality of applications of the modification tool in the image. The intensity can be an aggregate intensity defined based on a plurality of applications of the modification tool in the image. Attribute values for object-level attributes of each identified vector object can be changed at a location in the image based at least in part on a value for the aggregate influence function and a value of the aggregate intensity defined for the location in the image.
User input selecting one of a plurality of image processing modes is received, whereby each of the image processing modes define a method of changing attribute values for one or more of the object-level attributes. Attribute values for object-level attributes of each identified vector object can be changed based at least in part on the selected image processing mode. The plurality of image processing modes can include a randomization mode, smoothing mode, and a sharpen mode. A random attribute value can be generated when attribute values for the object-level attributes of each identified vector object are changed according to the selected image processing mode. According to the selected image processing mode, attribute values for object-level attributes of each identified vector object can be changed relative to an average attribute value of a object-level attribute (e.g., towards or away from the average attribute value). The average attribute value for a object-level attribute of a given vector object can be calculated by determining a distance in the image between the location of the vector object and a location of each of a plurality of other vector objects in the image, and calculating the average attribute value for the object-level attribute at the location of the vector object based on an attribute value for the object-level attribute for each of the plurality of other vector objects and the distance between the location of the vector object and the location of each of the plurality of other vector objects in the image.
Attribute values for a plurality of object-level attributes of each identified vector object can be changed. The modification operation can be operable to modify object-level attributes, including, for example, color, tinting, size, orientation, oriented scale, style, transparency, and paint order. The influence function can be defined by a mask to be applied to the image. The strength of the modification operation at a given location in the image can be defined by a mask value for the given location. Attribute values for an oriented-scale attribute of each identified vector object can be changed, whereby the oriented-scale attribute includes one or more scale values representing a size of the corresponding vector object and one or more orientation values representing angles at which the scale attribute values are to be applied to the corresponding vector object. Attribute values for a tinting attribute can be changed. The tinting attribute includes for a given vector object a tint color value defining a tint color, a tinting-amount value defining a degree of tinting to be applied to the vector object, and a tinting function defining a result color for the vector object as a function of the tint color value of the vector object and an original color value of the color attribute of the vector object. The tinting attribute of the tinted colors for a given vector object can be applied to generate one or more tinted colors for the vector object based on the original color values of the vector object, the tinting function and the tinting-amount.
Attribute values for a tinting attribute can be interpolated between the original color value of each of the vector object's color attribute and the result color defined by the tinting function using the tinting amount according to the formula: finalColor=[(1−tintAmount)·origColor+tintAmount·resultColor], where finalColor is a value of the generated tint color, tintAmount is the tinting-amount, origColor is the original color value, and resultColor is the result color value produced by the tinting function. Attribute values for a paint-order attribute of each identified vector object can be changed, whereby the paint-order attribute includes one or more attribute values defining the order in which the vector object is painted in the group of vector objects when rendering the image. The user input can include input representing manipulation by a user of a modification tool defining the modification operation. Real-time (or near real-time) feedback of the changing attribute values can be provided to the user. A temporary visual representation of an attribute value for a vector object can be displayed within a footprint of the modification tool. The visual representation of the attribute value can be changed in response to the user's manipulation of the modification tool. The influence function can be defined at least in part by attribute values of a user-selected object in the image. The user input applying the modification operation can define one or more values for an input parameter derived from the user input. New attribute values for object-level attributes of each identified vector object can be calculated based at least in part on a value of the input parameter defined at the location of the identified vector object in the image. The input parameter can include one of direction, speed, pressure, and tilt. The object-level attributes can be vectors.
In general, in another aspect, the invention features computer-implemented methods and apparatus, including computer program apparatus, implementing techniques for processing vector objects having visual characteristics defined by one or more attribute values for one or more attributes in an image. User input applying a modification operation in the image is received. User input selecting one of a plurality of image processing modes is received, whereby each of the image processing modes define a method of changing attribute values for of one or more of the attributes. One or more vector objects to be affected by the modification operation are identified relative to a region of influence of the modification operation. One or more attribute values for one or more of the object-level attributes of each identified vector object are changed according the modification operation and the selected image processing mode, whereby the degree of change for each of the identified vector objects is defined based at least in part on the influence function and the method of changing attribute values.
Particular implementations can include one or more of the following features. The plurality of image processing modes can include a randomization mode, a smoothing mode, and a sharpen mode. A random attribute value can be generated when attribute values for the attributes of each identified vector object are changed according to the selected image processing mode. According to the selected image processing mode, attribute values for the attributes of each identified vector object can be changed relative to an average attribute value of an attribute. According to the selected image processing mode, attribute values for attributes of each identified vector object can be changed relative to an average attribute value of an attribute (e.g., towards or away from the average attribute value). The average attribute value for an attribute of a given vector object can be calculated by determining a distance in the image between the location of the vector object and a location of each of a plurality of other vector objects in the image, and calculating the average attribute value for the attribute at the location of the vector object based on an attribute value for the attribute for each of the plurality of other vector objects and the distance between the location of the vector object and the location of each of the plurality of other vector objects in the image. The attributes can be vectors.
Advantages that can be seen in implementations of the invention include one or more of the following. Vector objects in a digital image can be painted and edited using intuitive brush-like modification tools while maintaining a vector model of the vector objects and their attributes. Modification tools can be configured to operate on a variety of attributes as well as object-level appearance attributes, including orientation, color, opacity, transparency, and the like. The user can change the footprint, or region of influence, of the modification tool. The selected modification tool can affect all vector objects that lie within a region of influence created by the movement of the tool. A single application of the modification tool can affect different vector objects differently. The user can also control the degree of change for the attributes, that is, how much the attributes of vector objects affected by a given movement of the tool. Defining dependencies between operations and attribute values of different vector objects allows a user to quickly change the appearance of a vector object, or a group of one or more vector objects. These advantages, by themselves or in combination, give the user great flexibility and add new ways of editing vector objects in digital images. The editing tools also provide a more intuitive way of modifying vector objects than conventional vector-based tools do.
The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features of the invention will be apparent from the description and drawings, and from the claims.
Like reference symbols in the various drawings indicate like elements.
Thus, for example, the vector objects 200 can have object-level attributes that include, color, size, orientation, style, transparency, and the like defined independent of location within each vector object. Each attribute can have one or more attribute values that quantify the amount of the visual characteristic to be displayed in the electronic document. Attributes can be scalar attributes, or vector attributes. Each attribute can be represented by scalar form, a vector, or a combination of both.
Thus, for example, a vector object's location in an image may be defined by a pair of attribute values for a location attribute, representing x and y coordinates in the image; similarly, a vector object's color can be defined by one or more color values representing a color—for example, a single color value representing a shade of gray, a set of three color values representing RGB color values or the like.
A vector object's initial attribute values can be assigned based on user input, or can be derived from pre-determined or default values, and can be fixed (i.e., not subject to alteration based on subsequent user input or operations in illustration program 130) or variable (changeable based on user input or drawing operations). A vector object's visual appearance is defined by a set of current attribute values (e.g., at the time the vector object is to be displayed). The visual appearance of a vector object can be made to depend on the visual appearance of one or more other vector objects by defining relationships between the vector objects as described in contemporaneously filed U.S. patent application—“Creating and Manipulating Related Vector Objects in an Image” by Lubomir Bourdev and Martin Newell, which is incorporated by reference in its entirety.
In other implementations, the tool footprint (and region of influence) can be defined by a mask defining a masked region or regions and an unmasked region or regions, such that when the mask is applied to an image the relevant attribute value or values of any vector objects in the unmasked region of the image are changed, while the attribute values of vector objects located in the masked region remain unchanged (or vice versa).
Each modification tool can also be characterized by an influence function that defines the spatial distribution of the affect of the modification tool—that is, how attribute values of affected vector objects at a given location in the footprint (e.g., those vector objects within the tool's region of influence when the tool is applied to an image) will change as a function of the location. The influence function can be any predetermined or user-defined function, including constant functions, uniform variable functions, Gaussian functions, or arbitrary functions defined by user input. This can include input directly specifying a function or input indirectly specifying a function—e.g., selecting an image, object, or group of objects and defining the influence function as a function of one or more attribute values of the selected image, object, or group of objects.
Thus, for example, a modification tool can have a constant influence function specifying that a particular attribute value or values of any affected vector objects will be increased (or decreased) by a constant amount. Alternatively, an influence function can define a variable attribute value change of affected vector objects that will vary based on the location of each affected vector object relative to the region of influence. In implementations in which a mask defines the region of influence, the influence function can define multi-bit mask values at locations in the mask, which illustration program 130 can use to set attribute values for vector objects located at corresponding locations in the image (either absolutely or relatively, by altering existing attribute values based on mask values or a weighted average of mask values around the corresponding location). For example, orientation attribute values of vector objects can be set according to a luminosity gradient of a mask (or its perpendiculars). As with size and shape, the influence function of a modification tool can be variable in response to user input.
Each modification tool can also have an associated intensity that defines the strength of a given application of the modification tool—in other words, how much attribute values of affected vector objects (e.g., those vector objects within the tool's region of influence when the tool is applied to an image) will change as a result of the application of a modification tool to an image. The intensity of the modification tool can be constant, or variable. The intensity can also be user defined—e.g., based on pressure applied to a pressure-sensitive stylus, or the speed or direction in which a stylus or mouse pointer is moved, or the number of times that a mouse button is clicked, or the like. Where the operation involves multiple applications of the modification tool the intensity can be an aggregate intensity, based on the combined intensities for each discrete application of the modification tool.
In some implementations, modification tools can be manipulated using conventional graphical input techniques, such as user manipulation of a mouse, stylus or arrow keys, to apply the modification tool at one or more locations in an image. In response to user input manipulating a modification tool in this way, illustration program 130 changes one or more attribute values of vector objects based on their locations in the image relative to the location or locations of the modification tool. Thus, for example, in response to the application of a modification tool at a location in an image, illustration program 130 may identify a region of influence corresponding to the application of the tool footprint at the location, and increase attribute values for a specified attribute for all vector objects that fall within the region of influence by an amount determined by the tool's influence function and intensity. The effect of the modification tool as applied can also be altered—for example, by changing the size or shape of the tool footprint or by changing the influence function or intensity—in response to the user's manipulation of the tool.
The tool footprint (and resulting region of influence), influence function, and intensity of a given modification tool can be visually represented in illustration program 130 by, for example, a shape displayed in an image representing the tool footprint at a location in the image at which the modification tool is being or will be applied. Thus, for example,
Illustration program 130 identifies an attribute or attributes (step 320). If illustration program 130 is configured to provide multiple specialized (e.g., attribute-specific) modification tools, the attribute(s) can be identified based on the tool selected by the user; alternatively, the user can select an attribute separately—for example, when a generic tool is invoked, illustration program 130 can be configured to prompt the user to select an attribute or attributes on which the selected tool will operate.
Illustration program 130 receives a user input defining one or more applications of the selected modification tool (step 330). The user input can take the form of a gesture or motion of a graphical input tool, such as a mouse click or click and drag operation, or a stylus stroke on a digitizing tablet. The modification tool can, for example, be moved by the user by moving the mouse while holding down a mouse button, thereby defining a series of applications of the modification tool (e.g., at predetermined or variable intervals) along the path defined by the mouse motion. For each application of the modification tool, illustration program 130 determines an intensity (step 340). The intensity can be determined based on the selected modification tool (e.g., if the tool has a predetermined intensity), and, optionally, based on the user input received in step 330, such as based on pressure applied to a stylus at each application of the tool.
Illustration program 130 defines a region of influence (step 350). The region of influence can be defined based on the selected modification tool and the user input defining the application or applications of the tool, as in an accumulation of a number of discrete instances of a brush footprint as a stylus or mouse cursor is moved across an image, the unmasked (or masked) region of a mask, or the like. As discussed above, the footprint for each of the individual instances can itself vary based on the input—for example, changing size or shape based on the speed with which the mouse or stylus is moved, or the pressure exerted on a pressure-sensitive stylus. Where the operation involves multiple applications of the modification tool the region of influence can be an aggregate region of influence, based on the combined regions of influence for each discrete application of the modification tool.
Illustration program 130 also obtains an influence function that defines the strength of the modification operation as a function of location in the region of influence (step 360). If the modification operation involves only a single application of the modification tool, this influence function can simply be the influence function for the modification tool, as described above. Where the operation involves multiple applications of the modification tool the influence function can be an aggregate function, based on the combined influence functions for each discrete application of the modification tool.
Based on the region of influence, illustration program 130 identifies the vector objects that are affected by the operation (step 370). Affected vector objects are identified based on their location relative to the region of influence. For example, a given operation may affect any vector object that is located within the region of influence. A given operation can be configured to affect only vector objects that are located completely within the region of influence. Alternatively, an operation may affect any vector object any part of which falls within the region of influence.
For each of the affected vector objects, illustration program 130 determines a new value or values for the specified attribute(s) (step 380). Illustration program 130 determines the new attribute value(s) for a given vector object based on the influence function and intensity at the location of the vector object in the region of influence.
This modification step will be now described by way of an example involving a single application of a modification tool having a circular region of influence that has a radius, an intensity and an influence function, which in this example defines the change in the value of an attribute as a distance percentage from the center of the modification tool. Let AmountOfEffect be described by:
AmountOfEffect=InfluenceFn(DistanceFrom ToolCenter/BrushRadius)*BrushIntensity
where AmountOfEffect is the amount an attribute of a vector object is changed in response to the region of influence;
InfluenceFn defines the change in the value of an attribute as a distance percentage from the center of the modification tool;
DistanceFromToolCenter is the distance of the corresponding vector object from the center of the modification tool;
BrushRadius is the radius of the modification tool; and
BrushIntensity indicates the intensity (or strength) of the modification tool.
Illustration program 130 then moves to the next affected vector object (the YES branch of step 390). When new attribute values have been determined for all affected vector objects (the NO branch of step 390), the operation is complete.
Illustration program 130 can be configured to provide feedback to the user of how the attributes of the affected vector objects are being modified or edited. The feedback provided to the user can be any sensory feedback, such as visual, auditory, or haptic feedback. In some implementations, the feedback provided to the user can take the form of temporary visual cues representing current attribute values of affected vector objects in the region of influence (or tool footprint of the modification tool). The temporary visual cues can be provided in any number of forms which may provide a user with visual feedback of how one or more identified attributes of affected vector object are being modified. The temporary visual cues can be provided in real-time (or near real-time) during a user-controlled gesture of the modification tool giving the user a preview of the modification to the identified attributes of the affected vector objects. In some implementations a slider (or other graphic) can be used to indicate the amount of change of an attribute from its previous value (or from any other attribute value which can be used as a reference value). For example, if the orientation of vector objects 400 as shown in
As mentioned above, illustration program 130 can be configured to provide modification tools capable of operating on multiple attributes in a single operation. Such modification tools can be configured to change attribute values for each of a plurality of attributes based on a single influence function; alternatively, two or more different influence functions can be provided, such that changes in attribute values of a first attribute will be determined based on one influence function, while changes in attribute values of a second attribute will be determined based on a different influence function. In addition, illustration program 130 can provide modification tools capable of combining multiple operations into a single modification operation, and to define dependencies between such operations. For example, referring to
Illustration program 130 can be configured to provide for multiple image processing modes. Image processing modes define how the attribute value or values of affected vector objects will be changed by a given application of a modification tool. A variety of image processing modes can be provided, and a single modification operation can incorporate attribute value changes based on two or more different image processing modes. For example, an attribute value for an attribute can be sequentially changed according to two different image processing modes; similarly, an attribute value for a first attribute can be changed according to a first image processing mode, and an attribute value for a second attribute can be changed according to a second image processing mode.
In an increase mode, the relevant attribute value or values of any affected vector objects are increased by a proportion (or percentage) as determined by the influence function. Similarly, a decrease mode can provide for the decrease of the relevant attribute value or values for affected vector objects. In a blending mode, the attribute value or values of affected objects can be blended with a target value or values, with the contributions of the target and current attribute values to the blend being determined by the influence function. Additional image processing modes can include a randomization mode, a sharpen mode, and a smoothing mode.
In a randomization mode, illustration program 130 is configured to alter the relevant attribute value or values of each affected vector object by a random proportion from the current value. Thus, for example, referring to
In a smoothing mode, illustration program 130 applies a modification tool to change the attribute value or values of each affected vector object (by a proportion determined by the influence function) towards an average value of the attribute, which can include, for example, a global average value or the average value of the attribute calculated at the location of the vector object. For example, the local average F(x, y) of an attribute F at location (xy) can be calculated as the average of the values of the attribute F1 for each graphical element i weighted by an amount based on the graphical element's distance from (x,y). Preferably, a local average of an attribute can be defined using the values of the attribute of graphical elements within a predetermined distance, R, from (x,y). The weight of a graphical element can be the difference between R2 and the squared distance from the graphical element to (x,y) as follows:
Thus, for example, referring to
In some implementations, illustration program 130 can be configured to modify attribute values by linear interpolation. In some implementations, linear interpolation can be used to compute a weighted average when an attribute is a scalar or a vector, as described in more detail below. Linear interpolation of the attributes of vector objects generally involves interpolating the corresponding attribute values of the particular attribute. Thus, for example, in linear interpolation of color attribute values, the respective colors are converted into the same color space, and the corresponding color components are interpolated. In one implementation, the interpolation of angles can be calculated as follows. To calculate the weighted average β of n angles α1 n with weights w1 n the weighted average for their corresponding vectors can be computed by [cos β1·w1, sin β1·w1], and the angle β of each corresponding vector can be calculated by:
The techniques described herein can be applied to a wide variety of attributes. Referring to
To compose multiple oriented-scales, the oriented scale matrices can be composed to obtain a result matrix which is of the form:
Thus, the result α, Sx, and Sy is given by the following:
where A=cos2 αSx+sin2 αSy, B=cosasinaSy=cosasinaSx, and D=cos2 αSy+sin2 αSx. In the example of
The techniques can also be applied to a tinting attribute, which can be defined for each vector object by an associated tinting function, tint color, and tint amount. A tinting function takes an intrinsic color (e.g., the original color of the vector object) and a tint color and produces a result color. The final color can be calculated by interpolating the result color and the original color with the tint amount as follows:
finalcolor=(1−tintAmount)·origColor+tintAmount·tintfn(origColor, tintColor)
where finalColor is the final color computation of the tinting attribute, tintAmount is the tint amount, origColor is the original color, tintfn is the tinting function, and tintColor is the tint color, and tintfn(origColor, tintColor), produces the result color (which can be expressed as resultColor).
Referring to
The techniques can also be applied to vector objects having a paint-order attribute, as illustrated in
The invention can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations of them. Apparatus of the invention may be implemented in a computer program product tangibly embodied in a machine-readable storage device for execution by a programmable processor; and method steps of the invention can be performed by a programmable processor executing a program of instructions to perform functions of the invention by operating on input data and generating output. The invention can be implemented advantageously in one or more computer programs that are executable on a programmable system including at least one programmable processor coupled to receive data and instructions from, and to transmit data and instructions to, a data storage system, at least one input device, and at least one output device. Each computer program can be implemented in a high-level procedural or object-oriented programming language, or in assembly or machine language if desired; and in any case, the language can be a compiled or interpreted language. Suitable processors include, by way of example, both general and special purpose microprocessors. Generally, a processor will receive instructions and data from a read-only memory and/or a random access memory. Generally, a computer will include one or more mass storage devices for storing data files; such devices include magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and optical disks. Storage devices suitable for tangibly embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM disks. Any of the foregoing may be supplemented by, or incorporated in specially-designed ASICs (application-specific integrated circuits).
A number of embodiments of the invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention. For example, the order of performing steps of the methods described can be changed. Accordingly, other embodiments are within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
5592599 | Lindholm | Jan 1997 | A |
6057854 | Davis et al. | May 2000 | A |
6208355 | Schuster | Mar 2001 | B1 |
6226000 | Richens et al. | May 2001 | B1 |
6271861 | Sargent et al. | Aug 2001 | B1 |
6300955 | Zamir | Oct 2001 | B1 |
6310622 | Asente | Oct 2001 | B1 |
6313840 | Bilodeau et al. | Nov 2001 | B1 |
6337925 | Cohen et al. | Jan 2002 | B1 |
6373490 | Bendiksen et al. | Apr 2002 | B1 |
6459439 | Ahlquist et al. | Oct 2002 | B1 |
6476934 | Ilbery et al. | Nov 2002 | B1 |
6549212 | Janzen | Apr 2003 | B1 |
6628295 | Wilensky | Sep 2003 | B1 |
6633300 | Tomack et al. | Oct 2003 | B1 |
6774907 | Gupta | Aug 2004 | B1 |
6784896 | Perani et al. | Aug 2004 | B1 |
6795069 | Raskar et al. | Sep 2004 | B1 |
20020130908 | Wilensky | Sep 2002 | A1 |
20030112454 | Woolfe et al. | Jun 2003 | A1 |
Number | Date | Country |
---|---|---|
2306652 | Oct 2000 | CA |