1. Field of the Invention
The present invention relates to animation editing apparatus for editing animation data, and a method of editing animation data in a data processing system.
2. Description of the Related Art
Computerised systems for the editing of animation data have been used for some time. In order to provide a human editor access to the required editable parameters of an animation, it is known for such systems to display a hierarchical representation of items defining a whole scene. The problem with this approach, is that for a complex, high resolution scene, the editor may be confronted with a hierarchy containing thousands of items representing hundreds of simulated objects and their associated attributes.
According to a first aspect of the present invention there is provided animation editing apparatus for editing animation data, said apparatus comprising data storage means, processing means, visual display means and manually responsive input device configured to allow a user to indicate a selected point on the visual display means, wherein: said visual display means is configured to display an image representing a simulated three-dimensional world-space including a plurality of simulated objects; said manually responsive input device is configured to provide an input signal indicating a location within said image corresponding to one of said simulated objects; said processing device is configured to identify the selected simulated object in response to receiving said input signal, and to retrieve data from said data storage means of one or more related items related to said selected simulated object within a defined degree of relationship; and said visual display means is configured to display labels identifying the selected simulated object and said related items only.
An animation artist 101 equipped with a computer system 102 for editing animated graphics is shown in
As an alternative to using a mouse 103, the artist 101 could be provided with a stylus/touch-tablet combination, or a trackball or similar manually responsive input device.
An example of an animation to be edited by the user 101 is shown in
Computer system 102 is detailed in
The present invention is embodied by an animation editing program installed from a CD ROM 310 via the CD-ROM drive 305.
A flow chart outlining the operation of the system 102 is shown in
Data contained within main memory 302 following step 404 is shown in
The project data 503 is illustrated in greater detail in
The hierarchy data 510, defines relationships existing between items defined by data 511, 512, 513 and 514. Thus, the hierarchy data 510, defines relationships between simulated objects defined by data 511, material items defined by data 512, animations defined by data 513, and lights, cameras etc. defined by data 514.
The hierarchy data may be stored within the memory of the system 102 as a database. A table representing a database containing hierarchy data 510 is shown in
The example provided in
Two other items labelled “target scene” and “scene renderer” are also included in the database. The “target scene” defines the overall composition of the animation. The “scene renderer” is a process for rendering the three dimensional animation, defined by the target scene, into a two dimensional animated image that is suitable for display.
It should be understood that an item may be a data-set, or a process which defines a part of an animation, such as a simulated object, an attribute of an object, the overall composition of the animation or the rendering of the animation.
The relationships existing between the items of the database are illustrated by the third column, “PARENT OF” and fourth column “CHILD OF” of the table. The two relationships are the opposite of each other, and thus, if item “A” is a parent of item “B”, then item “B” is a child of item “A”. For example, the sixth line shows that “SPHERE1” is the parent of “Texture-Whitehouse” (node identity 11) and “Animation-Orbit H” (node identity 16), while line 11 shows that “Texture-Whitehouse” is a child of “SPHERE1” (node ID 6) and line 16 shows that “Animation-Orbit H” is a child of “SPHERE1” (node ID 6). Thus the attributes of objects are children of the object.
The two spheres, “SHERE1” and “SHERE2” have been constrained to the object “CUBE1” such that they follow the movement of “CUBE1” during the animation. An offset constraint is used so that the spheres are held apart from “CUBE1”. Because the spheres are constrained to “CUBE1”, they are its children, as indicated in
The database therefore contains data indicating which items are directly related and the nature of the relationship, that is, “parent of” or “child of”. The database is therefore designed such that given the node identity of an item, the node identities of its children may be looked up in the third column, or its parent may be looked up in the fourth column.
A scene tree according to the prior art is shown in
The scene tree of
Lead-lines drawn up from the set 714 and cube 701 to another node 717, representing the target scene, indicate that the target scene is the parent of said set, and cube. Lead-lines from the node 717 and a node 718 representing a defined camera, “CAMERA1”, up to scene renderer node 719 show that the scene renderer is the parent of said camera and the target scene.
The graphical animation illustrated in
In contrast, as described below, the system 102 provides a user interface which allows its user to navigate around the objects of a scene and related attributes in such a way that only items closely related to a user selected item are displayed. The user is therefore presented with only information which is relevant to their present interest, and only of a limited volume, thus making it relatively easy to comprehend when compared to the scene tree of
Furthermore, the animation editing system 102 preferably includes character registration mapping as described in the applicants co-pending Canadian patent application published as CA 2 314 712. It has been found that the character registration mapping in combination with the features of the graphical user interface described herein allows a user to perform animation editing without the need to refer to a scene tree, such as that shown in
Another point to note from
A graphical user interface (GUI) 801 produced by the application program and the GUI is displayed on the visual display unit 104 shown in
The icon window 802 contains a number of icons which facilitate the creation of new simulated objects, the addition of materials to objects within the scene, the animation of objects within the scene, etc.
The navigation window 804 displays a number of labels representing selected items defining the animation. The particular items displayed by the navigation window are selected by the application program in response to the user's input. Specifically, when the system receives an input indicating that a simulated object in viewer window 803 has been selected by the user, the system displays a label representing said selected object at the top of the navigation window 804, and then displays labels representing other items which are directly related to the selected simulated object.
“Directly related” is herein defined as meaning “being a parent of, or being a child of”. Thus if two items are directly related, then one is a child of the other, and when a simulated object is selected by the user, labels representing said object, the child or children of said object and the parent of said object are displayed in the navigation window.
For example, in
The user 101 is therefore presented with only the portion of the hierarchical structure that they are interested in, rather than being confronted with the whole scene tree.
The application selects suitable tools for editing the selected item and displays said tools within window 805.
After selecting a particular simulated object the user may then selected another such object by clicking on the relevant object in viewer window 803. Alternatively, the user may navigate around the hierarchy structure by clicking on labels displayed in the navigation window 804. For example, the user could view the items directly related to the cube 201 by clicking on the label “CUBE1” 813, or if they wished to edit the texture applied to the sphere 203 they could click on the label 812.
The application program is therefore structured such that if the system receives an input indicating that a label within the navigation window has been selected, it displays the selected label at the top of the navigation window and displays labels of directly related items below it. An example of this functionality is provided by
The windows 802 and 803 remain unchanged in appearance, and so continue to display creation tool icons and a view of the animation. While window 805 is updated to display appropriate editing tools
If the user now wishes to divert their attention to the cube 201, they may update the navigation window by selecting the relevant label 813 using the mouse.
The processing of data in response to user generated input commands at step 405 of
Alternatively, if the user input corresponds to the operation of an editing tool or creation tool, project data will be updated at step 1103. At step 1104, the graphical user interface is updated in response to the user input. Then at step 1105, a question is asked to determine whether the end of the editing session has been indicated by the user input, and if so step 405 is completed. Otherwise steps 1101 to 1105 are repeated.
The step 1104 of updating the user interface is shown in more detail in
At step 1203 a question is asked to determine whether the user input indicated the selection of a simulated three dimensional object, and if so, step 1204 is performed before step 1205. Otherwise step 1205 is performed directly after step 1203. At step 1204 node labels are displayed corresponding to the selected simulated object and items directly related to said selected object only. Thus, unlike the prior art illustrated in
At step 1205 a question is asked to determine whether the received user input indicated the selection of a label in the navigation window 804. If this is so, then the selected label is displayed in the navigation window 804 along with labels of directly related items only. Completion of step 1206, or a negative answer to the question at step 1205 completes step 1104.
The step 1204 of displaying labels for a selected simulated object and directly related objects is shown in further detail in
In an alternative embodiment, as well as displaying labels of the parent and children of the selected simulated object, the parent of the parent is also displayed. In a further alternative embodiment, as well as displaying labels of the parent and children of the selected simulated object, the children of the children are also displayed. However, in the preferred embodiment and these two alternative embodiments, the system only displays a label for the selected simulated objects and related items within a defined degree of relationship.
The step 1206 of displaying a selected label with directly related items is shown in further detail in
Having obtained node identities at step 1401, node labels corresponding to the retrieved parent and child node identities, and for the selected label are retrieved from the database at step 1402. At step 1403 the selected node label is displayed at the top of the navigation window 804, and the node labels for the parents and children are also displayed in the navigation window during step 1404.
A second example of an animation project to be edited by the user 101 is illustrated in
A conventional scene tree representing the animation of
The character comprises a group of simulated objects in the form of an internal skeleton which allows the character to be positioned and animated, and external objects constrained to the skeleton to provide him with a dressed, human-like appearance. Thus the scene tree has a family of nodes, shown within dashed line 1601, which comprise the skeleton of the character, and other nodes, shown within dashed line 1602, which comprise its outer body.
As can be seen in
In this example, the body of the character is formed as a single object and represented by node 1603. The body is the parent of other objects including the shirt 1504 and trousers 1503, represented by nodes 1604 and 1605. The shirt and trousers are constrained to the body, so that their animation is determined by the animation of the body. The shirt and trousers are thus children of the body as illustrated by the scene tree. The shirt, trousers and body each have applied textures as represented by nodes 1606, 1607 and 1608 respectively.
The visual display unit 104, as it appears during editing of the animated character 1501, is shown in
The navigation window 804 is shown in
The windows 802 and 803 (not shown in
The body of the character 1501 is one of several simulated objects in a group labelled “CHARACTER#8” which defines the character 1501. Consequently, “CHARACTER#8” is the parent of said body and so label 1701 is displayed below label 1702.
The simulated objects which provide the appearance of the trousers, shirt and hair of the character 1501 are constrained to the body and so they are children of the body. Thus labels 1801, 1802 and 1803 representing the trousers, shirt and hair are displayed below label 1702 representing the body.
As shown in the scene tree of
Number | Date | Country | Kind |
---|---|---|---|
0216814.4 | Jul 2002 | GB | national |
Number | Name | Date | Kind |
---|---|---|---|
5261041 | Susman | Nov 1993 | A |
5267154 | Takeuchi et al. | Nov 1993 | A |
5347306 | Nitta | Sep 1994 | A |
5483630 | Unuma et al. | Jan 1996 | A |
5511158 | Sims | Apr 1996 | A |
5546518 | Blossom et al. | Aug 1996 | A |
5577185 | Tunnell et al. | Nov 1996 | A |
5619632 | Lamping et al. | Apr 1997 | A |
5717848 | Watanabe et al. | Feb 1998 | A |
5786814 | Moran et al. | Jul 1998 | A |
5896139 | Strauss | Apr 1999 | A |
5907704 | Gudmundson et al. | May 1999 | A |
6049805 | Drucker et al. | Apr 2000 | A |
6189012 | Mital et al. | Feb 2001 | B1 |
6237006 | Weinberg et al. | May 2001 | B1 |
6329994 | Gever et al. | Dec 2001 | B1 |
6337700 | Kinoe et al. | Jan 2002 | B1 |
6373484 | Orell et al. | Apr 2002 | B1 |
6437784 | Bentley et al. | Aug 2002 | B1 |
6567070 | Light et al. | May 2003 | B1 |
6701313 | Smith | Mar 2004 | B1 |
6714201 | Grinstein et al. | Mar 2004 | B1 |
6741242 | Itoh et al. | May 2004 | B1 |
6801916 | Roberge et al. | Oct 2004 | B2 |
7061486 | Sowizral et al. | Jun 2006 | B2 |
7106334 | Imagawa et al. | Sep 2006 | B2 |
7292243 | Burke | Nov 2007 | B1 |
20010035873 | Easter | Nov 2001 | A1 |
20020111932 | Roberge et al. | Aug 2002 | A1 |
20030011601 | Itoh et al. | Jan 2003 | A1 |
20030076336 | Fukao et al. | Apr 2003 | A1 |
20030191765 | Bargh et al. | Oct 2003 | A1 |
20030197702 | Turner et al. | Oct 2003 | A1 |
20040210386 | Wood et al. | Oct 2004 | A1 |
20050171746 | Thalhammer-Reyero | Aug 2005 | A1 |
20060262121 | Grassia et al. | Nov 2006 | A1 |
Number | Date | Country | |
---|---|---|---|
20040012640 A1 | Jan 2004 | US |