This invention relates to generating a shadow for a three-dimensional (3D) model based on bones in the infrastructure of the 3D model.
A 3D model includes a virtual skeleton/infrastructure comprised of bones that are arranged in a hierarchical tree structure. Surrounding the bones is a polygon mesh, comprised of polygons such as triangles, which represents the skin of the 3D model. Movement of the polygon mesh is tied to the movement of the bones so that the 3D model approximates real-life movement when the bones are re-positioned.
The 3D model inhabits a virtual world, in which the distance to a virtual camera dictates perspective. A virtual light source, positioned in the environment of the virtual world, is used as a reference point for projecting shadows of the 3D model onto surfaces in the environment.
The 3D data for model 10 also includes bone data. The bone data defines a rigid infrastructure 12 of model 10 (FIG. 2). The infrastructure corresponds to the skeletal structure of a living being. In this embodiment, the “bones” that make up the infrastructure are Cartesian XYZ-space vectors.
In pre-processing phase 16, process 14 receives (301) data that corresponds to the size and/or shape of a shadow to be generated. The data may be input by an animator (user) via a graphical user interface (GUI) (not shown) or the like.
Process 14 reads (302) 3D data that defines the geometry of a frame of 3D animation. The 3D data may be read from a local memory, from a remote database via a network such as the Internet, or obtained from any other source. The data includes 3D data that defines a 3D model, including its polygon mesh (see, e.g.,
Process 14 locates (303) a virtual camera and virtual light source in the environment of the read frame of 3D animation. The location of a virtual camera defines a viewer's perspective for the frame of 3D animation. For example, in
In
Process 14 generates (305) a shadow on surface 36 based on projection 48 of bone 34. Generating a shadow for each bone in a 3D model results in a shadow for the entire 3D model itself. It is noted that process 14 may not generate a shadow for every bone in a 3D model. For example, process 14 may generate shadows for only “major” bones in a 3D model, where major bones may be defined, e.g., by their length (bones that are greater than a predefined length) or proximity to the trunk/body of a 3D model (bones that are within a predefined number of bones from a predefined reference bone).
Process 14 generates the shadow based on the data received (301) in pre-processing phase 16. That is, the data defines the size and shape of the shadow. Process 14 therefore generates the shadow accordingly. This is done by creating (305a) a shape over at least part of projection 48 of the bone. The shape may be created, e.g., by growing a polygon from projection 48 (for the purposes of this application, the definition of “polygon” includes smooth-edged shapes, such as a circle, ellipse, etc.).
By way of example, referring to
Process 14 maps (305b) one or more textures onto the shape (e.g., quadrilateral 50) that was created over projection 48. The texture(s) may define a color of the shape as well as how transparent or translucent the shadow is. That is, it may be desirable to see objects covered by the shadow. Therefore, a light color that is relatively transparent may be mapped. For example, texture with an alpha transparency value of 50% may be used for the mapping.
A “fuzzy” texture may also be mapped onto edges or other portions of the shape. In this context, a fuzzy texture is a texture that does not have sharp edges, meaning that the edges fade out from darker to lighter (hence the use of the term “fuzzy”). Fuzzy textures provide softer-looking shadows, which can be difficult to construct using other techniques.
It is noted that process 14 may be used with other animation that does not necessarily have a bones-based infrastructure. In this case, bones may be defined for a 3D model and then process 14 may be applied. For example, bones may be defined for the veins of leaves on a tree 56 (FIG. 9). Process 14 may project shadows 58 (
As another example, process 14 may be used to generate a shadow of a ball (not shown). In this example, a spherical “bone” or a linear bone that points to the virtual light source (i.e., that looks like a point relative to the virtual light source) may be used to represent the ball. The bone may be projected onto a surface and a shape, such as a circle or an ellipse, may be grown from the projection. The type of shape that is grown may be defined by the user-input data or it may be determined by analyzing the shape of the bone. For example, a spherical bone may dictate a circular shape and a linear bone may dictate a rectangular shape.
Referring to
During run-time phase 64, process 61 identifies the bones infrastructure of a 3D model (e.g., 3D model 10 or 20) and, for each bone in the 3D model, proceeds as follows.
Process 61 generates (1101) a bounding volume for the bone. The bounding volume of the bone is an expansion of a two-dimensional (2D) bone into 3D space. Referring to
Process 61 generates (1102) a shadow of bone 66 by projecting (1102a) a shape of bounding volume 65 onto surface 72. In more detail, process 61 draws lines (e.g., vectors) 73 from virtual light source 74, through locations on the surface (e.g., the perimeter) of bounding volume 65, onto surface 72. The number of lines drawn depends on the shape of bounding volume 65. For example, if bounding volume 65 is a cylinder (as shown), a number of lines (e.g., four) may not be required to project the shadow. On the other hand, if bounding volume 65 is a sphere, more bounding lines may be required to achieve a shadow that is a relatively close approximation of the shape of the bounding volume.
To project the shape, process 61 connects points, in this example, points 77, 78, 79 and 80 at which lines 73 intersect surface 72. Connecting the points results in a shape 83 that roughly corresponds to the outline of bounding volume 65 relative to virtual light source 74.
Process 61 maps (1102b) one or more textures onto shape 83 created by connecting the points. The texture mapping, and contingencies associated therewith, are identical to the texture mapping described above with respect to process 14.
As was the case above, process 61 may not generate a bounding volume and shadow for every bone in a 3D model. For example, process 61 may generate shadows for only “major” bones in a 3D model, where major bones may be defined, e.g., by their length or proximity to the trunk/body of a 3D model.
Processes 14 and 61 both have the advantage of using existing data, e.g., bones, to generate a shadow using relatively little computational effort. Moreover, processes 14 and 61 also give the user control over the look and feel of the resulting shadow.
Processes 14 and 61, however, are not limited to use with the hardware and software of
Processes 14 and 61 may be implemented in hardware, software, or a combination of the two. Processes 14 and 61 may be implemented in computer programs executing on programmable machines that each includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and one or more output devices. Program code may be applied to data entered using an input device, such as a mouse or a keyboard, to perform processes 14 and 61 and to generate output information.
Each such program may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the programs can be implemented in assembly or machine language. The language may be a compiled or an interpreted language.
Each computer program may be stored on a storage medium or device (e.g., CD-ROM, hard disk, or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium or device is read by the computer to perform processes 14 and 61. Processes 14 and 61 may be implemented as articles of manufacture, such as a machine-readable storage medium, configured with a computer program, where, upon execution, instructions in the computer program cause the machine to operate in accordance with processes 14 and 61.
The invention is not limited to the embodiments described above. For example, elements of processes 14 and 61 may be combined to form a new process not specifically described herein. The blocks shown in
Other embodiments not described herein are also within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4600919 | Stern | Jul 1986 | A |
4747052 | Hishinuma et al. | May 1988 | A |
4835712 | Drebin et al. | May 1989 | A |
4855934 | Robinson | Aug 1989 | A |
4901064 | Deering | Feb 1990 | A |
5124914 | Grangeat | Jun 1992 | A |
5163126 | Einkauf et al. | Nov 1992 | A |
5371778 | Yanof et al. | Dec 1994 | A |
5611030 | Stokes | Mar 1997 | A |
5731819 | Gagne et al. | Mar 1998 | A |
5757321 | Billyard | May 1998 | A |
5786822 | Sakaibara | Jul 1998 | A |
5805782 | Foran | Sep 1998 | A |
5809219 | Pearce et al. | Sep 1998 | A |
5812141 | Kamen et al. | Sep 1998 | A |
5847712 | Salesin et al. | Dec 1998 | A |
5894308 | Isaacs | Apr 1999 | A |
5929860 | Hoppe | Jul 1999 | A |
5933148 | Oka et al. | Aug 1999 | A |
5949969 | Suzuoki et al. | Sep 1999 | A |
5966133 | Hoppe | Oct 1999 | A |
5966134 | Arias | Oct 1999 | A |
5974423 | Margolin | Oct 1999 | A |
6054999 | Strandberg | Apr 2000 | A |
6057859 | Handelman et al. | May 2000 | A |
6078331 | Pulli et al. | Jun 2000 | A |
6115050 | Landau et al. | Sep 2000 | A |
6175655 | George et al. | Jan 2001 | B1 |
6191787 | Lu et al. | Feb 2001 | B1 |
6191796 | Tarr | Feb 2001 | B1 |
6198486 | Junkins et al. | Mar 2001 | B1 |
6201549 | Bronskill | Mar 2001 | B1 |
6208347 | Migdal | Mar 2001 | B1 |
6219070 | Baker et al. | Apr 2001 | B1 |
6239808 | Kirk et al. | May 2001 | B1 |
6252608 | Snyder et al. | Jun 2001 | B1 |
6262737 | Li et al. | Jul 2001 | B1 |
6262739 | Migdal et al. | Jul 2001 | B1 |
6292192 | Moreton | Sep 2001 | B1 |
6317125 | Persson | Nov 2001 | B1 |
6337880 | Cornog et al. | Jan 2002 | B1 |
6388670 | Naka et al. | May 2002 | B2 |
6405071 | Analoui | Jun 2002 | B1 |
6437782 | Pieragostini et al. | Aug 2002 | B1 |
6478680 | Yoshioka et al. | Nov 2002 | B1 |
6559848 | O'Rourke | May 2003 | B2 |
6593924 | Lake et al. | Jul 2003 | B1 |
6593927 | Horowitz et al. | Jul 2003 | B2 |
6608627 | Marshall et al. | Aug 2003 | B1 |
6608628 | Ross et al. | Aug 2003 | B1 |
20010024326 | Nakamura et al. | Sep 2001 | A1 |
20010026278 | Arai et al. | Oct 2001 | A1 |
20020061194 | Wu et al. | May 2002 | A1 |
20020101421 | Pallister | Aug 2002 | A1 |
20030011619 | Jacobs et al. | Jan 2003 | A1 |
Number | Date | Country | |
---|---|---|---|
20030071822 A1 | Apr 2003 | US |