Non-homogenous second order perspective texture mapping using linear interpolation parameters

Information

  • Patent Grant
  • 5835097
  • Patent Number
    5,835,097
  • Date Filed
    Monday, December 30, 1996
    28 years ago
  • Date Issued
    Tuesday, November 10, 1998
    26 years ago
Abstract
In a computer system having a host processor, a peripheral graphics device, a display screen, and a memory, a non-homogenous second order perspective texture mapping process. Polygon coordinates, a.sub.m,n, defining a texture polygon, are received. Initial values are received for a set of parameters including u.sub.main, v.sub.main, dv.sub.ortho, dv.sub.main, du.sub.ortho, du.sub.main, d.sup.2 u.sub.ortho, d.sup.2 u.sub.main, d.sup.2 v.sub.ortho, d.sup.2 v.sub.main, du.sub.ortho-ADD, and dv.sub.ortho-ADD. The texture polygon includes span regions. A span value, j, is set to an initial value to designate an initial span region of the texture polygon in (m, n) polygon coordinate space. The total number of rows, n.sub.max, is determined for the current span region of the polygon. The total number of polygon coordinates, m.sub.max, in the current row, n, of the texture polygon is determined. An (x, y) display coordinate, corresponding to the current polygon coordinate, a.sub.m,n, is set by translating from (m, n) polygon space to (x, y) display coordinate space. Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are determined, according to the present invention, for each polygon coordinate, a.sub.m,n, using linear interpolation based on a set of relationships that utilize the above parameters without using a repetitive divide operation. A display pixel of a color determined according to texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), is rendered on a display screen at the determined (x, y) position.
Description

FIELD OF THE INVENTION
The present invention relates to the field of computer implemented graphics display. Specifically, the present invention relates to a system and method for second order perspective texture mapping.
BACKGROUND OF THE INVENTION
Computer controlled graphics systems are used for displaying graphics objects on a display. These graphics objects are composed of graphics primitive elements ("graphics primitives") that include points, lines, polygons, etc. The graphics primitives can be used to render a 2 dimensional (2-D) image of a three dimensional (3-D) object on a display screen. Texture mapping refers to techniques for adding surface detail to areas or surfaces of these 3-D graphics objects displayed on a 2-D display. Since the original graphics object is 3-D, texture mapping often involves maintaining certain perspective attributes with respect to the surface detailed added to the object. Generally, texture mapping occurs by accessing encoded surface detail points or "texels" from a memory storing the surface detail and transferring the surface detail texels to predetermined points of the graphics primitive to be texture mapped. The manner in which the texels are accessed is used to provide the perspective discussed above.
With reference to prior art FIGS. 1A and 1B, a texture map 102 and a display screen 104 are shown. The texture map 102 contains a texture image 103 to be mapped onto an area or surface of a graphics object 105 on the display screen 4. The texture map 102 includes point elements (texels) which reside in a (u, v) texture coordinate space. The texture image 103 is represented in computer memory as a bitmap or other raster-based encoded format. The display screen 104 includes point elements (pixels) which reside in an (x, y) display coordinate space. More specifically, texture mapping operates by applying color or visual attributes of texels of the (u, v) texture map 102 to corresponding pixels of the graphics object 105 on the display screen 104. In texture mapping, color values for pixels in (x, y) display coordinate space are determined based on sampled texture map values. After texture mapping, a version of the texture image 103 is visible on surfaces of the object 5.
Three types of texture mapping are described below, linear, second order homogeneous perspective and second order non-homogeneous perspective. In linear texture mapping, texels of a texture map are generally mapped onto pixels of a 2-D or 3-D graphics object linearly whereby the rate of sampling in texel space with respect to the screen coordinate update rate is constant, e.g., du/dx and du/dy are constant values. In perspective texture mapping, texels of a texture map are generally mapped onto pixels of a 3-D graphics object that is displayed in 2-D space (x, y) wherein the rate of sampling in texel space with respect to the rate of screen coordinate update rate is not constant. Perspective texture mapping features an illusion of depth which is created by varying the sampling rate of the texture map 102 during the normal linearly performed polygon rendering process on the display screen 104. With reference to prior art FIG. 1A and FIG. 1B, the texture image 103 is mapped onto surfaces of a 2-D rendition of the 3-D graphics object 105 on the display screen 104.
With reference to prior art FIG. 2A, a linear texture sampling path 106 is shown in the (u, v) texture coordinate space that is traversed (e.g., "sampled") during texture map sampling. During linear texture map sampling, the texture image 103 is sampled according to path 106 simultaneously with a well known linear polygon rendering process. Path 106 can be represented by a linear equation of u and v. Each texel of the texture map 102 is defined according to (u, v) coordinates. The rates of change of u and v with respect to x and y (e.g., du/dx, du/dy, dv/dx, and dv/dy) of the linear sampling path 106 of FIG. 2A, are constant values for linear texture map sampling.
With reference to FIG. 2B, a second order homogeneous perspective texture sampling path 108 is shown in (u, v) texture coordinate space. The rates of change of u and v with respect to x and y (e.g., du/dx, du/dy, dv/dx, and dv/dy) of the second order homogeneous perspective sampling path 8 are varying values. However, the rates of change of the rates of change of u and v with respect to x and y (e.g., d.sup.2 u/dx.sup.2, d.sup.2 u/dy.sup.2, d.sup.2 v/dx.sup.2, and d.sup.2 v/dy.sup.2) of the second order homogenous perspective sampling path 8 are constant and thus homogenous values. During homogenous second order texture map sampling, the texture map 102 is sampled according to path 108 during the polygon rendering process. Path 108 can be represented by a homogenous second order polynomial equation of u and v.
With reference to Prior Art FIG. 2C, a non-homogenous second order perspective sampling path 110 is shown in (u, v) texture coordinate space. The rates of change of u and v with respect to x and y (e.g., du/dx, du/dy, dv/dx, and dv/dy) along sampling path 110 are varying values. The rates of change of the rates of change of u and v with respect to x and y (e.g., d.sup.2 u/dx.sup.2, d.sup.2 u/dy.sup.2, d.sup.2 v/dx.sup.2, and d.sup.2 v/dy.sup.2) of the second order perspective sampling path 110 are also varying values and non-homogenous (e.g., the second order rate of change of u is defined by multiple functions of v). During non-homogenous second order texture map sampling, the texture map 102 is sampled according to path 110 during the polygon rendering process. Path 110 can be represented by a non-homogenous second order non-homogenous polynomial equation of u and v.
In typical prior art second order perspective texture mapping techniques, linear terms are generated and divided by perspective terms to obtain perspective texture map sample coordinates, T(u, v), for a given display coordinate in (x, y) display coordinate space. The coordinates (u, v) can then be used to obtain an attribute value from a texture map, T, according to T(u, v). The relationship below illustrates an exemplary second order perspective texture mapping relationship in which linear terms, Du and Dv, are divided by perspective terms, W(x, y, z), which represent depth, to obtain perspective texture map sample position rates of change, du and dv,
(du, dv)=(du/W(x, y, z), dv/W(x, y, z)).
From du and dv, the texture coordinates (u, v) are computed in the prior art.
A problem associated with the above described prior art second order perspective texture mapping technique is that it is costly to implement in terms of processor time and integrated circuit real estate due to the repetitive divide operation. Divide operations are computationally expensive. Thus a need exists for a second order perspective texture mapping apparatus which is not costly to implement in terms of processor time and integrated circuit real estate. What is needed further is an apparatus for second order perspective texture mapping that eliminates the repetitive division operation required by prior art texture mapping techniques.
Accordingly, the present invention provides such advantages. These and other advantages of the present invention not described above will become clear in view of the following detailed description of the present invention.





BRIEF DESCRIPTION OF THE DRAWINGS
Prior Art FIG. 1A represents a texture map.
Prior Art FIG. 1B represents a display screen.
Prior Art FIG. 2A is a prior art linear sampling path for sampling a texture map.
Prior Art FIG. 2B is a prior art homogenous 2nd order perspective sampling path for sampling a texture map.
Prior Art FIG. 2C is a prior art non-homogenous 2nd order perspective sampling path for sampling a texture map.
FIG. 3A is an exemplary host computer system for employing the computer implemented method of the present invention for second order perspective texture mapping using linear interpolation parameters.
FIG. 3B represents a computer readable volatile memory unit containing a bit mapped texture map stored therein in accordance with the present invention.
FIG. 4 is a texture polygon, comprised of polygon coordinates, a.sub.m,n, in (m, n) texture polygon coordinate space in accordance with the present invention.
FIG. 5 is a flow diagram for implementing steps of the method of the present invention for second order perspective texture mapping using linear interpolation parameters.





SUMMARY OF THE INVENTION
The process of the present invention provides an approximation for non-homogenous 2nd order perspective texture mapping to provide texture for a polygon which does not require a division operation during polygon rendering. Color values for pixels in (x, y) display coordinate space are determined based on texture map values generated by sampling and processing according to a non-homogenous 2nd order perspective texture mapping process of the present invention.
Polygon coordinates, a.sub.m,n, defining a texture polygon residing in (m, n) polygon coordinate space, are received. Each polygon coordinate, a.sub.m,n, is associated with a position in (m, n) polygon coordinate space. The subscript, m, of each polygon coordinate, a.sub.m,n, refers to the orthogonal position and the subscript, n, of each polygon coordinate, a.sub.m,n, refers to the vertical or main position of the polygon coordinate in (m, n) polygon coordinate space.
The non-homogenous 2nd order perspective texture mapping process of the present invention determines texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), for each of the polygon coordinates, am, n. Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are used as indexes into a texture map to determine colors or visual attributes of display pixels rendered in (x, y) display coordinate space on a display screen. Within the present invention, determining values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), involves non-homogenous second order perspective texture map sampling of a texture map. The rate of change of u with respect to x, du/dx, is referred to in the present invention as du.sub.ortho. Similarly, the rate of change of u with respect to y, du/dy, is referred to in the present invention as du.sub.main. The rate of change of v with respect to x, dv/dx, is referred to as dv.sub.ortho. The rate of change of v with respect to y, dv/dy, is referred to as dv.sub.main. The rate of change of the rate of change of u with respect to x, d.sup.2 u/dx.sup.2, is referred to as d.sup.2 u.sub.ortho. The rate of change of the rate of change of u with respect to y, d.sup.2 u/dy.sup.2, is referred to as d.sup.2 u.sub.main. The rate of change of the rate of change of v with respect to x, d.sup.2 v/dx.sup.2, is referred to as d.sup.2 v.sub.ortho. The rate of change of the rate of change of v with respect to y, d.sup.2 v/dy.sup.2, is referred to as d.sup.2 v.sub.main. In order to compensate for the non-homogenous nature of the mapping (i.e. represent v as a non-uniform second order differential relationship of u), an ortho-add term is included for u and v which approximates for the non-homogenous second order mapping.
In the process of the present invention, values are received for a set of parameters including u-.sub.main, v-.sub.main, dv.sub.ortho, dv.sub.main, du.sub.ortho, du.sub.main, d.sup.2 u.sub.ortho, d.sup.2 u.sub.main, d.sup.2 v.sub.ortho, d.sup.2 v.sub.main, du.sub.ortho-ADD, and dv.sub.ortho-ADD. The initial texture coordinates, u(a.sub.0,0) and v(a.sub.0,0), of the polygon to be displayed are set equal to u-.sub.main and v-.sub.main respectively. Also, registers for the values of m and n are set to initial values (e.g., m=0 and n=0).
The texture polygon includes an upper half region and a lower half region. A span value, j, is set to an initial value to designate the upper half region of the texture polygon in (m, n) polygon coordinate space to start. The total number of orthogonal rows (or rows), n.sub.max, is determined for the current half region of the polygon. The total number of polygon coordinates, a.sub.m,n, in the current row, n, of the texture polygon is determined. An (x, y) display coordinate, corresponding to the current polygon coordinate, a.sub.m,n, is set by translating from (m, n) polygon space to (x, y) display coordinate space by well known techniques since the polygon's position on the display screen is given.
Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are determined for the current polygon coordinate, a.sub.m,n, according to Relationships (2A) and (2B), below:
u(a.sub.m,n)=u(a.sub.m-1, n)+du.sub.ortho (n)+(m-1)d.sup.2 u.sub.ortho for m>0 (2A)
v(a.sub.m,n)=v(a.sub.m-1, n)+dv.sub.ortho(n) +(m-1)d.sup.2 v.sub.ortho for m>0; (2B)
wherein d.sup.2 u.sub.ortho(n) and d.sup.2 v.sub.ortho(n) are constant values and du.sub.ortho(n) and dv.sub.ortho(n) are determined according to relationships (3A) and (3B), below:
du.sub.ortho(n) =n(du.sub.ortho ADD)+du.sub.ortho for all n;(3A)
dv.sub.ortho(n) =n(dv.sub.ortho ADD)+dv.sub.ortho for all n;(3B)
wherein du.sub.orthoADD and dv.sub.orthoADD are constant values used to approximate for the non-homogenous nature of the second order relationship.
Equations (3A) and (3B) are used to determine values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), for all polygon coordinate positions other than the first polygon coordinates, a.sub.0,n, of each row, n, of the texture polygon. Texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), for the first polygon coordinate positions, a.sub.0,n, of each row, n, of the texture polygon are determined according to equations 5A and 5B described further below.
Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are used to define a color for a display pixel in (x, y) display coordinate space. The color or visual attribute defined by texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), is accessed from a texture map and can be expressed by Relationship (4), below:
Color=T(u(a.sub.m,n) and v(a.sub.m,n)) (4)
where T is the texture map in memory.
Having determined texture coordinates, u(a.sub.m,n) and v(a.sub.m,n) for the current polygon coordinate, a.sub.m,n, a display pixel is rendered on a display screen at the corresponding (x, y) display coordinate position set above. The color of the current display pixel, rendered on the display screen, is determined according to Relationship (4) above.
Subsequently, it is determined whether there are more polygon coordinates, a.sub.m,n, to be processed for the current row, n, of the texture polygon. If there are no more polygon coordinates, a.sub.m,n, to be processed for the current row, n, of the texture polygon, then the process of the present invention proceeds to process the next row, n, of polygon coordinates, a.sub.m,n, of the texture polygon by incrementing the vertical position, n.
Values are then determined for texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), for each polygon coordinate, a.sub.0,n, which is the first polygon coordinate on each row, n, of the texture polygon. Texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), are determined for each polygon coordinate, a.sub.0,n, according to Relationships (5A) and (5B), below:
u(a.sub.0,n)=u(a.sub.0, n-1)+du.sub.main +(n-1)d.sup.2 u.sub.main for m=0, n>0 (5A)
v(a.sub.0,n)=v(a.sub.0, n-1)+dv.sub.main +(n-1)d.sup.2 v.sub.main for m=0, n>0; (5B)
wherein du.sub.main, d.sup.2 u.sub.main, dv.sub.main, and d.sup.2 v.sub.main are constants.
Texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), are used to define a color for a display pixel in (x, y) display coordinate space. The color or visual attribute defined by texture coordinates, u(a.sub.0,n) and v(a.sub.0,n) is accessed from a texture map, T, and can be expressed by Relationship (5), reprinted below;
Color=T�u(a.sub.m,n) and v(a.sub.m,n)! (5).
Having determined texture coordinates, u(a.sub.0,n) and v(a.sub.0,n) for the current polygon coordinate, a.sub.0,n, a display pixel is rendered on a display screen at the corresponding (x, y) display coordinate position set above. The color of the current display pixel is determined according to Relationship (5) above.
The process of the present invention is implemented as instructions stored in a computer readable memory unit of a host computer system and can be executed over a host processor of the host computer system or over a display processor in a peripheral graphics device of a host computer system.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
In the following detailed description of the present invention, a second order perspective texture mapping process using linear interpolation parameters, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be obvious to one skilled in the art that the present invention may be practiced without these specific details or by using alternate elements or processes. In other instances well known processes, procedures, components, and circuits have not been described in detail as not to unnecessarily obscure aspects of the present invention.
NOTATION AND NOMENCLATURE
Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. These descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. A procedure, logic block, process, etc., is herein, and generally, conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these physical manipulations take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system. Herein, these signals are referred to as bits, values, elements, symbols, characters, terms, numbers, or the like with reference to the present invention.
It should be borne in mind, however, that all of these terms are to be interpreted as referencing physical manipulations and quantities and are merely convenient labels and are to be interpreted further in view of terms commonly used in the art. Unless specifically stated otherwise as apparent from the following discussions, it is understood that throughout discussions of the present invention, discussions utilizing terms such as "processing" or "computing" or "calculating" or "determining" or "displaying" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data. The data is represented as physical (electronic) quantities within the computer system's registers and memories and is transformed into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
COMPUTER SYSTEM PLATFORM
With reference to FIG. 3A, a block diagram is shown of a host computer system 312 used by the preferred embodiment of the present invention. In general, host computer system 312 used by the preferred embodiment of the present invention comprises a bus 300 for communicating information, a host processor 301 coupled with the bus 300 for processing information and instructions, a computer readable volatile memory unit 302 (e.g. random access memory unit) coupled with the bus 300 for storing information and instructions for the host processor 301, a computer readable non-volatile memory unit 303 (e.g., read only memory unit) coupled with the bus 300 for storing static information and instructions for the host processor 301, a computer readable data storage device 304 such as a magnetic or optical disk and disk drive (e.g., hard drive or floppy diskette) coupled with the bus 300 for storing information and instructions, and a display device 305 coupled to the bus 300 for displaying information to the computer user. The display device 305 utilized with the computer system 312 of the present invention can be a liquid crystal device, cathode ray tube, or other display device suitable for creating graphic images and alphanumeric characters recognizable to the user.
The host system 312 provides data and control signals via bus 300 to a graphics hardware unit ("card") 309. The graphics hardware card 309 contains a display processor 310 which executes a series of display instructions found within a display list. The display processor 310 supplies data and control signals to a frame buffer which refreshes the display device 305 for rendering images (including graphics images) on display device 305.
With reference to FIG. 3B, a block diagram is shown of the computer readable volatile memory unit 302 containing a texture map 314 stored therein.
The present invention provides an approximation for non-homogenous 2nd order perspective texture mapping which applies color or visual attributes of texels of a (u, v) configured texture map to surfaces of 3-D graphical objects represented in 2-D (x, y) display coordinate space. Color values for display pixels in (x, y) display coordinate space are determined based on data sampled from the texture map 314 and processed according to a 2nd order perspective texture mapping process of the present invention. Texture map 314 can include bit mapped data representative of an image 316.
With reference to FIG. 4, an exemplary texture polygon 400 is shown in an (m, n) polygon coordinate space 402 which has an orthogonal position coordinate, m, and a main (or vertical) position coordinate, n. The texture polygon 400 is comprised of polygon coordinates, a.sub.m,n. The texture polygon 400 can be a triangle, as shown in FIG. 4, or any other polygon. The texture polygon 400 includes an upper half region, designated by j=0, and a lower half region designated by j=1 (FIG. 4). The upper half region of the exemplary texture polygon 400 includes P1 rows (or rows) and the lower half region of texture polygon 400 includes (P2-P1) rows. The major slope 410 of the triangle 400 spans from the upper vertex to its lower vertex and while shown vertical in FIG. 4, the major slope 410 can also be diagonal.
The present invention provides a 2nd order perspective texture mapping process which determines texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), for each of the polygon coordinates, a.sub.m,n. Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are used as indexes to access data in the texture map 314 if FIG. 3B. Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n) determine the color of display pixels rendered in corresponding (x, y) display coordinate space of the polygon 400 on the display device 305.
Determining values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), involves second order perspective texture map sampling of the texture map 314 of FIG. 3B. In the present invention, the rate of change of u with respect to x, du/dx, is referred to as du.sub.ortho. Similarly, the rate of change of u with respect to y, du/dy, is referred to as du.sub.main. The rate of change of v with respect to x, dv/dx, is referred to as dv.sub.ortho. The rate of change of v with respect to y, dv/dy, is referred to as dv.sub.main. The rate of change of the rate of change of u with respect to x, d.sup.2 u/dx.sup.2, is referred to as d.sup.2 u.sub.ortho. The rate of change of the rate of change of u with respect to y, d.sup.2 u/dy.sup.2, is referred to as d.sup.2 u.sub.main. The rate of change of the rate of change of v with respect to x, d.sup.2 v/dx.sup.2, is referred to as d.sup.2 v.sub.ortho. The rate of change of the rate of change of v with respect to y, d.sup.2 v/dy.sup.2, is referred to as d.sup.2 v.sub.main. Parameters, du.sub.ortho-ADD and dv.sub.ortho-ADD, are defined as offset values and are constants.
The pixels, a.sub.m,n of polygon 400 are rendered on display screen 305 from the top rows (or rows) downward and for each row, from the left most pixel (e.g., a.sub.0,m) rightward to the far right pixel. As shown below, each pixel is processed separately for rendering on display screen 305. In view of the exemplary polygon 400 of FIG. 3B, the pixels are processed in the following order: a.sub.0,0 ; a.sub.0,1 ; a.sub.1,1 ; a.sub.0,2 ; a.sub.1,2 ; a.sub.2,2 ; a.sub.0,3 ; a.sub.1,3 ; a.sub.2,3 ; a.sub.3,3 ; . . . a.sub.0,7 ; a.sub.1,7 ; and a.sub.0,8.
PROCESS OF THE PRESENT INVENTION
FIG. 5 shows a flow diagram for implementing a process 500 according to the method of the present invention for non-homogenous second order perspective texture mapping using linear interpolation parameters. Process 500 includes polygon rendering and maps images or patterns from the (u, v) texture map 314 to a polygon in (x, y) display coordinate space on the display device 305. Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are determined for each of the polygon coordinates, a.sub.m, n, of FIG. 4. Texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are used to determine colors or visual attributes of display pixels rendered in (x, y) display coordinate space on the display device 305.
In operation, process 500 determines color values for pixels in (x, y) display coordinate space and then renders display pixels sequentially on the display device 305, horizontally from left to right and sequentially downward as orthogonal lines of pixels are fully rendered. It is appreciated however that the method of the present invention is also well suited to rendering pixels horizontally from right to left. The process 500 of the present invention is implemented as instructions stored in a computer readable memory unit of host computer system 312 and can be executed over the host processor 301 of FIG. 3 or over the display processor 310 of the peripheral graphics device 309 of FIG. 3A.
In step 510 of FIG. 5, values are received by process 500 for a set of parameters including dv.sub.ortho, dv.sub.main, du.sub.ortho, du.sub.main, d.sup.2 u.sub.ortho, d.sup.2 u.sub.main, d.sup.2 v.sub.ortho, d.sup.2 v.sub.main, du.sub.ortho-ADD, and dv.sub.ortho-ADD. The initial texture coordinates, u(a.sub.0,0) and v(a.sub.0,0), are set equal to u-.sub.main and v-.sub.main respectively. The above parameters are computed based on a process described in patent application Ser. No. 08/777,558, filed on Dec. 30, 1996, entitled Method for Computing Parameters Used in a Non-Homogeneous Second Order Perspective Texture Mapping Processing Using Interpolation, assigned to the assignee of the present invention, and having an attorney docket number of CRUS-096-0030. Also at step 510, the values of m and n are set to initial values (e.g., m=0 and n=0).
In step 515, a span value, j, is set to an initial value. In the preferred embodiment of the present invention, the upper half region of the texture polygon 400 of FIG. 4 is designated when span, j, is equal to 0 and the lower half region of the texture polygon 400 is designated when span, j, is equal to 1.
In step 520 of FIG. 5, it is determined whether j>1. The upper half region of the texture polygon 400 of FIG. 4 is processed while span, j, is equal to 0 and the lower half region of the texture polygon 400 is processed while span, j, is equal to 1. If j<=1, process 500 proceeds to step 525. If j>1, process 500 terminates.
In step 525 of FIG. 5, the total number of rows, n, is determined for the half region of texture polygon 400 designated by the current span, j (FIG. 4). Provided that j=0, the number of orthogonal lines to be processed, in the upper half region of the texture polygon 400, is equal to P1. The value, n.sub.max, is equal to P1 at step 525. Provided that j=1, the number of orthogonal lines to be processed, in the lower half region of exemplary texture polygon 400, is equal to (P2-P1). This value, n.sub.max, is set to (P2-P1) at step 525 for this case.
Step 530 of FIG. 5 determines the number of polygon coordinates, a.sub.m,n, in the current row, n, of the texture polygon 400 (FIG. 4). This is the row width. For example, if n=0, then it is determined in step 530 that there is one polygon coordinate, a.sub.0,0, in the orthogonal line defined by n=0 (FIG. 4). If n=1, then it is determined in step 530 that there are two polygon coordinates, a.sub.0,1 and a.sub.1,1 in the orthogonal line defined by n=0 (FIG. 4).
At step 535, the current polygon position in (m, n) polygon coordinate space is translated into a corresponding display position in (x, y) display coordinate space. Once the position of polygon coordinate space (m, n) is known with respect to the display coordinate space position (x, y), step 535 can be implemented using well known translation techniques.
In step 540 of FIG. 5, texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are determined for the current polygon coordinate, a.sub.m,n, according to Relationships 2A and 2B, below:
u(a.sub.m,n)=u(a.sub.m-1, n)+du.sub.ortho(n) +(m-1)d.sup.2 u.sub.ortho for m>0 (2A)
v(a.sub.m,n)=v(a.sub.m-1, n)+dv.sub.ortho(n) +(m-1)d.sup.2 v.sub.ortho for m>0; (2B)
wherein du.sub.ortho(n) and dv.sub.ortho(n) are determined according to relationships 3A and 3B, below:
du.sub.ortho(n) =n(du.sub.ortho ADD)+du.sub.ortho for all n;(3A)
dv.sub.ortho(n) =n(dv.sub.ortho ADD)+dv.sub.ortho for all n;(3B)
wherein du.sub.orthoADD, dv.sub.orthoADD, du.sub.ortho, and dv.sub.ortho are constant values. Step 540 of FIG. 5 is used to determine values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), for all polygon coordinate positions along the current line, n, other than the first polygon coordinates, a.sub.0,n, of each row, n, of the texture polygon 400 of FIG. 4. On the first pass through step 540 for a new line, n, m=0. Texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), for the first polygon coordinate positions, a.sub.0,n, of each row, n, of the texture polygon 400 of FIG. 4, are determined according to steps described below with respect to step 560.
At step 545, texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), are used to define a color for a display pixel in (x, y) display coordinate space. The color or texture defined by texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), can be expressed by Relationship (4), below:
Color=T�u(a.sub.m,n) and v(a.sub.m,n)! (4)
wherein T�u, v! refers to the color or texture at the (u,v) texture coordinates of the texture map 314.
Step 545 of FIG. 5 obtains the color value, color, and renders a display pixel on the display device 305 wherein the pixel color or texture is defined by texture coordinates, u(a.sub.m,n) and v(a.sub.m,n) for the current polygon coordinate, a.sub.m, n. The display pixel rendered on the display device 305 by step 545 is positioned at the (x, y) display coordinate position, corresponding to the current polygon coordinate, a.sub.m, n, defined in step 535 above. The color of the display pixel rendered on display device 305 in step 545 is determined according to Relationship (4) above.
Step 550 of FIG. 5 determines whether there are more polygon coordinates, a.sub.m,n, to be processed for the current row, n, of texture polygon 400 by comparing n to the current row width obtained at step 530. If there are more polygon coordinates, a.sub.m,n, for the current row, n, of the texture polygon 400, then process 500 proceeds to step 555, which increments the value of m by one. From step 555, process 500 proceeds back to step 535 to set an (x, y) display coordinate corresponding to the new current polygon coordinate, a.sub.m,n. At step 535, the polygon position in (m, n) polygon coordinate space is translated into a corresponding (x, y) display coordinate space. In this manner all pixels from m>0 of the given row, n, are rendered on the display device 305. At step 550, if the current value of m equals the width in m defined for the current row, then no more coordinates in a.sub.m,n are in the current row, n. At step 550, if there are no more polygon coordinates, a.sub.m,n, to be processed for the current row, n, of the texture polygon 400, then process 500 proceeds to step 558.
In step 558, the value of n is incremented to access a new polygon row and process 500 proceeds to process the next row, n, of texture polygon coordinates, a.sub.m,n, of the texture polygon 400 of FIG. 4. From step 558, process 500 proceeds to step 560
Step 560 of FIG. 5 determines values for texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), for the current polygon coordinate, a.sub.0,n (except for a.sub.0,0 for which an initial value is assumed in step 510 above). Each of the polygon coordinates, a.sub.0,n is the first polygon coordinate in the current row, n, of the texture polygon 400 (FIG. 4). Values for u(a.sub.0,n) and v(a.sub.0,n) are determined for the current polygon coordinate, a.sub.0,n, by implementing Relationships (5A) and (5B), below;
u(a.sub.0,n)=u(a.sub.0, n-1)+du.sub.main +(n-1)d.sup.2 u.sub.main for m=0, n>0 (5A)
v(a.sub.0,n)=v(a.sub.0, n-1)+dv.sub.main +(n-1)d.sup.2 v.sub.main for m=0, n>0; (5B)
wherein du.sub.main, d.sup.2 u.sub.main, dv.sub.main, and d.sup.2 v.sub.main are constants.
Step 565 determines whether there are more rows, n, of polygon coordinates, a.sub.m,n, to be processed for the current span, j, of the texture polygon 400 of FIG. 4. This is performed by comparing the current count, n, to the current value of n.sub.max. If n>n.sub.max, then there are no more rows in the current span to process. If rows remain, then process 500 proceeds back to step 530 and again through steps 530-565 to process the next line, n, of polygon coordinates, a.sub.m,n, in the texture polygon 400. If there are no more rows, n, of polygon coordinates, a.sub.m,n, to be processed for the current span, j, of the texture polygon 400, then process 500 proceeds to step 575 which increments the value of span, j, by 1. From step 575, process 500 proceeds back to step 520 which determines whether j>1. If j<=1, process 500 proceeds to step 525 to process polygon coordinates, a.sub.m,n, for the next span of texture polygon 400. If j>1, process 500 terminates.
By receiving the initial values and constants in step 510 and using the above relationships of step 540 and step 560, the present invention provides an effective polygon texture method 500 that eliminates the processor intensive divide operation used by prior art systems. In lieu of the expensive divide, the present invention utilizes a linear interpolation driven technique based on a set of interpolation parameters received at step 510.
Claims
  • 1. In a computer system having a host processor, a bus coupled to said processor, a display screen coupled to said bus, and a memory coupled to said bus, a method of performing non-homogenous 2nd order perspective texture mapping onto a polygon having pixel coordinates arranged in (m,n) coordinate space, said method comprising the computer implemented steps of:
  • (a) receiving constant parameters u-.sub.main, v-.sub.main, dv.sub.ortho, dv.sub.main, du.sub.ortho, du.sub.main, d.sup.2 u.sub.ortho, d.sup.2 u.sub.main, d.sup.2 v.sub.ortho, d.sup.2 v.sub.main, du.sub.ortho-ADD, and dv.sub.ortho-ADD, wherein initial values for texture coordinates, u(a.sub.0,0) and v(a.sub.0,0), are equal to u-.sub.main and v-.sub.main respectively;
  • (b) determining values for texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), which correspond to the first polygon coordinate on each row, n, of said texture polygon, by implementing,
  • u(a.sub.0,n)=u(a.sub.0, n-1)+du.sub.main +(n-1)d.sup.2 u.sub.main for m=0, n>0, and
  • v(a.sub.0,n)=v(a.sub.0, n-1)+dv.sub.main +(n-1)d.sup.2 v.sub.main for m=0, n>0; and
  • (c) determining values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), which correspond to each polygon coordinate, a.sub.m,n, for each polygon coordinate position, m, other than the first polygon coordinate position on each row, n, of said texture polygon, by implementing,
  • u(a.sub.m,n)=u(a.sub.m-1, n)+du.sub.ortho(n) +(m-1)d.sup.2 u.sub.ortho for m>0, and
  • v(a.sub.m,n)=v(a.sub.m-1, n)+dv.sub.ortho(n) +(m-1)d.sup.2 v.sub.ortho for m>0;
  • wherein du.sub.ortho(n) and dv.sub.ortho(n) are determined according to,
  • du.sub.ortho(n) =n(du.sub.ortho ADD)+du.sub.ortho for all n, and
  • dv.sub.ortho(n) =n(dv.sub.ortho ADD)+dv.sub.ortho for all n.
  • 2.
  • 2. A method as described in claim 1 further comprising the computer implemented steps of:
  • (d) accessing a texture value from a texture map stored in said memory for each set of texture coordinates, u(a.sub.m,n) and v(a.sub.m,n); and
  • (e) rendering each texture value obtained in step (d) on said display screen at a corresponding (x, y) position.
  • 3. A method as described in claim 1 further comprising the computer implemented steps of:
  • setting a span value, j, to designate a span region of said polygon;
  • determining the total number of rows, n, for said span region, j, of said polygon;
  • determining the number of polygon coordinates, a.sub.m,n, in each row, n, of said polygon;
  • translating each of said polygon coordinates, a.sub.m,n, into corresponding (x, y) display coordinates;
  • accessing a stored texture map by each set of texture coordinates, u(a.sub.m,n), v(a.sub.m,n), to obtain texture values; and
  • rendering each texture value on said display screen at corresponding (x, y) display coordinates.
  • 4. In a computer system having a host processor, a bus coupled to said processor, a display screen coupled to said bus, and a memory coupled to said bus, a method of performing 2nd order perspective texture mapping onto a polygon having pixel coordinates arranged in (m,n) coordinate space, said method comprising the computer implemented steps of:
  • (a) receiving constant parameters u-.sub.main, v-.sub.main, dv.sub.ortho, dv.sub.main, du.sub.ortho, du.sub.main, d.sup.2 u.sub.ortho, d.sup.2 u.sub.main, d.sup.2 v.sub.ortho, d.sup.2 v.sub.main, du.sub.ortho-ADD, and dv.sub.ortho-ADD, wherein initial values for texture coordinates, u(a.sub.0,0) and v(a.sub.0,0), are equal to u-.sub.main and v-.sub.main respectively;
  • (b) determining values for texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), which correspond to the first polygon coordinate on each row, n, of said texture polygon, by implementing,
  • u(a.sub.0,n)=u(a.sub.0, n-1)+du.sub.main +(n-1)d.sup.2 u.sub.main for m=0 , n>0, and
  • v(a.sub.0,n)=v(a.sub.0, n-1)+dv.sub.main +(n-1)d.sup.2 v.sub.main for m=0, n>0; and
  • (c) determining values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), which correspond to each polygon coordinate, a.sub.m,n, for each polygon coordinate position, m, other than the first polygon coordinate position on each row, n, of said texture polygon, by implementing,
  • u(a.sub.m,n)=u(a.sub.m-1, n)+du.sub.ortho(n) +(m-1)d.sup.2 u.sub.ortho for m>0, and
  • v(a.sub.m,n)=v(a.sub.m-1, n)+dv.sub.ortho(n) +(m-1)d.sup.2 v.sub.ortho for m>0.
  • 5. A method as described in claim 4 further comprising the steps of:
  • (d) accessing a texture value from a texture map stored in said memory for each set of texture coordinates, u(a.sub.m, n) and v(a.sub.m, n); and
  • (e) rendering each texture value obtained in step (d) on said display screen at a corresponding (x, y) position.
  • 6. A method as described in claim 4 further comprising the computer implemented steps of:
  • setting a span value, j, to designate a span region of said polygon;
  • determining the total number of rows, n, for said span region, j, of said polygon;
  • determining the number of polygon coordinates, a.sub.m,n, in each row, n, of said polygon;
  • translating each of said polygon coordinates, a.sub.m,n, into corresponding (x, y) display coordinates;
  • accessing a stored texture map by each set of texture coordinates, u(a.sub.m,n), v(a.sub.m,n), to obtain texture values; and
  • rendering each texture value on said display screen at corresponding (x, y) display coordinates.
  • 7. A computer system comprising a host processor coupled to a bus, physical memory coupled to said bus, and a display screen, said physical memory containing program code that when executed over said host processor causes said computer system to implement a method of performing non-homogenous 2nd order perspective texture mapping onto a polygon having pixel coordinates arranged in (m,n) coordinate space, said method comprising the computer implemented steps of:
  • (a) receiving constant parameters u-.sub.main, v-.sub.main, dv.sub.ortho, dv.sub.main, du.sub.ortho, du.sub.main, d.sup.2 u.sub.ortho, d.sup.2 u.sub.main, d.sup.2 v.sub.ortho, d.sup.2 v.sub.main, du.sub.ortho-ADD, and dv.sub.ortho-ADD, wherein initial values for texture coordinates, u(a.sub.0,0) and v(a.sub.0,0), are equal to u-.sub.main and v-.sub.main respectively;
  • (b) determining values for texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), which correspond to the first polygon coordinate on each row, n, of said texture polygon, by implementing,
  • u(a.sub.0,n)=u(a.sub.0, n-1)+du.sub.main +(n-1)d.sup.2 u.sub.main for m=0, n>0, and
  • v(a.sub.0,n)=v(a.sub.0, n-1)+dv.sub.main +(n-1)d.sup.2 v.sub.main for m=0, n>0; and
  • (c) determining values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), which correspond to each polygon coordinate, a.sub.m,n, for each polygon coordinate position, m, other than the first polygon coordinate position on each row, n, of said texture polygon, by implementing,
  • u(a.sub.m,n)=u(a.sub.m-1, n)+du.sub.ortho(n) +(m-1)d.sup.2 u.sub.ortho for m>0, and
  • v(a.sub.m,n)=v(a.sub.m-1, n)+dv.sub.ortho(n) +(m-1)d.sup.2 v.sub.ortho for m>0;
  • wherein du.sub.ortho(n) and dv.sub.ortho(n) are determined according to,
  • du.sub.ortho(n) =n(du.sub.ortho ADD)+du.sub.ortho for all n, and
  • dv.sub.ortho(n) =n(dv.sub.ortho ADD)+dv.sub.ortho for all n.
  • 8. A computer system as described in claim 7 wherein said method further comprises the computer implemented steps of:
  • (d) accessing a texture value from a texture map stored in said memory for each set of texture coordinates, u(a.sub.m,n) and v(a.sub.m,n); and
  • (e) rendering each texture value obtained in step (d) on said display screen at a corresponding (x, y) position.
  • 9. A computer system as described in claim 7 wherein said method further comprises the computer implemented steps of:
  • setting a span value, j, to designate a span region of said polygon;
  • determining the total number of rows, n, for said span region, j, of said polygon;
  • determining the number of polygon coordinates, a.sub.m,n, in each row, n, of said polygon;
  • translating each of said polygon coordinates, a.sub.m,n, into corresponding (x, y) display coordinates;
  • accessing a stored texture map by each set of texture coordinates, u(a.sub.m,n), v(a.sub.m,n), to obtain texture values; and
  • rendering each texture value on said display screen at corresponding (x, y) display coordinates.
  • 10. In a computer system having a host processor, a bus coupled to said processor, a display screen coupled to said bus, and a memory coupled to said bus, a method for performing second order perspective texture mapping onto a polygon having pixel coordinates arranged in (m,n) coordinate space, said method comprising the computer implemented steps of:
  • (a) receiving a set of parameters including u.sub.main, V.sub.main, dv.sub.ortho, dv.sub.main, du.sub.ortho, du.sub.main, d.sup.2 u.sub.ortho, d.sup.2 u.sub.main, d.sup.2 v.sub.ortho, d.sup.2 v.sub.main, du.sub.ortho-ADD, and dv.sub.ortho-ADD, wherein initial values for texture coordinates, u(a.sub.0,0) and v(a.sub.0,0), are equal to u.sub.main and v.sub.main respectively;
  • (b) for each row, n>0, of said polygon, determining values for texture coordinates, u(a.sub.0,n) and v(a.sub.0,n), which correspond to the first polygon coordinate on each row, n, of said polygon, said step (b) performed according to the below procedures:
  • u(a.sub.0,n)=u(a.sub.0, n-1)+du.sub.main +(n-1)d.sup.2 u.sub.main,
  • v(a.sub.0,n)=v(a.sub.0, n-1)+dv.sub.main +(n-1)d.sup.2 v.sub.main ;
  • (c) for each row, n>0, determining values for texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), for each polygon coordinate, a.sub.m,n, m>0, of said texture polygon, said step (c) performed according to the below procedures:
  • u(a.sub.m,n)=u(a.sub.m-1, n)+du.sub.ortho(n) +(m-1)d.sup.2 u.sub.ortho,
  • v(a.sub.m,n)=v(a.sub.m-1, n)+dv.sub.ortho(n) +(m-1)d.sup.2 v.sub.ortho ; and
  • (d) determining values of du.sub.ortho(n) and dv.sub.ortho(n) for each orthogonal line.
  • 11. A method as described in claim 10 wherein said step (d) is performed according to the below procedures:
  • du.sub.ortho(n) =n(du.sub.ortho ADD)+du.sub.ortho, and
  • dv.sub.ortho(n) =n(dv.sub.ortho ADD)+dv.sub.ortho.
  • 12. A method as described in claim 11 and further comprising the steps of:
  • (e) accessing a texture value from a texture map stored in said memory for each set of texture coordinates, u(a.sub.m,n) and v(a.sub.m,n), said texture map containing an image for display on said display screen with second order perspective; and
  • (f) rendering each texture value obtained in step (e) on said display screen at an (x, y) screen display coordinate corresponding to an (m,n) polygon coordinate of said each texture value.
  • 13. A method as described in claim 11 further comprising the steps of:
  • setting a span value, j, to designate a span region of said texture polygon;
  • determining the total number of rows, n, for said span region of said polygon;
  • determining the number of polygon coordinates, a.sub.m,n, in each row, n, of said polygon;
  • translating each of said polygon coordinates, a.sub.m,n, into corresponding (x, y) display coordinates;
  • accessing a stored texture map by each set of texture coordinates u(a.sub.m,n) and v(a.sub.m,n) to obtain texture values;
  • rendering on said display screen each texture value obtained from said step of accessing a corresponding (x,y) display coordinate.
US Referenced Citations (24)
Number Name Date Kind
4583185 Heartz et al. Apr 1986
4586038 Sims Apr 1986
4692880 Merz et al. Sep 1987
4714428 Bunker et al. Dec 1987
4715005 Heartz Dec 1987
4727365 Bunker et al. Feb 1988
4811245 Bunker et al. Mar 1989
4821212 Heartz Apr 1989
4825391 Merz Apr 1989
4855937 Heartz Aug 1989
4862388 Bunker Aug 1989
4905164 Chandler et al. Feb 1990
4958305 Piazza Sep 1990
4965745 Economy et al. Oct 1990
5126726 Howard et al. Jun 1992
5187754 Currin et al. Feb 1993
5191642 Quick et al. Mar 1993
5268996 Steiner et al. Dec 1993
5293467 Buchner et al. Mar 1994
5357579 Buchner et al. Oct 1994
5367615 Economy et al. Nov 1994
5420970 Steiner et al. May 1995
5745667 Kawase et al. Apr 1998
5751293 Hashimoto et al. May 1998
Foreign Referenced Citations (1)
Number Date Country
WO 9636011 Nov 1996 WOX