The invention relates to the domain of image synthesis and more specifically to the domain of lighting simulation in a virtual environment comprising one or more participating media. The invention is also understood in the context of special effects for a live composition.
According to the prior art, different methods exist for estimating the quantity of light received and scattered in participating media such as for example fog, smoke, dust or clouds. Participating media correspond to media composed of particles in suspension that interact with the light modifying particularly the trajectory and intensity.
Participating media can be broken down into two groups, namely homogenous media such as water and heterogeneous media, such as smoke or clouds. In the case of homogenous participating media, it is possible to calculate analytically the attenuation of the light transmitted by a light source. In fact, due to their homogenous nature, these media have parameters such as the light absorption coefficient or the light scattering coefficient that are constant at any point of the media. Conversely, the light absorption and scattering properties vary from one point to another in a heterogeneous participating media. The calculations required to simulate the scattering of light in such heterogeneous media are then very costly and it is thus not possible to calculate analytically and interactively the quantity of light scattered by a heterogeneous participating medium. In addition, the media not being isotropic (that is to say the scattering of the media being anisotropic), the quantity of light scattered by the media also varies according to the scattering direction of the light, that is to say the direction in which a person views the media. Calculations estimating the quantity of light scattered must then be reiterated for each observation direction of the media by a person in order to obtain a realistic rendering of the media.
To produce the live display of heterogeneous participating media, some methods perform the pre-calculation of some parameters representative of the heterogeneous participating media. Though these methods are perfectly adapted for a studio use in post-production for example and provide a good quality display, these methods are not adapted in the context of live interactive conception and composition of a heterogeneous participating media. Such a method is described for example in the patent application WO2009/003143 filed by Microsoft Corporation and published on 31 Dec. 2008. The purpose of the application WO2009/003143 is a live display application for a heterogeneous media and describes a solution using radial base functions. This solution cannot however be considered as a live display solution as some pre-processing must be applied offline to the participating media to be able to calculate projection coefficients representing the media that will be used for image synthesis live calculations.
With the emergence of interactive simulation games and applications, notably in three dimensions (3D), the need is being felt for live simulation methods offering a realistic display of heterogeneous participating media.
The purpose of the invention is to overcome at least one of these disadvantages of the prior art.
More specifically, the purpose of the invention is to optimize the required calculation time to compose a realistic live display of the light passing through one or more participating media.
The invention relates to a method for estimating the quantity of light received by at least an element of at least a participating medium, the at least an element belonging to a light ray crossing the at least a participating medium and having as origin a light source. The method comprises the steps of:
Advantageously, the pseudo-metric function is increasing inside the at least a participating medium and stationary outside the at least a participating medium.
According to a specific characteristic, the pseudo-metric function is determined from a first function that is equal to zero for each element of the light ray not belonging to the at least a participating medium and different from zero for elements of the light ray belonging to the at least a participating medium.
According to a specific characteristic, the first function is equal to 1 for the elements of the light ray belonging to the at least a participating medium.
Advantageously, the first function is a derivative of the pseudo-metric function.
According to a specific characteristic, the method comprises a step of estimating second projection coefficients in a functions basis, the second projection coefficients being representative of the first function.
Advantageously, the pseudo-metric function is determined according to the second projection coefficients.
According to a particular characteristic, the first projection coefficients are estimated by projecting the extinction function into the function basis with respect to the pseudo-metric function.
According to another characteristic, the at least a participating medium is homogeneous or heterogeneous.
Advantageously, the first projection coefficients are stored in a projective texture.
According to a specific characteristic, the method comprises a step of estimating values representative of the reduction of light intensity for elements of the light ray from the first projection coefficients.
The invention also relates to a device configured for estimating the quantity of light received by at least an element of at least a participating medium, the at least an element belonging to a light ray crossing the at least a participating medium and having as origin a light source, the device comprises at least a processor configured for:
Advantageously, the at least a processor is a Graphics Processing Unit.
The invention also relates to a computer program product, which comprises instructions of program code for executing the steps of the above method, when the program is executed on a computer.
The invention will be better understood, and other specific features and advantages will emerge upon reading the following description, the description making reference to the annexed drawings wherein:
The virtual environment may be represented with a set of elements, an element corresponding for example to a point or a particle, a density value being associated with each element. A particle is advantageously assimilated to a sphere that is characterized by its centre and an influence radius. A particle groups a set of points together, the points of the set of points having same or similar characteristics (for example a same density). When represented with points, a density value is associated with each point of the virtual environment. When represented with particles, a density value is associated with each particle of the virtual environment.
Advantageously, the virtual environment is represented with a set of extinction coefficients associated with the elements forming the virtual environment.
ƒ(x)=Σi=0Ncibi(x) equation 1
c
i≈∫ƒ(x)bi(x) equation 2
c
i≈=∫σt(x)bi(x)dx equation 3
σt(x)=Σi=0Ncibi(x) equation 4
The extinction function σt(x) along the light ray is thus represented with a set of first projection coefficients ci which enable to compute the value of any extinction coefficient value associated with any elements of the virtual environment along the light ray 10. The representation of the extinction function σt(x) in the function basis has the advantage of simplifying and speeding up the computations needed for estimating light intensity attenuation inside the participating media and also of reducing the footprint on memory (as explained with more details with regard to
Naturally, the number of participating media is not limited to 3 but may be any number higher than or equal to 1. When equal to 1, the oscillation problem appears mainly before entering the participating medium and after going out of the participating medium.
As to overcome the oscillation problem, the invention proposes to remap the distance x with a distance function d(x) as to avoid hollow spaces or gaps which appear along the light ray between the light source and the participating medium 11 and between two participating media (respectively between 11 and 12, between 12 and 13) and also after exiting the last participating medium 13 crossed by the light ray 10, for example on the segments [SK1], [L1K2], [L2K3] and [L3 . . . ]. Such a distance remapping is illustrated according to an advantageous embodiment with regard to
According to a variant, the pseudo-metric function is determined by integrating a square function v(x) which takes different value according to the distance x, knowing the intersection points K1 111, L1 112, K2 121, L2 122, K3 131 and L3 132 between the light ray 10 and the participating media 11, 12, 13. The square function v(x) 42 is equal to zero when x belongs to a segment of the light ray 10 outside the participating media 11, 12 and 13 (i.e. [SK1], [L1K2], [L2K3] and [L3 . . . ]) and is different from zero when x belongs to a segment of the light ray inside a participating medium 11, 12 or 13 (i.e. [K1L1], [K2L2] and [K3L3]). The square function v(x) 41 is represented with a dotted line on
v(x)=Σi=0Ndibi(x) equation 5
wherein di is the ith coefficient of the basis function bi defined with:
d
i=∫(x)bi(x)dx equation 6
where v(x)=1 or any value different from 0 if x is within a participating medium and 0 otherwise. To obtain the pseudo-metric function d(x), v(x) is integrated, which gives:
d(x)=∫0xΣdibi(x)dx equation 7
d(x)=Σdi∫0xbi(x)dx equation 8
d(x)=Σdi(Bi(x)−Bi(0)) equation 9
where Bi is the primitive of bi and Bi(0) is the primitive of bi at the level of the light source, i.e. at point S 100.
The function d(x) 420 obtained with the equation 9 and illustrated on
c
i=∫σt(d(x))bi(d(x))d(d(x)) equation 10
From the first projection coefficients ci, the attenuation of the light intensity at a point M (noted AttL(M)) of a participating media at a distance x of the light source and representing the quantity of incident light arriving at the point M after attenuation is easily computed with:
AttL(M)=exp(∫0x−σt(x)d(x)) equation 11
which gives:
AttL(M)=exp[−Σici(Bi(d(x))−Bi(d(0)))] equation 12
Naturally, the number of light rays is not limited to 1 but extend to any number higher than 1, for example 100, 1000, 10000. The operation described with regard to
The participating media 11, 12 and 13 may be seen as a single participating media when the extinction function is expressed with regard with the pseudo-metric function. If the participating media are all heterogeneous, the resulting one participating medium is also heterogeneous. If the participating media are all homogeneous, the resulting one participating medium may be homogeneous or heterogeneous (the resulting one participating medium is homogeneous if the density inside the participating media 11, 12 and 13 is the same for each and every participating media 11, 12 and 13; the resulting one participating medium is heterogeneous if the density inside the participating media 11, 12 and 13 varies from one participating medium to another one).
Q(M,ωout)=D(M)·σs·p(M,ωout,ωin)·Lri(M,ωin) equation 13
The quantity of light scattered by a point M 22 of the media attaining the eye of the spectator 21 situated at a point C of space in the direction ωout 20, that is to say the quantity of light scattered by the point M is attenuated by the medium 11 on the trajectory M-P, the point P being situated at the intersection of the medium 11 and the direction ωout in the direction of the spectator 21, is then:
L
P(M,ωout)=Q(M,ωout)·exp∫
wherein:
exp∫
Equation 14 enables the quantity of light scattered by a point M and attaining the eye of a spectator 21 situated on the direction ωou to be calculated. To calculate the quantity of light received by a spectator looking in the direction ωout, the sum of all the contributions of the set of points of the medium situated on the axis ωout must be calculated, that is to say the points situated on the segment P-Mmax, P and Mmax being the two intersection points between the medium 11 and the direction ωout 20. This total scattered light arriving at P 23 from the direction ωout 20 due to simple scattering is then:
L(P,ωout)=∫PMmaxLp(M,ωout)dM equation 16
In this case, it is considered that the light following the trajectory C-P is not attenuated.
This total scattered light is obtained by integration of contributions from all the points situated between P and Mmax on a ray having ωout as direction. Such an integral equation cannot be resolved analytically in general and even less so for a live estimation of the quantity of light scattered. The integral is evaluated digitally using the method known as ray-marching. In this method, the integration domain is discretized into a multitude of intervals of size δM and the following equation is obtained:
L(P,ωout)≈ΣPMmaxLP(M,ωout)δM equation 17
Advantageously, the heterogeneous participating medium 11 is a three-dimensional element, shown in two dimensions on
The device 6 comprises the following elements, connected to each other by a bus 65 of addresses and data that also transports a clock signal:
The device 6 also comprises a display device 63 of display screen type directly connected to the graphics card 62 to display notably the display of synthesized images calculated and composed in the graphics card, for example live. The use of a dedicated bus to connect the display device 63 to the graphics card 62 offers the advantage of having much greater data transmission bitrates and thus reducing the latency time for the displaying of images composed by the graphics card. According to a variant, the display device is external to the device 6. The device 6, for example the graphics card, comprises a connector adapted to transmit a display signal to an external display means such as for example an LCD or plasma screen or video-projector.
It is noted that the word “register” used in the description of memories 62, 66 and 67 designates in each of the memories mentioned, both a memory zone of low capacity (some binary data) as well as a memory zone of large capacity (enabling a whole program to be stored or all or part of the data representative of data calculated or to be displayed).
When switched-on, the microprocessor 61 loads and executes the instructions of the program contained in the RAM 67.
The random access memory 67 notably comprises:
The algorithms implementing the steps of the method specific to the invention and described hereafter are stored in the memory GRAM 67 of the graphics card 62 associated with the device 6 implementing these steps. When switched on and once the parameters 670 representative of the environment are loaded into the RAM 67, the graphic processors 620 of the graphics card 62 load these parameters into the GRAM 621 and execute the instructions of these algorithms in the form of microprograms of “shader” type using HLSL (High Level Shader Language) language or GLSL (OpenGL Shading Language) for example.
The random access memory GRAM 621 notably comprises:
According to a variant, a part of the RAM 67 is assigned by the CPU 61 for storage of the coefficients 6211 and 6212 and values 6212 to 6214 if the memory storage space available in GRAM 621 is insufficient. This variant however causes greater latency time in the composition of an image comprising a representation of the virtual environment composed from microprograms contained in the GPUs as the data must be transmitted from the graphics card to the random access memory 67 passing by the bus 65 for which the transmission capacities are generally inferior to those available in the graphics card for transmission of data from the GPUs to the GRAM and vice-versa.
According to another variant, the power supply 68 is external to the device 6.
During an initialization step 70, the different parameters of the device 6 are updated. In particular, the parameters representative of the participating media 11, 12 and/or 13 are initialized in any way.
Then, during a step 71, a pseudo-metric function is determined, the pseudo-metric function being representative of distance along the light ray but only for the parts of the light ray crossing the participating media. The pseudo-metric function is advantageously determined based on the intersection points between the participating media and the light ray. The pseudo-metric function d(x) is representative of the distance travelled by the light inside the participating media 11, 12, 13. A value representative of distance is assigned to the elements of each of the participating medium crossed by the light ray 10, an element of a participating medium corresponding to a point, a particle or a discretization sample according to the representation of the participating medium.
For an element, the value representative of distance is for example computed by estimating the distance travelled by the light along the light ray from the intersection point between the participating medium and the light ray where the light ray enters the participating medium. This value corresponds to the sum of the distances travelled by the light inside each and every participating medium before reaching the considered point to which is assigned the distance value without taking into account the distance travelled by the light along the light ray outside the participating media.
According to another example, the pseudo-metric function is determined by integrating a first function that is equal to zero for each element (i.e. each point or each particle or each sample according to the representation of the virtual environment comprising the participating medium/media) of the light ray 10 not belonging to the participating media and different from zero for elements of the light ray belonging to the participating medium. Said differently, the first function is a derivative of the pseudo-metric function. The first function is for example a square function (when the positive value different from zero associated with the elements of the light ray belonging to the participating medium is constant) or a positive function (when the positive value different from zero associated with the elements of the light ray belonging to the participating medium varies according to the elements). Advantageously, the value taken by the first function for elements of the light ray inside the participating medium/media is equal to 1.
According to a variant and as to simplify the computation involved in integrating the first function for determining the pseudo-metric function, the first function is expressed in a function base (composed of a plurality of basis functions). To that aim, a plurality of second projection coefficients are estimated, the second projection coefficients being representative of the first function. The second projection coefficients are estimated by projecting the first function into the function base along (or with respect to) the light ray. According to this variant, the pseudo-metric function is determined by using the second projection coefficients.
Then, during a step 72, first projection coefficients of a functions base are estimated, these first projection coefficients being representative of extinction coefficients, the values of which varying in the participating medium/media (as the density values associated with elements forming the participating media may vary). As to reduce the footprints on memory, a density value is used to weight a unique extinction coefficient as to simulate the variations of extinction inside the medium instead of varying the extinction coefficients themselves, as in RGB (for Red, Green and Blue) representation of the scene, the extinction coefficient normally may vary according to each of the color components R, G and B. To that aim, the function σt(x) representative of the variations in extinction in the participating medium/media is projected along the pseudo-metric function representative of distance inside the participating medium/media along the light ray and represented in a functional space of basis functions, for example by using a Fourier Transform or a Discrete Cosine Transform.
Advantageously, the first projection coefficients are stored in a projective texture. A storage space of the projective texture is assigned for the storage of the first projection coefficients associated with the light ray. A plurality of set of first coefficient projection are advantageously estimated for a plurality of light rays, for example as to cover the entire virtual environment, one set of first projection coefficients being associated with one light ray (a pseudo-metric function being determined for each light ray). In this case, a storage space of the projective texture is assigned for the storage of each set of first projection coefficients for each light ray.
Then, during a step 73, the quantity of light received by an element belonging to the participating medium (or to one of the participating media when more than one participating medium is crossed by the light ray 10) is then estimated according to the first projection coefficients associated with the part of light ray crossing the at least one participating medium. This is advantageously achieved by estimating a value representative of the reduction of light intensity (along the light ray) from the first projection coefficients, as explained with regard to equations 11 and 12.
Steps 71 to 73 are advantageously repeated for a plurality of light rays as to determine the quantity of light received by each and every element of the participating medium/media. According to a variant, the quantity of light received is estimated for only a part of the elements of the participating medium/media. According to this variant, the quality of the rendering of the participating medium/media will be less but could be acceptable if the participating medium/media are far from the point of view from which a spectator looks at the rendered virtual environment.
Naturally, the invention is not limited to the embodiments previously described.
In particular, the invention is not limited to a method for estimation of the quantity of light received by an element of a participating medium but also extends to any device implementing this method and notably any devices comprising at least one GPU. The implementation of equations described with respect to
Advantageously, the base functions used for the estimation of projection coefficients are standard Fourier functions. According to a variant, the base functions used are Legendre polynomials or Tchebychev polynomials.
For example, the method implemented in a device comprising a Xeon® microprocessor with a 3.6 GHz rate nVidia geforce GTX580 graphics card enables the display to be composed of 40 images per second live for a heterogeneous participating medium of cloud type composed of 5123 elements. The use of the invention is not limited to a live utilization but also extends to any other utilization, for example for processing known as postproduction processing in a recording studio for the display of synthesis images for example. The implementation of the invention in postproduction offers the advantage of providing an excellent visual display in terms of realism notably while reducing the required calculation time.
The invention also relates to a method for composition/rendering of a video image, in two dimensions or in three dimensions, for which the quantity of light received by a participating medium is calculated and the information representative of the light that results is used for the displaying of pixels of the image, each pixel corresponding to an observation direction according to an observation direction ωout. The calculated light value for displaying by each of the pixels of the image is re-calculated to adapt to the different viewpoints of the spectator.
The implementations described herein may be implemented in, for example, a method or a process, an apparatus, a software program, a data stream, or a signal. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method or a device), the implementation of features discussed may also be implemented in other forms (for example a program). An apparatus may be implemented in, for example, appropriate hardware, software, and firmware. The methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to processing devices in general, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processors also include communication devices, such as, for example, computers, cell phones, portable/personal digital assistants (“PDAs”), and other devices that facilitate communication of information between end-users.
Implementations of the various processes and features described herein may be embodied in a variety of different equipment or applications, particularly, for example, equipment or applications associated with data encoding, data decoding, view generation, texture processing, and other processing of images and related texture information and/or depth information. Examples of such equipment include an encoder, a decoder, a post-processor processing output from a decoder, a pre-processor providing input to an encoder, a video coder, a video decoder, a video codec, a web server, a set-top box, a laptop, a personal computer, a cell phone, a PDA, and other communication devices. As should be clear, the equipment may be mobile and even installed in a mobile vehicle.
Additionally, the methods may be implemented by instructions being performed by a processor, and such instructions (and/or data values produced by an implementation) may be stored on a processor-readable medium such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette (“CD”), an optical disc (such as, for example, a DVD, often referred to as a digital versatile disc or a digital video disc), a random access memory (“RAM”), or a read-only memory (“ROM”). The instructions may form an application program tangibly embodied on a processor-readable medium. Instructions may be, for example, in hardware, firmware, software, or a combination. Instructions may be found in, for example, an operating system, a separate application, or a combination of the two. A processor may be characterized, therefore, as, for example, both a device configured to carry out a process and a device that includes a processor-readable medium (such as a storage device) having instructions for carrying out a process. Further, a processor-readable medium may store, in addition to or in lieu of instructions, data values produced by an implementation.
As will be evident to one of skill in the art, implementations may produce a variety of signals formatted to carry information that may be, for example, stored or transmitted. The information may include, for example, instructions for performing a method, or data produced by one of the described implementations. For example, a signal may be formatted to carry as data the rules for writing or reading the syntax of a described embodiment, or to carry as data the actual syntax-values written by a described embodiment. Such a signal may be formatted, for example, as an electromagnetic wave (for example, using a radio frequency portion of spectrum) or as a baseband signal. The formatting may include, for example, encoding a data stream and modulating a carrier with the encoded data stream. The information that the signal carries may be, for example, analog or digital information. The signal may be transmitted over a variety of different wired or wireless links, as is known. The signal may be stored on a processor-readable medium.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made. For example, elements of different implementations may be combined, supplemented, modified, or removed to produce other implementations. Additionally, one of ordinary skill will understand that other structures and processes may be substituted for those disclosed and the resulting implementations will perform at least substantially the same function(s), in at least substantially the same way(s), to achieve at least substantially the same result(s) as the implementations disclosed. Accordingly, these and other implementations are contemplated by this application.
The present invention can be used in video game applications for example, whether via programs that can be executed in a PC or portable type computer or in specialized game consoles producing and displaying images live. The device 6 described with respect to
Number | Date | Country | Kind |
---|---|---|---|
1230502808 | Jan 2012 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2012/075804 | 12/17/2012 | WO | 00 |