This application is the U.S. National Stage of PCT/EP2021/069507, filed Jul. 13, 2021, which in turn claims priority to French patent application number 2007515, filed Jul. 17, 2020. The content of these applications are incorporated herein by reference in their entireties.
The object of the invention is a method for calculating the visibility of objects within a three-dimensional scene, also called a 3D scene.
The technical field of the invention is that of 3D (or 2D, for two dimensions) rendering and object visibility detection in applications using graphics renderers such as video games, virtual worlds, virtual reality, augmented reality, etc.
In the field of the art, graphics engines, such as Unity, Unreal and others, are used to produce images for displaying on a screen. These images are produced from a scene comprised of objects. From a macroscopic point of view, for an end user, these objects are buildings, vehicles, planets, everyday objects . . . . From the point of view of the graphics renderer, they are a list of geometric shapes with coordinates in space, surface properties, movement rules in time, etc.
More technically, the object of a 3D renderer is to draw in real time on a rendering plane called “viewport”, a plurality of objects composing a scene. These objects are defined in the coordinate space (x,y,z).
For this, with reference to
A renderer uses several techniques in order to correctly project and draw the scene onto the rendering plane. These techniques are combined in a “render pipeline” also called a graphics pipeline. The graphics pipeline includes a notion of visibility in order to calculate for each pixel which object has to be drawn and render its colour.
It is therefore easy to understand that from a given observation point certain objects are more or less visible, some of which may mask others.
In the rendering of a 3-dimensional scene, there is sometimes a need to know the visibility of an element within this scene. Indeed, it may be hidden by another, in a shadow zone, partially visible, or behind another semi-transparent object.
This need is currently met inaccurately or partially.
One solution is to use the so-called zbuffer technique. This technique makes it possible to determine the visibility of an object at the time it is drawn, as other elements may cover it later. This technique therefore only allows positioning and orientation of the object in relation to an observer, also known as a camera, to be taken into account when the object is rendered.
Another solution is the use of “raycasts”. However, this solution does not allow the visibility of an object behind a semi-transparent object or an object in a shadow zone to be known.
Therefore, there is no solution allowing a calculation of the visibility behind transparent objects taking lighting of the scene and shadows into account.
Our invention meets this need by describing a very accurate technique for calculating the visibility of an object within a scene.
The invention relies on a second rendering in addition to the main rendering. The main rendering is that expected by the user of the application implementing the renderer.
For this, a second camera is added to the scene. This second camera follows the position and characteristics of the main camera at all times (position, scale, filter options, cut planes, etc.).
Unlike the main camera which is used to render the scene and display it to the player, the second camera performs a rendering only to verify the visibility of certain objects in the scene. The frequency of these renderings can be different from the main camera so as not to disrupt the performance of the application.
The graphics pipeline associated with this second camera is modified in order to be able to perform the visibility calculations according to the invention.
The invention provides a solution to the previously mentioned problems, by allowing all aspects of the scene to be taken into account.
One aspect of the invention relates to a method for calculating a visibility score within a three-dimensional scene, the scene being comprised of a plurality of objects, characterised in that the method includes the following steps implemented by a dedicated camera being a clone of a main camera, the dedicated camera being associated with a dedicated graphics pipeline:
In addition to the characteristics just discussed in the previous paragraph, the method according to one aspect of the invention may have one or more of the following additional characteristics, considered individually or in any technically possible combination:
The invention and its various applications will be better understood upon reading the following description and examining the accompanying figures.
The figures are set forth for indicative and in no way limiting purposes of the invention.
In this description, the processing actions are performed by microprocessors that use memory. These microprocessors are either general purpose microprocessors also called CPUs or graphics microprocessors also called GPUs.
In practice, the invention is implemented by a computer, a game console, a smart phone and additionally any device capable of implementing a graphics renderer.
The figures are set forth for indicative and in no way limiting purposes of the invention.
Unless otherwise specified, a same element appearing in different figures has a unique reference.
At the end of the transparent object rendering step 250, the dedicated pipeline has finished its rendering work and a rendered image is available.
In contrast to a standard Raycast, with the invention, each pixel tested has a value between 0 and 1. This is illustrated in [
The visibility of the object can therefore be accurately determined by aggregating the alpha measurements. A simple sum is sufficient: if the sum is 0 then the object is fully visible. If the sum is the number of zones to be tested then the object is completely invisible.
It is also possible to use the rgb layers of the tested pixel to determine the colour deviation from the original colour. This allows the visibility score to be refined. To achieve this result, it is sufficient to keep the lightning steps in the dedicated rendering pipeline, to determine the colour deviation from the ambient lighting of the scene.
In one preferred alternative, the visibility mark is integrated over a predetermined period of time. This period depends on the nature of the object. The object may for example include text or symbols. The period is then the time sufficient to allow the text or symbols to be read. In another alternative, this period is between 0.5 second and 3 seconds.
In the invention, the dedicated pipeline processes opaque objects first, then transparent objects. In a preferred alternative, the dedicated pipeline is the simplest one provided by the renderer that meets the previous condition. This limits processing resources consumed by the invention.
Number | Date | Country | Kind |
---|---|---|---|
2007515 | Jul 2020 | FR | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2021/069507 | 7/13/2021 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2022/013242 | 1/20/2022 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
6646639 | Greene | Nov 2003 | B1 |
7483042 | Glen | Jan 2009 | B1 |
11132831 | Schmalstieg | Sep 2021 | B1 |
20040169671 | Aronson | Sep 2004 | A1 |
20110141112 | Hux | Jun 2011 | A1 |
20150062142 | Goel | Mar 2015 | A1 |
20170161863 | Baral | Jun 2017 | A1 |
20230196627 | Seiler | Jun 2023 | A1 |
Entry |
---|
International Search Report as issued in International Patent Application No. PCT/EP2021/069507, dated Oct. 27, 2021. |
Brüll, F., “Order-Independent Transparency Acceleration,” Bachelor thesis, Sep. 2018, Retrieved from the Internet: URL:https://www2.in.tu-clausthal.de/-cgstore/theses/ha_ felix brüll_2018.pdf, XP055794949. |
Anonymous: “Transparent Depth Shader (good for ghosts!)—Unity Forum,” Aug. 2012, Retrieved from the Internet: URL:https://forum.unity.com/threads/transparent-depth-shader-good-for-ghosts.149511/, [Retrieved on Apr. 13, 2021], XP055794950. |
Number | Date | Country | |
---|---|---|---|
20230281917 A1 | Sep 2023 | US |