The present invention relates to new and improved ways for carrying out the path tracing method of parallel graphics rendering.
Path tracing is a computer graphic method for a realistic rendering of three-dimensional scenes, based on global illumination. Global illumination takes into account not only the light which comes directly from a light source, but also subsequent cases in which light rays from the same source are reflected by other surfaces in the scene, whether reflective or not (indirect illumination).
Fundamentally, global illumination integrates over all the luminance arriving to a single point on the surface of a rendered object. This luminance is then reduced by a surface reflectance function (BRDF) to determine how much of it will go towards the viewpoint camera. This integration procedure is repeated for every pixel in the output image. When combined with physically accurate models of surfaces, accurate models of real light sources, and optically-correct cameras, path tracing can produce still images that are indistinguishable from photographs.
Path tracing naturally simulates many effects that have to be specifically added to other methods (conventional ray tracing or scanline rendering), such as soft shadows, depth of field, motion blur, caustics, ambient occlusion, and indirect lighting.
Path tracing is a computationally intensive algorithm. The basic and most time consuming task in path tracing is the locating of intersection points between millions of rays and millions of polygons. In prior art it is done by massive traversals of accelerating structures and by resolving intersection tests. Traversals are typically taking 60%-70% of rendering time. In addition, the need to modify or reconstruct acceleration structures before each dynamic frame, limits the performance.
Fortunately, path tracing is quite easy to parallelize. The contribution of each ray to the final image can be computed independently of other rays. There are two main parallelization approaches in the prior art: (i) ray-parallel, in which rays are distributed among parallel processors, while each processor traces a ray all the way, and (ii) data-parallel, in which the scene is distributed among multiple processors, while a ray is handled by multiple processors in a row.
The ray-parallel implementation, subdividing the image space into a number of disjoint regions, replicates all the scene data with each processor. Each processor, renders a number of screen regions using the unaltered sequential version of the path tracing algorithm, until the whole image is completed. Load balancing is achieved dynamically by sending new tasks to processors that have just become idle. However, if a large model needs to be rendered, the local memory of each processor is not large enough to hold the entire scene. This is evident from
Data-parallel is a different approach to rendering, best for large data cases that do not fit into a single processor's memory. Each processor owns a subset of the database, tracing rays only when they pass through its own subspace (cell). As shown in
Accordingly, a primary object of the present invention is to provide a new and improved method of and apparatus for path tracing, while reducing the high complexity associated with the prior art.
Another object of the present invention is to provide a new and improved method of and apparatus for path tracing, while enabling an efficient rendering of big data.
Another object of the present invention is to provide a new and improved mechanism for locating intersection points between rays and objects for global illumination rendering.
Another object of the present invention is to provide a new and improved acceleration structure mechanism for data parallel path tracing, consisting of global and local components.
Another object of the present invention is to decrease the complexity of path tracing by reducing the traversals of acceleration structures.
Another object of the present invention is to provide a new and improved local acceleration structure.
Yet another object of the present invention is to replace the complex traversals of acceleration structures by a new and low complexity mechanism.
Yet another object of the present invention is to replace the complex traversals of acceleration structures by a new and low complexity mechanism implementable by the graphics pipeline.
These and other objects of the present invention will become apparent hereinafter and in the claims to invention.
The embodiments of the present invention follow the data parallel approach, therefore the scene data are fragmented into numerous non-uniform sub-volumes of cells. Cell is a basic unit of process and data locality.
According to one embodiment, the task of traversals is divided between the global acceleration structure, and multiple small local acceleration structures. The local acceleration structures, along with the local geometry data and textures reside in cells. Each cell is assigned a processor, on a demand driven basis. These rendering processors may come on different platforms of CPUs, GPUs or both. Each cell builds its own acceleration structure for the local portion of data. It means that the global acceleration structure remains the only central element, while its size and load are greatly reduced. Each cell handles ray traversal for its local domain only, meaning that there is no need to retrieve data from external devices (central memory or hard disks), saving the big penalty of slow access times. The secondary (the term ‘secondary’ generally stands for secondary, ternary, and higher generations of HIPs and bouncing rays) rays are generated locally at each cell.
Another embodiment of the present invention replaces the local acceleration structures with a new and improved method and apparatus for locating ray/object intersections. It comprises a low complexity collective shooting method in a cell, facilitated by the graphics pipeline. According to this method, the encounter between the ray and object is projected by ‘visualizing’, in a sense similar to the human seeing, eliminating the need for expensive line/object mathematical intersections. The communication of rays among cells is still carried by the global acceleration structure. However, this communication is reduced: due to many cell-internal rays that do not use the global acceleration structure, and due to lowering the traversal complexity by knowing-ahead the intersection coordinates. This reduces greatly the amount of traversals of secondary rays, and offloads the global acceleration structure which otherwise, due to its centrality, would be subject to bottleneck effect.
For a more complete understanding of how to practice the Objects of the Present Invention, the following Detailed Description of the Illustrative Embodiments can be read in conjunction with the accompanying Drawings, briefly described below:
According to an embodiment of the present invention, the task of traversals is divided between the global acceleration structure, and multiple small local acceleration structures, as depicted in
The basic data elements of the global acceleration structure,
As shown in
Each cell of transit treats its corresponding segment of ray vs. local data, meaning that there is no need to retrieve data from external devices (central memory or hard disks). This saves the penalty of slow access times. The secondary (and higher) rays are generated locally at each cell. Some of them terminate at the cell, hitting local objects, the rest is communicated to other cells utilizing the global acceleration structure.
When a cell receives rays, the rays are rearranged into coherent packets, in order to gain an increased performance, utilizing the architectural advantage of the platform. Today's CPU architecture provides extensive SIMD abilities (up to 512 bits: 16 floats in current architecture), which can be utilized to perform parallel traverse on acceleration trees. This method, known as packet traverse, would provide superior traverse and intersection test performance, as long as the rays in the packet are correlated. This applies even more strongly to GPU platforms, in which memory coherency is crucial. A natural coherence is present only on the primary rays, as long as their path is short enough. Secondary rays must be brought to coherency in order to apply the packet method, which improves the advantage drastically.
According to an embodiment of the current invention, the rays that move between local and global structures, are rearranged into packets on the transition phase, so the packets in the entry to the local acceleration structure would comprise coherent rays.
Reordering is preformed in a few levels. First, a packet with rays all targeted to the same cell. Further sorting can be done, so all the packet rays enter the cell at the same face. A packet can be rearranged several times on its course, to keep the coherency. There is a cost, in latency and in processing, but the advantage of using packets outperforms the cost.
As mentioned, a GPU hardware is employed for the primary rays. In order to randomize the primary rays, such that the HIP samples (as described hereinafter in
As mentioned above, and as it is shown in
The way the distributed acceleration structures, global (GAS) and local (LAS), work for path tracing is drafted in the flow charts of
The computed global illumination simulates the real world, where objects and surfaces are visible due to the fact that they are reflecting diffused light. This reflected light illuminates other objects in turn, by diffused inter-reflection. Diffuse inter-reflection is a process whereby light reflected from an object strikes other objects in the surrounding area, illuminating them. Diffuse inter-reflection specifically describes light reflected from objects which are not shiny or specular. It is an important component of global illumination.
Generally, two types of rays are used in path tracing. The primary rays are shot from the viewer's eye via screen pixels into the scene. The primary ray hits an object in the primary hit point (HIP). Secondary rays are then generated, bouncing further into the scene, generating next generations of HIPs.
The calculation of a diffuse radiance at a point of a primary hit on an object is depicted in
HIP 606 absorbs reflected radiance from its surroundings. Upon a hit of that ray at some external hit point 605, the amount of reflected light from that hit point is reported to the pixel of origin. The actual ray shooting provides reflectance results, and generates additional rays in the scene space. Each of the hemi-rays 604, is of a different probability, according to a probability distribution function.
Collective shooting of secondary rays. According to another embodiment of the present invention, the heavy traversals of local acceleration structures are replaced by new and improved method of collective shooting in a cell, greatly reducing the processing complexity.
According to the embodiment, the encounter between a ray and object is projected by a visualization mechanism, in a sense similar to the human seeing, eliminating the need for a line/object mathematical intersection. This method replaces the expensive traversals of acceleration structure. As explained hereinafter, the visualization is done by means of the graphics GPU pipeline, resulting in high performance and low complexity.
In the following description it is assumed, that the scene is subdivided into cells, while each cell is processed autonomously. However, the collective shooting mechanism can be used as well as a stand alone version, when no cells involved.
The collective shooting is mathematically articulated as follows:
Let T be a tree-graph of d levels and let V be its vertices on top of geometries G in space.
Define Vd-vertices within V in level d.
Let Cd be a division of Vd to clusters.
We shall extend T to d+1 levels by finding Vd+1:
Choose cluster c∈Cd, with Vd
Instead of projecting every vertex v∈Vd
We optimize Cd/Lc in throughput/overfitting to have:
Lc is chosen to have a pseudo-random output, representing a possible segment of distribution for each v∈Vd
The input vertices V of the above articulation are illustrated in
As explained hereinafter, the projecting mechanism must treat the HIPs separately from the geometry data. This is depicted in
The local collective shooting of the present invention utilizes the Z-buffering mechanism of a raster graphics. Each active HIP ‘looks’ forward along the direction of parallel projection. So the Z-buffering mechanism must discard objects all the way before the HIP, and start seeking objects only at the HIP. This is described in
In
Accuracy: to what extent the hemi-ray and the parallel projection ray must overlap? The lack of accuracy is demonstrated in
In
Multiple parallel projections at a cell, and their effect on a HIP 134, are shown in
The next generations of rays and HIPs may be generated and used in different ways. According to one embodiment, use is made of all HIP's hemi-rays. This is shown in
In the collective shooting within a cell, the communication of rays reaching out of the cell, is still carried by the global acceleration structure. However, this communication is reduced due to two procedures: many rays conclude their tracing internally in the cell, and the traversal is simplified by knowing ahead the coordinate of the external intersection, found in the course of the projection. These events reduce the load on the global acceleration structure, as shown in the next two drawings.
The reduced traversal complexity involved with the use of the global acceleration structure is described by
Flowchart. The preferred embodiment of the present invention is flowcharted in
The product of rendering, the render target, is used to find the hit point for each HIP, by inspecting the render target texture at the correct u,v coordinates 1706. If a hit is found, then the accuracy is checked 1707, as explained hereinbefore. The projection cycle is completed when all HIPs are checked for hit and for accuracy.
The preferred embodiment of a collective shooting in a cell is detailed in flow chart of
The application is a continuation of U.S. patent application Ser. No. 17/327,690, filed on May 22, 2021; which is a continuation of the U.S. patent application Ser. No. 17/019,274, filed on Sep. 13, 2020, now issued as U.S. Pat. No. 11,017,583; which is a continuation of U.S. patent application Ser. No. 16/788,845, filed on Feb. 12, 2020, now issued as U.S. Pat. No. 10,818,072; which is a continuation of U.S. patent application Ser. No. 16/444,431, filed on Jun. 18, 2019, now issued as U.S. Pat. No. 10,614,614; which is a continuation of U.S. patent application Ser. No. 15/984,359, filed on May 20, 2018, now issued as U.S. Pat. No. 10,380,785; which is a continuation of U.S. patent application Ser. No. 15/376,580, filed Dec. 12, 2016; which claims the benefit of U.S. Provisional Application Ser. No. 62/266,584, filed on Dec. 12, 2015, U.S. Provisional Application Ser. No. 62/289,927, filed on Feb. 2, 2016, U.S. Provisional Application Ser. No. 62/354,755, filed on Jun. 26, 2016, and U.S. Provisional Application Ser. No. 62/408,730, filed on Oct. 15, 2016; and is also a continuation-in-part of the U.S. patent application Ser. No. 15/009,442, filed on Jan. 28, 2016, now issued as U.S. Pat. No. 9,741,160; which is a continuation-in-part of U.S. patent application Ser. No. 14/868,461, filed on Sep. 29, 2015, now issued as U.S. Pat. No. 9,558,530. All of the above applications are incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
5442733 | Kaufman et al. | Aug 1995 | A |
5907691 | Faget et al. | May 1999 | A |
6359618 | Heirich | Mar 2002 | B1 |
6512517 | Knittel et al. | Jan 2003 | B1 |
6621925 | Ohmori et al. | Sep 2003 | B1 |
6911984 | Sabella et al. | Jun 2005 | B2 |
6967663 | Bastos et al. | Nov 2005 | B1 |
7212207 | Green et al. | May 2007 | B2 |
7561163 | Johnson | Jul 2009 | B1 |
8169439 | Luick et al. | May 2012 | B2 |
9424685 | Howson et al. | Aug 2016 | B2 |
9483865 | Bakalash | Nov 2016 | B2 |
9558530 | Bakalash | Jan 2017 | B2 |
9576389 | Lee et al. | Feb 2017 | B2 |
9607426 | Peterson | Mar 2017 | B1 |
9672654 | Shin et al. | Jun 2017 | B2 |
9697640 | Obert et al. | Jul 2017 | B2 |
9741160 | Bakalash | Aug 2017 | B2 |
10204442 | Bakalash et al. | Feb 2019 | B2 |
10217268 | Bakalash et al. | Feb 2019 | B2 |
10229527 | Bakalash et al. | Mar 2019 | B2 |
10297068 | Bakalash et al. | May 2019 | B2 |
10332304 | Bakalash | Jun 2019 | B1 |
10380785 | Bakalash et al. | Aug 2019 | B2 |
10395415 | Bakalash | Aug 2019 | B2 |
10395416 | Bakalash | Aug 2019 | B2 |
10403027 | Bakalash et al. | Sep 2019 | B2 |
10410401 | Bakalash et al. | Sep 2019 | B1 |
10565776 | Bakalash | Feb 2020 | B2 |
10614612 | Bakalash et al. | Apr 2020 | B2 |
10614614 | Bakalash | Apr 2020 | B2 |
10699468 | Bakalash et al. | Jun 2020 | B2 |
10789759 | Bakalash et al. | Sep 2020 | B2 |
10818072 | Bakalash et al. | Oct 2020 | B2 |
10930053 | Bakalash et al. | Feb 2021 | B2 |
10950030 | Bakalash et al. | Mar 2021 | B2 |
11017582 | Bakalash et al. | May 2021 | B2 |
11017583 | Bakalash et al. | May 2021 | B2 |
11302058 | Bakalash et al. | Apr 2022 | B2 |
11508114 | Bakalash et al. | Nov 2022 | B2 |
20070257911 | Bavoil et al. | Nov 2007 | A1 |
20080088622 | Shearer | Apr 2008 | A1 |
20080100617 | Keller et al. | May 2008 | A1 |
20080122846 | Brown et al. | May 2008 | A1 |
20080231627 | Shearer | Sep 2008 | A1 |
20090073167 | Brown et al. | Mar 2009 | A1 |
20090073187 | Rampson et al. | Mar 2009 | A1 |
20090102844 | Deparis | Apr 2009 | A1 |
20090128552 | Fujiki et al. | May 2009 | A1 |
20090128562 | Mccombe et al. | May 2009 | A1 |
20090167763 | Waechter et al. | Jul 2009 | A1 |
20090213115 | Keller et al. | Aug 2009 | A1 |
20100053162 | Dammertz et al. | Mar 2010 | A1 |
20100060637 | Shearer | Mar 2010 | A1 |
20100079457 | Tavenrath | Apr 2010 | A1 |
20100164948 | Kho et al. | Jul 2010 | A1 |
20100239185 | Fowler et al. | Sep 2010 | A1 |
20110066682 | Aldunate et al. | Mar 2011 | A1 |
20120069023 | Hur et al. | Mar 2012 | A1 |
20120169728 | Mora | Jul 2012 | A1 |
20120213430 | Nutter et al. | Aug 2012 | A1 |
20130120385 | Krishnaswamy et al. | May 2013 | A1 |
20130190602 | Liao et al. | Jul 2013 | A1 |
20140049539 | Lee et al. | Feb 2014 | A1 |
20140078143 | Lee et al. | Mar 2014 | A1 |
20140375641 | Bakalash | Dec 2014 | A1 |
20140375645 | Bakalash | Dec 2014 | A1 |
20150042655 | Gautron et al. | Feb 2015 | A1 |
20150109302 | Lee et al. | Apr 2015 | A1 |
20150172641 | Nakamura et al. | Jun 2015 | A1 |
20160005209 | Dachsbacher | Jan 2016 | A1 |
20160019672 | Bakalash | Jan 2016 | A1 |
20160027204 | Lee et al. | Jan 2016 | A1 |
20160055608 | Frascati et al. | Feb 2016 | A1 |
20160155258 | Bakalash | Jun 2016 | A1 |
20160239994 | Chung et al. | Aug 2016 | A1 |
20160284118 | Howson et al. | Sep 2016 | A1 |
20160292908 | Obert | Oct 2016 | A1 |
20170061674 | Lee et al. | Mar 2017 | A1 |
20170103567 | Peterson | Apr 2017 | A1 |
20170109931 | Knorr et al. | Apr 2017 | A1 |
20170178398 | Afra et al. | Jun 2017 | A1 |
20170236247 | Akenine-Moller | Aug 2017 | A1 |
20170236325 | Lecocq et al. | Aug 2017 | A1 |
20170372508 | Schoeneman | Dec 2017 | A1 |
20180350128 | Bakalash et al. | Dec 2018 | A1 |
20180365880 | Bakalash et al. | Dec 2018 | A1 |
20180374255 | Bakalash et al. | Dec 2018 | A1 |
20180374256 | Bakalash et al. | Dec 2018 | A1 |
20180374257 | Bakalash et al. | Dec 2018 | A1 |
20190057544 | Lecocq et al. | Feb 2019 | A1 |
20190147637 | Bakalash et al. | May 2019 | A1 |
20190180496 | Bakalash | Jun 2019 | A1 |
20190180497 | Bakalash | Jun 2019 | A1 |
20190188897 | Bakalash | Jun 2019 | A1 |
20190304162 | Bakalash | Oct 2019 | A1 |
20190304163 | Bakalash | Oct 2019 | A1 |
20190378323 | Bakalash et al. | Dec 2019 | A1 |
20200058155 | Bakalash et al. | Feb 2020 | A1 |
20200143581 | Bakalash et al. | May 2020 | A1 |
20200175743 | Bakalash et al. | Jun 2020 | A1 |
20200219305 | Bakalash et al. | Jul 2020 | A1 |
20200320773 | Bakalash et al. | Oct 2020 | A1 |
20200380764 | Bakalash et al. | Dec 2020 | A1 |
20200410744 | Bakalash et al. | Dec 2020 | A1 |
20210166463 | Bakalash et al. | Jun 2021 | A1 |
20210201560 | Bakalash et al. | Jul 2021 | A1 |
20210279941 | Bakalash et al. | Sep 2021 | A1 |
20220198740 | Bakalash et al. | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
WO-2015192117 | Dec 2015 | WO |
Entry |
---|
“U.S. Appl. No. 14/479,322, Final Office Action mailed Jun. 4, 2015”, 19 pgs. |
“U.S. Appl. No. 14/479,322, Non Final Office Action mailed Jan. 22, 2015”, 26 pgs. |
“U.S. Appl. No. 14/479,322, Notice of Allowance mailed Aug. 29, 2016”, 8 pgs. |
“U.S. Appl. No. 14/479,322, Notice of Non-Compliant Amendment mailed Apr. 29, 2015”, 2 pgs. |
“U.S. Appl. No. 14/479,322, Notice of Non-Compliant Amendment mailed Jul. 29, 2016”, 3 pgs. |
“U.S. Appl. No. 14/479,322, Response filed Feb. 22, 2016 to Final Office Action mailed Jun. 4, 2015”, 9 pgs. |
“U.S. Appl. No. 14/479,322, Response filed Apr. 22, 2015 to Non Final Office Action mailed Jan. 22, 2015”, 10 pgs. |
“U.S. Appl. No. 14/479,322, Response filed May 21, 2015 to Notice of Non-Compliant Amendment mailed Apr. 29, 2015”, 10 pgs. |
“U.S. Appl. No. 14/479,322, Response filed Aug. 15, 2016 to Notice of Non-Compliant Amendment mailed Jul. 29, 2016”, 10 pgs. |
“U.S. Appl. No. 14/868,461, Final Office Action mailed Jun. 3, 2016”, 7 pgs. |
“U.S. Appl. No. 14/868,461, Non Final Office Action mailed Jan. 5, 2016”, 8 pgs. |
“U.S. Appl. No. 14/868,461, Notice of Allowance mailed Sep. 13, 2016”, 5 pgs. |
“U.S. Appl. No. 14/868,461, Response filed May 3, 2016 to Non Final Office Action mailed Jan. 5, 2016”, 1 pg. |
“U.S. Appl. No. 14/868,461, Response filed Aug. 29, 2016 to Final Office Action mailed Jun. 3, 2016”, 1 pg. |
“U.S. Appl. No. 15/009,442, Non Final Office Action mailed Mar. 21, 2017”, 9 pgs. |
“U.S. Appl. No. 15/009,442, Notice of Allowance mailed Jun. 29, 2017”, 5 pgs. |
“U.S. Appl. No. 15/009,442, Response filed Jun. 12, 2017 to Non Final Office Action mailed Mar. 21, 2017”, 4 pgs. |
“U.S. Appl. No. 15/615,037, Final Office Action mailed Jul. 11, 2018”, 13 pgs. |
“U.S. Appl. No. 15/615,037, Non Final Office Action mailed Sep. 27, 2017”, 15 pgs. |
“U.S. Appl. No. 15/615,037, Notice of Allowance mailed Jan. 15, 2019”, 5 pgs. |
“U.S. Appl. No. 15/615,037, Notice of Non-Compliant Amendment mailed Dec. 26, 2017”, 3 pgs. |
“U.S. Appl. No. 15/615,037, Response filed Jan. 7, 2019 to Final Office Action mailed Jul. 11, 2018”, 4 pgs. |
“U.S. Appl. No. 15/615,037, Response filed Feb. 21, 2018 to Notice of Non-Compliant Amendment mailed Dec. 26, 2017”, 5 pgs. |
“U.S. Appl. No. 15/615,037, Response filed Nov. 13, 2017 to Non Final Office Action mailed Sep. 27, 2017”, 5 pgs. |
“U.S. Appl. No. 15/615,037, Response filed Dec. 31, 2018 to Final Office Action mailed Jul. 11, 2018”, 4 pgs. |
“U.S. Appl. No. 15/640,464, 312 Amendment filed Dec. 10, 2018”, 3 pgs. |
“U.S. Appl. No. 15/640,464, Final Office Action mailed Dec. 27, 2017”, 14 pgs. |
“U.S. Appl. No. 15/640,464, Non Final Office Action mailed Mar. 30, 2018”, 11 pgs. |
“U.S. Appl. No. 15/640,464, Non Final Office Action mailed Sep. 27, 2017”, 17 pgs. |
“U.S. Appl. No. 15/640,464, Notice of Allowance mailed Oct. 26, 2018”, 5 pgs. |
“U.S. Appl. No. 15/640,464, PTO Response to Rule 312 Communication mailed Dec. 13, 2018”, 2 pgs. |
“U.S. Appl. No. 15/640,464, Response filed Mar. 12, 2018 to Final Office Action mailed Dec. 27, 2017”, 1 pg. |
“U.S. Appl. No. 15/640,464, Response filed Nov. 13, 2017 to Non Final Office Action mailed Sep. 27, 2017”, 6 pgs. |
“U.S. Appl. No. 15/640,466, Ex Parte Quayle Action mailed Oct. 15, 2018”, 4 pgs. |
“U.S. Appl. No. 15/640,466, Final Office Action mailed Apr. 3, 2018”, 9 pgs. |
“U.S. Appl. No. 15/640,466, Non Final Office Action mailed Oct. 5, 2017”, 11 pgs. |
“U.S. Appl. No. 15/640,466, Notice of Allowance mailed Jan. 28, 2019”, 5 pgs. |
“U.S. Appl. No. 15/640,466, Notice of Allowance mailed Dec. 26, 2018”, 7 pgs. |
“U.S. Appl. No. 15/640,466, Response filed Mar. 5, 2018 to Non Final Office Action mailed Oct. 5, 2017”, 4 pgs. |
“U.S. Appl. No. 15/640,466, Response filed Sep. 19, 2018 to Final Office Action mailed Apr. 3, 2018”, 5 pgs. |
“U.S. Appl. No. 15/640,466, Response filed Dec. 10, 2018 to Ex Parte Quayle Action mailed Oct. 15, 2018”, 6 pgs. |
“U.S. Appl. No. 15/640,467, 312 Amendment filed Dec. 10, 2018”, 3 pgs. |
“U.S. Appl. No. 15/640,467, Corrected Notice of Allowability mailed Oct. 22, 2018”, 4 pgs. |
“U.S. Appl. No. 15/640,467, Final Office Action mailed Apr. 3, 2018”, 9 pgs. |
“U.S. Appl. No. 15/640,467, Non Final Office Action mailed Oct. 4, 2017”, 11 pgs. |
“U.S. Appl. No. 15/640,467, Notice of Allowance mailed Oct. 3, 2018”, 7 pgs. |
“U.S. Appl. No. 15/640,467, PTO Response to Rule 312 Communication mailed Dec. 17, 2018”, 2 pgs. |
“U.S. Appl. No. 15/640,467, Response filed Mar. 5, 2018 to Non Final Office Action mailed Oct. 4, 2017”, 4 pgs. |
“U.S. Appl. No. 15/640,467, Response filed Sep. 23, 2018 to Final Office Action mailed Apr. 3, 2018”, 6 pgs. |
“U.S. Appl. No. 15/659,618, Non Final Office Action mailed Oct. 30, 2017”, 11 pgs. |
“U.S. Appl. No. 15/659,618, Notice of Allowance mailed Apr. 10, 2018”, 7 pgs. |
“U.S. Appl. No. 15/659,618, PTO Response to Rule 312 Communication mailed Jun. 19, 2018”, 2 pgs. |
“U.S. Appl. No. 15/659,618, Response filed Mar. 5, 2018 to Non Final Office Action mailed Oct. 30, 2017”, 8 pgs. |
“U.S. Appl. No. 15/984,359, Final Office Action mailed Jan. 2, 2019”, 5 pgs. |
“U.S. Appl. No. 15/984,359, Non Final Office Action mailed Sep. 24, 2018”, 10 pgs. |
“U.S. Appl. No. 15/984,359, Notice of Allowance mailed Apr. 1, 2019”, 5 pgs. |
“U.S. Appl. No. 15/984,359, Response filed Mar. 18, 2019 to Final Office Action mailed Jan. 2, 2019”, 6 pgs. |
“U.S. Appl. No. 15/984,359, Response filed Dec. 16, 2018 to Non Final Office Action mailed Sep. 24, 2018”, 10 pgs. |
“U.S. Appl. No. 16/231,520, Final Office Action mailed Apr. 10, 2019”, 6 pgs. |
“U.S. Appl. No. 16/231,520, Non Final Office Action mailed Jan. 30, 2019”, 7 pgs. |
“U.S. Appl. No. 16/231,520, Notice of Allowance mailed May 17, 2019”, 5 pgs. |
“U.S. Appl. No. 16/231,520, Notice of Allowance mailed Jul. 26, 2019”, 5 pgs. |
“U.S. Appl. No. 16/231,520, Response filed Mar. 27, 2019 to Non Final Office Action mailed Jan. 30, 2019”, 6 pgs. |
“U.S. Appl. No. 16/231,520, Response filed May 5, 2019 to Final Office Action mailed Apr. 10, 2019”, 6 pgs. |
“U.S. Appl. No. 16/275,366, Non Final Office Action mailed Apr. 1, 2019”, 8 pgs. |
“U.S. Appl. No. 16/275,366, Notice of Allowance mailed May 13, 2019”, 7 pgs. |
“U.S. Appl. No. 16/275,366, Response filed Apr. 21, 2019 to Non Final Office Action mailed Apr. 1, 2019”, 3 pgs. |
“U.S. Appl. No. 16/275,371, Non Final Office Action mailed Apr. 1, 2019”, 10 pgs. |
“U.S. Appl. No. 16/275,371, Notice of Allowance mailed May 10, 2019”, 7 pgs. |
“U.S. Appl. No. 16/275,371, Response filed Apr. 21, 2019 to Non Final Office Action mailed Apr. 1, 2019”, 3 pgs. |
“U.S. Appl. No. 16/275,907, Non Final Office Action mailed Apr. 10, 2019”, 6 pgs. |
“U.S. Appl. No. 16/275,907, Notice of Allowance mailed May 13, 2019”, 5 pgs. |
“U.S. Appl. No. 16/275,907, Response filed Apr. 21, 2019 to Non Final Office Action mailed Apr. 10, 2019”, 3 pgs. |
“U.S. Appl. No. 16/444,431, 312 Amendment filed Mar. 10, 2020”, 3 pgs. |
“U.S. Appl. No. 16/444,431, Non Final Office Action mailed Jul. 5, 2019”, 11 pgs. |
“U.S. Appl. No. 16/444,431, Notice of Allowance mailed Nov. 7, 2019”, 5 pgs. |
“U.S. Appl. No. 16/444,431, PTO Response to Rule 312 Communication mailed Mar. 15, 2020”, 2 pgs. |
“U.S. Appl. No. 16/444,431, Response filed Nov. 3, 2019 to Non Final Office Action mailed Jul. 5, 2019”, 9 pgs. |
“U.S. Appl. No. 16/444,464, Non Final Office Action mailed Jul. 9, 2019”, 14 pgs. |
“U.S. Appl. No. 16/444,464, Notice of Allowance mailed Oct. 11, 2019”, 7 pgs. |
“U.S. Appl. No. 16/444,464, Response filed Oct. 3, 2019 to Non Final Office Action mailed Jul. 9, 2019”, 5 pgs. |
“U.S. Appl. No. 16/662,657, Non Final Office Action mailed Dec. 19, 2019”, 6 pgs. |
“U.S. Appl. No. 16/662,657, Notice of Allowance mailed Mar. 23, 2020”, 5 pgs. |
“U.S. Appl. No. 16/662,657, Preliminary Amendment filed Nov. 15, 2019”, 3 pgs. |
“U.S. Appl. No. 16/662,657, Response filed Mar. 10, 2020 to Non Final Office Action mailed Dec. 19, 2019”, 8 pgs. |
“U.S. Appl. No. 16/736,848, 312 Amendment filed Aug. 4, 2020”, 3 pgs. |
“U.S. Appl. No. 16/736,848, Non Final Office Action mailed Feb. 11, 2020”, 16 pgs. |
“U.S. Appl. No. 16/736,848, Notice of Allowance mailed May 14, 2020”, 7 pgs. |
“U.S. Appl. No. 16/736,848, Response filed May 5, 2020 to Non Final Office Action mailed Feb. 11, 2020”, 3 pgs. |
“U.S. Appl. No. 16/788,845, 312 Amendment filed Aug. 16, 2020”, 3 pgs. |
“U.S. Appl. No. 16/788,845, Non Final Office Action mailed Mar. 6, 2020”, 4 pgs. |
“U.S. Appl. No. 16/788,845, Notice of Allowance mailed Jun. 15, 2020”, 5 pgs. |
“U.S. Appl. No. 16/788,845, PTO Response to Rule 312 Communication mailed Aug. 21, 2020”, 2 pgs. |
“U.S. Appl. No. 16/788,845, Response filed Jun. 2, 2020 to Non Final Office Action mailed Mar. 6, 2020”, 5 pgs. |
“U.S. Appl. No. 16/909,063, Non Final Office Action mailed Aug. 25, 2020”, 8 pgs. |
“U.S. Appl. No. 16/909,063, Notice of Allowance mailed Nov. 23, 2020”, 5 pgs. |
“U.S. Appl. No. 16/909,063, Response filed Nov. 8, 2020 to Non Final Office Action mailed Aug. 25, 2020”, 5 pgs. |
“U.S. Appl. No. 16/994,615, Non Final Office Action mailed Aug. 25, 2020”, 17 pgs. |
“U.S. Appl. No. 16/994,615, Notice of Allowance mailed Feb. 1, 2021”, 7 pgs. |
“U.S. Appl. No. 16/994,615, Response filed Nov. 8, 2020 to Non Final Office Action mailed Aug. 25, 2020”, 3 pgs. |
“U.S. Appl. No. 17/019,274, Non Final Office Action mailed Nov. 9, 2020”, 8 pgs. |
“U.S. Appl. No. 17/019,274, Notice of Allowance mailed Feb. 8, 2021”, 5 pgs. |
“U.S. Appl. No. 17/019,274, Response filed Jan. 28, 2021 to Non Final Office Action mailed Nov. 9, 2020”, 5 pgs. |
“U.S. Appl. No. 17/183,395, Non Final Office Action mailed Jul. 28, 2021”, 8 pgs. |
“U.S. Appl. No. 17/183,395, Notice of Allowance mailed Dec. 20, 2021”, 5 pgs. |
“U.S. Appl. No. 17/183,395, Response filed Dec. 2, 2021 to Non Final Office Action mailed Jul. 28, 2021”, 7 pgs. |
“U.S. Appl. No. 17/327,690, Non Final Office Action mailed Apr. 5, 2022”, 4 pgs. |
“U.S. Appl. No. 17/327,690, Notice of Allowance mailed Jul. 18, 2022”, 7 pgs. |
“U.S. Appl. No. 17/327,690, Response filed Jul. 5, 2022 to Non Final Office Action mailed Apr. 5, 2022”, 7 pgs. |
Alwani, Rishi, “Microsoft and Nvidia Tech to Bring Photorealistic Games With Ray Tracing”, Gadgets 360, [Online] Retrieved from the Internet: <URL: https://gadgets.ndtv.com/laptops/news/microsoft-dxr-nvidia-rtx-ray-tracing-volta-gpu-metro-exodus-1826988>, (Mar. 21, 2018), 8 pgs. |
Beck, et al., “CPU-GPU hybrid real time ray tracing framework”, vol. 0 (1981), (2005), 1-8. |
Bikker, J, “Real-time ray tracing through the eyes of a game developer”, In: Proceedings of the 2007 IEEE Symposium on Interactive Ray Tracing, Washington, DC, USA, IEEE Computer Society, (2007), 10 pgs. |
Broecker, Markus, et al., “Adapting ray tracing to spatial augmented reality”, Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on. IEEE, (2013). |
Celarek, Adam, “Merging ray tracking and rasterization in mixed reality”, Vienna University of Technology, (2012). |
Chen, C. C, et al., “Use of hardware z-buffered rasterization to accelerate ray tracing”, In: Proceedings of the 2007 ACM symposium on Applied computing. SAC '07, New York, NY, USA, ACM, (2007), 1046-1050. |
Fredriksson, Johan, et al., “Generating Real Time Reflections by Ray Tracing Approximate Geometry”, Chalmers University of Technology, Gothenburg, Sweden, (2016), 46 pgs Hanraman, et al., “Ray tracing on programmable graphics hardware”, ACM Siggraph 2001 Proceedings, (2001), 703-712. |
Jenson, Henrik Wann, et al., “Photon maps in bidirectional Monte Carlo ray tracing of complex objects”, In Computers & Graphics, vol. 19, Issue 2, ISSN 0097-8493, (1995), 215-224. |
Kajiya, J. T, “The rendering equation”, In Proc. Siggraph, vol. 20, No. 4, (1986), 143-150. |
Kan, Peter, “Differential irradiance caching for fast high-quality light transport between virtual and real worlds”, 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, (2013). |
Kan, Peter, et al., “High-quality reflections, reflections, and caustics in augmented reality and their contribution to visual coherence”, Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on. IEEE, (2012). |
Mcguire, et al., “Efficient GPU screen-space ray tracing”, Journal of Computer Graphics Techniques, vol. 3, No. 4, (2014), 13 pgs. |
Parker, et al., “OptiX: a general purpose ray tracing engine”, Journal ACM Transactions on Graphics—Proceedings of ACM Siggraph 2010, vol. 29 Issue 4, Article No., (Jul. 4, 2010). |
Peddie, Jon, “Peddie predicts we could have real time ray tracing on our PCs in less than 6 years”, TechWatch, [Online] Retrieved from the Internet: <URL: https://www.jonpeddie.com/press-releases/peddie-predicts-we-could-have-real-time-ray-tracing-on-our-pcs-in-less-than/>, (Mar. 27, 2018), 2 pgs. |
Sabino, Thales, et al., “A Hybrid GPU Rasterized and Ray Traced Rendering Pipeline for Real Time Rendering of Per Pixel Effects”, ICEC 2012, Univ. Federal Fluminense, Rio de Janeiro, Brazil, (2013), 14 pgs. |
Sadeghi, Iman, et al., “Coherent Path Tracing”, Jensen University of California, San Diego, (2009), 12 pgs. |
Santos, D, et al., “Real time ray tracing for augmented reality”, 14th Symposium on Virtual and Augmented Reality IEEE, (May 28-31, 2012). |
Wald, et al., “Interactive Rendering with Coherent Ray Tracing”, Computer Graphics Forum, vol. 20, (Sep. 2001), 153-165. |
Wald, et al., “Ray Tracing Animated Scenes using Coherent Grid Traversal”, ACM, (2006). |
Wald, et al., “Realtime Ray Tracing and its use for Interactive Global Illumination”, Proceedings of Eurographics, (2003). |
Wyman, Chris, “Voxelized shadow volumes”, Proceedings of the ACM Siggraph Symposium on High Performance Graphics, ACM, (2011), 34 pgs. |
“U.S. Appl. No. 17/654,714, Non Final Office Action mailed Dec. 15, 2023”, 15 pgs. |
“U.S. Appl. No. 17/654,714, Response filed Feb. 16, 2024 to Non Final Office Action mailed Dec. 15, 2023”, 8 pgs. |
“U.S. Appl. No. 17/654,714, Notice of Allowance mailed Mar. 4, 2024”, 7 pgs. |
“U.S. Appl. No. 16/004,348, Non Final Office Action mailed Oct. 30, 2018”, 17 pgs. |
“U.S. Appl. No. 16/004,348, Response filed Feb. 11, 2019 to Non Final Office Action mailed Oct. 30, 2018”, 6 pgs. |
“U.S. Appl. No. 16/004,348, Final Office Action mailed Apr. 1, 2019”, 17 pgs. |
“U.S. Appl. No. 16/004,348, Response filed May 27, 2019 to Final Office Action mailed Apr. 1, 2019”, 10 pgs. |
“U.S. Appl. No. 16/004,348, Advisory Action mailed Jun. 5, 2019”, 3 pgs. |
“U.S. Appl. No. 16/004,348, Response filed Jun. 23, 2019 to Advisory Action mailed Jun. 5, 2019”, 10 pgs. |
“U.S. Appl. No. 16/004,348, Non Final Office Action mailed Aug. 8, 2019”, 9 pgs. |
“U.S. Appl. No. 16/004,348, Response filed Oct. 31, 2019 to Non Final Office Action mailed Aug. 8, 2019”, 5 pgs. |
“U.S. Appl. No. 16/004,348, Notice of Allowance mailed Nov. 14, 2019”, 8 pgs. |
“U.S. Appl. No. 16/004,348, 312 Amendment filed Feb. 12, 2020”, 3 pgs. |
“U.S. Appl. No. 16/004,348, Supplemental Notice of Allowability mailed Mar. 9, 2020”, 4 pgs. |
“U.S. Appl. No. 16/788,792, Non Final Office Action mailed Apr. 3, 2020”, 12 pgs. |
“U.S. Appl. No. 16/788,792, Response filed Jun. 10, 2020 to Non Final Office Action mailed Apr. 3, 2020”, 6 pgs. |
“U.S. Appl. No. 16/788,792, Notice of Allowance mailed Oct. 22, 2020”, 8 pgs. |
“U.S. Appl. No. 17/175,644, Notice of Allowance mailed Jun. 20, 2022”, 9 pgs. |
“U.S. Appl. No. 17/175,644, Supplemental Notice of Allowability mailed Jul. 7, 2022”, 2 pgs. |
“U.S. Appl. No. 17/654,714, PTO Response to Rule 312 Communication mailed Jun. 17, 2024”, 2 pgs. |
Number | Date | Country | |
---|---|---|---|
20230062294 A1 | Mar 2023 | US |
Number | Date | Country | |
---|---|---|---|
62266584 | Dec 2015 | US | |
62289927 | Feb 2016 | US | |
62354755 | Jun 2016 | US | |
62408730 | Oct 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17327690 | May 2021 | US |
Child | 18047546 | US | |
Parent | 17019274 | Sep 2020 | US |
Child | 17327690 | US | |
Parent | 16788845 | Feb 2020 | US |
Child | 17019274 | US | |
Parent | 16444431 | Jun 2019 | US |
Child | 16788845 | US | |
Parent | 15984359 | May 2018 | US |
Child | 16444431 | US | |
Parent | 15376580 | Dec 2016 | US |
Child | 15984359 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15009442 | Jan 2016 | US |
Child | 15376580 | US | |
Parent | 14868461 | Sep 2015 | US |
Child | 15009442 | US |