For example, the developer may click on or otherwise select an (e.g., erroneously-rendered) pixel of a visual representation and/or select “pixel history” to thereby obtain a browsable window to see a temporal sequence of every event that affected the selected pixel, as well as every primitive that affects the event(s). This allows the developer to scroll through the browsable window and see the history of the selected pixel; for example, the selected pixel may start as the color red, then change to the color white, then change texture, and so on. Accordingly, the developer may determine information about each event that caused an effect on the selected pixel, and may thus determine errors that may have lead to the observed rendering error associated with the selected pixel.
Thus, the graphics application 102, as referenced above, may generally be used to generate such visual representations, such as, for example, a video game or a simulation (e.g., a flight simulation). As also just referenced, the graphics application 102 may generate a call 104. The call 104 may, for example, include or specify a number of elements which instruct a graphics interface 108 as to how to issue commands for rendering the visual representation. In other words, the graphics interface 108 represents a set of tools or capabilities that are generic to a large number of graphic applications, and that allow such graphics applications to achieve a desired effect.
Thus, the graphics application 102 may interact with the graphics interface 108 using the call 104, which may contain, or reference, known (types of) elements or data. For example, the graphics application 102 may, in the call 104, reference or use elements from related asset data 109. That is, the asset data 109 may represent, for example, information that is used by the graphics application 102 for a particular visual simulation. For example, the type and amount of asset data 109 required by the graphics application 102 for a high-speed, 3-dimensional video game may be quite different in amount and extent from that required for a 2-dimensional rendering of a largely static image. Thus, the graphics interface 108 represents a set of tools that is generic to a large number of graphics applications, where each (group of) graphics application(s) (e.g., the graphics application 102) may access its own asset data 109.
In the following discussion, an in
Thus, the call 104 is illustrated as using or including a primitive 110, in the sense just described. As referenced above, the primitive 110 may refer to a most-basic element (e.g., a line or triangle) used to construct a visual representation. As such, the call 104 may typically specify a large number of such primitives. Moreover, since the primitive 110 is at such a base level of representation, the primitive 110 often is combined with other primitives to form a standard object, which may itself be the element that is typically specified by the graphics application 102 in the call 104. As a result, it may be particularly difficult for the developer to ascertain information regarding the primitive 110 within the call 104, since the primitive 110 must be parsed from a large number of primitives, and from within the call 104 that itself represents a large number of calls.
The call 104 also may include or reference, in the sense described above and for example, a depth value 112, a stencil value 114, and/or a color value 116. The depth value 112 may, for example, be used to identify depth coordinates in three-dimensional graphics. For example, an object (e.g., a car) in a visual representation may drive either in front of or behind another object (e.g., a building), depending on a depth value(s) associated with the objects and their respective pixels. The stencil value 114 may, for example, be used to limit a rendered area within the resulting visual representation. The color value 116 may, for example, indicate what color a given pixel should render in terms of a red, green, blue value (RGB value). It should be appreciated that the above examples of elements or features of the call 104, and other examples provided herein, are merely provided for the sake of illustration, and are not intended to be exhaustive or limiting, as many other examples exist.
Thus, the graphics application 102 may make the call 104, for example, to the graphics interface 108. The graphics interface 108, as referenced above, may describe a set of functions that may be implemented by the graphics application 102. The graphics interface 108 may, for example, include a graphics application programming interface such as Direct3D (D3D), or, as another example, may include a graphics interface such as OpenGL, or any graphics interface configured to receive calls from the graphics application 102.
The graphics interface 108 may thus use the call 104 to communicate with, and operate, a graphics driver 118. The graphics driver 118 may represent or include, for example, virtually any graphics card, expansion card, video card, video adaptor, or other hardware and/or software that is configured to convert the logical output of the graphics interface 108 into a signal that may be used by graphics hardware 120 so as to cause a computer 122, having a display 124, to render an image frame 126.
The computer 122 may represent, but need not be limited to, a personal computer. The computer 122 additionally or alternatively may include, for example, a desktop computer, a laptop, a network device, a personal digital assistant, or any other device capable of rendering a signal from the graphics hardware 120 into the desired visual representation. As described, the computer 122 may then, for example, send a signal to the display 124, which may include a screen connected to the computer 122, part of the laptop computer 122, or any other suitable or desired display device.
The display 124 may then render a frame 126 of a visual representation, the frame 126 including a frame portion 128 that includes at least one pixel 130. Of course, the frame portion 128 may include many pixels 130, where it will be appreciated that the pixel 130 may represent one of the smallest, or the smallest, element of the display 124 (and the graphics application 102) that may be displayed to the developer or other user.
In many of the following examples, situations are described in which a rendering error occurs and is observed within the frame 126. However, it should be understood that in many examples, as referenced above, a rendering error need not occur, and that other motivations exist for observing a pixel history (e.g., optimization or understanding of code of the graphics application 102). Thus, the developer may then, for example, view the frame 126 on the display 124, and find that the frame 126 does not appear in an intended manner, e.g., contains a rendering error. The developer may, for example, find that the frame 126 renders a blue building, when the frame 126 was intended to render the building as being white, or vice-versa. As another example, a depth of an object may be incorrect. For example, a rendered car may be intended to be shown as driving behind the building, but may instead appear in front of the building. Such types of undesired outcomes are well-known to developers, and include many examples other than the few mentioned herein.
In some example implementations, then, such as in the example of the system 100, the developer may then, for example, select a pixel 130 used to render the building (i.e. a pixel appearing to be white rather than blue) from within the frame 126, which is the problematic frame in which the undesired outcome occurred. The developer may then, for example, request a pixel history on the pixel 130 to help determine what is wrong (e.g. why the building rendered as white instead of blue). For example, the developer may right-click on the pixel 130 and be provided with a pop-up window that includes the option “view pixel history.” Of course, other techniques may be used to access/initiate the pixel history.
Specifically, in the example of
In operation, then, the developer may simply observe a visual representation on the display 124. When the developer observes a rendering error within the visual representation, e.g., within the frame 126, the developer may simply select the pixel 130 that includes the rendering error, e.g., by clicking on or otherwise designating the pixel 130. The developer is then provided with the pixel history window 142, which identifies the pixel 130 and provides the sequence or listing of events 106 (e.g., events 106a, 106b) that led to the rendering of the pixel 130 within the frame 126. In the pixel history window 142, the events 106a and 106b are browsable, so that, for example, the developer may scroll through the events 106a and 106b (and other events 106, not shown in
Rendering errors such as those just referenced may be very small or very large, in either spatial or temporal terms. For example, the rendering error may be limited to a single pixel, or may include a large object in the frame 126 (such as the building or car just mentioned). The rendering error may exist for several seconds, or may appear/disappear quite rapidly. Moreover, the rendering error may not appear exactly the same within different executions of the graphics application 102. For example, in a video game, a particular scene (and associated rendering error) may depend on an action of a player of the game. If the player takes a different action, the resulting scene may render slightly differently, or completely differently. In such cases, it may be difficult even to view the rendering error again, or to be certain that the rendering error occurred.
Accordingly, the pixel history system 132 is configured, in example implementations, to implement techniques for capturing, storing, and re-playing the calls 104 to the graphics interface 108. Specifically, for example, a capturing tool 134 may be used that is configured to intercept the calls 104 as they are made from the graphics application 102 to the graphics interface 108.
In example implementations, the capturing tool 134 generally operates to monitor calls 104 made by the graphics application 102 to the graphics interface 108, and to package the calls and associated asset data within an event 106 and put the event(s) 106 into a run file 136. More specifically, the capturing tool 134 captures calls 104 made in connection with the frame 126, along with a sequence of calls made in connection with previous frames that help define a state of the frame 126. The capture tool 134 may then store the call(s) within the event 106, and put event 106 into run file 136. Accordingly, the captured calls may be re-executed using executable 137 so as to re-render the frame 126.
Thus, the event 106 may serve as a data bundle for call 104 and asset data 109. The event 106 may, for example, include zero or more calls 104, with or without a particular type of associated asset data 109 for each call 104, and the pixel history system may generate more than one event 106. For example, the capturing tool 134 may capture a call 104 and associated asset data 109 and store them within the event 106.
In some example implementations, only those calls contributing to the frame 126 may be stored as event(s) 106, perhaps using a memory 135, within the run file 136. Also, since the calls 104 are captured at the level of standard calls made to the graphics interface 108, it should be understood that capturing and re-rendering may be performed in connection with any graphics application that uses the graphics interface 108, without necessarily needing to access source code of the graphics application(s), and without dependency on either the graphics driver 118, graphics hardware, or the computer 122. Furthermore, virtually any graphics application 102 and/or graphics interface 108 (or graphics application programming interface) may be compatible with the pixel history system, so long as there is a way, for examples to capture, modify, and replay the calls 104 to the graphics interface 108 from the graphics application 102.
In use, then, a developer may observe a rendering error during execution of the graphics application 102, and may then re-execute the graphics application 102, but with the pixel history system 132 inserted so as to capture the calls 104. In this way, the run file 136 may be captured that represents, in a minimal way, the rendered frame 126 in a manner that allows the developer to observe the rendering error in an easy, repeatable way.
Of course, the just-described operations and features of the pixel history system 132 of
However the developer is provided with the frame 126 for selection of the pixel 130, the pixel history system 132 may receive the selection thereof using a pixel parser 138, which may extract all of the events 106 that are relevant to the selected pixel 130 (e.g., the events 106a, 106b), which may then be provided to the developer by display logic 140 in the form of the pixel history window 142. As shown, the pixel history window 142 may include a pixel identifier 143 that identifies the selected pixel 130 (e.g., by providing a row, column of the pixel 130 within the display 124, or by other suitable identification techniques).
Upon receiving the selection of the pixel 130, the pixel parser 138 may select only those events 106 relevant to the pixel 130 (e.g., used to render the pixel 130), along with the associated data or other information associated with each event (e.g., associated instances of the primitive 110, depth 112, stencil 114, and color 116). For example, in
The provided events 106a, 106b also may include test results 146a, 146b, respectively. Such test results 146a, 146b refer generally to the fact that the graphics application 102 may specify that the pixel should only appear, or should only appear in a specified manner, if a certain precondition (i.e., test) is met. Various types of such pixel tests are known. For example, a depth test may specify that the pixel 130 should only be visible if its depth is less than that of another specified pixel, and should not otherwise be visible (i.e., should be “behind” the other specified pixel). Other known types of tests include, for example, the stencil test or the alpha test, which operate to keep or discard frame portions based on comparisons of stencil/alpha values to reference values. For example, as is known, the stencil test may help determine an area of an image, while the alpha test refers to a level of opaqueness of an image, e.g., ranging from completely clear to completely opaque. These tests may be interrelated, e.g., the stencil value 114 may be automatically increased or decreased, depending on whether the pixel 130 passes or fails an associated depth test.
Thus, for example, the call 106 may attempt to render the pixel 130; however, if the pixel 130 fails an associated depth test that the developer intended the pixel 130 to pass (or passes a depth test it was supposed to fail), then the pixel 130 may not appear (or may appear), thus resulting in a visible rendering error. In other words, for example, it may occur that the call 106 attempts to affect the pixel 130 and fails to do so. In this case, the pixel history window 142 may provide the results of the (failed) test, so that the developer may judge whether the test was the source of the perceived error. Specifically, in the example of
The display logic 140 may be configured to interact with the run file 136 (and/or memory 135), the executable 137, and the pixel parser 138, to provide the pixel history window 142 and/or other associated information. For example, the display logic 140 may use the executable to re-render the frame 126, frame portion 128, and/or the pixel 130 itself. The pixel history window 142 may appear on the same display 124 as the rendered frame 126, or may appear on a different display. There may be numerous configurations on how to display the information; for example, all of the information may be displayed in a pop-up window. From the information displayed in the pixel history window 142, the developer or other user may then determine a source of a rendering error associated with the pixel 130, such as, for example, why a depicted building that included the pixel 130 was white when it was intended to be blue.
It should be understood that, in the example of the blue building that renders white, or in other rendering errors, there may be multiple sources of the rendering error. For example, there may be an error with the call 104, and/or with the primitive 110. Further, there may be an error with the graphics driver 118 implementing the graphics interface 108, or there may be an error in the graphics hardware 120. The pixel history system 132, by providing the pixel history window 142, may thus assist the developer in determining one or more sources of the observed rendering error.
Calls associated with the pixel (e.g., that “touch” the pixel), from the graphics application to the graphics interface, may then be determined. The calls may be stored in one or more events with associated data (220). For example, the capturing tool 134 may capture the calls 104 and store the calls in the events 106. The pixel parser 138 may then, for example, extract the events that are associated with the pixel 130.
An identification of a pixel within the frame may then be received (230). For example, the pixel history system 132, e.g., the pixel parser 138, may receive a selection of the pixel 130 from within the frame 126. For example, the developer who requested the re-rendering of the frame 126 may “click on” the pixel 130 as the pixel 130 displays the rendering error. The pixel parser 138 may then determine which pixel has been selected (e.g., the pixel 130 may be designated as pixel (500, 240)). In some implementations, as described in more detail below with respect to
Each call may include multiple primitives; those primitives that are configured to affect the pixel may be determined (240). For example, the pixel parser 138 may parse each of the events 104 to determine the primitive(s) 110. Since, as referenced, there may be thousands of primitives within even a single event 106, it may be difficult to parse and extract each primitive. Consequently, different techniques may be used, depending on a given situation/circumstance.
For example, it may be determined whether the primitives are arranged in a particular format, such as one of several known techniques for arranging/combining primitives. For example, it may be determined whether the primitives are arranged in a fan or a strip (242), and, consequently, it may be determined which primitive parsing algorithm should be used (244). Further discussion of how the primitives 110 are obtained is provided in more detail below with respect to
Then, it may be determined how the events associated with the pixel 130 are configured to affect the pixel 130, including asset data (250). Various examples are provided above with regard to
Test results of tests associated with one or more of the events may be determined (260) and stored (262). For example, the pixel parser 138 may determine that a given event was associated with a depth test, such that, for example, the pixel 130 was supposed to become visible as being in front of some other rendered object. The test results, e.g., the test results 146b of the event 106b of
The events, primitives, pixel values/asset data, and test results may then be displayed in association with an identification of the pixel (270). For example, as discussed above and as illustrated in more detail with respect to
By providing the pixel history window 142, the pixel history system 132 may, for example, provide the developer with direct and straight-forward access to the primitives and asset data used within each event (272). For example, the primitives 110a, 110b may provide a link that the developer may select/click in order to learn more information about the selected primitive. Similarly, the pixel values 142a, 142b may provide information about a color of the pixel 130 after the associated event 106a, 106b, and the developer may, for example, compare this information to an associated alpha value (i.e., degree of opaqueness) to determine why the pixel 130 was not rendered in a desired manner.
The pixel history window 142 also includes a first event 302 that represents a value of the pixel 130 at an end of a previous frame. The event 302 includes a color swatch 302a that illustrates a color value associated with the event 302 for easy visualization. Pixel values 302b may specify, for example and as shown, float representations of the color values, as well as alpha, depth, and stencil values.
The event 304 represents a clear event, which sets values for the above-referenced pixel values at “0.” The event 304 also includes a link 304a that provides the developer with state information about a state of associated hardware (e.g., the graphics driver 118, graphics hardware 120, or the computer 122) at a point in time associated with the event 302. For example, the state information may be rendered in a separate window.
The event 306 identifies a first event (i.e., event 101) associated with rendering the pixel 130. In this example, the event 306 is associated with drawing a primitive, so that link(s) 306a provide the developer with direct access to the specified primitive. Consequently, for example, the developer may view characteristics of the identified primitive, in order to ensure that the characteristics match the desired characteristics. The event 306 also includes, similarly to the clear event 304, pixel values, such as a pixel shader output 306b and a framebuffer output 306c, along with associated color, alpha, depth, and/or stencil values at that point in time. Also in the event 306, links 306d provide the developer with access to hardware state information (as just referenced) and access to mesh values associated with a meshing of the identified primitive(s) into a desired, higher-level object.
The event 308 is associated with an event (i.e., the event 105) commanding the graphics interface 108 to update a resource. As before, the event 308 may include pixel values of an associated framebuffer output 308b, as well as a link 308c to state information.
Finally in
An event may next be examined (406). For example, in event 306 the DrawPrimitive(a,b,c) call of the event (i.e., event 101) may be examined. It may then be determined, for the event, whether the call is a draw to the render target (408). The render target, for example, may be the frame including an erroneously rendered pixel as selected by the graphics developer, as described above, or may be a frame for which the developer wishes to understand or optimize related calls (events). If the call is not a draw to the render target, then the next event (if any) may be examined (422).
If, however, the call is a draw call to the render target, it may be determined whether the draw call covers the pixel (410). In this context, an example for determining whether the draw call covers the pixel is provided below in Code Section 1, which is intended to conceptually/generically represent code or pseudo-code that may be used:
If the draw call covers the pixel (i.e. Code Section 1 returns true) then it may be determined whether a primitive of the draw call covers the pixel (414). In this context, an example for determining the first primitive that covers the pixel is provided below in Code Section 2, which is intended to conceptually/generically represent code or pseudo-code that may be used:
After finding a primitive that covers the pixel, the history details for the pixel may be determined (416). Details of example operations for determining the pixel history details are provided below (426-438).
After the history details for the primitive are determined as described above, the primitive value may be added to the history (418). Then if there are more primitives in the draw call, the next primitive may be examined (420), using the techniques just described. Once there are no more primitives in the draw call and no more events to examine, the final framebuffer value may be added to the history (424).
In determining the history details (416), first the pixel shader output may be determined (426). For example, the values associated with item 310c of event 310 may be determined. Then the pixel may be tested to determine whether it fails any one of several tests, including for example a scissor test, an alpha test, a stencil test, and a depth test. Alternative embodiments may include a subset and/or different tests and/or different sequences of tests other than those specified in this example.
For example, first it may be determined whether the pixel fails the scissor test (428). The scissor test may include a test used to determine whether to discard pixels contained in triangle portions falling outside a field of view of a scene (e.g., by testing whether pixels are within a “scissor rectangle”). If the pixel does not fail the scissor test, then it may be determined whether the pixel fails the alpha test (430). The alpha test may include a test used to determine whether to discard a triangle portion (e.g., pixels of the triangle portion) by comparing an alpha value (i.e., transparency value) of the triangle potion with a reference value. Then, if the pixel does not fail the alpha test, the pixel may be tested to determine whether it fails the stencil test (432). The stencil test may be a test used to determine whether to discard triangle portions based on a comparison between the portion(s) and a reference stencil value. If the pixel does not fail the stencil test, finally the pixel may be tested to determine whether it fails the depth test (434). The depth test may be a test used to determine whether the pixel, as affected by the primitive, will be visible, or whether the pixel as affected by the primitive may be behind (i.e., have a greater depth than) an overlapping primitive.
If the pixel fails any of the above mentioned tests, then this information may be written to the event history (438) and no further test may be performed on the pixel. In alternative embodiments however, a minimum set of tests may be specified to be performed on the pixel regardless of outcome.
If however the pixel passes all of the above mentioned tests, then final frame buffer color may be determined (436). For example, the color values associated with item 310d of event 310 may be determined. Then the color value and the test information may be written to the event history for this primitive (438).
Using the above information and techniques, the developer may determine why a particular pixel was rejected during the visual representation (e.g., frame 126). For example, in some cases, a target pixel may simply be checked without needing to render, such as when the target pixel fails the scissor test. In other cases/tests, a corresponding device state may be set, so that the render target (pixel) may be cleared and the primitive may be rendered. Then, a value of the render target may be checked, so that, with enough tests, a reason why the pixel was rejected may be determined.
Based on the above, a developer or other user may determine a history of a pixel in a visual representation. Accordingly, the developer or other user may be assisted, for example, in debugging associated graphics code, optimizing the graphics code, and/or understanding an operation of the graphics code. Thus, a resulting graphics program (e.g., a game or simulation) may be improved, and a productivity and skill of a developer may also be improved
While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the various embodiments.