The invention relates to integrated circuits verificatin, e.g. by means of simulation, and more particularly to systems, methods and computer program products for prioritizing electronic design verification review issues.
Electronic chip designers continue to develop electronic chips of ever increasing complexity using more and more transistors. Verifying the behavior of an electronic chip has becoming increasingly difficult and time-consuming. A considerable amount of engineering time is spent running and analyzing simulation results.
Design verification teams spend many months developing, running and analyzing simulation tests and their results. The verification teams typically first develop functional tests, also known as directed tests. These functional tests are designed to test expected behavior as described in a functional specification. The functional tests check that the design works in all modes and configurations. When the verification teams first run the functional tests they typically show errors in the design and errors in the test. After some period of time, after correcting the design and test errors, the design and verification teams will agree one or more regression test suites. The verification teams will run daily and weekly regression tests.
After functional test verification, the verification team will typically start random testing. Random tests typically check scenarios with different sets of pseudo-random test inputs. Random testing usually shows fewer errors than functional testing.
After random testing, the verification team typically starts code coverage analysis. They want to ensure that all lines of RTL code are exercised. Verification teams typically use simulators to generate code coverage reports using the previously developed tests for data input. They analyze these reports and try to ensure all lines of RTL code are exercised. The verification teams often find “dead-code”, code that isn't needed, and they find situations that the functional tests didn't test but should have tested. The code coverage project phase generally finds few design errors but is considered a necessary step. The code coverage reports provide a large volume of detailed information. Verification teams and design engineers find it time-consuming and tedious to review the code coverage issues.
Electronic design automation (EDA) tools are making increased use of verification properties to augment simulation and reduce the verification cost. A verification property declares a condition in the design. If a property always holds true, we call them an assertion. For example, a property “overflow==1′b0” should always hold for any correct FIFO design. On the other hand, a property can capture possible behavior allowed by the design; we call such a property a cover property. For example, a property “full==1′b1” can be a typical coverage property on the same FIFO design. For the given above two examples, we typically write them as:
Users can specify verification properties in an RTL file or in a separate constraint file. They are typically written in specific language including System Verilog Assertions (SVA) or Property Specification Language (PSL). Many EDA tools can parse/generate verification properties. Some EDA tools can generate verification properties by analyzing RTL statements. Other EDA tools can generate verification properties by analyzing simulation test results.
Atrenta Inc.'s Bugscope® EDA tool has proven valuable in finding test coverage holes. Bugscope® generates verification properties by analyzing simulation, test results. For example it may note that in one test the condition “full==1′b0” is always true. If Bugscope® discovers that the same condition is false in a second test it treats the condition as a coverage property and generates the property “cover full==1′b1”. The coverage property condition is inverted with respect to the discovered property to direct a simulator to check for the inverted condition. We say that the second test covers the property. Bugscope® may note that the condition “overflow==1′b0” is always true in all tests. In this case it cannot tell if the property is a coverage property or an assertion.
Using verification properties places an additional review burden on the verification engineer. In addition to reviewing code coverage data the verification team must also review generated properties and simulation property results.
Verification teams would prefer EDA tools that simplify the task of reviewing code coverage and generated property results. Design and verification teams would like to speed up the overall development schedule by reducing time spent reviewing coverage items and get coverage information earlier in the development schedule.
A system and method are provided that use pass/fail test results to prioritize electronic design verification review issues. It may prioritize either generated properties or code coverage items or both. Thus issues, whether generated properties or code coverage items, that have never been violated in any passing or failing test may be given highest priority for review, while those that have been violated in a failing test but are always valid in passing tests may be given lower priority. Still further, where end-users have marked one or more properties or code coverage items as already-reviewed, the method will give these already-reviewed issues the lowest priority.
As a result, both properties and code coverage items may be generated together in a progressive manner starting earlier in development. Properties for unchanged modules that have already been verified from a previous version of a chip can be removed or given lowest priority to avoid duplication of effort. Likewise, properties and code coverage items that are only violated at failing tests may be removed or given lower priority so that repetitive testing of such issues at every design regression can be minimized or avoided altogether. The number of issues to review is therefore significantly smaller than the old approach.
A Verification Issue Rating System (VIRS) in accord with the present invention uses pass/fail test results to prioritize electronic design verification review issues. Properties that have never been violated in any passing or failing test are given highest priority. Properties that have been violated in a failing test are given lower priority. Similarly, code coverage items that have never been exercised in any passing or failing test are given highest priority. Code coverage items that have been exercised in a failing test are given lower priority.
Verification teams typically use simulators to generate code coverage reports to discover which lines of RTL design code have not been exercised. For example, an RTL design may include a case statement specifying four conditions corresponding to the four combinations of values for a pair of binary signals. The code coverage report may report that cases 00, 01 and 10 are exercised but the case 11 is not exercised. The RTL code for case 11 is flagged for engineering review.
EDA tools like Bugscope® generate properties by analyzing a design and its test simulation results. Verification engineers review the generated properties looking for test coverage holes and design errors. The generated properties may indicate a relationship that should always be true, called an assertion. The generated property may indicate a situation that hasn't been tested, called a coverage property. The generated property may also indicate a design error. Coverage properties that are true in all tests indicate a test coverage hole. Verification engineers are more interested in these types of coverage properties than in assertions. A coverage property may indicate that the verification team needs to generate a new test.
During the early stages of development it is common to see test failures and assertion property violations. These assertion property violations are often the result of design errors. For example, a verification engineer may know that a FIFO should not overflow. The verification engineer creates functional tests trying to create a FIFO overflow and may manage to show conditions under which FIFO overflow can occur. After a design engineer corrects the design the test passes and the assertion properties pass. Tests that failed previously and now work give a strong indication that the failing properties were assertion properties and not coverage properties. To take advantage of this information the EDA tools need to maintain a database of test results over time.
In the case of a result checker error, the design behavior observed in the failure test is legal but the result checker thinks it is illegal. Assume that there is a property P which gets violated in this failure test. Given that it is a checker failure, the verification engineer will fix the result checker while the stimulus will be kept the same. When the result checker is fixed, you will see that this property P still gets violated. The VIRS will ignore this property P because it cannot be an assertion and it isn't a coverage property that is true for all tests.
In the case of a design error, assume that a property P is violated. When the design is fixed, testing P can have only two consequences: a) P holds true; b) P is still incorrect. For case a), P is very likely to be an assertion because it is violated in a failure test and once it is fixed, it holds. For case b), P must be a coverage property. This implies that the failure test exercises a previously uncovered corner case, and therefore finds a bug. When the design is fixed, the test still exercises this corner case, and that is why this P is still violated. In this way, we find with high probability that a property is an assertion. The VIRS uses this information to prioritize both generated properties as well as code coverage.
In the case of a test stimulus error, the analysis is very similar to that of the design error. The only difference is that the verification engineer has to fix test stimuli instead of the design. Testing the violated property P will still have two consequences: a) P holds true; b) P is still incorrect. For case a), P is very likely to be an assertion. For case b), P must be a coverage property.
The VIRS handles code coverage review items in a similar manner to the way it handles properties. In S310 the VIRS would list code coverage items instead of properties. In S320 the VIRS would give high priority to code coverage items that aren't covered in any passing or failing test. Subsequent steps would apply to code coverage items.
In one embodiment the VIRS allows users to mark verification items as “already reviewed”. The VIRS takes account of the user's “already reviewed” designation when prioritizing review items. In one embodiment the VIRS creates 4 categories of review items: a) high review priority and not “already reviewed”; b) high review priority and “already reviewed”; c) low review priority and not “already reviewed”; and d) low review priority and “already reviewed”.
The embodiments disclosed herein can be implemented as hardware, firmware, software, or any combination thereof. Moreover, the software is preferably implemented as an application program tangibly embodied on a program storage unit or computer readable medium. The application program may be uploaded to, and executed by, a machine comprising any suitable architecture. Preferably, the machine is implemented on a computer platform having hardware such as one or more central processing units (“CPUs”), a memory, and input/output interfaces. The computer platform may also include an operating system and microinstruction code. The various processes and functions described herein may be either part of the microinstruction code or part of the application program, or any combination thereof, which may be executed by a CPU, whether or not such computer or processor is explicitly shown. In addition, various other peripheral units may be connected to the computer platform such as an additional data storage unit and a printing unit. Furthermore, a non-transitory computer readable medium is any computer readable medium except for a transitory propagating signal.
This application claims priority under 35 U.S.C. 119(e) from prior U.S. provisional application 62/041,661, filed Aug. 26, 2014.
Number | Date | Country | |
---|---|---|---|
62041661 | Aug 2014 | US |