tag 标签: coverage

相关博文
  • 热度 20
    2015-3-16 06:30
    4326 次阅读|
    0 个评论
    We have been implementing every possible checks to make sure design is verified but what have we done to check our test bench ? How do we make sure that our test bench has covered everything that needs to be covered w.r.t to specification and test plans ? Here is the place “Functional Coverage” and “SVA” comes in picture!   Before we start on few guidelines to follow while working with functional coverage, I would encourage you to refer various posts on functional coverage and assertions to get high level idea on architecture and usage. Click on below links 1. http://asicwithankit.blogspot.in/2011/01/coverage-model-in-system-verilog-test.html 2. http://www.asicwithankit.blogspot.com/2012/12/system-verilog-functional-coverage.html 3. http://www.asicwithankit.blogspot.com/2013/01/the-two-door-keepers-assertion-to-make.html     Now, Basic questions can come to mind is, "what is the difference between code and functional coverage?". Let’s understand it at high level and then we will move forward to understand guidelines for functional coverage.     Sr No Code Coverage Functional Coverage    and SVA 1 Derived from design code with the help of simulation tools It is user specified, controlled approach coverage by test bench 2 Evaluate design code to check whether structure is covered or not Measures functionality part with the help of covergroup, cover point and bins (with the help of luxury feature of System Verilog  J ) (With SVA you can capture functional coverage using cover property)   To conclude with few guidelines from various posts on functional coverage and assertions: Functional coverage and code coverage both are contributing highly on sign off criteria for verification. Verification engineers have to make sure that their test plan and test environment is intelligent enough to satisfy the code/functional coverage closer. Code coverage is generated by tool with the help of the simulations generated by the test environment. So test environment should be random and intelligent enough to make sure design is covered as a part of code coverage and designer should be in agreement while code coverage review. There should be valid comments with reason for all exclusions for code coverage w.r.t to design specification. Functional coverage should be written such a way that it should be able to capture all identified functionality while defining the test plan. Coverage and assertions are very important entity in the verification process and there are few guidelines that would help in verification process. Few guidelines while working with functional coverage   Your test plan should be based on the functionality you want to verify w.r.t to specification You should have a coverage matrix with the list of cover point details w.r.t to your test plan scenario and there should be link of traceability between test scenario and cover point. Environment should have control mechanism for enabling or disabling coverage and assertions for better control ability in your environment Don’t enable functional coverage at the beginning of the verification to avoid simulation overhead in the starting phase of verification During the initial time of the verification bug ration is typically high, as you move forward to the verification bug ration would start to drop. Here is the time when you should enable coverage and analyze it Functional coverage plan needs to be updated as verification progresses As your knowledge of the design and corner case understanding increases, you should keep updating your functional coverage plan Make effective use of cover group “trigger” and sampling mechanism. (Stay tune for sampling mechanism on upcoming blog post !) Follow meaningful names of cover group and cover points. This will help when you in debug process Coverage should not be captured on failing simulations. Make sure to gathered coverage for only passing simulation. If few tests are not passing in regression first make sure to fix those issues before come to a conclusion on coverage achievement If your tests are keep exercising the same logic in design, start developing the new tests for uncovered coverage part of coverage (coverage holes) For guidelines on SVA, please refer to this article (http://www.asicwithankit.blogspot.com/2014/08/system-verilog-assertions-sva-types.html)  ! Stay tuned to understand functional coverage sampling mechanism ! Thanks, ASIC With Ankit
  • 热度 22
    2013-8-27 20:13
    1844 次阅读|
    0 个评论
    The blog series by Harry Foster of Mentor Graphics contains lots of really valuable information about trends in functional verification. The studies he discusses are very useful in helping me track how the industry is evolving. Being able to answer questions such as how much time engineers spend performing verification and which languages are being adopted the most can ensure that engineers get the right tools for the future. However, Foster's blog a few weeks ago immediately raised my eyebrows—not because of what it said, but because of what it didn't say. This chart from the 2012 Wilson Research Group study shows adoption trends from 2007 and 2012. One would think that technologies such as code coverage, functional coverage, and assertions were being adopted rapidly. Oops. That's not quite the case. In the other blogs in this series, Foster had been comparing results from the 2012 study with results from 2010. To me, the switch to a comparison with 2007 results seemed highly suspicious. Unluckily for Foster, the Internet is persistent. This graph shows results from a 2010 study. Let me turn those two charts into one.   Code coverage dropped 2 per cent in two years. The use of assertions dropped 6 per cent, and use of functional coverage dropped 6 per cent. Mentor claimed the overall confidence level of the 2010 study was 95 per cent with a margin of error of 4.1 per cent. For 2012 study, the overall confidence level was 95 per cent with a margin of error of 4.05 per cent, so the differences between 2010 and 2012 are basically in the noise. Rather than dealing with the small but declining percentages as signs of a maturation and potential saturation of the industry for constrained random test-pattern generation, the blogs attempted to paint a rosy picture of adoption. This flattening of adoption is an important trend and actually supports the types of things that I hear real engineers telling me. I hear them talking about the increased difficulties associated with creating functional coverage points, not being able to use constrained random for SoC-level verification, and frustration with assertions. These numbers indicate, not mass migration, but that all is not well, and that EDA vendors need to be looking in other directions for their next generation of tools. Is your use of any of these technologies changing? Perhaps you know of other reasons why these numbers have become flat. Brian Bailey EE Times  
相关资源