tag 标签: system

相关博文
  • 热度 20
    2015-3-16 06:30
    4286 次阅读|
    0 个评论
    We have been implementing every possible checks to make sure design is verified but what have we done to check our test bench ? How do we make sure that our test bench has covered everything that needs to be covered w.r.t to specification and test plans ? Here is the place “Functional Coverage” and “SVA” comes in picture!   Before we start on few guidelines to follow while working with functional coverage, I would encourage you to refer various posts on functional coverage and assertions to get high level idea on architecture and usage. Click on below links 1. http://asicwithankit.blogspot.in/2011/01/coverage-model-in-system-verilog-test.html 2. http://www.asicwithankit.blogspot.com/2012/12/system-verilog-functional-coverage.html 3. http://www.asicwithankit.blogspot.com/2013/01/the-two-door-keepers-assertion-to-make.html     Now, Basic questions can come to mind is, "what is the difference between code and functional coverage?". Let’s understand it at high level and then we will move forward to understand guidelines for functional coverage.     Sr No Code Coverage Functional Coverage    and SVA 1 Derived from design code with the help of simulation tools It is user specified, controlled approach coverage by test bench 2 Evaluate design code to check whether structure is covered or not Measures functionality part with the help of covergroup, cover point and bins (with the help of luxury feature of System Verilog  J ) (With SVA you can capture functional coverage using cover property)   To conclude with few guidelines from various posts on functional coverage and assertions: Functional coverage and code coverage both are contributing highly on sign off criteria for verification. Verification engineers have to make sure that their test plan and test environment is intelligent enough to satisfy the code/functional coverage closer. Code coverage is generated by tool with the help of the simulations generated by the test environment. So test environment should be random and intelligent enough to make sure design is covered as a part of code coverage and designer should be in agreement while code coverage review. There should be valid comments with reason for all exclusions for code coverage w.r.t to design specification. Functional coverage should be written such a way that it should be able to capture all identified functionality while defining the test plan. Coverage and assertions are very important entity in the verification process and there are few guidelines that would help in verification process. Few guidelines while working with functional coverage   Your test plan should be based on the functionality you want to verify w.r.t to specification You should have a coverage matrix with the list of cover point details w.r.t to your test plan scenario and there should be link of traceability between test scenario and cover point. Environment should have control mechanism for enabling or disabling coverage and assertions for better control ability in your environment Don’t enable functional coverage at the beginning of the verification to avoid simulation overhead in the starting phase of verification During the initial time of the verification bug ration is typically high, as you move forward to the verification bug ration would start to drop. Here is the time when you should enable coverage and analyze it Functional coverage plan needs to be updated as verification progresses As your knowledge of the design and corner case understanding increases, you should keep updating your functional coverage plan Make effective use of cover group “trigger” and sampling mechanism. (Stay tune for sampling mechanism on upcoming blog post !) Follow meaningful names of cover group and cover points. This will help when you in debug process Coverage should not be captured on failing simulations. Make sure to gathered coverage for only passing simulation. If few tests are not passing in regression first make sure to fix those issues before come to a conclusion on coverage achievement If your tests are keep exercising the same logic in design, start developing the new tests for uncovered coverage part of coverage (coverage holes) For guidelines on SVA, please refer to this article (http://www.asicwithankit.blogspot.com/2014/08/system-verilog-assertions-sva-types.html)  ! Stay tuned to understand functional coverage sampling mechanism ! Thanks, ASIC With Ankit
  • 热度 21
    2014-9-18 18:10
    2027 次阅读|
    1 个评论
    Don't get too concerned just yet, our vital standards are not going away. However, they are changing radically. A fascinating article in the July 2014 issue of Physics Today reviews the history of the definition of our basic units -- metre, kilogram, second -- and how these definitions have evolved as part of the present International System of Units (SI, from the French Système International d'Unités). It then clearly explains the very significant, dramatic, and non-intuitive changes that are officially approved and being made new basic units of the metrology world.   For many years, for example, the "master" metre was a platinum bar with scratch-marks, kept in a vault in Paris at the Bureau International des Poids ets Mesures (BIPM). Just think about the logistics of comparing secondary standard to that primary one. The metre was subsequently redefined in terms of wavelengths of a specific spectral emission. Similarly, the standard for time was changed from a fraction of an Earth's year (which actually does vary) to vibrations of an atomic clock. These new standards are not only more accurate, they are reproducible and don't rely on a single physical artifact.   But the kilogram has been a problem and remained as an artifact-based standard, defying many attempts to design and build a reproducible standard with sufficient precision and consistency (remember, this is the world of ppm performance). Until very recently, the primary kilogram was a master cylinder known as the International Prototype of the Kilogram (IPK), Figure 1, and protected like the old metre standard. In addition to the logistics issues, it seems the master was losing weight, when compared to multiple secondary kilogram standards (or maybe they were gaining?). Seriously, when you are looking for this level of perfection, a few molecules rubbing off the surface despite careful handling can make a difference. The interesting news is that the new kilogram will be defined by a reproducible system called a "watt balance," an incredibly sophisticated embodiment of a conceptually "simple" idea.   Figure 1. The world standard for the kilogram is the last SI unit to be an artifact, but that's changing.   The changes go far beyond the basic units. The ampere, so vital our electronics work, will no longer be defined in terms of current through parallel wires and the attractive force between them -- always a tricky one to explain and assess. Instead, it will be defined by fundamental atomic constants and quantum physics phenomena, while the volt is defined by an array of superconducting Josephson junctions.   You might think that this is all nice and good, but we'll still have our basic building blocks of time, distance, mass, and four other so-called "base units," but that's not the case. The really dramatic fundamental change of the new SI system is that the existing seven base units themselves are changed to seven new ones. These new base units are very different than the ones we're used to, and which seemed, for better and worse, somewhat more intuitive to most of us. But progress is progress, so say goodbye kilogram and electric current, and hello to electric charge, Avogadro's number, and luminous intensity (to cite a few of the old and new ones). (An enjoyable book on the history of metrology, its many problems and solutions, as well as evaluation and transformations, is World in the balance: the historic quest for an absolute system of measurement by Robert P. Crease, Figure 2.)   Figure 2. World in the balance: the historic quest for an absolute system of measurement by Robert P. Crease.   The Physics Today article is worth your time for both the explanatory background it provides, and to bring you up to date on these very significant changes to fundamental standards upon which we rely so intensively (often without needing to think about any deeper meaning). While accuracy to the level that the new definitions and standards provide is not critical to most of us and our work, reality is that in any test and measurement situation, there is always the issue of "how do you know that measurement is accurate?"   A general guideline is that any equipment measuring a device under test (DUT) must be at least four times better than the desired accuracy of the result. That's called test uncertainly ratio (TUR). But, how to you verify the measurement uncertainty of the test equipment? Going a step further, how do you know the uncertainty of the system used to check the test equipment itself? Pretty soon, you are in deep philosophical territory about the meaning of reality and perfection.   What's your view on the major changes we are undergoing in the basic units of the SI system, and how they are defined? Will it affect you, or is it a "don't care" change?
  • 热度 15
    2013-5-27 13:09
    6182 次阅读|
    0 个评论
      Dear Readers,   Here I would like to share some understanding on keyword called  "this" . What is  "this"  in System Verilog? How does it used? Usage of  "this"  is simple but important in test bench development.   First of all lets understand What is  "this"  in System Verilog?   "this"  is a key word in System Verilog used to unambiguously refer to class properties or methods of current object. The  "this"  keyword shall only used within a non-static class methods otherwise an error shall occur.   As example is the best way to understand the most of the things, let me take a example and try to explain. Example to understand the usage of  "this"  in System Verilog:   #############################################       class  ASICwithAnkit ;          int  a ;          function new  ( int  a);             this .a = a;          endfunction  : new       endclass  : ASICwithAnkit //Class instantiation and usage ASICwithAnkit  AwA =  new  (123); $display  ("AwA.a = %d,", AwA.a); ##########################################   In above example we can see that 'a' is a member of class "ASICwithAnkit". When we initialize the memory for class for usage, we have passed a integer value '123' to its constructor (function new). The variable 'a' is local to class instance "AwA and is now 123 as we have passed this from constructor.   Hope this is useful to understand the meaning and usage of  "this"  in System Verilog.   Happy Reading ! ASIC With Ankit
  • 热度 22
    2013-5-19 00:52
    3460 次阅读|
    1 个评论
    Dear Readers, System Verilog has new data type called ‘queue’ which can grow and shrink. With SV queue, we can easily add and remove elements anywhere that is the reason we say it can shrink and grow as we need. Queues can be used as LIFO (Last In First Out) Buffer or FIFO (First In First Out) type of buffers. Each element in the queue is identified by an ordinal number that represents its position within the queue, with 0 representing the first element and $ represents the last element. The size of the queue is variable similar to dynamic array but queue may be empty with zero element and still its a valid data structure. Lets take a few examples to understand queue operation with different methods we have in system verilog.   ###############################################  int  a; Q = {0,1,2,3};  // Initial queue   initial begin    // Insert and delete    Q.insert (1, 1); // This means insert 1 at first element in the queue which becomes {0,1,1,2,3}    Q.delete(2); // This means delete 2 nd  element from the queue. {0,1,2,3}       //Push_front    Q.push_front (6);  //Insert ‘6’ at front of the queue. {6,0,1,2,3}       //Pop_back    a = Q.pop_back; // Poping the last element and stored it to local variable ‘a’,  a = 3 in this case. Resultant Queue = {6,0,1,2}       //push_back    Q.push_back(7) // Pushing the element ‘7’ from the back. {6,0,1,2,7}        //Pop_front:    a =Q.pop_front; Poping the first element and stored it to local variable called ‘a’, a=6 in this case. Resultant Queue = {0,1,2,7}   end #####################################################    When you create a queue System Verilog actually allocates extra space and because this we can add and remove the element based on need in our test bench. This is very useful feature in test bench implementation. System Verilog automatically allocates the additional space so we don't need to worry about the limits and queue will not run out of space.   Queue is very useful data type in System Verilog for developing a test benches. It can be used in development of various entity in the test bench like scoreboard, monitor, transaction class, drivers etc.   Hope this helps in basic understanding of queue and its methods.   Happy Reading! ASIC With Ankit
  • 热度 20
    2013-5-1 01:03
    2666 次阅读|
    1 个评论
    Dear Readers, We have been using standard languages and methodologies for ASIC/FPGA design and Verification activities. We as an engineer must know on history of verification activities. Today we mostly work on verification standard languages like System Verilog. The whole industry is moving to accept this language with few methodologies (RVM, VMM, AVM, OVM, UVM etc...) as their standards for new and existing product development and verification. Now since we use the industry standard languages like VHLD, Verilog, and System Verilog, we must know and understand history and importance of Verification languages. Let’s understand how we reached to a System Verilog usage? What are the other different verification languages engineers were using in past few decades? How did they start their usage from Verilog to System Verilog for verification? Let’s go back to history and understand these questions. When Verilog first developed in mid-80, main requirement and usage of this language was to develop synthesizable RTL with not much complexity. Revolution started by late 1980s. By late 80s synthesis and simulation triggered a revolution for EDA industry. As time passed, In 90s industry realized a tremendous need to solve complex verification problems due to complex designs. This was the time when EDA Company played a key role in filling the requirement to solve this verification issues. Those days verification languages which become popular and people started using those were proprietary to some companies! Best examples are ‘Open Vera’ and ‘e’ language. Since these languages were proprietary to EDA companies, some people were using the Object Oriented Languages like C++. During those days some users were using Verilog to develop their testbench, looks interesting, Isn’t it! The problems gets started during 1990s when Verilog become an industry standard. In 1980s a company called Gateway Design Automation developed a logic simulator called Verilog-XL and Cadence acquired in 1989 with right. Now with a new strategy Cadence put the language in to the public domain with the intention that Verilog should become a standard. After this Verilog HDL is now maintained by Accellera a nonprofit making organization. In 1995 Verilog HDL became IEEE standard. Accellera came up with revised versions in 2001 and then in 2005 and industry taking this as standard and moved ahead. Accellera have also developed a new standard called ‘System Verilog’ which extends Verilog with newly added many feature with the concept of Object Oriented Programing. System Verilog then became an IEEE standard (1800-2005) in 2005. System Verilog is a super set of Verilog plus all the features known to be necessary for traditional verification. System Verilog is being used mostly in Verification activities because of higher level abstraction and user friendly features. Today System Verilog is already became a standard for Verification activities and most of the companies have started accepting his beauty! In addition to System Verilog usage and acceptance people have come up with few methodologies like (RVM, VMM, AVM, OVM, UVM etc…) With the addition of this type of methodologies Verification environments are becoming easy to handle, user friendly and most importantly the environments are becoming re-usable! Happy Reading! Ankit Gopani
相关资源