Skip to content Skip to footer

Verification Methodology Basics & UVM

ASIC Verification has largely shifted its focus from the legacy procedural approach of building test benches to the more advanced object-oriented layered test bench architecture. It was imperative to find a solution in order to meet the verification requirements of increasingly complex designs. Thus, SystemVerilog that borrowed concepts/features from Verilog, VHDL and OOPS (Object Oriented Programming) was born to combat the challenges faced by the ASIC Verification industry. SystemVerilog based test benches met the modern day verification needs hands down for they are scalable, reusable and they support constrained randomization as well as coverage. Besides this, SystemVerilog offers features that raise the test bench abstraction level a several notches higher, thus simplifying test bench building by making the code readable.

The current technological scenario demands ‘reuse’ of IPs. It is more of a  necessity than a trend to deploy IPs from other vendors/companies into your SOC and this applies to verification code as well. There is a growing need to plug and play VIPs (Verification Intellectual Property) from a third party source in order to accelerate verification. But, this is possible only if all the VIP makers follow the same ‘rule book’. This rule book in Verification context is called a Verification Methodology. Many Verification methodologies have surfaced over the past decade, for example eRM (e Reference Methodology) that was built around e (Specman), RVM (Reference Verification Methodology) that was built around Vera and the more advanced ones such as VMM, AVM, OVM and UVM that were built to standardize SystemVerilog based test benches.

Why is there a need to ‘define’ a Verification Methodology?

While following a methodology has many advantages, the primary reason is ensuring ‘interoperability’ of code. There is almost an insatiable quest for faster, leaner, smarter and power efficient gadgets. The designs are getting increasingly complex, but time to market and cost expectations are continuously being slashed. Often, SOC manufacturers deploy third-party Design IPs to slash time to market and cost (no need to invest time and money in coding something that already exists). On the same lines, third-party verification IPs (VIPs) are often deployed in test benches in order to avoid reinventing the wheel and hence achieve verification goals faster.

SystemVerilog is packed with features that help in generating constrained random stimulus as well as in implementing thorough verification by supporting assertions and coverage. SystemVerilog test benches are built at a much higher abstraction layer compared to legacy test benches and thus the code is readable. The time to bring up and maintain test benches is also low in comparison to the legacy approach. Additionally, an object-oriented approach helps build test bench components that are reusable. Although SystemVerilog is so powerful, it still needs to have a ‘methodology’ defined and here’s why.

A verification methodology is a mix of ‘best practices’ and additional features such as Base Class Library (BCL) that aims to build reusable, fuss-free and interoperable test bench code. The ‘best practices’ define ‘how’ each and every test bench component of a layered test bench architecture should be designed. It provides a set of ‘guidelines’ that define as to how the test bench components must be interconnected and placed in the layered test bench hierarchy.  One of the remarkable features a methodology offers is to offload the job of ‘phasing’ the test case simulation steps so as to avoid test bench races. By adhering to these best practices verification engineers can build code that is interoperable, reusable, easier to build or even modify.  The Base Class Library (BCL) that works like a hand in glove with these best practices is a set of predefined classes that serve as a ‘reference’ to build complex test bench components. These classes are virtual in nature, but they contain the fundamental infrastructure to build the corresponding test bench component. As an example, all the data objects or in other words ‘transactions’  can be derived from the virtual class ‘uvm_transaction’. This class contains properties and methods that are necessary to track, view, create, clone or even print data transactions such as an Ethernet packet. The test bench components such as a driver are key to the layered architecture. The ‘driver’ is derived from ‘uvm_driver’ class which in turn is derived from the ‘uvm_component’ class. The ‘uvm_component’ class is derived from the ‘uvm_report_object’ which is in turn derived from ‘ uvm_object’ class. Thus, the ‘uvm_driver’ not only inherits features from its predecessors such as containing the basic framework for searching and traversing the component hierarchy , managing test case phasing (building test bench components, interconnecting them, configuring them, timing their roles in a simulation cycle, etc), configuration, reporting, transaction recording (adding the ‘produced’ or ‘consumed’ transaction into a database useful in ‘scoreboard’ or ‘monitors’) and factory (to create new components). But it also contains ‘driver’ specific methods and attributes that provide the ‘base’ to build the ‘driver’ component. Thus, by adopting a ‘methodology’ we can build ‘interoperable’, ‘readable’, ‘manageable’ and ‘reusable’’’ test benches without the need to walk up to the moon and back.

Trends in the Verification Methodology Arena

SystemVerilog was developed to integrate a number of Hardware Verification Languages (HVL) such as Vera and e (Specman). These languages were built on top of Hardware Description Languages such as Verilog and VHDL. Each of these languages had their own corresponding methodology. RVM (Reference Verification Methodology) for Vera and eRM (e Reference Methodology) for e served as rulebooks to build reusable and interoperable test benches.

Mentor created a methodology for SystemVerilog once it began to gain more acceptance, this methodology is called the AVM (Advanced Verification Methodology). This was developed in the year 2006 and it borrowed concepts from System C. Around the same time, Synopsys too had begun porting Vera based RVM (Reference Verification Methodology) library to SystemVerilog based VMM (Verification Methodology Manual) library. Cadence had acquired Versity and modified the existing eRM into URM (Universal Reuse Methodology). The EDA industry showed signs of convergence in 2008 when Mentor and Cadence collaborated to create OVM (Open Verification Methodology) which was a mix of the existing AVM (Advanced Verification Methodology) and eRM concepts. Accelera decided to standardize test benches and hence verification by transforming the OVM (Open Verification Methodology) into UVM (Universal Verification Methodology). All the EDA giants such as Synopsys, Mentor, Aldec and Cadence backed Accelera in this process.

Why UVM rules the roost?

Although all the methodologies performed a common task of ‘standardizing’ verification, UVM has taken the cake and had it too. It is a well-known fact that UVM has evolved from OVM (a methodology formulated by Cadence and Mentor), which in turn is based on eRM (a methodology developed by Versity design in the year 2001 for the ‘e Verification Language’). Thus, one of the primary reason for the rapid adoption of UVM in building test benches is because it finds its roots in well-known and industry proven methodologies. If we take a dive into each of the existing verification methodologies, there lies the second reason as to why UVM has gained precedence. eRM was defined by Versity and was built around the verification language e. On the same lines, RVM(Reference Verification Methodology for Vera HVL) and VMM (Verification Methodology Manual for SystemVerilog) were defined by Synopsys. Mentor developed AVM (Advanced Verification Methodology for SystemVerilog) and OVM (Open Verification Methodology for SystemVerilog) came from Mentor and Cadence joining hands to make their methodologies open source. However, UVM is an Accellera standard that is backed by all the big names in the EDA industry such as Synopsys, Cadence, Mentor and Aldec. Hence it is ‘simulator independent’ Thus if the test bench code adheres to the UVM guidelines, it can be simulated on a wide range of simulators.

Besides being simulator independent and open source, UVM has a bunch of unique features that make it popular. A better ‘phasing’ mechanism is said to be a major crowd puller as there are 12 sub-phases for the ‘run’ phase as well as provision for ‘user-defined’ phases. The additional phases give a finer run time control. Another advantage that UVM offers is the usage of config_db/resource_db, which is a ‘centralized’ database that can be used to store as well as retrieve different ‘types’ of information. The data stored in the database can range from queues, class handles or even virtual interfaces. The config_db works like a repository wherein ‘certain’ portions of the test bench can be stored and can also be pulled out to build a different framework. Thus, making modifications in the test bench or even parameterizing it can be simplified with the use of config_db.

Summary

A verification methodology can be defined as a ‘rule book’ clubbed with BCL (Base Class Libraries) that catalyses the task of developing reusable and configurable test benches. They provide additional features that augment the functions of HVLs (Hardware Verification Languages) and hence build standard layered test benches equipped with coverage, assertion, phasing and randomization support. To cope up with the constant demand for ‘better’ electronics, it became almost mandatory to ‘reuse’ IPs, be it for the designer or for the verification engineer. Thus, by defining a standard set of rules and by using a predefined  BCL (Base Class Library), the ‘reuse’ of third-party VIPs (Verification IPs) into one’s environment became a reality.

All the verification methodologies developed so far stand tall in ‘their’ time, for they  ‘served’ the purpose of providing a ‘structured’ and ‘efficient’ framework in building ‘standard’ test benches. However as all good things must come to an end to let better things begin, each one of them was gradually overshadowed. The quest for finding a methodology that was better than the other overpowered the existing ones. The years of hard work, research and industry proven concepts of these verification methodologies got boiled down into UVM, thus making it a far better version of its predecessors.

Unlike the other methodologies, UVM has received support from major EDA companies such as Synopsys, Cadence, Aldec and Mentor. This makes it simulator independent and open source too. With the improved phasing mechanism in UVM, simulation can be controlled more finely. With features such as config_db/resource_db, it becomes easy to modify, change or parameterize test benches. Above all, UVM derives its features and concepts from the erstwhile methodologies that have consistently delivered promising results in the industry. Thus, UVM is now a widely accepted methodology across the length and breadth of the ASIC verification industry.

Leave a comment