Page loading . . .

  
 Category: SOCcentral Feature Articles & Columns: Feature Articles: Sunday, April 20, 2014
Formal Verification Goes Mainstream  
 Printer friendly
 E-Mail Item URL

August 5, 2008 -- Formal verification is a term that's been kicking around the EDA industry for years, but only recently has seen some success in helping designers verify their complex SOC's, processors and ASICs. When it was first introduced, formal verification usually referred to a methodology that compared versions of a design as it went through simulation and optimization, to ensure that the evolving system still met the original specifications. While many designers still think of formal verification in those terms, the more recent and widely accepted technology is formal functional verification, which uses assertions to find corner case bugs that might be missed by traditional verification techniques.

This type of formal verification was slow to catch on, mainly because standards for assertion languages were going through the usual in-fighting and debate that surrounds the establishment of a standard. The standards have stabilized, with the two most common assertion languages being SystemVerilog assertions (SVA) and the Property Specification Language (SPL), which is favored by VHDL users. The stabilization of these standards has hastened the adoption of formal verification techniques and today, more and more engineers are using these formal techniques to complete their standard verification flows.

"Though formal verification today is still seen as a niche application, published success stories from large semiconductor companies such as Analog Devices, Intel, NVIDIA and Texas Instruments have hastened the adoption of formal techniques in smaller companies," says Dan Benua, Senior Manager, Corporate Applications Engineering at Synopsys, Inc. "Though it's not a complete replacement for simulation, it fills in some of the areas where traditional verification methods are weak."

According to Benua, there are two things that formal verification does well: It can find obscure bugs and is supplemented by formal proof where you can, in concept at least, exhaustively prove the correctness of some circuit or some properties of a circuit. In the bug-finding arena, these could be found by regular simulation, but to find that bug you might have to write some extremely specific or low probability stimulus to find the bug. The power of formal verification is that it searches the state space and finds the things that the simulation environment might have difficulty finding. On the proof side, a simulator cannot exhaustively prove that something is always correct. It can only show examples of specific scenarios where the circuit operates correctly, but it can't prove that there are no bugs. A formal tool can theoretically prove that there are no bugs.

Additional Reading

What Ever Happened to Formal Verification?

Combining Metrics from Simulation and Formal

What Hardware Designers Need to Know about SystemVerilog Testbench Debug and Analysis.

"Formal verification is limited, however, in capacity and coverage," Benua continues. "The memory and compute resources in the verification hardware are limited and there's the time it takes for designers to learn how to write formal assertions. The designer needs to describe the environment around the design, such as defining a legal stimulus and the reset procedure for the simulator. These need to be described differently for formal than conventional simulation."

Designers also use formal techniques to verify the interfaces between the blocks. They try to verify that the interaction between the blocks is going to exactly follow the protocols for standards such as AMBA or USB. Because different teams might work on different blocks, verifying the interfaces is very important. A block might work fine in and of itself, but it could run into problems when it tries to communicate with another block across an interface if all design rules surrounding the standard haven't been followed precisely.

Standing out in the crowd

Some designers also point out that there's no one formal tool that's best for all designs. Any tool can win a benchmark on one particular problem but it might lose on others. This is where a designers' expertise comes in. They have to understand the intricacies of their design's behavior to understand which formal tool really gives the coverage and accuracy that they need. Because there's no one particular vendor's technology that wins all benchmarks, EDA companies are each touting the differentiator in its solution.

Synopsys, for instance, points to its Magellan hybrid RTL formal verification product that it says combines the strengths of advanced formal engines with the strengths of a built-in VCS simulation engine to verify properties on large and complex Verilog and VHDL designs.

Other large EDA vendors, such as Cadence Design Systems, Inc. and Mentor Graphics Corp. each point to features that the companies hope will win the day in a benchmark situation.

"There are technical metrics that can be compared in benchmarks," says Tom Anderson, Product Marketing Director at Cadence. "Engineers look at how large a block a formal verification tool can handle, and assess its accuracy and speed. At Cadence we have two differentiators. We put a lot of effort on improving the formal engines and algorithms that drive those engines. This isn't very visible to the customer, but if it gives better results, then that's a selling point, because the better the engines, the better the results in the overall verification effort. Cadence has spent a lot of time persuading design engineers to get involved in the verification effort. After all, they understand the design's intent and the intricacies of its behavior. As far as results go, I can safely say that formal verification results in an average of 25% reduction in the verification effort. Some designs post results that exceed that metric and some don't quite make that number, but an average is 25%. That's assuming that a company is making a broad effort to adopt formal technology and not just assign one or two engineers to occasionally use the tool."

Figures 1 and 2 below compare the amount of manual intervention required in a traditional approach vs. the Automated Verification Management approach with the Open Verification Methodology (OVM).

Figure 1. Directed testing. Source: Cadence Design Systems, Inc.


Figure 2. OVM in a metric-driven verification flow. Source: Cadence Design Systems, Inc.


Another EDA leader, Mentor Graphics, is also trying to make formal verification attractive to the design engineer. Mentor emphasizes the role that assertions play in the formal verification effort, which means that design engineers might not have to spend time developing testbenches right from the outset of product development. Testbenches are certainly an important element in more detailed, traditional verification flows, but, according to Mentor, assertions can be a vital part of assessing the values of coverage that they provide for the design.

"Assertions generally appeal to designers because they are general descriptions of design intent for different blocks in the circuit," says Rindert Schutten, Product Marketing Manager for Mentor Graphics 0-1 verification product. "Assertions are simple. If you have a FIFO in your design, for example, one part of the design fills up the FIFO and another portion empties it out. One thing you might want to define is, for example, is that if the FIFO is full, then you should not be able to write additional data into it. This isn't defined in the RTL code, which just executes the FIFO's behavior. Through an assertion, you can very succinctly capture that property and its rules for correct behavior, i.e., FIFO overflow. Or underflow, so that you cannot read from an empty FIFO. These are examples of simple assertions that you can use to ensure that the RTL cannot carry out illegal instructions."

According to Schutten, Mentor's differentiator is assertion density analysis, which helps ensure that you've covered all of the design's functionality. Schutten points out Mentor's coverage targeting, which analyzes existing assertions and automatically creates new tests to reach coverage goals. (Learn more about assertion density analysis and coverage targeting by reading the Mentor Graphics article What Ever Happened to Formal Verification?.)

Ensuring that any verification technology adequately covers a design is no trivial task. If designers get too caught up in comprehensive coverage of their designs, verification can take too much time and design schedules can slip. Inadequate coverage, however, can leave a design with fatal flaws that might only be discovered after placement and routing, but which could even lurk in the product until release. EDA companies are working to find the correct balance between exhaustive coverage and incomplete coverage.

"In dynamic verification, you have to drive the right stimuli into the system in order to find the erroneous behavior," says Michael Siegel, Director of Product Marketing at OneSpin Solutions GmbH. "This, of course, depends on the quality of the stimuli. If you write bad properties, you cannot find errors in the RTL. We feature 'Gap Free Verification,' which inspects how good the property set is that's used for verification. In other words, the verification is only as good as the property set that's used. For example, if I have a processor and I've defined 10 properties, those capture only 10 instructions; I won't find errors in the other instructions. Other approaches only partially cover this gap and they also depend on expertise of the designer in writing assertions. What kind of gaps can be detected with Gap Free Verification? Assume that we have DRAM controller that performs read and write operations. If some of the output of the controller has not been properly specified by the properties, then the gap detection would say you didn't write the expected values of that output signal for a certain input. It is a non-trivial task to write comprehensive properties in SVA or PSL because they are pretty complex languages."

OneSpin advocates working from timing diagrams that are typically found in the informal specification. This could describe the read instructions from the SDRAM controller, for example. To that end, OneSpin has devised a timing diagram assertion library which allows it to translate a timing diagram into a property and then to verify this property on the RTL code.

"This speeds the verification process considerably," says Siegel. "We have one customer with a processor that had been dynamically verified and then redesigned. For the redesign, only formal methods were used with the Gap Free technology. It only took 40% of the effort and higher quality was achieved. Another project was the verification of a complex arbiter and that had been verified using constrained random techniques and it took fifteen weeks. With formal verification, the customer found additional errors with less than half the effort of the constrained random testbench."

Figure 3. According to OneSpin, its 360 MV technology automatically detects gaps in the property set. The identification of incomplete properties drives improvement of individual properties, while the identification of missing properties drives completion of the verification plan and the property set.


Even with EDA venders touting the benefits of formal solutions, most companies rely on SystemVerilog testbenches as a basis for the verification flow. While this isn't exactly formal verification, it does use constructs in the SystemVerilog language that ensure coverage that dovetails nicely with a formal verification approach. Familiarity with SystemVerilog lets both design and verification engineers use a common language to write assertions and more complex testbenches. (Learn more about testbench creation with SystemVerilog by reading the SpringSoft article What Hardware Designers Need to Know about SystemVerilog Testbench Debug and Analysis.)

A different approach

Some formal verification tool vendors try to break from the pack with technology that speeds the verification process and provides the capacity to handle more gates. Calypto Design Systems, Inc., for example, claims that its SLEC System finds design errors that other tools miss by formally comparing the functionality of an electronic system level (ESL) model written in C/C++ or SystemC with its corresponding RTL design or system-level model for all possible input sequences. According to Calypto, this technology doesn't require one-to-one mapping of registers. The quality of verification SLEC System performs in minutes is equal to months of running traditional simulation.

"The SLEC System either confirms that no functional errors exist or generates short, concise waveforms that pinpoint design bugs," says Mitch Dale Director of Product Marketing at Calypto. "These waveforms are written in standard VCD and FSDB formats that can be analyzed in the user's native debugging environment. We can comprehensively verify an RTL design in a matter of hours because we're using sequential equivalency checking, which lets people verify at the system level and not have to go back and re-run all those regressions at the RTL. System models run from 1,000 to 10,000 times faster and they can do more in a short amount of time. With formal verification, you check all possible input conditions over all times so you can quickly find these corner case bugs."

Dale also says that formal algorithms have an exponential nature that grinds to a halt when designs exceed a certain size. "We can run over a million gates," he says. "We can verify comprehensively in a matter of hours. Typically we only have to run about 10 or fewer cycles so a designer can see the bug and fix it. Regular simulation runs about a million cycles."

Figure 4. Calypto's SLEC simplifies and reduces the time to debug designs. Design differences (bugs) are displayed as short, concise counter-examples. Counter examples pinpoint the point of divergence in short (typically less that 10 cycles) waveforms or simulation testbenches that designers debug in their preferred debugging environment.


Another formal verification tool provider, Real Intent, Inc. espouses timing closure verification (TCV) to complete the formal verification process. This unique approach, found in its Meridian and PureTime products, looks at clock to clock sequences found in most asynchronous designs.

"When a signal is going from one clock domain to another, we call it clock domain crossing signals or CDC signals," says Rich Faris, Vice President of Marketing at Real Intent. "Since these clocks are asynchronous to each other, it is possible that the signal might transition during the setup and hold window of the receiving clock, thereby resulting in metastability in the design. Metastability can cause various design errors. For example, signals losing correlation, hazards being captured, losing valid data, or metastability propagation. Let's say you have a complex math function that will take three ticks of the clock to complete. Synthesis will try to cram that into one clock tick and that just won't work. In this case, the path that doesn't contribute to the overall timing is called a false path. When it takes multiple clock cycles to settle it's called a multi-cycle path. These are both exceptions to the overall timing rule. Our tools look at this circuit and ensure enough setup and hold time. It's a cycle by cycle analysis. You can perform this analysis at the RTL or at the output of the synthesis to make sure there are no timing violations. Our tools would flag that error and let you correct it."

Figure 5. CDC design principles. Source: Real Intent.


In Figure 5 above, the data path should be a multi-cycle path. The data transmitted should not be sampled at the next edge of the receiving clock. The control signals need to be synchronized. Multi-bit control signals should be gray coded to ensure only one signal transition at a time. They should be free of hazards and glitches and be stable for more than 1 receiving clock cycle.

Automating the process

Some EDA companies are emphasizing ease-of-use and automation to entice design engineers to use formal techniques. Jasper Design Automation, for example, says that its Jasper Gold verification tool lets non-expert users solve complex problems.

"We take a targeted deployment approach where we work with the customer to identify which blocks should be verified with formal techniques," says Rajeev Ranjan, Chief Technical Officer at Jasper Design Automation. "We help them create a verification plan and incorporate formal methods into that plan. The training to enable the user to write a requirement for the block takes place during the creation of the verification plan. The debugging environment uses timing diagrams to pinpoint the place in the design which is failing and highlights the RTL code that needs to be fixed."

According to Ranjan, Jasper's design tunneling formal infrastructure verification system provides a high level of verification productivity. Properties can be specified as end-to-end block-level microarchitecture statements known as high-level requirements. These properties express input-to-output block-level design intent.

"Proving properties at these higher levels of abstraction is more powerful and efficient than writing and testing hundreds of lower-level, individual assertions scattered throughout the logic," Ranjan continues. "We consistently perform full proofs on properties where other formal tools fail to converge, with an average of 10x capacity improvement compared to other solutions." (To learn more about comparisons between formal methods and traditional testbench technologies read the Jasper Design Automation article Combining Metrics from Simulation and Formal.)

Jasper's "push-button" mode formal verification completely tests all possible logic values that can be applied to each assertion and identifies any possible conditions that violate the property. If any failures are found, its tool automatically generates a multi-cycle waveform displaying the complete path from reset to the failure. Designers and verification engineers can then quickly clean up RTL code in the earliest stages of the design flow where it is easiest to fix, without the need for costly testbench development.

Formal verification moving into traditional flows

As this article shows, the simple term formal verification comes in many guises. Each vendor of a formal verification tool highlights a different feature set in its solution that sets it apart from the competition. This leaves designers and verification engineers with a dizzying array of choices when they decide to embrace formal verification techniques.

Careful understanding of which features actually will serve them best is key to engineers choosing the tools that are right for them and their designs. Even so, one thing is clear. Formal verification is no longer an esoteric technology reserved for the large semiconductor companies. As designs become more complex and schedules tighten, everyone will be looking for ways to streamline the verification process. To that end, tool vendors are promising fast, reliable, easy-to-use tools to help engineers move into the realm of formal verification. Those that add this technology to their established design flows will probably get their products to market faster than those that rely solely on traditional, time-consuming verification techniques. Change is tough, but it might make the difference between a successful product launch or a design that goes through numerous iterations before it can be released.


By Mike Donlin, Senior Editor, SOCcentral.com

Keywords: SOCcentral, formal verification, ASIC design, EDA tools,
488/26329 8/5/2008 12778 12778
Add a comment or evaluation (anonymous postings will be deleted)

Designer's Mall
Cinco De Mayo countdown banner
0.015625



 Search for:
            Site       Current Category  
   Search Options


Subscribe to SOCcentral's
SOC Explorer
Newsletter
and receive news, article, whitepaper, and product updates bi-weekly.

Executive
Viewpoint

Progressive Static Verification Leads to Earlier and Faster Timing Sign-off


Graham Bell
VP of Marketing,
Real Intent

Executive
Viewpoint

Threading the Way
through
SOC Verification


Thomas L. Anderson
VP of Marketing,
Breker Verification Systems

Odd Parity

What? You Haven't Made Any Resolutions?


Mike Donlin
The Write Solution

Odd Parity Archive

SOCcentral Job Search

SOC Design
ASIC Design
ASIC Verification
FPGA Design
CPLD Design
PCB Design
DSP Design
RTOS Development
Digital Design

Analog Design
Mixed-Signal Design
DFT
DFM
IC Packaging
VHDL
Verilog
SystemC
SystemVerilog

Special Topics/Feature Articles
3D Integrated Circuits
Analog & Mixed-Signal Design
Design for Manufacturing
Design for Test
DSP in ASICs & FPGAs
ESL Design
Floorplanning & Layout
Formal Verification/OVM/UVM/VMM
Logic & Physical Synthesis
Low-Power Design
MEMS
On-Chip Interconnect
Selecting & Integrating IP
Signal Integrity
SystemC
SystemVerilog
Timing Analysis & Closure
Transaction Level Modeling (TLM)
Verilog
VHDL
 
Design Center
Whitepapers & App Notes
Live and Archived Webcasts
Newsletters



About SOCcentral.com

Sponsorship/Advertising Information

The Home Port  EDA/EDA Tools  FPGAs/PLDs/CPLDs  Intellectual Property  Electronic System Level Design  Special Topics/Feature Articles  Vendor & Organization Directory
News  Major RSS Feeds  Articles Online  Tutorials, White Papers, etc.  Webcasts  Online Resources  Software   Tech Books   Conferences & Seminars  About SOCcentral.com
Copyright 2003-2013  Tech Pro Communications   1209 Colts Circle    Lawrenceville, NJ 08648    Phone: 609-477-6308
553.488  0.09375