In a recent survey of FPGA designers on EDA tools done by EE Times, I noticed a considerable drop (almost 50%) in the use of Formal Verification (FV) between 2005 and 2006. At the same time, they predict their usage will increase in the future. I attribute some of this drop to the design methodology change required to get the potential benefit from FV.
A good presentation from Harry Foster, ex-Jasper, current Mentor, describes why methodology is important. Use in the FPGA design flow will continue to creep up as previous ASIC designers move to FPGA design (and bring their FV experience with them), and design complexity continues to expand while schedules don’t.
As design complexity increases, the percentage of total design time spent on verification increases. Common estimates are that design verification exceeds 50% of the design time.
Traditional ASIC verification has relied heavily on simulation where test stimuli are fed to the design, and the output is checked for correct results. Simulation has become much more common in the FPGA verification flow as well, although because of the re-programmable fabric, the FPGA designer has more chip-level options – for example using an internal FPGA logic analyzer such as Xilinx’s ChipScope™ or Lattice’s ispTRACY™ products.
All verification techniques have their pros and cons. As more time is spent in simulation, the coverage improves at a slower rate. Moreover, the best that simulation can tell a designer is that, “no bugs have been found yet”, rather than the much more useful, “this design is correct”.
Formal verification is a technique that takes a very different approach to verifying a design. It also has its pros and cons, but the consensus is that that Simulation and Formal Verification are very complementary. Together, they have the potential of giving the highest verification coverage in the fastest time (lowest cost).
Formal verification relies on “rules” that have been defined by the user that characterize how the design should behave – the so-called design intent. For example, in a bus protocol, if bus signal A should never be high two cycles after bus signal B goes low, this can be captured as a rule. These rules can be checked in two fundamental ways:
- Dynamic or Simulation Based: while the design is running in simulation, the rules are checked for any violations.
- Static: through formal analysis of the netlist, counter-examples are created which show how a rule can be violated. If this counter-example was run through a simulator, the violation could be observed.
Both dynamic and static FV approaches are used in Assertion Based Verification (ABV). Like a simulation testbench, the FV rules have to be created by the design or verification engineer. Two rule languages have emerged:
- SystemVerilog Assertions (SVA) are part of the SystemVerilog language, and can be a powerful method for designers to capture and formally document their design intent during design entry.
- PSL is more HDL language neutral, and has additional support for more advanced formal verification techniques.
A paper comparing the two languages can be found at: http://www.pslsugar.org/papers/pslandsva.pdf
Formal verification is not new, and its general adoption has been slow. Certain designs that have high concurrency and lower sequential depth (e.g. processors), are ideal for formal verification and the technique is commonly used. A few rules can accomplish much more than many simulation test vectors.
However, the fact that simulation and formal verification are complementary requires the user to know how they are going to apply the two techniques. This requires much more upfront planning, expertise in rule generation, and basically adopting a new design methodology. That kind of thing doesn’t happen overnight.