Directions in Formal and Semi-Formal Verification


This was initially intended as a short executive briefing note on current commercial developments in formal and semi-formal verification. It was written as part of "due diligence" investigations prior to a research grant application. What follows is very brief, incomplete, obsolete in parts and possibly contains errors (it's hard to tell what some proprietary tools actually do from their marketing material). Please send me updates, additions, corrections.

Traditional formal verification methods are boolean equivalence checking, model checking and theorem proving. A few years ago these were `hot' technologies being promoted by startups. However, there was a shakeout and although equivalence checking joined the mainstream, model checking and theorem proving have not really caught on. Model checking almost made it (maybe it still will) as it is an impressive technology and has successful niche applications, but it seems that it is too hard to apply and doesn't scale to current design sizes. Theorem proving suffered from needing highly trained experts and only being tractable for a few applications.

Although general model checking with full temporal logic seems a way off from widespread industrial use, there are some special cases that are quite successful. Intel, have been using symbolic trajectory evaluation (STE) for some years, apparently with great success. STE supports a very limited temporal logic, but can achieve high efficiency and check large designs. Averant (previously HDAC) have a static functional verification tool called Solidify that appears to be limited model checking. The only example I could find (in a paper at DesignCon 2000) suggests Solidify checking might be similar to STE. Cadence's FormalCheck model checker is, apparently, based on a general model checking engine. However, it is possible that it restricts the properties that can be checked via a property entry GUI, so that only efficiently checkable properties can be specified.

There are also a few very successful industrial applications of theorem proving to hardware, notably Intel's work on floating point verification using HOL Light and Forte, David Russinof's work on floating point verification of AMD processor algorithms, and Srivas and Miller's verification of part of the AAMP5 processor.

Whilst research on model checking and theorem proving continues to make excellent progress in academia, and also to have a few high-profile niche successes in industry, like those mentioned above, it seems unlikely that they will become mainstream EDA technologies soon.

What seems to be happening now is that hybrid combinations of traditional and formal approaches to functional verification are emerging. These combine good ease of use with simple but scalable formal methods. Some examples are very briefly described in what follows.

Symbolic simulation

A conventional HDL event simulator is augmented with symbolic values and then expressions are represented symbolically (e.g. using BDDs). A startup in this area is Innologic which sells a symbolic simulator for Verilog. Symbolic simulation is marketed as being like conventional simulation, but giving greater coverage. It also makes equivalence checking look familiar, i.e. as just the symbolic interpretation of Verilog's !== symbol in, for example,
({output_symbols} !== (a_input_symbols + b_input_symbols))

Generalised `linting'

The idea is to check for common problems like multiple bus drivers causing conflicting values, vectors indexed out of range, and inconsistencies in case statements. A recent commercial offering is Verix. An interesting quote from John Cooley's DAC report is

Commercial linters are proprietary. A very simple academic point tool which illustrates how theorem proving might be used for linting is Myra Van Inwegen's checker.

Property checking

Although full model checking is not yet ready for widespread deployment, the idea of writing and checking properties seems to be catching on, where the checking is done by simulation, possibly `amplified' by some kind of formal method.

An example is 0-In, which offers a tool, 0-In Check, that provides a collection, called CheckerWare, of automatically generatable Verilog test bench modules that check properties like conformance to input/output protocols, correct management of FIFOs etc. The user specifies these checkers with high level directives expressed as comments in the source (preceded by "//0in"), and 0-In search generates Verilog code linked to the design under test for inclusion in the test bench. If a property is violated during simulation then the checker `fires' and reports a bug. The verification engineer needs no knowledge of a separate property language, and the checking is done using a standard simulator. 0-In also provide a formally based facility called 0-In Search. This starts with a `seed' test vector and searches for perturbations of the vector that will fire the checkers. Presumably this works by some kind of formal state enumation technique that tries to find perturbations that fire `corner cases' of checked properties. 0-In Seach `amplifies' each test vector, so the verification engineer can get good coverage from a small test set.

Another example of simulation checkable properties is found in SpecMan Elite from Verisity. Properties are written in a proprietary test bench language called e, which has a subset based on linear temporal logic, but designed to be engineer-friendly. SpecMan has a simulator that can check compliance with e properties, it also has sophisticated tools for generating test vectors that satisfy constraints (expressed in another part of e). Thus whereas SpecMan gets enhanced coverage by random testing using vectors satisfying constraints, 0-In enhances its coverage by formal search for perturbations of test vectors that fire checkers.

Some other formal verification products and companies

Interesting links


Mike Gordon Wed Aug 2 09:30:23 BST 2000