viresh paruthi, ibm corporation j. baumgartner, h. mony, r. l. kanzelman
DESCRIPTION
Sequential Equivalence Checking Across Arbitrary Design Transformation: Technologies and Applications. Viresh Paruthi, IBM Corporation J. Baumgartner, H. Mony, R. L. Kanzelman. Outline. Equivalence Checking Overview Combinational Equivalence Checking (CEC) - PowerPoint PPT PresentationTRANSCRIPT
IBM Systems and Technology Group
Formal Methods in Computer-Aided Design, 2006 11/16/2006 © 2006 IBM Corporation
Sequential Equivalence Checking Across Arbitrary Design Transformation:Technologies and Applications
Viresh Paruthi, IBM Corporation
J. Baumgartner, H. Mony, R. L. Kanzelman
IBM Systems and Technology Group
© 2006 IBM Corporation2 Formal Methods in Computer-Aided Design, 2006
Outline
Equivalence Checking Overview
– Combinational Equivalence Checking (CEC)
– Sequential Equivalence Checking (SEC)
Use of SEC within IBM
IBM’s SEC Solution
SEC Applications
SEC Challenges
Conclusion
IBM Systems and Technology Group
© 2006 IBM Corporation3 Formal Methods in Computer-Aided Design, 2006
Equivalence Checking
A technique to check equivalent behavior of two designs
Validates that certain design transforms preserve behavior– Logic synthesis, manual redesign does not introduce bugs
Often done formally to save resources, eliminate risk
R2
R1
Logic1
Logic2
{x0, x1, …} {0, 0, …}?
IBM Systems and Technology Group
© 2006 IBM Corporation4 Formal Methods in Computer-Aided Design, 2006
Combinational Equivalence Checking (CEC)
No sequential analysis: latches treated as cutpoints
Equivalence check over outputs + next-state functions– Though NP-complete, CEC is scalable+mature technology
CEC is the most prevalent formal verification application– Often mandated to validate synthesis
R2
R1X
Logic1
Logic2
S
0?
0?
IBM Systems and Technology Group
© 2006 IBM Corporation5 Formal Methods in Computer-Aided Design, 2006
Sequential Equivalence Checking (SEC)
Latch cutpointing req’ment severely limits CEC applicability
– Cannot handle retimed designs, state machine re-encoding, ... – Cutpointing may cause mismatches in unreachable states
• Often requires manual introduction of constraints over cutpoints
SEC overcomes these CEC limitations – Supports arbitrary design changes that do not impact I/O behavior
• Does not require for 1:1 latch or hierarchy correspondence– Known mappings can be leveraged to reduce problem complexity
• Check restricted to reachable states
– Explores sequential behavior of design to assess I/O equivalence
IBM Systems and Technology Group
© 2006 IBM Corporation6 Formal Methods in Computer-Aided Design, 2006
SEC Is Computationally Expensive
Sequential verification: more complex than combinational
– Higher complexity class: PSPACE vs. NP
– Model checking is thus less scalable than CEC
SEC deals with 2x size of model checking!
– Composite model built over both designs being equiv checked
– However, tuned algorithms exist to scale SEC better in practice
IBM Systems and Technology Group
© 2006 IBM Corporation7 Formal Methods in Computer-Aided Design, 2006
SEC Paradigms
Various SEC paradigms exist
Initialized approaches
– Check equivalent behavior from user-specified initial states
– Assumes that designs can be brought into known reset states
Uninitialized approaches, e.g. alignability analysis
– Require designs to share a common reset mechanism
– Compute reset mechanism concurrently with checking equivalence from a reset state
IBM Systems and Technology Group
© 2006 IBM Corporation8 Formal Methods in Computer-Aided Design, 2006
IBM’s Approach: Initialized SEC
More flexible:– Enables checking specific modes of operation
– Applicable even if initialization logic altered (or not yet implemented)
– Applicable even to designs that are not exactly equivalent• Pipeline stage added? check equivalence modulo 1-clock delay• data_out differs when data_valid=0? check equiv only when data_valid=1
More scalable: 1,000s to even 100,000+ state elements– Reset mechanism computation adds (needless) complexity
Validation of reset mechanism can be done independently
– Functional verification performed w.r.t. power-on reset states
IBM Systems and Technology Group
© 2006 IBM Corporation9 Formal Methods in Computer-Aided Design, 2006
SEC Usage at IBM
IBM’s SEC toolset: SixthSense– Developed primarily for custom microprocessor designs
– Also used by ASICs; for (semi-)formal functional verification
Use CEC to validate combinational synthesis– Verity is IBM’s CEC toolset
– Also used for other specific purposes, e.g. ECO verification
Use SEC for pre-synthesis HDL comparisons– Sequential optimizations manually reflected in HDL
– SEC efficiently eliminates the risk of such optimizations
IBM Systems and Technology Group
© 2006 IBM Corporation10 Formal Methods in Computer-Aided Design, 2006
SixthSense Horsepower
SixthSense is a system of cooperating algorithms– Transformation engines (simplification/reduction algorithms)
– Falsification engines
– Proof engines
Unique Transformation-Based Verification(TBV) framework– Exploits maximal synergy between various algorithms
– Retiming, redundancy removal, localization, induction...
– Incrementally chop problem into simpler sub-problems until solvable
Transformations yield exponential speedups to bug-finding (semi-formal), as well as proof (formal) applications
IBM Systems and Technology Group
© 2006 IBM Corporation11 Formal Methods in Computer-Aided Design, 2006
Transformation-Based Verification (TBV)
RedundancyRemovalEngine
RetimingEngine
TargetEnlargement
Engine
Design N’
Design N’’
Design N’’’
Design N
Result N’
Result N’’
Result N’’’
Result N
IBM Systems and Technology Group
© 2006 IBM Corporation12 Formal Methods in Computer-Aided Design, 2006
Transformation-Based Verification Framework
SixthSense
140627 registers
Design + Driver + Checker
Combinational Optimization
Engine
119147 registers
100902 registers
Retiming Engine
Localization Engine
132 registers
Problemdecomposition via synergistic
transformations
Reachability Engine
optimized, retimed, localized trace
optimized, retimed trace
optimized trace
Counter-example Trace consistent
with original design
These transformations are completely transparent to the
user
All results are in terms of original design
IBM Systems and Technology Group
© 2006 IBM Corporation13 Formal Methods in Computer-Aided Design, 2006
Example SixthSense Engines
Combinational rewriting
Sequential redundancy removal
Min-area retiming
Sequential rewriting
Input reparameterization
Localization
Target enlargement
State-transition folding
Isomorphic property decomposition
Unfolding
Semi-formal search
Symbolic sim: SAT+BDDs
Symbolic reachability
Induction
Interpolation
…
Expert System Engine automates optimal engine sequence experimentation
IBM Systems and Technology Group
© 2006 IBM Corporation14 Formal Methods in Computer-Aided Design, 2006
Key to Scalability: Assume-then-prove framework
1. Guess redundancy candidates Equivalence classes of gates
2. Create speculatively-reduced model Add a miter (XOR) over each candidate and its equiv class
representative
Replace fanout references by representatives
3. Attempt to prove each miter unassertable
4. If all miters proven unassertable, corresponding gates can be merged
5. Else, refine to separate unproven candidates; go to Step 2
IBM Systems and Technology Group
© 2006 IBM Corporation15 Formal Methods in Computer-Aided Design, 2006
Assume-then-prove Framework
Speculative reduction greatly enhances scalability
– Generalizes CEC
• Sequential analysis only needed over sequentially redesigned logic
Proof step is the most costly facet
– Most equivalences solved by lower-cost algos (e.g. induction)
• However, some equivalences can be very difficult to prove
– Failure to prove a cutpoint often degrades into inconclusive SEC run
Novel SixthSense technology: leverage synergistic algorithms to solve these harder proofs
IBM Systems and Technology Group
© 2006 IBM Corporation16 Formal Methods in Computer-Aided Design, 2006
Causes of refinement
Asserted miter – incorrect candidate guessing
Resource limitations preclude proof
– Induction becomes expensive with depth
– Approximation weakens power of reachability
Refinement weakens induction hypothesis
– Immediate separation of candidate gates
– Avalanche of future resource-gated refinements
– End result? Suboptimal redundancy removal
• Inconclusive equivalence check
IBM Systems and Technology Group
© 2006 IBM Corporation17 Formal Methods in Computer-Aided Design, 2006
SixthSense: Enhanced Redundancy Proofs
Use of robust variety of synergistic transformation and verification algorithms
– Enables best proof strategy per miter
• Exponential run-time improvements
– Greater speed and scalability
– Greater degree of redundancy identified
Powerful use of Transformation-based Verification
– Synergistically leverage transformations to simplify large problems
– Reduction in model size, number of distinct miters
• Transformation alone sufficient for many proofs
IBM Systems and Technology Group
© 2006 IBM Corporation18 Formal Methods in Computer-Aided Design, 2006
Benefits of Transformation-Based Verification
Reduction in model size, number of distinct miters
– Useful regardless of proof technique
Transformations alone sufficient for many proofs
– Sub-circuits differing by retiming and resynthesis solved using polynomial-resource transformations
– Scales to aggressive design modifications
Leverage independent proof strategy on each miter
– Different algorithms suited for different problems
– Entails exponential difference in run-times
IBM Systems and Technology Group
© 2006 IBM Corporation19 Formal Methods in Computer-Aided Design, 2006
TBV on Reduced Model
Methodology restrictions– Retiming may render name- and structure-based candidate
guessing ineffective
Synergistic increase in reduction potential– TBV flows more effective after merging
– Applying TBV before + after induction-based redundancy removal insufficient
Need to avoid resource-gated refinement
“Exploiting Suspected Redundancy without Proving it”, DAC 2005
IBM Systems and Technology Group
© 2006 IBM Corporation20 Formal Methods in Computer-Aided Design, 2006
Redundancy Removal Results
010000200003000040000500006000070000
Num
ber
of
Regis
ters
IFU
Original DesignAfter Merging via InductionAfter Merging via TBV
0
1000
2000
3000
SMM
0
200
400
600
S6669
• Induction alone unable to solve all properties
• TBV => solves all properties, faster than induction
IBM Systems and Technology Group
© 2006 IBM Corporation21 Formal Methods in Computer-Aided Design, 2006
TBV on speculatively-reduced model
IFU Initial COM LOC CUT COM
Registers 33231 30362 19 19 19
ANDs 304990 276795 86 76 71
Inputs 1371 1329 23 10 10
S6669 Initial COM CUT RET CUT COM
Registers 325 186 138 0 0 0
ANDs 3992 3067 1747 2186 1833 1788
Inputs 83 61 40 40 24 24
RETiming, LOCalization, COMbinational reduction, CUT: reparameterization
IBM Systems and Technology Group
© 2006 IBM Corporation22 Formal Methods in Computer-Aided Design, 2006
Enhanced search without Proofs
Use miters as filters
– No miter asserted => search remains within states for which speculative merging is correct
• i.e., search results valid on original model also
– Miters need not be proven unassertable
– Enables exploitation of redundancy that holds only for an initial bounded time-frame
Faster and deeper bounded falsification
Improved candidate guessing using spec-reduced model
IBM Systems and Technology Group
© 2006 IBM Corporation23 Formal Methods in Computer-Aided Design, 2006
Bounded Falsification Results (% improvement)
43.75% 25%
50%393%
189%
0
50
100
150
200
250
300
350
FPU IFU SMM S3384 S6669
Bounded Steps Completed in 1 hour
Original Design Speculatively Reduced Model
IBM Systems and Technology Group
© 2006 IBM Corporation24 Formal Methods in Computer-Aided Design, 2006
Miter Validation Results (% improvement)
92% 100%
58%785%
872%
0
50
100
150
200
250
300
350
FPU IFU SMM S3384 S6669
Bounded Steps Completed in 1 hour
Original with Miters Speculatively Reduced with Miters
IBM Systems and Technology Group
© 2006 IBM Corporation25 Formal Methods in Computer-Aided Design, 2006
SixthSense Sequential Equivalence Checking
SixthSense
Initialized OLD Design
Initialized NEW Design
Inputs =?
Outputs
NEW Design
OLD Design
InitializationData
Mapping file
Drivers(stimulus) Black-Box
listCheckers
Mismatch Trace
Proof of Equality
IBM Systems and Technology Group
© 2006 IBM Corporation26 Formal Methods in Computer-Aided Design, 2006
Running the Sequential Equivalence Check
Little manual effort to use
Produces a counterexample showing output mismatch– With respect to specified initial state(s)
– Trace is short and has minimal activity to simply illustrate mismatch
Or, proves that no such trace exists– Proof of equivalence
Mandatory inputs:– Requires OLD and NEW version of a design
IBM Systems and Technology Group
© 2006 IBM Corporation27 Formal Methods in Computer-Aided Design, 2006
Running the Seq Equiv Check: Optional Inputs
Initialization data; equiv checked w.r.t. given initial values
Mapping file
– Indicates I/O signal renaming/polarities, add cutpoints, omit checks…
Drivers, filter input stimuli to prevent spurious mismatches
Black Box file, to easily delete components from design
– Outputs correlated, driven randomly; Inputs correlated, made targets
Checkers (check equivalence of internal events)
– Ensure that coverage obtained before change, is valid after
– "Audit" known mismatches to enable meaningful proofs
IBM Systems and Technology Group
© 2006 IBM Corporation28 Formal Methods in Computer-Aided Design, 2006
Sequential Equivalence Checking Applications
Used at block/unit-level on multiple projects…
– To verify remaps, retiming, synthesis optimizations…
– CEC inadequate to deal with these changes
Exposed 100's of unintended mismatches/design errors
– No need to run lengthy regression buckets for lesser coverage
– SixthSense often provides proofs/bugs in lesser time
– No need to debug lengthy, more cluttered traces
• SixthSense traces are short, with minimal activity to illustrate bug
– Quickly finds bugs before faulty logic is released
IBM Systems and Technology Group
© 2006 IBM Corporation29 Formal Methods in Computer-Aided Design, 2006
Example SEC Applications
Timing optimizations: retiming, adding redundant logic,…
Power optimizations: clock gating, logic minimization, …
Check specific modes of design behavior
– Backward-compatibility modes of a redesign preserve functionality
– BIST change must not alter functionality
Verifying RTL vs. higher-level models
Quantifying late design fixes
– Eg., constrain SEC to disallow ops that are the ones affected by a fix
IBM Systems and Technology Group
© 2006 IBM Corporation30 Formal Methods in Computer-Aided Design, 2006
Example Applications: Clock-Gating Verification
Clock-gating:
– Disables clocks to certain state elements when they are not required to update
Approach: Equiv-check identical unit
– One with clock-gating enabled, one disabled
– Check design behavior does not change during care time-frames
Leveraged to converge upon an optimal clock-gating solution
– Iteratively apply SEC to ascertain if clock-gating a latch alters function
Unit withClock Gating
Disabled
Unit with Clock Gating
Enabled
input
Unit withClock Gating
Disabled
Unit with Clock Gating
Enabled
input
IBM Systems and Technology Group
© 2006 IBM Corporation31 Formal Methods in Computer-Aided Design, 2006
Example Applications: Quantifying a late design fix
Late bug involving specific cmds on target memory node– Fix made with backwards-compatible "disable" chicken-switch
Wanted to validate: – "disable" mode truly disabled fix
– Fix had no impact upon other commands, non-target nodes
Several quick SixthSense equiv check runs performed:– With straight-forward comparison, 192/217 outputs mismatched
– "Disabled" NEW design is equivalent to OLD
– If configured as non-target node, NEW equivalent to OLD
– If specific commands excluded (via a driver), NEW equiv to OLD
IBM Systems and Technology Group
© 2006 IBM Corporation32 Formal Methods in Computer-Aided Design, 2006
Example Applications: Hierarchical Design Flow
FPU designed hierarchically– Conventional latch-equivalent VHDL (yellow)
– Simple, abstract cycle-accurate VHDL (green)
– FPU spec (blue) is behavioral model
Verification approach:– First, formally verified green box is equivalent
to its spec using SixthSense (SEC)
– Next, yellow box is verified equivalent to green, macro by macro (takes minutes)
– Finally, schematics verified using Verity (CEC)
– FPU verification is done completely by Formal
FPU spec
High-
VHDL(Latch-
Schematics
Verity(CEC)
SixthSense
FPU spec
High-Level Design
VHDL(Latch-Equivalent)
Schematics
SixthSense
IBM Systems and Technology Group
© 2006 IBM Corporation33 Formal Methods in Computer-Aided Design, 2006
SEC Future Directions: Hierarchical Design Flow
Enables raising the level of abstraction (ESL)– IBM methodology requires, CEC-equivalent to circuit, RTL model
• Allows for verifying self-test logic, asynchronous crossings, scan, …
– Specification of each macro precisely captured by high-level model• Allows creativity in designing optimal circuit for the macro
Verifn can begin without having the entire design ready– Verify the high-level macros, unit/core/chip compositions
– Verifn done in parallel to circuit design; reduces design+verifn cycle
Formal correctness eliminates risk of late design changes– Efficient automated equiv proof of high-level vs. ckt-accurate macros
IBM Systems and Technology Group
© 2006 IBM Corporation34 Formal Methods in Computer-Aided Design, 2006
SEC Future Directions: Sequential Optimizations
SEC is an enabler for “safe” sequential synthesis
– E.g. retiming, addition/deletion of sequential redundancy
– Opens the door for automated (behavioral) synthesis
• Results in higher quality, more optimized designs
• Enabler for system-level design and verification
SEC enables sequential optimizations
– Identify sequential redundancy, unreachable states…
– Validate user specified don’t-care conditions
– Verify “global” optimizations, e.g. FSM re-encoding, clock-gating,…
– Leveraged in diverse areas such as power-gating, fencing, etc.
IBM Systems and Technology Group
© 2006 IBM Corporation35 Formal Methods in Computer-Aided Design, 2006
SEC Challenges: Scalability
SEC has to scale to real world problems– Large design slices, arbitrary transforms, low-level HDL spec,…
– Tighten induction to resolve miters in spec-reduced model• TBV attempts to do just that, but further improvements welcome!
– Improved proof techniques critical to improving scalability
– Improved falsification methods to help with candidate guessing• Helps distinguish false equivalences to converge faster
Abstractions to reduce computational complexity– Leverage techniques such as uninterpreted functions, blackboxing,…
– Hierarchical proof decomposition• Bottom-up approach – blackboxes verified portions of the logic,
and captures constraints at the interfaces
IBM Systems and Technology Group
© 2006 IBM Corporation36 Formal Methods in Computer-Aided Design, 2006
SEC Challenges: Combined CEC and SEC
Leverage mappings of state elements obtained from CEC
– Take advantage of the wealth of techniques to correspond latches
• Name-based, structural, functional, scan-based…
– Used as cutpoints to define a boundary between CEC and SEC
• Significantly simplifies the SEC problem via co-relation hints
• Refining a cut if a false negative obtained is a hard problem– Automatically propagate constraints across mapped state elements
Benefits to CEC
– Improved latch pair matching via functional analysis
• Latch-phase determination, functional correspondence,…
– Apply constraints derived from SEC to simplify problems
IBM Systems and Technology Group
© 2006 IBM Corporation37 Formal Methods in Computer-Aided Design, 2006
Conclusion: Sequential Equivalence Checking
Eliminates Risk:
– SEC is exhaustive, unlike sim regressions
Improves design quality:
– Enables aggressive optimizations, even late in design flow
Saves Resources:
– Obviates lengthy verification regressions
Generalizes CEC, and improves productivity
Opens door to automated sequential synthesis
IBM Systems and Technology Group
© 2006 IBM Corporation38 Formal Methods in Computer-Aided Design, 2006
Conclusion: SEC at IBM
SEC becoming part of standard methodology at IBM
– Pre-synthesis HDL-to-HDL applications
– CEC closes gap with combinational synthesis flow
IBM’s SEC solution driven by scalability across arbitrary design transforms
– Hooks for: initial values, interface constraints, “partial equivalence”…
SixthSense: TBV-Powered SEC
– Leverage a rich set of synergistic algos for highly-scalable SEC
IBM Systems and Technology Group
© 2006 IBM Corporation39 Formal Methods in Computer-Aided Design, 2006
Conclusion: References/Links
Website (lists SixthSense publications):www.research.ibm.com/sixthsense
Relevant Papers:
“Exploiting Suspected Redundancy without Proving it”, DAC 2005
“Scalable Sequential Equivalence Checking across Arbitrary Design Transformations”, ICCD 2006