february, 2000
DESCRIPTION
Automated Generation of Test Suites from Formal Specifications Alexander K.Petrenko Institute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow. February, 2000. Ideal Testing Process. Why formal specification?. Design specifications. Forward engineering:. ? Oracles. - PowerPoint PPT PresentationTRANSCRIPT
Automated Generation of Test Suites from Formal SpecificationsAlexander K.PetrenkoInstitute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow.
February, 2000February, 2000
2 Cambridge, February, 2000
Ideal Testing ProcessWhy formal specification?
What kind of specifications?
Forward engineering: Design specifications
Sources
? Oracles
Partition
Reverse engineering: Sources
? Criteria? Oracles
Partition
Pre- and post-conditions, for Oracles and Partition invariants
? Algebraic specifications for Test sequences
Tests
Tests
? Criteria
3 Cambridge, February, 2000
Ideal Testing ProcessWhy formal specification?
What kind of specifications?
Forward engineering: Design specifications
SourcesCriteria
Oracles
Partition
Reverse engineering: Post-specifications
Sources
CriteriaOracles
Partition
Pre- and post-conditions, for Oracles and Partition invariants
? Algebraic specifications for Test sequences
Tests
Tests
4 Cambridge, February, 2000
KVEST project history
• Started under contract with Nortel Networks in 1994 to develop a system automatically generating test suites for regression testing from formal specifications, reverse engineered from the existing code
• A joint collaboration effort between Nortel Networks and ISPRAS background:
— Soviet Space Mission Control Center OS and networks;— Soviet space shuttle “Buran” OS and real-time programming
language;— formal specification of the real-time programming language
5 Cambridge, February, 2000
What is KVEST?
• KVEST: Kernel Verification and Specification Technology• Area of Application: specification, test generation and test execution
for API like OS kernel interface• Specification Language: RAISE/RSL (VDM family)• Specification Style: state-oriented, implicit (pre- and
post-conditions, subtype restrictions)• Target Language: Programming language like C/C++• Size of Application: over 600Kline• Size of specification: over 100Kline• Size of test suites: over 2Mline• Results: over hundred errors have been detected
in several projects
6 Cambridge, February, 2000
Position
• Constraint specification
• Semi-automated test production
• Fully automated test execution and test result analysis
• Orientation on use in industrial software development
processes
7 Cambridge, February, 2000
Research and design problems
• Test system architecture
• Mapping between specification and programming languages
• Integration of generated and manual components - re-use of manual components
• Test sequence and test case generation
8 Cambridge, February, 2000
Verification processes
• Reverse engineering: (post-) specification, testing based on the specification
• Forward engineering: specification design, development, test production
• Co-verification: specification design, simultaneous development and test production
9 Cambridge, February, 2000
Reverse engineering: Technology stream
Software contractcontentsInterface A1
…………………….Interface A2
…………………….
Detected error &test coverage reports
Documentation
Source code
Test driversand test cases
Interfacedefinition
Phase 1
Specification
Phase 2
Test suiteproduction
Phase 3
Test executionanalysis
Phase 4
Test plans
Actual documentation
10 Cambridge, February, 2000
Key features of KVEST test suites
• Phase 1: Minimal and orthogonal API (Application Programming Interface) is determined
• Phase 2: Formal specification in RAISE Specification Language is developed for API.
• Phase 3: Automatic generation of sets of test suites (test cases and test sequences) in target language.
• Phase 4: Automatic execution of generated test suites. Pass/fail verdict is assigned for every test case execution. Error summary is provided at the end of the run. User has an option of specifying completeness of the test coverage and the form of tracing.
11 Cambridge, February, 2000
DAY_OF_WEEK : INT >< INT -~-> RC >< WEEKDAY
DAY_OF_WEEK( tday, tyear ) as ( post_rc, post_Answer )
post
if tyear <= 0 \/ tday <= 0 \/
tday > 366 \/ tday = 366
/\ ~a_IS_LEAP( tyear )
then
BRANCH( bad_param, "Bad parameters" );
post_Answer = 0 /\ post_rc = NOK
else
BRANCH( ok, "OK" );
post_Answer = (a_DAYS_AFTER_INITIAL_YEAR(tyear, tday ) +
a_INITIAL_DAY_OF_WEEK ) \
a_DAYS_IN_WEEK /\ post_rc = OK
end
An example of specification in RAISE
12 Cambridge, February, 2000
Partition based on the specification
Specification
post
if a \/ b \/
c \/ d
/\ e
then
BRANCH( bad_param, "Bad parameters" )
else
BRANCH( ok, "OK" )
end
Partition (Branches and
Full Disjunctive Normal Forms - FDNF)
BRANCH "Bad parameters”•a/\b/\c/\d/\e•~a/\b/\c/\d/\e•...
BRANCH "OK" •~a/\~b/\~c/\~d/\e•…
13 Cambridge, February, 2000
Test execution schemeSpecifications
Verdict and trace
Test drivers
Test harness
Target platform
UNIX
Program behavior model
Comparison
Test suite generators
SUT
Test case parameters
14 Cambridge, February, 2000
Test execution management
Repository
Navigator: - test suite generation - repository browser - test plan run
Unix workstation Target platform
Test bed: - process control - communication - basic data conversion
Test suite
Script driver
MDC Basic drivers
MDC - Manually Developed Components
15 Cambridge, February, 2000
KVEST Test Drivers
• Hierarchy of Test Drivers
— Basic test drivers: test single procedure by receiving input, calling the procedure, recording the output, assigning a verdict
— Script drivers: generate sets of input parameters, call basic drivers, evaluate results of test sequences, monitor test coverage
— Test plans: define the order of script driver calls with given test options and check their execution
• KVEST uses set of script driver skeletons to generate script drivers
• Test drivers are compiled from RAISE into the target language
16 Cambridge, February, 2000
Test generation scheme
RAISE specificationsScript driver
skeletons
Basic drivergenerator
Script drivergenerator
Test case generator
RAISE -> target language compiler
Basic drivers Test caseparameters
Script drivers
Target platform
Test suites
Tools (UNIX)
17 Cambridge, February, 2000
Test generation scheme, details
RAISE -> target language compiler
Basic drivers
Test caseparameters
Script drivers
Target platform
Test suites
RAISE specifications
Script driver skeletons
Basic drivergenerator
Script drivergenerator
Test case generatorTools (UNIX)
Iterators
Data converters
State observersFilters
Manually Developed Components
18 Cambridge, February, 2000
Test sequence generation based on implicit Finite State Machine (FSM)
– Partition based on pre- and post-conditions– Implicit FSM definition.
S1
S4
S2
S3
op2
op3
op3
op3
op2
op1
op3
19 Cambridge, February, 2000
Test sequence generation based on implicit FSM
S1
S4
S2
S3
op21
op3
op3op3
op2op1
op3
Partition (Branches and
Full Disjunctive Normal Forms - FDNF)
BRANCH "Bad parameters”•a/\b/\c/\d/\e -- op1•~a/\b/\c/\d/\e-- op2•...
BRANCH "OK" •~a/\~b/\~c/\~d/\e -- opi•…
20 Cambridge, February, 2000
Conclusion on KVEST experience
• Code inspection during formal specification can detect up to 1/3 of the errors
• Code inspection can not replace testing. Up to 2/3 of the errors are detected during and after testing.
• Testing is necessary to develop correct specifications.
• Up to 1/3 of the errors were caused by the lack of knowledge on pre-conditions and some details of the called procedures’ behavior.
21 Cambridge, February, 2000
What part of testware is generated automatically?
Kind of source fortest generation
Percen-tage in the sources
Ratio betweensource size andgenerated tests
size
Kind ofgeneration result
Specification 50 1:5 Basic drivers
Data converters,Iterators andState observers(MDC)
50 1:10 Script drivers
22 Cambridge, February, 2000
Solved and unsolved problems in test automation
Have been automated or simple problems
Not automated and not simple
Phase 2
Phase 4
For legacy software
Test resultunderstanding
For well designed
For single operations
Test oracles, partition, filters
Interfacedefinition
Phase 1
Specification
Test suiteproduction
Phase 3
Test executionanalysis
Test plans, execution and analysis, browsing, reporting
Test sequence design for operation groups
23 Cambridge, February, 2000
Specification based testing: problems and prospects
Problems
• Lack of correspondence between any specification and programming languages
• There is users’ resistance to study any specification language and any additional SDE
• Methodology of Test sequence generation
• Testing methodologies for specific software areas
Prospects
• Use an OO programming language specification extension and standard SDE instead a specific SDE
• FSM extraction from implicit specification, FSM factorization
• Research on Distributed software specification and testing
Part II. KVEST revisionPart II. KVEST revision
25 Cambridge, February, 2000
Specification notation revision.
UniTesK: Universal TEsting and Specification toolKit• Formal methods deployment problems
— lack of users with theoretical background— lack of tools— non conventional languages and paradigms
• UniTesK Solutions— first step is possible without “any theory”— extension of C++ and Java— integration with standard software development environment
• Related works— ADL/ADL2— Eiffel, Larch, iContract
26 Cambridge, February, 2000
UniTesK: Test generation schemeSpecifications in Java or C++
extension
Test oraclesgenerator
OO test suitegenerator
Test oracles
Target platform
Test suites in the target language
Tools
Iterators, FSM
Iterators, FSM
Path builderengines
Test sequence fabric
Use cases
27 Cambridge, February, 2000
Integration of Constraint Verification tools into software development environment
UML based design environment
OracleClass(from Generated classes)
AbstractInputTypeIterator(from Generated classes)
Iterator_seq_of_int_binary
<<instance variable>> length : nat1<<instance variable>> current : nat
<<operation>> init()<<operation>> next()hhh()
(from Generated classes)
ClassUnderTestImplementation(from Generated classes)
A standard Software Development Environment
Specification, Verification tools for the standard notation
Part III. Test generation insidePart III. Test generation inside
29 Cambridge, February, 2000
Requirements. Test coverage criteria
– All branches
– All disjuncts (all accessible disjuncts)
Specification
post if a \/ b \/
c \/ d/\ e
then BRANCH( bad_param, "Bad parameters" )
else BRANCH( ok, "OK" )
end
Partition (Branches and
Full Disjunctive Normal Forms - FDNF)
BRANCH "Bad parameters”•a/\b/\c/\d/\e•~a/\b/\c/\d/\e•...
BRANCH "OK" •~a/\~b/\~c/\~d/\e•…
30 Cambridge, February, 2000
Test sequence kinds. Kinds 1st, 2nd, 3rd
Such procedures can be tested separately because no other target procedure is needed to generate input parameters and analyze outcome.
— Kind 1. The input is data that could be represented in literal (text) form and can be produced without accounting for any interdependencies between the values of different parameters..
— Kind 2. No interdependencies exist between the input items (values of input parameters). Input does not have to be in literal form.
— Kind 3. Some interdependencies exist, however separate testing is possible.
31 Cambridge, February, 2000
Kinds 1st, 2nd, 3rd. What are automated?
Kind Automatically Manually
1st Everything Nothing
2ndTest sequences and Parameter tuple iterators
Data type iterators
3rd Test sequences Parameter tuple iterators
32 Cambridge, February, 2000
Test sequence kinds. Kinds 4th and 5th
Kinds 4th and 5th. The operations cannot be tested separately, because some input can be produced only by calling another operation from the group and/or some outcome can be analyzed only by calling other procedures.
33 Cambridge, February, 2000
Requirements for kinds 4th and 5th
The same requirements: all branches/all disjuncts
Additional problem: how to traverse all states?
34 Cambridge, February, 2000
FSM use for API testing
Traditional FSM approach (explicit FSM definition):— define all states— for each state define all transitions (operation, input
parameters, outcome, next state)
ISPRAS approach (implicit FSM definition):— the state is defined by type definition— for each state
- operations and input are defined by pre-conditions
- outcome and next state are defined by post-conditions
35 Cambridge, February, 2000
Advanced FSM use
—FSM factorization
—Optimization of exhaustive FSM traversing
—Use-case based test sequence generation
—Test scenario modularization
—Friendly interface for test sequence generation and debugging
36 Cambridge, February, 2000
References
– Igor Bourdonov, Alexander Kossatchev, Alexander Petrenko, and Dmitri Galter. KVEST: Automated Generation of Test Suites from Formal Specifications.- Proceedings of World Congress of Formal Methods, Toulouse, France, LNCS, N 1708, 1999, pp.608-621.
– Igor Burdonov, Alexander Kosachev, Victor Kuliamin. FSM using for Software Testing. Programming and Computer Software, Moscow-New-York, No. 2, 2000.
37 Cambridge, February, 2000
Contacts
Alexander PetrenkoInstitute for System Programming of Russian Academy of Sciences (ISP RAS), Moscow, [email protected] phone: +7 (095) 912-5317 ext 4404fax: +7 (095) 912-1524
http://www.ispras.ru/~RedVerst/index.html