automated comparison [email protected] models and analysis of software lecture 12...

22
Automated Comparison Automated Comparison Jerzy.Nawrocki@ put . poznan . pl www.cs.put.poznan.pl/ jnawrocki/models/ Models and Analysis of Software Lecture 12 Copyright, 2003 Jerzy R. Nawrocki

Upload: darcy-wiggins

Post on 13-Jan-2016

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Automated ComparisonAutomated Comparison

[email protected]/jnawrocki/models/

Models and Analysis of SoftwareLecture 12

Copyright, 2003 Jerzy R. Nawrocki

Page 2: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Plan wystąpieniaPlan wystąpienia

Summary of scripting techniquesReference testingComplex comparison

Page 3: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Summary of scripting techniquesSummary of scripting techniques

Linear scriptsStructured scriptingShared scriptsData-driven scriptsKeyword-driven scripts

Page 4: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Summary of scripting techniquesSummary of scripting techniques

Data-driven scripting = Test inputs stored in a separate (data) file.OpenFile ‘ScribbleData’For each record in ScribbleData Read INPUTFILE Read NAME1 Read NAME2 Read OUTPUTFILE Call ScribbleOpen (INPUTFILE) FocusOn ‘Scribble’ SelectOption ‘List/Add Item’ FocusOn ‘Add Item’ Type NAME1 LeftMouseClick ‘OK’ ...EndFor

countries, Sweden, USA, countries2countries, France, Germany, test2countries, Austria, Italy, test3

ScribbleData

Control script

Page 5: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

A more sofisticated data-driven scriptA more sofisticated data-driven scriptOpenFile ‘ScribbleData’Read INPUTFILECall ScribbleOpen (INPUTFILE)Go to next record (ie row)For each record in ScribbleData Read ADDNAME If ADDNAME <> Blank Then { FocusOn ‘Scribble’ SelectOption ‘List/Add Item’ FocusOn ‘Add Item’ Type ADDNAME LeftMouseClick ‘OK} Read MOVEFROM Read MOVETO ...EndFor

ScribbleData

countries

SwedenUSA

4 1Norway

27

countries2

Control script

Page 6: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Keyword-driven scriptsKeyword-driven scripts

Keyword-driven script = A data-driven scripts augmented with keywords representing user actions.

SQABasic???

ScribbleOpen countriesAddToList Sweden USASaveAs countries2

For each TEST_ID OpenFile TEST_ID For each record in test file Read KEYWORD Call KEYWORD EndFor CloseFile TEST_IDEndFor

Test file

Control script

ScribbleOpen...

AddToList...

SaveAs...

Supporting scripts

Page 7: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Plan wystąpieniaPlan wystąpienia

Summary of scripting techniques

Reference testingComplex comparison

Page 8: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Reference testingReference testing

1. Run an application and capture the outcomes2. Review the outcomes3. Use them in subsequent tests as a reference point

It is a kind of regression testing.

Page 9: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Reference testingReference testing

1. Run an application and capture the outcomes2. Review the outcomes3. Use them in subsequent tests as a reference point

Reference testing (RT) or expected outcomes prepared in advance (ExOPA)?

The amount of expected data: a little > ExOPA; a lot > RT

Is the software under test available: No > ExOPA

Verification quality is higher with ExOPA.

Page 10: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Plan wystąpieniaPlan wystąpienia

Summary of scripting techniquesReference testing

Complex comparison

Page 11: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Kinds of comparisonKinds of comparison

Dynamic comparison

Post-execution comparison

• passive

• active

Simple (dumb) comparison: requires identical match

Complex (intelligent) comparison: takes into account known differences

Page 12: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

When we need complex comparison:

• date and time

• unique identity code or number

• ?

Page 13: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

Sales Invoice No. 03/11803Date: 26-May-2003

Code Description Price--------------------------CL/3 Chain link 2.00HK/1 Hook 3.50-------------------------- Total: 5.50

Payment due by 09-Jun-2003

Page 14: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

Sales Invoice No. 03/11803Date: 26-May-2003

Code Description Price--------------------------CL/3 Chain link 2.00HK/1 Hook 3.50-------------------------- Total: 5.50

Payment due by 09-Jun-2003

IGNORE(1, 19:26)IGNORE(2, 7:17)IGNORE(11, 16:26)

Page 15: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

Sales Invoice No. 03/11803Date: 26-May-2003

Code Description Price--------------------------CL/3 Chain link 2.00HK/1 Hook 3.50-------------------------- Total: 5.50

Payment of $5.50 due by 09-Jun-2003

IGNORE(1, 19:26)IGNORE(2, 7:17)IGNORE(11, 16:26) ???

Page 16: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

Sales Invoice No. 03/11803Date: 26-May-2003

Code Description Price--------------------------CL/3 Chain link 2.00HK/1 Hook 3.50-------------------------- Total: 5.50

Payment due by 09-Jun-2003

IGNORE_AFTER("Invoice No. ", 8)IGNORE_AFTER("Date: ", 11) IGNORE_AFTER("Due by ", 11)

Page 17: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

Sales Invoice No. 03/11803Date: 26-May-2003

Code Description Price--------------------------CL/3 Chain link 2.00HK/1 Hook 3.50-------------------------- Total: 5.50

Payment due by 09-Jun-2003

[0-9]{2}\/[0-9]{4}DD”-”LLL”-”DDDD

Page 18: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

Sales Invoice No. @InvIdDate: @Date

Code Description Price--------------------------CL/3 Chain link 2.00HK/1 Hook 3.50-------------------------- Total: 5.50

Payment due by @Date

@InvId [0-9]{2}\/[0-9]{4}@Date DD”-”LLL”-”DDDD

Page 19: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Complex comparisonComplex comparison

What to do with the ignored data?

• Remove them from a copy of the actual output

• Replace them with a constant

Page 20: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

SummarySummary

Simple masking comparisonSimple masking comparison Prefix-based comparisonPrefix-based comparison Regular expressionsRegular expressions Template-based comparisonTemplate-based comparison

Page 21: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

QuestionsQuestions??

Page 22: Automated Comparison Jerzy.Nawrocki@put.poznan.pl  Models and Analysis of Software Lecture 12 Copyright, 2003  Jerzy

Quality assessmentQuality assessment

1. What is your general impression? (1 - 6)2. Was it too slow or too fast?3. What important did you learn during the

lecture?4. What to improve and how?