Download - Test Automation Patterns 200207
-
8/6/2019 Test Automation Patterns 200207
1/187
(c) 2002 Bret Pettichord 1
Test Automation ArchitecturesThese slides are distributed under the Creative Commons License.
In brief summary, you may make and distribute copies of these slides
so long as you give the original author credit and, if you alter,
transform or build upon this work, you distribute the resulting work
only under a license identical to this one.
For the rest of the details of the license, see
http://creativecommons.org/licenses/by-sa/2.0/legalcode.
August 2002
-
8/6/2019 Test Automation Patterns 200207
2/187
(c) 2002 Bret Pettichord 2
Test Automation Architectures
A Context-Based Approach
Bret [email protected]
www.pettichord.com
August 2002
-
8/6/2019 Test Automation Patterns 200207
3/187
(c) 2002 Bret Pettichord 3
Welcome
Getting the most out this seminar
Let us know of special interests.
Ask questions
n During class; or
n Write them down and share during a break
Share your experience and perspective.
-
8/6/2019 Test Automation Patterns 200207
4/187
(c) 2002 Bret Pettichord 4
Seminar Objectives
Understand the different options for testautomation that are available to you.
Learn test automation concepts
Identify important requirements for success
Contrast the benefits of GUI, API and otherapproaches.
Learn which contexts are most suitable forthe different approaches.
Select a test automation architecture suitablefor your project
-
8/6/2019 Test Automation Patterns 200207
5/187
(c) 2002 Bret Pettichord 5
Agenda
Introductionn Test Automation
n Patterns
n Context
n Architecture
n Mission
Quality Attributesn Maintainability
n Reviewability
n Dependabilityn Reusability
Architectural Patternsn Scripting Frameworks
n Data-Driven Scripts
Architectural Patterns (Cont.)
n Screen-based Tables
n Action Keywords
n
Test-First Programmingn API Tests
n Thin GUI
n Consult an Oracle
n Automated Monkey
n Assertions and Diagnosticsn Quick and Dirty
Are You Ready to Automate?
Concluding Themes
-
8/6/2019 Test Automation Patterns 200207
6/187
(c) 2002 Bret Pettichord 6
Background
Domains
n Technical publishing
n Expense reporting
n Sales trackingn Database management
n Systems management
n Application management
n
Internet accessn Math education
n Benefits administration
Toolsn SilkTest (QA Partner)
n WinRunner
n
Rational Robot(TeamTest)
n TestQuest (Btree)n WebLoad
n QA Run (Compuware)
Languagesn Perln Expect (TCL)
n Java
n Korn shell
n Lisp
n Python
-
8/6/2019 Test Automation Patterns 200207
7/187
(c) 2002 Bret Pettichord 7
Acknowledgements
Much of the material in the courseresults from discussions w ithcolleagues.
n Los Altos Workshops on Software Testing(LAWST 1 & 2).
n Austin Workshops on Test Automat ion(AWTA 1 & 2).
n Workshop on Model-Based Testing(Wombat)
n Other course notes and reviews.
-
8/6/2019 Test Automation Patterns 200207
8/187
(c) 2002 Bret Pettichord 8
Introduction
Test Automation, Patterns,Context, Architecture, Mission
IntroductionQuality AttributesArchitectural PatternsAre You Ready to Automate?Concluding Themes
-
8/6/2019 Test Automation Patterns 200207
9/187
(c) 2002 Bret Pettichord 9
A Fable: First Episode
Jerry Overworked starts an automation project(on top of everything else he is doing).
He cant get the test tool to work. Calls support
several times.
Eventually the vendor sends out an engineer whogets it to work with their product.
Many months have passed.
Jerry refuses to work on automation any further.
-
8/6/2019 Test Automation Patterns 200207
10/187
(c) 2002 Bret Pettichord 10
A Fable: Second Episode
Kevin Shorttimer takes over. He is young andeager.
Is excited about doing test automation.Builds large library and complex testingsystem.
Uses automated tests for testing.
Actually finds bugs.
Kevin leaves for a development position.
-
8/6/2019 Test Automation Patterns 200207
11/187
(c) 2002 Bret Pettichord 11
A Fable: Final Episode
Ahmed Hardluck is given the testsuite.
It takes him a while to figure out how to run thetests.
Major product changes break the automation.
Most tests fail.
Ahmed gets help and the testsuite is repaired.
The tests eventually pass and the product ships.Unfortunately, the testsuite ignored errors.
Customers are irate. Product is a failure.
-
8/6/2019 Test Automation Patterns 200207
12/187
(c) 2002 Bret Pettichord 12
A Fable: Some Problems
Spare-time test automation
Lack of clear goals
Lack of experienceTesting the wrong stuff
High turnover
Reaction to desperation
Reluctance to think about testing
Technology focus
Working in isolation
-
8/6/2019 Test Automation Patterns 200207
13/187
(c) 2002 Bret Pettichord 13
The Rules of Software Development
Define requirements
Manage source code, test data, tools
Design before codingReview test automation code
Test the code
Track automation bugs
Document for other users
Establish milestones and deliverables
Dont be a level-zero organization
-
8/6/2019 Test Automation Patterns 200207
14/187
(c) 2002 Bret Pettichord 14
Seven Steps to TestAutomation Success
1. Follow the Rules of Software Development
2. Improve the Testing Process
3. Define Requirements4. Prove the Concept
5. Champion Product Testability
6. Design for Sustainability7. Plan for Deployment
-
8/6/2019 Test Automation Patterns 200207
15/187
(c) 2002 Bret Pettichord 15
The Capture Replay Idea
Regression testing is repetitive, boring,mindless and repetitive.
Therefore it should be easy to automate.Capture the testers events while testing.
Replay the same events on a later build.
If anything different happens, it must be abug.
No programming required!
-
8/6/2019 Test Automation Patterns 200207
16/187
(c) 2002 Bret Pettichord 16
Birth of capture replay tools
[Does not print]
-
8/6/2019 Test Automation Patterns 200207
17/187
(c) 2002 Bret Pettichord 17
Capture Replay:Methodological Problems
Must specify tolerances and scope for whatcounts as the same.
Must prepare for user interface changes.Must be able to work in differentconfigurations and environments.
Must track state of software under test.
Hard-coded data limits reuse.
Solving these problems requires programming!
-
8/6/2019 Test Automation Patterns 200207
18/187
(c) 2002 Bret Pettichord 18
How can Capture Replay toolsbe so smart?
They must instrument the system sothat they can interrogate controls and
intercept and insert user events.n Insert hooks into the operating system.n Replace browser DLLs.
n Replace operating system DLLs.
n
Supplant shared libraries by changing the loadpath.
n Provide their own instrumented window manager.
n Directly instrument application event loop.
newer
older
-
8/6/2019 Test Automation Patterns 200207
19/187
(c) 2002 Bret Pettichord 19
Capture Replay:Modify the System?
Modifying system libraries only adds anotherelement of instability.
To do this, the test tool engineers mustreverse engineer the system beinginstrumented.
They often have to use undocumented andunsupported interfaces, so minorconfiguration changes can break the test tool.
Recording technology is the most sensitive.
-
8/6/2019 Test Automation Patterns 200207
20/187
(c) 2002 Bret Pettichord 20
Capture Replay:Using the Latest Technology?
Your developers incorporate the latest versionin your software. You get to test it.
The test tool developers also get the latestversion. They reverse engineer it so that theycan support it in the next release.
Test tools are always playing catch up.
If they are behind, you will have to debug theproblems yourself.
-
8/6/2019 Test Automation Patterns 200207
21/187
(c) 2002 Bret Pettichord 21
Capture Replay:Technical Problems
Tools are constantly playing catch up withnew technologies.
Instrumented systems tend to be moreunreliable, especially with configurationvariations.n Recording technology is the most sensitive.
Some subtle system bugs can seriouslyconfuse test tools.
Tools require special customization to supportcustom controls.
-
8/6/2019 Test Automation Patterns 200207
22/187
(c) 2002 Bret Pettichord 22
Capture Replay:Custom Controls
Custom controls arenon-standard controlsthat a tool cant support
out of the box.GUI test tool expertsmust customize the testtools to provide thenecessary support.
A control may or maynot be customdepending on the toolyou are using.
Examples:
n Grids with embeddeddrop down lists
n Treeviews
n Delphi lists
n 3270 emulator
n
Powerbuildern Icons
-
8/6/2019 Test Automation Patterns 200207
23/187
(c) 2002 Bret Pettichord 23
Capture Replay:Supporting Custom Controls
Mapping to known controls
Operating using key events
Computing mouse events
Using optical character recognition
Calling internal APIs
Faking it by accessing external data
Peeking at the clipboard
-
8/6/2019 Test Automation Patterns 200207
24/187
(c) 2002 Bret Pettichord 24
Capture Replay:A Pattern without a Context
When could Capture Replay work?
n User interface is defined and frozen early.
n Programmers use late-market, non-customized interface technologies.
These situations are rare.
n There are very few reports of projectsactually using Capture Replay successfully.
bl h d f
-
8/6/2019 Test Automation Patterns 200207
25/187
(c) 2002 Bret Pettichord 25
Published criticism of capturereplay
[This foil does not print]
-
8/6/2019 Test Automation Patterns 200207
26/187
-
8/6/2019 Test Automation Patterns 200207
27/187
(c) 2002 Bret Pettichord 27
Test Automation Context
Staff Skills.n Find a way to leverage the
skills available for testing.
Product Architecture.n Take advantage of all the
interfaces of the softwareunder test.
Test Mission.n Focus on key features or
concerns that can benefit
from the added power ofautomated testing.
Mission
Staff Product
-
8/6/2019 Test Automation Patterns 200207
28/187
(c) 2002 Bret Pettichord 28
Tester Skill VariationUser Specialistsn Understand user perspective and the
problem domain
n Have experience in the targeted userrole
n Anticipate unwritten requirements
Technical Specialistsn Understand the technology and the
solution domain
n
Have experience with the technologyn Anticipate technology challenges
Automation Specialistsn Understand testing technology
n Have experience with test tools
n Anticipate automation needs
A mix ofskillsimprovestestingeffectiveness
Design yourt est st rategyand
automationarchitecturet o allowcontributionsfrom all
-
8/6/2019 Test Automation Patterns 200207
29/187
(c) 2002 Bret Pettichord 29
Staffing Models
User Experts + Tools
User Experts + Automation Experts
Junior Programmers
Tester/Programmers
Test Expert + Warm Bodies
Spare-time Automation
Central Automation Team
-
8/6/2019 Test Automation Patterns 200207
30/187
(c) 2002 Bret Pettichord 30
Forming an Automation Team
Treating automation as a development projectrequires committing staff to the automationas their first priority
Do you have user specialists, technicalspecialists, automation specialists or warmbodies?
Do you have someone with experience inautomation to lead the project?
Do you have product developers who canlend assistance?
-
8/6/2019 Test Automation Patterns 200207
31/187
(c) 2002 Bret Pettichord 31
Activity: Contextual QuestionsAbout People
Answer these questions for yourcurrent or most recent project.
Source: LAWST 2, reprinted in Kaner,Architectures of Test Automation and AvoidingShelfware
-
8/6/2019 Test Automation Patterns 200207
32/187
(c) 2002 Bret Pettichord 32
Product Architecture
Hardware and Software
Multiple machines
Distributed architectureNetworking
Databases
Multiple users, multiple user roles
Both GUI and non-GUI interfaces areavailable for testing
Event-driven & multi-threaded
-
8/6/2019 Test Automation Patterns 200207
33/187
(c) 2002 Bret Pettichord 33
What is Architecture?
Selection of tools, languages andcomponents
Decomposition into modulesn Standard modules which can be acquired
n Custom modules which must be built
Distribution of labor and responsibility
-
8/6/2019 Test Automation Patterns 200207
34/187
(c) 2002 Bret Pettichord 34
Product and Test Architectures
Provides support forthe testing
Provides the contextfor the testing
Designed to meet thequality requirements ofthe testing
Designed to meet thequality requirements ofthe customer
Determined by testdesigners
Determined by systemdesigners
Test AutomationArchitecture
ProductArchitecture
-
8/6/2019 Test Automation Patterns 200207
35/187
(c) 2002 Bret Pettichord 35
Activity: Contextual QuestionsAbout Test Tools
Answer these questions for yourcurrent or most recent project.
Source: LAWST 2, reprinted in Kaner,Architectures of Test Automation and AvoidingShelfware
-
8/6/2019 Test Automation Patterns 200207
36/187
(c) 2002 Bret Pettichord 36
Activity: Contextual QuestionsAbout Your Product
Answer these questions for yourcurrent or most recent project.
Source: LAWST 2, reprinted in Kaner,Architectures of Test Automation and AvoidingShelfware
C t t l ti t
-
8/6/2019 Test Automation Patterns 200207
37/187
(c) 2002 Bret Pettichord 37
Contextual questions toask about your product
[does not print]
-
8/6/2019 Test Automation Patterns 200207
38/187
(c) 2002 Bret Pettichord 38
Test Mission
What is your test mission?n What kind of bugs are you looking for?
n What concerns are you addressing?
Possible missionsn Find important bugs fast
n Verify key features
n Keep up with development
n Assess software stability, concurrency, scalability
n Provide service
Make automation serve your mission.
Expect your mission to change.
-
8/6/2019 Test Automation Patterns 200207
39/187
(c) 2002 Bret Pettichord 39
Two Focuses
Efficiency
n Reduce testing costs
n Reduce time spent in the
testing phasen Improve test coverage
n Make testers look good
n Reduce impact on thebottom line
Service
n Tighten build cycles
n Enable refactoring and
other risky practicesn Prevent destabilization
n Make developers lookgood
n Increase management
confidence in the product
Automation projects with a service focus are more successful
-
8/6/2019 Test Automation Patterns 200207
40/187
(c) 2002 Bret Pettichord 40
Test StrategyOpportunities for test automation1. Software Setup (next slide)
2. Test creationw Test inputs
w Expected results
w Test selection
3. Test executionw External interfaces
w Internal interfaces
4. Results evaluation
w Consulting oraclesw Comparing baselines
Automating execution can leavelots of manual work remaining
ExecuteTests
SpecifyTests
VerifyResults
1
2
3
-
8/6/2019 Test Automation Patterns 200207
41/187
(c) 2002 Bret Pettichord 41
Test Setup
Software testing usually requires lots of setup activities in preparation for testingn Installing Product Software
n Configuring Operating Systemsn Initializing Databases
n Loading Test Data
Many of these activities can be automated.
System administration tools are often usefuland cost-effective.
-
8/6/2019 Test Automation Patterns 200207
42/187
Contextual questions to
-
8/6/2019 Test Automation Patterns 200207
43/187
(c) 2002 Bret Pettichord 43
Contextual questions toask about your mission
[does not print]
-
8/6/2019 Test Automation Patterns 200207
44/187
(c) 2002 Bret Pettichord 44
Quality Attributes
Maintainability, Reviewability,Dependability, Reusability
IntroductionQuality AttributesArchitectural Patterns
Are You Ready to Automate?Concluding Themes
-
8/6/2019 Test Automation Patterns 200207
45/187
(c) 2002 Bret Pettichord 45
Essential Capabilities
Automation is replacing what works withsomething that almost works, but is fasterand cheaper Professor Roger Needham
What trade-offs are we willing to make?Functional requirements for test automationn The tests that need to be executed (and re-
executed)
Quality (non-functional) requirements for testautomationn The requirements that result from the fact that we
are automating
-
8/6/2019 Test Automation Patterns 200207
46/187
(c) 2002 Bret Pettichord 46
Maintainability
Will the tests still run after productdesign changes?
Will tests for 1.0 work with 2.0?Can tests be easily updated for 2.0?
Will tests fail because of changes to the
output format?
-
8/6/2019 Test Automation Patterns 200207
47/187
(c) 2002 Bret Pettichord 47
Maintainability
[does not print]
-
8/6/2019 Test Automation Patterns 200207
48/187
(c) 2002 Bret Pettichord 48
User Interfaces Change
Your GUI test automation will likely break.What can you do?
Prevent developers from changing themDesign your automation so it isadaptable
Test via non-user interfaces
-
8/6/2019 Test Automation Patterns 200207
49/187
(c) 2002 Bret Pettichord 49
Reviewability
Can others review the test scripts andunderstand what is being covered?
Are the test scripts documented?Can we make sure that that the automatedscript matches the original test design?
What kind of coverage does the test suite
have? How can we know?
Is the test testing the right stuff?
-
8/6/2019 Test Automation Patterns 200207
50/187
(c) 2002 Bret Pettichord 50
Repeatability
Will your test do the exact same thingevery time?
Is random data embedded in yourtests?
Do your tests modify objects in a way
that prevents them from being re-run?
-
8/6/2019 Test Automation Patterns 200207
51/187
(c) 2002 Bret Pettichord 51
Integrity
Can your test results be trusted?
Do you get lots of false alarms?
Are you sure that failed tests always appearin the test results?
Is it possible for tests to be inadvertentlyskipped?
A Basic PrincipleAut omated t est s must fai l i f t he product
under t est is not installed
-
8/6/2019 Test Automation Patterns 200207
52/187
(c) 2002 Bret Pettichord 52
Reliability
Will the test suite actually run?
Will tests abort?
Can you rely on the test suite toactually do some testing when youreally need it?
Will it run on all the platforms andconfigurations you need to test?
-
8/6/2019 Test Automation Patterns 200207
53/187
(c) 2002 Bret Pettichord 53
Dependability
[does not print]
-
8/6/2019 Test Automation Patterns 200207
54/187
(c) 2002 Bret Pettichord 54
Reusability
To what degree can testing assets bereused to create more, different tests?
This goes beyond mere repetition.Can you amass a collection of data,procedures, mappings and models that
can be reused in new ways to makemore testing happen?
-
8/6/2019 Test Automation Patterns 200207
55/187
(c) 2002 Bret Pettichord 55
Independence
Can your tests be run individually oronly as part of a suite?
Can developers use them to reproducedefects?
Will your tests run correctly if previoustests fail?
Will one failure cause all succeedingtests to fail?
-
8/6/2019 Test Automation Patterns 200207
56/187
(c) 2002 Bret Pettichord 56
Performance
Rarely is it worth optimizing testautomation code for performance.
Supporting independence andrepeatability can impact performance.
But performance improvements can
complicate tests, reduce reliability andmay even damage integrity.
-
8/6/2019 Test Automation Patterns 200207
57/187
(c) 2002 Bret Pettichord 57
Performance Example
[does not print]
-
8/6/2019 Test Automation Patterns 200207
58/187
(c) 2002 Bret Pettichord 58
Simplicity
Things should be a simple as possible, but nosimpler Einstein
Complexity is the bugbear of test automation
You will need to test your test automation,but you are likely to have few resources for
this
Therefore your architecture must be assimple and perspicacious as possible
-
8/6/2019 Test Automation Patterns 200207
59/187
(c) 2002 Bret Pettichord 59
Quality Attributes
Maintainability
Reviewability
Repeatability
Integrity
Reliability
Reusability
IndependencePerformance
Simplicity
-
8/6/2019 Test Automation Patterns 200207
60/187
(c) 2002 Bret Pettichord 60
Quality Attribute Assessment
[does not print]
Quality Attributes for
-
8/6/2019 Test Automation Patterns 200207
61/187
(c) 2002 Bret Pettichord 61
Quality Attributes forArchitectural Patterns
We will assess each architecture patternin terms of
n Maintainabilityn Reviewability
n Dependability
wReliability & Integrity
n Reusability
-
8/6/2019 Test Automation Patterns 200207
62/187
-
8/6/2019 Test Automation Patterns 200207
63/187
(c) 2002 Bret Pettichord 63
Test Automation Architecture
Generally, softwarearchitecture is:n Selection of tools,
languages and
componentsn Decomposition into
modulesw Standard modules which
can be acquired
w
Custom modules whichmust be built
n Distribution of labor andresponsibility
Our approach:n Context
w People
w Product
w Mission
n Test strategyw Test creation
w Test execution
w Test evaluation
n Quality Attributesw Maintainability
w Reviewability
w Dependability
w Reusability
-
8/6/2019 Test Automation Patterns 200207
64/187
(c) 2002 Bret Pettichord 64
Scripting Framework
You want to create a lot of tests without building a lot of custom supportcode.
Therefore, use a generalized scriptingframework. Extend it as needed foryour project.
Most commercial GUI test tools arescripting frameworks with GUI drivers.
-
8/6/2019 Test Automation Patterns 200207
65/187
(c) 2002 Bret Pettichord 65
Scripting Framework: Example
[-] testcase response_notice_group ()[ ] Desktop.SetActive ()[ ] Desktop.IconView.DoubleClick ("Notices")[ ] ReadNotices.Update.Click () // update[ ] ReadNotices.ListBox.Select ("Sentry*")[ ] ReadNotices.CatchUp.Click ()[ ] ReadNotices.Close.Click ()[ ] Desktop.IconView.DoubleClick ("sentry-region")[ ] PolicyRegion.IconView.DoubleClick ("BVT_DM")[ ] ProfileManager.Profiles.DoubleClick ("Sentry_BVT")[ ] DistributedMonitoring.AddMonitor.Click ()
[ ] AddMonitor.Collections.Select ("Universal")[ ] AddMonitor.Sources.Select ("Swap space available")[ ] AddMonitor.AddEmpty.Click ()[ ] EditMonitor.ResponseLevel.Select ("always")[ ] EditMonitor.SetMonitoringSchedule.Click ()[ ] SetMonitoringSchedule.CheckEvery.SetText ("1")
[ ] SetMonitoringSchedule.TimeUnits.Select ("minutes")[ ] SetMonitoringSchedule.ChangeClose.Click ()[ ] EditMonitor.SendTivoliNotice.Click ()[ ] EditMonitor.NoticeGroupList.Select ("Sentry")[ ] EditMonitor.ChangeClose.Click ()[ ] DistributedMonitoring.Profile.Save.Pick ()[ ] DistributedMonitoring.Profile.Distribute.Pick ()
[ ] DistributeProfile.DistributeClose.Click ()[ ] DistributedMonitoring.Profile.Close.Pick ()[ ] sleep (80)[ ] Desktop.SetActive ()[ ] Desktop.IconView.DoubleClick ("Notices")[ ] ReadNotices.ListBox.Select ("Sentry (* unread)")
Desktop.SetActive ()
Desktop.IconView.DoubleClick ("Notices")
ReadNotices.Update.Click () // update
ReadNotices.ListBox.Select ("Sentry*")
ReadNotices.CatchUp.Click ()
ReadNotices.Close.Click ()
-
8/6/2019 Test Automation Patterns 200207
66/187
(c) 2002 Bret Pettichord 66
Scripting Framework: Context
People
n Tester/Programmers
n Test Tool Specialists
Product
n Can automate GUItesting
n
Frameworks alsosupport API and unittesting
Mission
n Automate tests thatwill last the life of
the productn Automating tests
that are defined inadvance
n Provide maximumflexibility of approach
Scripting Framework:
-
8/6/2019 Test Automation Patterns 200207
67/187
(c) 2002 Bret Pettichord 67
Scripting Framework:Basic Components
Scripting Language
n Often proprietary
n Standard languages
are preferred
Test Scripts
n Written in thescripting language
n May contain calls tolibrary functions
Test Harness
n Executes tests andcollects test results
n Optional support forpreconditions andpostconditions (setupand teardown)
Scripting Framework:
-
8/6/2019 Test Automation Patterns 200207
68/187
(c) 2002 Bret Pettichord 68
Scripting Framework:Test Harness
Necessary Capabilities
n Run the test scripts
n Collect test verdicts
n
Report test results
Optional Capabilitiesn Check test preconditions
(abort or correct if notmet)
n Allow selected subsets oftests to run
n Distribute test executionacross multiple machines
n Distinguished between
known failures and newfailures
n Allow remote executionand monitoring
n Use Error RecoverySystem (later)
-
8/6/2019 Test Automation Patterns 200207
69/187
(c) 2002 Bret Pettichord 69
Scripting Framework: Examples
TETXunit,Junit
DejaGNUTestHarness
Perl, ShellSource
language
TCLLanguage
N/AN/AExpectInterface
Driver
Command-line testing
Unit testingCharacter-basedtesting
These frameworks are in the freely available.
Scripting Framework:
-
8/6/2019 Test Automation Patterns 200207
70/187
(c) 2002 Bret Pettichord 70
Scripting Framework:GUI Support Components
Widget Support Library
n Driver functions forGraphical User Interfaces
n
Identify user interfacecomponents usingspecified qualifiers
n Insert events into theinput stream aimed atthose components
n Requires customizationto support customcontrols.
Recorders
n Action Recorders
w Record user actions asscripts
n Object Recorders
w Report control identifiesby class and properties
w Assist hand-coding
w Spy
Recorders are useful for
creating tests
Scripting Framework:
-
8/6/2019 Test Automation Patterns 200207
71/187
(c) 2002 Bret Pettichord 71
Scripting Framework:Test Strategy
Test Creationn Tests are hand-coded,
using capture replaywhen possible.
n Product must beavailable before writingtests.
Test Executionn The test harness and
scripting languageprovide an executionenvironment for thetests.
Test Evaluation
n Expected outcomes arehand-coded when testsare created, or
n Framework supportscapturing a baseline forlater comparison whentest is first run
Using Scripting Framework for
-
8/6/2019 Test Automation Patterns 200207
72/187
(c) 2002 Bret Pettichord 72
Using Scripting Framework forGUI Testing
Verify support for custom controls. Customize asnecessary. Encapsulate any special tricks.
Determine best strategy for window maps.
Create test scripts in the scripting language, usingrecorders for assistance.
Use cut and paste to create test variants.Avoid placing re-used code in libraries. (see nextslide)
Build or customize recovery system to handleknown error messages. Build tests to verifyrecovery system.
-
8/6/2019 Test Automation Patterns 200207
73/187
(c) 2002 Bret Pettichord 73
Cut and Paste: True Evil?
Common Wisdomn Avoid using cut and paste
n Repeated code smells bad
n Instead place in functions
The Testing Exceptionn Tests by their nature use lots of repetition
n Tests are easier to review if dont include distractingfunctional calls and conditional statements
n
Tests become less reliable with added complexityTherefore, use cut and paste when it makes testseasier to understand and modify.
Scripting Framework:
-
8/6/2019 Test Automation Patterns 200207
74/187
(c) 2002 Bret Pettichord 74
Scripting Framework:Considerations
When might you use it?n Pilot projects. It serves as a foundation for more
complicated architectures.
n Minimum complexity that provides reasonableflexibility and robustness.
When might it be enough?n User interface is very stable, or product has short
life-span.n Domain specialists can program, or tests are well-
specified.
Scripting Framework:
-
8/6/2019 Test Automation Patterns 200207
75/187
(c) 2002 Bret Pettichord 75
Scripting Framework:Quality Attributes
Maintainabilityn Medium
n Additional design patternscan be used to insulatetests from interfacechanges.
n Testers will requirediscipline to avoid usingunmaintainable constructs
Reviewabilityn
Lown Tests are written in a
scripting language that maynot be known by many.
Dependability
n Medium
n Scripting language providesthe freedom to write clever,
complex, error-prone code.Reusability
n Medium
n Framework can be thefoundation of other
architectures.
-
8/6/2019 Test Automation Patterns 200207
76/187
(c) 2002 Bret Pettichord 76
Error Recovery System
It resets the system when a test fails.n Without a recovery system, test suites are prone to
cascading failures. (Domino effect)
A recovery system mayn Close extraneous application windows
n Shutdown and restart the product
n Reboot the hardware
n Reset test files
n
Reinitialize the databasen Log the error
A recovery system returns the product to a basestate.n Tests need to start from this base state.
-
8/6/2019 Test Automation Patterns 200207
77/187
(c) 2002 Bret Pettichord 77
Data-Driven Scripts
Good testing practice encouragesplacing test data in tables.
Hard-coding data in test scripts makesthem hard to review and invites errorand omissions.
Therefore write bridge code to allow
test scripts to directly read testparameters from tables.
Data-Driven Testing
-
8/6/2019 Test Automation Patterns 200207
78/187
(c) 2002 Bret Pettichord 78
g& Make Test Data Convenient
[does not print]
-
8/6/2019 Test Automation Patterns 200207
79/187
(c) 2002 Bret Pettichord 79
Data-Driven Scripts: Example
rejectabcrjohnson
expired123dschmidt
loginxyzmfayad
ResultPasswordName
-
8/6/2019 Test Automation Patterns 200207
80/187
(c) 2002 Bret Pettichord 80
Data-Driven Scripts: Context
Peoplen Tester/Programmers
create the test
procedure scriptsn Anyone can create
the inputs
Productn
Any functionality thatmust be tested withlots of variations.
Mission
n Execute lots of testswith varying
parameters
-
8/6/2019 Test Automation Patterns 200207
81/187
(c) 2002 Bret Pettichord 81
Data-Driven Scripts: Example 2
Captio
nlocation
Captio
ntypeface
Captio
nstyle
Captio
nGraphic(CG)
CGfo
rmat
CGsi
ze
Bound
ingboxwidth
1 Top Times Normal Yes PCX Large 3 pt.
2 Right Arial Italic No 2 pt.
3 Left Courier Bold No 1 pt.
4 Bottom Helvetica Bold Italic Yes TIFF Medium none
Data-Driven Scripts:
-
8/6/2019 Test Automation Patterns 200207
82/187
(c) 2002 Bret Pettichord 82
ata e Sc ptsTest Strategy
Test Creation
n Anyone can enter testparameters in aspreadsheet
n Tests may beautomatically generatedusing spreadsheetformulas or externalprograms
n Live data may be usedfrom legacy systems
Test Execution
n Tests are executed bydata-driven scripts
Test Evaluationn Specify expected results
with test inputs, or
n Deliver input parameterswith outputs to facilitate
manual verification
-
8/6/2019 Test Automation Patterns 200207
83/187
(c) 2002 Bret Pettichord 83
Data-Driven Scripts: Components
Data-Driven Scriptsn Test procedure scripts that execute in a
Scripting Framework
Test Case Parametersn Stored in a spreadsheet
Bridge Coden Frame work library allows test script to
read spreadsheet data
Data-Driven Scripts:
-
8/6/2019 Test Automation Patterns 200207
84/187
(c) 2002 Bret Pettichord 84
pConsiderations
When might you usethis?n You intend to run lots of
variations of a test
proceduren Domain specialists are
defining tests but are notprogrammers
n You have automation
specialistsn You would like to reduce
dependency on specifictest tool
When might this beenough?n Your tests vary in terms
of data rather thanprocedure
Data-Driven Scripts:
-
8/6/2019 Test Automation Patterns 200207
85/187
(c) 2002 Bret Pettichord 85
pQuality Attributes
Maintainability
n High.
n Test procedure script can beadapted to interface
changes without requiringchanges to the data.
Reviewability.
n Medium/High.
n Test data is easy to review.
n Test procedure scriptsshould be double checkedwith known test data first.
Dependabilityn Medium.
n Reviewability helps.
n Test procedure script must
be reviewed to ensure thatit is executing as expected.
n Supporting navigationoptions can complicate andincrease chance of errors.
Reusability
n High.n Test data may be able to be
used with different testprocedure scripts.
-
8/6/2019 Test Automation Patterns 200207
86/187
(c) 2002 Bret Pettichord 86
Screen-based Tables
Non-programmers want to create andautomate lots of tests.
They dont want to work throughmiddlemen, and the screen designs arefixed in advance.
Therefore, use tables to specify the
windows, controls, actions and data forthe tests.
-
8/6/2019 Test Automation Patterns 200207
87/187
-
8/6/2019 Test Automation Patterns 200207
88/187
(c) 2002 Bret Pettichord 88
Screen-Based Tables: Context
People
n Many non-programming testers
n Dedicatedautomation staff
Product
n User interfacedefined early
n It wont change late
Mission
n Test business logicfrom a user
perspective.n Allow tests to be
reviewed by anybody
n Avoiding committing
to a test tool
Screen-Based Tables:
-
8/6/2019 Test Automation Patterns 200207
89/187
(c) 2002 Bret Pettichord 89
Test Strategy
Test Creation
n User domainspecialists specify
tests, step-by-step inspreadsheets
n Tests can be writtenas soon as the
screen design isfixed.
Test Execution
n Tests are executedby a dispatcher script
running in aframework
Test Evaluation
n Expected results are
also specified in thetest casespreadsheets.
Screen-Based Tables:
-
8/6/2019 Test Automation Patterns 200207
90/187
(c) 2002 Bret Pettichord 90
Components
Test Tables
n Screen-baseddescriptions stored in
spreadsheetsDispatcher
n Test script that readsin test tables line-by-
line, and executingthem.
Bridge Code
n Allows dispatcherscript to access
spreadsheet dataWindow Maps
n Defines the names ofwidgets on a screen
and how they can beaccessed.
-
8/6/2019 Test Automation Patterns 200207
91/187
(c) 2002 Bret Pettichord 91
Window Maps
Abstraction layer that improvesmaintainability
Provides names and functions for
conveniently accessing all controls or widgetsIncluded in the better test toolsn GUI Map, Window Declarations
Costs to generate depend on tool
Using window maps also greatly improvescript readability
-
8/6/2019 Test Automation Patterns 200207
92/187
(c) 2002 Bret Pettichord 92
Window Maps (contd)
[does not print]
-
8/6/2019 Test Automation Patterns 200207
93/187
(c) 2002 Bret Pettichord 93
Window Map Examples
Login intname USERNM
Password intname PWD
Login textbox #1
Password textbox #2
Login
Password
A
B
Screen-Based Tables:
-
8/6/2019 Test Automation Patterns 200207
94/187
(c) 2002 Bret Pettichord 94
Quality Attributes
Maintainability
n Medium.
n Minor User Interfacechanges can be handledusing the Window Maps
Reviewabilityn High.
n Tests can be reviewed by
almost anyone
Dependabilityn High.
n Error handling andlogging is isolated to the
execution system, whichcan be optimized forreliability
Reusabilityn Low.
n Test format facilitatesreplacing GUI test tool ifthe need arises.
-
8/6/2019 Test Automation Patterns 200207
95/187
(c) 2002 Bret Pettichord 95
Action Keywords
You would like easy to read test scriptsthat can be created by business domain
experts who may not be programmers.Therefore define Action Keywords thatcan appear in spreadsheets yetcorrespond to user tasks.
-
8/6/2019 Test Automation Patterns 200207
96/187
(c) 2002 Bret Pettichord 96
Action Keywords: Example
512-555-xxxx1010 Main StXJohnXSmithXVerify Address
John SmithAddressNameClick On
DoneClick On
512-555-xxxx1010 Main StXJohnXSmithXChange Address
512-555-12121010 Main StJohnSmithVerify Address
John SmithAddressNameClick On
DoneClick On
512-555-12121010 Main StJohnSmithNew Address
AddressBookGo To
-
8/6/2019 Test Automation Patterns 200207
97/187
(c) 2002 Bret Pettichord 97
Action Keywords: Context
Peoplen Business domain experts
write the test scripts
n Automation experts write
the task libaries andnavigation code
Productn Typically used with GUI
interfaces
n Also can be usedeffectively with otherinterfaces (e.g.telephone)
Mission
n Support thoroughautomated testing bybusiness domain experts
n Facilitate test review
n Write test scripts beforesoftware is available fortesting.
n Create tests that will last.
-
8/6/2019 Test Automation Patterns 200207
98/187
(c) 2002 Bret Pettichord 98
Action Keywords: Components
Keyword-based tests
n Tests in spreadsheetformat
Keyword definitionsn Supported keywords and
arguments
n Mapped to task libraryfunctions
Task libraryn Functions that executed
the tasks
n Written in the scripting
languageDispatchern Parses the spreadsheet
data and executes thecorresponding library
functionBridge coden Allows dispatcher script
to read spreadsheet data
Action Keywords:
-
8/6/2019 Test Automation Patterns 200207
99/187
(c) 2002 Bret Pettichord 99
Test Strategy
Test Creation
n Business domainspecialists create tests
in spreadsheets.n Automation specialists
create keywords andtask libraries.
n Tests can be createdearly.
Test Execution
n Tests are executedusing a dispatcher
and framework.Test Evaluation
n Expected results aredefined as verification
keywords when testsare authored.
-
8/6/2019 Test Automation Patterns 200207
100/187
(c) 2002 Bret Pettichord 100
Action Keywords
Sample Architecture
User Interface Components
Dispatch Control
User Interface
Driver
Window Maps
Support for
Custom Controls
Error Recovery
Bridge Code
Task Library
Test Cases
Custom TestingHooks
-
8/6/2019 Test Automation Patterns 200207
101/187
(c) 2002 Bret Pettichord 101
Action Keywords
When might you usethis?n Non-technical
domain specialistsn Tests will be used for
a long time
n Wish to write testsearly
n Expect user interfacechanges
What are the risks?
n Complexity
n Cost
n Dispatchers and tasklibraries must betested
Action Keywords:
-
8/6/2019 Test Automation Patterns 200207
102/187
(c) 2002 Bret Pettichord 102
Quality Attributes
Maintainabilityn High
n Only the task
libraries need to beupdated when userinterfaces change
Reviewabilityn High
n Test format is easyto understand
Dependabilityn Medium. It really
depends on how wellthe dispatcher andtask functions areengineered
Reusabilityn Medium.
n Tasks can be reusedfor many tests.
Comparison of
-
8/6/2019 Test Automation Patterns 200207
103/187
(c) 2002 Bret Pettichord 103
Spreadsheet-based Architectures
One per actionOne per stepOne per test caseRows inthe testfiles
General purpose.For many testprocedures
Predefined aswindow, object,action, value
Screen-BasedTables
General purpose.For many testprocedures
One per testprocedure
ControlScript
Action worddeterminessemantics for therow
Columnsemanticsdefined by eachprocedure script
for all rows
Columnsin the testfiles
ActionKeywords
Data-DrivenScripts
T k Lib i
-
8/6/2019 Test Automation Patterns 200207
104/187
(c) 2002 Bret Pettichord 104
Task Libraries
Writing good libraries is never simple
Two forces make test libraries a
particular challengen Test variations present ample opportunities
for premature generalization
n Test libraries are less likely to be tested
themselves
T k Lib i P i i l
-
8/6/2019 Test Automation Patterns 200207
105/187
(c) 2002 Bret Pettichord 105
Task Libraries: Principles
Therefore use test libraries as a way to maketest scripts easier to understand rather thanbeing concerned with reducing lines of code
n Focus on grouping tasks in users terms
n Document start and end states
n Only create functions that will be used dozens oftimes
n Write tests for your libraries
There is nothing wrong with open coding
T k Lib i T k D fi iti
-
8/6/2019 Test Automation Patterns 200207
106/187
(c) 2002 Bret Pettichord 106
Task Libraries: Task Definition
Tasks are common sequences of actions that appearrepeatedly in tests
They may take place on a single screen or span acouple of screens, but they usually do involvemultiple widgetsThey closely map to manual procedures in terms of:n detail and specificity
n terminology
They must note and verify start and end statesTasks may require more verification than typicallyappear in manual test descriptions
T t Lib G id li
-
8/6/2019 Test Automation Patterns 200207
107/187
(c) 2002 Bret Pettichord 107
Test Library Guidelines
[does not print]
T k Lib i E l
-
8/6/2019 Test Automation Patterns 200207
108/187
(c) 2002 Bret Pettichord 108
Task Libraries: Examples
CreateAccount (Name, Address, Credit Limit)
n Creates a customer account and returns to themain screen.
OrderItem (ProductID, Quantity)n Adds specified product to the order sheet
CompleteOrder (CreditCard, Address)
n
Complete the existing order using the specifiedcredit card and address.
T f Lib F ti
-
8/6/2019 Test Automation Patterns 200207
109/187
(c) 2002 Bret Pettichord 109
Types of Library Functions
Task Librariesn Encapsulate common user tasks.
Navigation Librariesn Facilitate navigating through the elements of the user
interface
Verification Librariesn Check user interface elements against expected results
n Will automatically log errors when appropriate
Test Harness Librariesn Support preconditions and postconditions
n Error Recovery System is an example
-
8/6/2019 Test Automation Patterns 200207
110/187
(c) 2002 Bret Pettichord 110
Dont build test librariessimply to avoid repeating code
T k Lib i C t t
-
8/6/2019 Test Automation Patterns 200207
111/187
(c) 2002 Bret Pettichord 111
Task Libraries: Contexts
When might you usethem?
n User interface is
expected to changen Lots of tests will be
automated
What are the risks?
n Library designintegrity must be
adhered ton Larger up-front costs
n Increased complexity
Levels of Testing:S t T ti
-
8/6/2019 Test Automation Patterns 200207
112/187
(c) 2002 Bret Pettichord 112
System Testing
System testing testsa complete softwaresystem from a user
perspective.The most popular
kind of systemtesting is GUItesting.
SystemTesting
ComponentTesting
Unit Testing
Levels of Testing:U it T ti
-
8/6/2019 Test Automation Patterns 200207
113/187
(c) 2002 Bret Pettichord 113
Unit Testing
Units are functions, methods or otherlexical collections of code
Units are in a specific language
Unit testing is usually done in the samelanguage as that being tested
Unit testing is usually done by
programmers, sometimes the sameones who wrote the code under test
Levels of Testing:API d U it T ti C i
-
8/6/2019 Test Automation Patterns 200207
114/187
(c) 2002 Bret Pettichord 114
API and Unit Testing Comparison
Typicallyprovided for largeunits only
Applies to units largeand small
WhatsTested
APIs must be
created
Exposed
automatically
Effort to
Expose
Often differentlanguage fromproduct
Same language asproduct
TestLanguages
Typically publicTypically privatePublic vs
Private
API TestingUnit Testing
Levels of Testing:U it I t ti T ti
-
8/6/2019 Test Automation Patterns 200207
115/187
(c) 2002 Bret Pettichord 115
Unit Integration Testing
What is a unit?
n A class, function or procedure
n Typically the work product of a single developer
Allow calls to classes,
functions andcomponents that theunit requires
Create stubs orsimulators for unitsbeing depended on
Typically cheaper and
easier
Unit integration testing
Test units in context
Expensive and oftendifficult
Unit isolation testing
Test each unit inisolation
Test First Programming
-
8/6/2019 Test Automation Patterns 200207
116/187
(c) 2002 Bret Pettichord 116
Test-First Programming
You want to be sure that the code works assoon as it is written.
And you want to have regression tests that
will facilitate reworking code later withoutworry.
Therefore use Test-First Programming, atechnique that uses testing to structure and
motivate code development.n Many programmers believe that Test-First
Programmer results in better design.
T t Fi t P i C t t
-
8/6/2019 Test Automation Patterns 200207
117/187
(c) 2002 Bret Pettichord 117
Test-First Programming: Context
People
n Programmers usethis technique when
they develop codeProduct
n Typically used onproducts using
iterativedevelopmentmethodologies
Mission
n Test code before it ischecked in
n All code must haveautomated tests
Test-First Programming:T t St t
-
8/6/2019 Test Automation Patterns 200207
118/187
(c) 2002 Bret Pettichord118
Test Strategy
Test Creationn Write a test for
anything that couldpossibly fail
n Programmers createtest, write somecode, write anothertest
n Tests are written inthe same languageas the product code.
Test Execution
n Tests are executedusing a unit testing
frameworkTest Evaluation
n Expected results arespecified when the
tests are written
Test-First Programming:C t
-
8/6/2019 Test Automation Patterns 200207
119/187
(c) 2002 Bret Pettichord 119
Components
Unit Test Framework
n Several are in thepublic domain
Tests
n Written in the samelanguage as the
product code
Test-First Programming:Q alit Att ib tes
-
8/6/2019 Test Automation Patterns 200207
120/187
(c) 2002 Bret Pettichord 120
Quality Attributes
Maintainability
n Medium.
n Tests are maintained
with the code.
Reviewability
n Medium.
n Can be reviewed byother programmers.
Dependability
n Medium/High.
n Tests are run before
the code is written.This helps to test thetests.
Reusability
n Low
API Tests
-
8/6/2019 Test Automation Patterns 200207
121/187
(c) 2002 Bret Pettichord 121
API Tests
User interfaces tests are often tricky toautomate and hard to maintain.
Therefore, use or expose programminginterfaces to access the functionalitythat needs to be tested.
API Tests: Example Interfaces
-
8/6/2019 Test Automation Patterns 200207
122/187
(c) 2002 Bret Pettichord 122
API Tests: Example Interfaces
Interfaces maybe providedspecifically for
testing.n Excel
n Xconq
Existing interfaces maybe able to supportsignificant testing.
n InstallShieldn Autocad
n Interleaf
n Tivoli
n Any defined client/serverinterface
Any interface iseasier to automatethan a GUI.
Hidden test interfaces
-
8/6/2019 Test Automation Patterns 200207
123/187
(c) 2002 Bret Pettichord 123
Hidden test interfaces
[does not print]
API Tests: Context
-
8/6/2019 Test Automation Patterns 200207
124/187
(c) 2002 Bret Pettichord 124
API Tests: Context
Peoplen Programmer/Testers
write the tests.
n Good cooperation
between testers andproduct programmers.
Productn Any product with some
kind of programming
interfacen Any product that can
have an interface added
Mission
n Find an effective way towrite powerfulautomated tests.
n Testing starts early.
API Tests: Uncovering APIs
-
8/6/2019 Test Automation Patterns 200207
125/187
(c) 2002 Bret Pettichord 125
API Tests: Uncovering API s
Does your product have an API?nAsk? It might be undocumented
Client/Server protocol interfaces may beavailable.
APIs or command line interfaces maybe available
Diagnostic interfaces may also beavailable.
API Tests: Building APIs
-
8/6/2019 Test Automation Patterns 200207
126/187
(c) 2002 Bret Pettichord 126
API Tests: Building API s
Request test interfaces.
nYou may be surprised and get what youask for.
It may be cheaper to create or exposeone for testing than to build GUI testautomation infrastructure
API Tests: Test Strategy
-
8/6/2019 Test Automation Patterns 200207
127/187
(c) 2002 Bret Pettichord 127
API Tests: Test Strategy
Test Creation
n Tests are written in ascripting language.
Test Executionn Tests are executed in
a scriptingframework or using a
programminglanguage.
Test Evaluation
n Expected results arespecified when tests
are written
API Tests: Quality Attributes
-
8/6/2019 Test Automation Patterns 200207
128/187
(c) 2002 Bret Pettichord 128
API Tests: Quality Attributes
Maintainabilityn High.
n APIs tend to be
more stable thanGUIs
Reviewabilityn Medium/High.
n
Tests are written in astandard scriptinglanguage.
Dependability
n Medium.
n The ability to write
flawed tests isunimpeded.
Reusability
n Low.
Levels of Testing:Comparison of Approaches
-
8/6/2019 Test Automation Patterns 200207
129/187
(c) 2002 Bret Pettichord 129
Comparison of Approaches
Must have toollicense to run
Anyone can runAnyone can runFlexibility
Requires tooltraining
Need tounderstand API
Tests are insame languageas product
Trainingrequired
GUI test tools(purchase)
Scripting testharnesses(public domain)
Unit testharnesses(public domain)
Tool supportrequired
Testers +automationexperts
Programmer/users
DevelopersWho can do it?GUIAPIUnit
Thin GUI
-
8/6/2019 Test Automation Patterns 200207
130/187
(c) 2002 Bret Pettichord 130
Thin GUI
You want to test the user interfacewithout using a GUI test tool.
Therefore, design the GUI as a thinlayer of presentation code atop thebusiness logic. Use unit or API testtechniques to test the business logic
layer.
Comparing Two Kinds ofTest Interfaces
-
8/6/2019 Test Automation Patterns 200207
131/187
(c) 2002 Bret Pettichord 131
Test Interfaces
Which interface will be easier to test?
Product Under Test
Domain
Code
Presentation
Code
ProgrammingI
nterface
UserInterface
ProgrammingI
nterface
Test
AutomationLibraries
Thin GUI: Context
-
8/6/2019 Test Automation Patterns 200207
132/187
(c) 2002 Bret Pettichord 132
Thin GUI: Context
People
n Programmer/Testers
Product
n Product presentationlayer is developed asa thin layer atop thebusiness logic code.
Mission
n Automate userinterface tests using
existing unit testframework
Thin GUI: Test Strategy
-
8/6/2019 Test Automation Patterns 200207
133/187
(c) 2002 Bret Pettichord 133
Thin GUI: Test Strategy
Test Creation
n Tests are created asunit tests.
Test Executionn Tests are executed in
the unit testframework.
Test Evaluation
n Expected results aredefined in the tests.
Thin GUI: Quality Attibutes
-
8/6/2019 Test Automation Patterns 200207
134/187
(c) 2002 Bret Pettichord 134
Thin GUI: Quality Attibutes
Maintainabilityn High.
n Splitting the GUIseparates tests from
interface changes.n Technical issues,
however, may interfere.
Reviewabilityn Medium.
n Other programmers canreview.
n No special tool languageused.
Dependability
n Medium.
n Note that sometraditional testing of the
GUI is still required.
Reusabilityn Medium.
n The unit test framework
is being reused.
Consult an Oracle
-
8/6/2019 Test Automation Patterns 200207
135/187
(c) 2002 Bret Pettichord 135
Consult an Oracle
You want to evaluate lots of tests.
Therefore, Consult an Oracle.
An oracle is a reference programthat computes correct results.
Consult an Oracle: Examples
-
8/6/2019 Test Automation Patterns 200207
136/187
(c) 2002 Bret Pettichord 136
Consult an Oracle: Examples
A calculator is tested by randomlygenerating inputs. The resulting outputsare compared to the results fromMathematica.
Changes to a business system aretested by randomly generating inputs
and then comparing the results to thosefrom a previous version.
Consult an Oracle: Context
-
8/6/2019 Test Automation Patterns 200207
137/187
(c) 2002 Bret Pettichord 137
Consult an Oracle: Context
People
n Whoever
Product
n A suitable oraclemust already exist
Mission
n Test thoroughly
Consult an Oracle: Test Strategy
-
8/6/2019 Test Automation Patterns 200207
138/187
(c) 2002 Bret Pettichord 138
Consult an Oracle: Test Strategy
Test Creation
n Typically largenumbers of tests are
generated randomlyTest Execution
n A testing frameworksends the test inputs
to both the systemunder test and theoracle.
Test Evaluationn The results are
compared.
n Typically rules mustbe defined regardingthe domain in whichthe oracle isconsidered
authoritative and thedegree of accuracythat is acceptable.
Consult an Oracle:Five Types of Oracles
-
8/6/2019 Test Automation Patterns 200207
139/187
(c) 2002 Bret Pettichord 139
Five Types of Oracles
Nonen Doesnt check
correctness of results
n Just makes sure the
program doesnt crashTruen Independent generation
of results
n Often expensive
Consistencyn Compares results from
different runs/versions.
n Gold files
Self Verifying
n Inputs indicate thecorrect result
n Correct result is
computed when data isgenerated
Heuristicn Only checks some
characteristics of results
n Often very useful
Consult an Oracle:Quality Attributes
-
8/6/2019 Test Automation Patterns 200207
140/187
(c) 2002 Bret Pettichord 140
Quality Attributes
Maintainability
n High
Reviewabilityn High
n If you save the inputsand outputs, anyone candouble check them.
Dependability
n Varies.
n Mostly this depends onthe dependability of the
oracle.n Dont forget to test your
framework and accuracycalculations by seedingerrors.
Reusabilityn High.
Automated Monkey
-
8/6/2019 Test Automation Patterns 200207
141/187
(c) 2002 Bret Pettichord 141
Automated Monkey
Users will invariably try more input sequencesthan you could ever think up in the test lab.
Interaction problems, in which a feature only
fails after a previous action triggered ahidden fault, are hard to find.
Therefore, develop an Automated Monkey.
This is a state model of the product undertest that can generate lots of test sequences.
Automated Monkey: Example
-
8/6/2019 Test Automation Patterns 200207
142/187
(c) 2002 Bret Pettichord 142
Automated Monkey: Example
A state modelof a web-basedordering
system.
A testgeneratedfrom this
model
Home
Add
Account
Add Account
HomeAdd OrderAdd Order
Home
Add Order
A small part of our modelof an ordering system.
Automated Monkey:Example State Table
-
8/6/2019 Test Automation Patterns 200207
143/187
(c) 2002 Bret Pettichord 143
Example State TableStart State Transition End State
AccountAdministration AccountAdministration.CreateNewAccount.Click() CreateAccountAdmin
AccountAdministration AccountAdministration.AgentAccounts.Click() Administration
AccountProfile AccountProfile.ReturnHome.Click() AgentHome
AccountProfile AccountProfile.LogOut.Click() MainHome
AccountProfile AccountProfile.Tasks_Bottom.Click() Tasks
AccountProfile AccountProfile.NewSolution_Bottom.Click() NewSolution
AccountProfile AccountProfile.Administration_Bottom.Click() AdministrationAccounts Accounts.ReturnHome.Click() AgentHome
Accounts Accounts.LogOut.Click() MainHome
Accounts Accounts.Tasks_Bottom.Click() Tasks
Accounts Accounts.NewSolution_Bottom.Click() NewSolution
Accounts Accounts.Administration_Bottom.Click() Administration
Accounts Accounts.Sites.Click() AccountSites
Accounts Accounts.CreateNewAccountTop.Click() CreateAccountAccounts Accounts.ShowAll1.Click() AccountsShowAll
AccountsEmpty AccountsEmpty.ReturnHome.Click() AgentHome
AccountsEmpty AccountsEmpty.LogOut.Click() MainHome
The beginning of a state table that lists all transitions.
Automated Monkey:Example Test
-
8/6/2019 Test Automation Patterns 200207
144/187
(c) 2002 Bret Pettichord 144
Example Test
A test generated from the state table, ready to be executed.
AgentHomeAccountSites.ReturnHome.Click()AccountSites
AccountSitesAccounts.Sites.Click()AccountsAccountsAdministration.Accounts_Tab.Click()Administration
AdministrationNewSolution.Administration_Tab.Click()NewSolution
NewSolutionAccountSites.NewSolution_Tab.Click()AccountSites
AccountSitesAccounts.Sites.Click()AccountsAccountsAgentHomeAccounts_Center.Click()AgentHome
AgentHomeNewSolution.ReturnHome.Click()NewSolution
New SolutionAgentHome.NewSolution_Center.Click()AgentHome
AgentHomeMainHome.Login()MainHomeEnd StateTransitionStart State
Automated Monkey:Test Strategy
-
8/6/2019 Test Automation Patterns 200207
145/187
(c) 2002 Bret Pettichord 145
Test Strategy
Test Creation
n Define a state model thatcorresponds to part ofthe product
n Use algorithms togenerate test pathsthrough the state model.
Test Execution
n Execute the test pathsagainst the product.
n As a practical matter, this
needs to be automated.
Test Evaluationn Verify that the product is
in the correct state foreach transition.
Automated Monkey: Architecture
-
8/6/2019 Test Automation Patterns 200207
146/187
(c) 2002 Bret Pettichord 146
Automated Monkey: Architecture
State Modeln A state model consists of
nodes, which are the states,and edges, which are thetransitions between states.
Generation Algorithmn The simplest algorithm
simply picks a randomtransition from each node.(Random Walk)
n Mathematical graph theory
provides several algorithmsthat can be used to ensurespecific levels of coverage.
Test Paths
n A test path is a chain oftransitions that traversesthe model.
n Each transition is an actionand each node on the chainis a state.
Execution Engine
n This script executes the testpath.
n A verification method isexecuted for each node.
Automated Monkey: Context
-
8/6/2019 Test Automation Patterns 200207
147/187
(c) 2002 Bret Pettichord 147
Automated Monkey: Context
People
n Programmer/Testerswith some
mathematical skillProduct
n Many
Mission
n Thoroughly testfunctionality that
must work correctly
Automated Monkey:Quality Attributes
-
8/6/2019 Test Automation Patterns 200207
148/187
(c) 2002 Bret Pettichord 148
Quality Attributes
Maintainability
n Varies
Reviewability
n Low/Medium
n State models may behard to review
n It helps if tests are
generated in areviewable form
Dependability
n Varies.
n Depends on the state
verification functions.(Instrumentationmay be required.)
Reusability
n Varies
Assertions and Diagnostics
-
8/6/2019 Test Automation Patterns 200207
149/187
(c) 2002 Bret Pettichord 149
Assertions and Diagnostics
Delayed-fuse bugs are hard to detect. Thesecreate bad data or invalid states, but it maytake further testing before the problembecomes obvious.
Therefore, add Assertions and Diagnostics tothe product code.n Assertions are statements of invariants: when
these fail, youve found a bug.
n Diagnostics are warnings: further analysis isrequired.
Assertions and Diagnostics:Example
-
8/6/2019 Test Automation Patterns 200207
150/187
(c) 2002 Bret Pettichord 150
Example
Digital PBX
Internal errors are logged. (Assertions)
Diagnostic messages report events thatare expected to happen infrequently.
Tests are generated with a real-time
simulator.Logs are inspected to find errors.
Assertions and Diagnostics:Context
-
8/6/2019 Test Automation Patterns 200207
151/187
(c) 2002 Bret Pettichord 151
Co te t
People
n Tester/Programmers
n Programmer/Testers
Productn Many.
n Many standardcomponents have
diagnostic interfacesbuilt in.
Mission
n Test softwarethoroughly.
Assertions and Diagnostics:More Examples
-
8/6/2019 Test Automation Patterns 200207
152/187
(c) 2002 Bret Pettichord 152
More Examples
n Assertions. These logical statements in the code makeassumptions explicit. If false, there must be a bug. Typicallyassertion checking is only done during testing anddebugging.
n Database Integrity Checks. A program checks thereferential integrity of a database, reporting errors found.
n Code Integrity Checks . Compute a checksum to seewhether code has been overwritten.
n Memory Integrity Checks. Modify memory allocation tomake wild pointers more likely to cross application memoryallocations and trigger memory faults.
n Fault Insertion.Allow error handling code to be triggeredwithout having to actually create the error conditions (e.g.bad media, disk full)
n Resource Monitoring.Allow configuration parameters,memory usage and other internal information to be viewed.
Assertions and Diagnostics:Test Strategy
-
8/6/2019 Test Automation Patterns 200207
153/187
(c) 2002 Bret Pettichord 153
gy
Test Creation
n Varies.
n Tests can be createdusing Automated
Monkey.n Manual Exploratory
testing is also supported.
Test Execution
n
Varies.n May require debug
version of software.
Test Evaluation
n Assertions report thaterrors occurred.
n Diagnostics must be
analyzed, either to helpdebug assertion-errors orto suggest problems lyingin wait.
Assertions and Diagnostics:Quality Attributes
-
8/6/2019 Test Automation Patterns 200207
154/187
(c) 2002 Bret Pettichord 154
Q y
Maintainabilityn Medium/High.
n Assertions andDiagnostics should berevised as the code ischanged.
Reviewabilityn Medium.
n This makes the code
execution easier tounderstand.
n Diagnostics may helpprovide informationregarding test coverage.
Dependability
n Medium.
n Depends on how wellAssertions and
Diagnostics have beenadded.
Reusabilityn High.
n
Any test can use them.
Quick and Dirty
-
8/6/2019 Test Automation Patterns 200207
155/187
(c) 2002 Bret Pettichord 155
Q y
You want results fast and you have no timefor architecture.
Therefore, just do what you can:
n Focus on smoke tests, configuration tests, tests ofvariations, and endurance tests.
n Plan to throw away code.
n In the process, learn about your tools and thepossibilities for automation.
Plan to throw one away; you will anyhow. --Fred Brooks
Quick and Dirty: Opportunities
-
8/6/2019 Test Automation Patterns 200207
156/187
(c) 2002 Bret Pettichord 156
Q y pp
Platform Setup andReset
Pre-load Database withTestbed Data
Smoke TestsRegression Testing
Configuration and Multi-platform Testing
Load TestingRandomized Testing
Code CoverageMeasurement
Memory Leak and otherspecialized testing
Test compliance withinterface standards
Collect performance
metrics
Confirm pre-definedrelease criteria
Quick and Dirty: Context
-
8/6/2019 Test Automation Patterns 200207
157/187
(c) 2002 Bret Pettichord 157
Q y
People
n Tester/Programmers
ProductnAny
Mission
n
Automate tests that will pay back quicklyfor the time invested in creating them.
Quick and Dirty: Test Strategy
-
8/6/2019 Test Automation Patterns 200207
158/187
(c) 2002 Bret Pettichord 158
Q y gy
Test Creation
n Hand coding
n Capture replay (ifit works)
n Nothing fancy
Test Execution
nYes!
Test Evaluationn Manual
verification ofresults is OK.
Quick and Dirty:Quality Attributes
-
8/6/2019 Test Automation Patterns 200207
159/187
(c) 2002 Bret Pettichord 159
Q y
Maintainability
n Low/None.
Reviewability
n Low
Dependability
n Varies.
n Youre really
depending on thepeople who createand run the tests.
Reusability
n Low
[Keeping it Simple]
Scripting Frameworks
[U I t f Ab t ti ]
-
8/6/2019 Test Automation Patterns 200207
160/187
(c) 2002 Bret Pettichord 160
ArchitecturePatterns
[User Interface Abstraction]
Data-Driven Scripts
Screen-Based Tables
Action Keywords
[Alternate Interfaces]
Test-First Programming
API Tests
Thin GUI
[Verdict Focus]
Consult an Oracle
Automated Monkey
Assertions and Diagnostics
[Keeping it Simple (Stupid)]
Quick and Dirty
IntroductionQuality AttributesA chitect al Patte ns
-
8/6/2019 Test Automation Patterns 200207
161/187
(c) 2002 Bret Pettichord 161
Are You Ready to Automate?
Surveying Test AutomationObjectives
Close your workbooks
Architectural PatternsAre You Ready to Automate?Concluding Themes
Top 10 Reasons forAutomating Tests
-
8/6/2019 Test Automation Patterns 200207
162/187
(c) 2002 Bret Pettichord 162
g
1. Manual testing sucks.
2. Tool vendor saidCapture replay works.
3. Fun to watch thedialogs popping on thescreen.
4. Afterwards we can fire
all those pesky testers.
5. Everybody else is doingit.
6. No humans wereharmed in the testing ofthis software.
7. Big bucks already spenton the test tool.
8. Looks good on theresume.
9. No Testing for Dummiesbook ... yet.
10. Keep the intern busy.
Gradual Test Automation
-
8/6/2019 Test Automation Patterns 200207
163/187
(c) 2002 Bret Pettichord 163
Test automation benefits from agradual approach.
Build some tests and see how they runbefore adding complexity.
This seminar has presentedarchitectures in way that allows gradualadoption.
Pilot Project
-
8/6/2019 Test Automation Patterns 200207
164/187
(c) 2002 Bret Pettichord 164
j
Validate your tools and approach
Demonstrate that your investment inautomation is well-spent
Quickly automate some real tests
Get a trial license for any test tools
Scale your automation project in steps
PerspectivesDiffer
Reasons for Automating
-
8/6/2019 Test Automation Patterns 200207
165/187
(c) 2002 Bret Pettichord 165
Roles
n Developmentmanager
n
Testing managern Developers
n Testers
n Automators
Reasons for Automating
n
Speed up testingn Allow more frequent testing
n Reduce manual labor costs
n Improve test coverage
n Ensure consistency
n Simplify testing
n Define the testing process
n Make testing more interestingand challenging
n Develop programming skillsn Justify cost of the tools
n Of course well have automation!Marick:
How shoulddevelopers think
about testing?
Reasons for Automating
-
8/6/2019 Test Automation Patterns 200207
166/187
(c) 2002 Bret Pettichord 166
n Speed up testing
nAllow more frequent testing
n Reduce manual labor costs
n Improve test coverage
n Ensure consistency
n Simplify testing
n Just want testing to go away
Reasons for Automating (cont.)
-
8/6/2019 Test Automation Patterns 200207
167/187
(c) 2002 Bret Pettichord 167
n Define the testing process
n Make testing more interesting andchallenging
n Develop programming skills
n Justify cost of the tools
n Of course well have automation!
Reasons for Automating:Summary
-
8/6/2019 Test Automation Patterns 200207
168/187
(c) 2002 Bret Pettichord 168
Reasonable
n High Reuse
n Speed Development
n Expand Reachn Smooth
Development
Unreasonable
n Simplify Testing
n Force Organization
n 100% Automationn Justify Tool Purchase
Mixed Bag
n Regression Testing
n Build Skill & Morale
Success Critera
-
8/6/2019 Test Automation Patterns 200207
169/187
(c) 2002 Bret Pettichord 169
What are your success criteria?
The automation runs
The automation does real testing
The automation finds defectsThe automation saves time
What bugs arent you finding while you areworking on the automation?
What is the goal of testing?
Ready to Automate?
-
8/6/2019 Test Automation Patterns 200207
170/187
(c) 2002 Bret Pettichord 170
1. Is automation or testing a label for other problems?2. Are testers trying to use automation to prove their prowess?3. Can testability features be added to the product code?4. Do testers and developers work cooperatively and with
mutual respect?
5. Is automation is developed on an iterative basis?6. Have you defined the requirements and success criteria for
automation?
7. Are you open to different concepts of what test automationcan mean?
8. Is test automation lead by someone with an understanding ofboth programming and testing?
Ready to Automate?
-
8/6/2019 Test Automation Patterns 200207
171/187
(c) 2002 Bret Pettichord 171
[does not print]
Ready to Automate?Scoring
-
8/6/2019 Test Automation Patterns 200207
172/187
(c) 2002 Bret Pettichord 172
Nevermind55 or less
Wait and See60-65
Time for More Training70-75
Win Over Some Converts80-85
Ready to Automate90-100
Score
IntroductionQuality AttributesArchitectural Patterns
-
8/6/2019 Test Automation Patterns 200207
173/187
(c) 2002 Bret Pettichord 173
Concluding Themes
What Have We Learned?
Architectural PatternsAre You Ready to Automate?
Concluding Themes
Keep It Simple
-
8/6/2019 Test Automation Patterns 200207
174/187
(c) 2002 Bret Pettichord 174
Test Automation tends to complicatetesting
The tests suite itself will need to betested
Make sure the test suite meets theoriginal goals
Build Flexibly and Incrementally
-
8/6/2019 Test Automation Patterns 200207
175/187
(c) 2002 Bret Pettichord 175
Build and deliver test automation instages.
Deliver a proof of concept early.
Deliver automation updates regularly.
Package the automation so that it canbe installed easily.
Document all dependencies.
Work With Development
-
8/6/2019 Test Automation Patterns 200207
176/187
(c) 2002 Bret Pettichord 176
Test automation is development. Gethelp from your development experts(developers).
Incorporate automation milestones intothe development plan.
-
8/6/2019 Test Automation Patterns 200207
177/187
Keep Tests Visible
-
8/6/2019 Test Automation Patterns 200207
178/187
(c) 2002 Bret Pettichord 178
Visibility facilitates review.
Review encourages realism about testcoverage.
Test suites require review and improvement.Dont assume that old features are coveredby existing tests.
Assess test suite weaknesses and product
risks.Are these tests still useful?
Commitment Is Essential
-
8/6/2019 Test Automation Patterns 200207
179/187
(c) 2002 Bret Pettichord 179
It is easy for test automation to bedesignated as a side project. It wontget the resources it needs.
Commitment ensures that testautomation gets the resources,cooperation and attention that it needs.n From development
n From management
Get an Early Start
-
8/6/2019 Test Automation Patterns 200207
180/187
(c) 2002 Bret Pettichord 180
An early start makes it more likely youcan improve testability and get testAPIs
Your test strategy will includeautomation from the start
-
8/6/2019 Test Automation Patterns 200207
181/187
(c) 2002 Bret Pettichord 181
Activity Next Steps
What are the next things for yourproject?
-
8/6/2019 Test Automation Patterns 200207
182/187
(c) 2002 Bret Pettichord 182
Resources
Books, Websites, Consultation
Test Automation Books
-
8/6/2019 Test Automation Patterns 200207
183/187
(c) 2002 Bret Pettichord 183
Software Test Automation, Fewster & Graham (1999)n The first section provides a general overview of test automation
with a description of common practices. The second collectsaccounts from various automators describing their projects.Describes Scripting Framework, Data-driven Scripts, Screen-basedTables, and Action Keywords.
Integrated Test Design and Automation, Buwalda et al (2002)n A detailed elaboration of Action Keywords.
The Automated Testing Handbook, Linda Hayes (1995)n A concise description of Screen-based Tables.
Visual Test 6 Bible, Thomas Arnold (1999)n Contains a chapter on Automation Monkey by Noel Nyman.
Visual Basic for Testers, Mary Sweeney (2001)n Describes API-based testing.
Software Testing Books
-
8/6/2019 Test Automation Patterns 200207
184/187
(c) 2002 Bret Pettichord 184
Lessons Learned in Software Testing, Kaner, Bach & Pettichord(2001)n Chapter on test automation has been described as more useful
than any of the books on test automation.
Testing Applications on the Web, Nguyen (2001)
n Understand how to customize your testing strategy based on thearchitecture of your system.
Testing Computer Software, Kaner, Falk & Nguyen (1993)n Describes how to test effectively when programmers dont follow
the rules.
How to Break Software, Whittaker (2003!)n Details 23 attacks for uncovering common bugs, including fault-
insertion techniques.
Websites
-
8/6/2019 Test Automation Patterns 200207
185/187
(c) 2002 Bret Pettichord 185
Reference Point: Test Automation,Pettichord (Readings)n Identifies articles, books and websites.
Software Testing Hotlist, Pettichord,testinghotlist.com (Readings)
n Articles and websites for software testing and testautomation.
QA Forums, qaforums.com
n Good place to get current tool-specificinformation. Has boards for all the popular testtools.
Free Consultation
-
8/6/2019 Test Automation Patterns 200207
186/187
(c) 2002 Bret Pettichord 186
As a student in this seminar you are entitledto a free consultation.n One hour
n By phone or email
n Face to face if Im already in town
Send me an email describing your situation.Remind me that you attended this seminar.n
If you want to talk on the phone let me knowgood times when you can be reached.
Contactn [email protected], 512-302-3251
Bibliography
-
8/6/2019 Test Automation Patterns 200207
187/187
[does not print]