7 deadly sins of agile software test automation
DESCRIPTION
Automated software testing is a key enabler for teams wanting to build high quality software that can be progressively enhanced and continuously released. To ensure development practices are sustainable, automated testing must be treated as a first-class citizen and not all approaches are created equal. Some approaches can accumulate technical debt, cause duplication of effort and even team dysfunctions. The seven deadly sins of automated software testing are a set of common anti-patterns that have been found to erode the value of automated testing resulting in long term maintenance issues and ultimately affecting the ability of development teams to respond to change and continuously deliver. Taking the classic seven sins (Gluttony, Sloth, Lust, Envy, Rage, Pride, Greed) as they might be applied to test automation we will discuss how to identify each automated sin and more importantly provide guidance on recommended solutions and how to avoid them in the first place.TRANSCRIPT
Engineering Innovation.
7 DEADLY SINS OF AUTOMATED TESTINGDr Adrian SmithSeptember 2012
Thursday, 20 September 12
Adrian Smith• Background in Engineering
• Software development using Agile and Lean
• Technical and Organisational Coach
• Founded a startup product development and consulting business
Aerospace EngineeringCommercial and military engineering design,
analysis and manufacturing experience on major
programs including A380 and F35.
Agile Software DevelopmentSoftware development, architecture and
management for engineering CAE, automation,
scientific and digital media.
Systems IntegrationIntegration of logistics, financial, engineering
and resource management systems for mining,
defence and government.
Diverse Experience Base
Agile Software & Systems
15 years developing & integrating
engineering software products within
US, Australia and UK
Aerospace EngineeringCommercial and military engineering design,
analysis and manufacturing experience on major
programs including A380 and F35.
Agile Software DevelopmentSoftware development, architecture and
management for engineering CAE, automation,
scientific and digital media.
Systems IntegrationIntegration of logistics, financial, engineering
and resource management systems for mining,
defence and government.
Diverse Experience Base
Agile Software & Systems
15 years developing & integrating
engineering software products within
US, Australia and UK
Aerospace EngineeringCommercial and military engineering design,
analysis and manufacturing experience on major
programs including A380 and F35.
Agile Software DevelopmentSoftware development, architecture and
management for engineering CAE, automation,
scientific and digital media.
Systems IntegrationIntegration of logistics, financial, engineering
and resource management systems for mining,
defence and government.
Diverse Experience Base
Agile Software & Systems
15 years developing & integrating
engineering software products within
US, Australia and UK
Thursday, 20 September 12
Geeks hate repetition
Thursday, 20 September 12
Airbus A380 Wing
Thursday, 20 September 12
Thursday, 20 September 12
EnvyFlawed comparison of manual testing and automation
Thursday, 20 September 12
How management see testing
Thursday, 20 September 12
How management would like to see testing
Thursday, 20 September 12
Manual vs Automation• A flawed comparison
• Assumes that automation can replace manual testing effort
• Automation generally doesn’t find new defects
• Testing is not merely a sequence of repeatable actions
• Testing requires thought and learning
Thursday, 20 September 12
Ideal automation targets• Regression testing - assessing current state
• Automation of test support activities
• Data generation or sub-setting• Load generation• Non-functional testing
• Deterministic problems
• Big data problems
Thursday, 20 September 12
Common symptoms• Relying on automation as the basis
for all testing activities
• All tests are built by developers
• Absence of code reviews
• Absence of exploratory testing
• Absence of user testing
Thursday, 20 September 12
Suggested approach• Avoid comparison between manual and
automated testing - both are needed
• Distinguish between the automation and the process that is being automated
• Use automation to provide a baseline
• Use automation in conjunction with manual techniques
Thursday, 20 September 12
GLUTTONY
Over indulging on commercial test toolsThursday, 20 September 12
Promise of automation• Software vendors
have sold automation as the capture-replay of manual testing processes
• Miracle tools that solve all testing problems
Thursday, 20 September 12
License barrier• Commercial licenses restrict usage
• Not everyone can run the tests
• Typically, organisations create special groups or privileged individuals
Thursday, 20 September 12
Incompatible technology• Underlying technology of commercial tools is
often not compatible with the development toolchain
• Special file formats or databases• Lack of version control for tests• Tests cannot be versioned within the software • Continuous integration problems• Can’t be adapted or extended by the developers
Thursday, 20 September 12
Justifying the expense• Financial commitments
distort judgement
• Difficult to make objective decisions
• Tendency to use the tool for every testing problem
• People define their role by the tools they use
Thursday, 20 September 12
Common symptoms• A commercial tools form the basis
of a testing strategy
• Only certain teams or individuals can access a tool or run tests
• Developers have not be consulted in the selection of a testing tools
• “We always use <insert tool-name> for testing!”
Thursday, 20 September 12
Suggested approach• Use Open Source software tools where ever
possible
• Use tools that can easily be supported by the development team and play nicely with existing development tool chain
• Ensure any commercial tools can be executed in a command-line mode
Thursday, 20 September 12
User interface forms the basis for all testingLust
Thursday, 20 September 12
Testing through the GUI• Non-technical testers often approach testing
through the user interface
• Ignores the underlying system and application architecture
• Resulting tests are slow and brittle
• Difficult to setup test context - resulting in sequence dependent scripts
Thursday, 20 September 12
Investment profile
Interface
Acceptance
Integration
Unit/Component
Investment / Importance
Manual Exploratory
Collaboratively built around
system behaviour
Developer built optimised for fast
feedback
Confi
denc
e
Spee
d / F
eedb
ack
Exercises components and
systems
Thursday, 20 September 12
Architecture• Understanding
application and system architecture improves test design
• Creates opportunities to verify functionality at the right level
Thursday, 20 September 12
Test designTest Intent
(Clearly identifies what the �test is trying to verify) Test Data
Test Implementation�
(Implementation of the test including usage of test data)
System Under Test
Thursday, 20 September 12
F.I.R.S.T. class tests
F Fast
I Independent
R Reliable
S Small
T Transparent
Thursday, 20 September 12
Common symptoms• Testers cannot draw the
application or system architecture
• Large proportion of tests are being run through the user interface
• Testers have limited technical skills
• No collaboration with developers
• Intent of tests is unclear
Thursday, 20 September 12
Suggested approach• Limit the investment in automated tests that are
executed through the user interface
• Collaborate with developers
• Focus investment in automation at lowest possible level with clear test intent
• Ensure automation give fast feedback
Thursday, 20 September 12
PrideToo proud to
collaborate when creating tests
Thursday, 20 September 12
Poor collaboration• Organisations often create
specialisations of roles and skills
• Layers of management and control then develop
• Collaboration becomes difficult
• Poor collaboration = poor tests
Thursday, 20 September 12
Automating too much• Delegating test automation to a special group
inhibits collaboration
• Poor collaboration can results in duplicate test cases / coverage
• Duplication wastes effort and creates maintenance issues
• Setting performance goals based around test-cases automated leads to problems
Thursday, 20 September 12
No definition of quality• Automated testing effort should match the
desired system quality
• Risk that too-much, too-little or not the right things will be tested
• Defining quality creates a shared understanding and can only be achieved through collaboration
Thursday, 20 September 12
Good collaboration
Analyst
Developer
TesterSpecification and Elaboration
Automation
Acceptance Criteria
Collaboration
• Cross-functional teams built better software
• Collaboration improves definition and verification
Thursday, 20 September 12
Specification by Example• Recognises the value of
collaboration in testing
• More general than ATDD and/or BDD
• Based around building a suite of Living Documentation that can be executed
Thursday, 20 September 12
Common symptoms• Automated tests are being built
in isolation from team
• Intent of tests is unclear or not matched to quality
• Poor automation design (abstraction, encapsulation, ...)
• Maintainability or compatibility issues
Thursday, 20 September 12
Suggested approach• Collaborate to create good tests and avoid
duplication
• Limit the investment in UI based automated tests
• Collaborate with developers to ensure good technical practices (encapsulation, abstraction, reuse, ... )
• Test code = Production code
Thursday, 20 September 12
Engineering Innovation.
Too lazy to properly maintain automated tests
SLOTH
Thursday, 20 September 12
Automated Test Failures
NewFeature
SystemInterfaceChange
OS Patch
ReferenceData
Changes
Time
• Many potential causes of failure
• Unless maintained - value is slowly eroded
Thursday, 20 September 12
Importance of maintenance
Time
Cost
/ Ef
fort Potential
Value of Maintained Automated Test
Suite
Value of Unmaintained
Automated Test Suite
Manual test executionMaintained automationUnmaintained automation
Thursday, 20 September 12
Continuous integration
Thursday, 20 September 12
Common symptoms• Test suite has not been recently
run - state is unknown
• Continuous Integration historyshows consistent failures following development changes / release
• Test suite requires manual intervention
• Duplication within automation code
• Small changes triggers a cascade of failures
Thursday, 20 September 12
Suggested Approach• Ensure automated tests are executed
using a Continuous Integration environment
• Ensure test are always running - even if system in not being actively developed
• Make test results visible - create transparency of system health
• Ensure collaboration between developers and testers
Thursday, 20 September 12
RageFrustration with slow, brittle or unreliable automated tests
Thursday, 20 September 12
Slow automation• Large datasets
• Unnecessary integrations
• Inadequate hardware/environments
• Too many tests
• Reliance on GUI based tests
• Manual intervention
• ... many others
Thursday, 20 September 12
Fast Feedback
Thursday, 20 September 12
Brittle Tests• Contain time-bound data
• Have external dependencies
• Rely on UI layout/style
• Rely on sequence of execution
• Based on production data or environments
Thursday, 20 September 12
Frustration
Thursday, 20 September 12
Unreliable Tests• False positives
• Wastes time investigating
• Failures start being ignored
• Creates uncertainty of system health
• Workarounds and alternate tests are created
Thursday, 20 September 12
Suggested approach• Treat automated tests with the same
importance as production code
• Review, refactor, improve ...
• Apply a “Stop the line” approach to test failure
• Eliminate (quarantine) unreliable tests
• Ensure collaboration with developers
• Up-skill / pair testers
Thursday, 20 September 12
Avarice (Greed)
Trying to cut costs through automation
Thursday, 20 September 12
Lure of cheap testing• Testing tool vendors
often try to calculate ROI based on saving labour
• Analysis is unreliable and under values the importance of testing
Thursday, 20 September 12
Automation is not cheap• Adopting test automation tools and techniques
requires significant investment
• Investment in new ways of working
• Investment in skills
• Investment in collaboration
• Ongoing investment in maintenance
Thursday, 20 September 12
Common symptoms• Investment in commercial tools
using a business-case based onreducing headcount
• Using a predicted ROI as a way ofreducing budget for Testing
• Consolidating automated testingwithin a special group
Thursday, 20 September 12
Suggested approach• Ensure the reasons for automation are
clear and are NOT based purely on saving money/headcount
• Ensure business case for automation includes costs for ongoing maintenance
Thursday, 20 September 12
7 Deadly SinsEnvy Flawed comparison of manual testing and automation
Gluttony Over indulging on commercial test tools
Lust User interface forms the basis for all testing
Pride Too proud to collaborate when creating tests
Sloth Too lazy to maintain automated tests
Rage Frustration with slow, brittle or unreliable tests
Greed Trying to cut costs through automation
Thursday, 20 September 12
Why automate testing ?
Thursday, 20 September 12
How geeks really work
Thursday, 20 September 12
Engineering Innovation.
Thank youDr Adrian SmithSeptember 2012
Thursday, 20 September 12