testing and devops
TRANSCRIPT
Testing and DevOpsInstill a DevOps Culture in your Testing Team and Organization
Adam AuerbachCo-Head of Cloud and DevTestSecOps Practice
©2018 EPAM Systems, Inc.
Adam AuerbachC o - H e a d , C l o u d a n d D ev Te st S e c Ops P ra c t i c e
V P , D e v T e s t S e c O p s
A d a m A u e r b a c h
Joined EPAM in April of 2018 as VP, DevTestSecOps, based in Newtown PA.
Prior to EPAM, Adam served as the VP of Quality and DevOps Engineering at Lincoln Financial Group, where he was responsible for leading the implementation of Continuous Testing and Continuous Delivery.
Adam regularly speaks at conferences as well as writing blogs and articles on Testing and DevOps.
You can find more of Adam’s thinking on the TechWell blog or follow him on Twitter at @bugman31.
EPAM Fast Facts
REVENUE BY GEOGRAPHY(Reported $ & YoY Growth)
REVENUE BY INDUSTRY VERTICAL(Reported $ & YoY Growth)
F i gur es ar e as o f F Y 2020, per c ent ages ar e pr es ent ed on a YoY bas i s unl es s ot her wi s e not ed.
FOUNDED IN
1993U.S. HEADQUARTEREDPUBLIC COMPANY
(NYSE:EPAM)
Q4 2020 REVENUE
$723MFY 2020 REVENUE
$2.7BREVENUE GROWTH
5 Year Revenue CAGR of 24%(FY15-FY20)
36,700+ Engineers, Designers & Consultants 41,150+ Employees 35+ Countries
North America
Europe
$ 7 0 M 2 3 .7 % ↑
APAC$ 1 1 5 M 1 4 .2 % ↑
CIS
$ 2 9 2 M 1 6 .3 % ↑
$ 4 5 9 M 4 .4 % ↑
$ 5 5 5 M 1 0 .9 % ↑
$ 2 9 6 M 1 9 .3 % ↑
Financial Services
Travel & Consumer
Emerging
Life Sciences & Healthcare
BusinessInformation & Media
Software & Hi-Tech$ 4 9 7 M 1 4 .6 % ↑
$ 5 6 1 M 3 3 .2 % ↑
$ 1 ,5 9 5 M 1 4 .8 % ↑
$ 8 8 0 M 1 7 .8 % ↑
DevOps is a Philosophy
DELIVER HIGH QUALITY WORKING SOFTWARE FASTER
DevOps is a philosophy where teams are accountable for everything required to get their code developed, tested and deployed to production, while shared service teams provide the automation and tools to enable them.
“You Build it You Own it”Agile Pod
(Dev, QA, PO, BSA)
+
Prod Support
Arch.
Infrastructure
Shared Services (e.g. Security Testing, Perf Testing)
1
2
3
4
5
Build once, deploy many
Enable self-service and on-demand capabilities
Leverage SMEs for support and best practices
Identify opportunities to remove bottlenecks
Increase and assume accountability
5 Key Principles to DevOps
DevOps is the Next Part of Your Agile Journey
Number of Releases
Soft
war
e Q
ualit
y
Agile
DevOps
Continuous Delivery
Waterfall
Accelerates software development with
iterative, incremental releases
Enables faster feedback for developers thru
automation and increased responsibilities
Allows code to flow via an automated software factory
that deploys constantly
Agile + DevOps
DevOps
Continuous Integration
Continuous Testing
Continuous Monitoring
Continuous DeliveryAgile
Real-Time Automation
• Team Focused• Stress on Stories• Add a Test
• Run all Tests• Write Code• Refactor Until Added Test Passes
BEHAVIOR-DRIVEN DEVELOPMENT (BDD)
TEST-DRIVEN DEVELOPMENT (TDD)
• Developer Focused• Closer to Unit Level• Add a Test
• Run all Tests• Write Code• Refactor Until Added Test Passes
ACCEPTANCE TEST-DRIVEN DEVELOPMENT (ATDD)
• Team Focused• Stress on Acceptance Criteria• Add a Test
• Run all Tests • Write Code• Refractor Until Added Test Passes
Testing Approach
Software Testing Ice-cream Cone Anti-Pattern
watermelon.com
Automated GUI Tests
Integration Tests
UnitTests
AutomatedGUI Tests
Automated Unit Tests
Manual Session Based Testing
Automated API Tests
Automated Integration Tests
Automated Component Tests
Manual Tests
Ideal Software Testing Pyramid
watermelon.com
C U R R E N T D E S T I N A T I O N
Test Data Management
E F F E C T I V E D A T A E F F I C I E N T D A T A
P R O D U C T I O N D A T A
Building the right data Getting it right the first time
Reduce Security Exception Dependency
1 2
3
Service Virtualization
I N T E G R A T E D E N V I R O N M E N T V I R T U A L I Z E D E N V I R O N M E N T
VS.
Continuous Testing
27
Quality Gates
Product delivery flow
Notifications and Reports
Test Data Setup
Persistent Components
Build Artifact Flow
Automated Test Environment Provisioning
Delivery and Governance are tracked and supported by KPIs and industry standard quality gates in every level of testing architecture.
Developer
CI / Build Server
Version Control
Mock /Unit Tests
Static СodeAnalysis
Code Review
Build Artifact Repository
QA, INT
Automated smoke &
Regression Test
UAT, SYS
PERF, STAG
Test Data Management
Performance Tests
Functional and Acceptance Tests
Security Tests
QA Engineer
Test CaseManagement
Prod Approval
Synthetic Monitoring
PRODReporting
CONFIDENTIAL. Copyright © 2019 28
HIGHLIGHTS
• Automated quality gate is implemented as a step in the pipeline that outputs a binary result (pass/fail)
• Based on the result the decision on whether the build can move further down the pipeline is made automatically
EXPECTED BENEFITS
• Time savings on build/deployment administration
• Time savings on test result analysis
• Harmful code remains at hands of the author until fixed and doesn’t affect the rest of the team
• Tests are properly maintained
KPIs TO MEASURE VALUE
• Count of bugs let through the gates
• Count of bug-free code increments
Example of a Continuous Testing Pipeline
Continuous Testing Quality Gates
C O D E
Code QualityScan
Unit tests
Code Coverage
Composition Analysis
F E A T U R E B R A N C H C O D ER E V I E W
MutationTest
ComponentTests
ContractTests
M A S T E R B R A N C H D E P L O Y R E L E A S E
MutationTest
ComponentTests
Composition Analysis
Unit tests
Code Coverage
ContractTests
PerformanceTests
Security Scans
PenetrationTests
Functional Automation
User AcceptanceTesting
SystemMonitoring
SyntheticMonitoringFour eyes
check
DeveloperReview
CO
MM
ON
QG
AD
VA
NC
ED
QG
Static Application Security Scan
Continuous Testing isn’t common yet
90%(2)have test casescombined into suites
70%(1)have smoke and regression tests automated
79%(1)(2)have exploratorytesting
29%(2)run performance testsdaily or per sprint
12%(1)run security testsdaily or per sprint
35%(2)have unit testsCoverage > 80%
38%(2)have regression tests coverage> 80%
24%(1)have > 80% of regression testson non-UI level
55%(1)have Pass Ratio> 90%
43%(1)(2)have ability to promotea new featurewithin 1 week
25%(2)have defect containment level higher than 95%
37%(2)have resolution time for blocking issues < 1day
52%(1)have automation as a part of DoD
41%(1)(2)have static code analysis as a quality gate (QG)
42%(1)(2)have smoke test as a QG
33%(2)have regression tests as a QG
Test process
Automation health
Delivery health
Testing environment Integration environment STAG/UAT environment PROD environment
37%(1)(2)32%(2)
21%(2)10%(1)(2)
Builds promoted automatically to upper environments
OPPORTUNISTIC DEVELOPING ENGAGING SUSTAINABLE TRANSFORMATIVE
Retail (2)Embedded Software & Hi-Tech (1)
CT standards:• Code review, code standards, pair programming• Pre-commit verification, automated deployment
process• Automation tests as a part of DoD.• Automated quality gates, quality metrics and
dashboards covering all test stages, risk-based quality management
• Self-service test data management and service virtualization
• Full traceability between requirements, tests and test results
• Non-functional requirements are assessed continuously
• Continuous improvement process.• Real-time app monitoring• Self-service infrastructure
CONFIDENTIAL. Copyright © 2019 31
SDLC maturity drives immediate needsProvide a Testing Transformation Journey Tailored to Client’s Maturity
Waterfall/Agile DevOps Continuous Delivery
Cost Savings thru improved productivity and automation
Increased value by improving overall quality and time to market
Leveraging emerging technologies to know what tests to run and which data to use
• Improving automation effectiveness• Implementing testing best practices• Maximizing utilization of on/offshore
resources
• Continuous quality • Testers who are engineers• Non-functional testing is automated• Quality gates are enforced
• Cloud native automation frameworks and tools
• AI based testing insights• ML based risk models
Organizational Needs:• QA assessment and consulting• Open source testing frameworks• AI/ML based tools • Testing as a Service (manual and
automation)• Onsite, nearshore and offshore
Type
s of T
estin
g En
gage
men
ts
Organizational Needs:• EngX assessment and consulting• Off-the-shelf performance and security
testing platform • Testing as a Service (performance and
security)• Test data management as service• Mobile and IoT device farms
Organizational Needs:• Continuum consulting • Cloud native CI/CD and testing
framework• Gig workforce• AI based tools • ML based data modeling IP
Mat
urity
Low High
CONFIDENTIAL. Copyright © 2019 32
Keep QA or not?
32
QA Organization Developers Test only
• Test Engineers and SDETs build robust test frameworks, testing solutions are fully embedded in release process for CI/CD
• Quick regression, quick feedback
• Developers test their own code from within the code. Tests are in sprint.
• Focus is on the system, not on the stakeholders. Tend to miss big picture issues like usability or End to End integration.
• Testers focus on test design and coverage; developers write the automated frameworks
• Regression tests may be slow, unautomated scenarios are tested manually
• Developers or project team run test scripts when available.
• Focus is on the stakeholders and value of the product. Exploratory
• Slow release cycles, regression testing bottlenecks
• Developers might have some unit tests, Product owners, project managers, other staff does UAT in their spare time.
• Shallow focus is on the stakeholders, not the system• High risk
Auto
mat
ion
Cont
inuu
m
high
low
CONFIDENTIAL | © 2019 EPAM Systems, Inc.
Example - Value Stream
Regression testing
Unit Tests
System
testing / new
Features
D E V 1 , 2
BAT INT
S Y S
Smoke testing
I N T T R A I N I N GD E V
P R I C I N G
Smoke testing
UW BAT CROWD
PRICING BAT
Regression testing
Regression testing
Ad Hoc testing
R E L E A S E P R O D U C T I O N
Smoke testing
Smoke testing
Smoke testing
Smoke testing
QA
BU
In-Sprint BAT Release
13-14 weeks 6 weeks
7,5 weeks 1 week
Smoke testing
Smoke testing
4 hrs
Smoke testing
1 week
23.5 weeks
the whole cycle
DEV
BAT users find >60% of all defects
BAT vs QA test data setup flows overlap 25-35%
Manual regression
CONFIDENTIAL | © 2019 EPAM Systems, Inc.
Example – Current State Pipeline
Local Dev Environment
Unit Testing
SYS Env
Delivery increment testing
Automated deployment
Smoke testing
INT Env Dev Pricing Env Training Env Release Env Production Env
Integration testing of all delivery
increments
User acceptance testing
Manual deployment
Release sign-off
Unenforced Quality GateManual effort
Static security testing
Smoke testing
Feature testing Feature testing
E2E Regression testing
E2E Regression testing
User acceptance testing
User acceptance testing
Smoke testing
Feature testing
E2E Regression testing
Smoke testing
Crowd testing
Smoke testing
Smoke testing
Smoke testing
Smoke testing
Manual deployment
Automated deployment
Automated deployment
Automated deployment
Ad Hoc testing
CONFIDENTIAL | © 2019 EPAM Systems, Inc.
Example – Destination Pipeline
Local Dev Environment
Unit Testing
SYS Env.
Delivery increment testing
Automated Quality GateManual effort
Automated deployment
Static security testing
Static code analysis
Sanity testing
Smoke testing
API Regression testing
Feature testing
Ad Hoc testing
INT Env.
Smoke testing
API Regression testing
E2E Regression testing
Dev Pricing Env.
Smoke testing
Release Env.
Smoke testing
Crowd testing
Training Env.
Smoke testing
API Regression testing
E2E Regression testing
Production Env.
Dynamic security testing
Delivery increment testing
Deployment testing
Regression
Manual deployment
Release candidate stabilization
Data configuration
Automated deployment
User acceptance testing
Integration testing of all delivery
increments with 3rd parties
Manual deployment
Release sign-offData configuration
API Regression testing
E2E Regression testing
Manual deployment
Automated deployment
1
2
3
4
Performance testing
Non-functional testing 5
Empower People to Drive Change
Agile Process Training Waterfall ProcessTest Data ManagementToolsTechnical Processes
N E X T S T O P
Common Challenges
• 41
Lack of TechnicalresourcesLack of funding
Project delivery mindset 3rd party
dependencies
Questions on direction
Top down and bottom up support is critical
Take Inventory of Your Team
SME
DevOpsWaterfall Process Driven
Agile InnovativeBusiness Focus
YOUR TEAM
Programming
Use multiple levers to get more technical
Training
Sourcing
New HiresR E S U L T I N G I N
INCREASED TECHNICALT E A M M E M B E R S
Enterprise Groups Focus on Enablement
FOCUS ON:
E N T E R P R I S E T E A M S
FOCUS ON:
SHIPPABLECODE
T E A M M E M B E R S
TECHNOLOGY &SUPPORT
Proven Metrics
LEAD TIME
From code commit to code
successfully running in production
or in a releasable state
DEPLOY FREQUENCY
How often code is deployed
MEAN TIME TO RESTORE (MTTR)How long it takes to restore
service when a service incident occurs
CHANGE FAIL PERCENTAGE
The percentage of changes that result in degraded service or
remediation
Less than one hour On Demand (multiple deploys per day) Less than one hour 0 – 15%
Between one week & one month
Between once per week & once per month Less than one day 31– 45%
Between one month & 6 months
Between once per month & once every 6 months Less than one day 16 – 30%
HIGH PERFORMERS
MEDIUM PERFORMERS
LOW PERFORMERS
Continuous Testing Results I’ve Experienced
• Teams who adopted Continuous Testing and eventually Continuous Delivery improved deployment frequencies by over 4x
• The incident rate per production deployment fell 90% to less then 5%
• Elimination of manual testing tools due to the rollout of CT
• Over 100k test data requests able to be provided in a given month
• Elimination of Hardening phase for agile teams
DEPLOYMENT SPEED4XDECREASE IN INCIDENTS90%LEAD TIME FROM CHECK-IN TO PROD DEPLOYMENT1HR
Continuous Testing Results I’ve Experienced
LOBTeam
LOBTeam
LOB Team
Testing tools and frameworks Performance and monitoring tools
Test Data Management Tool Service Virtualization Test Environment tooling
E N T E R P R I S E P R O D U C T S
T E A M S A R E E N A B L E D V I A E N T E R P R I S E S O L U T I O N S