use the windshield, not the mirror predictive metrics that drive successful product releases
DESCRIPTION
Sharon Niemi, Practice Director of SQA, talks about how the right combination of predictive and reactive metrics can help you build a measurement portfolio that improves product quality and release consistency. You’ll learn how to build a measurement system that incorporates leading and lagging indicators to improve your team’s consistency in delivering quality products on time and within budget.TRANSCRIPT
Software Quality AssociatesUse the Windshield, Not the Mirror:
Predictive Metrics that Drive Successful Product Releases
Presented by:Sharon Niemi
Practice Director, Lifecycle Optimization
Agenda
2
My “measurement aha! moment”
Measures Today; The Missing Links
Why Measure; The ROI
What to Measure; The Portfolio of Measures
How to Measure; The Measurement System
Pulling it Together; A Case Study
Seapine Tools Demonstration
Measures Today
3
< 50% of projects are delivered successfully
40% of a Project Teams effort is wasted on unproductive rework
70% of defects uncovered in production are requirements related
90% of our clients most often measure and use:
Schedule
Budget
Defects (found in Systems Test)
Our Benchmark Studies and other sources of data have revealed….Our Benchmark Studies and other sources of data have revealed….
So what’s missing?
Portfolio of Measures
Measurement System
To Effectively &Proactively Manage It!
Use Measures…
The Basics – Why Measure
4
Measure
Metric
MeasurementTechnique
BaselineAction
Measures are a Means ~Not an End!
Benchmark
“You can’t manage what you don’t measure”“You can’t manage what you don’t measure”X
The Portfolio of MeasuresOutcome Measures
5
The Rear View MirrorReactive
How Well Did You Execute?(Performance)
Time Cost QualityCustomer
Satisfaction
The Portfolio of Measures Predictive Measures – The Missing Set
6
Resources
Resources Internal ProcessActivities*
Internal ProcessActivities*
The Windshield
* Methodology Agnostic
Capability &Capacity
ToolsTools
Efficiency
Effectiveness
-Process Compliance
-Requirements Stability
-Change Request Backlog
-Velocity
-Trends
Creating the Line of Sight
7
Resources
ResourcesInternal Process
Activities*-Process Compliance
-Requirements Stability
-Change Request Backlog
-Velocity
-Trends
Internal ProcessActivities*
-Process Compliance
-Requirements Stability
-Change Request Backlog
-Velocity
-Trends
Predictive(Proactive)
For the Portfolio of Measures to provide value, it needs to be holistic
Outcomes(Reactive)
ToolsTools
Learning & FeedbackLearning & Feedback
How much do you want to Improve?
OutcomesOutcomes
Cau
se
Th
The Measurement System
How to Turn the Portfolio of Measuresinto Action!
8
Appliedto
MeasurementBased
Technique
MeasurementBased
Technique
Processes,Tools, and Capabilities
Processes,Tools, and Capabilities
To Supply
To Improve
Fact BasedData for
Decisions
Fact BasedData for
Decisions
TechniquesBalanced Scorecard
Goal –Question – Metric
Six Sigma’s DMAIC
ISO / CMMi / ITIL / GxP
TechniquesBalanced Scorecard
Goal –Question – Metric
Six Sigma’s DMAIC
ISO / CMMi / ITIL / GxP
Owned
Defined
Used
Improved
Portfolioof
Measures
Visible
The Four Main Points
Remember…
Develop a Portfolio of Measures ~ balanced and integrated.
Tie measures to Key Business Drivers and Goals ~ make them meaningful and relevant.
Implement a Measurement System that Drives Action ~ openly communicate progress, gaps, and action plans.
Continue to update the Portfolio of Measures as Goals are attained and new goals identified!
9
Putting it into Practice - A Case Study
Copyright © 2013 SQA
All Rights Reserved
Step One – Identify Goals and PerceivedIssues
11
Three organizational goals were established;
Avoid Client Impact
Consistency in Delivery
Strive to be Best in Class
Perceived issues; Testing was perceived to be the roadblock to on time delivery.
Software Development Process defined – Iterative in nature.
Recently implemented Test Management tool and questioned if fully utilized.
Step Two– Develop the QuestionsWhere do we begin to look?
12
Predictor Questions Tools: Are the tools being utilized appropriately? How integrated are they?
Resources: How well trained are the testers? Workloads?
Process: We hear that the SDLC is defined, but is it followed and effective?
Process: How stable are the requirements? When do they baseline or “freeze” them?
Trends: Are there any trending data that we can use?
Outcome Questions Time: How much time is allocated for testing and how much effort does it actually
take?
Quality: What’s the state of the builds being deployed to Test and what defects are being uncovered throughout the development process?
Cost: What is the cost of migrating defects?
Customer Satisfaction: Were and if so where are the customers finding the defects?
Step Three – Build the Portfolio of MeasuresWhat Measures are available?
13
Category Description Data Source Data ElementsDefects/Errors reported by Customer Help Desk Excel Spreadsheets
Planned Testing Effort vs. Actual Testing Effort Test Analysts LOE Actual vs. Planned by project/ releaseTests Planned vs. Actual Test Analysts # of Planned Test Cases Planned vs. # of Actual Test Cases Run
Defects Identifited (full l ifecycle) QA Analysts # of Defects Type by Priority
Cost Cost of Rework QA Director LOE/Fully Loaded Cost/# of Defects Found
Time
Quality
Satisfaction
Category Description Data Source Data ElementsSDLC PMO Excel SpreadsheetsChange Request Backlog PMO Excel SpreadsheetsRequirements and Testing Practices BA and Test Managers Excel Spreadsheets
Skil ls Matrix by Job Description HR Performance Management Excel Spreadsheets
Training Planned and Completed Test Manager Training Plans and Performance Reviews
Tools Utilization and Guidelines Test Manager Review of data and Adherence to Guidelines
Trending Metrics Test Cost per Defect QA Director LOE/Fully Loaded Cost/# of Defects Found
Process Adherence
Skill Levels
Training
Outcome Measures
Predictive Measures
Step Four – Gather and Analyze the MeasuresPredictive Measures
14
Tools Available Suitable for Intended Use Integrated Widespread Adoption (Test Organization) Applicable Resources Trained Appropriately Used Consistently Data Integrity
Capability; Resource / Skills
Step Four – Gather and Analyze the MeasuresPredictive Measures, continued
15
6567 68 6870
81 8293
0
10
20
30
40
50
60
70
80
90
100
# o
f R
eq
uir
em
en
ts
Requirements 1.22.13 Release
Planned
Actual
Design Code Test – Week 2Test – Week 1
Requirements Stability
Requirements
Requirements were never frozen
Requirements continued to be unstable / changing; 36% increase to plan during week 2 of Systems Testing
Requirements reviews do not include representative from Test
Added effort required to address rework due to impact of changes
Increase = 36%
Process Defined and Documented Appropriate to the Culture Used Consistently Associated Metrics / Quantifiable Roadblocks Duplicate Work Gaps Outcomes Measured (time / cost / quality)
Step Four – Gather and Analyze the MeasuresOutcome Measures
16
Effort
Requirements unstable/changing, sixteen new builds passed to the Test Environment in four weeks
Test Analysts averaged 8 or more hours overtime 11/22, 11/29, 12/13
3 Test Analysts were out sick week ending 12/6
Due to # of defects being found, additional test cases were selected for execution
Test Analysts do not possess the right level of skills to complete the job
Step Four – Gather and Analyze the Measures Outcome Measures
17
0
10
20
30
40
50
60
70
80
90
Options Trading Positions Accounts Quote PDF Balances Quicken
# o
f E
scal
ated
Cal
ls
Call Category
Escalated Calls by Category 12/13/2012
12/6/2012
11/29/2012
11/22/2012
Customer Satisfaction
Over a four week period, 53% of the calls were due to Options, Quote, and Trading issues
Only 10 % of Test Cases run are against Options, Quote, and Trading
Options, Quote, and Trading are not fully covered in the Regression Suites
Options Trading Positions Accounts Quote PDF Balances Quicken
24.7% 13.2% 7.6% 12.2% 15.1% 7.9% 7.6% 11.8%
Step Four – Gather and Analyze the MeasuresOutcome Measures
18
Cost
Total Production Defects = 251
Phase Unit Cost (Industry Trend)
Calculated Cost
# of Defects Reported
Min $ Max $
Requirements 1 $39 $3,120
Design 3 to 5 $117 / $195 $9,360 $15,600
Code 10 $390 $31,200
Systems Test 15 to 40 $585 / $1560 $46,800 $124,800
User Acceptance 30 to 70 $1170 / $2730 $293,670 $685,230
Production 40 to 1000 $1560 / $39,000 80 $124,800 $3,120,000
Phase; Software Development Lifecycle phase
Unit Cost per Defect; Based on Industry Trend
Calculated Cost per Defect; Based on $585 per Defect if Found in Systems Test Phase (cost of Testers / average # of defects found = average cost per defect in that particular phase)
# of Defect Reported; Based on 80 defects (Requirements Related) reported in Production Environment (reported in a three month period)
Min / Max $$; Cost to Fix if Found in Phase
Step Five – Take Action!Alignment of Measures Key to Success
19
DRIVECONSISTENCY
STRIVE TO BEBEST IN CLASS
World Class
AVOID CLIENT IMPACT
Time/$/Quality
Pro
bab
ilit
y
Targ
et
N
Satisfaction, Cost, &Quality; Defect Propagation & Customer Found Defects
Time; Effort Planned vs. Actual, Test Cases Planned vs. Actual due to changing landscape
Improve Regression Testing
Need Full Lifecycle Defect Mgt
Time/$/QualityPro
bab
ilit
y
Targ
et
N
Process; Abandoned When Schedule Pressures Arose
Tools; Expansion Required
Resources; Weren’t Skilled and Completely Capable
Need Application Training
Need Process Adherence
Need Stable Requirements
Control
Protect Production
Time/$/QualityPro
bab
ilit
y
Targ
et
N
Trends; Data Available (CpD)
Measurement; Need integrated Portfolio of Measures, Baselines, and Benchmarks
Need Expansion to BA / Development Activities
Protect Production
Time/$/QualityPro
bab
ilit
y
Targ
et
N
Need Measurement System fully established
Need Governance and Oversight; Improvement through Measurement as a way of life
Reactive (Rear View Mirror) Proactive (Windshield)
Protect Production
Control
Assure
Manage
Thank You!For questions or additional information, please feel free to contact:
Sharon M. Niemi(508) [email protected]
Jeff Amfahr(513) [email protected]
or visit our websiteswww.sqassociates.comwww.seapine.com
Copyright © 2013 SQA
All Rights Reserved