measuring return on investment: agony &...
TRANSCRIPT
8/20/2009 1
Measuring Return On
Investment: Agony &
Ecstasy!
Brian Wells
Experimentus Ltd.
Brian WellsExperimentus
www.experimentus.com
Measuring Return on Investment:
the Agony & the Ecstasy!
CTH3 – 1 Dec 2005EuroSTAR 2005Copenhagen
3
Agenda
What is Measurement? Where does it fit in?
My DIM Method!
Define
Implement
Measure
Experiences so far!
The Agony & the Ecstasy!
4
Where do Metrics fit in?
Test & Debugging Policies & (Corporate) Goals
Corporate Framework – what is the structure to comply with/deliver
Process & procedure definitions, templates defined –how to undertake activities
Training & mentoring and QA Audit – implementation and monitoring uptake & compliance
Metrics programme - accurate data to measure current basis (project, organisation) & identify
improvement opportunities
5
My DIM Method!
D(efine)
I(mplement)
M(easure)
6
Why Define a Measurement Programme?
How many of you have been part of process
improvement programmes?
How many programmes have said they will do things so
much better?
And save lots of money/effort?
How many of you have seen PREDICTED,
DEMONSTRABLE, RELIABLE savings/improvements?
7
Measure Return on Investment (ROI)
Developing IT costs money
Process Improvements are expensive but are
supposed to save money!
Need to accurately measure the cost to improve
against savings gained (money, efficiency, quality)
Metrics used well can measure any element to ensure
it has improved sufficiently!
Complex activity that is organisation-specific
8
Business Drivers (in the beginning)
Business knew they had :
Little accurate knowledge of complete cost of (poor) quality
Few quality processes & controls
Many issues & they needed to improve but not how
A culture of local “heroes” at project level
They hoped to:
Accurately measure cost of quality over time
Identify process improvement opportunities
Monitor and control delivery process at all stages
Improve quality and reduce costs
9
Business Drivers (eventually)
Eventual reasons became, primarily:
Measure a (partially) outsourced development life cycle
Measure an outsourced centrally Managed IT services
Reduce costs and Improve Quality
Hoped for benefits:
Demonstrate benefits of out-sourced development and
testing effort
Demonstrate savings of out-source Managed Services
Demonstrate improved Quality and Reduce costs
10
The Main Persuaders (in the end)
Needed to QUICKLY:
Measures benefits of out-sourced centrally managed service
contract
Control quality and costs of (partially) out-sourced delivery
process
Demonstrate savings in effort and improvements in quality
i.e. DEMONSTRATE accurately ROI!
By adopting and implementing a full, base Metrics programme!
11
Define Stage
2 categories of measurement: Product Quality & Process
Efficiency (contract SLAs added later)
Used Goal/Question/Measure (GQM) method
Identified 47 base measurements
Grouped Testing Quality, Debugging and Service performance
Important to accept that base measurements:
Evolve (periodically review for validity and usefulness)
Are Based on industry experience but also Business specific
Accompanied by targets (Best, Should, Must Achieve)
Must be reviewed and agreed by all
12
Base measurement GQM example
Goal: Ensure that the Business/Functional
Requirements & Attributes have been tested
sufficiently (and traceable) based on robust risk
assessment and prioritised
Question(s): Do test records show that sufficient
testing has been successfully completed to
demonstrate adequate PRIORITISED coverage of
requirements/attributes on the Traceability Matrices?
13
Base measurement GQM example
Measurement(s): Requirements/Attributes Coverage
(successfully tested)
Targets:
Must >=60% coverage
Should >=80% coverage
Best 100% coverage
14
Implementation Stage; Project Level
Project level first!
Establish collection process & monitoring at ground level first
Ensure accurate raw data items collected quickly (using tool
repository)
Demonstrated quick benefits at project level (standard
reporting toolkit)
Ran for 9+ months before implementing at organisation level
On-going (some resisted at first; some wanted more than
the base!)
15
Implementation: Organisation Level
Look at organisational level after project level adopted across
(most of) the organisation – walk before you run!
Central Metrics Officer (finally) appointed
Central metrics repository created – collects project level data
and adds other, central data collection routines
Define reporting requirements at all levels – incremental
introduction and issue of different reports
Publicise and agree at all levels & integrate into QA function
3 months to define; 6 months to implement
16
Training & Awareness
This was key to success; metrics is a mystery to most!
Awareness Programme required
Training module for Reporting Toolkit rolled out
All test and Project resources must be trained
Mentor available (to assist collection, monitoring &
reporting)
Stakeholders involved in reporting requirements definition
and approval
17
Measurement Stage
“Lies, Damn Lies & Statistics” (Benjamin Disraeli)
Must address culture issues & ensure information is analysed
and presented in positive and objective manner i.e.
90% of all known faults found by System testing stage
But no faults found before system testing
Ensure compliance & benefits accruing through project level
measurements
Organisational analysis and reporting is being incrementally built
up
18
Initial results
At Project Level:
Monitoring in standard, structured manner has proved
extremely popular
Like the accurate data they receive but do not always like the
messages!
At Organisation level, patchy
First 3 months analysis indicates that approximately 50% of
projects/measurements are not being achieved BUT process
weaknesses being identified & Services being monitored
Not all measurements being captured accurately nor
consistently
19
The agony!
Lack of understanding by business of organisational
needs/benefits that could accrue thorough Metrics
Resulted in little “political” will to allow development &
implementation until more urgent business drivers appeared
Constant misinterpretation of what could accrue
Frustration in trying to develop and implement over long period
of time (5 iterations over 3.5 years)
Even now, some key elements not being measured (Off-shore
quality measurements, project level resource/cost/time tracking
of actual vs. estimated etc.)
20
The ecstasy!
When implemented, little resistance – clearly demonstrated
benefits immediately
Even at this early stage, process weaknesses at project,
business unit and organisational level being identified and acted
upon
Increasing awareness of need and potential benefits across the
organisation
Differing presentation of data analysis serves many different
information requirements (increasingly )accurately
The changing mind set towards the metrics programme –
POSITIVE!
21
What could have been done differently?
What would we have done differently?
Target known problem area
Create limited pilot metrics programme at project
and organisational level
Use results to demonstrate potential benefits and
to gain approval to develop full programme
Control Quality Assurance and Compliance Audit as
part of Metrics programme
22
Remember…..
Accurate measurement is increasingly necessary
but….
Gaining acceptance for the need and benefit is
difficult because…..
Measurement is frequently misunderstood and…..
There is much politics and culture issues to overcome
but….
Presented properly, benefits are enormous and quick!
23
Questions?
24
Thank You
Brian Wells
www.experimentus.com
M +44(0) 7725 709262